gov.sandia.cognition.learning.algorithm.regression

Class LevenbergMarquardtEstimation

• All Implemented Interfaces:

```@PublicationReference(author={"William H. Press","Saul A. Teukolsky","William T. Vetterling","Brian P. Flannery"},title="Numerical Recipes in C, Second Edition",type=Book,year=1992,pages={685,687},notes="Section 15.5",url="http://www.nrbook.com/a/bookcpdf.php") @PublicationReference(author="Wikipedia",title="Levenberg-Marquardt algorithm",type=WebPage,year=2008,url="http://en.wikipedia.org/wiki/Levenberg-Marquardt_algorithm")
public class LevenbergMarquardtEstimation
extends LeastSquaresEstimator```
Implementation of the nonlinear regression algorithm, known as Levenberg-Marquardt Estimation (or LMA). In its pure form, this algorithm computes the least-squares solution to the parameters of a functional form given a (weighted) set of input-output Vector pairs. While the algorithm is specified in terms of parameter gradients, it has been proven that a forward difference Jacobian approximation does not hurt the convergence properties of LMA. (Thus, you can use GradientDescendableApproximator with impunity.) LMA requires the storage of the explicit Jacobian, which may not be possible for large problems.
When gradients are available, LMA appears competitive with BFGS for function evaluations, but BFGS is ~2 times faster than LMA.
When gradients are not available, LMA slows down by a factor of ~3 and requires "M" more function evaluations, where "M" is the number of parameters in the function. However, LMA with approximated parameter Jacobian is ~4 times FASTER than Powell's minimization (and Powell's method requires about ~8 times more function evaluations).
Take home message: Use ParameterDifferentiableCostMinimizer with BFGS when possible, or LMA with approximated Jacobian when not gradient information is not available.
Loosely based on Numerical Recipes in C, p.685-687
Since:
2.1
Author:
Kevin R. Dixon
Serialized Form
• Field Summary

Fields
Modifier and Type Field and Description
`static double` `DEFAULT_DAMPING`
Default initial value of the damping factor 1.0
`static double` `DEFAULT_DAMPING_DIVISOR`
Divisor of the damping factor on unsuccessful iteration, dividing damping factor on cost-reducing iteration 10.0
`static int` `DEFAULT_MAX_ITERATIONS_WITHOUT_IMPROVEMENT`
Default maximum number of iterations without improvement before stopping 4
• Fields inherited from class gov.sandia.cognition.learning.algorithm.regression.AbstractParameterCostMinimizer

`DEFAULT_MAX_ITERATIONS, DEFAULT_TOLERANCE`
• Fields inherited from class gov.sandia.cognition.learning.algorithm.AbstractAnytimeBatchLearner

`data, keepGoing`
• Fields inherited from class gov.sandia.cognition.algorithm.AbstractAnytimeAlgorithm

`maxIterations`
• Fields inherited from class gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm

`DEFAULT_ITERATION, iteration`
• Constructor Summary

Constructors
Constructor and Description
`LevenbergMarquardtEstimation()`
Creates a new instance of LevenbergMarquardtEstimation
```LevenbergMarquardtEstimation(double dampingFactor, double dampingFactorDivisor, int maxIterations, int maxIterationsWithoutImprovement, double tolerance)```
Creates a new instance of LevenbergMarquardtEstimation
• Method Summary

All Methods
Modifier and Type Method and Description
`protected void` `cleanupAlgorithm()`
Called to clean up the learning algorithm's state after learning has finished.
`double` `getDampingFactor()`
Getter for dampingFactor
`double` `getDampingFactorDivisor()`
Getter for dampingFactorDivisor
`int` `getIterationsWithoutImprovement()`
Getter for iterationsWithoutImprovement
`int` `getMaxIterationsWithoutImprovement()`
Getter for maxIterationsWithoutImprovement
`protected boolean` `initializeAlgorithm()`
Called to initialize the learning algorithm's state based on the data that is stored in the data field.
`void` `setDampingFactor(double dampingFactor)`
Setter for dampingFactor
`void` `setDampingFactorDivisor(double dampingFactorDivisor)`
Setter for dampingFactorDivisor
`void` `setIterationsWithoutImprovement(int iterationsWithoutImprovement)`
Setter for iterationsWithoutImprovement
`void` `setMaxIterationsWithoutImprovement(int maxIterationsWithoutImprovement)`
Setter for maxIterationsWithoutImprovement
`protected boolean` `step()`
Called to take a single step of the learning algorithm.
• Methods inherited from class gov.sandia.cognition.learning.algorithm.regression.AbstractParameterCostMinimizer

`getCostFunction, getObjectToOptimize, getPerformance, getResult, getResultCost, getTolerance, setCostFunction, setObjectToOptimize, setResult, setResultCost, setTolerance`
• Methods inherited from class gov.sandia.cognition.learning.algorithm.AbstractAnytimeBatchLearner

`clone, getData, getKeepGoing, learn, setData, setKeepGoing, stop`
• Methods inherited from class gov.sandia.cognition.algorithm.AbstractAnytimeAlgorithm

`getMaxIterations, isResultValid, setMaxIterations`
• Methods inherited from class gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm

`addIterativeAlgorithmListener, fireAlgorithmEnded, fireAlgorithmStarted, fireStepEnded, fireStepStarted, getIteration, getListeners, removeIterativeAlgorithmListener, setIteration, setListeners`
• Methods inherited from class java.lang.Object

`equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait`
• Methods inherited from interface gov.sandia.cognition.learning.algorithm.BatchCostMinimizationLearner

`learn`
• Methods inherited from interface gov.sandia.cognition.util.CloneableSerializable

`clone`
• Methods inherited from interface gov.sandia.cognition.algorithm.AnytimeAlgorithm

`getMaxIterations, setMaxIterations`
• Methods inherited from interface gov.sandia.cognition.algorithm.IterativeAlgorithm

`addIterativeAlgorithmListener, getIteration, removeIterativeAlgorithmListener`
• Methods inherited from interface gov.sandia.cognition.algorithm.StoppableAlgorithm

`isResultValid, stop`
• Field Detail

• DEFAULT_DAMPING

`public static final double DEFAULT_DAMPING`
Default initial value of the damping factor 1.0
Constant Field Values
• DEFAULT_DAMPING_DIVISOR

`public static final double DEFAULT_DAMPING_DIVISOR`
Divisor of the damping factor on unsuccessful iteration, dividing damping factor on cost-reducing iteration 10.0
Constant Field Values
• DEFAULT_MAX_ITERATIONS_WITHOUT_IMPROVEMENT

`public static final int DEFAULT_MAX_ITERATIONS_WITHOUT_IMPROVEMENT`
Default maximum number of iterations without improvement before stopping 4
Constant Field Values
• Constructor Detail

• LevenbergMarquardtEstimation

`public LevenbergMarquardtEstimation()`
Creates a new instance of LevenbergMarquardtEstimation
• LevenbergMarquardtEstimation

```public LevenbergMarquardtEstimation(double dampingFactor,
double dampingFactorDivisor,
int maxIterations,
int maxIterationsWithoutImprovement,
double tolerance)```
Creates a new instance of LevenbergMarquardtEstimation
Parameters:
`dampingFactor` - Current damping factor for the ridge regression
`dampingFactorDivisor` - Divisor of the damping factor on a successful iteration, must be greater then 1.0, typically ~10.0
`maxIterations` - Maximum iterations before stopping
`maxIterationsWithoutImprovement` - Number of sequential unsuccessful iterations without a cost-reducing step
`tolerance` - Stopping criterion for the algorithm, typically ~1e-5
• Method Detail

• initializeAlgorithm

`protected boolean initializeAlgorithm()`
Description copied from class: `AbstractAnytimeBatchLearner`
Called to initialize the learning algorithm's state based on the data that is stored in the data field. The return value indicates if the algorithm can be run or not based on the initialization.
Specified by:
`initializeAlgorithm` in class `AbstractAnytimeBatchLearner<java.util.Collection<? extends InputOutputPair<? extends Vector,Vector>>,GradientDescendable>`
Returns:
True if the learning algorithm can be run and false if it cannot.
• step

`protected boolean step()`
Description copied from class: `AbstractAnytimeBatchLearner`
Called to take a single step of the learning algorithm.
Specified by:
`step` in class `AbstractAnytimeBatchLearner<java.util.Collection<? extends InputOutputPair<? extends Vector,Vector>>,GradientDescendable>`
Returns:
True if another step can be taken and false it the algorithm should halt.
• cleanupAlgorithm

`protected void cleanupAlgorithm()`
Description copied from class: `AbstractAnytimeBatchLearner`
Called to clean up the learning algorithm's state after learning has finished.
Specified by:
`cleanupAlgorithm` in class `AbstractAnytimeBatchLearner<java.util.Collection<? extends InputOutputPair<? extends Vector,Vector>>,GradientDescendable>`
• getIterationsWithoutImprovement

`public int getIterationsWithoutImprovement()`
Getter for iterationsWithoutImprovement
Returns:
Number of sequential unsuccessful iterations without a cost-reducing step
• setIterationsWithoutImprovement

`public void setIterationsWithoutImprovement(int iterationsWithoutImprovement)`
Setter for iterationsWithoutImprovement
Parameters:
`iterationsWithoutImprovement` - Number of sequential unsuccessful iterations without a cost-reducing step
• getMaxIterationsWithoutImprovement

`public int getMaxIterationsWithoutImprovement()`
Getter for maxIterationsWithoutImprovement
Returns:
Maximum number of iterations without improvement before stopping
• setMaxIterationsWithoutImprovement

`public void setMaxIterationsWithoutImprovement(int maxIterationsWithoutImprovement)`
Setter for maxIterationsWithoutImprovement
Parameters:
`maxIterationsWithoutImprovement` - Maximum number of iterations without improvement before stopping
• getDampingFactor

`public double getDampingFactor()`
Getter for dampingFactor
Returns:
Current damping factor for the ridge regression
• setDampingFactor

`public void setDampingFactor(double dampingFactor)`
Setter for dampingFactor
Parameters:
`dampingFactor` - Current damping factor for the ridge regression
• getDampingFactorDivisor

`public double getDampingFactorDivisor()`
Getter for dampingFactorDivisor
Returns:
Divisor of the damping factor on a successful iteration, must be greater then 1.0, typically ~10.0
• setDampingFactorDivisor

`public void setDampingFactorDivisor(double dampingFactorDivisor)`
Setter for dampingFactorDivisor
Parameters:
`dampingFactorDivisor` - Divisor of the damping factor on a successful iteration, must be greater then 1.0, typically ~10.0