gov.sandia.cognition.learning.algorithm.minimization.line

## Class LineMinimizerBacktracking

• All Implemented Interfaces:
AnytimeAlgorithm<InputOutputPair<java.lang.Double,java.lang.Double>>, IterativeAlgorithm, StoppableAlgorithm, AnytimeBatchLearner<Evaluator<java.lang.Double,java.lang.Double>,InputOutputPair<java.lang.Double,java.lang.Double>>, BatchLearner<Evaluator<java.lang.Double,java.lang.Double>,InputOutputPair<java.lang.Double,java.lang.Double>>, FunctionMinimizer<java.lang.Double,java.lang.Double,Evaluator<java.lang.Double,java.lang.Double>>, LineMinimizer<Evaluator<java.lang.Double,java.lang.Double>>, CloneableSerializable, java.io.Serializable, java.lang.Cloneable

```@PublicationReference(author={"William H. Press","Saul A. Teukolsky","William T. Vetterling","Brian P. Flannery"},
title="Numerical Recipes in C, Second Edition",
type=Book,
year=1992,
pages={384,386},
url="http://www.nrbook.com/a/bookcpdf.php")
public class LineMinimizerBacktracking
extends AbstractAnytimeLineMinimizer<Evaluator<java.lang.Double,java.lang.Double>>```
Implementation of the backtracking line-minimization algorithm. This algorithm takes a full Newton step in the direction of a minimum and, if this does not find a sufficiently decreasing function evaluation, it then successively reduces the step length and tries again until a sufficiently decreasing value is found.
This is a very inexact line minimizer, but it appears to help the performance of both Quasi-Newton and Conjugate Gradient algorithms. In my tests, it appears to reduce the computation by about 30%-60%.
Since:
2.1
Author:
Kevin R. Dixon
Serialized Form
• ### Field Summary

Fields
Modifier and Type Field and Description
`static double` `DEFAULT_GEOMETRIC_DECREASE`
Default amount to decrease the step amount each iteration, 0.5.
`static boolean` `DEFAULT_NUMERICAL_DERIVATIVE`
Default flag to use numerical differentiation, true.
`static double` `DEFAULT_SUFFICIENT_DECREASE`
Default sufficient decrease value, 0.5
`static double` `STEP_MAX`
Maximum step size allowed by a parabolic fit, 100.0
• ### Fields inherited from class gov.sandia.cognition.learning.algorithm.minimization.line.AbstractAnytimeLineMinimizer

`DEFAULT_MAX_ITERATIONS, DEFAULT_TOLERANCE`
• ### Fields inherited from class gov.sandia.cognition.learning.algorithm.minimization.AbstractAnytimeFunctionMinimizer

`initialGuess, result, tolerance`
• ### Fields inherited from class gov.sandia.cognition.learning.algorithm.AbstractAnytimeBatchLearner

`data, keepGoing`
• ### Fields inherited from class gov.sandia.cognition.algorithm.AbstractAnytimeAlgorithm

`maxIterations`
• ### Fields inherited from class gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm

`DEFAULT_ITERATION, iteration`
• ### Constructor Summary

Constructors
Constructor and Description
`LineMinimizerBacktracking()`
Creates a new instance of LineMinimizerBacktracking
`LineMinimizerBacktracking(double geometricDecrease)`
Creates a new instance of LineMinimizerBacktracking
```LineMinimizerBacktracking(double geometricDecrease, boolean numericalDerivative)```
Creates a new instance of LineMinimizerBacktracking
```LineMinimizerBacktracking(double geometricDecrease, boolean numericalDerivative, double sufficientDecrease)```
Creates a new instance of LineMinimizerBacktracking
• ### Method Summary

All Methods
Modifier and Type Method and Description
`boolean` `bracketingStep()`
Continues the bracketing phase of the algorithm, which attempts to place a bracket around a known minimum.
`double` `getGeometricDecrease()`
Getter for geometricDecrease.
`boolean` `getNumericalDerivative()`
Getter for numericalDerivative
`double` `getSufficientDecrease()`
Getter for sufficientDecrease
`protected boolean` `initializeAlgorithm()`
Called to initialize the learning algorithm's state based on the data that is stored in the data field.
`boolean` `sectioningStep()`
Continues the sectioning phase of the algorihtm.
`void` `setGeometricDecrease(double geometricDecrease)`
Setter for geometricDecrease
`void` `setNumericalDerivative(boolean numericalDerivative)`
Setter for numericalDerivative
`void` `setSufficientDecrease(double sufficientDecrease)`
Setter for sufficientDecrease
• ### Methods inherited from class gov.sandia.cognition.learning.algorithm.minimization.line.AbstractAnytimeLineMinimizer

`cleanupAlgorithm, getBracket, getInitialGuessFunctionValue, getInitialGuessSlope, getInterpolator, isValidBracket, minimizeAlongDirection, setBracket, setData, setInitialGuess, setInitialGuessFunctionValue, setInitialGuessSlope, setInterpolator, setValidBracket, step`
• ### Methods inherited from class gov.sandia.cognition.learning.algorithm.minimization.AbstractAnytimeFunctionMinimizer

`getInitialGuess, getResult, getTolerance, setResult, setTolerance`
• ### Methods inherited from class gov.sandia.cognition.learning.algorithm.AbstractAnytimeBatchLearner

`clone, getData, getKeepGoing, learn, setKeepGoing, stop`
• ### Methods inherited from class gov.sandia.cognition.algorithm.AbstractAnytimeAlgorithm

`getMaxIterations, isResultValid, setMaxIterations`
• ### Methods inherited from class gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm

`addIterativeAlgorithmListener, fireAlgorithmEnded, fireAlgorithmStarted, fireStepEnded, fireStepStarted, getIteration, getListeners, removeIterativeAlgorithmListener, setIteration, setListeners`
• ### Methods inherited from class java.lang.Object

`equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait`
• ### Methods inherited from interface gov.sandia.cognition.learning.algorithm.minimization.FunctionMinimizer

`getInitialGuess, getTolerance, learn, setTolerance`
• ### Methods inherited from interface gov.sandia.cognition.util.CloneableSerializable

`clone`
• ### Methods inherited from interface gov.sandia.cognition.algorithm.AnytimeAlgorithm

`getMaxIterations, getResult, setMaxIterations`
• ### Methods inherited from interface gov.sandia.cognition.algorithm.IterativeAlgorithm

`addIterativeAlgorithmListener, getIteration, removeIterativeAlgorithmListener`
• ### Methods inherited from interface gov.sandia.cognition.algorithm.StoppableAlgorithm

`isResultValid, stop`
• ### Field Detail

• #### STEP_MAX

`public static final double STEP_MAX`
Maximum step size allowed by a parabolic fit, 100.0
Constant Field Values
• #### DEFAULT_SUFFICIENT_DECREASE

`public static final double DEFAULT_SUFFICIENT_DECREASE`
Default sufficient decrease value, 0.5
Constant Field Values
• #### DEFAULT_GEOMETRIC_DECREASE

`public static final double DEFAULT_GEOMETRIC_DECREASE`
Default amount to decrease the step amount each iteration, 0.5.
Constant Field Values
• #### DEFAULT_NUMERICAL_DERIVATIVE

`public static final boolean DEFAULT_NUMERICAL_DERIVATIVE`
Default flag to use numerical differentiation, true.
Constant Field Values
• ### Constructor Detail

• #### LineMinimizerBacktracking

`public LineMinimizerBacktracking()`
Creates a new instance of LineMinimizerBacktracking
• #### LineMinimizerBacktracking

`public LineMinimizerBacktracking(double geometricDecrease)`
Creates a new instance of LineMinimizerBacktracking
Parameters:
`geometricDecrease` - Amount to decrease the step amount each iteration.
• #### LineMinimizerBacktracking

```public LineMinimizerBacktracking(double geometricDecrease,
boolean numericalDerivative)```
Creates a new instance of LineMinimizerBacktracking
Parameters:
`geometricDecrease` - Amount to decrease the step amount each iteration.
`numericalDerivative` - Flag whether or not to use the numerical differentiation.
• #### LineMinimizerBacktracking

```public LineMinimizerBacktracking(double geometricDecrease,
boolean numericalDerivative,
double sufficientDecrease)```
Creates a new instance of LineMinimizerBacktracking
Parameters:
`geometricDecrease` - Amount to decrease the step amount each iteration.
`numericalDerivative` - Flag whether or not to use the numerical differentiation.
`sufficientDecrease` - Sufficient decrease condition, must be (0,1). Smaller values (0.1) result in more accurate searches, larger values (0.9) tend to be easier to satisfy.
• ### Method Detail

• #### initializeAlgorithm

`protected boolean initializeAlgorithm()`
Description copied from class: `AbstractAnytimeBatchLearner`
Called to initialize the learning algorithm's state based on the data that is stored in the data field. The return value indicates if the algorithm can be run or not based on the initialization.
Overrides:
`initializeAlgorithm` in class `AbstractAnytimeLineMinimizer<Evaluator<java.lang.Double,java.lang.Double>>`
Returns:
True if the learning algorithm can be run and false if it cannot.
• #### bracketingStep

`public boolean bracketingStep()`
Description copied from interface: `LineMinimizer`
Continues the bracketing phase of the algorithm, which attempts to place a bracket around a known minimum.
Specified by:
`bracketingStep` in interface `LineMinimizer<Evaluator<java.lang.Double,java.lang.Double>>`
Specified by:
`bracketingStep` in class `AbstractAnytimeLineMinimizer<Evaluator<java.lang.Double,java.lang.Double>>`
Returns:
True if a valid bracket has been found, false to continue the bracketing phase of the algorithm.
• #### sectioningStep

`public boolean sectioningStep()`
Description copied from interface: `LineMinimizer`
Continues the sectioning phase of the algorihtm. This phase occurs after the bracketing phase and attempts to shrink the size of the bracket (using a LineBracketInterpolator) until sufficient accuracy is attained.
Specified by:
`sectioningStep` in interface `LineMinimizer<Evaluator<java.lang.Double,java.lang.Double>>`
Specified by:
`sectioningStep` in class `AbstractAnytimeLineMinimizer<Evaluator<java.lang.Double,java.lang.Double>>`
Returns:
True that a valid minimum has been found, false to continue sectioning the bracket.
• #### getSufficientDecrease

`public double getSufficientDecrease()`
Getter for sufficientDecrease
Returns:
Sufficient decrease condition, must be (0,1). Smaller values (0.1) result in more accurate searches, larger values (0.9) tend to be easier to satisfy.
• #### setSufficientDecrease

`public void setSufficientDecrease(double sufficientDecrease)`
Setter for sufficientDecrease
Parameters:
`sufficientDecrease` - Sufficient decrease condition, must be (0,1). Smaller values (0.1) result in more accurate searches, larger values (0.9) tend to be easier to satisfy.
• #### getGeometricDecrease

`public double getGeometricDecrease()`
Getter for geometricDecrease.
Returns:
Amount to decrease the step amount each iteration.
• #### setGeometricDecrease

`public void setGeometricDecrease(double geometricDecrease)`
Setter for geometricDecrease
Parameters:
`geometricDecrease` - Amount to decrease the step amount each iteration.
• #### getNumericalDerivative

`public boolean getNumericalDerivative()`
Getter for numericalDerivative
Returns:
Flag whether or not to use the numerical differentiation.
• #### setNumericalDerivative

`public void setNumericalDerivative(boolean numericalDerivative)`
Setter for numericalDerivative
Parameters:
`numericalDerivative` - Flag whether or not to use the numerical differentiation.