gov.sandia.cognition.learning.algorithm.minimization

• All Implemented Interfaces:
AnytimeAlgorithm<InputOutputPair<Vector,java.lang.Double>>, IterativeAlgorithm, StoppableAlgorithm, AnytimeBatchLearner<Evaluator<? super Vector,java.lang.Double>,InputOutputPair<Vector,java.lang.Double>>, BatchLearner<Evaluator<? super Vector,java.lang.Double>,InputOutputPair<Vector,java.lang.Double>>, FunctionMinimizer<Vector,java.lang.Double,Evaluator<? super Vector,java.lang.Double>>, CloneableSerializable, java.io.Serializable, java.lang.Cloneable

```@PublicationReference(author={"William H. Press","Saul A. Teukolsky","William T. Vetterling","Brian P. Flannery"},title="Numerical Recipes in C, Second Edition",type=Book,year=1992,pages={411,412},notes="Section 10.5",url="http://www.nrbook.com/a/bookcpdf.php") @PublicationReference(author="Wikipedia",title="Nelder-Mead method",type=WebPage,year=2008,url="http://en.wikipedia.org/wiki/Nelder-Mead_method")
extends AbstractAnytimeFunctionMinimizer<Vector,java.lang.Double,Evaluator<? super Vector,java.lang.Double>>```
Implementation of the Downhill Simplex minimization algorithm, also known as the Nelder-Mead method. It finds the minimum of a nonlinear function without using derivative information. In my experience, this method "gives up" easily and the simplex breaks down and gets stuck on complicated nonlinear manifolds. I would recommend using Powell's method instead.
Since:
2.0
Author:
Kevin R. Dixon
Serialized Form
• ### Field Summary

Fields
Modifier and Type Field and Description
`static int` `DEFAULT_MAX_ITERATIONS`
Default max iterations, 4000
`static double` `DEFAULT_TOLERANCE`
Default tolerance, 0.001
• ### Fields inherited from class gov.sandia.cognition.learning.algorithm.minimization.AbstractAnytimeFunctionMinimizer

`initialGuess, result, tolerance`
• ### Fields inherited from class gov.sandia.cognition.learning.algorithm.AbstractAnytimeBatchLearner

`data, keepGoing`
• ### Fields inherited from class gov.sandia.cognition.algorithm.AbstractAnytimeAlgorithm

`maxIterations`
• ### Fields inherited from class gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm

`DEFAULT_ITERATION, iteration`
• ### Constructor Summary

Constructors
Constructor and Description
`FunctionMinimizerNelderMead()`
Creates a new instance of FunctionMinimizerNelderMead
• ### Method Summary

All Methods
Modifier and Type Method and Description
`protected void` `cleanupAlgorithm()`
Called to clean up the learning algorithm's state after learning has finished.
`protected Vector` `computeSimplexInputSum()`
Computes the sum of input values in the simplex
`protected boolean` `initializeAlgorithm()`
Called to initialize the learning algorithm's state based on the data that is stored in the data field.
`java.util.ArrayList<DefaultInputOutputPair<Vector,java.lang.Double>>` ```initializeSimplex(InputOutputPair<Vector,java.lang.Double> initialPoint, double offsetValue)```
Initializes the simplex about the initial point
`protected boolean` `step()`
Called to take a single step of the learning algorithm.
• ### Methods inherited from class gov.sandia.cognition.learning.algorithm.minimization.AbstractAnytimeFunctionMinimizer

`getInitialGuess, getResult, getTolerance, setInitialGuess, setResult, setTolerance`
• ### Methods inherited from class gov.sandia.cognition.learning.algorithm.AbstractAnytimeBatchLearner

`clone, getData, getKeepGoing, learn, setData, setKeepGoing, stop`
• ### Methods inherited from class gov.sandia.cognition.algorithm.AbstractAnytimeAlgorithm

`getMaxIterations, isResultValid, setMaxIterations`
• ### Methods inherited from class gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm

`addIterativeAlgorithmListener, fireAlgorithmEnded, fireAlgorithmStarted, fireStepEnded, fireStepStarted, getIteration, getListeners, removeIterativeAlgorithmListener, setIteration, setListeners`
• ### Methods inherited from class java.lang.Object

`equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait`
• ### Methods inherited from interface gov.sandia.cognition.learning.algorithm.minimization.FunctionMinimizer

`learn`
• ### Methods inherited from interface gov.sandia.cognition.util.CloneableSerializable

`clone`
• ### Methods inherited from interface gov.sandia.cognition.algorithm.AnytimeAlgorithm

`getMaxIterations, setMaxIterations`
• ### Methods inherited from interface gov.sandia.cognition.algorithm.IterativeAlgorithm

`addIterativeAlgorithmListener, getIteration, removeIterativeAlgorithmListener`
• ### Methods inherited from interface gov.sandia.cognition.algorithm.StoppableAlgorithm

`isResultValid, stop`
• ### Field Detail

• #### DEFAULT_TOLERANCE

`public static final double DEFAULT_TOLERANCE`
Default tolerance, 0.001
Constant Field Values
• #### DEFAULT_MAX_ITERATIONS

`public static final int DEFAULT_MAX_ITERATIONS`
Default max iterations, 4000
Constant Field Values
• ### Constructor Detail

`public FunctionMinimizerNelderMead()`
Creates a new instance of FunctionMinimizerNelderMead
• ### Method Detail

• #### initializeAlgorithm

`protected boolean initializeAlgorithm()`
Description copied from class: `AbstractAnytimeBatchLearner`
Called to initialize the learning algorithm's state based on the data that is stored in the data field. The return value indicates if the algorithm can be run or not based on the initialization.
Specified by:
`initializeAlgorithm` in class `AbstractAnytimeBatchLearner<Evaluator<? super Vector,java.lang.Double>,InputOutputPair<Vector,java.lang.Double>>`
Returns:
True if the learning algorithm can be run and false if it cannot.
• #### initializeSimplex

```public java.util.ArrayList<DefaultInputOutputPair<Vector,java.lang.Double>> initializeSimplex(InputOutputPair<Vector,java.lang.Double> initialPoint,
double offsetValue)```
Initializes the simplex about the initial point
Parameters:
`initialPoint` - Initial point about which to initialize the simplex
`offsetValue` - Value to use to spread out the vertices of the simplex
Returns:
Simplex
• #### step

`protected boolean step()`
Description copied from class: `AbstractAnytimeBatchLearner`
Called to take a single step of the learning algorithm.
Specified by:
`step` in class `AbstractAnytimeBatchLearner<Evaluator<? super Vector,java.lang.Double>,InputOutputPair<Vector,java.lang.Double>>`
Returns:
True if another step can be taken and false it the algorithm should halt.
• #### cleanupAlgorithm

`protected void cleanupAlgorithm()`
Description copied from class: `AbstractAnytimeBatchLearner`
Called to clean up the learning algorithm's state after learning has finished.
Specified by:
`cleanupAlgorithm` in class `AbstractAnytimeBatchLearner<Evaluator<? super Vector,java.lang.Double>,InputOutputPair<Vector,java.lang.Double>>`
• #### computeSimplexInputSum

`protected Vector computeSimplexInputSum()`
Computes the sum of input values in the simplex
Returns:
Sum of input values in the simplex