gov.sandia.cognition.learning.algorithm.minimization

## Class FunctionMinimizerDirectionSetPowell

• All Implemented Interfaces:
AnytimeAlgorithm<InputOutputPair<Vector,java.lang.Double>>, IterativeAlgorithm, StoppableAlgorithm, AnytimeBatchLearner<Evaluator<? super Vector,java.lang.Double>,InputOutputPair<Vector,java.lang.Double>>, BatchLearner<Evaluator<? super Vector,java.lang.Double>,InputOutputPair<Vector,java.lang.Double>>, FunctionMinimizer<Vector,java.lang.Double,Evaluator<? super Vector,java.lang.Double>>, CloneableSerializable, java.io.Serializable, java.lang.Cloneable

@PublicationReference(author="R. Fletcher",title="Practical Methods of Optimization, Second Edition",type=Book,year=1987,pages={87,90},notes="Section 4.2") @PublicationReference(author={"William H. Press","Saul A. Teukolsky","William T. Vetterling","Brian P. Flannery"},title="Numerical Recipes in C, Second Edition",type=Book,year=1992,pages={417,418},notes="Section 10.5",url="http://www.nrbook.com/a/bookcpdf.php")
public class FunctionMinimizerDirectionSetPowell
extends AbstractAnytimeFunctionMinimizer<Vector,java.lang.Double,Evaluator<? super Vector,java.lang.Double>>
Implementation of the derivative-free unconstrained nonlinear direction-set minimization algorithm called "Powell's Method" by Numerical Recipes. The method was originally known as Smith's Direction-Set Method, to which Powell made an ingenious improvement. Powell's Method was later improved upon by Brent. This algorithm creates a basis set of search directions and repeatedly searches along each direction until a local minimum is found using only function evaluations, that is, no gradient information is needed. This algorithm is amazingly good at finding a minimum, and is my method of choice for derivative-free minimization. However, be sure to check the performance this algorithm's cousin, the Nelder-Mead downhill simplex (FunctionMinimizerNelderMead) before deciding on a derivative-free minimization algorithm.

That being said, it is sometimes more effective to use approximated gradients and algorithms like BFGS (FunctionMinimizerBFGS) than derivative-free minimization algorithms. Thus, I would try them both.
Since:
2.0
Author:
Kevin R. Dixon
Serialized Form
• ### Field Detail

• #### DEFAULT_MAX_ITERATIONS

public static final int DEFAULT_MAX_ITERATIONS
Default maximum number of iterations before stopping, 1000
Constant Field Values
• #### DEFAULT_TOLERANCE

public static final double DEFAULT_TOLERANCE
Default tolerance, 1.0E-5
Constant Field Values
• #### DEFAULT_LINE_MINIMIZER

public static final LineMinimizer<?> DEFAULT_LINE_MINIMIZER
Default line minimization algorithm, LineMinimizerDerivativeFree
• ### Constructor Detail

• #### FunctionMinimizerDirectionSetPowell

public FunctionMinimizerDirectionSetPowell()
Default constructor
• #### FunctionMinimizerDirectionSetPowell

public FunctionMinimizerDirectionSetPowell(LineMinimizer<?> lineMinimizer)
Creates a new instance of FunctionMinimizerDirectionSetPowell
Parameters:
lineMinimizer - Work-horse algorithm that minimizes the function along a direction
• #### FunctionMinimizerDirectionSetPowell

public FunctionMinimizerDirectionSetPowell(LineMinimizer<?> lineMinimizer,
Vector initialGuess,
double tolerance,
int maxIterations)
Creates a new instance of FunctionMinimizerDirectionSetPowell
Parameters:
initialGuess - Initial guess about the minimum of the method
tolerance - Tolerance of the minimization algorithm, must be >= 0.0, typically ~1e-10
lineMinimizer - Work-horse algorithm that minimizes the function along a direction
maxIterations - Maximum number of iterations, must be >0, typically ~100
• ### Method Detail

• #### initializeAlgorithm

protected boolean initializeAlgorithm()
Description copied from class: AbstractAnytimeBatchLearner
Called to initialize the learning algorithm's state based on the data that is stored in the data field. The return value indicates if the algorithm can be run or not based on the initialization.
Specified by:
initializeAlgorithm in class AbstractAnytimeBatchLearner<Evaluator<? super Vector,java.lang.Double>,InputOutputPair<Vector,java.lang.Double>>
Returns:
True if the learning algorithm can be run and false if it cannot.
• #### getLineMinimizer

public LineMinimizer<?> getLineMinimizer()
Getter for lineMinimizer
Returns:
Work-horse algorithm that minimizes the function along a direction
• #### setLineMinimizer

public void setLineMinimizer(LineMinimizer<?> lineMinimizer)
Setter for lineMinimizer
Parameters:
lineMinimizer - Work-horse algorithm that minimizes the function along a direction