@PublicationReference(author="R. Fletcher",title="Practical Methods of Optimization, Second Edition",type=Book,year=1987,pages=55,notes="Section 3.2, Equation 3.2.12") @PublicationReference(author="Wikipedia",title="BFGS method",type=WebPage,year=2008,url="http://en.wikipedia.org/wiki/BFGS_method") @PublicationReference(author={"William H. Press","Saul A. Teukolsky","William T. Vetterling","Brian P. Flannery"},title="Numerical Recipes in C, Second Edition",type=Book,year=1992,pages={428,429},notes="Section 10.7",url="http://www.nrbook.com/a/bookcpdf.php") public class FunctionMinimizerBFGS extends FunctionMinimizerQuasiNewton
GradientDescendableApproximator (when used for parameter cost
minimization) when exact gradients are not available. Using approximated
Jacobian tends, to slow down BFGS by a factor of about ~3, but it appears
to generally outperform derivative-free minimization techniques, like
Powell's direction-set method. Also, BFGS appears to outperform
Leveberg-Marquardt estimation for parameter estimation, in my experience.
DEFAULT_LINE_MINIMIZER, DEFAULT_MAX_ITERATIONS, DEFAULT_TOLERANCEinitialGuess, result, tolerancedata, keepGoingmaxIterationsDEFAULT_ITERATION, iteration| Constructor and Description |
|---|
FunctionMinimizerBFGS()
Creates a new instance of FunctionMinimizerBFGS
|
FunctionMinimizerBFGS(LineMinimizer<?> lineMinimizer)
Creates a new instance of FunctionMinimizerBFGS
|
FunctionMinimizerBFGS(LineMinimizer<?> lineMinimizer,
Vector initialGuess,
double tolerance,
int maxIterations)
Creates a new instance of FunctionMinimizerBFGS
|
| Modifier and Type | Method and Description |
|---|---|
static boolean |
BFGSupdateRule(Matrix hessianInverse,
Vector delta,
Vector gamma,
double tolerance)
BFGS Quasi-Newton update rule
|
boolean |
updateHessianInverse(Matrix hessianInverse,
Vector delta,
Vector gamma)
The step that makes BFGS/DFP/SR1 different from each other.
|
cleanupAlgorithm, getLineMinimizer, initializeAlgorithm, setLineMinimizer, stepgetInitialGuess, getResult, getTolerance, setInitialGuess, setResult, setToleranceclone, getData, getKeepGoing, learn, setData, setKeepGoing, stopgetMaxIterations, isResultValid, setMaxIterationsaddIterativeAlgorithmListener, fireAlgorithmEnded, fireAlgorithmStarted, fireStepEnded, fireStepStarted, getIteration, getListeners, removeIterativeAlgorithmListener, setIteration, setListenersequals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitlearnclonegetMaxIterations, setMaxIterationsaddIterativeAlgorithmListener, getIteration, removeIterativeAlgorithmListenerisResultValid, stoppublic FunctionMinimizerBFGS()
public FunctionMinimizerBFGS(LineMinimizer<?> lineMinimizer)
lineMinimizer - Work-horse algorithm that minimizes the function along a directionpublic FunctionMinimizerBFGS(LineMinimizer<?> lineMinimizer, Vector initialGuess, double tolerance, int maxIterations)
initialGuess - Initial guess about the minimum of the methodtolerance - Tolerance of the minimization algorithm, must be >= 0.0, typically ~1e-10lineMinimizer - Work-horse algorithm that minimizes the function along a directionmaxIterations - Maximum number of iterations, must be >0, typically ~100public boolean updateHessianInverse(Matrix hessianInverse, Vector delta, Vector gamma)
FunctionMinimizerQuasiNewtonupdateHessianInverse in class FunctionMinimizerQuasiNewtonhessianInverse - Current estimate of the Hessian inverse. Must be modified!delta - Change in the search points (xnew-xold)gamma - Change in the gradients (gnew-gold)@PublicationReference(author="R. Fletcher", title="Practical Methods of Optimization, Second Edition", type=Book, year=1987, pages=55, notes="Section 3.2, Equation 3.2.12") public static boolean BFGSupdateRule(Matrix hessianInverse, Vector delta, Vector gamma, double tolerance)
hessianInverse - Current Hessian inverse estimate, will be modifieddelta - Change in x-axisgamma - Change in gradienttolerance - Tolerance of the algorithm