minimize
or
maximize
a scalar function, called the
objective
function
.See: Description
Interface  Description 

ConvergenceChecker<PAIR> 
This interface specifies how to check if an optimization algorithm has
converged.

OptimizationData 
Marker interface.

OptimizationProblem<PAIR> 
Common settings for all optimization problems.

Class  Description 

AbstractConvergenceChecker<PAIR> 
Base class for all convergence checker implementations.

AbstractOptimizationProblem<PAIR> 
Base class for implementing optimization problems.

BaseMultiStartMultivariateOptimizer<PAIR> 
Base class multistart optimizer for a multivariate function.

BaseMultivariateOptimizer<PAIR> 
Base class for implementing optimizers for multivariate functions.

BaseOptimizer<PAIR> 
Base class for implementing optimizers.

InitialGuess 
Starting point (first guess) of the optimization procedure.

MaxEval 
Maximum number of evaluations of the function to be optimized.

MaxIter 
Maximum number of iterations performed by an (iterative) algorithm.

PointValuePair 
This class holds a point and the value of an objective function at
that point.

PointVectorValuePair 
This class holds a point and the vectorial value of an objective function at
that point.

SimpleBounds 
Simple optimization constraints: lower and upper bounds.

SimplePointChecker<PAIR extends Pair<double[],? extends Object>> 
Simple implementation of the
ConvergenceChecker interface using
only point coordinates. 
SimpleValueChecker 
Simple implementation of the
ConvergenceChecker interface using
only objective function values. 
SimpleVectorValueChecker 
Simple implementation of the
ConvergenceChecker interface using
only objective function values. 
Generally, optimizers are algorithms that will either
minimize
or
maximize
a scalar function, called the
objective
function
.
For some scalar objective functions the gradient can be computed (analytically
or numerically). Algorithms that use this knowledge are defined in the
org.apache.commons.math3.optim.nonlinear.scalar.gradient
package.
The algorithms that do not need this additional information are located in
the org.apache.commons.math3.optim.nonlinear.scalar.noderiv
package.
Some problems are solved more efficiently by algorithms that, instead of an
objective function, need access to a
model function
: such a model predicts a set of values which the
algorithm tries to match with a set of given
target values
.
Those algorithms are located in the
org.apache.commons.math3.optim.nonlinear.vector
package.
Algorithms that also require the
Jacobian matrix of the model
are located in the
org.apache.commons.math3.optim.nonlinear.vector.jacobian
package.
The nonlinear leastsquares optimizers
are a specialization of the the latter,
that minimize the distance (called cost or χ^{2})
between model and observations.
For cases where the Jacobian cannot be provided, a utility class will
convert
a (vector) model into a (scalar) objective function.
This package provides common functionality for the optimization algorithms.
Abstract classes (BaseOptimizer
and
BaseMultivariateOptimizer
) contain
boilerplate code for storing evaluations
and iterations
counters and a userdefined
convergence checker
.
For each of the optimizer types, there is a special implementation that wraps an optimizer instance and provides a "multistart" feature: it calls the underlying optimizer several times with different starting points and returns the best optimum found, or all optima if so desired. This could be useful to avoid being trapped in a local extremum.
Copyright © 2003–2016 The Apache Software Foundation. All rights reserved.