optimset

Specify optimization function options.

Syntax

options = optimset('option1', value1, 'option2', value2, ...)

Inputs

optionN
The name of option N.
valueN
The value of option N.

Outputs

options
A struct containing the options.
The available options are as follows:
MaxIter
The maximum number of iterations allowed.
MaxFunEvals
The maximum number of function evaluations allowed.
MaxFail
The maximum number of failed sample evaluations.
TolFun
The termination tolerance on the objective function convergence.
TolX
The termination tolerance on the parameter convergence.
TolCon
The constraint violation allowance, as a percent.
TolKKT
The termination tolerance on the Karush-Kuhn-Tucker conditions.
GradObj
An 'on'/'off' flag to indicate whether the objective function will return the gradient as an optional second return value. (Only for fminunc, fmincon). The function signature is as follows: function [sys, grad] = System(x), where sys and grad contain the system function and its gradient.
GradConstr
An 'on'/'off' flag to indicate whether the non-linear constraint function will return the gradients as the optional third and fourth return value. If a non-linear constraint is used, then GradConstr must be set the same as GradObj. (Only for fmincon). The function signature is as follows: function [c, ceq, cj, ceqj] = ConFunc(x), where c and ceq contain inequality and equality contraints, respectively, and cj and ceqj contain their Jacobians. The inequality constraints are assumed to have upper bounds of 0.
Jacobian
An 'on'/'off' flag to indicate whether the objective function will return the Jacobian as an optional second return value (Only for fsolve, lsqcurvefit). The function signature is as follows: function [res, jac] = System(x), where res and jac contain the residuals vector of the system function and its Jacobian.
Display
An 'iter'/'off' flag to indicate whether objective function results will be displayed at each iteration. For more extensive iteration information, see the output return argument of the optimization function.

Examples

Set options to control the number of iterations and display intermediate data:

options = optimset('MaxIter', 200, 'Display', 'iter')
options = struct [
Display: iter
MaxIter: 200
]
Set options to specify that the analytical Jacobian function name is returned by the objective function:
options = optimset('Jacobian', 'on')
options = struct [
Jacobian: on
]

Comments

For fminbnd, the only available tolerance option is: TolX.

For fmincon, the available tolerance options are: MaxFail, TolX, TolCon, and TolKKT.

For fminunc, the available tolerance options are: TolFun and TolX.

For fminsearch, the only available tolerance option is: TolX.

For fsolve, the available tolerance options are: TolFun and TolX.

For fzero, the only available tolerance option is: TolX.

For lsqcurvefit, the available tolerance options are: TolFun and TolX.

The solver functions will terminate the first time that any of the convergence tolerance criteria are met.

The default value for MaxIter is 100.

The default value for MaxFunEvals is 400.

The default value for MaxFail is 20000.

The default value for TolFun is 1.0e-7.

The default value for TolX is 1.0e-3 for fmincon, and 1.0e-7 otherwise.

The default value for TolCon is 0.5.

The default value for TolKKT is 1.0e-4.

TolX, when used with fmincon, sets the convergence criteria relative to the design variable bounds. The TolX value is applied to the interval sizes as a scale factor. For all other functions, TolX sets the convergence criteria relative to the design variable magnitudes.

TolCon only applies when fmincon cannot find a feasible solution. In such cases, the function will return the best infeasible solution found within the allowed violation, along with a warning. The algorithm does not attempt to minimize the utilized violation. The TolCon value is applied as a percent of the constraint bound, with an absolute minimum of 1.0e-4 applied when the bound is zero or near zero.

TolKKT sets the convergence criterion for the optimal relationship between the gradients of the objective and constraint functions, which is an equation involving Lagrange multipliers.