The sherpa.optmethods.optfcts module

Optimizing functions.

These functions take a callback, the current set of parameters, the minimum and maximum parameter ranges, along with optional arguments, and return a tuple containing

status, parameters, statistic, message, dict

where status is a boolean indicating whether the optimisation succeeded or not, parameters is the list of parameter values at the best-fit location, the statistic value at this location, a string message - when status is False this will give information on the failure - and a dictionary which depends on the optimiser.

The callback should return the current statistic value and an array of the statistic value per bin.


Each optimizer has certain classes of problem where it is more, or less, successful. For instance, the neldermead function should only be used with chi-square based statistics.


Fit a constant model to the array of values in y, using a least-square statistic:

>>> import numpy as np
>>> y = np.asarray([3, 2, 7])
>>> def cb(pars):
...     'Least-squares statistic value from fitting a constant model to y'
...     dy = y - pars[0]
...     dy *= dy
...     return (dy.sum(), dy)

This can be evaluated using the neldermead optimiser, starting at a model value of 1 and bounded to the range 0 to 10000:

>>> res = neldermead(cb, [1], [0], [1e4])
>>> print(res)
(True, array([4.]), 14.0, 'Optimization terminated successfully', {'info': True, 'nfev': 98})
>>> print(f"Best-fit value: {res[1][0]}")
Best-fit value: 4.0


difevo(fcn, x0, xmin, xmax[, ftol, maxfev, ...])

difevo_lm(fcn, x0, xmin, xmax[, ftol, ...])

difevo_nm(fcn, x0, xmin, xmax, ftol, maxfev, ...)

grid_search(fcn, x0, xmin, xmax[, num, ...])

Grid Search optimization method.

lmdif(fcn, x0, xmin, xmax[, ftol, xtol, ...])

Levenberg-Marquardt optimization method.

minim(fcn, x0, xmin, xmax[, ftol, maxfev, ...])

montecarlo(fcn, x0, xmin, xmax[, ftol, ...])

Monte Carlo optimization method.

neldermead(fcn, x0, xmin, xmax[, ftol, ...])

Nelder-Mead Simplex optimization method.