The sherpa.optmethods.optfcts module
Optimizing functions.
These functions take a callback, the current set of parameters, the minimum and maximum parameter ranges, along with optional arguments, and return a tuple containing
status, parameters, statistic, message, dict
where status
is a boolean indicating whether the optimisation
succeeded or not, parameters is the list of parameter values at the
best-fit location, the statistic value at this location, a string
message - when status
is False
this will give information on the
failure - and a dictionary which depends on the optimiser.
The callback should return the current statistic value and an array of the statistic value per bin.
Notes
Each optimizer has certain classes of problem where it is more, or
less, successful. For instance, the neldermead
function should
only be used with chi-square based statistics.
Examples
Fit a constant model to the array of values in y
, using a
least-square statistic:
>>> y = np.asarray([3, 2, 7])
>>> def cb(pars):
... 'Least-squares statistic value from fitting a constant model to y'
... dy = y - pars[0]
... dy *= dy
... return (dy.sum(), dy)
...
This can be evaluated using the neldermead
optimiser, starting at a
model value of 1 and bounded to the range 0 to 10000:
>>> res = neldermead(cb, [1], [0], [1e4])
>>> print(res)
(True, array([4.]), 14.0, 'Optimization terminated successfully', {'info': True, 'nfev': 98})
>>> print(f"Best-fit value: {res[1][0]}")
Best-fit value: 4.0
Functions
|
|
|
|
|
|
|
Grid Search optimization method. |
|
Levenberg-Marquardt optimization method. |
|
|
|
Monte Carlo optimization method. |
|
Nelder-Mead Simplex optimization method. |