CRS¶
- class CRS(max_evals=1000)[source]¶
Controlled Random Search (CRS) with local mutation optimizer.
Controlled Random Search (CRS) with local mutation is part of the family of the CRS optimizers. The CRS optimizers start with a random population of points, and randomly evolve these points by heuristic rules. In the case of CRS with local mutation, the evolution is a randomized version of the
NELDER_MEAD
local optimizer.NLopt global optimizer, derivative-free. For further detail, please refer to https://nlopt.readthedocs.io/en/latest/NLopt_Algorithms/#controlled-random-search-crs-with-local-mutation
- Parameters
max_evals (
int
) – Maximum allowed number of function evaluations.- Raises
NameError – NLopt library not installed.
Attributes
Returns bounds support level
Returns gradient support level
Returns initial point support level
Returns is bounds ignored
Returns is bounds required
Returns is bounds supported
Returns is gradient ignored
Returns is gradient required
Returns is gradient supported
Returns is initial point ignored
Returns is initial point required
Returns is initial point supported
Return setting
Methods
Return NLopt optimizer type
return support level dictionary
CRS.gradient_num_diff
(x_center, f, epsilon)We compute the gradient with the numeric differentiation in the parallel way, around the point x_center.
CRS.optimize
(num_vars, objective_function[, …])Perform optimization.
Print algorithm-specific options.
CRS.set_max_evals_grouped
(limit)Set max evals grouped
CRS.set_options
(**kwargs)Sets or updates values in the options dictionary.
CRS.wrap_function
(function, args)Wrap the function to implicitly inject the args at the call of the function.