NFT¶
- class NFT(maxiter=None, maxfev=1024, disp=False, reset_interval=32)[source]¶
Nakanishi-Fujii-Todo algorithm.
See https://arxiv.org/abs/1903.12166
Built out using scipy framework, for details, please refer to https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html.
- Parameters
maxiter (
Optional
[int
]) – Maximum number of iterations to perform.maxfev (
int
) – Maximum number of function evaluations to perform.disp (
bool
) – dispreset_interval (
int
) – The minimum estimates directly once inreset_interval
times.
Notes
In this optimization method, the optimization function have to satisfy three conditions written in 1.
References
- 1
K. M. Nakanishi, K. Fujii, and S. Todo. 2019. Sequential minimal optimization for quantum-classical hybrid algorithms. arXiv preprint arXiv:1903.12166.
Attributes
Returns bounds support level
Returns gradient support level
Returns initial point support level
Returns is bounds ignored
Returns is bounds required
Returns is bounds supported
Returns is gradient ignored
Returns is gradient required
Returns is gradient supported
Returns is initial point ignored
Returns is initial point required
Returns is initial point supported
Return setting
Methods
return support level dictionary
NFT.gradient_num_diff
(x_center, f, epsilon)We compute the gradient with the numeric differentiation in the parallel way, around the point x_center.
NFT.optimize
(num_vars, objective_function[, …])Perform optimization.
Print algorithm-specific options.
NFT.set_max_evals_grouped
(limit)Set max evals grouped
NFT.set_options
(**kwargs)Sets or updates values in the options dictionary.
NFT.wrap_function
(function, args)Wrap the function to implicitly inject the args at the call of the function.