ADAM¶
-
class
ADAM
(maxiter=10000, tol=1e-06, lr=0.001, beta_1=0.9, beta_2=0.99, noise_factor=1e-08, eps=1e-10, amsgrad=False, snapshot_dir=None)[source]¶ Bases:
qiskit.algorithms.optimizers.optimizer.Optimizer
Adam and AMSGRAD optimizers.
Adam [1] is a gradient-based optimization algorithm that is relies on adaptive estimates of lower-order moments. The algorithm requires little memory and is invariant to diagonal rescaling of the gradients. Furthermore, it is able to cope with non-stationary objective functions and noisy and/or sparse gradients.
AMSGRAD [2] (a variant of Adam) uses a ‘long-term memory’ of past gradients and, thereby, improves convergence properties.
References
- [1]: Kingma, Diederik & Ba, Jimmy (2014), Adam: A Method for Stochastic Optimization.
- [2]: Sashank J. Reddi and Satyen Kale and Sanjiv Kumar (2018),
On the Convergence of Adam and Beyond. arXiv:1904.09237
Note
This component has some function that is normally random. If you want to reproduce behavior then you should set the random number generator seed in the algorithm_globals (
qiskit.utils.algorithm_globals.random_seed = seed
).- Parameters
maxiter (
int
) – Maximum number of iterationstol (
float
) – Tolerance for terminationlr (
float
) – Value >= 0, Learning rate.beta_1 (
float
) – Value in range 0 to 1, Generally close to 1.beta_2 (
float
) – Value in range 0 to 1, Generally close to 1.noise_factor (
float
) – Value >= 0, Noise factoreps (
float
) – Value >=0, Epsilon to be used for finite differences if no analytic gradient method is given.amsgrad (
bool
) – True to use AMSGRAD, False if notsnapshot_dir (
Optional
[str
]) – If not None save the optimizer’s parameter after every step to the given directory
Methods
Return support level dictionary
We compute the gradient with the numeric differentiation in the parallel way, around the point x_center.
Load iteration parameters for a file called
adam_params.csv
.Run the minimization.
Perform optimization.
Print algorithm-specific options.
Save the current iteration parameters to a file called
adam_params.csv
.Set max evals grouped
Sets or updates values in the options dictionary.
Wrap the function to implicitly inject the args at the call of the function.
Attributes
-
bounds_support_level
¶ Returns bounds support level
-
gradient_support_level
¶ Returns gradient support level
-
initial_point_support_level
¶ Returns initial point support level
-
is_bounds_ignored
¶ Returns is bounds ignored
-
is_bounds_required
¶ Returns is bounds required
-
is_bounds_supported
¶ Returns is bounds supported
-
is_gradient_ignored
¶ Returns is gradient ignored
-
is_gradient_required
¶ Returns is gradient required
-
is_gradient_supported
¶ Returns is gradient supported
-
is_initial_point_ignored
¶ Returns is initial point ignored
-
is_initial_point_required
¶ Returns is initial point required
-
is_initial_point_supported
¶ Returns is initial point supported
-
setting
¶ Return setting
-
settings
¶ - Return type
Dict
[str
,Any
]