AQGD¶
- class AQGD(maxiter=1000, eta=3.0, tol=1e-06, disp=False, momentum=0.25)[source]¶
Analytic Quantum Gradient Descent (AQGD) optimizer.
Performs gradient descent optimization with a momentum term and analytic gradients for parametrized quantum gates, i.e. Pauli Rotations. See, for example:
K. Mitarai, M. Negoro, M. Kitagawa, and K. Fujii. (2018). Quantum circuit learning. Phys. Rev. A 98, 032309. https://arxiv.org/abs/1803.00745
Maria Schuld, Ville Bergholm, Christian Gogolin, Josh Izaac, Nathan Killoran. (2019). Evaluating analytic gradients on quantum hardware. Phys. Rev. A 99, 032331. https://arxiv.org/abs/1811.11184
for further details on analytic gradients of parametrized quantum gates.
Gradients are computed “analytically” using the quantum circuit when evaluating the objective function.
- Parameters
maxiter (
int
) – Maximum number of iterations, each iteration evaluation gradient.eta (
float
) – The coefficient of the gradient update. Increasing this value results in larger step sizes: param = previous_param - eta * derivtol (
float
) – The convergence criteria that must be reached before stopping. Optimization stops when: absolute(loss - previous_loss) < toldisp (
bool
) – Set to True to display convergence messages.momentum (
float
) – Bias towards the previous gradient momentum in current update. Must be within the bounds: [0,1)
Attributes
Returns bounds support level
Returns gradient support level
Returns initial point support level
Returns is bounds ignored
Returns is bounds required
Returns is bounds supported
Returns is gradient ignored
Returns is gradient required
Returns is gradient supported
Returns is initial point ignored
Returns is initial point required
Returns is initial point supported
Return setting
Methods
AQGD.converged
(objval[, n])Determines if the objective function has converged by finding the difference between the current value and the previous n values.
AQGD.deriv
(j, params, obj)Obtains the analytical quantum derivative of the objective function with respect to the jth parameter.
Return support level dictionary
AQGD.gradient_num_diff
(x_center, f, epsilon)We compute the gradient with the numeric differentiation in the parallel way, around the point x_center.
AQGD.optimize
(num_vars, objective_function)Perform optimization.
Print algorithm-specific options.
AQGD.set_max_evals_grouped
(limit)Set max evals grouped
AQGD.set_options
(**kwargs)Sets or updates values in the options dictionary.
AQGD.update
(j, params, deriv, mprev)Updates the jth parameter based on the derivative and previous momentum
AQGD.wrap_function
(function, args)Wrap the function to implicitly inject the args at the call of the function.