Class LBFGS

Implementation of the limited memory BFGS algorithm

Methods

LBFGS:new ([damping=1][, H0=1/75][, history=25][, discard="none"][, scaling="none"], ...) Instantiating a new LBFGS object.
LBFGS:reset () Reset the LBFGS object
LBFGS:correct_dF (dF) Normalize the parameter displacement to a given max-change.
LBFGS:add_history (F, G) Add the current optimization variable and the gradient variable to the history.
LBFGS:optimize (F, G) Perform a LBFGS step with input parameters F and gradient G
LBFGS:SIESTA (siesta) SIESTA function for performing a complete SIESTA LBFGS optimization.
LBFGS:info () Print information regarding the LBFGS object


Methods

LBFGS:new ([damping=1][, H0=1/75][, history=25][, discard="none"][, scaling="none"], ...)
Instantiating a new LBFGS object.

The LBFGS algorithm is a straight-forward optimization algorithm which requires very few arguments for a succesful optimization. The most important parameter is the initial Hessian value, which for large values (close to 1) may have difficulties in converging because it is more aggressive (keeps more of the initial gradient). The default value is rather safe and should enable optimization on most systems.

This optimization method also implements a history-discard strategy, if needed, for possible speeding up the convergence. A field in the argument table, discard, may be passed which takes one of - "none", no discard strategy - "max-dF", if a displacement is being made beyond the max-displacement we do not store the step in the history

This optimization method also implements a scaling strategy, if needed, for possible speeding up the convergence. A field in the argument table, scaling, may be passed which takes one of - "none", no scaling strategy used - "initial", only the initial inverse Hessian and use that in all subsequent iterations - "every", scale for every step

Parameters:

  • damping number damping parameter for the parameter change (default 1)
  • H0 number initial Hessian value, larger values are more safe, but takes possibly longer to converge (default 1/75)
  • history int number of previous steps used when calculating the new Hessian (default 25)
  • discard string method for discarding a previous history step (default "none")
  • scaling string method for scaling the inverse Hessian (default "none")
  • ... any arguments Optimizer:new accepts

Usage:

    lbfgs = LBFGS{<field1 = value>, <field2 = value>}
    while not lbfgs:optimized() do
       F = lbfgs:optimize(F, G)
    end
LBFGS:reset ()
Reset the LBFGS object
LBFGS:correct_dF (dF)
Normalize the parameter displacement to a given max-change. The LBFGS algorithm always perfoms a global correction to maintain the minimization direction.

Parameters:

  • dF Array the parameter displacements that are to be normalized

Returns:

    the normalized dF according to the global or local correction
LBFGS:add_history (F, G)
Add the current optimization variable and the gradient variable to the history. This function calculates the residuals and updates the kernel of the residual dot-product.

Parameters:

  • F Array the parameters for the function
  • G Array the gradient of the function with the parameters F
LBFGS:optimize (F, G)
Perform a LBFGS step with input parameters F and gradient G local minimum point.

Parameters:

  • F Array the parameters for the function
  • G Array the gradient for the function with parameters F

Returns:

    a new set of parameters which should converge towards a
LBFGS:SIESTA (siesta)
SIESTA function for performing a complete SIESTA LBFGS optimization.

This function will query these fdf-flags from SIESTA:

  • MD.MaxForceTol
  • MD.MaxCGDispl

and use those as the tolerance for convergence as well as the maximum displacement for each optimization step.

Everything else is controlled by the LBFGS object.

Note that all internal operations in this function relies on units being in - Ang - eV - eV/Ang

Parameters:

  • siesta table the SIESTA global table.
LBFGS:info ()
Print information regarding the LBFGS object
generated by LDoc 1.4.6 Last updated 2019-05-23 08:36:38