esys.downunder.minimizers Package

Classes

class esys.downunder.minimizers.AbstractMinimizer(J=None, m_tol=0.0001, J_tol=None, imax=300)

Bases: object

Base class for function minimization methods.

getCostFunction()

return the cost function to be minimized

Return type:CostFunction
getOptions()

Returns a dictionary of minimizer-specific options.

getResult()

Returns the result of the minimization.

logSummary()

Outputs a summary of the completed minimization process to the logger.

run(x0)

Executes the minimization algorithm for f starting with the initial guess x0.

Returns:the result of the minimization
setCallback(callback)

Sets a callback function to be called after every iteration. The arguments to the function are: (k, x, Jx, g_Jxx), where k is the current iteration, x is the current estimate, Jx=f(x) and g_Jxx=grad f(x).

setCostFunction(J)

set the cost function to be minimized

Parameters:J (CostFunction) – the cost function to be minimized
setMaxIterations(imax)

Sets the maximum number of iterations before the minimizer terminates.

setOptions(**opts)

Sets minimizer-specific options. For a list of possible options see getOptions().

setTolerance(m_tol=0.0001, J_tol=None)

Sets the tolerance for the stopping criterion. The minimizer stops when an appropriate norm is less than m_tol.

class esys.downunder.minimizers.MinimizerBFGS(J=None, m_tol=0.0001, J_tol=None, imax=300)

Bases: esys.downunder.minimizers.AbstractMinimizer

Minimizer that uses the Broyden-Fletcher-Goldfarb-Shanno method.

getCostFunction()

return the cost function to be minimized

Return type:CostFunction
getOptions()
getResult()

Returns the result of the minimization.

logSummary()

Outputs a summary of the completed minimization process to the logger.

run(x)
setCallback(callback)

Sets a callback function to be called after every iteration. The arguments to the function are: (k, x, Jx, g_Jxx), where k is the current iteration, x is the current estimate, Jx=f(x) and g_Jxx=grad f(x).

setCostFunction(J)

set the cost function to be minimized

Parameters:J (CostFunction) – the cost function to be minimized
setMaxIterations(imax)

Sets the maximum number of iterations before the minimizer terminates.

setOptions(**opts)
setTolerance(m_tol=0.0001, J_tol=None)

Sets the tolerance for the stopping criterion. The minimizer stops when an appropriate norm is less than m_tol.

class esys.downunder.minimizers.MinimizerException

Bases: exceptions.Exception

This is a generic exception thrown by a minimizer.

args
message
class esys.downunder.minimizers.MinimizerIterationIncurableBreakDown

Bases: esys.downunder.minimizers.MinimizerException

Exception thrown if the iteration scheme encountered an incurable breakdown.

args
message
class esys.downunder.minimizers.MinimizerLBFGS(J=None, m_tol=0.0001, J_tol=None, imax=300)

Bases: esys.downunder.minimizers.AbstractMinimizer

Minimizer that uses the limited-memory Broyden-Fletcher-Goldfarb-Shanno method.

getCostFunction()

return the cost function to be minimized

Return type:CostFunction
getOptions()
getResult()

Returns the result of the minimization.

logSummary()

Outputs a summary of the completed minimization process to the logger.

run(x)
setCallback(callback)

Sets a callback function to be called after every iteration. The arguments to the function are: (k, x, Jx, g_Jxx), where k is the current iteration, x is the current estimate, Jx=f(x) and g_Jxx=grad f(x).

setCostFunction(J)

set the cost function to be minimized

Parameters:J (CostFunction) – the cost function to be minimized
setMaxIterations(imax)

Sets the maximum number of iterations before the minimizer terminates.

setOptions(**opts)
setTolerance(m_tol=0.0001, J_tol=None)

Sets the tolerance for the stopping criterion. The minimizer stops when an appropriate norm is less than m_tol.

class esys.downunder.minimizers.MinimizerMaxIterReached

Bases: esys.downunder.minimizers.MinimizerException

Exception thrown if the maximum number of iteration steps is reached.

args
message
class esys.downunder.minimizers.MinimizerNLCG(J=None, m_tol=0.0001, J_tol=None, imax=300)

Bases: esys.downunder.minimizers.AbstractMinimizer

Minimizer that uses the nonlinear conjugate gradient method (Fletcher-Reeves variant).

getCostFunction()

return the cost function to be minimized

Return type:CostFunction
getOptions()

Returns a dictionary of minimizer-specific options.

getResult()

Returns the result of the minimization.

logSummary()

Outputs a summary of the completed minimization process to the logger.

run(x)
setCallback(callback)

Sets a callback function to be called after every iteration. The arguments to the function are: (k, x, Jx, g_Jxx), where k is the current iteration, x is the current estimate, Jx=f(x) and g_Jxx=grad f(x).

setCostFunction(J)

set the cost function to be minimized

Parameters:J (CostFunction) – the cost function to be minimized
setMaxIterations(imax)

Sets the maximum number of iterations before the minimizer terminates.

setOptions(**opts)

Sets minimizer-specific options. For a list of possible options see getOptions().

setTolerance(m_tol=0.0001, J_tol=None)

Sets the tolerance for the stopping criterion. The minimizer stops when an appropriate norm is less than m_tol.

Functions

esys.downunder.minimizers.Lsup(arg)

Returns the Lsup-norm of argument arg. This is the maximum absolute value over all data points. This function is equivalent to sup(abs(arg)).

Parameters:arg (float, int, escript.Data, numpy.ndarray) – argument
Returns:maximum value of the absolute value of arg over all components and all data points
Return type:float
Raises TypeError:
 if type of arg cannot be processed
esys.downunder.minimizers._zoom(phi, gradphi, phiargs, alpha_lo, alpha_hi, phi_lo, phi_hi, c1, c2, phi0, gphi0, IMAX=25)

Helper function for line_search below which tries to tighten the range alpha_lo...alpha_hi. See Chapter 3 of ‘Numerical Optimization’ by J. Nocedal for an explanation.

Line search method that satisfies the strong Wolfe conditions. See Chapter 3 of ‘Numerical Optimization’ by J. Nocedal for an explanation.

Parameters:
  • f – callable objective function f(x)
  • x – start value for the line search
  • p – search direction
  • g_Jx – value for the gradient of f at x
  • Jx – value of f(x)
  • alpha_truncationax – algorithm terminates if alpha reaches this value
  • c1 – value for Armijo condition (see reference)
  • c2 – value for curvature condition (see reference)
  • IMAX – maximum number of iterations to perform
esys.downunder.minimizers.sqrt(arg)

Returns the square root of argument arg.

Parameters:arg (float, escript.Data, Symbol, numpy.ndarray) – argument
Return type:float, escript.Data, Symbol, numpy.ndarray depending on the type of arg
Raises TypeError:
 if the type of the argument is not expected

Others

  • EPSILON
  • __all__
  • __builtins__
  • __copyright__
  • __doc__
  • __file__
  • __license__
  • __name__
  • __package__
  • __url__
  • lslogger
  • zoomlogger