Bases: object
A function f(x) that can be minimized (base class).
Example of usage:
cf=DerivedCostFunction()
# ... calculate x ...
args=cf.getArguments(x) # this could be potentially expensive!
f=cf.getValue(x, *args)
# ... it could be required to update x without using the gradient...
# ... but then ...
gf=cf.getGradient(x, *args)
The class distinguishes between the representation of the solution x (x-type) and the gradients (r-type).
Variables: | provides_inverse_Hessian_approximation – This member should be set to True in subclasses that provide a valid implementation of getInverseHessianApproximation() |
---|
returns precalculated values that are shared in the calculation of f(x) and grad f(x) and the Hessian operator. The default implementation returns an empty tuple.
Parameters: | x (x-type) – location of derivative |
---|---|
Return type: | tuple |
returns the dual product of x and r
Return type: | float |
---|
returns the gradient of f at x using the precalculated values for x.
Parameters: |
|
---|---|
Return type: | r-type |
returns an approximative evaluation p of the inverse of the Hessian operator of the cost function for a given gradient r at a given location x: H(x) p = r
Parameters: |
|
---|---|
Return type: | x-type |
Note : | In general it is assumed that the Hessian H(x) needs to be calculated in each call for a new location x. However, the solver may suggest that this is not required, typically when the iteration is close to completeness. |
Note : | Subclasses that implement this method should set the class variable provides_inverse_Hessian_approximation to True to enable the solver to call this method. |
returns the norm of x
Return type: | float |
---|
returns the value f(x) using the precalculated values for x.
Parameters: | x (x-type) – a solution approximation |
---|---|
Return type: | float |
notifies the class that the Hessian operator needs to be updated. This method is called by the solver class.
Bases: esys.downunder.costfunctions.CostFunction
This an intrumented version of the CostFunction class. The function calls update statistical information. The actual work is done by the methods with corresponding name and a leading underscore. These functions need to be overwritten for a particular cost function implementation.
returns precalculated values that are shared in the calculation of f(x) and grad f(x) and the Hessian operator
Parameters: | x (x-type) – location of derivative |
---|
returns the dual product of x and r
Return type: | float |
---|
returns the gradient of f at x using the precalculated values for x.
Parameters: |
|
---|---|
Return type: | r-type |
returns an approximative evaluation p of the inverse of the Hessian operator of the cost function for a given gradient r at a given location x: H(x) p = r
Parameters: |
|
---|---|
Return type: | x-type |
Note : | In general it is assumed that the Hessian H(x) needs to be calculate in each call for a new location x. However, the solver may suggest that this is not required, typically when the iteration is close to completeness. |
returns the norm of x
Return type: | float |
---|
returns the value f(x) using the precalculated values for x.
Parameters: | x (x-type) – a solution approximation |
---|---|
Return type: | float |
resets all statistical counters
notifies the class that the Hessian operator needs to be updated. This method is called by the solver class.