2.1.6.1.6. sf_tools.signal.optimisation module¶
OPTIMISATION CLASSES
This module contains classes for optimisation algoritms.
| Author: | Samuel Farrens <samuel.farrens@gmail.com> |
|---|---|
| Version: | 1.3 |
| Date: | 20/10/2017 |
Notes
Input classes must have the following properties:
- Gradient Operators
Must have the following methods:
get_grad()- calculate the gradientMust have the following variables:
grad- the gradientinv_spec_rad- inverse spectral radius \(\frac{1}{\rho}\)
- Linear Operators
Must have the following methods:
op()- operatoradj_op()- adjoint operatorMust have the following variables:
l1norm- the l1 norm of the operator
- Proximity Operators
Must have the following methods:
op()- operator
The following notation is used to implement the algorithms:
- x_old is used in place of \(x_{n}\).
- x_new is used in place of \(x_{n+1}\).
- x_prox is used in place of \(\tilde{x}_{n+1}\).
- x_temp is used for intermediate operations.
-
class
sf_tools.signal.optimisation.FISTA(lambda_init=None, active=True)[source]¶ Bases:
objectThis class is inhereited by optimisation classes to speed up convergence
Parameters: -
speed_switch(turn_on=True)[source]¶ Speed swicth
This method turns on or off the speed-up
Parameters: turn_on (bool) – Option to turn on speed-up (default is True)
-
-
class
sf_tools.signal.optimisation.ForwardBackward(x, grad, prox, cost=None, lambda_init=None, lambda_update=None, use_fista=True, auto_iterate=True)[source]¶ Bases:
sf_tools.signal.optimisation.FISTAForward-Backward optimisation
This class implements standard forward-backward optimisation with an the option to use the FISTA speed-up
Parameters: - x (np.ndarray) – Initial guess for the primal variable
- grad (class) – Gradient operator class
- prox (class) – Proximity operator class
- cost (class, optional) – Cost function class
- lambda_init (float, optional) – Initial value of the relaxation parameter
- lambda_update (function, optional) – Relaxation parameter update method
- use_fista (bool, optional) – Option to use FISTA (default is
True) - auto_iterate (bool, optional) – Option to automatically begin iterations upon initialisation (default is ‘True’)
-
class
sf_tools.signal.optimisation.GenForwardBackward(x, grad, prox_list, cost=None, lambda_init=1.0, lambda_update=None, weights=None, auto_iterate=True)[source]¶ Bases:
objectGeneralized Forward-Backward optimisation
This class implements algorithm 1 from [R2012]
Parameters: - x (np.ndarray) – Initial guess for the primal variable
- grad (class) – Gradient operator class
- prox_list (list) – List of proximity operator classes
- cost (class, optional) – Cost function class
- lambda_init (float, optional) – Initial value of the relaxation parameter
- lambda_update (function, optional) – Relaxation parameter update method
- weights (np.ndarray, optional) – Proximity operator weights
- auto_iterate (bool, optional) – Option to automatically begin iterations upon initialisation (default is ‘True’)
-
class
sf_tools.signal.optimisation.Condat(x, y, grad, prox, prox_dual, linear, cost, rho, sigma, tau, rho_update=None, sigma_update=None, tau_update=None, auto_iterate=True)[source]¶ Bases:
objectCondat optimisation
This class implements algorithm 10.7 from [Con2013]
Parameters: - x (np.ndarray) – Initial guess for the primal variable
- y (np.ndarray) – Initial guess for the dual variable
- grad (class) – Gradient operator class
- prox (class) – Proximity primal operator class
- prox_dual (class) – Proximity dual operator class
- linear (class) – Linear operator class
- cost (class) – Cost function class
- rho (float) – Relaxation parameter
- sigma (float) – Proximal dual parameter
- tau (float) – Proximal primal paramater
- rho_update (function, optional) – Relaxation parameter update method
- sigma_update (function, optional) – Proximal dual parameter update method
- tau_update (function, optional) – Proximal primal parameter update method
- auto_iterate (bool, optional) – Option to automatically begin iterations upon initialisation (default is ‘True’)
-
update_param()[source]¶ Update parameters
This method updates the values of
rho,sigmaandtauwith the methods provided