2.1.6.1.6. sf_tools.signal.optimisation module

OPTIMISATION CLASSES

This module contains classes for optimisation algoritms.

Author:Samuel Farrens <samuel.farrens@gmail.com>
Version:1.3
Date:20/10/2017

Notes

Input classes must have the following properties:

  • Gradient Operators

Must have the following methods:

  • get_grad() - calculate the gradient

Must have the following variables:

  • grad - the gradient
  • inv_spec_rad - inverse spectral radius \(\frac{1}{\rho}\)
  • Linear Operators

Must have the following methods:

  • op() - operator
  • adj_op() - adjoint operator

Must have the following variables:

  • l1norm - the l1 norm of the operator
  • Proximity Operators

Must have the following methods:

  • op() - operator

The following notation is used to implement the algorithms:

  • x_old is used in place of \(x_{n}\).
  • x_new is used in place of \(x_{n+1}\).
  • x_prox is used in place of \(\tilde{x}_{n+1}\).
  • x_temp is used for intermediate operations.
class sf_tools.signal.optimisation.FISTA(lambda_init=None, active=True)[source]

Bases: object

This class is inhereited by optimisation classes to speed up convergence

Parameters:
  • lambda_init (float, optional) – Initial value of the relaxation parameter
  • active (bool, optional) – Option to activate FISTA convergence speed-up (default is True)
speed_switch(turn_on=True)[source]

Speed swicth

This method turns on or off the speed-up

Parameters:turn_on (bool) – Option to turn on speed-up (default is True)
update_lambda()[source]

Update lambda

This method updates the value of lambda

Notes

Implements steps 3 and 4 from algoritm 10.7 in [B2011]

speed_up()[source]

speed-up

This method returns the update if the speed-up is active

class sf_tools.signal.optimisation.ForwardBackward(x, grad, prox, cost=None, lambda_init=None, lambda_update=None, use_fista=True, auto_iterate=True)[source]

Bases: sf_tools.signal.optimisation.FISTA

Forward-Backward optimisation

This class implements standard forward-backward optimisation with an the option to use the FISTA speed-up

Parameters:
  • x (np.ndarray) – Initial guess for the primal variable
  • grad (class) – Gradient operator class
  • prox (class) – Proximity operator class
  • cost (class, optional) – Cost function class
  • lambda_init (float, optional) – Initial value of the relaxation parameter
  • lambda_update (function, optional) – Relaxation parameter update method
  • use_fista (bool, optional) – Option to use FISTA (default is True)
  • auto_iterate (bool, optional) – Option to automatically begin iterations upon initialisation (default is ‘True’)
update()[source]

Update

This method updates the current reconstruction

Notes

Implements algorithm 10.7 (or 10.5) from [B2011]

iterate(max_iter=150)[source]

Iterate

This method calls update until either convergence criteria is met or the maximum number of iterations is reached

Parameters:max_iter (int, optional) – Maximum number of iterations (default is 150)
class sf_tools.signal.optimisation.GenForwardBackward(x, grad, prox_list, cost=None, lambda_init=1.0, lambda_update=None, weights=None, auto_iterate=True)[source]

Bases: object

Generalized Forward-Backward optimisation

This class implements algorithm 1 from [R2012]

Parameters:
  • x (np.ndarray) – Initial guess for the primal variable
  • grad (class) – Gradient operator class
  • prox_list (list) – List of proximity operator classes
  • cost (class, optional) – Cost function class
  • lambda_init (float, optional) – Initial value of the relaxation parameter
  • lambda_update (function, optional) – Relaxation parameter update method
  • weights (np.ndarray, optional) – Proximity operator weights
  • auto_iterate (bool, optional) – Option to automatically begin iterations upon initialisation (default is ‘True’)
update()[source]

Update

This method updates the current reconstruction

Notes

Implements algorithm 1 from [R2012]

iterate(max_iter=150)[source]

Iterate

This method calls update until either convergence criteria is met or the maximum number of iterations is reached

Parameters:max_iter (int, optional) – Maximum number of iterations (default is 150)
class sf_tools.signal.optimisation.Condat(x, y, grad, prox, prox_dual, linear, cost, rho, sigma, tau, rho_update=None, sigma_update=None, tau_update=None, auto_iterate=True)[source]

Bases: object

Condat optimisation

This class implements algorithm 10.7 from [Con2013]

Parameters:
  • x (np.ndarray) – Initial guess for the primal variable
  • y (np.ndarray) – Initial guess for the dual variable
  • grad (class) – Gradient operator class
  • prox (class) – Proximity primal operator class
  • prox_dual (class) – Proximity dual operator class
  • linear (class) – Linear operator class
  • cost (class) – Cost function class
  • rho (float) – Relaxation parameter
  • sigma (float) – Proximal dual parameter
  • tau (float) – Proximal primal paramater
  • rho_update (function, optional) – Relaxation parameter update method
  • sigma_update (function, optional) – Proximal dual parameter update method
  • tau_update (function, optional) – Proximal primal parameter update method
  • auto_iterate (bool, optional) – Option to automatically begin iterations upon initialisation (default is ‘True’)
update_param()[source]

Update parameters

This method updates the values of rho, sigma and tau with the methods provided

update()[source]

Update

This method updates the current reconstruction

Notes

Implements equation 9 (algorithm 3.1) from [Con2013]

  • primal proximity operator set up for positivity constraint
iterate(max_iter=150)[source]

Iterate

This method calls update until either convergence criteria is met or the maximum number of iterations is reached

Parameters:max_iter (int, optional) – Maximum number of iterations (default is 150)