Algorithms

The proximal optimization algorithms for minimization of non-smooth objective functions.

Proximal Gradient Methods

ProximalMethods.prox_gradFunction
prox_grad(x0, f, ∇f!, prox!; style=:noaccel, β=.5, ϵ=1e-7, max_iter=1000)

Minimize an objective function $f(x) + g(x)$, where $f(x)$ is differentibale while $g(x)$ is not, using the proximal gradient method.

Arguments

  • x0::AbstractVector : initial parameter values (n x 1)
  • f::Function : $f(x)$
  • ∇f!::Function : gradient of f
  • prox!::Function : proximal operator of $g(x)$
  • style::Symbol : acceleration style
  • β::Real : line search parameter
  • ϵ::Real : tolerance
  • max_iter::Integer : max number of iterations

Returns

  • x::AbstractVector : minimizer (optimal parameter values) (n x 1)
source
ProximalMethods.prox_grad!Function
prox_grad!(x, f, ∇f!, prox!; style=:noaccel, β=.5, ϵ=1e-7, max_iter=1000)

Minimize an objective function $f(x) + g(x)$, where $f(x)$ is differentibale while $g(x)$ is not, using the proximal gradient method. Storing the result in x. See also prox_grad.

source

Alternating direction method of multipliers (ADMM)

ProximalMethods.admmFunction
admm(x0, prox_f!, prox_g!; λ=1., α=1., ϵ_abs=1e-7, ϵ_rel=1e-4, max_iter=1000)

Minimize an objective function $f(x) + g(x)$, where $f(x)$ and $g(x)$ can both be nonsmooth, using alternating direction method of multipliers, also known as Douglas-Rachford splitting. The alternating direction method of multipliers method has two hyperparameters, λ and α. λ controls the scaling of the update steps, i.e. a pseudo step size, and is equal to the inverse of the augmented Lagrangian parameter. $α ∈ [0,2]$ is the relaxation parameter, where $α < 1$ denotes under-relaxation and $α > 1$ over-relaxation.

Arguments

  • x0::AbstractVector: initial parameter values (n x 1)
  • prox_f!::Function : proximal operator of $f(x)$
  • prox_g!::Function : proximal operator of $g(x)$
  • λ::Real : proximal scaling parameter
  • α::Real : relaxation parameter
  • ϵ_abs::Real : absolute tolerance
  • ϵ_rel::Real : relative tolerance
  • max_iter::Integer : max number of iterations

Returns

  • x::AbstractVector : minimizer $∈ dom f$ (n x 1)
  • z::AbstractVector : minimizer $∈ dom g$ (n x 1)
source
ProximalMethods.admm!Function
admm!(x, prox_f!, prox_g!; λ=1., α=1., ϵ_abs=1e-7, ϵ_rel=1e-4, max_iter=1000)

Minimize an objective function $f(x) + g(x)$, where $f(x)$ and $g(x)$ can both be nonsmooth, using alternating direction method of multipliers, also known as Douglas-Rachford splitting, overwriting x. See also admm.

source