Algorithms
The proximal optimization algorithms for minimization of non-smooth objective functions.
Proximal Gradient Methods
ProximalMethods.prox_grad
— Functionprox_grad(x0, f, ∇f!, prox!; style=:noaccel, β=.5, ϵ=1e-7, max_iter=1000)
Minimize an objective function $f(x) + g(x)$, where $f(x)$ is differentibale while $g(x)$ is not, using the proximal gradient method.
Arguments
x0::AbstractVector
: initial parameter values (n x 1)f::Function
: $f(x)$∇f!::Function
: gradient off
prox!::Function
: proximal operator of $g(x)$style::Symbol
: acceleration styleβ::Real
: line search parameterϵ::Real
: tolerancemax_iter::Integer
: max number of iterations
Returns
x::AbstractVector
: minimizer (optimal parameter values) (n x 1)
ProximalMethods.prox_grad!
— Functionprox_grad!(x, f, ∇f!, prox!; style=:noaccel, β=.5, ϵ=1e-7, max_iter=1000)
Minimize an objective function $f(x) + g(x)$, where $f(x)$ is differentibale while $g(x)$ is not, using the proximal gradient method. Storing the result in x
. See also prox_grad
.
Alternating direction method of multipliers (ADMM)
ProximalMethods.admm
— Functionadmm(x0, prox_f!, prox_g!; λ=1., α=1., ϵ_abs=1e-7, ϵ_rel=1e-4, max_iter=1000)
Minimize an objective function $f(x) + g(x)$, where $f(x)$ and $g(x)$ can both be nonsmooth, using alternating direction method of multipliers, also known as Douglas-Rachford splitting. The alternating direction method of multipliers method has two hyperparameters, λ
and α
. λ
controls the scaling of the update steps, i.e. a pseudo step size, and is equal to the inverse of the augmented Lagrangian parameter. $α ∈ [0,2]$ is the relaxation parameter, where $α < 1$ denotes under-relaxation and $α > 1$ over-relaxation.
Arguments
x0::AbstractVector
: initial parameter values (n x 1)prox_f!::Function
: proximal operator of $f(x)$prox_g!::Function
: proximal operator of $g(x)$λ::Real
: proximal scaling parameterα::Real
: relaxation parameterϵ_abs::Real
: absolute toleranceϵ_rel::Real
: relative tolerancemax_iter::Integer
: max number of iterations
Returns
x::AbstractVector
: minimizer $∈ dom f$ (n x 1)z::AbstractVector
: minimizer $∈ dom g$ (n x 1)
ProximalMethods.admm!
— Functionadmm!(x, prox_f!, prox_g!; λ=1., α=1., ϵ_abs=1e-7, ϵ_rel=1e-4, max_iter=1000)
Minimize an objective function $f(x) + g(x)$, where $f(x)$ and $g(x)$ can both be nonsmooth, using alternating direction method of multipliers, also known as Douglas-Rachford splitting, overwriting x
. See also admm
.