Nikola Janjušević

# Scaled form ADMM

The alternating direction method of multipliers (ADMM) algorithm solves problems of the form,

$\begin{array}{rcl} &\underset{}{\mathrm{minimize}\,}\, && f(x) + g(z) \\ &\text{subject to}\, && Ax + Bz = c \end{array}$

The iterates of ADMM are defined by the augmented Lagrangian,

$L_{\rho}(x,z,y) = f(x) + g(z) + y^\top(Ax + Bz - c) + (\rho/2)\lVert Ax+Bz-c \rVert_2^2,$

and consist of,

\begin{aligned} x^{k+1} &\coloneqq \underset{x}{\mathrm{argmin}\,}\, L_{\rho}(x,z^k,y^k) \\ z^{k+1} &\coloneqq \underset{z}{\mathrm{argmin}\,}\, L_{\rho}(x^{k+1},z,y^k) \\ y^{k+1} &\coloneqq y^k + \rho(Ax^{k+1} + Bz^{k+1} - c) \end{aligned}

Note that the method of multipliers algorithm considers the update of $x,z$ jointly, whereas ADMM takes a Gauss-Seidel pass. An often more convenient scaled form may be obtained by completing the square with the dual variable and residual ($r = Ax + Bz - c$), and defining the scaled dual variable $u = (1/\rho)y$.

$\begin{array}{rcl} y^\top r + (\rho/2) \lVert r \rVert^2_2 &=& \frac{\rho}{2}\left[ \frac{2}{\rho}y^\top r + r^\top r + \frac{1}{\rho^2}y^\top y - \frac{1}{\rho^2}y^\top y \right] \\ &=& \frac{\rho}{2}\lVert r + u \rVert_2^2 - \frac{\rho}{2}\lVert u \rVert^2_2 \end{array}$

With the scaled dual variable, our augmented Lagrangian is written as,

$L_{\rho}(x,z,u) = f(x) + g(z) + (\rho/2)\lVert Ax + Bz - c + u \rVert_2^2 - (\rho/2)\lVert u \rVert_2^2,$

and the ADMM iterates are expressed as,

Scaled Form ADMM \begin{aligned} x^{k+1} &\coloneqq \underset{x}{\mathrm{argmin}\,}\, f(x) + (\rho/2)\lVert Ax + Bz^k - c + u^k \rVert_2^2\\ z^{k+1} &\coloneqq \underset{z}{\mathrm{argmin}\,}\, g(z) + (\rho/2)\lVert Ax^{k+1} + Bz - c + u^k \rVert_2^2\\ u^{k+1} &\coloneqq u^k + Ax^{k+1} + Bz^{k+1} - c \end{aligned}

The ADMM iterates written in scaled form are often more amenable to being written with proximal operators.