Skip to main content
. Author manuscript; available in PMC: 2021 Nov 5.
Published in final edited form as: J Mach Learn Res. 2021 Jan;22:55.

Algorithm 9.

Proximal linearized 2-block ADMM with backtracking when the loss is differentiable and gradient is Lipschitz continuous

Input: X,γ,w,α,ζ
Initialize: U(0), V(0), Λ(0), t
Precompute: Difference matrix D, x˜j
while not converged do
for j = 1 to p do
  t = 1
  u˜j(k1)=U.j(k1)x˜j1n
  g(u˜j(k1))=(X.j,u˜j(k1)+x˜j1n)+ρDT(D(u˜j(k1)+x˜j1n)V.j(k1)+Λ.j(k1))
  z=proxtαζj2(u˜j(k1)tg(u˜j(k1)))
  while g(z)>g(u˜j(k1)g(u˜j(k1))T(u˜j(k1)z)+12tzu˜j(k1)22do
   t = βt
   z=proxtαζj2(u˜j(k1)tg(u˜j(k1)))
  end while
  U.j(k)=z+x˜j1n
end for
V(k)=proxγ/ρP1(;w)(DU(k)+Λ(k1))
Λ(k)=Λ(k1)+(DU(k)V(k))
end while
Output: U(k).