Skip to main content
. Author manuscript; available in PMC: 2019 Jun 26.
Published in final edited form as: Adv Neural Inf Process Syst. 2018 Dec;31:5695–5705.

Algorithm 1.

Optimizing hyperparameters with the decoupled Laplace approximation

Require: input carriers g, choices y
Require: initial hyperparameters θ0, subset of hyperparameters to be optimized θOPT
1: repeat
2:  Optimize for w given current θwMAP, Hessian of log-posterior Hθ, log-evidence E
3:  Determine Gaussian prior N(0, Cθ) and Laplace appx. posterior N(wMAP,Hθ1)
4:  Calculate Gaussian approximation to likelihood N(wL,Γ) using product identity, where Γ1=(Hθ+Cθ1) and wL = −ΓHθwMAP
5:  Optimize E w.r.t. θOPT using closed form update (with sparse operations) wMAP=Hθ1Γ1wL
6:  Update best θ and corresponding best E
7: until θ converges
8: return wMAP and θ with best E