Skip to main content
. Author manuscript; available in PMC: 2022 Feb 17.
Published in final edited form as: Neuron. 2021 Jan 6;109(4):597–610.e6. doi: 10.1016/j.neuron.2020.12.004

Algorithm 1.

Optimizing hyperparameters with the decoupled Laplace approximation

Require: inputs x, choices y
Require: initial hyperparameters θ0, subset of hyperparameters to be optimized θOPT
1: tepeat
2: Optimize for w given current θwMAP, Hessian of log-posterior Hθ, log-evidence E
3: Determine Gaussian prior N(0,Cθ) and Laplace appx. posterior N(wMAP,Hθ1)
4: Calculate Gaussian approximation to likelihood N(wL,Γ) using product identity, where Γ1=(Hθ+Cθ1)andwL=ΓHθwMAP
5: Optimize E w.r.t. θOPT using closed form update (with sparse operations) wMAP=Hθ1Γ1wL
6: Update best θ and corresponding best E
7: until θ converges
8: return wMAP and θ with best E