Skip to main content
. 2017 Jul 31;11:68. doi: 10.3389/fncom.2017.00068

Algorithm 2.

Low-rank MNE block coordinate descent algorithm (locally optimal domain)

1: inputs: maximum rank r, maximum range for regularization parameters ϵmax, number of regularization parameter grid points ngrid, training set indices Ttrain⊆{1, … , N} and cross-validation set indices TCV⊂{1, … , N} where TtrainTCV = ∅, paired data samples (st, yt) for all tTtrainTCV, initial guess for weights a, h, U, and V, set πk for all k = 1, … , r, maximum iterations Mmax, convergence precision δp, maximum failures to find a better solution σmax
2: initialization: JUVT, LbestL(a, h, U, V)|TCV (evaluated over data indices tTCV), regularization grid resolution δϵ ← ϵmax/ngrid, early completion switch σ ← 1
3:  
4: for m ← 1, … , Mmax do
5:       for k ← 1, … , r do
6:            a′ ← a, h′ ← h, uU,k, vV,k
7:            JJuvT                                                                                                 ⊳ remove block k from J
8:            for n ← 0, … , ngrid do
9:                 ϵkϵ
10:                 a′, h′, u′, v′ ← Solve the block k subproblem (Equation 13) using an
11:                    interior-point method algorithm with inputs a′, h′, u′, v′, J,
12:                    ϵk, πk, (st, yt):∀tTtrain
13:                 L=L(a,h,J+uvT)|TCV
14:                 if L<Lbest-δp then
15:                    LbestL
16:                    aa′, hh′, U,ku, V,kv
17:                    σ ← 0
18:                 else if LL(a,h,U,V)|TCV then
19:                    aa′, hh′, U,ku, V,kv (or skip for monotonic convergence)
20:            JJ+U,kV,kT                                                                                      ⊳ include block k's solution in J
21:       if σ = σmax then
22:            break                                                                                                               ⊳ optimization has finished
23:       σ ← σ + 1
24:  
25: outputs: a, h, J