Skip to main content
. 2020 Jun 10;20(11):3305. doi: 10.3390/s20113305
Algorithm 1: The solution of Equation (3). The Newton step is computed with linear conjugate gradient.
Input: S=(<(x1,1),,(xn,n)>,V,W,C)
ψi=it=1vtxti
ω(i,j)=max(wi,wj)
qkQ={ω(i,j)|i>j}
u is randomly iniatlized
repeat
Dkk=1(uTqk<1)
g=u+2CQTD(QTu1)
repeat
  Update based on the computation of s+2CQTD(Qs) for some s.
until Convergence of linear conjugate gradient
δu=(I+2CQTDQ)1g
uuτδu (τ found by line gradient)
until Convergence of Newton
return u