Skip to main content
. 2024 Feb 7;24(4):1092. doi: 10.3390/s24041092
Algorithm 6 Weighted Softmax Loss Function—Variant 1
  • Require: y, y^

  • y^exp(y^max(y^))

  • y^y^/(y^)

  • gradzeromatrixwithshapeofy^

  • hesszeromatrixwithshapeofy^

  • weights{(0,1):0.1,(1,0):0.1,(0,2):0.17,(2,0):0.17,(1,2):0.1,(2,1):0.1}

  • for i=0 to lengthofy1 do

  •     for j=0 to numberofcolumnsiny^1 do

  •         weightweights[min(y[i],j),max(y[i],j)]

  •         if weight is not set then

  •            weight0

  •         end if

  •         proby^[i,j]

  •         grad[i,j]weight×(prob(y[i]==j))

  •         hess[i,j]weight×prob×(1prob)+0.02

  •     end for

  • end forreturn  grad,hess