Skip to main content
. 2024 Feb 7;24(4):1092. doi: 10.3390/s24041092
Algorithm 7 Weighted Softmax Loss with Edge Penalty
  • Require: y,y^

  • Initialize grad and hess as zero matrices of the same shape as y^

  • Define class pair weights

  • Set extra_penalty1.2

  • for i=0 to len(y)1 do

  •     for j=0 to columns(y^)1 do

  •         weight weight for the pair (min(y[i],j),max(y[i],j))

  •         proby^[i,j]

  •         penaltyextra_penalty if y[i] or j is an edge class, else 1

  •         grad[i,j]penalty×weight×(prob(y[i]==j))

  •         hess[i,j]penalty×weight×prob×(1prob)+0.02

  •     end for

  • end forreturn  grad,hess