Skip to main content
. 2021 Jan 10;21(2):450. doi: 10.3390/s21020450
Algorithm 1 The algorithm of FFCNN back-propagation.
Input: Labeled source domain samples (xiS,yi)i=1m, unlabeled target domain samples
(xit)i=1m, regularization parameter λ, learning rate η, dilate rate {r1,r2,r3}.
Output Network parameters θrjconv1,θconv2,θfc,θclf and predicted labels for target
 domain samples.
Begin:
 Initialization for θrjconv1,θconv2,θfc,θclf.
while stopping criteria is not met do
  for each source and target domain samples of mini-batch size m do
   Calculate output xrjconv1 of each branch in dilate convolution layer according to
   Equation (9).
   Connect xrjconv1j=13, and calculate output of the second convolution layer according
   to Equation (10).
   Calculate features representations zi and output of softmax layer according to
   Equation (11).
   Calculate loss (yS,y˜S,zS,zT) according to Equation (12)
   Upgrade θrjconv1,θconv2,θfc,θclf according to Equation (13).
  end for
end while