Skip to main content
. 2022 Dec 16;22(24):9922. doi: 10.3390/s22249922
Algorithm 1 Pseudocode for training the BERT-DTCN.
  • Require: 

    Xs: training set for the BERT-DTCN, including the constructed defect text dataset and labels; Nc: number of classifier-training iterations per mini-batch.

  • 1:

    for the number of training iterations do

  • 2:

        Sample mini-batch of m examples from the training set Xs;

  • 3:

        for i=1Nc do

  • 4:

            Update the BERT-DTCN by minimizing the loss: Lossc

  • 5:

        end for

  • 6:

    end for