Skip to main content
. 2023 Jan 19;13(3):385. doi: 10.3390/diagnostics13030385
Algorithm 1: ShuffleNet-Light Architecture for Features Extraction and Classification of PSLs
Input:Input Tensor (X), 2-D of (256 × 256 × 3) PSLs training dataset.
Output:Obtained and Classified feature mapx=(x1,x2,,xn)augmented 2-D image
Main Process:
Step 1. Define number of stages = 4
Step 2. Iterate for Each Stage
  • (a)

    “Depthwise-CNN is applied to tensor x by kernel size of (3 × 3), which includes a number of filters; branch normalization, the ReLU activation function, Pointwise-CNN by kernel size of (1 × 1), branch normalization, and the GELU” activation function are applied.

  • (b)

    “Pointwise-CNN is applied to tensor x by kernel size of (1 × 1), which includes a number of filters, branch normalization, ReLU activation function, Pointwise-CNN by kernel size of (1 × 1), branch normalization, GELU” activation function are applied.

Step 3. Fscale = Squeeze and Excitation (SE) block contains expansion (1 × 1 × 3) layers.
Step 4. Fcat(i) = concatenation (# features-maps)
Step 5. channel = shuffle (x)
[End Step 2]
Step 6. Model Construction
  • (a)

    Define Global-average-pooling layer

  • (b)

    Define Fully-Connected (FC) Layer and applied GELU function.

Step 7. Afterward, the feature mapx=(x1,x2,,xn)generated, which is recognized by Softmax function.
Step 8. Test samplesyit are predicted to the class label using the decision function of the below equation.yit=t=0M1ft(xi)