Skip to main content
. 2022 Mar 27;12(4):508. doi: 10.3390/biom12040508
Algorithm 1: Autoencoder Algorithm. The Pseudocode of the proposed Autoencoder algorithm for 2D molecular fingerprints.
1: Mols = 2 D figerprints dataset descriptor
2: M = number of database molecules.// 102516
3: N = number of hidden layers.
4: α = learning rate value.
5: Epoch = 0;
6: For k = 1:M   // for all dataset molecules
7:  Input = Mlos(k)    // input data
8:  x = mols(k); // initial the encoder input layer with molecule k.
9:  AE(x)   // Autoencoder function
10:    For i = 1 until N   //start the encoder phase
11:        If epoch = 0 do // first time training
12:        wi = random (0,1) // initial the weigh matrix for the first time training
13:        bi = random (0,1) // initial the bias vector for the first time training
14:      Else
15:        wi=wi+ // update the weigh matrix based on the error value
16:        bi=bi+ // update the bias vector based on the error value
17:        hi=11+e(wix+bi) //calculate the hidden layers values based on Equation (1)
18:      x = hi   // make the hidden layer values to be an input to the next hidden layer.
    End   // end encoder phase
19:   Encoded date = x     //keep the last encoder layer which the new represented molecule.
20:   h = x          //keep last encoder layer be an input encoder layer.
21:   n = N
22:   For j = 1 until N    //start decoder phase
23:      w^j = wnT   //male the weight matrix of the decoder layer j be the transpose of n encoder weight matrix layer
24:      b^j=b^nT  // make the value of the bias vector of the decoder layer j be the transpose of bias vector of n encoder layer.
25:      zj=11+e(w^jh+b^j)  // calculate the zj reconstructed decoder layer values
26:      h = zj; // keep the hidden decoder layer values to be an input to the next hidden layer.
27:      n = n−1
   End       // end decoder phase
28:  output = h     // Reconstructed data
29:  =inputoutput2   // calculate the error value based on Equation (3)
30:  If (>α)         // if the error value is greater than the learning rate.
31:   Epoch = epoch+1     // need more training to reduce the error value
32:   Go to 9       // call the AE function again for new training—fine tune.
   Else
33:   New_Rep_mols(k) = Encoded date;//