Skip to main content
. 2024 Jan 31;10:e1827. doi: 10.7717/peerj-cs.1827

Algorithm 1. Attention mechanism-based LSTM with class weight.

    Require: Training data Xtrain and labels Ytrain, testing data Xtest and labels Ytest, batch size b, epochs e, number of LSTM units u, dropout rate d, learning rate α, class weight w, and optimizer o
    Ensure: Trained model
1:     Procedure ATTENTIONLAYER ( inputshape,returnsequences=True)
2:        b Initialize biases with zeros
3:        return Weighted sum of input
4:     end Procedure
5:     Procedure CREATE_MODEL ( units=u,dropoutrate=d,optimizer=o,learningrate=α)
6:        LSTM layer with u units and input shape
7:        Attention layer
8:        Dropout layer with dropout rate d
9:        Dense layer with sigmoid activation
10:       Compile model with binary cross-entropy loss and optimizer o with learning rate α
11:        return Compiled model
12:     end Procedure
13:      model CREATE_MODEL( units=u,dropoutrate=d,optimizer=Adam,learningrate=α)
14:      model.fit(Xtrain,Ytrain,batchsize=b,epochs=e,classweight=w)
15:      return Trained model