Algorithm 2: Proposed Model (SA-BiLSTM) |
-
1.
Input Sequence of vectors (a1, a2, …, aT)
-
2.
Bidirectional LSTM Processing
For_LSTM: Processing input sequence from left to right.
Back_LSTM: Processing input sequence from right to left.
Concatenate Outputs: Form a single sequence of hidden states.
-
3.
LSTM Cell Computation:
The Input Gate is computed via Equation (26)
Forget gate is calculated via Equation (27)
The output gate and Input Modulation gate are calculated via Equations (28) and (29)
Memory cell is updated via Equation (30)
The Hidden state is updated via Equation (31)
-
4.
Multi-Head Attention Mechanism
Hidden States are divided into multiple “heads”.
Attention Weights calculated Individually for each head, and weighted sum is calculated.
Weighted Sum , where Σ represents the elementwise multiplication and sum of attention weights and hidden states.
-
5.
Fully Connected Layers Process the attention-weighted hidden states through one or more fully connected layers.
-
6.
Output Classification, including attention-weighted hidden states.
|