Skip to main content
. 2024 Dec 26;19(12):e0312856. doi: 10.1371/journal.pone.0312856

Table 2. Description of STL-LSTM algorithm.

Algorithm: LSTM model based on data characteristics and space-time attention machine
Input: The input variable x(k)=[x1(k),x2(k),,xT(k)] represents the k-th cycle’s short-term data, where T represents the dimensionality of the short-term data. L(k) represents the real capacity value at the k-th cycle.
Output: The output variable Q˜(k) represents the SOH estimated value for the k-th cycle’s capacity.
1.Algorithm: Long Short-Term Memory Network Based on Temporal Attention Mechanism(STL-LSTM)
procedure:
1.Standardize the data set
2.Initialize the network parameters, including the weight matrix W and the bias vectors U,V
3.Execute for each time step T = 1 to k:
a.Short-time feature extraction:
For each time step k = 1 to T
i. Calculate the activation values of the input, forget, and output gates
Input gate: ik=σ(Wi[s(k1),c(k)]+bi)
Forget Gate: f(k)=σ(Wf[s(k1),c(k)]+bf)
Output gate: h(k)=σ(Wo[s(k1),c(k)]+bo)
Calculate cell state
Q˜k=tanh(Wc[s(k1),c(k)]+bc)
Update the hidden status
d(k)=fkd(k1)+ikC˜k
b. Spatial attention mechanism:
For each time step k = 1 to T:
i.Calculate the time attention value at moment k: xs(k)=i=1Tαi(k)xi(k),1iT
ii.The time weighted sum of all encoder hidden states is calculated to obtain the final feature input:
αi(k)=exp|xi(k)Q(k1)|i=1Texp|xi(k)Q(k1)|,1iT
b. Temporal attention mechanism:
i.Calculate the time attention value of the hidden state at time k:
β1i(k)=exp|xsi(k)s(k1)|,1iM1
ii.Final feature input c(k)
c(k)=i=1Mβi(k)xsi(k),1iM
c. Capacity estimation:
i.The average capacity is trend data L(k)
L(k)=1Mi=1MQ(ki)
ii. c(k)、L(k)、Q˜(k1)Q(k−1) as input of LSTM model as input, Estimate subsequent capacity Q˜(k)
Q˜(k)=fl[c(k),L(k),Q(k1),Q˜(k1)]+bv