Skip to main content
. 2022 Apr 2;8(4):97. doi: 10.3390/jimaging8040097
Algorithm 2: LDA steps
1: Compute the d-dimensional mean vectors of the dataset classes:
mi=1/nixDinxk (4)
2: Compute the scatter matricesbetween-class and within-class scatter matrix.
The within-class scatter matrix SW is computed by the following equation:
Sw=i=1cSi (5)
where   Si=xDinxmixmiT (6)
The between-class scatter matrix SB is computed by the following equation:
SB=i=1c NimimmimT (7)
where m is the overall mean, and mi and Ni are the sample mean and the size of the respective classes.
3: Compute the eigenvectors and associated eigenvalues for the scatter matrices.
4: Sort the eigenvectors by decreasing eigenvalues and select k eigenvectors with the highest eigenvalues to form a d x k dimensional matrix W.
5: Use the W eigenvector matrix to transform the original matrix onto the new subspace via the equation:
y=X W (8)
where X is an n × d-dimensional matrix representing the n samples, and Y is the transformed n × k-dimensional sample in the new subspace.