Skip to main content
. 2025 Jul 31;11:e3089. doi: 10.7717/peerj-cs.3089

Table 2. Summary of related works.

Method Core Defects
Anomal-E E-GraphSAGE encoder + modified DGI; edge embeddings fed to anomaly detectors (PCA, IF, etc.) Complex preprocessing; real-time efficiency unaddressed.
RNN-XGBoost RNN variants (LSTM/GRU) + XGBoost for feature selection/classification. Limited high-dimensional heterogeneous data handling.
Adaptive CNN-GRU 1D CNN (spatial) + GRU (temporal) + data preprocessing + softmax. High computational load from deep architecture.
GJOADL-IDSNS GJOA for feature selection, Attention GJOA (feature selection) + A-BiLSTM (classification) + SSA (hyperparameter tuning). High computational overhead from optimization algorithms.
Dugat-LSTM Multi-step preprocessing (M-squared, KerPCA/CHbO) + LSTM classification. Limited generalization to unknown attacks.
IDS-SIoDL LSTM + feature engineering (Autoencoder, GA, IG) for preprocessing/classification. Unvalidated generalization to unseen attack variants.
Hybrid CNN-LSTM CNN (spatial) + LSTM (temporal) + PCA optimization + model pruning. Limited unknown/novel attack detection.
DRL-based IDS
(DQN, DDQN, PG, AC)
DRL treats network features as states, labels as actions. High training complexity
MARL-based IDS Two-layer architecture (detection agents + decision agent) + improved DQN. Increased system complexity.
ID-RDRL RFE + DT (feature selection) + DQN classification. Poor non-linear feature handling.
AE-RL Adversarial training of environment/classifier agents; dynamic resampling via Q-functions. Stability issues in parallel training.
AE-SAC Environmental agent for data resampling and SAC algorithm to maximize action entropy and cumulative reward. Complex structure causing longer training time.
Big-IDS Decentralized MARL + shared target networks + cloud/streaming techniques. High computational cost (encryption/distributed training).
RFS-DRL DQN + RFE (feature selection) + epsilon-greedy/experience replay. Low U2R detection accuracy; high hardware requirements.