Table 8.
Conclusions about learning approaches.
| Ml And Dl Techniques | Advantages | Disadvantages | Suitability towards the Attacks |
|---|---|---|---|
| DT | Inherent feature selection, less preprocessing required, simple and easy to implement, can handle missing values, coupling with clustering decreases the processing time in misuse-based detection [29]. | Large training time, large complexity, small alterations cause significant changes. | C4.0, C5.0 show very similar results to ANN in [110] with real IoT data. J48 shows a high affinity towards the DOS attack [111]. |
| SVM | The Huge success rate in IDS, best for binary classification, requires small datasets for training, enhanced SVM shows better results in novel and real attacks. | Reveals its weakness in multiclass classification, massive consumption of memory, depends on the kernel function. | It is used in [9] for attack detection. Also useful in spoofing attacks, intrusions in access control [112], online outlier detection [113]. |
| KNN | It has a Fast training phase and makes no assumptions about the data. | It requires abundant storage, expensive, depends on the value of K, and suffers from the dimensionality curse. | Mostly used in combination with other classifiers [48,107]. Useful for access control intrusion detection, malware. |
| RF | No feature selection, no overfitting problem, usually has the best accuracy. | Time-consuming because of the development of decision trees. | It has achieved 99% accuracy. for the DOS attack [106]. Useful for malware detection,link fault detection [83], access control. |
| NB | Robust towards the noise, simple and easy to implement | It cannot capture useful information because of the assumption of independence amongst the features. | Used in [49] for intrusion detection, access control. |
| ANN | Robust model and can handle non-linear data. | It suffers from overfitting, and the technique is time-consuming, selection of activation function is another overhead and estimating an appropriate number of units in each layer. | Very useful DOS attack detection [83,114]. |
| RNN | Efficient modeling of time-series data | Difficulty in training, cannot remember very long sequences with Relu or tanh activation function [115]. | Eavesdropping [107]. |
| LSTM | Reduces a load of feature engineering, effective for unstructured datasets, can remember long sequences of attack patterns. | Difficult to train because of gigantic memory bandwidth requirements. | IoT malware [108], botnet activities, used in [116] for attack detection in fog networks. |