Artificial Neural Network |
ANN |
It is trained by processing examples, each of which contains a known “input” and “result,” forming probability-weighted associations between the two, which are stored within the data structure of the net itself [34]. |
Backpropagation Neural Network |
- |
Backpropagation is a process involved in training a neural network. It involves taking the error rate of a forward propagation and feeding this loss backward through the neural network layers to fine-tune the weights. Backpropagation is the essence of neural net training [35]. |
Bayesian Inference |
- |
It allows for an algorithm to make predictions based on prior beliefs. In Bayesian inference, the posterior distribution of predictors (derived from observed data) is updated based on new evidence [36]. |
Causal Associational Network |
CASNET |
This model consists of three main components: observations of a patient, pathophysiological states, and disease classifications. As observations are recorded, they are associated with the appropriate states [37]. |
Convolutional Neural Network |
CNN |
A network architecture for deep learning that learns directly from data. CNNs are particularly useful for finding patterns in images to recognize objects, classes, and categories. They can also be quite effective for classifying audio, time-series, and signal data [38]. |
Deep Neural Network |
DNN |
An ANN with multiple layers between the input and output layers. There are different types of neural networks but they always consist of the same components: neurons, synapses, weights, biases, and functions [39]. |
Light Gradient Boosting Machine |
LightGBM |
LightGBM is a gradient-boosting ensemble method that is based on decision trees. As with other decision tree-based methods, LightGBM can be used for both classification and regression. LightGBM is optimized for a high performance with distributed systems [40]. |
Multilayer Perceptron |
MLP |
A feedforward artificial neural network that generates a set of outputs from a set of inputs. An MLP is characterized by several layers of input nodes connected as a directed graph between the input and output layers. MLP uses backpropagation for training the network [41]. |
Natural Language Processing |
NLP |
It enables machines to understand the human language. Its goal is to build systems that can make sense of text and automatically perform tasks such as translation, a spell check, or topic classification [18]. |
Optimal Channel Networks |
OCNet |
Oriented spanning trees that reproduce all scaling features characteristic of real, natural river networks. As such, they can be used in a variety of numerical and laboratory experiments in the fields of hydrology, ecology, and epidemiology [42]. |
Probabilistic Neural Network |
PNN |
A feedforward neural network used to handle classification and pattern recognition problems [43]. |
Random Forest Models |
|
An ensemble learning method for classification, regression, and other tasks that operates by constructing a multitude of decision trees at the training time [44]. |
Recurrent Neural Network |
RNN |
An ANN where connections between nodes can create a cycle, allowing output from some nodes to affect subsequent input to the same nodes. This allows it to exhibit temporal dynamic behavior [45]. |
Region-based Convolutional Neural Network |
R-CNN |
The key concept behind the R-CNN series is region proposals. Region proposals are used to localize objects within an image [46,47]. |
Support Vector Machine |
SVM |
A type of deep learning algorithm that performs supervised learning for classification or regression of data groups. In AI and machine learning, supervised learning systems provide both input and desired output data, which are labeled for classification [48]. |
Extreme Gradient Boosting |
XGBoost |
XGBoost, which stands for extreme gradient boosting, is a scalable, distributed gradient-boosted decision tree (GBDT) machine learning library. It provides parallel tree boosting and is the leading machine learning library for regression, classification, and ranking problems [49]. |