Multilayer perceptron (MLP) |
Multiple layers system with a basic unit (perceptron). It allows a forward-directed flow of data from the input layer to the output one, passing through the hidden layer/layers, and with a learning algorithm of backward propagation. |
Deep Learning |
It provides a dissection of various levels that correlate to peculiar levels of abstraction. |
Convolutional Neural Networks (CNNs) |
An MLP development applied to two-dimensional matrices through a convolution operation. CNN neurons are linked to a limited number of inputs of a confined continuous region, used for computer vision tasks. |
Recurrent neural networks (RNNs) |
Uses sequential data or time-series data and develops loops between layers, storing memories for a short time. They are used for voice recognition. |
Deep belief networks (DBNs) |
Links between layers but not between elements within the same layer. |
Long short-term memory (LSTM) |
Advanced version of RNN that stores information for a more extended period (“long-term dependencies”). |
Mixed Networks |
Hybrid networks built by the correlation of two or more particular ANNs (e.g., CNNs and RNNs). |