Table 3.
SMNN | Hidden layer | Neuron type | Drop out | Hidden layer | Neuron type | Drop out | Hidden layer | Neuron type | Output layer | Neuron type |
---|---|---|---|---|---|---|---|---|---|---|
1 | ReLU | 1 | Sigmoid | |||||||
2 | ReLU | 1 | Sigmoid | |||||||
3 | ReLU | 1 | Sigmoid | |||||||
4 | ReLU | 20% | 50 | PReLU | 1 | Sigmoid | ||||
5 | ReLU | 20% | 100 | PReLU | 1 | Sigmoid | ||||
6 | ReLU | 20% | 200 | PReLU | 1 | Sigmoid | ||||
7 | ReLU | 20% | 50 | PReLU | 20% | 50 | PReLU | 1 | Sigmoid | |
8 | ReLU | 20% | 100 | PReLU | 20% | 100 | PReLU | 1 | Sigmoid | |
9 | ReLU | 20% | 200 | PReLU | 20% | 200 | PReLU | 1 | Sigmoid |
Numbers correspond to the number of neurons in each layer. For example, SMNN 1 consists of one hidden linear layer with 50 ReLU neurons and a linear output layer with one sigmoid neuron. denotes dimension of the input to the first hidden layer.