Skip to main content
. 2023 Jul 15;13:11463. doi: 10.1038/s41598-023-38706-5

Table 5.

The neural networks hyperparameters.

NNs Hyperparameters
PRNN

hidden layer sizes = 10, training function = 'trainscg',

Performance function = 'crossentropy', number epochs = 100

FNN

Input layer (10 neurons, activation = 'tanh'), 2 Hidden Layer (2nd layer – hidden layer with 8 neurons, activation = 'tanh' and 3rd layer – hidden layer with 6 neurons, activation = 'tanh'), Output layer (1 neuron, activation = 'sigmoid')

optimizer = 'adam', learning rate = 0.001,

loss = 'binary_crossentropy', metrics = ['accuracy'], number epochs = 100

1D-CNN

2 Conv1D layers, 1 MaxPool1D layer and 2 Dense Layers as follows:

Conv1D layer (filters = 64, kernel_size = 2, activation = 'relu'), Dropout layer (dropout rate = 0.2), Conv1D layer (filters = 32, kernel_size = 1, activation = 'relu'), MaxPool1D layer (pool_size = 1), Flatten layer, Dense layer (32 neurons, activation = 'relu'),

Dense layer (1 neuron, activation = 'sigmoid'). loss = 'binary_crossentropy’, optimizer = 'adam', learning rate = 0.001, metrics = ['accuracy']