Skip to main content
. 2022 Oct 6;23:413. doi: 10.1186/s12859-022-04971-w

Table 2.

EnsembleSplices’ CNNs and DNNs model architecture

Neural networks Layer type
CNN 1 Conv1D(72, 5)
Conv1D(144, 7)
Conv1D(168, 7)
Flatten()
Dropout(0.20)
Dense(2, "sigmoid")
CNN 2 Conv1D(136, 3)
Conv1D(72, 4)
MaxPooling1D(7)
Conv1D(272, 7)
MaxPooling1D(3)
Flatten()
Dropout(rate = 0.35)
Dense(2, "sigmoid")
CNN 3 Conv1D(208, 9)
MaxPooling1D(6)
Conv1D(120, 5)
MaxPooling1D(3)
Flatten()
Dropout(0.20)
Dense(2, "sigmoid")
CNN 4 Conv1D(250, 5)
Conv1D(250, 5)
Conv1D(250, 5)
MaxPooling1D(3)
Flatten()
Dropout(0.20)
Dense(2, "sigmoid")
DNN 1 Flatten()
Dense(704)
Dense(224)
Dropout(0.1)
Dense(512)
Dropout(0.15)
Dense(2, "sigmoid")
DNN 2 Flatten()
Dense(704)
Dense(224)
Dense(128)
Dropout(0.15)
Dense(2, "sigmoid")
DNN 3 Flatten()
Dense(256)
Dense(352)
Dense(32)
Dense(352)
Dropout(0.15)
Dense(2, "sigmoid")
DNN 4 Flatten()
Dense(250)
Dense(250)
Dense(250)
Dropout(0.25)
Dense(2, "sigmoid")

The number of filters and kernel size are the first and second parameters for convolutional layers (CNN), respectively, with the same activation function (ReLu) and padding. The pool size is the parameter in the max-pooling layer, and the number of dense nodes and ReLu activation function is the parameter in the layer for dense neural networks (DNNs). DNN 4 uses the random normal as its kernel initializer