Skip to main content
. 2020 May 6;15(5):e0232683. doi: 10.1371/journal.pone.0232683

Table 1. Diffusivity equation—forward modeling: PINN performance as function of training set size.

Relative L2 errors between the exact and predicted values of p for the validation set. The table shows the dependency on the number of the initial and boundary training data, Nb, and on the number of collocation points, NΠ. Here, the network architecture is fixed to 4 layers with 10 neurons per hidden layer.

Panel A: Average over 1 realization
NΠ 2 5 10 20 40 80 160 320 640
Nb
2 0.1010 0.1150 0.4060 0.2570 0.1530 0.0108 0.1930 0.0236 0.0216
3 0.0339 0.0665 0.1050 0.0122 0.0020 0.0973 0.0025 0.0314 0.0074
6 0.0248 0.0401 0.1110 0.0807 0.0012 0.0057 0.0014 0.0116 0.0046
12 0.0072 0.1230 0.0126 0.0296 0.0012 0.0119 0.0010 0.0015 0.0076
24 0.0152 0.0018 0.0008 0.0008 0.0006 0.0016 0.0008 0.0006 0.0009
48 0.0124 0.0127 0.0015 0.0037 0.0044 0.0013 0.0035 0.0002 0.0026
96 0.0076 0.0010 0.0023 0.0015 0.0026 0.0017 0.0009 0.0007 0.0010
192 0.0036 0.0006 0.0205 0.0025 0.0010 0.0051 0.0005 0.0008 0.0002
384 0.0020 0.0019 0.0009 0.0004 0.0008 0.0001 0.0005 0.0005 0.0003
Panel B: Average over 3 realizations
NΠ 2 5 10 20 40 80 160 320 640
Nb
2 0.0972 0.0596 0.0417 0.0859 0.0380 0.0086 0.0041 0.0105 0.0050
3 0.0744 0.0652 0.0969 0.0415 0.0173 0.0268 0.0261 0.0084 0.0077
6 0.1020 0.1160 0.0841 0.0133 0.0071 0.0756 0.0217 0.0078 0.0064
12 0.0853 0.0099 0.0751 0.0088 0.0061 0.0456 0.0036 0.0059 0.0033
24 0.0447 0.0272 0.0104 0.0214 0.0290 0.0015 0.0041 0.0017 0.0060
48 0.0201 0.0037 0.0012 0.0230 0.0040 0.0072 0.0042 0.0060 0.0025
96 0.0104 0.0020 0.0013 0.0024 0.0048 0.0011 0.0009 0.0019 0.0020
192 0.0186 0.0014 0.0022 0.0006 0.0007 0.0011 0.0006 0.0004 0.0008
384 0.0015 0.0011 0.0010 0.0009 0.0005 0.0007 0.0010 0.0005 0.0005
Panel C: Average over 24 realizations
NΠ 2 5 10 20 40 80 160 320 640
Nb
2 0.1002 0.0965 0.0904 0.0732 0.0849 0.0514 0.0139 0.0194 0.0188
3 0.0879 0.0775 0.0861 0.0513 0.0352 0.0354 0.0260 0.0111 0.0117
6 0.0710 0.0537 0.0529 0.0241 0.0144 0.0265 0.0153 0.0056 0.0107
12 0.0541 0.0351 0.0376 0.0302 0.0104 0.0235 0.0111 0.0135 0.0058
24 0.0320 0.0233 0.0141 0.0311 0.0160 0.0052 0.0061 0.0043 0.0039
48 0.0178 0.0050 0.0162 0.0111 0.0040 0.0024 0.0044 0.0041 0.0026
96 0.0087 0.0037 0.0031 0.0034 0.0022 0.0015 0.0021 0.0010 0.0012
192 0.0064 0.0049 0.0017 0.0010 0.0010 0.0011 0.0012 0.0009 0.0009
384 0.0046 0.0026 0.0011 0.0011 0.0010 0.0007 0.0008 0.0007 0.0006