Skip to main content
. 2021 Aug 6;8(8):e19824. doi: 10.2196/19824

Table 2.

The Akaike information criterion (AIC) results against all models. Each row is reported with the number of parameters (K), the residual sum of squares, and the AIC. A lower AIC is better.

Model Number of parameters, K Likelihood AIC
MILA-SocNeta 59,668 −143.72 −597.05
MIL-SocNetb 56,296 −210.22 −464.45
Deep learning 138,502 −309.97 −260.84
Language 16695.5 −420.31 −61.03
LIWCc 93 −169.62 575.92
Usr2Vec 100 −190.28 640.32
Topic 200 −276.42 1290.66

aMILA-SocNet: multiple instance learning with an anaphoric resolution for social network.

bMIL-SocNet: multiple instance learning for social network.

cLIWC: linguistic inquiry and word count.