Skip to main content
. 2014 Dec 8;14(12):23581–23619. doi: 10.3390/s141223581

Table 2.

Results obtained in the OAEI-13 conference track compared with OntoPhil (ordered by F-measure).

MaasMatch CroMatcher RIMOM2013 XMapGen StringEquiv
P 0.28 0.52 0.59 0.63 0.80
F 0.37 0.51 0.54 0.54 0.56
R 0.55 0.50 0.49 0.50 0.43
XMapSig SYNTHESIS CIDER_CL OntoK HotMatch

P 0.76 0.77 0.75 0.77 0.71
F 0.56 0.57 0.58 0.58 0.59
R 0.44 0.45 0.47 0.47 0.51
IAMA LogMapLite WikiMatch edna HerTUDA

P 0.78 0.73 0.73 0.76 0.74
F 0.59 0.59 0.59 0.60 0.60
R 0.48 0.50 0.49 0.49 0.50
WeSeE-Match MapSSS ServOMap ODGOMS StringsAuto

P 0.85 0.83 0.73 0.75 0.78
F 0.61 0.62 0.63 0.64 0.64
R 0.47 0.50 0.55 0.56 0.54
OntoPhil LogMap AML YAM++

P 0.86 0.80 0.87 0.80
F 0.67 0.68 0.69 0.74
R 0.57 0.59 0.57 0.69