Skip to main content
. 2014 Dec 8;14(12):23581–23619. doi: 10.3390/s141223581

Table 1.

Results obtained in the OAEI-13 benchmark track compared with OntoPhil (ordered by F-measure).

MaPSSS StringsAuto RiMOM2013 ServOMap LogMapLt
P 0.84 0.84 0.49 0.53 0.43
F 0.14 0.14 0.19 0.33 0.46
R 0.08 0.08 0.12 0.22 0.50
MaasMatch OntoK LogMap XMapGen edna

P 0.66 0.69 0.72 0.66 0.58
F 0.50 0.51 0.51 0.52 0.54
R 0.41 0.40 0.39 0.44 0.50
WeSeE AML XMapSig Synthesis CIDER-CL

P 0.96 1.00 0.71 0.60 0.84
F 0.55 0.57 0.59 0.60 0.66
R 0.39 0.40 0.50 0.60 0.55
CroMatcher Hertuda Hotmatch WikiMatch ODGOMS

P 0.75 0.90 0.96 0.99 0.98
F 0.68 0.68 0.68 0.69 0.70
R 0.63 0.54 0.50 0.53 0.54
OntoPhil IAMA YAM++

P 1.00 0.99 0.84
F 0.71 0.73 0.77
R 0.62 0.57 0.70