Skip to main content
. 2017 May 25;9:32. doi: 10.1186/s13321-017-0219-x

Table 1.

Results for the training data of the CASMI 2016 contest

# Tools Top hits Top 5 Top 10 Top 20
1 MetFrag + CFM-ID + DB + MS/MS Voting/consensus 290 304 305 306
2 CFM-ID + ID_sorted + MAGMa(+) + DB + MS/MS Voting/consensus 289 304 306 308
3 MetFrag + ID_sorted + DB + MS/MS Voting/consensus 288 305 306 308
4 MetFrag + DB + MS/MS 288 305 305 307
5 MAGMa(+) + ID_sorted + DB + MS/MS Voting/consensus 288 304 307 309
6 CFM-ID + ID_sorted + MAGMa(+) + MetFrag + DB + MS/MS Voting/consensus 288 304 305 308
7 MetFrag + CFM-ID + MAGMa(+) + DB + MS/MS Voting/consensus 288 304 305 307
8 CFM-ID + MAGMa(+) + DB + MS/MS Voting/consensus 288 303 306 307
9 MetFrag + MAGMa(+) + DB + MS/MS Voting/consensus 288 303 305 307
10 CFM-ID + ID_sorted + DB + MS/MS Voting/consensus 287 304 306 308
11 CFM-ID + DB + MS/MS 287 304 304 306
12 ID-sorted + DB + MS/MS 286 306 306 308
13 MetFrag + MS-FINDER + DB + MS/MS Voting/consensus 286 302 305 307
14 MS-FINDER + CFM-ID + DB + MS/MS Voting/consensus 286 301 304 305
15 MAGMa(+) + DB + MS/MS 286 301 302 303
16 MetFrag + MS-FINDER + CFM-ID + DB + MS/MS Voting/consensus 285 303 305 307
17 MS-FINDER + ID_sorted + DB + MS/MS Voting/consensus 285 302 306 307
18 MetFrag + MS-FINDER + CFM-ID + MAGMa(+) + DB + MS/MS Voting/consensus 285 302 305 307
19 MS-FINDER + DB + MS/MS 285 300 302 303
20 CFM-ID + ID_sorted + MAGMa(+) + MetFrag + MS-FINDER + DB + MS/MS Voting/consensus 284 303 306 307
21 MetFrag + MS-FINDER + MAGMa(+) + DB + MS/MS Voting/consensus 284 302 306 306
22 MS-FINDER + MAGMa(+) + DB + MS/MS Voting/consensus 284 301 305 306
23 MS-FINDER + CFM-ID + MAGMa(+) + DB + MS/MS Voting/consensus 283 302 305 305
24 MetFrag + CFM-ID + DB Voting/consensus 243 291 296 304
25 MetFrag + MS-FINDER + CFM-ID + DB Voting/consensus 242 289 298 301
26 MetFrag + CFM-ID + MAGMa(+) + DB Voting/consensus 240 290 297 304
27 MS-FINDER + DB 239 284 294 296
28 MetFrag + DB 238 290 296 301
29 MS-FINDER + CFM-ID + DB Voting/consensus 238 287 297 298
30 MS-FINDER + CFM-ID + MAGMa(+) + DB Voting/consensus 237 288 298 300
31 CFM-ID + MAGMa(+) + DB Voting/consensus 236 289 298 303
32 MetFrag + MS-FINDER + DB Voting/consensus 236 289 297 300
33 MetFrag + MS-FINDER + MAGMa(+) + DB Voting/consensus 236 288 298 300
34 MAGMa(+) + DB 236 287 294 299
35 CFM-ID + DB 236 286 295 302
36 MetFrag + MAGMa(+) + DB Voting/consensus 235 290 298 301
37 MS-FINDER + MAGMa(+) + DB Voting/consensus 235 288 298 299
38 ID-sorted + DB 227 291 301 303
39 Randomize + DB + MS/MS 195 273 289 305
40 Randomize + DB 193 268 283 298
41 ID-sorted 143 249 267 270
42 MetFrag + CFM-ID in silico Voting/consensus 69 155 194 230
43 MetFrag + CFM-ID + MAGMa(+) in silico Voting/consensus 62 154 187 228
44 MetFrag + MS-FINDER + CFM-ID + MAGMa(+) in silico Voting/consensus 62 145 180 228
45 MetFrag + MS-FINDER + CFM-ID in silico Voting/consensus 58 145 179 221
46 MS-FINDER + CFM-ID + MAGMa(+) in silico Voting/consensus 58 133 170 213
47 CFM-ID + MAGMa(+) in silico Voting/consensus 55 134 179 221
48 MetFrag in silico only 52 134 171 210
49 MetFrag + MAGMa(+) in silico Voting/consensus 52 133 171 210
50 MAGMa + in silico only 50 121 151 189
51 MS-FINDER + CFM-ID in silico Voting/consensus 50 111 141 188
52 MetFrag + MS-FINDER + MAGMa(+) in silico Voting/consensus 49 128 153 210
53 CFM-ID in silico only 48 124 170 209
54 MS-FINDER + MAGMa(+) in silico Voting/consensus 44 105 135 183
55 MetFrag + MS-FINDER in silico Voting/consensus 43 120 143 178
56 MS-FINDER in silico only 32 86 117 145
57 Randomize 4 13 27 46

‘MetFragCL, CFM-ID, MAGMa+ and MS-FINDER’ designate results obtained by the in silico fragmentation software tools. ‘DB’ designates priority ranking by presence in chemical and biochemical databases. ‘MS/MS’ designates presence in MS/MS libraries based on >400 dot-product similarity. 312 MS/MS spectra of the CASMI 2016 training data were used