Skip to main content
. 2009 Jun 3;37(Web Server issue):W312–W316. doi: 10.1093/nar/gkp479

Table 1.

Comparison of the top-performing gene finding systems in the ab initio setting of the nGASP challenge (10)

Method Nucleotide
Exon
Transcript
Gene
Sn Sp graphic file with name gkp479i1.jpg Sn Sp graphic file with name gkp479i1.jpg Sn Sp graphic file with name gkp479i1.jpg Sn Sp graphic file with name gkp479i1.jpg
mGene.init 96.8 90.9 93.8 85.1 80.2 82.6 49.6 42.3 45.9 60.7 42.3 51.5
mGene.init (dev) 96.9 91.6 94.2 84.2 78.6 81.4 44.3 38.7 41.5 54.3 40.1 47.2
Craig 95.5 90.9 93.2 80.3 78.2 79.2 35.7 35.4 35.6 43.7 35.4 39.6
Fgenesh 98.2 87.1 92.7 86.4 73.6 80.0 47.1 34.1 40.6 57.7 34.1 45.9
Augustus 97.0 89.0 93.0 86.1 72.6 79.3 52.9 28.6 40.8 64.4 34.5 49.4

Shown are sensitivity (Sn), specificity (Sp) and their average (each in percent) on nucleotide, exon, transcript and gene levels (if several submissions were made for one method, we chose the version with the best gene level average of sensitivity and specificity). The predictions of mGene.init were prepared after the deadline but strictly adhering to the rules and conditions of the nGASP challenge. The result of the best-performing method according to each of the evaluation levels is set in bold face. The evaluation is based on the submitted sets of the participants and performed with our own routine. The numbers slightly deviate from the official nGASP evaluation on the transcript and gene level due to minor differences in the evaluation criteria. These differences, however, do not change the ranking.