Table 1.
The AUC performance comparison between iDeep and other methods on 31 experiments
Protein | iDeep | iONMF | NMF | SNMF | QNO | Oli | iDeep-kmer | DeepBind |
---|---|---|---|---|---|---|---|---|
1 Ago/EIF | 0.90 | 0.89 | 0.89 | 0.85 | 0.87 | 0.61 | 0.87 | 0.69 |
2 Ago2-MNase | 0.73 | 0.71 | 0.69 | 0.66 | 0.69 | 0.51 | 0.67 | 0.53 |
3 Ago2-1 | 0.91 | 0.81 | 0.81 | 0.76 | 0.83 | 0.80 | 0.82 | 0.81 |
4 Ago2-2 | 0.91 | 0.84 | 0.82 | 0.79 | 0.82 | 0.80 | 0.83 | 0.81 |
5 Ago2 | 0.74 | 0.73 | 0.71 | 0.65 | 0.66 | 0.53 | 0.65 | 0.58 |
6 eIF4AIII-1 | 0.94 | 0.92 | 0.91 | 0.78 | 0.95 | 0.92 | 0.95 | 0.93 |
7 eIF4AIII-2 | 0.97 | 0.93 | 0.93 | 0.67 | 0.64 | 0.93 | 0.94 | 0.93 |
8 ELAVL1-1 | 0.96 | 0.91 | 0.89 | 0.71 | 0.80 | 0.89 | 0.95 | 0.90 |
9 ELAVL1-MNase | 0.68 | 0.71 | 0.70 | 0.68 | 0.70 | 0.49 | 0.66 | 0.54 |
10 ELAVL1A | 0.94 | 0.94 | 0.93 | 0.91 | 0.92 | 0.84 | 0.95 | 0.87 |
11 ELAVL1-2 | 0.97 | 0.95 | 0.94 | 0.90 | 0.95 | 0.88 | 0.97 | 0.91 |
12 ESWR1 | 0.95 | 0.87 | 0.85 | 0.80 | 0.85 | 0.81 | 0.92 | 0.88 |
13 FUS | 0.92 | 0.81 | 0.73 | 0.55 | 0.65 | 0. 85 | 0.87 | 0.92 |
14 Mut FUS | 0.97 | 0.96 | 0.95 | 0.91 | 0.94 | 0.82 | 0.97 | 0.91 |
15 IGFBP1-3 | 0.95 | 0.93 | 0.92 | 0.89 | 0.91 | 0.57 | 0.93 | 0.68 |
16 hnRNPC-1 | 0.93 | 0.95 | 0.93 | 0.45 | 0.63 | 0.88 | 0.92 | 0.95 |
17 hnRNPC-2 | 0.97 | 0.97 | 0.96 | 0.48 | 0.70 | 0.94 | 0.95 | 0.97 |
18 hnRNPL-1 | 0.82 | 0.74 | 0.73 | 0.70 | 0.77 | 0.39 | 0.79 | 0.76 |
19 hnRNPL-2 | 0.82 | 0.66 | 0.62 | 0.56 | 0.61 | 0.47 | 0.72 | 0.74 |
20 hnRNPL-like | 0.79 | 0.69 | 0.67 | 0.63 | 0.68 | 0.56 | 0.70 | 0.70 |
21 MOV10 | 0.97 | 0.96 | 0.96 | 0.89 | 0.92 | 0.78 | 0.97 | 0.80 |
22 Nsun2 | 0.87 | 0.81 | 0.80 | 0.69 | 0.82 | 0.75 | 0.81 | 0.84 |
23 PUM2 | 0.98 | 0.93 | 0.92 | 0.86 | 0.89 | 0.94 | 0.98 | 0.93 |
24 QKI | 0.95 | 0.84 | 0.77 | 0.52 | 0.62 | 0.92 | 0.92 | 0.95 |
25 SRSF1 | 0.92 | 0.85 | 0.85 | 0.73 | 0.86 | 0.84 | 0.85 | 0.85 |
26 TAF15 | 0.97 | 0.91 | 0.89 | 0.82 | 0.91 | 0.80 | 0.95 | 0.95 |
27 TDP-43 | 0.89 | 0.84 | 0.78 | 0.45 | 0.57 | 0.88 | 0.85 | 0.89 |
28 TIA1 | 0.94 | 0.93 | 0.92 | 0.86 | 0.90 | 0.84 | 0.96 | 0.90 |
29 TIAL1 | 0.92 | 0.87 | 0.86 | 0.73 | 0.85 | 0.83 | 0.90 | 0.87 |
30 U2AF2 | 0.95 | 0.82 | 0.74 | 0.61 | 0.70 | 0.86 | 0.91 | 0.95 |
31 U2AF2(KD) | 0.92 | 0.80 | 0.74 | 0.60 | 0.74 | 0.84 | 0.88 | 0.91 |
Mean | 0.90 ±0.08 | 0.85 ±0.08 | 0.83 ±0.10 | 0.71 ±0.14 | 0.79 ±0.12 | 0.77 ±0.16 | 0.87 ±0.09 | 0.83±0.12 |
The performance of iONMF, NMF, SNMF and QNO are taken from [5]. DeepBind, Oli and iDeep-kmer perform on the same data with iDeep, and iDeep-kmer used kmer to replace CNN sequence and motif modalities in iDeep
The boldface indicates this performance is the best among the compared methods