Table 1.
Comparison of Dice scores by class and step follow the setups in Sect. 3.1.2
| Method | 7–2 (2 steps) | 7–2–2 (3 steps) | 7–4 (2 steps) | |||||||
|---|---|---|---|---|---|---|---|---|---|---|
| 1–7 | 8–9 | 1–9 | 1–7 | 8–9 | 10–11 | 1–11 | 1–7 | 8–11 | 1–11 | |
| DeepLabV3+ Framework | ||||||||||
| Fine tuning | 0.00 | 16.10 | 3.58 | 0.00 | 0.00 | 21.35 | 3.88 | 0.00 | 20.55 | 7.47 |
| ±0.00 | ±7.67 | ±7.61 | ±0.00 | ±0.00 | ±3.61 | ±8.38 | ±0.00 | ±7.35 | ±10.83 | |
| MiB [9] | 45.73 | 17.39 | 39.43 | 37.36 | 18.33 | 32.17 | 32.96 | 46.25 | 27.12 | 39.29 |
| ±10.82 | ±7.92 | ±15.61 | ±13.74 | ±6.92 | ±8.32 | ±13.89 | ±11.06 | ±12.61 | ±14.85 | |
| PLOP [13] | 40.68 | 16.48 | 35.30 | 31.23 | 13.72 | 25.92 | 27.08 | 40.33 | 24.42 | 34.54 |
| ±13.84 | ±7.55 | ±16.21 | ±14.02 | ±8.01 | ±9.27 | ±14.00 | ±14.00 | ±11.56 | ±15.23 | |
| SSUL [15] | 55.16 | 34.42 | 50.55 | 48.30 | 27.75 | 41.15 | 43.26 | 55.13 | 39.07 | 49.29 |
| ±23.12 | ±11.80 | ±22.82 | ±21.81 | ±12.43 | ±2.40 | ±19.81 | ±23.92 | ±8.59 | ±21.14 | |
| InSeg [14] | 57.84 | 33.95 | 52.53 | 52.25 | 27.33 | 40.93 | 45.66 | 53.40 | 37.76 | 47.71 |
| ±23.22 | ±9.15 | ±23.17 | ±22.55 | ±10.47 | ±1.90 | ±20.9 | ±20.45 | ±10.38 | ±19.02 | |
| NeST [16] | 57.76 | 28.89 | 51.34 | 56.56 | 29.11 | 37.20 | 48.05 | 58.72 | 28.59 | 47.77 |
| ±18.49 | ±7.52 | ±20.56 | ±19.69 | ±7.80 | ±6.67 | ±19.96 | ±11.20 | ±5.12 | ±17.31 | |
| IDEC [17] | 63.80 | 32.53 | 56.85 | 51.05 | 25.17 | 44.81 | 45.21 | 64.86 | 40.59 | 56.03 |
| ±16.34 | ±16.10 | ±20.84 | ±21.36 | ±15.35 | ±3.31 | ±20.74 | ±16.69 | ±11.31 | ±18.98 | |
| Ours | 68.86 | 33.70 | 61.05 | 65.26 | 29.08 | 45.68 | 55.12 | 68.79 | 47.34 | 60.99 |
| ±16.20 | ±8.58 | ±20.83 | ±18.91 | ±9.00 | ±1.60 | ±21.15 | ±16.73 | ±9.92 | ±17.90 | |
| Offline | 69.83 | 28.76 | 60.70 | 69.24 | 28.95 | 58.25 | 59.92 | 69.24 | 43.60 | 59.92 |
| ±9.26 | ±12.60 | ±19.84 | ±9.95 | ±10.15 | ±2.65 | ±17.70 | ±9.95 | ±16.42 | ±17.70 | |
| ViT Encoder + Mask Decoder (MedSAM) Framework | ||||||||||
| MBS [18] | 62.45 | 47.06 | 56.59 | 60.66 | 19.90 | 48.20 | 50.98 | 61.57 | 39.69 | 53.61 |
| ±28.18 | ±10.98 | ±27.65 | ±27.75 | ±2.79 | ±2.23 | ±27.00 | ±19.60 | ±15.98 | ±21.17 | |
| Ours | 64.29 | 43.87 | 59.75 | 62.88 | 33.11 | 47.37 | 54.65 | 64.34 | 40.74 | 55.76 |
| ±14.43 | ±9.87 | ±19.05 | ±14.29 | ±12.61 | ±1.35 | ±17.21 | ±20.59 | ±10.56 | ±20.96 | |
| Offline | 64.86 | 28.35 | 56.74 | 64.75 | 31.74 | 48.16 | 55.73 | 64.75 | 39.95 | 55.73 |
| ±12.53 | ±10.11 | ±19.37 | ±13.23 | ±10.58 | ±1.39 | ±17.29 | ±13.23 | ±11.15 | ±17.29 | |
Offline methods are trained with all data available at once without incremental learning steps. Highest results and second highest results are highlighted in bold and underlined, respectively