|
Das & Sabut (2016)
|
adaptive threshold, morphological processing and kernel fuzzy C-mean (KFCM) clustering algorithm |
PSNR = 8.5299 |
MICCAI 2008 |
2016 |
|
Rela, Nagaraja & Ramana (2020)
|
superpixel-based fast fuzzy C-means clustering algorithm |
Dice = 0.9154 |
20 CT images |
2020 |
|
Anter, Bhattacharyya & Zhang (2020)
|
an optimization method named CALOFCM based on fast-FCM, chaos theory, and bio-inspired ant lion optimizer |
Dice = 0.773 |
27 CT images |
2020 |
|
Anter & Hassenian (2019)
|
utilizing the watershed algorithm, neutrosophic sets (NS), and the fast fuzzy c-mean clustering algorithm |
Dice = 0.9288 |
30 CT images |
2019 |
|
Liu et al. (2019)
|
increasing the depth of U-Net and only copying pooling layer features during skip-connection and use graph segmentation to optimize the results |
Dice = 0.9505 |
codalab |
2019 |
|
Xu et al. (2020)
|
improved UNet++ and added the residual 68 structure in convolution blocks to avoid the problem of gradient disappearing. |
Dice = 0.9336 |
from 15 patients |
2020 |
|
Seo et al. (2019)
|
improved the skip connection part of U-Net by adding 70 the residual path with deconvolution layer and activation operation |
Dice = 0.8972 |
LiTS |
2019 |
|
Li et al. (2020a)
|
added an attention mechanism module to the convolution block 73 of UNet++ |
Dice = 0.9815 |
LiTS |
2020 |
|
Ahmad et al. (2019a)
|
training the deep belief network through unsupervised pretraining and supervised fine-tuning |
Dice = 0.9480 Dice = 0.9183 |
Sliver07 3Dircadb01 |
2019 |
|
Ahmad et al. (2022)
|
a very lightweight convolutional neural network and Gaussian distribution for weight initialization |
Dice = 0.95 Dice = 0.929 Dice = 0.9731 |
Sliver07 3Dircadb01 LiTS |
2022 |
|
Ahmad et al. (2019b)
|
a new approach called CNN-LivSeg |
Dice = 0.9541 |
Sliver07 |
2019 |