Skip to main content
. 2019 Dec 11;13:83. doi: 10.3389/fncom.2019.00083

Table 1.

Data augmentation techniques applied in the approaches validated within the BraTS 2018 challenge framework.

References Model Flip Rot. Trans. Scale Shear Elastic GAD Pixel-wise
Albiol et al., 2019 VGG, Inception, Dense 3D affine transformations
Benson et al., 2018* CNN (encoder-decoder) Yes Random
Carver et al., 2018 U-Net Yes
Chandra et al., 2018 V-Net, ResNet-18, FC-CRF Yes Yes
Dai et al., 2018 Domain-adapted U-Net Yes
Feng et al., 2018 U-Net Yes
Gholami et al., 2018* U-Net PDE
Isensee et al., 2018 U-Net Yes Yes Yes Random Gamma
Kao et al., 2018 DeepMedic, 3D U-Net Yes
Kermi et al., 2018 U-Net Yes Yes Yes
Lachinov et al., 2018* Cascaded U-Net Yes B-spline Gaussian
Ma and Yang, 2018 3D CNN Yes Yes Yes
McKinley et al., 2018 Dense CNN Yes Yes Shift, scale
Mehta and Arbel, 2018 U-Net Yes Yes Yes Yes
Myronenko, 2018 CNN (encoder-decoder) Yes Yes Shift
Nuechterlein and Mehta, 2018 3D-ESPNet Yes Yes
Puybareau et al., 2018 VGG-16 Yes Yes
Rezaei et al., 2018 Voxel-GAN Yes Yes Gaussian
Sun et al., 2018 CNN, DFKZ, 3D CNN Yes Gaussian
Wang et al., 2018* CNN Yes Yes Yes Random
Number of methods utilizing this augmentation → 15 8 2 9 1 2 1 8
Percentage (%) of methods utilizing this augmentation → 75 40 10 45 5 10 5 40

The top-performing techniques (over the unseen test set) are annotated with green.

*

The authors verified the impact of data augmentation of the generalization abilities of their deep models.

The authors used both training- and test-time data augmentation.