Abstract
Computer-aided systems for skin lesion diagnosis is a growing area of research. Recently, researchers have shown an increasing interest in developing computer-aided diagnosis systems. This paper aims to review, synthesize and evaluate the quality of evidence for the diagnostic accuracy of computer-aided systems. This study discusses the papers published in the last five years in ScienceDirect, IEEE, and SpringerLink databases. It includes 53 articles using traditional machine learning methods and 49 articles using deep learning methods. The studies are compared based on their contributions, the methods used and the achieved results. The work identified the main challenges of evaluating skin lesion segmentation and classification methods such as small datasets, ad hoc image selection and racial bias.
Keywords: skin image segmentation, skin lesion classification, machine learning, deep learning, small data, racial bias
1. Introduction
The annual incidence of melanoma cases has increased by 53%. This is due in part to increased ultraviolet (UV) exposure [1]. Despite the fact that melanoma is one of the deadliest types of skin cancer, early identification can lead to a high chance of survival.
Cancer develops when cells in the body begin to proliferate uncontrollably. Metastasizing means that cancerous cells may form in practically any place of the body and spread [2]. In this regard, the uncontrolled proliferation of abnormal skin cells is referred to as skin cancer. Uncorrected DNA damage to skin cells, most typically produced by UV radiation from the sun or tanning beds, creates mutations, or genetic flaws, that cause skin cells to reproduce rapidly and produce malignant tumors.
There are several varieties of benign and malignant melanomas that make the diagnosis of skin lesions complex. Squamous Cell Carcinoma (SCC), Basal Cell Carcinoma (BSC), and melanoma are major forms of irregular skin cells seen in clinical practice [3]. Further, the Skin Cancer Foundation (SCF) [4] distinguishes three less common types of abnormal cells, namely Merkel cell carcinoma, Actinic Keratosis (AKIEC), and Atypical moles. The six forms of skin lesions are depicted in Figure 1. The second most harmful cells are atypical moles after melanoma cases. According to SCF [4], the following are the distinctions between abnormal tissues:
Actinic Keratosis (AKIEC) or solar keratosis: This is a form of keratosis that occurs on the skin. It is a crusty, scaly growth on the skin. It is classified as pre-cancer because it has the potential to turn into skin cancer if left untreated.
Atypical moles: Also known as dysplastic nevi, they are benign moles that have an irregular appearance. They may look like melanoma, and the ones that have them are more likely to develop melanoma in a mole or anywhere on the body. They have a greater chance of having melanom;
Basal Cell Carcinoma (BCC): This is the most common form of skin cancer. This form of skin cancer spreads rarely. Its common symptoms are open sores, shiny bumps, red spots, pink growths, or scars;
Melanoma: This is the most lethal kind of skin cancer. It is often black or brown, although it can also be pink, red, purple, blue, or white. UV radiation from the sun or tanning beds causes cancerous tumors. Melanoma is frequently treatable if detected and treated early, but, if not, the disease can spread to other places of the body and the therapy would be complicated and deadly;
Merkel cell carcinoma: This is an uncommon and aggressive kind of skin cancer that has a high chance of metastasizing. However, it is 40 times less prevalent than melanoma;
Squamous Cell Carcinoma (SCC): This is the second most frequent kind of skin cancer. Scaly red spots, open sores, raised growths with a central depression, or warts are frequent signs.
The dominating cause of such skin cancer forms is skin tissue damage caused by UV radiation [5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21]. A dermatologist’s visual examination is a common clinical procedure for melanoma diagnosis [18]. The precision of the clinical diagnosis is somewhat deceptive [19]. Dermoscopy is a non-invasive diagnostic method that interconnects clinical dermatology with dermatology by enabling the display of morphological characteristics which are not discernible using a naked eye examination. The morphological details visualized can significantly be improved with different techniques such as solar scans [20], microscopy of the Epiluminescence (ELM), Cross-polarization epiluminescence (XLM), and side transillumination [21,22,23,24]. Therefore, the dermatologist receives further diagnostic criteria. Dermoscopy improves diagnostic performance by 10–30% relative to a non-discrete eye [25,26,27,28]. Nevertheless, [29,30,31] reported that the diagnostic accuracy of the dermoscopy was decreased with novice dermatologists in contrast with expert dermatologists because this process requires a great deal of experience to identify lesions [32].
According to ref. [33] professional dermatologists have achieved 90% sensitivity and 59% specificity in the identification of skin lesions. Around the same time, the statistics for less qualified doctors indicated a significant decline for general practitioners of about 62–63%.
A visual inspection by a dermatologist of the suspicious skin region is the first step in diagnosing a malignant lesion. A correct diagnosis is essential because certain types of lesions have similarities; moreover, the accuracy of the Computer-Aided System (CAD) is close to the experienced dermatologist’s diagnosis [34,35,36]. Without the use of technology, dermatologists can diagnose melanoma with a 65–80% accuracy rate [37]. For suspect cases, a dermatoscopic image is taken using a very high-resolution camera to complete the visual examination. The lighting is controlled and a filter is used during the recording to reduce skin reflections and thus visualize the deeper layers of skin. This technical assistance led to a 49% improvement in skin lesion diagnosis [31]. Ultimately, the combination of a visual examination and dermatoscopy images led to an absolute accuracy of 75–84% for melanoma detection [38,39].
Classifying lesions of the skin has been an aim of the machine learning community for some time. Automated classification of lesions is used in clinical examination to help physicians and allow rapid and affordable access to lifesaving diagnoses [40], and outside of the hospital environment, smartphone apps have been used [41]. Before 2016, most research adopted the traditional machine learning workflow of preprocessing (enhancement), segmentation, feature extraction, and classification [41,42,43]. These phases are explained in the following section:
Image enhancement: This phase aims to eliminate all noise and artifacts such as hair and blood vessels in dermoscopic images;
Segmentation: Segmenting the Region of Interest (ROI) is a crucial step in CAD systems. The process of segmenting skin cancer images is made more complex by a large number of different skin lesions. It quickly became one of the most complex and tedious tasks in the CAD system;
Feature extraction: After defining the ROI, the goal of the feature extraction step is to identify the best set of features that have high discrimination capability to classify the dataset into two or more classes;
Classification and detection: The proposed system is evaluated according to its capability to classify the dataset into different classes. Hence, the choice of classifier is critical for a better performance. However, it depends on the set of extracted feature and the required number of classes. The classification performance measures are accuracy, specificity, sensitivity, precision, and Receiver Operating Characteristic (ROC).
The need for high-performance CAD systems is essential for lesion detection and diagnosis. Feature selection is a crucial task for CAD system development. The choice of appropriate features took a long time for the automatic recognition of pigmented skin lesion images in 1987 [44]. In the same manner, errors and data loss have a significant influence on the classification rate. For example, an inaccurate segmentation result often results in poor outcomes in feature extraction and, thus, low accuracy in the classification. Machine vision and computer analysis are becoming more critical to produce a successful automatic melanoma diagnosing system [45,46,47,48,49,50]. An accurate CAD system would help doctors and dermatologists to make better and more dependable diagnoses.
Many CAD systems have been identified using different border detection, extraction, selection, and classification algorithms. Some studies [51,52,53,54,55,56,57,58] have proposed the study and analysis of image processing techniques to diagnose skin cancer; moreover, they compared Artificial Intelligence (AI) and CAD system performances against the diagnostic accuracy of experienced dermatologists. However, further work is required to define and reduce ambiguity in automated decision support systems to enhance diagnostic accuracy. There is no comprehensive and up-to-date review of the automatic skin lesion diagnostic model. The constant development, in recent years, of new dermoscopic research classification algorithms and techniques would benefit from such a study.
2. Methods
2.1. Systematic Review
We have looked for systematic reviews and original research papers written in English in the ScienceDirect, IEEE, and SpringerLink databases. In this analysis, only papers that were published in journals and recorded proper scientific proceedings were considered.
Papers were included based on the inclusion criteria: (i) classification or segmentation of skin lesions binary or multi-class, (ii) traditional machine learning method, (iii) deep learning models, (iv) digital image modality, (v) papers published in well-defined journals, and (vi) published in English.
The exclusion criteria were used to exclude the irrelevant studies based on the list of criteria presented as follows: (i) review articles, (ii) papers published in a language different from English, (iii) conference papers, (iv) books, and (v) book chapters.
The PRISMA flow diagram in Figure 2 shows the selection procedure [59]. The initial search identified 111,701 literature sources satisfying the search criteria. These sources were supplemented with 5757 records identified using other methods (forward and backward snowballing). After the removal of duplicate records, the number of papers ended up with 106,398 records. After applying the inclusion criteria, 801 full-text articles were identified, which were further inspected by applying the exclusion criteria. Finally, 53 articles using traditional machine learning methods and 49 articles using deep learning were selected. The selected articles were further analyzed and their results are discussed in this study. In addition, we listed only the models with the best score in each sample from studies that tested several models.
2.2. Datasets
The performance of melanoma diagnosis has improved with the dermoscopic method [21]. Dermoscopy is an invasive skin imaging technique that can capture enlightened and enlarged pictures of skin lesions to improve the clarity of the spots. The effect of the deeper skin could be improved by removing the reflection from the skin surface [60]. Automatic identification of dermoscopic images of melanoma is a challenging task due to many factors: firstly, intraclass variability in lesions such as texture, scale, and color; secondly, the high resemblance between the lesions of melanoma and non-melanoma; finally, the environmental conditions around it including hair, veins, and color calibration charts and rule marks.
In this section, the most used datasets in this area of research are described. A broad range of available and free online datasets such as MedNode, DermaIS, DermQuest, the ISIC 2016, 2017, 2018, and 2019, Ph2, and paid like Dermofit were used.
MED-NODE consists of 170 images of melanoma and nevus; it is divided into 70 and 100, respectively. The dataset came from the Digital Image Archive of the University of Medicine in the Netherlands, Department of Dermatology. The device for the sensing of skin cancer on microscopic images was developed and tested [61]. DermIS [62] is the Dermatology Information System skin dataset. This dataset is divided into two classes, nevus and melanoma. It contained 69 images: 26 nevus and 43 melanoma. DermQuest is a dataset consisting of 137 images. These images are divided into two classes, melanoma and nevus; these classes have 76 and 61 images, respectively [63].
The PH2 [64] database was created in collaboration with Porto University, Technical University of Lisbon, and Hospital Pedro Hispano in Matosinhos. It is comprised of 200 RGB color images with a 768 × 560 pixel resolution. This dataset has three groups of images: melanoma, normal nevus, and atypical nevus, with 40, 80, and 80 images in each category, respectively.
The skin colors characterized in the PH2 database may differ from white to creamy white. As illustrated in Figure 3, the images were carefully selected, taking into account their resolutions, quality, and dermoscopic features.
The International Skin Imaging Collaboration (ISIC) 2016 [65] dataset, which is referred to as the 2016 ISIC-ISBI challenge, provides 900 training images. Participants can produce and submit automated results using a separate test dataset (350 images). The training dataset consists of two classes. These classes are melanoma and benign, in which each class contains 173 and 727 dermoscopic images, respectively.
The International Skin Imaging Collaboration (ISIC) 2017 dataset [66], which is also referred to as the 2017 ISBI Challenge on Skin Lesion Analysis Towards Melanoma Detection. This challenge provides training data (2000 images), a separate validation dataset (150 photos), and a blind held-out test dataset (600 images). The training dataset consists of three classes divided into 374, 254, and 1372 dermoscopic images for melanoma, seborrheic keratosis, and nevus. Figure 4 shows different skin lesions examples from ISIC 2017.
The International Skin Imaging Collaboration (ISIC) 2018 dataset [67,68], which is also referred to as the HAM10000 (“Human Against Machine with 10,000 training images”) dataset, was divided into a training dataset, consisting of 10015 images, and the test dataset, consisting of 1512 images. This dataset was compiled using a variety of dermatoscopy techniques on all anatomic sites (except mucosa and nails) from a retrospective sample of patients who had undergone skin cancer screening at multiple institutions. The training dataset consists of seven classes: AKIEC, BCC, Benign Keratosis (BKL), Dermatofibroma (DF), Melanocytic nevus (NV), Melanoma (MEL), Vascular lesion (VASC). There are varying numbers of images in each of these groups. The MEL has 1113, the NV has 6705, the BCC has 514, the AKIEC has 327, the BKL has 1099, the DF has 115, and the VASC has 142. The classification of different images into seven groups in this dataset is one of the most difficult challenges.
There is another dataset from the International Skin Imaging Collaboration, ISIC 2019 (BCN_20000) [69] consists of eight known classes and one class for outlier images. These classes are MEL, NV, BCC, AKIEC, BKL, DF, VASC, and SCC. ISIC 2019 consists of 25,331 images, where AKIEC has 867, BKL has 2624, BCC has 3323, DF has 239, NV has 12,875, MEL has 4522, SCC has 628, and VASC has 253. Figure 5 depicts the several types of skin cancer. This dataset is one of the hardest to categorize into eight classes with an uneven amount of photos in each class. The hardest challenge is to detect outliers or any of the other “out of distribution” diagnosis confidence.
The Dermofit Image Library dataset is composed of 1300 skin images with corresponding class labels and lesion segmentations. There are ten lesion categories in this dataset: AKIES, BCC, hemangioma, intraepithelial carcinoma, nevus, SCC, pyogenic granuloma, seborrhoeic keratosis, DF, and MEL. These classes have 331, 76, 257, 239, 65, 45, 97, 88, 78, and 24 images for nevus, MEL, seborrhoeic keratosis, BCC, DF, AKIES, hemangioma, SCC, Intraepithelial Carcinoma, and Pyogenic Granuloma [70].
The Interactive Atlas of Dermoscopy (EDRA) [71] is another type of skin image dataset that consisted of 20 labels. These labels named melanoma with several subtypes, BCC, blue nevus, Clark’s nevus, combined nevus, congenital nevus, dermal nevus, DF, lentigo, melanosis, recurrent nevus, Reed nevus, SK, and VASC.
The ISIC challenge 2020 dataset consisted of 33,126 dermoscopic images. These images were acquired from over 2000 patients. Images in the dataset were decomposed into nine classes in addition to an unknown image class [72]. Table 1 summarizes the total number of images and the total number of images in each class for all of these datasets.
Table 1.
Nevus or Atypical Nevus |
Common Nevus |
Melanoma | Seborrheic Keratosis |
Basal Cell Carcinoma |
Dermatofibroma | Actinic Keratosis |
Vascular Lesion or Hemangioma |
Squamous Cell Carcinoma |
Intraepithelial Carcinoma |
Pyogenic Granuloma |
Total | |
---|---|---|---|---|---|---|---|---|---|---|---|---|
MedNode | 100 | - | 70 | - | - | - | - | - | - | - | - | 170 |
Dermis | 26 | - | 43 | - | - | - | - | - | - | - | - | 69 |
DermQuest | 61 | - | 76 | - | - | - | - | - | - | - | - | 137 |
Ph2 | 80 | 80 | 40 | - | - | - | - | - | - | - | 200 | |
ISIC 2016 | 726 | - | 173 | - | - | - | - | - | - | - | - | 899 |
ISIC 2017 | 1372 | - | 374 | 254 | - | - | - | - | - | - | - | 2000 |
ISIC 2018 | 6705 | - | 1113 | 1099 | 514 | 115 | 327 | 142 | - | - | - | 10,015 |
ISIC 2019 | 12,875 | - | 4522 | 2624 | 3323 | 239 | 867 | 253 | 628 | 25,331 | ||
Dermofit | 331 | - | 76 | 257 | 239 | 65 | 45 | 97 | 88 | 78 | 24 | 1300 |
EDRA | 560 | 55 | 196 | 45 | 42 | 20 | - | 29 | - | 64 | - | 1011 |
ISIC2020 | 46 | 5193 | 584 | 135 | 7 | - | 37 | - | - | - | - | 6002 |
For more than 30 years, skin cancer detection by CAD systems has been a hot topic of research [73]. For example, several methods for melanoma identification, classification, and segmentation have been developed and tested [74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99]. The following section addresses the researcher’s effort in the state of the art using journal papers published in ScienceDirect, IEEE, and SpringerLink databases during the last five years only.
2.3. Traditional Machine Learning
The rule asymmetry, border, color, diameter (ABCD) was used by authors in [100] to analyze the border and color of skin lesions. They classified the features using a multilayer perceptron network (MLP) based on backpropagation training. In [101], a Gabor filter and geodesic active contours are used to enhance the image and remove hair. Then, the ABCD scoring method was used to extract features. Finally, a combination of existing methods was used to classify lesions. In [102], the authors classified melanoma based on the thickness of the lesion by three values. Two classification schemata were used: the first classified lesions into thin or thick and the second schema classified lesions into thin, intermediate, and thick. They combined logistic regression with artificial neural networks for classification. In [103], lesions were enhanced using a median filter separately on each channel of the RGB space. Finally, these lesions were segmented based on a deformable model. A segmentation method based on the Chan–Vese model was proposed by [104]. Images were first enhanced using an isotropic diffusion filter, and then ABCD was used to extract the features from the segmented regions. These features were classified using a support vector machine (SVM). In [105], the authors proposed a classification system for BCC and MEL using Paraconsistent logic (PL) and annotation with two values. They extracted the degrees of evidence, formation pattern, and diagnosis comparison. The spectra values that were used to differentiate between normal, BCC and MEL were 30, 96, and 19, respectively. A Delaunay Triangulation was used in [106] to extract the binary mask of ROIs. The authors of [107] segmented the histopathological images by extracting the granular layer boundary, and then the intensity profiles were used to classify two lesions only.
A comparison of 4 classification methods that were used for skin lesion diagnosis is summarized in [108] to determine the best method. Based on the histopathology of BCC, the authors of [109] combined two or three Fourier transform features to form one Z-transform feature. A CAD system using principal component analysis (PCA) and SVM was proposed in [110] to classify skin psoriasis. A framework for the segmentation of BCC was proposed in [111]. The hemoglobin component was clustered using k-means. In [112], the classification system for skin cancer was based on deep lesions using 3D reconstruction. They utilized adaptive snake, stereo vision, structure from motion, depth from focus for segmentation, and classification.
In [113], the lesion was extracted using a self-generated neural network. Then, the descriptive border, texture, and color features were removed. Finally, an ensemble classifier network that combined fuzzy neural networks with backpropagation (BP) neural network was used to classify the lesions based on the extracted features. A fixed grid wavelet network was used by [114] to enhance and segment skin lesions. Then, these features were classified by D-optimality orthogonal matching pursuit. Based on a chaotic time series analysis of the boundary, Khodadadi et al. [115] analyzed the irregular boundary of an infected skin lesion by Lyapunov exponent and Kolmogorov–Sinai entropy.
In [116], a segmentation technique for skin lesions was proposed based on ant colony using three types of features from lesions such as texture, relative colors, and geometrical properties. Finally, these features were classified by two classifiers: artificial neural network (ANN) and K-nearest neighbour (KNN). Based on shape and color, the authors of [117] combined some features after segmentation using ABCD. Finally, these features were classified and tested individually and after being connected. The Histogram of Gradients (HOG) and the Histogram of Lines (HL) were used in [118] to create a bag of features for each one separately. The bag of features was used to extract texture and color features for skin lesion detection. Color features were extracted using 3rd Zernike moments.
Roberta et al. [119] proposed a skin lesion diagnosis based on the ensemble model for feature manipulation. Przystalski et al. [120] proposed multispectral lesion analysis by fractal methods. Jaisakthi et al. [121] proposed a segmentation method for skin lesions using the Grab-Cut algorithm and k-means. Do et al. [122] introduced a melanoma detection system using a smartphone. Images were acquired using a smartphone camera. Then, they searched for the best processing method that worked appropriately with smartphones. This method started with a hierarchical segmentation approach and numerical features to classify a skin lesion. Adjed et al. [123] proposed a feature extraction method using wavelet transform, curvelet transforms, and local binary pattern (LBP). Finally, the extracted features were classified using an SVM. Hosseinzade [124] divided lesion images based on fabric characteristics. Fabric characteristics were described by the Gabor filter, and then these characteristics were classified by k-means.
Akram et al. [125] proposed an automatic skin lesion segmentation and classification system. They used several diverse ABCD, fuzzy C-means, pair threshold binary decomposition, HOG, and linear discriminant analysis (LDA). Jamil et al. [126] proposed a technique for skin lesion detection. Finally, they classified lesions based on color, shape, Gabor wavelet, and Gray intensity features. Khan et al. [127] combined Bhattacharyya distance and variance as an entropy-based method for skin lesion detection and classification.
Tan et al. [128] enhanced Particle Swarm Optimization (PSO) for skin lesion feature optimization. The authors modified two PSO models for discriminative feature selection; the first one performing a global search by combining lesion features and an in-depth local search by separating lesions into specific areas. The second modified PSO was for random acceleration coefficients. Subsequently, these features were classified using different classification methods. Tajeddin et al. [129] classified skin melanoma based on highly discriminative features. They started with contour propagation for lesion segmentation. Based on the peripheral area to extract features, lesions were mapped by log-polar space using Daugman’s transformation. Finally, different classifiers were compared.
To classify skin lesions, the authors in [130] utilized the structural co-occurrence Matrix of frequencies extracted from dermoscopic images. Peñaranda et al. [131] classified skin lesions by analyzing skin cells using Fourier transform infrared. Finally, a study was conducted that used the perturbations that influenced results to determine the right effects. Wahba et al. [132] proposed a skin lesion classification system based on the Gray-level difference method. They tried to discriminate between four lesions by extracting features using ABCD and cumulative level-difference mean. Finally, these features were classified using an SVM. Zakeri et al. [133] proposed a CAD system to differentiate between melanoma and dysplastic lesions. They enhanced the grey-level co-occurrence matrix to extract features. Finally, these features were classified using an SVM. Pathan et al. [134] proposed a detection system for pigment networks and differentiated between typical and atypical network patterns. In [135], laser-induced breakdown spectroscopy was used with a combination of statistical methods to distinguish between the soft tissue of the skin.
Chatterjee et al. [136] utilized the non-invasive image of a skin lesion to distinguish melanoma from nevi. They obtained the texture pattern of the skin using 2D wavelet packet decomposition. Qasim et al. [137] proposed a skin lesion classification system. KNN was used with the enhanced images by the Gaussian filter to extract ROI. Finally, the segmented ROI was classified using an SVM. Madooei et al. [138] utilized a blue-whitish structure to differentiate melanoma from nevi lesions. Saez et al. [139] utilized the color of lesions to classify these lesions as melanoma or nevi. The lesions were classified by the color itself and their neighborhood color values. Navarro et al. [140] proposed a segmentation system for skin lesions. They classified the segmented lesions into melanoma and nevi. Riaz et al. [141] proposed a CAD system for skin lesions. Their system started with lesion segmentation to extract ROI. They utilized Kullback–Leibler divergence to detect lesion boundaries.
Sabbaghi et al. [142] presented a QuadTree based on the perception of lesion color. They found that the three most common colors of melanoma were blue-grey, black, and pink. Finally, they used different classifiers, such as SVM, ANN, LDA, and random forests (RFs). Murugan et al. [143] utilized watershed segmentation to extract ROI. Features were extracted using ABCD and Gray-Level Co-occurrence Matrix (GLCM). Finally, these features were classified using KNN, RF, SVM. Khalid et al. [144] suggested a segmentation method for dermoscopic skin lesion images using a combination of wavelet transform with morphological operations. Majumder et al. [145] proposed three features that were used in melanoma classification based on the ABCD rule. Chatterjee et al. [146] introduced fractal-based regional texture analysis (FRTA) to extract shape, fractal dimension, texture, and color features to classify three lesions using SVM with RBF.
Chatterjee et al. [147] proposed a classification system for four kinds of lesions. Features were extracted using cross-correlation techniques based on frequency domain analysis. Their system differentiated between benign and malignant lesions of the epidermal and melanocytic classes in a binary manner. Upadhyay et al. [148] extracted color, orienttion histogram, and gradient location of skin lesion features. These features were fused and classified as benign or malignant using an SVM. Pathana et al. [149] proposed a skin lesion CAD system. Garcia-Arroyo et al. [150] proposed a skin lesion detection system. Lesions were segmented using fuzzy histogram thresholding. Hu et al. [151] suggested a skin lesion classification approach. To measure the similarity between features, they introduced codewords for a bag of features. Moradi et al. [152] suggested a skin lesion segmentation and classification system based on sparse kernel representation. Pereira et al. [153] proposed a skin lesion classification system based on characteristics of lesions borderline in addition to combining LBP with gradients.
2.4. Deep Learning
Kawahara et al. [154] proposed a skin lesion classification system using a convolutional neural network (CNN). The modified pre-trained CNN was able to classify images with different resolutions. Yu et al. [155] proposed a novel CNN for skin lesion segmentation and classification. The proposed CNN was based on residual learning and consisted of 50 layers. Codella et al. [156] proposed a deep residual network for skin lesion classification using the benchmark dataset ISIC2016. Bozorgtabar et al. [157] proposed a decision support system that localized skin cancer automatically using deep convolution learning for pixel-wise image segmentation. Yuan et al. [158] proposed a skin lesion segmentation system by leveraging CNN. The network consisted of 19 layers. Instead of using the traditional loss function, they utilized Jaccard Distance as a loss function.
Sultana et al. [159] proposed a skin lesion detection system using deep residual learning with a regularized framework. Rundo et al. [160] utilized ABCD for lesions to analyze skin lesions. Finally, ad hoc clustering was performed using a pre-trained deep Levenberg–Marquardt neural network. Creswell et al. [161] proposed denoising adversarial autoencoders to classify limited and imbalanced skin lesion images. Harangi [162] ensembled four different CNNs to investigate the impact on performance. Guo et al. [163] utilized and ensembled multi-ResNet to analyze skin lesions. The training images for each ResNet were pre-treated in different ways while the labels were still like the original.
Monedero et al. [164] utilized the thickness of lesions to detect melanoma using the Breslow index. The extracted texture, shape, pigment network, and color features of lesions were classified using GoogleNet to classify lesions into five types. Hagerty et al. [165] utilized deep learning and conventional image processing to extract different skin lesion features. The extracted features from deep learning and traditional processing of images were combined and fused. Finally, the newly generated features were used to classify lesions. Polap [166] proposed a skin lesion classification based on IoT. He used deep learning over IoT. The proposed model classified images in a short time, but with a low classification rate. Such a system could be used in a smart home as a part of an intelligent monitoring system [167].
Sarkar et al. [168] proposed a depth-wise separable residual deep convolutional network to classify skin cancer. The non-local means filter was succeeded by the contrast-limited adaptive histogram equalization (CLAHE) over the discrete wavelet transform (DWT) algorithm. Zhang et al. [169] proposed a CNN model with attention residual learning to classify skin lesions into three classes. The proposed deep model has four residual blocks with a total of 50 layers that consist of the deep model. Albahar [170] introduced a skin lesion detection system for binary classification of malignant or benign. He proposed a CNN model consisting of seven layers. He also proposed a regularization method to control the complexity of the classifier using the standard deviation of the weight matrix.
González-Díaz [171] introduced a skin lesion CAD system using CNN called DermaKNet. The proposed CNN was based on ResNet50, but the author started by applying a modulation block over the convolutional res5c layer outputs. Two pooling layers (AVG and Polar AVG) were worked together at the same time. Three fully connected layers were used at the end of CNN. However, because melanoma growth differently, the third fully connected one precedes the asymmetry block. The asymmetry block was used to detect the different methods of melanoma growth.
Kawahara et al. [172] proposed a skin lesion classification system using CNN that simultaneously worked on multiple tasks. The proposed CNN is able to classify seven-point melanoma checklist criteria. The proposed CNN skin lesion diagnosis classified skin lesion images and meta-data of patients. Yu et al. [173] proposed a skin lesion classification system using CNN and the local descriptor encoding method. The lesion features were extracted from images using ResNet50 and ResNet101. Then, a fisher vector (FV) was used to build a global image representation using the extracted features of ResNet. Finally, an SVM was used with a Chi-squared kernel for classification. Dorj et al. [174] proposed a skin cancer classification system using CNN. The pre-trained Alex-Net was used to extract features while the Error-Correcting Output Codes (ECOC) SVM was used to classify the extracted features.
Gavrilov et al. [175] proposed a skin neoplasm (cancer) classification system using CNN. They applied transfer learning to inception V3 (Googlenet). Furthermore, the development of web and mobile applications was created to allow patients to assess their lesions and give a preliminary diagnosis using images captured by themselves. Chen et al. [176] proposed a classification system for facial skin diseases. They used three CNN models with transfer learning to classify five skin diseases of the face. The proposed model was worked on through a cloud platform. Mahbod et al. [177] utilized four different CNN models (AlexNet, VGG, ResNet-18, and ResNet-101) to classify three skin lesions. They used SVM, RF, and MLP to classify the extracted features from CNN. The different classification results were enameled together to generate a single classification for the input lesions. Brinker et al. [178] proposed a skin lesion classification system using a pre-trained model. The modified pre-trained model ResNet50 outperformed expert dermatologists in classifying lesions into melanoma and nevus.
Tan et al. [179] utilized PSO for skin lesion segmentation. They tried to optimize PSO using different methods, such as Firefly Algorithm (FA), spiral research action, probability distributions, crossover, and mutation. To enhance lesion segmentation, K-Means was used. The hybrid learning PSO (HLPSO) was used in the development of CNN. The classification system could classify lesions into melanoma and nevus. Khan et al. [180] used a custom CNN of ten layers for image segmentation and a deep pre-trained CNN model for feature extraction. Then, an improved moth flame optimization (IMFO) algorithm was used for feature selection. The selected features were fused using multiset maximum correlation analysis and classified using the Kernel Extreme Learning Machine (ELM) classifier.
Tschandl et al. [181] proposed combining and expanding different CNNs for the segmentation and classification of skin lesions. They used three well-known benchmark datasets. Finally, they found that post-processing with a small dataset that contains noise decreased Jaccard loss. Vasconcelos et al. [182] proposed a skin lesion segmentation system using morphological geodesic active contour. Different CNN models were used, such as full resolution convolutional networks (FrCNs), deep class-specific learning with probability-based step-wise integration (DCL-PSI). The proposed model was able to classify skin lesions into melanoma and nevus.
Burlina et al. [183] proposed a CNN for acute Lyme disease from erythema migrans images, even with different acquisition and quality conditions. They fine-tuned and replaced the final layers of ResNet50. The proposed model was able to classify four different lesions. Maron et al. [184] proposed a system that classified five types of skin lesions. They applied transfer learning to ResNet50 in addition to fine-tuning CNN weights. They compared the classification rate using a CNN against 112 expert dermatologists.
Goyal et al. [185] proposed a skin lesion segmentation system by ensembling the segmentation output from Mask R-CNN and DeeplabV3+. Albert [186] proposed Predict-Evaluate-Correct K-fold (PECK), which trains CNNs from limited data. The proposed system was used for skin lesion classification in his research, Synthesis, and Convergence of Intermediate Decaying Omnigradients (SCIDOG), to detect the contour of lesions. Finally, the segmented lesions were classified using a CNN with SVM and RF. Ahmad et al. [187] utilized three loss functions by fine-tuning ResNet152 and InceptionResNet-V2 layers. Euclidean space was used to compute the L-2 distance between images. Finally, the L-2 distance was used to adapt the weights to classify images.
Kwasigroch et al. [188] proposed a skin lesion classification using a CNN with hill-climbing for search space. This process led to increasing the size of the network, which reduced the computational cost. Adegun et al. [189] proposed an encoder and decoder network with subnetworks connected using skip connections. The proposed CNN was used for skin lesion segmentation and pixel-wise classification. Song et al. [190] suggested that CNNs could segment, detect, and classify skin lesions. To control the imbalanced datasets, they utilized a loss function based on the Jaccard distance and the focal loss. Wei et al. [191] proposed a skin lesion recognition system based on fine-grained classification to discriminate features. A different lightweight CNN was utilized for segmentation and classification.
Gong et al. [192] proposed a dermoscopic skin image classification system using deep learning models. The authors enhanced skin images using StyleGANs while the enhanced image was classified using 43 CNNs. CNNs were divided into three groups with different fusion methods. Finally, the classification decision was generated using the max voting technique.
Nasiri et al. [193] proposed a skin lesion classification system based on deep learning with case-based reasoning (CBR). Öztürk et al. [194] proposed a segmentation system for skin lesions. Hosny et al. [195] proposed a new deep CNN classification system for skin lesions. The authors performed three different experiments with three datasets. They compared the accuracy between using traditional machine learning classifiers and the emerging technology with CNN. They found that the conventional machine learning classifier led to a lower classification rate.
Amin et al. [196] proposed a skin lesion classification system. Firstly, they enhanced images, then biorthogonal 2-D wavelet transforms and the Otsu algorithm were used to segment lesions. Finally, two pre-trained models were fused serially to extract features for classification using PCA. Mahbod et al. [197] proposed investigating the effect of the different image sizes of skin lesions using transfer learning with pre-trained models.
Hameed et al. [198] proposed a skin lesion classification system based on a multiclass multilevel algorithm. Traditional machine learning and deep learning methods were used with the proposed model. Zhang et al. [199] proposed an optimization algorithm for optimal weight selection to minimize the network output error for skin lesion classification. Hasan et al. [200] proposed a semantic segmentation network to segment skin lesions called DSNet. They utilized depth-wise separable convolution to reduce the number of parameters that produced a lightweight network. Al-masni et al. [201] proposed a diagnostic framework for skin lesion classification systems, which combined segmentation of lesion boundaries with multiple classification stages. The proposed system segmented the lesions using a full resolution convolutional network (FrCN) with four CNNs for classification. This system was evaluated and tested using three benchmark datasets.
Pour et al. [202] proposed a segmentation method for skin lesions using a CNN. The CNN was trained from scratch using a small dataset with augmentation. Olusola et al. [203] utilized image augmentation (a variant of SMOTE). Then, they classified skin lesions into benign and malignant using SquuezeNet. Hosny et al. [204] utilized Alexnet with transfer learning to classify the challenging dataset ISIC2018. They used different approaches for lesion segmentation. Hosny et al. [205] proposed a CAD system for skin lesions using the challenging dataset ISIC2019. That dataset has several challenges, such as imbalanced classes and unknown images. The authors utilized the bootstrap weighted classifier with a multiclass SVM. This classifier changed the weights according to the image class. Finally, the authors dealt with unknown images in two different ways. They trained GoogleNet with a new class containing a different number of unknown images collected from various sources. The second way was the similarity score; if the high similarity score of the tested image with the known eight classes was less than 0.5, the tested image was identified as an unknown image or out of distribution. The number of classes used for skin disease recognition in the analyzed works is summarized in Figure 6. Generally, the vast majority of works used two-class recognition only, while only one study [172] used 10 classes for recognition.
3. Discussion and Conclusions
The paper is an analytical survey of the literature on skin lesion image classification and disease recognition. It is a comprehensive review of the methods and algorithms used in the processing, segmentation, and classification of skin lesion images. Both classical machine learning methods and deep convolutional neural network models were explored. A discussion of the available and known datasets with a comparison between these datasets was introduced. At the end of the systematic survey, a table was used to compare state-of-the-arts methods in a novel way. The column “simple” in Table 2 refers to the proposed method not being complicated and easy to apply and did not require specified hardware, whereas the column “Contribution Achieved” was added to indicate if the research paper achieved what was discussed or not.
Table 2.
Reference | Simple | Dataset | Colored Images |
Enhancement | Segmentation | Contribution | Contribution Achieved |
Methods and Tools | No. of Classes |
Accuracy (%) |
Sensitivity (%) |
Specificity (%) |
Precision (%) |
Dice (%) |
||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Number | Type | Free | ||||||||||||||
[100] | Y | 1 | D | Y | N | Y | Y | Mobile Assessment | N | ABCD, MLP | 2 | 88 | 66 | 93 | N/A | N/A |
[101] | N | 1 | A | N | N | Y | Y | Classify lesion Atlas | Y | Gabor filter, ABCD | 2 | 94 | 91.25 | 95.83 | N/A | N/A |
[102] | N | 1 | A | N | Y | Y | Y | Classify lesion Atlas based on the thickness |
Y | logistic regression with ANN | 2 | 64.4 | 55.2 | N/A | N/A | N/A |
[103] | N | 1 | A | N | Y | Y | Y | Increase classification using a novel segmentation method | Y | Median Filter | 2 | N/A | N/A | N/A | N/A | N/A |
1 | D | Y | ||||||||||||||
[104] | N | 3 | A | N | Y | Y | Y | Segment and classify pigmented lesions |
Y | Anisotropic Diffusion, Chan–Vese’s, ABCD, SVM |
2 | 79.01 | N/A | N/A | 80 | N/A |
1 | D | Y | ||||||||||||||
[105] | N | 1 | RS | N | - | - | - | classification | Y | Paraconsistent analysis network | 3 | 93.79 | N/A | N/A | N/A | N/A |
[106] | Y | 1 | D | Y | N | Y | Y | Segmentation | Y | Delaunay Triangulation | 2 | 75.91 | 67.46 | 95.93 | N/A | N/A |
[107] | N | 1 | H | N | Y | N | Y | Segmentation and Classification | Y | Granular Layer, Intensity Profiles | 2 | 92.1 | N/A | 97.6 | 88.1 | N/A |
[108] | Y | 1 | F | N | - | - | - | Comparison of four Classification methods |
Y | KNN with Sequential Scanning selection, KNN with GA, ANN with GA, Adaptive Neuro-Fuzzy Inference |
2 | 94 | N/A | N/A | N/A | N/A |
[109] | Y | 1 | H | N | Y | N | Y | Classification lesions with and without segmentation |
Y | Z-transforms | 2 | 85.18 | 91.66 | 80 | 78.57 | N/A |
[110] | Y | 1 | JPGE | N | Y | N | N | Classification of Psoriasis | Y | PCA, SVM | 2 | 100 | 100 | 100 | N/A | N/A |
[111] | Y | 3 | A | N | N | N | Y | Segmentation | Y | k-means | 2 | N/A | N/A | N/A | N/A | N/A |
[112] | Y | 1 | A | N | Y | N | Y | Classification of skin cancer based on the deep of lesion using 3D reconstruction |
Y | adaptive snake, stereo vision, structure from motion, depth from focus |
8 | 86 | 98 | 99 | N/A | N/A |
1 | D | Y | 3 | |||||||||||||
[113] | N | 2 | D | N | N | N | Y | Classification of melanoma | Y | a self-generated neural network, fuzzy neural networks with backpropagation (BP) neural networks |
2 | 94.17 | 95 | 93.75 | N/A | N/A |
[114] | N | 1 | JPGE | N | Y | Y | Y | Segmentation and Classification | Y | fixed grid wavelet network, D-optimality orthogonal matching pursuit |
2 | 91.82 | 92.61 | 91 | N/A | N/A |
[115] | N | 1 | SI | N | N | N | Y | Classification of the regular and irregular boundary of skin lesion |
Y | 1D time series for Lyapunov exponent and Kolmogorov–Sinai entropy | 2 | 95 | 100 | 92.5 | 86.5 | N/A |
1 | D | Y | ||||||||||||||
[116] | Y | 2 | D | Y | Y | N | Y | Segmentation and classification of lesions to melanoma and Benign | Y | An ant colony, KNN, ANN | 2 | N/A | N/A | N/A | N/A | N/A |
[117] | N | 1 | A | N | Y | N | Y | Segmentation and combine features, classification |
Y | ABCD, SVM | 2 | 90 | 72.5 | 94.4 | N/A | N/A |
2 | D | Y | ||||||||||||||
[118] | N | 2 | D | N | N | Y | Y | Codebook generated from a bag of features |
Y | Histogram of Gradients, Histogram of Lines, Zernike moments |
2 | 92.96 | 96.04 | 84.78 | N/A | N/A |
[119] | N | 1 | D | Y | Y | N | Y | Classification of skin lesion using feature subset selection | Y | Optimum-path forest integrated with a majority voting |
2 | 94.3 | 91.8 | 96.7 | N/A | N/A |
[120] | N | 1 | SIA scope | N | N | N | Y | Analysis of multispectral skin lesion | Y | Box-counting dimension and lacunarity, Hunter score pattern detection, RBF kernel, SVM | 2 | 97 | 59.6 | 97.8 | N/A | N/A |
[121] | N | 2 | D | Y | Y | Y | Y | Automatic segmentation of skin lesion | Y | The grab-Cut algorithm, k-means | 2 | 96.04 | 89 | 98.79 | N/A | 91.39 |
[122] | Y | 1 | CCD | N | Y | N | Y | Classification of skin lesion using a smartphone |
Y | Otsu’s, Minimum Spanning Tree, Color Triangle | 2 | 87.98 | 89.09 | 86.87 | N/A | N/A |
[123] | Y | 1 | D | Y | Y | Y | N | Enhancement and fusion of skin lesion features for classification | Y | Wavelet transform, Curvelet transform, local binary pattern, SVM | 2 | 86.17 | 78.93 | 93.25 | N/A | N/A |
[124] | N | - | - | - | N | Y | Y | Extract fabric characteristics for lesion classification | Y | Gabor filters based on shark smell optimizing method, K-means | - | N/A | N/A | N/A | N/A | N/A |
[125] | N | 3 | D | Y | N | Y | Y | Skin lesion segmentation and recognition based on fused features | Y | ABCD, fuzzy C-means, pair threshold binary decomposition, HOG, linear discriminant analysis, linear regression, complex tree, W-KNN, Naive Bayes, ensemble boosted tree, ensemble subspace discriminant analysis | 2, 3 | 99 | 98.5 | 99 | 98.5 | N/A |
[126] | N | 1 | D | Y | Y | Y | Y | Skin lesion segmentation with several classifiers | Y | 2-D Gabor wavelet, OTSU’s, median filtering, morphological operations, m- mediod classifier, SVM, Gaussian Mixture Modeling | 2 | 97.26 | 96.48 | 96.9 | N/A | N/A |
[127] | N | 3 | D | Y | Y | Y | Y | Classify the selected features of parallel fusion of features after segmentation | Y | Global maxima and minima, uniform and normal distribution, mean, mean deviation, HOG, Harlick method, M-SVM | 2 | 93.2 | 93 | 97 | 93.5 | N/A |
[128] | N | 3 | D | Y/N | N | Y | Y | Optimize features of skin lesion and classify these features | Y | Particle Swarm Optimization, KNN, SVM, decision tree | 2 | N/A | N/A | N/A | N/A | N/A |
[129] | N | 1 | D | Y | Y | Y | Y | Using high discriminative features for melanoma classification | Y | Histogram correction, OTSU’s, corner detection, Gray-level co-occurrence matrix features, Daugman’s rubber sheet model, RUSBoost, linear SVM | 2 | 95 | 95 | 95 | N/A | 92 |
[130] | N | 3 | D | Y | N | Y | Y | Classify melanoma based on the main frequencies from dermoscopic images | Y | ABCD, HU Moments, GLCM, Structural Co-occurrence matrix, MLP, LSSVM, Minimal Learning Machine | 2 | 89.93 | 92.15 | 89.9 | N/A | 91.05 |
[131] | N | 1 | I | N | - | Y | Y | Differentiate infrared spectroscopy of skin lesion | Y | Fourier transform infrared, Morphological information, PCA | 2 | 85 | N/A | N/A | N/A | N/A |
[132] | N | 1 | D | Y | Y | Y | Y | Classifying four skin lesions based on the Gray-level difference method | N | Gray-level difference method, ABCD, SVM | 4 | 100 | 100 | 100 | N/A | N/A |
[133] | N | 1 | CCD | N | Y | Y | Segment skin lesion and using a hybrid classifier to classify these lesions | Y | ABCD, pigment distribution and texture, GLCM, Log-linearized Gaussian mixture neural network, KNN, linear discriminant analysis, LDA, SVM, majority vote | 3 | 98.5 | 95 | 99.5 | N/A | N/A | |
[134] | N | 1 | D | Y | Y | Y | Y | Classify typical and atypical pigment network to diagnose melanoma | Y | Laplacian filter, median filter, polynomial curve fitting, connected component analysis, 2D Gabor filters, Gray-Level co-occurrence Matrix, Pearson product-moment co-relation, Probabilistic SVM, ANN | 2 | 86.7 | 84.6 | 88.7 | N/A | N/A |
[135] | N | 1 | LIBS spectra | N | - | Y | Y | Classification of skin tissue | Y | laser-induced breakdown spectroscopy, PCA, KNN, SVM | 2 | 76.84 | 74.2 | 86.9 | N/A | N/A |
[136] | Y | 2 | A | N | N | Y | Y | Classification of melanoma versus nevi by correlation bias reduction | Y | 2D wavelet packet decomposition, SVM recursive feature elimination (SVM -RFE) | 2 | 98.28 | 97.63 | 100 | N/A | N/A |
2 | D | Y | ||||||||||||||
[137] | Y | 1 | D | N | Y | Y | Y | Classification of lesions into melanoma and nevi | Y | Gaussian Filter, KNN, SVM | 2 | 96 | 97 | 96 | 97 | N/A |
[138] | Y | 1 | A | N | Y | N | Y | Using of blue-whitish structure for melanoma classification | Y | multiple instance learning (MIL) paradigm, Markov network, SVM | 2 | 84.5 | 74.42 | 87.9 | 61.54 | N/A |
1 | D | Y | ||||||||||||||
[139] | N | 1 | A | N | Y | N | Y | Classify lesion into melanoma or nevi based of color features | Y | K-means, pixel-based classification | 2 | 89.42 | N/A | N/A | N/A | N/A |
[140] | Y | 1 | D | Y | Y | Y | Segmentation and classification of skin lesions | Y | Hough Transform, ABCD, SP-SIFT, LF-SLIC region labeling | 2 | 96 | N/A | N/A | N/A | 93.8 | |
[141] | Y | 2 | D | Y | Y | Y | Y | Detect the boundaries of lesions to classify into melanoma and nevi | Y | Kullback-Leibler divergence, local binary patterns, SVM, KNN | 2 | 80.7 | N/A | N/A | N/A | N/A |
[142] | N | 1 | JPGE | N | Y | Y | Y | A QuadTree-based melanoma detection system based on color | Y | hybrid thresholding method, adaptive histogram thresholding, Euclidean distance transform, QuadTree, Kolmogorov-Smirnov, SVM, ANN, LDA, random forests | 2 | 73.8 | 75.7 | 73.3 | N/A | N/A |
[143] | Y | 1 | D | Y | N | Y | Y | Segmentation and classification of skin lesion | Y | Median Filter, watershed segmentation, ABCD, GLCM, KNN, RF, SVM | 2 | 89.43 | 91.15 | 87.71 | N/A | N/A |
[144] | Y | 1 | D | Y | N | Y | Y | Segmentations of Enhanced dermoscopic lesion images | Y | wavelet transform, morphological operations, Gray Thresholding, Cohen–Daubechies–Feauveau biorthogonal wavelet, Active contour, Color enhancement, Adaptive thresholding, Gradient vector flow | 2 | 93.87 | N/A | N/A | N/A | 92.72 |
[145] | Y | 1 | D | Y | N | Y | Y | Three distinct features to classify melanoma | N | ABCD, Otsu, Chan–Vese, Dull-Razor, ANN | 2 | 98.2 | 98 | 98.2 | N/A | N/A |
[146] | N | 2 | A | N | N | Y | Y | Classification three lesions based on shape, fractal dimension, texture, and color features | Y | recursive feature elimination, GLCM, fractal-based regional texture analysis, SVM, RBF | 3 | 98.99 | 98.28 | 98.48 | N/A | 91.42 |
2 | D | Y | ||||||||||||||
[147] | N | 2 | A | N | Y | N | N | Extraction of features using frequency domain analysis and classify these features | Y | Cross spectrum-based feature extraction, Spatial feature extraction, SVM-RFE with CBR | 4 | 98.72 | 98.89 | 98.83 | N/A | N/A |
2 | D | Y | ||||||||||||||
[148] | Y | 1 | D | N | Y | N | N | Improve bag of dense features to Classify skin lesions | Y | Gradient Location, Orientation Histogram, color features, SVM | 2 | 78 | N/A | N/A | N/A | N/A |
[149] | N | 2 | D | Y | N | Y | Y | CAD system for clinical assist | N | chroma based deformable models, speed function, Chan-Vese, Wilcoxon Rank Sum statistics, Discrete Wavelet Transform, Asymmetry, and Compactness Index, SVM | 2 | 88 | 95 | 82 | N/A | N/A |
[150] | Y | 2 | D | Y | N | Y | Y | Segmentation of skin lesions using fuzzy pixels classification and histogram thresholding | Y | fuzzy classification, histogram thresholding | 2 | 88.4 | 86.9 | 92.3 | N/A | 76 |
[151] | N | 1 | D | Y | Y | Y | Y | Lesion classification based on feature similarity measurement for codebook learning in the bag-of-features model | Y | Codebook learning, k-means, color histogram, scale-invariant feature transform (SIFT) | 2 | 82 | 80 | 83 | N/A | N/A |
1 | C | Y | ||||||||||||||
[152] | Y | 1 | D | Y | Y | Y | Y | Segmentation and classification of skin lesions using Kernel sparse representation | Y | Sparse coding, kernel dictionary, K-SVD | 3 | 91.34 | 93.17 | 91.48 | N/A | 91.25 |
[153] | N | 1 | D | N | N | Y | Y | Improve skin lesion classification using borderline characteristics | Y | Gradient-based Histogram Thresholding, Local Binary Patterns Clustering, Euclidean distance, Discrete Fourier Transform spectrum (DCT), power spectral density (PSD), SVM, Feedforward Neural Network (FNN) | 2 | 91 | 68 | 96 | N/A | N/A |
1 | C | Y | ||||||||||||||
[154] | Y | 1 | D | N | Y | N | Y | Multi-resolution-Tract CNN | N | AlexNet, GPU | 10 | 79.5 | N/A | N/A | N/A | N/A |
[155] | Y | 1 | D | Y | Y | Y | Y | Automatic segmentation and classification for skin lesions | Y | CNN with 50 layers, residual learning, SoftMax, SVM, Augmentation, GPU | 2 | 94 | N/A | N/A | N/A | N/A |
[156] | N | 1 | D | Y | N | Y | Y | Classification of segmented skin lesions | Y | U-Net, Sparse coding, Deep Residual Network (DRN), Augmentation | 2 | 76 | 82 | 62 | N/A | N/A |
[157] | Y | 1 | D | Y | N | Y | Y | Segmentation of skin lesions using deep learning | Y | fully convolutional networks (FCN), VGG, Augmentation | 2 | N/A | N/A | N/A | N/A | 89.2 |
[158] | Y | 1 | D | Y | Y | Y | Y | Automatic Skin Lesion Segmentation Using CNN | Y | FCN with 19 layers, Jaccard Distance, Augmentation, GPU | 2 | 95.5 | 91.8 | 96.6 | N/A | 92.2 |
[159] | Y | 3 | D | Y | Y | Y | N | Detection of melanoma using CNN and regularized fisher framework | Y | ResNet50, transfer learning, SoftMax, SVM, Augmentation, GPU | 2 | 78.3 | 35 | 88.8 | N/A | N/A |
1 | C | Y | ||||||||||||||
[160] | N | 1 | D | Y | N | Y | Y | Evaluation of skin lesion using Levenberg neural networks and stacked autoencoders clustering | Y | ABCD, morphological analysis, Levenberg–Marquardt neural network | 2 | N/A | 98 | 98 | N/A | N/A |
[161] | N | 1 | D | Y | Y | N | N | Classify limited and imbalanced skin lesion images | N | Adversarial Autoencoder with 19 layers, Augmentation, GPU | 2 | N/A | N/A | 83 | N/A | N/A |
[162] | Y | 1 | D | Y | Y | N | N | Ensemble different CNN for skin lesion classification | Y | GoogLeNet, AlexNet, ResNet, VGGNet, Sum of the probabilities, Product of the possibilities, Simple majority voting (SMV), Sum of the maximal probabilities (SMP), Weighted ensemble of CNN, Augmentation, GPU | 3 | 86.6 | 55.6 | 78.5 | N/A | N/A |
[163] | N | 1 | D | Y | Y | N | N | Skin lesion analysis using Multichannel ResNet | Y | Ensemble multi-ResNet50, ANN, concatenated Fully Connected Layer, Augmentation, GPU | 3 | 82.4 | N/A | N/A | N/A | N/A |
[164] | N | 1 | A | N | Y | Y | Y | Classify skin lesions based on border thickness | Y | GoogleNet, Breslow index | 5 | 66.2 | 89.19 | 85 | N/A | N/A |
[165] | N | 1 | D | Y | Y | Y | Y | Classify lesion using fused features that extracted from deep learning and image processing | Y | ResNet50, Telangiectasia Vessel Detection Algorithm, transfer learning, GPU | 5 | N/A | N/A | N/A | N/A | N/A |
[166] | Y | 1 | D | Y | Y | N | N | Classify skin lesions over IoT using deep learning | Y | CNN with nine layers, IoT, | 7 | 81.4 | N/A | N/A | N/A | N/A |
[168] | Y | 3 | D | Y | Y | Y | N | Skin lesion classification using depthwise separable residual CNN | Y | Non-local means filter, contrast-limited adaptive histogram equalization, discrete Wavelet transforms, depthwise separable residual DCNN | 2 | 99.5 | 99.31 | 100 | N/A | N/A |
1 | C | Y | ||||||||||||||
[169] | Y | 1 | D | Y | Y | Y | Y | Classify skin lesion using Attention Residual Learning | Y | Attention residual learning CNN, ResNet50, transfer learning, Augmentation, GPU | 3 | N/A | N/A | N/A | N/A | N/A |
[170] | Y | 1 | D | Y | Y | N | N | Classify skin lesion using CNN with novel regularization method | Y | CNN 7 layers, standard deviation of the weight matrix, GPU | 2 | 97.49 | 94.3 | 93.6 | N/A | N/A |
[171] | N | 1 | D | Y | N | Y | Y | Classify skin lesion by Incorporating the knowledge of dermatologists to CNN | Y | ResNet50, DermaKNet, Modulation Block, asymmetry block, AVG layer, Polar AVG layer. | 3 | 91.7 | N/A | 65.2 | N/A | N/A |
[172] | Y | 1 | - | - | Y | Y | Y | Classify skin lesions based on the novel 7-point melanoma checklist using Multitask CNN | Y | Multitask CNN, 7-point melanoma checklist, Augmentation, GPU | 3 | 87.1 | 77.3 | 89.4 | 63 | N/A |
[173] | Y | 1 | D | Y | Y | N | N | Classify skin lesion using Aggregated CNN | Y | ResNet50, ResNet101, fisher vector (FV), SVM, Chi-squared kernel, transfer learning, Augmentation, GPU | 2 | 86.81 | N/A | N/A | N/A | N/A |
[174] | Y | - | - | - | Y | N | N | Using CNN as a feature extractor for skin lesion images and classify these features | Y | Alex-Net, ECOC SVM, transfer learning | 4 | 94.2 | 97.83 | 90.74 | N/A | N/A |
[175] | Y | 1 | D | Y | Y | N | N | Classification of skin neoplasms using CNN and transfer learning with web and mobile application | N | Inception V3 (GoogleNet), transfer learning, Augmentation, GPU | - | 91 | N/A | N/A | N/A | N/A |
[176] | N | 1 | C | N | Y | N | Y | An application used through the cloud to classify diseases of face skin | Y | LeNet-5, AlexNet and VGG16, transfer learning, Augmentation, GPU | 5 | N/A | N/A | N/A | N/A | N/A |
[177] | Y | 2 | D | Y | Y | Y | Y | Skin lesion classification using 4 CNNs and ensembling of the final classification results | Y | AlexNet, VGG, ResNet-18, ResNet-101, SVM, MLP, random forest, transfer learning, Augmentation, GPU | 3 | 87.7 | 85 | 73.29 | N/A | N/A |
[178] | Y | 1 | D | Y | Y | N | N | Compare the ability of deep learning model to classify skin lesions with expert dermatologists | Y | ResNet50, local outlier factor, transfer learning, GPU | 2 | N/A | 87.5 | 60 | N/A | N/A |
[179] | N | 1 | D | N | Y | Y | Y | Evolving the deep learning model | PSO, hybrid learning PSO, Firefly Algorithm, spiral research action, probability distributions, crossover, mutation, K-Means, VGG16, Augmentation, GPU | 2 | 73.76 | N/A | N/A | N/A | N/A | |
2 | D | Y | ||||||||||||||
[181] | Y | 3 | D | Y | Y | Y | Y | Combine and expand current segmentation CNN to enhance the classification of skin lesions | Y | U-Net, ResNet34, LinkNet34, LinkNet152, fine-tuning, PyTorch, transfer learning, Augmentation, GPU, Jaccard-loss, | 2 | N/A | N/A | N/A | N/A | N/A |
[182] | N | 1 | D | Y | N | Y | Y | skin lesion segmentation based using geodesic morphological active contour | Y | Gaussian filter, Otsu’s threshold, deformable models, partial differential equation, Mathematical morphology, active geodesic contour, neural network, deep learning, statistical region merging (SRM), | 2 | 94.59 | 91.72 | 97.99 | N/A | 89 |
[183] | Y | 1 | - | - | Y | N | N | Erythema migrans and the other confounding lesions of skin using | Y | ResNet50, Keras, TensorFlow, fine-tuning, transfer learning, Augmentation, GPU | 4 | 86.53 | 76.4 | 75.96 | N/A | 92.09 |
[184] | Y | 1 | D | Y | Y | N | N | Comparing between the CNN and 112 dermatologists for skin lesion detection | Y | ResNet50, fine-tuning, transfer learning, Augmentation, GPU | 5 | N/A | 56.5 | 98.2 | N/A | N/A |
[185] | Y | 2 | D | Y | Y | Y | Y | Segmentation of skin lesions by ensemble the segmentation output of 2 CNN | Y | DEEPLABV3+, Mask R-CNN, ABCD, fine-tuning, transfer learning, Augmentation, GPU | 2 | 94.08 | 89.93 | 95 | N/A | N/A |
[186] | Y | 1 | C | Y | N | Y | Y | Proposing an algorithm that able to train CNN with limited data | Y | Inception V3 (GoogleNet), PECK, SCIDOG, SVM, RF, fine-tuning, transfer learning, Augmentation, GPU | 2 | 91 | 92 | 93 | N/A | 90.7 |
[187] | Y | 1 | C | N | Y | N | N | Classify skin disease of faces using Euclidean space to compute L-2 distance between images | Y | ResNet152, InceptionResNet-V2, fine-tuning, Euclidean space, L-2 distance, transfer learning, Augmentation, GPU | 4 | 87.42 | 97.04 | 97.23 | N/A | N/A |
[188] | Y | 1 | D | Y | Y | Y | Y | Neural Architecture Search to increase the size of the network based on the dataset size | Y | VGG8, VGG11, VGG16, 5-fold validation, Neural Architecture Search (NAS), hill-climbing, transfer learning, Augmentation, GPU | 2 | 77 | N/A | N/A | N/A | N/A |
[189] | Y | 2 | D | Y | Y | Y | Y | Skin lesion segmentation and pixel-wise classification using encoder and decoder network | Y | CNN, encoder-decoder deep network with skip connection, softmax, transfer learning, Augmentation, GPU | 2 | 95 | 97 | 96 | N/A | N/A |
[190] | Y | 2 | D | Y | Y | Y | Y | Multitasks DCNN for skin lesion segmentation, detection, and classification | Y | Multitask DCNN, Jaccard distance, focal loss, Augmentation, GPU | 2 | 95.9 | 83.1 | 98.6 | N/A | 95 |
[191] | Y | 1 | D | Y | Y | Y | Y | Light Lightweight CNN for skin lesion segmentation and classification | Y | Lightweight CNN, MobileNet, DenseNet, U-Net, focal loss, fine-tune, transfer learning, Augmentation, GPU | 2 | 96.2 | 93.4 | 97.4 | N/A | 88.9 |
[192] | N | 1 | D | Y | Y | Y | N | Enhancement and classification of dermoscopic skin images | Y | StyleGANs, 43 CNNs (ResNet50, VGG11, VGG13, AlexNet, SENet, etc.) max voting, fine-tune, transfer learning, Augmentation, GPU | 8 | 99.5 | 98.3 | 99.6 | N/A | 92.3 |
[193] | Y | 1 | D | Y | Y | Y | Y | Skin lesion classification based on CNN | Y | Deep-class CNN, Augmentation, GPU | 2 | 75 | 73 | 78 | N/A | N/A |
[194] | Y | 2 | D | Y | Y | Y | Y | Skin lesion segmentation using encoder-decoder FCN | Y | FCN, GPU | 3 | 96.92 | 96.88 | 95.31 | N/A | N/A |
[195] | Y | 3 | D | Y | Y | N | Y | Classify skin melanoma by extracting ROI and CNNS | Y | AlexNet, ResNet101, GoogleNet, Multiclass SVM, SoftMax, Histogram based windowing process, hierarchical clustering, fine-tune, transfer learning, Augmentation, GPU | 3 | 98.14 | 97.27 | 98.60 | N/A | 88.64 |
1 | C | Y | ||||||||||||||
[196] | Y | 3 | D | Y | N | Y | Y | The fusion of extracted deep features of a skin lesion for classification | Y | Biorthogonal 2-D wavelet transform, Otsu algorithm, Alex and VGG-16, PCA, fusion, fine-tune, transfer learning, Augmentation, GPU | 2 | 99.9 | 99.5 | 99.6 | N/A | N/A |
[197] | N | 3 | D | Y | Y | Y | Y | Ensemble multiscale and multi-CNN network | Y | EfficientNetB0, EfficientNetB1, SeResNeXt-50, fusion, fine-tune, transfer learning, Augmentation, GPU | 7 | 96.3 | N/A | N/A | 91.3 | 82 |
[198] | Y | 2 | D | Y | Y | Y | Y | Classification of skin lesions by multiclass multilevel using traditional machine learning and transfer learning | Y | K-means, Otsu’s thresholding, GLCM, ANN. k-fold validation, AlexNet, fine-tune, transfer learning, Augmentation, GPU | 4 | 93.02 | 87.87 | 98.17 | 97.96 | N/A |
[199] | N | 2 | D | Y | N | Y | Y | Optimized algorithm for weight selection to minimize the output of the network | Y | The bubble-net mechanism, Whale optimization algorithm (WOA), Lévy Flight Mechanism, genetic algorithm, shark smell optimization (SSO), world cup optimization algorithm, grasshopper optimization algorithm (GOA), particle swarm optimization algorithm (PSO), LeNet, fine-tune, transfer learning, GPU | 2 | N/A | N/A | N/A | N/A | N/A |
[200] | Y | 3 | D | Y | N | Y | Y | semantic skin lesion segmentation with parameters reducing | Y | U-Net, FCN8s, DSNet, Augmentation, GPU | 7 | N/A | 87.5 | 95.5 | N/A | N/A |
[201] | N | 3 | D | Y | Y | Y | Y | Different CNN network integration for segmentation and multiple classification stage | Y | Inception-v3, ResNet-50, Inception-ResNet-v2, and DenseNet-201 | 7 | 89.28 | 81 | 87.16 | N/A | 81.82 |
[202] | Y | 3 | D | Y | Y | Y | Y | Skin lesion segmentation based on CNN | Y | CIElab, FCN, U-Net, Augmentation, GPU | 2 | N/A | N/A | N/A | N/A | 87.1 |
[205] | Y | 1 | D | Y | Y | N | N | Classify the challenging dataset ISIC2018 | Y | AlexNet, 10-fold cross-validation, fine-tune, transfer learning, Augmentation, GPU | 7 | 92.99 | 70.44 | 96 | 62.78 | N/A |
[204] | Y | 1 | D | Y | Y | N | N | Classify the challenging dataset ISIC2019 | Y | GoogleNet, Similarity score, bootstrap weighted SVM classifier, SoftMax, fine-tune, transfer learning, Augmentation, GPU | 9 | 98.70 | 95.6 | 99.27 | 95.06 | N/A |
The comparison of methods of classification for skin lesions shows that the problem formulations of each work vary slightly. The efficient melanoma detection process has five core elements, focused on data acquisition (collection), fine-tuning, selection of features, deep learning, and final model development. The first step is to acquire data in which data from publicly available benchmarks, non-listed and non-public databases, such as the melanoma detection images collected from the internet, are used for detecting skin cancer.
There were several methods of learning with regard to deep learning based on transfer learning, while others were based on ensemble approaches, and some employed neural networks and hybrid techniques of fully convolutional neural networks. The pre-trained deep learning models and handcrafted methods that were based on a deep-leaning approach have already shown promising results for high-precision accuracy of melanoma detection.
There is a limited range of images for training and testing available for comparison as most of the datasets are small. With small datasets, the proposed methods do well, but are prone to over-fitting, and when tested with a large image set, are reliably unpredictable. For example, just 200 images are included in the PH2 dataset. The problem of training with a small dataset could be mitigated by data augmentation, image generation by an adversarial generative network, and transfer learning. Some researchers use non-public databases and internet images. This makes it more difficult to replicate the findings and results because the dataset is not available, whereas the selection of images from the internet may be biased.
The creation of large public image datasets with images as representative of the world’s people as possible to avoid racial bias [206] is another major task in this research field. Image prejudice based on gender and race AI prejudice means that the models and algorithms fail to give optimal results for people of an under-represented gender or ethnicity.
Mostly, skin lesions from light-colored skin can be seen in current datasets. For example, the ISIC dataset images are mostly obtained in the USA, Europe and Australia. In addition, CNNs try to extract the skin color to achieve a proper classification for dark-skinned humans. This can only happen if the training dataset contains sufficient images of dark-skinned people. The size of the lesion also has significant importance. If the lesion size is smaller than 6mm, melanoma can not be detected easily.
The addition of clinical data such as race, age, gender, skin type, as inputs for classifiers may help to increase classification accuracy. This supplemental data could be beneficial for dermatologists’ decision making. These aspects should be included in future work. Finally, according to the authors’ perspective and based on the size of the dataset, if the dataset contains a large number of images per class, deep learning is better than traditional machine learning. Even with datasets containing few images, deep learning can overcome this issue by using different methods of augmentation. Deep learning makes intelligent decisions on its own with a higher accuracy rate. The pre-trained deep learning models and handcrafted methods that were based on a deep learning approach have already shown promising results for the high-precision accuracy of melanoma detection.
Author Contributions
Methodology, M.A.K.; formal analysis, K.M.H., M.M.E., R.D.; investigation, M.A.K., K.M.H., M.M.E.; resources, M.A.K., M.M.E.; data curation, M.A.K.; writing—original draft preparation, M.A.K., M.M.E.; writing—review and editing, K.M.H., R.D.; project administration, K.M.H.; funding acquisition, M.M.E., R.D. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
Not applicable.
Conflicts of Interest
The authors declare no conflict of interest.
Footnotes
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Capdehourat G., Corez A., Bazzano A., Alonso R., Musé P. Toward a combined tool to assist dermatologists in melanoma detection from dermoscopic images of pigmented skin lesions. Pattern Recognit. Lett. 2011;32:2187–2196. doi: 10.1016/j.patrec.2011.06.015. [DOI] [Google Scholar]
- 2.American Cancer Society Statistics 2013. [(accessed on 10 May 2021)]; Available online: https://www.cancer.org/research/cancer-facts-statistics/all-cancer-facts-figures/cancer-facts-figures-2013.html?fbclid=IwAR2gMmnaky1m3LdETjBwoTiRkaxDiaKvWss9UlSVx6YqWmR-rrehUjBMpvs.
- 3.Korotkov K., Garcia R. Computerized analysis of pigmented skin lesions: A review. Artif. Intell. Med. 2012;56:69–90. doi: 10.1016/j.artmed.2012.08.002. [DOI] [PubMed] [Google Scholar]
- 4.Skin Cancer Foundation. [(accessed on 20 August 2016)];Skin Cancer Information. 2016 Available online: http://www.skincancer.org/skin-cancer-information.
- 5.AlZubi S., Islam N., Abbod M. Multiresolution analysis using wavelet, ridgelet, and curvelet transforms for medical image segmentation. [(accessed on 20 July 2015)];J. Biomed. Imaging. 2011 4:2011. doi: 10.1155/2011/136034. Available online: http://www.dermquest.com. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Arroyo J.L.G., Zapirain B.G. Detection of pigment 131 network in dermoscopy images using supervised machine learning and structural analysis. Comput. Biol. Med. 2014;44:144–157. doi: 10.1016/j.compbiomed.2013.11.002. [DOI] [PubMed] [Google Scholar]
- 7.Lee H.Y., Chay W.Y., Tang M.B., Chio M.T., Tan S.H. Melanoma: Differences between asian and caucasian patients. Ann. Acad. Med. Singap. 2012;41:17–20. [PubMed] [Google Scholar]
- 8.Rigel D.S., Russak J., Friedman R. The evolution of melanoma diagnosis: 25 years beyond the abcds. CA Cancer J. Clin. 2010;60:301–316. doi: 10.3322/caac.20074. [DOI] [PubMed] [Google Scholar]
- 9.Laikova K.V., Oberemok V.V., Krasnodubets A.M., Gal’chinsky N.V., Useinov R.Z., Novikov I.A., Temirova Z.Z., Gorlov M.V., Shved N.A., Kumeiko V.V., et al. Advances in the Understanding of Skin Cancer: Ultraviolet Radiation, Mutations, and Antisense Oligonucleotides as Anticancer Drugs. Molecules. 2019;24:1516. doi: 10.3390/molecules24081516. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Apalla Z., Lallas A., Sotiriou E., Lazaridou E., Ioannides D. Epidemiological trends in skin cancer. Dermatol. Pract. Concept. 2017;7:1–6. doi: 10.5826/dpc.0702a01. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Karimkhani C., Green A.C., Nijsten T., Weinstock M.A., Dellavalle R.P., Naghavi M., Fitzmaurice C. The global burden of melanoma: Results from the Global Burden of Disease Study 2015. Br. J. Dermatol. 2017;177:134–140. doi: 10.1111/bjd.15510. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Schadendorf D., Lebbé C., Hausenzur A., Avril M.F., Hariharan S., Bharmal M., Becker J.C. Merkel cell carcinoma: Epidemiology, prognosis, therapy and unmet medical needs. Eur. J. Cancer. 2017;71:53–69. doi: 10.1016/j.ejca.2016.10.022. [DOI] [PubMed] [Google Scholar]
- 13.Timerman D., McEnery-Stonelake M., Joyce C.J., Nambudiri V.E., Hodi F.S., Claus E.B., Ibrahim N., Lin J.Y. Vitamin D deficiency is associated with a worse prognosis in metastatic melanoma. Oncotarget. 2017;8:6873–6882. doi: 10.18632/oncotarget.14316. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Feller L., Khammissa R.A., Kramer B., Altini M., Lemmer J. Basal cell carcinoma, squamous cell carcinoma and melanoma of the head and face. Head Face Med. 2016;12:11. doi: 10.1186/s13005-016-0106-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Becker J.C., Stang A., DeCaprio J.A., Cerroni L., Lebbé C., Veness M., Nghiem P. Merkel cell carcinoma. Nat. Rev. Dis. Primers. 2017;3:170–177. doi: 10.1038/nrdp.2017.77. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Glazer A.M., Winkelmann R.R., Farberg A.S., Rigel D.S. Analysis of trends in US melanoma incidence and mortality. JAMA Dermatol. 2016;153:225–226. doi: 10.1001/jamadermatol.2016.4512. [DOI] [PubMed] [Google Scholar]
- 17.Lv R., Sun Q. A Network Meta-Analysis of Non-Melanoma Skin Cancer (NMSC) Treatments: Efficacy and Safety Assessment. J. Cell. Biochem. 2017;118:3686–3695. doi: 10.1002/jcb.26015. [DOI] [PubMed] [Google Scholar]
- 18.Lindelof B., Hedblad M.-A. Accuracy in the clinical diagnosis and pattern of malignant melanoma at a dermatological clinic. J. Dermatol. 1994;21:461–464. doi: 10.1111/j.1346-8138.1994.tb01775.x. [DOI] [PubMed] [Google Scholar]
- 19.Morton C.A., Mackie R.M. Clinical accuracy of the diagnosis of cutaneous malignant melanoma. Br. J. Dermatol. 1998;138:283–287. doi: 10.1046/j.1365-2133.1998.02075.x. [DOI] [PubMed] [Google Scholar]
- 20.Menzies S.W., Bischof L., Talbot H., Gutenev A., Avramidis M., Wong L., Lo S.K., Mackellar G., Skladnev V., McCarthy W., et al. The performance of SolarScan: An automated dermoscopy image analysis instrument for the diagnosis of primary melanoma. Arch. Dermatol. 2005;141:1388–1396. doi: 10.1001/archderm.141.11.1388. [DOI] [PubMed] [Google Scholar]
- 21.Binder M., Schwarz M., Winkler A., Steiner A., Kaider A., Wolff K., Pehamberger H. Epiluminescence microscopy: A useful tool for the diagnosis of pigmented skin lesions for formally trained dermatologists. Arch. Dermatol. 1995;131:286–291. doi: 10.1001/archderm.1995.01690150050011. [DOI] [PubMed] [Google Scholar]
- 22.Pehamberger H., Binder M., Steiner A., Wolff K. In vivo epiluminescence microscopy: Improvement of early diagnosis of melanoma. J. Investig. Dermatol. 1993;100:3. doi: 10.1038/jid.1993.63. [DOI] [PubMed] [Google Scholar]
- 23.Dhawan A.P., Gordon R., Rangayyan R.M. Nevoscopy: Three-dimensional computed tomography of nevi and melanomas in situ by transillumination. IEEE Trans. Onmedical Imaging. 1984;3:54–61. doi: 10.1109/TMI.1984.4307657. [DOI] [PubMed] [Google Scholar]
- 24.Zouridakis G., Duvic M.D.M., Mullani N.A. Transillumination Imaging for Early Skin Cancer Detection. Biomedical Imaging Lab., Department of Computer Science, University of Houston; Houston, TX, USA: 2005. Technol Report 2005. [Google Scholar]
- 25.Vestergaard M.E., Macaskill P., Holt P.E., Menzies S.W. Dermoscopy compared with naked eye examination for the diagnosis of primary melanoma: A meta-analysis of studies performed in a clinical setting. Br. J. Dermatol. 2008;159:669–676. doi: 10.1111/j.1365-2133.2008.08713.x. [DOI] [PubMed] [Google Scholar]
- 26.Carli P., Giorgi V.D., Crocetti E., Mannone F., Massi D., Chiarugi A., Giannotti B. Improvement of malignant/benign ratio in excised melanocytic lesions in the “dermoscopy era”: A retrospective study 1997–2001. Br. J. Dermatol. 2004;150:687–692. doi: 10.1111/j.0007-0963.2004.05860.x. [DOI] [PubMed] [Google Scholar]
- 27.Carli P., Giorgi V.D., Chiarugi A., Nardini P., Weinstock M.A., Crocetti E., Stante M., Giannotti B. Addition of dermoscopy to conventional naked-eye examination in melanoma screening: A randomized study. J. Am. Acad. Dermatol. 2004;50:683–689. doi: 10.1016/j.jaad.2003.09.009. [DOI] [PubMed] [Google Scholar]
- 28.Mayer J. Systematic review of the diagnostic accuracy of dermatoscopy in detecting malignant melanoma. Med. J. Aust. 1997;167:206–210. doi: 10.5694/j.1326-5377.1997.tb138847.x. [DOI] [PubMed] [Google Scholar]
- 29.Piccolo D., Ferrari A., Peris K., Daidone R., Ruggeri B., Chimenti S. Dermoscopic diagnosis by a trained clinician vs. a clinician with minimal dermoscopy training vs. computeraided diagnosis of 341 pigmented skin lesions: A comparative study. Br. J. Dermatol. 2002;147:481–486. doi: 10.1046/j.1365-2133.2002.04978.x. [DOI] [PubMed] [Google Scholar]
- 30.Braun R.P., Rabinovitz H.S., Oliviero M., Kopf A.W., Saurat J.-H. Dermoscopy of pigmented skin lesions. J. Am. Acad. Dermatol. 2005;52:109–121. doi: 10.1016/j.jaad.2001.11.001. [DOI] [PubMed] [Google Scholar]
- 31.Kittler H., Pehamberger H., Wolff K., Binder M. Diagnostic accuracy of dermoscopy. Lancet Oncol. 2002;3:159–165. doi: 10.1016/S1470-2045(02)00679-4. [DOI] [PubMed] [Google Scholar]
- 32.Whited J.D., Grichnik J.M. Does this patient have a mole or a melanoma? J. Am. Med. Assoc. 1998;279:696–701. doi: 10.1001/jama.279.9.696. [DOI] [PubMed] [Google Scholar]
- 33.Burroni M., Corona R., Dell’Eva G., Sera F., Bono R., Puddu P., Perotti R., Nobile F., Andreassi L., Rubegni P. Melanoma computer-aided diagnosis: Reliability and feasibility study. Clin. Cancer Res. 2004;10:1881–1886. doi: 10.1158/1078-0432.CCR-03-0039. [DOI] [PubMed] [Google Scholar]
- 34.Nami N., Giannini E., Burroni M., Fimiani M., Rubegni P. Teledermatology: State-of-the-art and future perspectives. Expert Rev. Dermatol. 2014;7:1–3. doi: 10.1586/edm.11.79. [DOI] [Google Scholar]
- 35.Fabbrocini G., Triassi M., Mauriello M.C., Torre G., Annunziata M.C., De Vita V., Pastore F., D’Arco V., Monfrecola G. Epidemiology of skin cancer: Role of some environmental factors. Cancers. 2010;2:1980–1989. doi: 10.3390/cancers2041980. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Haenssle H., Fink C., Schneiderbauer R., Toberer F., Buhl T., Blum A., Reader Study Level-I and Level-II Groups Man against machine: Diagnostic performance of a deep learning convolutional neural network for dermoscopic melanoma recognition in comparison to 58 dermatologists. Ann. Oncol. 2018;29:1836–1842. doi: 10.1093/annonc/mdy166. [DOI] [PubMed] [Google Scholar]
- 37.Argenziano G., Soyer H.P. Dermoscopy of pigmented skin lesions: A valuable tool for early diagnosis of melanoma. Lancet Oncol. 2001;2:443–449. doi: 10.1016/S1470-2045(00)00422-8. [DOI] [PubMed] [Google Scholar]
- 38.Ali A.R.A., Deserno T.M. A systematic review of automated melanoma detection in dermatoscopic images and its ground truth data. Proc. SPIE Int. Soc. Opt. Eng. 2012;8318:1–6. doi: 10.1117/12.912389. [DOI] [Google Scholar]
- 39.Fabbrocini G., De Vita V., Pastore F., D’Arco V., Mazzella C., Annunziata M.C., Cacciapuoti S., Mauriello M.C., Monfrecola A. Teledermatology: From prevention to diagnosis of nonmelanoma and melanoma skin cancer. Int. J. Telemed. Appl. 2011;2011:125762. doi: 10.1155/2011/125762. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Foraker R., Kite B., Kelley M.M., Lai A.M., Roth C., Lopetegui M.A., Shoben A.B., Langan M., Rutledge N., Payne P.R.O. EHR-based visualization tool: Adoption rates, satisfaction, and patient outcomes. EGEMS. 2015;3:1159. doi: 10.13063/2327-9214.1159. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Fabbrocini G., Betta G., Di Leo G., Liguori C., Paolillo A., Pietrosanto A., Sommella P., Rescigno O., Cacciapuoti S., Pastore F., et al. Epiluminescence image processing for melanocytic skin lesion diagnosis based on 7-point check-list: A preliminary discussion on three parameters. Open Dermatol. J. 2010;4:110–115. doi: 10.2174/1874372201004010110. [DOI] [Google Scholar]
- 42.Hart P.E., Stork D.G., Duda R.O. Pattern Classification. 2nd ed. John Wiley & Sons; Hoboken, NJ, USA: 2000. [Google Scholar]
- 43.Oliveira R.B., Papa J.P., Pereira A.S., Tavares J.M.R.S. Computational methods for pigmented skin lesion classification in images: Review and future trends. Neural Comput. Appl. 2016;29:613–636. doi: 10.1007/s00521-016-2482-6. [DOI] [Google Scholar]
- 44.Cascinelli N., Ferrario M., Tonelli T., Leo E. Apossible new tool for clinical diagnosis of melanoma: The computer. J. Am. Acad. Dermatol. 1987;16:361–367. doi: 10.1016/S0190-9622(87)70050-4. [DOI] [PubMed] [Google Scholar]
- 45.Hall P.N., Claridge E., Smith J.D.M. Computer screening for early detection of melanoma—Is there a future? Br. J. Dermatol. 1995;132:325–338. doi: 10.1111/j.1365-2133.1995.tb08664.x. [DOI] [PubMed] [Google Scholar]
- 46.Cristofolini M., Bauer P., Boi S., Cristofolini P., Micciolo R., Sicher M.C. Diagnosis of cutaneous melanoma: Accuracy of a computerized image analysis system(SkinView) Ski. Res. Technol. 1997;3:23–27. doi: 10.1111/j.1600-0846.1997.tb00155.x. [DOI] [PubMed] [Google Scholar]
- 47.Umbaugh S.E. Computer Vision in Medicine: Color Metrics and Image Segmentation Methods for Skin Cancer Diagnosis. Electrical Engineering Department, University of Missouri; Rolla, MO, USA: 1990. [Google Scholar]
- 48.Stanganelli I., Brucale A., Calori L., Gori R., Lovato A., Magi S., Kopf B., Bacchilega R., Rapisarda V., Testori A., et al. Computer-aided diagnosis of melanocytic lesions. Anticancer Res. 2005;25:4577–4582. [PubMed] [Google Scholar]
- 49.Rubegni P., Cevenini G., Burroni M., Perotti R., Dell’Eva G., Sbano P., Miracco C., Luzi P., Tosi P., Barbini P., et al. Automated diagnosis of pigmented skin lesions. Int. J. Cancer. 2002;101:576–580. doi: 10.1002/ijc.10620. [DOI] [PubMed] [Google Scholar]
- 50.Sober A.J., Burstein J.M. Computerized digital image analysis: An aid for melanoma diagnosis—Preliminary investigations and brief review. J. Dermatol. 1994;21:885–890. doi: 10.1111/j.1346-8138.1994.tb03307.x. [DOI] [PubMed] [Google Scholar]
- 51.Rajpara S.M., Botello A.P., Townend J., Ormerod A.D. Systematic review of dermoscopy and digital dermoscopy/artificial intelligence for the diagnosis of melanoma. Br. J. Dermatol. 2009;161:591–604. doi: 10.1111/j.1365-2133.2009.09093.x. [DOI] [PubMed] [Google Scholar]
- 52.Rosado B., Menzies S., Harbauer A., Pehamberger H., Wolff K., Binder M., Kittler H. Accuracy of computer diagnosis of melanoma: A quantitative meta-analysis. Arch. Dermatol. 2003;139:361–367. doi: 10.1001/archderm.139.3.361. [DOI] [PubMed] [Google Scholar]
- 53.Bauer P., Cristofolini P., Boi S., Burroni M., Dell’Eva G., Micciolo R., Cristofolini M. Digital epiluminescence microscopy: Usefulness in the differential diagnosis of cutaneous pigmentary lesions. A statistical comparison between visual and computer inspection. Melanoma Res. 2000;10:345–349. doi: 10.1097/00008390-200008000-00005. [DOI] [PubMed] [Google Scholar]
- 54.Maglogiannis I., Doukas C.N. Overview of advanced computer vision systems for skin lesions characterization. IEEE Trans. Inf. Technol. Biomed. 2009;13:721–733. doi: 10.1109/TITB.2009.2017529. [DOI] [PubMed] [Google Scholar]
- 55.Friedman R.J., Gutkowicz-Krusin D., Farber M.J., Warycha M., Schneider-Kels L., Papastathis N., Mihm M.C., Googe P., King R., Prieto V.G., et al. The diagnostic performance of expert dermoscopists vs a computervision system on small-diameter melanomas. Arch. Dermatol. 2008;144:476–482. doi: 10.1001/archderm.144.4.476. [DOI] [PubMed] [Google Scholar]
- 56.Blum A., Zalaudek I., Argenziano G. Digital image analysis for diagnosis of skin tumors. Semin. Cutan. Med. Surgery. 2008;27:11–15. doi: 10.1016/j.sder.2007.12.005. [DOI] [PubMed] [Google Scholar]
- 57.Maruthamuthu M.K., Raffiee A.H., Oliveira D.M.D., Ardekani A.M., Verma M.S. Raman spectra-based deep learning: A tool to identify microbial contamination. MicrobiologyOpen. 2020;9:e1122. doi: 10.1002/mbo3.1122. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Zhao Z., Wu C.M., Zhang S., He F., Liu F., Wang B., Huang Y., Shi W., Jian D., Xie H., et al. A Novel Convolutional Neural Network for the Diagnosis and Classification of Rosacea: Usability Study. Jmir Med. Inform. 2021;9:e23415. doi: 10.2196/23415. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Moher D., Liberati A., Tetzlaff J., Altman D.G., The PRISMA Group Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. PLoS Med. 2009;6:E1000097. doi: 10.1371/journal.pmed.1000097. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Siegel R.L., Miller K.D., Jemal A. Cancer statistics, 2018. CA Cancer J. Clin. 2018;68:7–30. doi: 10.3322/caac.21442. [DOI] [PubMed] [Google Scholar]
- 61.Giotis I., Molders N., Land S., Biehl M., Jonkman M.F., Petkov N. MED-NODE: A computer-assisted melanoma diagnosis system using non-dermoscopic images. Expert Syst. Appl. 2018;42:6578–6585. doi: 10.1016/j.eswa.2015.04.034. [DOI] [Google Scholar]
- 62.Dermatology Information System. [(accessed on 25 January 2016)]; Available online: http://www.dermis.net.
- 63.DermQuest. [(accessed on 25 January 2016)]; Available online: http://www.dermquest.com.
- 64.Mendonça T., Ferreira P.M., Marques J.S., Marcal A.R.S., Rozeira J. PH2- A dermoscopic image database for research and benchmarking; Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC); Osaka, Japan. 3–7 July 2013; pp. 5437–5440. [DOI] [PubMed] [Google Scholar]
- 65.Gutman D., Codella N.C.F., Emre C., Brian H., Michael M., Nabin M., Allan H. Skin Lesion Analysis toward Melanoma Detection: A Challenge at the International Symposium on Biomedical Imaging (ISBI) 2016, hosted by the International Skin Imaging Collaboration (ISIC) arXiv. 20161605.01397 [Google Scholar]
- 66.Codella N., Gutman D., Celebi M.E., Helba D., Marchetti M.A., Dusza S., Kalloo A., Liopyris K., Mishra N., Kittler H., et al. Skin Lesion Analysis Toward Melanoma Detection: A Challenge at the 2017 International Symposium on Biomedical Imaging (ISBI), Hosted by the International Skin Imaging Collaboration (ISIC) arXiv. 20171710.05006 [Google Scholar]
- 67.Codella N., Rotemberg V., Tschandl P., Celebi M.E., Dusza S., Gutman D., Helba B., Kalloo A., Liopyris K., Marchetti M., et al. Skin Lesion Analysis Toward Melanoma Detection 2018: A Challenge Hosted by the International Skin Imaging Collaboration (ISIC) arXiv. 20181902.03368 [Google Scholar]
- 68.Tsch P., Rosendahl C., Kittler H. The HAM10000 dataset, a large collection of multi-sources dermatoscopic images of common pigmented skin lesions. Sci. Data. 2018;5:180161. doi: 10.1038/sdata.2018.161. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 69.Combalia M., Codella N.C.F., Rotemberg V., Helba B., Vilaplana V., Reiter O., Halpern A.C., Puig S., Malvehy J. BCN20000: Dermoscopic Lesions in the Wild. arXiv. 2019 doi: 10.1038/s41597-024-03387-w.1908.02288 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 70.Ballerini L., Fisher R.B., Aldridge B., Rees J. A Color and Texture Based Hierarchical K-NN Approach to the Classification of Non-melanoma Skin Lesions. In: Celebi M., Schaefer G., editors. Color Medical Image Analysis. Volume 6 Springer; Dordrecht, The Netherland: 2013. Lecture Notes in Computational Vision and Biomechanics. [Google Scholar]
- 71.Lio P.A., Nghiem P. Interactive atlas of dermoscopy. J. Am. Acad. Dermatol. 2004;50:807–808. doi: 10.1016/j.jaad.2003.07.029. [DOI] [Google Scholar]
- 72.Rotemberg V., Kurtansky N., Betz-Stablein B., Caffery L., Chousakos E., Codella N., Combalia M., Dusza S., Guitera P., Gutman D., et al. A patient-centric dataset of images and metadata for identifying melanomas using clinical context. Sci. Data. 2021;8:34. doi: 10.1038/s41597-021-00815-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 73.Korotkov K. Ph.D. Thesis. Universitat de Girona; Girona, Spain: 2014. Automatic Change Detection in Multiple Pigmented Skin Lesions. [Google Scholar]
- 74.Kadry S., Taniar D., Damasevicius R., Rajinikanth V., Lawal I.A. Extraction of abnormal skin lesion from dermoscopy image using VGG-SegNet; Proceedings of the 2021 IEEE 7th International Conference on Bio Signals, Images and Instrumentation, ICBSII 2021; Chennai, India. 25–27 March 2021; [DOI] [Google Scholar]
- 75.Mishra N.K., Celebi M.E. An overview of melanoma detection in dermoscopy images using image processing and machine learning. arXiv. 20161601.07843 [Google Scholar]
- 76.Hosny K.M., Kassem M.A., Foaud M.M. Classification of skin lesions using transfer learning and augmentation with Alex-net. PLoS ONE. 2019;14:e0217293. doi: 10.1371/journal.pone.0217293. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Hosny K.M., Kassem M.A., Foaud M.M. Skin Cancer Classification using Deep Learning and Transfer Learning; Proceedings of the 9th Cairo International Biomedical Engineering; Cairo, Egypt. 20–22 December 2018; pp. 90–93. [Google Scholar]
- 78.Hosny K.M., Kassem M.A., Fouad M.M. Deep Learning in Computer Vision: Theories and Applications. CRC; Boca Raton, FL, USA: 2020. Skin Melanoma Classification Using Deep Convolutional Neural Networks. [DOI] [Google Scholar]
- 79.Glaister J., Wong A., Clausi D.A. Segmentation of Skin Lesions From Digital Images Using Joint Statistical Texture Distinctiveness. IEEE Trans. Biomed. Eng. 2014;61:1220–1230. doi: 10.1109/TBME.2013.2297622. [DOI] [PubMed] [Google Scholar]
- 80.Celebi M.E., Zornberg A. Automated Quantification of Clinically Significant Colors in Dermoscopy Images and Its Application to Skin Lesion Classification. IEEE Syst. J. 2014;8:980–984. doi: 10.1109/JSYST.2014.2313671. [DOI] [Google Scholar]
- 81.Barata C., Ruela M., Francisco M., Mendonça T., Marques J.S. Two Systems for the Detection of Melanomas in Dermoscopy Images Using Texture and Color Features. IEEE Syst. J. 2014;8:965–979. doi: 10.1109/JSYST.2013.2271540. [DOI] [Google Scholar]
- 82.Sáez A., Serrano C., Acha B. Model-Based Classification Methods of Global Patterns in Dermoscopic Images. IEEE Trans. Med. Imaging. 2014;33:1137–1147. doi: 10.1109/TMI.2014.2305769. [DOI] [PubMed] [Google Scholar]
- 83.Abuzaghleh O., Barkana B.D., Faezipour M. Automated skin lesion analysis based on color and shape geometry feature set for melanoma early detection and prevention; Proceedings of the IEEE Long Island Systems, Applications and Technology (LISAT) Conference 2014; Farmingdale, NY, USA. 2 May 2014; pp. 1–6. [DOI] [Google Scholar]
- 84.Abuzaghleh O., Barkana B.D., Faezipour M. SKINcure: A real time image analysis system to aid in the malignant melanoma prevention and early detection; Proceedings of the Southwest Symposium on Image Analysis and Interpretation; San Diego, CA, USA. 6–8 April 2014; pp. 85–88. [DOI] [Google Scholar]
- 85.Surówka G., Ogorzałek M. On optimal wavelet bases for classification of skin lesion images through ensemble learning; Proceedings of the International Joint Conference on Neural Networks (IJCNN); Beijing, China. 6–11 July 2014; pp. 165–170. [DOI] [Google Scholar]
- 86.Lezoray O., Revenu M., Desvignes M. Graph-based skin lesion segmentation of multispectral dermoscopic images; Proceedings of the IEEE International Conference on Image Processing (ICIP); Paris, France. 27–30 October 2014; pp. 897–901. [DOI] [Google Scholar]
- 87.Sheha M.A., Sharwy A., Mabrouk M.S. Pigmented skin lesion diagnosis using geometric and chromatic features; Proceedings of the Cairo International Biomedical Engineering Conference (CIBEC); Giza, Egypt. 11–13 December 2014; pp. 115–120. [DOI] [Google Scholar]
- 88.Dhinagar N.J., Celenk M. Analysis of regularity in skin pigmentation and vascularity by an optimized feature space for early cancer classification; Proceedings of the 7th International Conference on Biomedical Engineering and Informatics; Dalian, China. 14–16 October 2014; pp. 709–713. [DOI] [Google Scholar]
- 89.Haider S., Cho D., Amelard R., Wong A., Clausi D.A. Enhanced classification of malignant melanoma lesions via the integration of physiological features from dermatological photographs; Proceedings of the 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society; Chicago, IL, USA. 26–30 August 2014; pp. 6455–6458. [DOI] [PubMed] [Google Scholar]
- 90.Masood A., Al-Jumaily A.A. Integrating soft and hard threshold selection algorithms for accurate segmentation of skin lesion; Proceedings of the 2nd Middle East Conference on Biomedical Engineering; Doha, Qatar. 17–20 February 2014; pp. 83–86. [DOI] [Google Scholar]
- 91.Abuzaghleh O., Barkana B.D., Faezipour M. Noninvasive Real-Time Automated Skin Lesion Analysis System for Melanoma Early Detection and Prevention. IEEE J. Transl. Eng. Health Med. 2015;3:1–12. doi: 10.1109/JTEHM.2015.2419612. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 92.Harmouche R., Subbanna N.K., Collins D.L., Arnold D.L., Arbel T. Probabilistic Multiple Sclerosis Lesion Classification Based on Modeling Regional Intensity Variability and Local Neighborhood Information. IEEE Trans. Biomed. Eng. 2015;62:1281–1292. doi: 10.1109/TBME.2014.2385635. [DOI] [PubMed] [Google Scholar]
- 93.Lu C., Ma Z., Mandal M. Automated segmentation of the epidermis area in skin whole slide histopathological images. IET Image Process. 2015;9:735–742. doi: 10.1049/iet-ipr.2014.0192. [DOI] [Google Scholar]
- 94.Jiji G.W., Raj P.S.J.D. Content-based image retrieval in dermatology using intelligent technique. IET Image Process. 2015;9:306–317. doi: 10.1049/iet-ipr.2013.0501. [DOI] [Google Scholar]
- 95.Barata C., Celebi M.E., Marques J.S. Improving Dermoscopy Image Classification Using Color Constancy. IEEE J. Biomed. Health Inform. 2015;19:1146–1152. doi: 10.1109/JBHI.2014.2336473. [DOI] [PubMed] [Google Scholar]
- 96.Valavanis I., Maglogiannis I., Chatziioannou A.A. Exploring Robust Diagnostic Signatures for Cutaneous Melanoma Utilizing Genetic and Imaging Data. IEEE J. Biomed. Health Inform. 2015;19:190–198. doi: 10.1109/JBHI.2014.2336617. [DOI] [PubMed] [Google Scholar]
- 97.Amelard R., Glaister J., Wong A., Clausi D.A. High-Level Intuitive Features (HLIFs) for Intuitive Skin Lesion Description. IEEE Trans. Biomed. Eng. 2015;62:820–831. doi: 10.1109/TBME.2014.2365518. [DOI] [PubMed] [Google Scholar]
- 98.Shimizu K., Iyatomi H., Celebi M.E., Norton K., Tanaka M. Four-Class Classification of Skin Lesions With Task Decomposition Strategy. IEEE Trans. Biomed. Eng. 2015;62:274–283. doi: 10.1109/TBME.2014.2348323. [DOI] [PubMed] [Google Scholar]
- 99.Schaefer G., Krawczyk B., Celebi M.E., Iyatomi H. An ensemble classification approach for melanoma diagnosis. Memetic Comp. 2014;6:233–240. doi: 10.1007/s12293-014-0144-8. [DOI] [Google Scholar]
- 100.Alencar F.E.S., Lopes D.C., Neto F.M.M. Development of a System Classification of Images Dermoscopic for Mobile Devices. IEEE Lat. Am. Trans. 2016;14:325–330. doi: 10.1109/TLA.2016.7430097. [DOI] [Google Scholar]
- 101.Kasmi R., Mokrani K. Classification of malignant melanoma and benign skin lesions: Implementation of automatic ABCD rule. IET Image Process. 2016;10:448–455. doi: 10.1049/iet-ipr.2015.0385. [DOI] [Google Scholar]
- 102.Sáez A., Sánchez-Monedero J., Gutiérrez P.A., Hervás-Martínez C. Machine Learning Methods for Binary and Multiclass Classification of Melanoma Thickness From Dermoscopic Images. IEEE Trans. Med. Imaging. 2016;35:1036–1045. doi: 10.1109/TMI.2015.2506270. [DOI] [PubMed] [Google Scholar]
- 103.Ma Z., Tavares J.M.R.S. A Novel Approach to Segment Skin Lesions in Dermoscopic Images Based on a Deformable Model. IEEE J. Biomed. Health Inform. 2016;20:615–623. doi: 10.1109/JBHI.2015.2390032. [DOI] [PubMed] [Google Scholar]
- 104.Oliveira R.B., Marranghello N., Pereira A.S., Tavares J.M.R.S. A computational approach for detecting pigmented skin lesions in macroscopic images. Expert Syst. Appl. 2016;61:53–63. doi: 10.1016/j.eswa.2016.05.017. [DOI] [Google Scholar]
- 105.Inácio D.F., Célio V.N., Vilanova G.D., Conceição M.M., Fábio G., Minoro A.J., Tavares P.M., Landulfo S. Paraconsistent analysis network applied in the treatment of Raman spectroscopy data to support medical diagnosis of skin cancer. Med. Biol. Eng. Comput. 2016;54:1453–1467. doi: 10.1007/s11517-016-1471-3. [DOI] [PubMed] [Google Scholar]
- 106.Pennisi A., Bloisi D.D., Nardi D.A., Giampetruzzi R., Mondino C., Facchiano A. Skin lesion image segmentation using Delaunay Triangulation for melanoma detection. Comput. Med. Imaging Graph. 2016;52:89–103. doi: 10.1016/j.compmedimag.2016.05.002. [DOI] [PubMed] [Google Scholar]
- 107.Noroozi N., Zakerolhosseini A. Differential diagnosis of squamous cell carcinoma in situ using skin histopathological images. Comput. Biol. Med. 2016;70:23–39. doi: 10.1016/j.compbiomed.2015.12.024. [DOI] [PubMed] [Google Scholar]
- 108.Odeh S.M., Baareh A.K.M. A comparison of classification methods as diagnostic system: A case study on skin lesions. Comput. Methods Programs Biomed. 2016;137:311–319. doi: 10.1016/j.cmpb.2016.09.012. [DOI] [PubMed] [Google Scholar]
- 109.Noroozi N., Zakerolhosseini A. Computer assisted diagnosis of basal cell carcinoma using Z-transform features. J. Vis. Commun. Image Represent. 2016;40(Pt A):128–148. doi: 10.1016/j.jvcir.2016.06.014. [DOI] [Google Scholar]
- 110.Shrivastava V.K., Londhe N.D., Sonawane R.S., Suri J.S. Computer-aided diagnosis of psoriasis skin images with HOS, texture and color features: A first comparative study of its kind. Comput. Methods Programs Biomed. 2016;126:98–109. doi: 10.1016/j.cmpb.2015.11.013. [DOI] [PubMed] [Google Scholar]
- 111.Kharazmi P., AlJasser M.I., Lui H., Wang Z.J., Lee T.K. Automated Detection and Segmentation of Vascular Structures of Skin Lesions Seen in Dermoscopy, With an Application to Basal Cell Carcinoma Classification. IEEE J. Biomed. Health Inform. 2017;21:1675–1684. doi: 10.1109/JBHI.2016.2637342. [DOI] [PubMed] [Google Scholar]
- 112.Satheesha T.Y., Satyanarayana D., Prasad M.N.G., Dhruve K.D. Melanoma Is Skin Deep: A 3D Reconstruction Technique for Computerized Dermoscopic Skin Lesion Classification. IEEE J. Transl. Eng. Health Med. 2017;5:1–17. doi: 10.1109/JTEHM.2017.2648797. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 113.Xie F., Fan H., Li Y., Jiang Z., Meng R., Bovik A. Melanoma Classification on Dermoscopy Images Using a Neural Network Ensemble Model. IEEE Trans. Med. Imaging. 2017;36:849–858. doi: 10.1109/TMI.2016.2633551. [DOI] [PubMed] [Google Scholar]
- 114.Sadri A.R., Azarianpour S., Zekri M., Celebi M.E., Sadri S. WN-based approach to melanoma diagnosis from dermoscopy images. IET Image Process. 2017;11:475–482. doi: 10.1049/iet-ipr.2016.0681. [DOI] [Google Scholar]
- 115.Hamed K., Khaki S.A., Mohammad A., Jahed M., Ali H. Nonlinear Analysis of the Contour Boundary Irregularity of Skin Lesion Using Lyapunov Exponent and K-S Entropy. J. Med. Biol. Eng. 2017;37:409–419. [Google Scholar]
- 116.Dalila F., Zohra A., Reda K., Hocine C. Segmentation and classification of melanoma and benign skin lesions. Optik. 2017;140:749–761. doi: 10.1016/j.ijleo.2017.04.084. [DOI] [Google Scholar]
- 117.Ma Z., Tavares J.M.R.S. Effective features to classify skin lesions in dermoscopic images. Expert Syst. Appl. 2017;84:92–101. doi: 10.1016/j.eswa.2017.05.003. [DOI] [Google Scholar]
- 118.Alfed N., Khelifi F. Bagged textural and color features for melanoma skin cancer detection in dermoscopic and standard images. Expert Syst. Appl. 2017;90:101–110. doi: 10.1016/j.eswa.2017.08.010. [DOI] [Google Scholar]
- 119.Oliveira R.B., Pereira A.S., Tavares J.M.R.S. Skin lesion computational diagnosis of dermoscopic images: Ensemble models based on input feature manipulation. Comput. Methods Programs Biomed. 2017;149:43–53. doi: 10.1016/j.cmpb.2017.07.009. [DOI] [PubMed] [Google Scholar]
- 120.Przystalski K., Ogorzałek M.J. Multispectral skin patterns analysis using fractal methods. Expert Syst. Appl. 2017;88:318–326. doi: 10.1016/j.eswa.2017.07.011. [DOI] [Google Scholar]
- 121.Jaisakthi S.M., Mirunalini P., Aravindan C. Automated skin lesion segmentation of dermoscopic images using GrabCut and k-means algorithms. IET Comput. Vis. 2018;12:1088–1095. doi: 10.1049/iet-cvi.2018.5289. [DOI] [Google Scholar]
- 122.Do T., Hoang T., Pomponiu V., Zhou Y., Chen Z., Cheung N., Koh D., Tan A., Tan S. Accessible Melanoma Detection Using Smartphones and Mobile Image Analysis. IEEE Trans. Multimed. 2018;20:2849–2864. doi: 10.1109/TMM.2018.2814346. [DOI] [Google Scholar]
- 123.Adjed F., Gardezi S.J.S., Ababsa F., Faye I., Dass S.C. Fusion of structural and textural features for melanoma recognition. IET Comput. Vis. 2018;12:185–195. doi: 10.1049/iet-cvi.2017.0193. [DOI] [Google Scholar]
- 124.Hosseinzadeh H. Automated skin lesion division utilizing Gabor filters based on shark smell optimizing method. Evol. Syst. 2018;11:1–10. doi: 10.1007/s12530-018-9258-4. [DOI] [Google Scholar]
- 125.Akram T., Khan M.A., Sharif M., Yasmin M. Skin lesion segmentation and recognition using multichannel saliency estimation and M-SVM on selected serially fused features. J. Ambient. Intell. Humaniz. Comput. 2018 doi: 10.1007/s12652-018-1051-5. [DOI] [Google Scholar]
- 126.Jamil U., Khalid S., Akram M.U., Ahmad A., Jabbar S. Melanocytic and nevus lesion detection from diseased dermoscopic images using fuzzy and wavelet techniques. Soft Comput. 2018;22:1577–1593. doi: 10.1007/s00500-017-2947-2. [DOI] [Google Scholar]
- 127.Khan M., Akram T., Sharif M., Shahzad A., Aurangzeb K., Alhussein M., Haider S.I., Altamrah A. An implementation of normal distribution based segmentation and entropy controlled features selection for skin lesion detection and classification. BMC Cancer. 2018;18:638. doi: 10.1186/s12885-018-4465-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 128.Tan T.Y., Zhang L., Neoh S.C., Lim C.P. Intelligent skin cancer detection using enhanced particle swarm optimization. Knowl. Based Syst. 2018;158:118–135. doi: 10.1016/j.knosys.2018.05.042. [DOI] [Google Scholar]
- 129.Tajeddin N.Z., Asl B.M. Melanoma recognition in dermoscopy images using lesion’s peripheral region information. Comput. Methods Programs Biomed. 2018;163:143–153. doi: 10.1016/j.cmpb.2018.05.005. [DOI] [PubMed] [Google Scholar]
- 130.Filho P.P.R., Peixoto S.A., Nóbrega R.V.M., Hemanth D.J., Medeiros A.G., Sangaiah A.K., Albuquerque V.H.C. Automatic histologically-closer classification of skin lesions. Comput. Med. Imaging Graph. 2018;68:40–54. doi: 10.1016/j.compmedimag.2018.05.004. [DOI] [PubMed] [Google Scholar]
- 131.Peñaranda F., Naranjo V., Lloyd G., Kastl L., Kemper B., Schnekenburger J., Nallala J., Stone N. Discrimination of skin cancer cells using Fourier transform infrared spectroscopy. Comput. Biol. Med. 2018;100:50–61. doi: 10.1016/j.compbiomed.2018.06.023. [DOI] [PubMed] [Google Scholar]
- 132.Wahba M.A., Ashour A.S., Guo Y., Napoleon S.A., Elnaby M.M.A. A novel cumulative level difference mean based GLDM and modified ABCD features ranked using eigenvector centrality approach for four skin lesion types classification. Comput. Methods Programs Biomed. 2018;165:163–174. doi: 10.1016/j.cmpb.2018.08.009. [DOI] [PubMed] [Google Scholar]
- 133.Zakeri A., Hokmabadi A. Improvement in the diagnosis of melanoma and dysplastic lesions by introducing ABCD-PDT features and a hybrid classifier. Biocybern. Biomed. Eng. 2018;38:456–466. doi: 10.1016/j.bbe.2018.03.005. [DOI] [Google Scholar]
- 134.Pathan S., Prabhu K.G., Siddalingaswamy P.C. A methodological approach to classify typical and atypical pigment network patterns for melanoma diagnosis. Biomed. Signal Process. Control. 2018;44:25–37. doi: 10.1016/j.bspc.2018.03.017. [DOI] [Google Scholar]
- 135.Li X., Yang S., Fan R., Yu X., Chen D.G. Discrimination of soft tissues using laser-induced breakdown spectroscopy in combination with k nearest neighbors (kNN) and support vector machine (SVM) classifiers. Opt. Laser Technol. 2018;102:233–239. doi: 10.1016/j.optlastec.2018.01.028. [DOI] [Google Scholar]
- 136.Chatterjee S., Dey D., Munshi S. Optimal selection of features using wavelet fractal descriptors and automatic correlation bias reduction for classifying skin lesions. Biomed. Signal Process. Control. 2018;40:252–262. doi: 10.1016/j.bspc.2017.09.028. [DOI] [Google Scholar]
- 137.Khan M.Q., Hussain A., Rehman S.U., Khan U., Maqsood M., Mehmood K., Khan M.A. Classification of Melanoma and Nevus in Digital Images for Diagnosis of Skin Cancer. IEEE Access. 2019;7:90132–90144. doi: 10.1109/ACCESS.2019.2926837. [DOI] [Google Scholar]
- 138.Madooei A., Drew M.S., Hajimirsadeghi H. Learning to Detect Blue–White Structures in Dermoscopy Images With Weak Supervision. IEEE J. Biomed. Health Inform. 2019;23:779–786. doi: 10.1109/JBHI.2018.2835405. [DOI] [PubMed] [Google Scholar]
- 139.Sáez A., Acha B., Serrano A., Serrano C. Statistical Detection of Colors in Dermoscopic Images With a Texton-Based Estimation of Probabilities. IEEE J. Biomed. Health Inform. 2019;23:560–569. doi: 10.1109/JBHI.2018.2823499. [DOI] [PubMed] [Google Scholar]
- 140.Navarro F., Escudero-Viñolo M., Bescós J. Accurate Segmentation and Registration of Skin Lesion Images to Evaluate Lesion Change. IEEE J. Biomed. Health Inform. 2019;23:501–508. doi: 10.1109/JBHI.2018.2825251. [DOI] [PubMed] [Google Scholar]
- 141.Riaz F., Naeem S., Nawaz R., Coimbra M. Active Contours Based Segmentation and Lesion Periphery Analysis for Characterization of Skin Lesions in Dermoscopy Images. IEEE J. Biomed. Health Inform. 2019;23:489–500. doi: 10.1109/JBHI.2018.2832455. [DOI] [PubMed] [Google Scholar]
- 142.Mahmouei S.S., Aldeen M., Stoecker W.V., Garnavi R. Biologically Inspired QuadTree Color Detection in Dermoscopy Images of Melanoma. IEEE J. Biomed. Health Inform. 2019;23:570–577. doi: 10.1109/JBHI.2018.2841428. [DOI] [PubMed] [Google Scholar]
- 143.Murugan A., Nair S.H., Kumar K.P.S. Detection of Skin Cancer Using SVM, Random Forest and kNN Classifiers. J. Med. Syst. 2019;43:269. doi: 10.1007/s10916-019-1400-8. [DOI] [PubMed] [Google Scholar]
- 144.Khalid S., Jamil U., Saleem K., Akram M.U., Manzoor W., Ahmed W., Sohail A. Segmentation of skin lesion using Cohen–Daubechies–Feauveau biorthogonal wavelet. SpringerPlus. 2016;5:1603. doi: 10.1186/s40064-016-3211-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 145.Majumder S., Ullah M.A. Feature extraction from dermoscopy images for melanoma diagnosis. SN Appl. Sci. 2019;1:753. doi: 10.1007/s42452-019-0786-8. [DOI] [Google Scholar]
- 146.Chatterjee S., Dey D., Munshi S. Integration of morphological preprocessing and fractal based feature extraction with recursive feature elimination for skin lesion types classification. Comput. Methods Programs Biomed. 2019;178:201–218. doi: 10.1016/j.cmpb.2019.06.018. [DOI] [PubMed] [Google Scholar]
- 147.Chatterjee S., Dey D., Munshi S., Gorai S. Extraction of features from cross correlation in space and frequency domains for classification of skin lesions. Biomed. Signal Process. Control. 2019;53:101581. doi: 10.1016/j.bspc.2019.101581. [DOI] [Google Scholar]
- 148.Upadhyay P.K., Chandra S. An improved bag of dense features for skin lesion recognition. J. King Saud Univ. Comput. Inf. Sci. 2019 doi: 10.1016/j.jksuci.2019.02.007. [DOI] [Google Scholar]
- 149.Pathan S., Prabhu K.G., Siddalingaswamy P.C. Automated detection of melanocytes related pigmented skin lesions: A clinical framework. Biomed. Signal Process. Control. 2019;51:59–72. doi: 10.1016/j.bspc.2019.02.013. [DOI] [Google Scholar]
- 150.Garcia-Arroyo J.L., Garcia-Zapirain B. Segmentation of skin lesions in dermoscopy images using fuzzy classification of pixels and histogram thresholding. Comput. Methods Programs Biomed. 2019;168:11–19. doi: 10.1016/j.cmpb.2018.11.001. [DOI] [PubMed] [Google Scholar]
- 151.Hu K., Niu X., Liu S., Zhang Y., Cao C., Xiao F., Yang W., Gao X. Classification of melanoma based on feature similarity measurement for codebook learning in the bag-of-features model. Biomed. Signal Process. Control. 2019;51:200–209. doi: 10.1016/j.bspc.2019.02.018. [DOI] [Google Scholar]
- 152.Moradi N., Mahdavi-Amiri N. Kernel sparse representation based model for skin lesions segmentation and classification. Comput. Methods Programs Biomed. 2019;182:105038. doi: 10.1016/j.cmpb.2019.105038. [DOI] [PubMed] [Google Scholar]
- 153.Pereira P.M.M., Fonseca-Pinto R., Paiva R.P., Assuncao P.A.A., Tavora L.M.N., Thomaz L.A., Faria S.M.M. Skin lesion classification enhancement using border-line features—The melanoma vs nevus problem. Biomed. Signal Process. Control. 2020;57:2020. doi: 10.1016/j.bspc.2019.101765. [DOI] [Google Scholar]
- 154.Kawahara J., Hamarneh G. Multi-resolution-Tract CNN with Hybrid Pretrained and Skin-Lesion Trained Layers. Mach. Learn. Med. Imaging. 2016;10019:164–171. [Google Scholar]
- 155.Yu L., Chen H., Dou Q., Qin J., Heng P. Automated Melanoma Recognition in Dermoscopy Images via Very Deep Residual Networks. IEEE Trans. Med. Imaging. 2017;36:994–1004. doi: 10.1109/TMI.2016.2642839. [DOI] [PubMed] [Google Scholar]
- 156.Nguyen N.C.F.C.Q.-B., Pankanti S., Gutman D.A., Helba B., Halpern A.C., Smith J.R. Deep learning ensembles for melanoma recognition in dermoscopy images. IBM J. Res. Dev. 2017;61:5–15. [Google Scholar]
- 157.Bozorgtabar B., Sedai S., Roy P.K., Garnavi R. Skin lesion segmentation using deep convolution networks guided by local unsupervised learning. IBM J. Res. Dev. 2017;61:6–14. doi: 10.1147/JRD.2017.2708283. [DOI] [Google Scholar]
- 158.Yuan Y., Chao M., Lo Y. Automatic Skin Lesion Segmentation Using Deep Fully Convolutional Networks With Jaccard Distance. IEEE Trans. Med. Imaging. 2017;36:1876–1886. doi: 10.1109/TMI.2017.2695227. [DOI] [PubMed] [Google Scholar]
- 159.Sultana N.N., Mandal B., Puhan N.B. Deep residual network with regularised fisher framework for detection of melanoma. IET Comput. Vis. 2018;12:1096–1104. doi: 10.1049/iet-cvi.2018.5238. [DOI] [Google Scholar]
- 160.Rundo F., Conoci S., Banna G.L., Ortis A., Stanco F., Battiato S. Evaluation of Levenberg–Marquardt neural networks and stacked autoencoders clustering for skin lesion analysis, screening and follow-up. IET Comput. Vis. 2018;12:957–962. doi: 10.1049/iet-cvi.2018.5195. [DOI] [Google Scholar]
- 161.Creswell A., Pouplin A., Bharath A.A. Denoising adversarial autoencoders: Classifying skin lesions using limited labelled training data. IET Comput. Vis. 2018;12:1105–1111. doi: 10.1049/iet-cvi.2018.5243. [DOI] [Google Scholar]
- 162.Harangi B. Skin lesion classification with ensembles of deep convolutional neural networks. J. Biomed. Inform. 2018;86:25–32. doi: 10.1016/j.jbi.2018.08.006. [DOI] [PubMed] [Google Scholar]
- 163.Guo S., Yang Z. Multi-Channel-ResNet: An integration framework towards skin lesion analysis. Inform. Med. Unlocked. 2018;12:67–74. doi: 10.1016/j.imu.2018.06.006. [DOI] [Google Scholar]
- 164.Sánchez-Monedero J., Pérez-Ortiz M., Sáez A., Gutiérrez P.A., Hervás-Martínez C. Partial order label decomposition approaches for melanoma diagnosis. Appl. Soft Comput. 2018;64:341–355. doi: 10.1016/j.asoc.2017.11.042. [DOI] [Google Scholar]
- 165.Hagerty J.R., Stanley R.J., Almubarak H.A., Lama N., Kasmi R., Guo P., Drugge R.J., Rabinovitz H.S., Oliviero M., Stoecker W.V. Deep Learning and Handcrafted Method Fusion: Higher Diagnostic Accuracy for Melanoma Dermoscopy Images. IEEE J. Biomed. Health Inform. 2019;23:1385–1391. doi: 10.1109/JBHI.2019.2891049. [DOI] [PubMed] [Google Scholar]
- 166.Połap D. Analysis of Skin Marks Through the Use of Intelligent Things. IEEE Access. 2019;7:149355–149363. doi: 10.1109/ACCESS.2019.2947354. [DOI] [Google Scholar]
- 167.Połap D., Winnicka A., Serwata K., Kęsik K., Woźniak M. An intelligent system for monitoring skin diseases. Sensors. 2018;18:2552. doi: 10.3390/s18082552. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 168.Sarkar R., Chatterjee C.C., Hazra A. Diagnosis of melanoma from dermoscopic images using a deep depthwise separable residual convolutional network. IET Image Process. 2019;13:2130–2142. doi: 10.1049/iet-ipr.2018.6669. [DOI] [Google Scholar]
- 169.Zhang J., Xie Y., Xia Y., Shen C. Attention Residual Learning for Skin Lesion Classification. IEEE Trans. Med. Imaging. 2019;38:2092–2103. doi: 10.1109/TMI.2019.2893944. [DOI] [PubMed] [Google Scholar]
- 170.Albahar M.A. Skin Lesion Classification Using Convolutional Neural Network With Novel Regularizer. IEEE Access. 2019;7:38306–38313. doi: 10.1109/ACCESS.2019.2906241. [DOI] [Google Scholar]
- 171.González-Díaz I. DermaKNet: Incorporating the Knowledge of Dermatologists to Convolutional Neural Networks for Skin Lesion Diagnosis. IEEE J. Biomed. Health Inform. 2019;23:547–559. doi: 10.1109/JBHI.2018.2806962. [DOI] [PubMed] [Google Scholar]
- 172.Kawahara J., Daneshvar S., Argenziano G., Hamarneh G. Seven-Point Checklist and Skin Lesion Classification Using Multitask Multimodal Neural Nets. IEEE J. Biomed. Health Inform. 2019;23:538–546. doi: 10.1109/JBHI.2018.2824327. [DOI] [PubMed] [Google Scholar]
- 173.Yu Z., Jiang X., Zhou F., Qin J., Ni D., Chen S., Lei B., Wang T. Melanoma Recognition in Dermoscopy Images via Aggregated Deep Convolutional Features. IEEE Trans. Biomed. Eng. 2019;66:1006–1016. doi: 10.1109/TBME.2018.2866166. [DOI] [PubMed] [Google Scholar]
- 174.Dorj U., Lee K., Choi J., Lee M. The skin cancer classification using deep convolutional neural network. Multimed. Tools Appl. 2018;77:9909–9924. doi: 10.1007/s11042-018-5714-1. [DOI] [Google Scholar]
- 175.Gavrilov D.A., Melerzanov A.V., Shchelkunov N.N., Zakirov E.I. Use of Neural Network-Based Deep Learning Techniques for the Diagnostics of Skin Diseases. Biomed. Eng. 2019;52:348–352. doi: 10.1007/s10527-019-09845-9. [DOI] [Google Scholar]
- 176.Chen M., Zhou P., Wu D., Hu L., Hassan M.M., Alamri A. AI-Skin: Skin disease recognition based on self-learning and wide data collection through a closed-loop framework. Inf. Fusion. 2020;54:1–9. doi: 10.1016/j.inffus.2019.06.005. [DOI] [Google Scholar]
- 177.Mahbod A., Schaefer G., Ellinger I., Ecker R., Pitiot A., Wang C. Fusing fine-tuned deep features for skin lesion classification. Comput. Med. Imaging Graph. 2019;71:19–29. doi: 10.1016/j.compmedimag.2018.10.007. [DOI] [PubMed] [Google Scholar]
- 178.Brinker T.J., Hekler A., Enk A.H., Klode J., Hauschild A., Berking C., Schilling B., Haferkamp S., Schadendorf D., Holland-Letz T., et al. Deep learning outperformed 136 of 157 dermatologists in a head-to-head dermoscopic melanoma image classification task. Eur. J. Cancer. 2019;113:47–54. doi: 10.1016/j.ejca.2019.04.001. [DOI] [PubMed] [Google Scholar]
- 179.Tan T.Y., Zhang L., Lim C.P. Adaptive melanoma diagnosis using evolving clustering, ensemble and deep neural networks. Knowl. Based Syst. 2020;187:104807. doi: 10.1016/j.knosys.2019.06.015. [DOI] [Google Scholar]
- 180.Khan M.A., Sharif M., Akram T., Damaševičius R., Maskeliūnas R. Skin lesion segmentation and multiclass classification using deep learning features and improved moth flame optimization. Diagnostics. 2021;11:811. doi: 10.3390/diagnostics11050811. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 181.Tschandl P., Sinz C., Kittler H. Domain-specific classification-pretrained fully convolutional network encoders for skin lesion segmentation. Comput. Biol. Med. 2019;104:111–116. doi: 10.1016/j.compbiomed.2018.11.010. [DOI] [PubMed] [Google Scholar]
- 182.Vasconcelos F.F.X., Medeiros A.G., Peixoto S.A., Filho P.P.R. Automatic skin lesions segmentation based on a new morphological approach via geodesic active contour. Cogn. Syst. Res. 2019;55:44–59. doi: 10.1016/j.cogsys.2018.12.008. [DOI] [Google Scholar]
- 183.Burlina P.M., Joshi N.J., Ng E., Billings S.D., Rebman A.W., Aucott J.N. Automated detection of erythema migrans and other confounding skin lesions via deep learning. Comput. Biol. Med. 2019;105:151–156. doi: 10.1016/j.compbiomed.2018.12.007. [DOI] [PubMed] [Google Scholar]
- 184.Maron R.C., Weichenthal M., Utikal J.S., Hekler A., Berking C., Hauschild A., Enk A.H., Haferkamp S., Klode J., Schadendorf D., et al. Systematic outperformance of 112 dermatologists in multiclass skin cancer image classification by convolutional neural networks. Eur. J. Cancer. 2019;119:57–65. doi: 10.1016/j.ejca.2019.06.013. [DOI] [PubMed] [Google Scholar]
- 185.Goyal M., Oakley A., Bansal P., Dancey D., Yap M.H. Skin Lesion Segmentation in Dermoscopic Images with Ensemble Deep Learning Methods. IEEE Access. 2020;8:4171–4181. doi: 10.1109/ACCESS.2019.2960504. [DOI] [Google Scholar]
- 186.Albert B.A. Deep Learning from Limited Training Data: Novel Segmentation and Ensemble Algorithms Applied to Automatic Melanoma Diagnosis. IEEE Access. 2020;8:31254–31269. doi: 10.1109/ACCESS.2020.2973188. [DOI] [Google Scholar]
- 187.Ahmad B., Usama M., Huang C., Hwang K., Hossain M.S., Muhammad G. Discriminative Feature Learning for Skin Disease Classification Using Deep Convolutional Neural Network. IEEE Access. 2020;8:39025–39033. doi: 10.1109/ACCESS.2020.2975198. [DOI] [Google Scholar]
- 188.Kwasigroch A., Grochowski M., Mikołajczyk A. Neural Architecture Search for Skin Lesion Classification. IEEE Access. 2020;8:9061–9071. doi: 10.1109/ACCESS.2020.2964424. [DOI] [Google Scholar]
- 189.Adegun A.A., Viriri S. Deep Learning-Based System for Automatic Melanoma Detection. IEEE Access. 2020;8:7160–7172. doi: 10.1109/ACCESS.2019.2962812. [DOI] [Google Scholar]
- 190.Song L., Lin J.P., Wang Z.J., Wang H. An End-to-end Multi-task Deep Learning Framework for Skin Lesion Analysis. IEEE J. Biomed. Health Inform. 2020 doi: 10.1109/JBHI.2020.2973614. [DOI] [PubMed] [Google Scholar]
- 191.Wei L., Ding K., Hu H. Automatic Skin Cancer Detection in Dermoscopy Images Based on Ensemble Lightweight Deep Learning Network. IEEE Access. 2020;8:99633–99647. doi: 10.1109/ACCESS.2020.2997710. [DOI] [Google Scholar]
- 192.Gong A., Yao X., Lin W. Dermoscopy Image Classification Based on StyleGANs and Decision Fusion. IEEE Access. 2020;8:70640–70650. doi: 10.1109/ACCESS.2020.2986916. [DOI] [Google Scholar]
- 193.Nasiri S., Helsper J., Jung M., Fathi M. DePicT Melanoma Deep-CLASS: A deep convolutional neural networks approach to classify skin lesion images. BMC Bioinform. 2020;21:84. doi: 10.1186/s12859-020-3351-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 194.Öztürk S., Özkaya U. Skin Lesion Segmentation with Improved Convolutional Neural Network. J. Digit. Imaging. 2020;33:958–970. doi: 10.1007/s10278-020-00343-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 195.Hosny K.M., Kassem M.A., Foaud M.M. Skin Melanoma Classification Using ROI and Data Augmentation with Deep Convolutional Neural Networks. Multimed. Tools Appl. 2020;79:24029–24055. doi: 10.1007/s11042-020-09067-2. [DOI] [Google Scholar]
- 196.Javeria A., Abida S., Nadia G., Muhammad A.A., Muhammad W.N., Faisal A., Syed Ahmad C.B. Integrated design of deep features fusion for localization and classification of skin cancer. Pattern Recognit. Lett. 2020;131:63–70. [Google Scholar]
- 197.Mahbod A., Schaefer G., Wang C., Dorffner G., Ecker R., Ellinger I. Transfer learning using a multi-scale and multi-network ensemble for skin lesion classification. Comput. Methods Programs Biomed. 2020;193:105475. doi: 10.1016/j.cmpb.2020.105475. [DOI] [PubMed] [Google Scholar]
- 198.Hameed N., Shabut A.M., Ghosh M.K., Hossain M.A. Multi-class multi-level classification algorithm for skin lesions classification using machine learning techniques. Expert Syst. Appl. 2020;141:112961. doi: 10.1016/j.eswa.2019.112961. [DOI] [Google Scholar]
- 199.Zhang N., Cai Y., Wang Y., Tian Y., Wang X., Badami B. Skin cancer diagnosis based on optimized convolutional neural network. Artif. Intell. Med. 2020;102:101756. doi: 10.1016/j.artmed.2019.101756. [DOI] [PubMed] [Google Scholar]
- 200.Hasan K., Dahal L., Samarakoon P.N., Tushar F.I., Martí R. DSNet: Automatic dermoscopic skin lesion segmentation. Comput. Biol. Med. 2020;120:103738. doi: 10.1016/j.compbiomed.2020.103738. [DOI] [PubMed] [Google Scholar]
- 201.Al-masni M.A., Kim D., Kim T. Multiple skin lesions diagnostics via integrated deep convolutional networks for segmentation and classification. Comput. Methods Programs Biomed. 2020;190:105351. doi: 10.1016/j.cmpb.2020.105351. [DOI] [PubMed] [Google Scholar]
- 202.Pour M.P., Seker H. Transform domain representation-driven convolutional neural networks for skin lesion segmentation. Expert Syst. Appl. 2020;144:113129. doi: 10.1016/j.eswa.2019.113129. [DOI] [Google Scholar]
- 203.Abayomi-Alli O.O., Damaševičius R., Misra S., Maskeliūnas R., Abayomi-Alli A. Malignant skin melanoma detection using image augmentation by oversampling in non-linear lower-dimensional embedding manifold. Turk. J. Elec. Eng. Comp. Sci. 2021 in press. [Google Scholar]
- 204.Hosny K.M., Kassem M.A., Foaud M.M. Classification of Skin Lesions into Seven Classes Using Transfer Learning with AlexNet. J. Digit. Imaging. 2020;33:1325–1334. doi: 10.1007/s10278-020-00371-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 205.Hosny K.M., Kassem M.A., Foaud M.M. Skin Lesions Classification into Eight Classes for ISIC 2019 Using Deep Convolutional Neural Network and Transfer learning. IEEE Access. 2020;8:114822–114832. [Google Scholar]
- 206.Ntoutsi E., Fafalios P., Gadiraju U., Iosifidis V., Nejdl W., Vidal M., Ruggieri S., Turini F., Papadopoulos S., Krasanakis E., et al. Bias in data-driven artificial intelligence systems—An introductory survey. Wires Data Min. Knowl. Discov. 2020;10:3. doi: 10.1002/widm.1356. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
Not applicable.