Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2025 Jul 30.
Published in final edited form as: Multimed Tools Appl. 2024 Jul 2;84(16):16971–17020. doi: 10.1007/s11042-024-19725-4

Brain magnetic resonance image (MRI) segmentation using multimodal optimization

Taymaz Akan 1,2, Amin Golzari Oskouei 3,4, Sait Alp 5, Mohammad Alfrad Nobel Bhuiyan 1
PMCID: PMC12308384  NIHMSID: NIHMS2095516  PMID: 40740598

Abstract

One of the highly focused areas in the medical science community is segmenting tumors from brain magnetic resonance imaging (MRI). The diagnosis of malignant tumors at an early stage is necessary to provide treatment for patients. The patient’s prognosis will improve if it is detected early. Medical experts use a manual method of segmentation when making a diagnosis of brain tumors. This study proposes a new approach to simplify and automate this process. In recent research, multi-level segmentation has been widely used in medical image analysis, and the effectiveness and precision of the segmentation method are directly tied to the number of segments used. However, choosing the appropriate number of segments is often left up to the user and is challenging for many segmentation algorithms. The proposed method is a modified version of the 3D Histogram-based segmentation method, which can automatically determine an appropriate number of segments. The general algorithm contains three main steps: The first step is to use a Gaussian filter to smooth the 3D RGB histogram of an image. This eliminates unreliable and non-dominating histogram peaks that are too close together. Next, a multimodal particle swarm optimization method identifies the histogram’s peaks. In the end, pixels are placed in the cluster that best fits their characteristics based on the non-Euclidean distance. The proposed algorithm has been applied to a Cancer Imaging Archive (TCIA) and brain MRI Images for brain Tumor detection dataset. The results of the proposed method are compared with those of three clustering methods: FCM, FCM_FWCW, and FCM_FW. In the comparative analysis of the three algorithms across various MRI slices. Our algorithm consistently demonstrates superior performance. It achieves the top mean rank in all three metrics, indicating its robustness and effectiveness in clustering. The proposed method is effective in experiments, proving its capacity to find the proper clusters.

Keywords: Image segmentation, MRI, Brain tumor, Multimodal optimization

1. Introduction

Clinicians can identify diseases earlier thanks to medical imaging, which improves patient outcomes. Appropriate medical image analysis is essential to aid specialists and promote a healthy community. The application of image processing techniques for analyzing medical images is exceedingly successful, thanks to advanced medical equipment. Image segmentation, the first step in image processing, is one of the most significant and complex challenges in image analysis, particularly in the application of medical images. Segmentation divides an image into meaningful, nonoverlapping, homogeneous, connected regions concerning color similarity. Medical image segmentation aims to isolate anatomical objects of interest for analysis and is critical in medical imaging applications [14]. There is a wide variety of image segmentation methods, including threshold-based, clustering-based, region-based, edge-based, etc. [59]. Other hybrid image segmentation techniques combine multiple approaches [10]. Many years of research have examined segmentation features and methods. Nevertheless, one of the restrictions is that the appropriate number of segments is a parameter that must be established a priori, and determining this value is not an easy task [11]. In addition, the problem remains tough because, as the desired number of segments increases, the problem’s computing cost increases exponentially, making it unfeasible to employ accurate methods to search for all possible solutions exhaustively.

Computerized tomography (CT), magnetic resonance imaging (MRI), electroencephalography (EEG), and positron emission tomography (PET) are the four most prevalent forms of imaging techniques for the brain. However, magnetic resonance imaging (MRI) is the most frequently utilized method. MRI does not expose patients to radiation, has minimal invasiveness, and is widely available [12]. Furthermore, the MRI can discern more clearly between fatty tissue, water, muscle, and other types of soft tissue. In addition, MRI provides greater soft tissue contrast [13]. The information contained in these images is provided to medical professionals, who can utilize it to assist in the diagnosis of a wide range of illnesses and ailments. The goal of the segmentation of tumors in the human brain is to differentiate between normal and diseased tissues of the brain, specifically cerebrospinal fluid, gray matter, and white matter of the brain. When it comes to investigating brain tumors, it is currently simple to identify abnormal tissues; nevertheless, the segmentation process is challenging, making it difficult to reproduce results, characterize abnormalities, and achieve precision [13]. The term “tumor” refers to the condition that occurs when the uncontrollable expansion of cancer cells causes it. This tumor comes in various subtypes and manifestations, each of which can be treated with a specific modality according to its unique symptoms. Brain segmentation results in an image with labels that show the different regions’ boundaries or a group of contours. Segmentation methods can be further divided into bi-level segmentation methods, which divide an image into two parts, and multi-level segmentation methods, which divide an image into more than two parts [14]. Since MRI brain images contain more than two different types of regions, each of which may correspond to a unique object, the bi-level segmentation technique cannot be effective and results in under-segmentation.

As a result, multi-level image segmentation algorithms should be utilized to segment MRI brain images. Also, segmenting color images (RGB) is difficult due to the range of color intensities and three-color channels, unlike gray images, which have only one [15]. Therefore, various approaches can be taken to segment MRI brain images [1, 13]. Methods for segmenting MRI brain images can be divided into three broad classes: classification-based, region-based, and boundary-based methods [16, 17]. Fuzzy c-means is a typical classification-based approach widely used in medical image segmentation [1821]. However, these clustering methods suffer from predefined values for determining the number of the proper segments.

Furthermore, the computational time is also a consideration since it depends on the number of clusters and image size. As a result, the paper’s motivation comes from automatically segmenting the image regardless of image size and selecting the appropriate number of clusters in addition to the actual peaks. To get around these problems, this study suggests using a 3D histogram and a modified multimodal particle swarm optimization method for brain MRI segmentation.

The cluster centers can be found by detecting the peaks in a three-dimensional histogram of a color image created from the RGB values of the pixels and smoothed with a Gaussian filter. The multimodal variation of particle swarm optimization (PSO) with a local search strategy has been utilized to find the global and local peaks in a histogram and the cluster centers. The contribution of the proposed clustering-based method is that a non-Euclidean distance metric has replaced the Euclidean metric for calculating the multimodal optimization algorithm’s pixel similarity and movement strategy. The number of clusters in the image can be automatically determined based on the number of PSO-detected peaks. Following the discovery of peaks, individual pixels are subsequently assigned to the cluster to which they are spatially closed using non-Euclidean distance, which results in the final segmented brain image. The proposed algorithm has been compared with the FCM [22] clustering algorithm.

The rest of this paper is organized as follows. Section 2 presents the FCM clustering. Section 3 describes the multimodal PSO algorithm. The proposed method is described in Section 4. Finally, section 5 presents the experimental and comparative results, while Section 5 concludes the paper.

2. Fuzzy c-means clustering

The k-means and c-means algorithms are two of the most well-known clustering methodologies in color image segmentation. These algorithms frequently yield good results and are widely used. Nevertheless, as mentioned before, one of the constraints is that the number of clusters is a parameter that must be determined in advance, and determining the value of this parameter is not easy. In addition, the computational time is a primary concern when solving the problem, as it depends on the required number of clusters and the image size. The c-means algorithm uses fuzzy memberships to put each pixel in the correct category.

j=j=1Ni=1cuijmxjvi2, (1)

Where xij s a representation of pixel xj’s membership in the ith cluster, vi s the center of the ith cluster, s a norm metric, and m is a constant. The fuzziness of the final partition is determined by the value of the parameter m, and usually m = 2.

The cost function is minimized when high membership values are assigned to pixels close to their clusters’ centroid. On the other hand, when low membership values are assigned to pixels with data far from the centroid, the cost function is maximized. This is because the membership function calculates the probability of a pixel being part of a particular cluster, given its location. This technique’s probability depends only on pixel distance from each cluster center. The following are the factors that contribute to the updating of the membership functions and cluster centers:

uij=1k=1cxjvixjvk2/m1 (2)
vi=j=1Nuijmxjj=1Nuijm (3)

The algorithm starts by randomly selecting the clusters’ centers and each cluster’s average location. The next step that c-means conducts is to assign an entirely random membership grade to each data point for each cluster. Finally, the means attempts to place the cluster centers in the correct location within a data set and calculates the degree of membership in each cluster for each data point by updating the cluster centers and the membership grades. This iteration tries to minimize an objective function that shows how far a given data point is from the center of a cluster based on how much that data point belongs to the cluster.

3. Multimodal PSO

While unimodal optimization algorithms can only identify a single global optimum (solution) within the collection of options, multimodal optimization algorithms can find many local and global optimum solutions [23, 24]. Even though multimodal optimization approaches have not been explored nearly as extensively as unimodal optimization methods, they have recently gained much attention. However, most of them have the same problem with niching parameters. Existing approaches have trouble figuring out the right niching radius, which is their main problem [25]. In the majority of studies, basic unimodal optimization algorithms, such as the Genetic Algorithm [2628] and the PSO Algorithm [2932], have been modified to become MMO algorithms. PSO’s movement (crossover) method is suited to adapt to a multimodal form. PSO is a nature-inspired method for unimodal stochastic optimization. t has been used to solve many computational problems [33]. PSO has been extended numerous times to a multimodal form in the literature. PSO mimics the swarming behavior of foraging birds to move particles toward the best solution. Therefore, PSO depends on the movement of particles in the search space to determine the best value. Each particle keeps track of its personal best position (i.e., personal best) and the overall best position (i.e., global best) gained by the population so far. Each particle (ith) is associated with a position vector and velocity vector that is recorded as xi and vi, respectively. xi and vi are updated according to the following equation:

vi(t+1)=wvi(t)+R1C1pibestxi+R2C2gbestxixi(t+1)=xi(y)+vi(t+1) (4)

Where t is the iteration number, w indicates the inertia weight, pibest and gbest correspond to the location of personal best and global best, respectively. R1 and R2 are two uniformly distributed random numbers generated inside the interval [0,1]. C1 and C2 represent the particle’s confidence and its neighbors correspondingly. Unimodal PSO cannot locate multiple solutions as all solutions move to global best (gbest). However, PSO’s mechanism for particle motion can be easily adapted to handle multimodal problems. Carrera and Coello Coello (2009) introduced a modified PSO variation for solving multimodal problems inspired by electrostatic charge interactions [32]. Each solution moves toward the solution with the greatest electrostatic interaction calculated based on the current fitness value to locate multiple optimal solutions. These interactions are determined mathematically according to the following:

Fij=QiQj4πr2ε0 (5)

Where Qij, r≠ 0, and ε0 refer, respectively, to the electrical charges of the interacting particles, the distance between them, and the permittivity of the vacuum. To apply these ideas in an optimization context, each solution’s fitness value corresponds to the particles’ electric charge. Herein, Eq. 4 is simulated as:

Fij=αfpibestfpjbestpibestpjbest2 (6)

In this case, the constant scalar 4πε0 s replaced by the variable α, computed based on Li [34]. gbest in Eq.4 is replaced by indexi=argmaxJ=1:MFij or a constant index j. Here, M denotes the population size.

vi(t+1)=wvi(t)+R1C1pibestxi+R2C2pindexixixi(t+1)=xi(y)+vi(t+1) (7)

4. Proposed method

In a research paper called 3DHP, all global and local peaks within a 3D color histogram corresponding to each cluster’s center points were located using the aforementioned multimodal approaches with a local search strategy. Pixel’s color in RGB-model images is derived from a weighted average of red, green, and blue components. t can represent each pixel in an image as a three-dimensional feature vector made up of the pixel’s three component colors. 3D histograms can then be constructed from these three-color axes [35]. The presence of peaks in a histogram indicates that the image comprises multiple distinct segments, each corresponding to a particular segment. In 3DHP, a three-dimensional Gaussian filter was applied to three-dimensional histograms to reduce the effect of noise and turn them out into smoothed histograms. This procedure also eliminates insignificant smaller peaks that may have been present in the histogram. Figure 1 illustrates the three-dimensional histogram, the original color distribution, and the color distribution after the smoothing process for the image of Lena. The 3D histogram was considered an objective function, and the positions of peaks were solution space. In this case, the number of pixels in the particular position corresponds to the fitness value. Moreover, an additional local search step proposed in [36] was integrated into multimodal PSO to enhance local search ability. Finally, the fitness values are used to check out the neighbors of the ith article. So, the following changes are made to the position of the ith particle:

{f(bestNearesti)f(pbesti)temp=d=1Dpd,ibest+C1.rand.(pd,ibest_nearestpd,ibest)f(bestNearesti)<f(pbesti)temp=d=1Dpd,ibest+C1.rand.(pd,ibestpd,ibest_nearest) (8)
f(temp)>f(pbesti)pbesti=temp (9)

Fig. 1.

Fig. 1

Graphics of the 3D histogram, RGB color distribution, and Lenna’s smoothed color distribution. a) original Lenna image, (b) three-dimensional histogram of Lenna, (c) and (d) show the normal and smoothened RGB representation of Lenna [11]

Where bestNearesti is the particle that is closest in the distance to the ith the article, D is the number of dimensions, and temp is a new position in the ith particle. A new position will then replace the particle’s position if it is determined that the new position is superior to the current position of the particle (xi). Consequently, all particles do not need to move to a single global optimum; other possible local solutions are not missed. In the next step, K dominant peaks are located. Then, K sets of peak intensity levels corresponding to cluster centers are automatically obtained for each RGB component. These peaks are represented as follows: p1rgb=r1,g1,b1,p2rgb=r2,g2,b2,p3rgb=r3,g3,b3,,pkrgb=rK,gK,bK.

Additionally, to eliminate non-dominant clusters, it is beneficial to confine the distance that separates the two peaks as much as possible. Therefore, dominant peaks in a region eliminate all non-dominant peaks within its radius based on a distance limit parameter. It is essential to remember that this procedure is not mandatory and can be skipped if desired. For the 3DHP, this parameter was set to 80 pixels. Ultimately, every pixel will be assigned to the peak closest to it regarding the Euclidean distance. The following equation was used to calculate the Euclidean distance between the kth peak and the (i,j)th (pixel)

pkrgbIi,jrgb=pkrIi,jr+pkgIi,jg+pkbIi,jb2 (10)

Also, a non-Euclidean distance criterion was proposed in [37] and then applied in [37] on color image segmentation using FCM. Therefore, this equation is calculated per:

nedxi,xj=a=1A1e1xi,acj,a2 (11)

Where A is the number of features.

In the proposed method, the devisor of the fraction in Eq. 6 pibestpjbest is replaced by Eq.7. This equation could also be expressed as:

ned(pibest,pjbest)=d=1D=31e1(pi,dbestpj,dbest)2 (12)

Therefore:

Fij=αfpibestfpjbestnedpibest,pjbest (13)

Also, after locating the histogram peaks, every pixel will be assigned to the peak (cluster head) closest to it in terms of the non-Euclidean distance instead of the Euclidean distance. Consequently, Eq.10 can be reformulated as:

nedpkrgb,Ii,jrgb=1e1pkrIi,jr+pkgIi,jg+pkbIi,jb2 (14)

It is worth mentioning that a preprocessing step to smooth the image by Gaussian smoothing is applied to the RGB image before calculating the 3D histogram. Meanwhile, σ and the window size for this filter are set to 0.5 and (3 × 3), (respectively)

The flow diagram of the overall method is illustrated in Fig. 2.

Fig. 2.

Fig. 2

The flow diagram of the overall method. The Gaussian filter smooths a 3D RGB histogram, eliminating non-dominating peaks. A multimodal particle swarm optimization method identifies peaks, assigning pixels to the nearest peak based on non-Euclidean distance, ensuring reliable and non-dominating histograms

5. Experimental results and performance evaluation

In this section, extensive experiments are performed on the proposed method. The results of the proposed method are compared with well-known FCM [22], FCM_FW [3], and FCM_FWCW [38]. The required parameters of the proposed method and their values are shown in Table 1. Also, the fuzziness parameter in all soft (fuzzy) clustering methods is set to 2. The maximum number of iterations for FCM, FCM_FW and FCM_FWCW is 100.

Table 1.

Required parameters of the proposed method

Parameter Value
Population size 350
Number of Iteration 350
w_max 0.98
w_min 0.3
C 1 2.15
C 2 2.15

5.1. Dataset

We used the following two datasets to evaluate the proposed method:

5.2. Evaluation metrics

As the brain MRI slices are heterogeneous, qualitative (visual) evaluation of different methods is insufficient to analyze the results accurately. Therefore, quantitative metrics are needed to evaluate the results of various methods [40]. In the experiments, the following two groups of metrics are used to measure the performance of algorithms.

1). Internals clustering metrics:

A lower value of these metrics indicates a better segmentation result in this group.

F: this metric penalizes over-segmentation [41] (segmenting one region of the image into more than one segment):

F=11000(M×N)Ri=1Rei2Ai (15)

where M and N represent the length and width of the input image, R is the number of segmented regions, Ai indicates the number of pixels in the ith segmented region ei indicates the color error in region i, and R represents a penalizing term that discourages over-segmentation.

F: this metric penalizes over-segmentation and is noise-robust [42]:

F=110000(M×N)A=1Max[R(A)]1+1Ai=1Rei2Ai (16)

Q: this metric penalizes non-homogeneous regions [42]:

Q=110000(M×N)Ri=1Rei21+logAi+RAiAi2 (17)

2). Externals clustering metrics:

A higher value of these metrics indicates a better segmentation result in this group.

Accuracy:

is the number of correct prediction pixels divided by the total number of pixels. This metric is calculated by Eq. (18).

Accuracy=TN+TPTN+TP+FN+FP (18)
Precision:

It is the ratio of correct positive prediction pixels to the number of positive pixels predicted. This metric is calculated by Eq. (19).

Precision=TPTp+FP (19)
Recall:

It is the ratio of the number of correct positive prediction pixels to the number of all relevant pixels. This metric is calculated by Eq. (20).

Recall=TPTP+FN (20)
F1 Score:

It is the harmonic mean between Precision and Recall. This metric is calculated by Eq. (21).

F1Score=2×Recall×PrecisionRecall+Precision (21)
Specificity:

The Specificity rate corresponds to the proportion of negative pixels that are correctly considered negative concerning all negative pixels. This metric is calculated by Eq. (22).

Specificity=TNTN+FP (22)

In Eq.s (18) to (22), TP, FN, TN, and FN represent True Positive, False Positive, True Negative, and False Negative, respectively.

In our experiments, these metrics are expressed as a percentage. A high percentage indicates a better performance.

5.3. Experiment 1: Visualization-based analysis using internal metrics

In this section, we evaluate the proposed method with other methods qualitatively and quantitatively. Several images have been selected for the quality assessment of each dataset. In selecting these images, we tried to select images that include different types of tumors: small and large tumor sizes, different tumor tissues, spherical and non-spherical tumor shapes, and different lighting conditions. 23 images are selected from the first, 15 from the second datasets, and 15 images from third one.

Figures 3, 4, and 5 demonstrate the results of the visual qualitative analysis of the first, second and third datasets, respectively. The segmentation results for each method are displayed by employing a distinct color set to the base image to highlight the clusters obtained. Tables 2 (first dataset), 4 (second dataset), and 6 (third dataset) contain information regarding the peak locations of the three-dimensional histogram and the cluster centroids for each cluster achieved by FCM, FCM_FW [3] and FCM_FWCW. In the same way, Tables 3, 5, and 7 show the numerical and qualitative analysis of the results from each of the three tested methods.

Fig. 3.

Fig. 3

Fig. 3

Fig. 3

Fig. 3

Original MRI and segmented MRI slices by the M3DHP, FCM, FCM_FW and FCM_FWCW (first dataset)

Fig. 4.

Fig. 4

Fig. 4

Original MRI and segmented MRI slices by the M3DHP, FCM, and FCM_FWCW on the second dataset

Fig. 5.

Fig. 5

Fig. 5

Fig. 5

Original MRI and segmented MRI slices by the M3DHP, FCM, and FCM_FWCW on the third dataset

Table 2.

Cluster heads and centers of proposed and other methods (first dataset)

M3DHP FCM FCM_FWCW FCM_FW
TCGA_HT_7877_19980917_23 (number of clusters = 11)
1 59 80,153,134,109,146,121,166,104,184 62 1 43 28 4 55,103,133 78 16,161 1 4 23 49 57 64 80 4 30 61,146
1 59 80,153,134,108,147,121,169,104,183 62 1 43 28 4 55,103,133 78 16,161 1 4 23 49 57 64 80 4 30 61,146
1 60 80,153,135,109,145,121,161 96,183 62 1 43 28 4 55,103,133 78 16,161 1 4 23 49 57 64 80 4 30 61,146
TCGA_DU_7014_19860618_45 (number of clusters = 11)
1 34 30 17 31 34 74 37 72 71 46 30 26 32 1 29 2 46 59 35 14 73 1 2 14 25 28 30 31 36 47 60 74 2 3 3 20 31 31 36 40 55 72
1 32 21 17 75 83 77 68 69 73 67 76 18 35 3 27 5 45 59 29 12 75 3 4 12 25 23 74 31 31 46 60 75 4 5 6 17 31 74 32 38 54 74
1 36 35 26 30 35 86 38 78 97 78 31 50 34 1 32 2 55 73 37 25 84 1 2 26 30 53 31 33 37 57 73 84 2 3 4 32 33 32 38 56 68 84
TCGA_DU_5855_19951217_22 (number of clusters = 12)
2 46 30 39 52 39 33 47 54 33 65 27 41 29 32 39 45 62 9 21 49 1 43 1 1 13 24 32 40 42 45 46 51 2 29 43 47 59
1 58 37 50 43 33 98 45 54 49 62 32 95 36 35 44 56 49 13 26 51 2 63 1 2 17 31 38 53 42 57 52 54 2 35 63 55 50
1 47 42 43 74 55 41 61 83 60 75 54 40 38 46 45 45 78 20 35 48 1 44 1 1 25 33 39 43 54 46 49 51 2 41 44 48 72
TCGA_HT_8105_19980826_26 (number of clusters = 7)
2 47 83 28 98 98,152 65 50 21 84 40 2104 3 22 46 61 82,104 3 27 52 91
1 47 84 28 97,107,145 65 50 21 84 40 2104 3 22 46 61 82,104 3 27 52 91
1 48 83 31 97,102,155 65 50 21 84 40 2104 3 22 46 61 82,104 3 27 52 91
TCGA_FG_6688_20020215_24 (number of clusters = 8)
4 74,122 63,118,134,134,107 36 70 75 52 20 61,117 3 4 24 39 53 62 71 76,117 4 23 38 53 62 71 76,116
1 25 46 42 50 51 43 35 11 23 23 21 6 23 44 1 2 7 13 22 25 24 24 45 2 7 13 22 25 24 24 45
3 64,105 61,114,120,111 62 37 61 64 49 21 54,101 2 3 25 40 49 55 61 65,102 3 24 39 49 55 61 65,100
TCGA_DU_7014_19860618_30 (number of clusters = 4)
1 33 25 77 2 76 29 37 3 28 35 75 3 27 35 74
1 30 57 81 4 80 29 36 5 30 33 79 5 29 33 77
1 36 28 91 3 91 35 63 4 34 48 91 4 34 46 90
TCGA_FG_A60K_20040224_40 (number of clusters = 7)
1 84,100 77 45 51,161 21 48 76,105,124 90 1 2 47 79 96,112 4 95
1 85,100 76 46 51,160 21 48 76,105,124 90 1 2 47 79 96,112 4 95
1 85,100 76 47 54,160 21 48 76,105,124 90 1 2 47 79 96,112 4 95
TCGA_DU_5872_19950223_38 (number of clusters = 24)
1 61 30 47 34 38 18,155,157,138 89,131,137,145,101,129,117,158 31,131,128,104,143,126 1 56,101 63 62 46,127 2 33 48,154 34 3 49 53,162 63 5 28 5 52 3 25 2 1 1 3 3 3 4 4 4 5 6 6 15 22 26 29 30 34 47 51 55 56 57 61 66 1 3 4 4 4 5 16 25 29 31 36 46 48 50 54 55 61 63 64 66,103,135,157
1 60 12 61 19 15 17,179,165,172,122,13,183,139,102,177,132,185 30,131,179 88,178,199 1 54,116 52 68 30,144 10 16,117,158 49 13 60 77,183 60 10 10 7 66 6 25 8 1 8 8 11 13 6 8 15 10 8 11 19 16 17 40 11 23 65 64 62 61 60 60 60 2 10 7 9 12 10 19 60 11 32 20 56,115 64 71 59 56 69 61 56,118,150,170
1 55 27 46 55 41 32,146,139,133,100,120,135,120,104,142,102,136 20,104,127,117,146,114 1 51,109 56 56 62,125 3 53 46,141 38 3 47 50,149 56 3 25 3 49 3 27 3 1 2 4 4 4 4 4 4 4 4 4 19 32 25 36 26 41 44 48 52 52 53 57 59 1 4 4 4 4 4 21 27 26 36 57 47 46 48 50 51 55 57 57 60,110,129,147
TCGA_DU_7306_19930512_21 (number of clusters = 7)
2 17 6 3 4 23 27 9 1 18 6 14 13 38 15 1 7 14 18 20 38 1 2 11 17 20 36
1 39 37 82 34 46 34 3 38 11 20 34 90 43 4 13 36 38 39 89 3 5 29 38 39 83
1 17 3 29 28 26 22 1 17 7 17 12 32 14 1 8 13 16 19 32 1 2 11 15 19 32
TCGA_DU_7013_19860523_24 (number of clusters = 7)
1 61 67,124,134 85 85 128 67 34 4 54 91 13 5 15 37 56 68 91,128 7 19 39 56 68 91,127
1 61 66,124,133 89 88 128 67 34 4 54 91 13 5 15 37 56 68 91,128 7 19 39 56 68 91,127
1 60 67,125,133 81 94 128 67 34 4 54 91 13 5 15 37 56 68 91,128 7 19 39 56 68 91,127
TCGA_HT_A616_19991226_19 (number of clusters = 11)
1 41,113 85 22 79,106,185,138,193,144 43 38 1 51 3 94,120 22 64 12 33 3 19 38 47 62 90,118 2 4 23 38 48 97
1 41,114 87 15 79,103,185,139,191,148 43 38 1 51 3 94,120 22 64 12 33 3 19 38 47 62 90,118 2 4 23 38 48 97
1 41,114 87 17 80,104,186,134,194,143 43 38 1 51 3 94,120 22 64 12 33 3 19 38 47 62 90,118 2 4 23 38 48 97
TCGA_HT_8106_19970727_12 (number of clusters = 23)
1 81 91 87 72 58 72 62 83 87 53 49 57 89 47,155,126,138 73 72 88,131,138 130 88 40 32 82,169 83 53 73 75 14 67 80 88 59 84 4 56 76 1 1 88 91 1 2 5 17 31 44 56 57 68 68 75 75 81 82 83 86 90 90 90 92,135,171 1 1 1 4 15 25 39 43 57 60 68 71 74 79 82 83 85 89 91 92,120,148
1 71 68 84 74 22 59 68 69,164 14 36 49,112 30,132,105,126 86 62,141,121,149 111 68 24 28 68,152,116 52 54 73 14 65 81 91 39,159 5 23 66 2 4 77,175 5 3 6 16 26 36 24 56 46 67 68 82 71 86,154 71 71 82,104 174 117 153 2 4 5 5 15 22 23 41 31 60 67 52 75 76 69,150 83 72 87,172 99,131
1 78,102 74 65 91 63 54 67 70 85,103 95 97 74,159,148,157 61 52 67,142,169 151 94 72 42 81,163 74 50,142 66 28 60 75 85,116 69 10 94 73 2 3 90 85 4 3 9 31 47 60 95 61,131 62 72 66 78 76 68 87 96 93 86 86,151,164 2 3 5 5 27 40 68 49,100 56 63,130 66 73 81 69 81 94 91 84,144,160
TCGA_FG_A4MU_20030903_25 (number of clusters = 16)
63 89 71 72 98 83 43 77,124 77,117 91,116 90,118 137 63 79 67 68,123 61,126 87 50 87 64,104 72 88 75 91 52 61 62 67 67 68 74 80 82 89 91 91,100,116,126 52 62 68 68 68 76 77 87 88 90 91,102,114,125
1 50 56 66 53 45 98 46 94 80 90 22 82 72 90 90 53 50 20 58 77 2 92 46 98 43 11 64 52 53 47 46 98 3 3 18 53 59 55 37 50 55 46 50 56 77 93 99 3 20 53 59 48 52 42 54 46 49 60 75 92
1 60 56 59 29 29 46 92,105 54,112 44,105 59 83 92 54 56 37 54 87 1105 58 50 29 18 37 55 60 90 60 51 2 2 31 55 55 57 36 58 61 60 61 32 91,106 51 2 31 55 55 90 57 29 61 60 61 34 91,106
TCGA_FG_7634_20000128_21 (number of clusters = 9)
3 47 18 24 66 79 59 40 59 27 14 13 76 2 23 45 55 14 4 16 21 26 32 33 34 76 4 14 16 25 28 32 33 75
1 51 54,100,119,140 90,125 57 51 35 54,139 2 95 50 98 17 4 27 59 45 85 52,102,139 3 18 34 66 54 98 48,136
1 51 54 99,120,140 93,125 52 51 35 54,139 2 95 50 98 17 4 27 59 45 85 52,102,139 3 18 34 66 54 98 48,136
TCGA_FG_6690_20020226_32 (number of clusters = 12)
1 70 61 47 32 23 40 93 46 61,106 43 97 3 47 52 75 26 44 1 71 35 42 60 2 21 30 39 42 47 49 57 65 73 78,100 2 28 39 44 47 55 68 94
1 42 43 81 21 6 43,150 52 37,162 34 151 2 80 53,115 11 42 1 40 23 24 42 2 8 25 35 28 79 46 47 43 41,116,151 2 21 30 43 78 47 45,139
1 57 50 37 35 19 31 76 51 40 79 24 86 3 41 45 74 28 39 1 57 41 74 49 2 21 34 40 75 41 42 47 53 59 80 89 2 32 58 40 42 46 58 87
TCGA_DU_7014_19860618_47 (number of clusters = 6)
1 34 33 73 77 31 19 31 71 44 34 1 2 17 30 35 36 70 2 3 23 33 36 68
1 32 81 76 62 57 17 30 70 40 72 4 5 14 30 63 33 68 4 6 22 31 59 67
1 37 32 85 94 37 33 35 82 58 36 2 3 31 35 39 42 82 2 3 36 38 39 81
TCGA_DU_5855_19951217_17 (number of clusters = 14)
1 49 32 71 38 57 42 87 79 59 59 85,105 69 30 43 9 1 17 42 27 49 35 70 1 48 90 38 2 15 26 31 38 39 43 47 49 50 85 1 3 15 24 31 36 41 45 49 51 65 82
1 57 67 69 98 60 47 81 74 67 48 65 84 68 66 66 12 2 19 58 30 58 36 64 1 52 80 93 2 18 28 68 89 40 63 58 53 61 76 1 4 17 28 67 81 62 61 54 58 58 76
1 49 29 89 45 56 45,107 99 79 53,108 88,110 27 44 14 1 24 43 32 49 46 85 1 48,104 37 2 22 32 28 38 51 44 48 49 50 99 1 3 21 31 28 37 43 45 48 50 78 99
TCGA_CS_6667_20011105_11 (number of clusters = 18)
2 69 62 59 75 52 57 65,122,133,162,137 55,127 31,198,197,171 55,195 3 51,162 65 56 12 63 1130 34 73 41 69 64 74 24 2 4 14 25 36 43 53 57 63 64 67 67 67 71 75 77,175 2 3 22 43 52 58 61 62 66 66 69 72 77,144
2 71 80 71 79 24,132 59 97,111,147,125 91,111 41,186,194,154 128,175 4 26,142 86 66 13 69 2112 27 73 48 66 77 61 25 3 5 14 27 32 51 29 67 73,130 70 81 89 67 74 63,155 3 5 22 48 28 68,130 76 69 85 75 67 68,125
1 74 65 67 74 80 63 65 61 63 55 54 58 53 30 57 49 73 61 53 3 80 59 70 61 17 68 2 60 50 80 51 74 70 81 27 3 5 20 27 50 51 81 56 67 62 72 74 69 77 81 85 55 2 4 27 52 81 62 62 67 71 72 75 77 84 62
TCGA_DU_5855_19951217_23 (number of clusters = 10)
1 45 30 10 2 55 20 55 46 44 31 1 4 42 48 24 54 45 15 35 1 28 43 47 54 1 23 31 43 47 54
1 57 36 18 17 44 24 36 67 42 36 1 9 63 46 31 44 54 21 44 2 35 62 53 47 2 29 39 62 53 47
1 47 42 33 14 79 32 51 52 64 45 1 14 43 53 38 70 45 31 42 2 41 44 48 64 2 37 43 44 47 65
TCGA_FG_6688_20020215_28 (number of clusters = 7)
3 71 37 22,115 45,107 72 2 38,111 23 53 63 3 26 42 56 66 73,114 3 26 41 55 65 73,111
1 24 17 8 44 21 32 23 1 14 40 8 22 23 2 9 17 25 24 24 42 2 9 17 24 24 24 41
4 61 36 14 95 40,111 62 2 40 95 24 49 56 3 28 42 51 58 63 97 3 28 41 50 57 62 95
TCGA_FG_7634_20000128_27 (number of clusters = 9)
2 8 9 7 19 48 15 37 24 12 9 1 9 7 33 9 15 39 2 10 11 11 13 14 24 2 10 11 11 13 14 26
1 48 35 75 44 91,110 88,101 91 51 1 14 72 62 28 40 97 2 16 68 82 36 52,102 2 15 28 74 41 52,100
1 48 34 75 48 90,109 91 98 91 51 1 14 72 62 28 40 97 2 16 68 82 36 52,102 2 15 28 74 41 52,100
TCGA_FG_A60K_20040224_57 (number of clusters = 6)
1 83 72 83,102 65 67,104 29 1 85 2 43 80 98 2 40 77 95
1 82 71 89 99 65 67,104 29 1 85 2 43 80 98 2 40 77 95
1 82 71 89 95 59 67,104 29 1 85 2 43 80 98 2 40 77 95
TCGA_HT_8105_19980826_30 (number of clusters = 5)
1 50 85 58,107 3 62 88 27 45 4 25 44 60 88 4 27 45 60 87
1 49 85 55,114 3 62 88 27 45 4 25 44 60 88 4 27 45 60 87
1 48 84 57,114 3 62 88 27 45 4 25 44 60 88 4 27 45 60 87

Table 4.

Cluster heads and centers of proposed and other methods (second dataset)

M3DHP FCM FCM_FWCW FCM_FW
Y108 (number of clusters = 13)
1 68 77,100,126,137 40,155,219,201,207,163,252 1,163,219 72 44 17,192,117,250 59,137,101 85 3 60 86,121,163,207,247 2 56 78,103,129,164,208,248
1 68 78 99,127,137 40,154,220,201,207,160,249 1,163,219 72 44 17,192,117,250 59,137,101 85 3 60 86,121,163,207,247 2 56 78,103,129,164,208,248
1 69 78,100 126 136 43,155,220,201,207,158,256 1,163,219 72 44 17,192,117,250 59,137,101 85 3 60 86,121,163,207,247 2 56 78,103,129,164,208,248
Y91 (number of clusters = 15)
1,151,141,163,129 17 43,192,220,187,231,245 72 67,224 30,153,166 68 1128 16,183,112,228,205,249 47,140 94 3 36 64,101,122,137,151,165,182,204, 228,249 4 45,100,128,145,161,181,208,239
2,150,142,163,128 16 43,194,220,184,237,243 69 66,229 30,153,166 68 1128 16,183,112,228,205,249 47,140 94 3 36 64,101,122,137,151,165,182,204, 228,249 4 45,100,128,145,161,181,208,239
1,151,141,162,129 17 43,193,219,186,231,244 71 61,235 30,153,166 68 1128 16,183,112,228,205,249 47,140 94 3 36 64,101,122,137,151,165,182,204, 228,249 4 45,100,128,145,161,181,208,239
Y162 (number of clusters = 14)
1 68 75 80,111 51,220 24,201,147,172,161,250,165 142,106 78 15,195,220,166,250 65 39 92 53,122 1 5 67 92,119,155,203,246 2 51 70 90,111,137,170,211,249
1 68 74 81,112 50,221 24,201,148,172,157,253,171 142,106 78 15,195,220,166,250 65 39 92 53,122 1 5 67 92,119,155,203,246 2 51 70 90,111,137,170,211,249
1 68 74 80,111 50,220 24,201,151,173,162,256,163 142,106 78 15,195,220,166,250 65 39 92 53,122 1 5 67 92,119,155,203,246 2 51 70 90,111,137,170,211,249
Y6 (number of clusters = 13)
1 78 90 57,119 46,131,149,158,175,183,227,203 58 29 67,149 98 75,123 44,179 14 90 1 82 3 36 57 69 78 88 96,121,147,178 3 36 57 70 79 89 98,134,172
1 78 90 57,119 49,130,150,158,175,184,224,194 58 29 67,149 98 75,123 44,179 14 90 1 82 3 36 57 69 78 88 96,121,147,178 3 36 57 70 79 89 98,134,172
1 77 89 63,119 47,130,149,158,176,184,228,203 58 29 67,149 98 75,123 44,179 14 90 1 82 3 36 57 69 78 88 96,121,147,178 3 36 57 70 79 89 98,134,172
Y96 (number of clusters = 14)
2 66 78 22,105,111 35,145,157,133 45,163,173,251 53 74 21 12 85,150,126 67 60 1103 31 42,173 5 36 56 67 78,103,134,164 3 25 53 66 77,102,134,165
1 66 77 23,104,111 41,145,157,133 43,164,173,250 53 74 21 12 85,150,126 67 60 1103 31 42,173 5 36 56 67 78,103,134,164 3 25 53 66 77,102,134,165
1 66 77 23,105,111 35,144,156,134 52,163,173,250 53 74 21 12 85,150,126 67 60 1103 31 42,173 5 36 56 67 78,103,134,164 3 25 53 66 77,102,134,165
Y51 (number of clusters = 7)
1 85 30 24,139,148,195 97 34 82,146 2 66,192 4 40 74 90,123,158,196 3 39 74 91,126,160,197
1 86 31 23,140,147,196 97 34 82,146 2 66,192 4 40 74 90,123,158,196 3 39 74 91,126,160,197
1 86 31 23,139,147,196 97 34 82,146 2 66,192 4 40 74 90,123,158,196 3 39 74 91,126,160,197
Y188 (number of clusters = 14)
1 98 23 74 67 60 46,132,139,165,180,174,187,247 221 24 13 85,185 49 1103 95,253 61 73,143 35 2 23 46 61 76 89 99,109,189,250 2 20 32 46 61 76 89,100,116,195,252
2 97 24 74 66 61 46,132,139,165,180,173,185,250 221 24 13 85,185 49 1103 95,253 61 73,143 35 2 23 46 61 76 89 99,109,189,250 2 20 32 46 61 76 89,100,116,195,252
1 98 24 74 67 60 47,131,140,164,180,173,186,256 221 24 13 85,185 49 1103 95,253 61 73,143 35 2 23 46 61 76 89 99,109,189,250 2 20 32 46 61 76 89,100,116,195,252
Y103 (number of clusters = 8)
24,138,129,122,109,252 91,169 58,112,132,203,246 24 88,148 25 56 88,113,133,149,206,247 25 65 96,119,139,158,215,249
23,139,130,121,108,251 91,168 58,112,132,203,246 24 88,148 25 56 88,113,133,149,206,247 25 65 96,119,139,158,215,249
24,138,129,122,108,251 91,169 58,112,132,203,246 24 88,148 25 56 88,113,133,149,206,247 25 65 96,119,139,158,215,249
Y257 (number of clusters = 14)
1 77 52 33,103 50 90,114 83,154,163,179,217,205 149,184 63 72 40 91,120,102 51 1 81,227 16 29 1 20 35 49 62 72 81 91,102,119,147,182,223 2 30 45 60 72 83 96,109,148,197
1 77 55 36,101 45,100,108 86,153,161,176,214,204 149,184 63 72 40 91,120,102 51 1 81,227 16 29 1 20 35 49 62 72 81 91,102,119,147,182,223 2 30 45 60 72 83 96,109,148,197
1 77 54 33,101 42 93,107 78,154,161,182,217,201 149,184 63 72 40 91,120,102 51 1 81,227 16 29 1 20 35 49 62 72 81 91,102,119,147,182,223 2 30 45 60 72 83 96,109,148,197
Y104 (number of clusters = 13)
16 87,105,243,137,205,220,171 53,252,186 21,137 243,171,105 13 71,148 27 52,196,126,220 84 96 14 30 55 74 86,102,114,138,160,182,20 4,225,245 14 26 46 64 78 88,105,135,161,188,215,243
16 86,102,242,137,205,220,172 52,248,187 18,132 243,171,105 13 71,148 27 52,196,126,220 84 96 14 30 55 74 86,102,114,138,160,182,20 4,225,245 14 26 46 64 78 88,105,135,161,188,215,243
16 87,103,242,137,205,220,172 54,251,186 25,127 243,171,105 13 71,148 27 52,196,126,220 84 96 14 30 55 74 86,102,114,138,160,182,20 4,225,245 14 26 46 64 78 88,105,135,161,188,215,243
Y26 (number of clusters = 8)
71 78,122,150,142,204 1178 117,198 2148 59 26 72 86 3 27 61 73 88,119,149,199 3 36 70 89,137,195
71 78,121,149,142,204 1179 117,198 2148 59 26 72 86 3 27 61 73 88,119,149,199 3 36 70 89,137,195
71 79,121,150,142,203 1178 117,198 2148 59 26 72 86 3 27 61 73 88,119,149,199 3 36 70 89,137,195
Y74 (number of clusters = 13)
1 86 74 93,182,159,148,215,252,127,222 74,198 1191 41 56,218 20,249 83 95,169,119 71,144 2 55 81 94,124,156,185,214,248 2 36 55 71 84 97,125,156,185,215,248
1 84 76 93,181,160,149,216,249,124,223 85,198 1191 41 56,218 20,249 83 95,169,119 71,144 2 55 81 94,124,156,185,214,248 2 36 55 71 84 97,125,156,185,215,248
2 86 74 90,182,159,148,218,256,125,226 80,194 1191 41 56,218 20,249 83 95,169,119 71,144 2 55 81 94,124,156,185,214,248 2 36 55 71 84 97,125,156,185,215,248
Y46 (number of clusters = 17)
22 74 82 92,102,110,118,125,197,144,21 1,246,172,166,150,256,149 18,107,159 84,138,223 32 24 54,120 75 95 67,245,180 43,200 19 25 32 41 52 66 75 80 85 96,109,124,143,168,195,218,244 19 24 30 37 47 57 68 75 84 95,107,122,141,167,194,218,244
17 69 77 88 97,106,114,121,193,136,208, 244,167,161,147,255,137 12,102,153 79,132,219 25 18 47,115 70 90 62,243,175 34,196 13 19 25 32 44 61 70 65 80 91,104,119,137,162,190,214,242 13 19 23 29 38 50 63 70 79 90,102,117,136,161,189,214,242
14 68 76 87 96,104,112,118,191,137,206, 241,165,159,143,256,126 11,100,147 78,129,214 20 16 41,113 68 88 59,238,171 25,192 12 17 21 25 36 59 68 44 79 90,102,117,133,157,186,210,237 11 16 20 23 30 44 60 68 78 88,101 115 132 155 186 210 237
Y55 (number of clusters = 14)
1 85 74 63,119 52,164,154,133,210 30,247,216,144 15 36 98 1,250,198 76 65 86,123 52,150,224,175 2 33 52 66 79 90,106,137,170,193,220,250 2 30 49 63 75 86 98,125,153,185,218,249
1 85 75 64,120 47,165,153,133,210 33,241,216,150 15 36 98 1,250,198 76 65 86,123 52,150,224,175 2 33 52 66 79 90,106,137,170,193,220,250 2 30 49 63 75 86 98,125,153,185,218,249
1 85 75 63,119 51,165,154,132,209 30,246,216,146 15 36 98 1,250,198 76 65 86,123 52,150,224,175 2 33 52 66 79 90,106,137,170,193,220,250 2 30 49 63 75 86 98,125,153,185,218,249
Y27 (number of clusters = 14)
17 97,108,244 44,139,152,216,204,225,173,184 25,183 16 45,106 93 30 59,159,217 85,243,188 99,134 73 17 37 57 73 86 94,101,107,136,161,190,218,244 17 50 71 87 96,105,132,158,187,217,244
18 98,108,245 44,140,151,216,203,223,175,186 23,174 16 45,106 93 30 59,159,217 85,243,188 99,134 73 17 37 57 73 86 94,101,107,136,161,190,218,244 17 50 71 87 96,105,132,158,187,217,244
17 98,107,245 44,139,152,217,203,225,175,180 32,174 16 45,106 93 30 59,159,217 85,243,188 99,134 73 17 37 57 73 86 94,101,107,136,161,190,218,244 17 50 71 87 96,105,132,158,187,217,244

Table 6.

Cluster heads and centers of proposed and other methods (third dataset)

M3DHP FCM FCM_FWCW FCM_FW
1 (number of clusters = 13)
1,145,115,136,217,123,172,252,189 56 75,168,156 204,184 97 1221 73,131,165,148 21,250,114 50 5 50 77,106,125,145,165,194,220,250 2 22 50 73 96,114,131,147,164,183,203, 222,251
1,145,114,135,218,122,171,252,187 57 76,164,163 204,184 97 1221 73,131,165,148 21,250,114 50 5 50 77,106,125,145,165,194,220,250 2 22 50 73 96,114,131,147,164,183,203, 222,251
1,145,114,136,217,123,172,251,190 57 76,162,160 204,184 97 1221 73,131,165,148 21,250,114 50 5 50 77,106,125,145,165,194,220,250 2 22 50 73 96,114,131,147,164,183,203, 222,251
38 (number of clusters = 10)
1,146,127,161,138,109 57,251 43,236 167,128,212 28 1 86 57,109,148,247 2 30 59 89,111,131,150,169,213,248 3 34 63 97,121,144,165,210,247
2,147,127,162,138,110 57,252 42,235 167,128,212 28 1 86 57,109,148,247 2 30 59 89,111,131,150,169,213,248 3 34 63 97,121,144,165,210,247
1,147,127,161,137,110 58,252 43,235 167,128,212 28 1 86 57,109,148,247 2 30 59 89,111,131,150,169,213,248 3 34 63 97,121,144,165,210,247
70 (number of clusters = 10)
2 83 41 47 96,135,156,170,180,211 76 41 85,203 60 1127 21,161 96 3 24 44 73 83 94,125,160,204 2 22 43 74 85 96,127,161,204
1 83 42 47 97,137,156,171,180,210 76 41 85,203 60 1127 21,161 96 3 24 44 73 83 94,125,160,204 2 22 43 74 85 96,127,161,204
1 82 41 47,104,135,156,170,179,210 76 41 85,203 60 1127 21,161 96 3 24 44 73 83 94,125,160,204 2 22 43 74 85 96,127,161,204
72 (number of clusters = 10)
1103 117 128 155 17,173,144,163 32 67 41 87 99,111 2 18,127,173,152 4 24 52 81 98,111,128,153,174 4 23 51 80 94,105,118,139,167
1,102,116,127,155 20,174,145,163 21 67 41 87 99,111 2 18,127,173,152 4 24 52 81 98,111,128,153,174 4 23 51 80 94,105,118,139,167
2,102,116,127,154 17,173,144,168 24 67 41 87 99,111 2 18,127,173,152 4 24 52 81 98,111,128,153,174 4 23 51 80 94,105,118,139,167
112 (number of clusters = 10)
1,100,108,114,125,200,153,223 45 55 92,116,192 56,134,160 4220 25,102 5 26 59 93,103,117,135,161,193,220 6 31 84 97,107,120,137,162,194,221
1,100,109,114,124,200,150,223 45 55 92,116,192 56,134,160 4220 25,102 5 26 59 93,103,117,135,161,193,220 6 31 84 97,107,120,137,162,194,221
2,100,109,115,125,200,148,223 43 56 92,116,192 56,134,160 4220 25,102 5 26 59 93,103,117,135,161,193,220 6 31 84 97,107,120,137,162,194,221
118 (number of clusters = 11)
67 82 7 21,172,166,198,130,158,113,137 67 5180 89 59,101,152 30 17 46 78 7 21 43 65 80 92,106,173 7 21 41 60 69 84,100 152 180
66 83 7 20,172,166,198,127,161,120,133 67 5180 89 59,101,152 30 17 46 78 7 21 43 65 80 92,106,173 7 21 41 60 69 84,100 152 180
67 83 6 20,172,165,197,130,161,111,139 67 5180 89 59,101,152 30 17 46 78 7 21 43 65 80 92,106,173 7 21 41 60 69 84,100 152 180
133 (number of clusters = 16)
88 94 99 7,113,131,175,185 36 30 45,192 1,205,225,210 222,118,107 89,197,147,163 6132 73,179 21 52 36 82 97 7 19 33 46 60 76 84 92,100,111,123,139, 157,175,194,221 7 20 34 47 61 78 86 95,104,116,130,146,1 62,179,197,223
88 93 99 7,113,131,175,185 35 29 44,191 1,204,225,220 222,118,107 89,197,147,163 6132 73,179 21 52 36 82 97 7 19 33 46 60 76 84 92,100,111,123,139, 157,175,194,221 7 20 34 47 61 78 86 95,104,116,130,146,1 62,179,197,223
87 93,100 7,112,132,174,185 35 29 44,192 1,205,224,212 222,118,107 89,197,147,163 6132 73,179 21 52 36 82 97 7 19 33 46 60 76 84 92,100,111,123,139, 157,175,194,221 7 20 34 47 61 78 86 95,104,116,130,146,1 62,179,197,223
155 (number of clusters = 9)
15 86,103,243,146,218,173,204 89 240,139,208 14,173 43 86,103 70 15 46 73 88,105,141,174,209,241 16 57 83,102 128 157 187 215 243
16 85,102,244,146,218,172,204 96 240,139,208 14,173 43 86,103 70 15 46 73 88,105,141,174,209,241 16 57 83,102 128 157 187 215 243
16 86,103 244 146 217 172 204 88 240,139,208 14,173 43 86,103 70 15 46 73 88,105,141,174,209,241 16 57 83,102 128 157 187 215 243
163 (number of clusters = 9)
11 67 75 48,136 157 164 219 198 31 66,135,196,108 52 80,162 10 11 32 52 65 78 94,130,159,190 11 36 57 71 85,127,157,189
12 67 75 48,137,158,163,219,190 31 66,135,196,108 52 80,162 10 11 32 52 65 78 94,130,159,190 11 36 57 71 85,127,157,189
11 66 74 49,136,157,163,219,199 31 66,135,196,108 52 80,162 10 11 32 52 65 78 94,130,159,190 11 36 57 71 85,127,157,189
193 (number of clusters = 14)
1 64,101,109 47,123,197,133,181 30,123 42,180 53 161,138 79,104 64 22 57,197 90,120 41 71,180 2 3 23 42 57 64 71 79 89,102,119,136,159,179,198 4 30 47 59 66 74 82 93,108,126,144,166,184,199
1 65,101,110 46,122,197,134,182 34,123 36,173 51 161,138 80,104 64 22 57,197 90,120 41 71,180 2 3 23 42 57 64 71 79 89,102,119,136,159,179,198 4 30 47 59 66 74 82 93,109,126,144,166,184,199
1 64,102,110 49,122,197,133,180 33,105 41,172 57 161,138 79,104 64 22 56,197 90,120 41 71,180 2 3 23 42 57 64 71 79 89,102,118,136,159,179,198 4 29 47 59 66 73 82 93,108,125,144,166,184,199
196 (number of clusters = 11)
77 95 27,106,146,125,135,196,177,18 0,143 64 25,167 82 73,125,145,194,106 39 92 26 40 65 74 82 92,106,125,145,166,194 31 65 75 84 97,120,144,166,194
77 95 26,106,147,124,135,196,175,18 4,141 64 25,167 82 73,125,145,194,106 39 92 26 40 65 74 82 92,106,125,145,166,194 31 65 75 84 97,120,144,166,194
77 96 26,106,146,125,135,196,174,17 9,138 64 25,167 82 73,125,145,194,106 39 92 26 40 65 74 82 92,106,125,145,166,194 31 65 75 84 97,120,144,166,194
198 (number of clusters = 13)
20,113 36 43,153,134,145,217,238 61 71,209,186 149 17,130 99 82,235,216 61,114 43 28,193,165 18 30 44 63 84,101,115,132,150,166,19 3,216,236 18 28 41 57 78 97,112,128,148,165,192,217,236
20,113 37 44,153,134,145,218,238 62 71,209,189 149 17,130 99 82,235,216 61,114 43 28,193,165 18 30 44 63 84,101,115,132,150,166,19 3,216,236 18 28 41 57 78 97,112,128,148,165,192,217,236
21,114 37 44,153,134,147,218,237 61 70,208,187 149 17,130 99 82,235,216 61,114 43 28,193,165 18 30 44 63 84,101,115,132,150,166,19 3,216,236 18 28 41 57 78 97,112,128,148,165,192,217,236
202 (number of clusters = 12)
100 32 65 56,147,111,139,196,169 36,177,229 51,201 80 30,114,103 94 38 65,137,156,181 32 42 55 71 87 97,106,118,140,159,183,202 30 36 48 61 75 90 99,108,126,147,173,198
104 34 65 61,150,111,142,198,172 44,178,224 52,202 82 32,117,106 98 39 66,139,159,183 34 43 57 72 90,101,109,120,142,161,185,203 32 38 49 62 76 93,102,111,129,150,176,200
105 34 68 61,150,106,143,200,174 33,178,235 54,204 84 32,118,107 99 40 68,141,160,185 33 44 58 74 91,102,110,122,144,163,187,205 32 38 51 64 78 95,104,113,130,151,177,202
217 (number of clusters = 11)
83 13 55,108,140,239,251,170,150,15 8,159 101 66,173 75 84 10,145 34,254,123 52 10 17 40 56 69 79 87,116,141,172,255 10 15 38 55 70 82,100,134,172,255
82 13 53,108 140 239 256 167 154 154 168 101 66,173 75 84 10,145 34,254,123 52 10 17 40 56 69 79 87,116,141,172,255 10 15 38 55 70 82,100,134,172,255
83 12 54,107,139,239,256,170,158,15 2,165 101 66,173 75 84 10,145 34,254,123 52 10 17 40 56 69 79 87,116,141,172,255 10 15 38 55 70 82,100,134,172,255
220 (number of clusters = 12)
29,136,101 83,118 69 90,232,239,222,194 34 148 23 48,180 69,136,125,108 27 88,205,235 28 63 84,102,121,134,145,168,191,21 4,238 24 29 68 94,118,133,146,179,205,236
28,136,100 83,121 68 90,233,239,223,195 38 148 23 48,180 69,136,125,108 27 88,205,235 28 63 84,102,121,134,145,168,191,21 4,238 24 29 68 94,118,133,146,179,205,236
28,136,101 84,121 68 90,233,239,223,195 34 148 23 48,180 69,136,125,108 27 88,205,235 28 63 84,102,121,134,145,168,191,21 4,238 24 29 68 94,118,133,146,179,205,236

Table 3.

Statistical results of the first dataset

Name Metric M3DHP FCM FCM_FWCW FCM_FW
TCGA_HT_7877_19980917_23 F 1.0915e-06 2.8215e-06 1.2423e-06 5.5836e-06
F’ 1.0915e-07 2.8215e-07 1.2423e-07 5.5836e-07
Q 5.8614e-07 4.6243e-06 2.98e-06 3.5379e-06
TCGA_DU_7014_19860618_45 F 1.348e-06 3.4175e-06 3.2885e-06 6.0497e-06
F’ 1.348e-07 3.4175e-07 3.2885e-07 6.0497e-07
Q 1.0571e-06 6.1599e-06 5.8026e-06 3.8491e-06
TCGA_DU_5855_19951217_22 F 1.5969e-06 3.0059e-06 1.2229e-06 2.218e-06
F’ 1.5969e-07 3.0059e-07 1.2229e-07 2.218e-07
Q 1.0242e-06 5.1063e-06 2.6194e-06 1.5561e-06
TCGA_HT_8105_19980826_26 F 8.7984e-07 8.2234e-07 6.9341e-07 1.6573e-06
F’ 8.7984e-08 8.2234e-08 6.9341e-08 1.6573e-07
Q 4.8981e-07 2.0828e-06 1.8582e-06 1.5417e-06
TCGA_FG_6688_20020215_24 F 9.4135e-07 5.8185e-06 5.7484e-06 3.6052e-06
F’ 9.4135e-08 5.8185e-07 5.7484e-07 3.6052e-07
Q 7.3559e-07 7.3364e-06 7.2822e-06 2.7147e-06
TCGA_DU_7014_19860618_30 F 3.4707e-07 2.002e-06 1.7636e-06 2.8665e-06
F’ 3.4707e-08 2.002e-07 1.7636e-07 2.8665e-07
Q 5.8773e-07 3.4258e-06 3.1991e-06 2.9416e-06
TCGA_FG_A60K_20040224_40 F 1.0316e-06 8.6835e-07 5.5555e-07 7.5381e-07
F’ 1.0316e-07 8.6835e-08 5.5555e-08 7.5381e-08
Q 7.3167e-07 2.1293e-06 1.648e-06 1.12e-06
TCGA_DU_5872_19950223_38 F 3.8285e-05 2.4414e-05 4.6159e-05 6.1421e-05
F’ 3.9093e-06 2.4414e-06 4.6159e-06 6.1421e-06
Q 1.4907e-05 3.1027e-05 4.852e-05 2.8252e-05
TCGA_DU_7306_19930512_21 F 4.2638e-07 4.86e-06 4.0295e-06 1.7874e-06
F’ 4.2638e-08 4.86e-07 4.0295e-07 1.7874e-07
Q 3.87e-07 6.1229e-06 5.2312e-06 1.2685e-06
TCGA_DU_7013_19860523_24 F 4.292e-05 9.7622e-06 9.9038e-06 3.1462e-06
F’ 4.292e-06 9.7622e-07 9.9038e-07 3.1462e-07
Q 1.3489e-05 1.2307e-05 1.2358e-05 2.2102e-06
TCGA_HT_A616_19991226_19 F 4.3661e-06 1.0658e-05 6.5489e-06 5.0238e-06
F’ 4.3661e-07 1.0658e-06 6.5489e-07 5.0238e-07
Q 1.0376e-06 1.0934e-05 6.2287e-06 3.032e-06
TCGA_HT_8106_19970727_12 F 1.6255e-05 3.4835e-05 2.9403e-05 6.398e-05
F’ 1.6255e-06 3.4835e-06 2.9403e-06 6.398e-06
Q 9.6308e-06 3.1458e-05 2.7419e-05 3.1075e-05
TCGA_FG_A4MU_20030903_25 F 4.199e-06 3.6253e-05 2.8401e-05 1.7958e-05
F’ 4.199e-07 3.6253e-06 2.8401e-06 1.7958e-06
Q 3.4396e-06 3.8506e-05 3.1563e-05 1.11e-05
TCGA_FG_7634_20000128_21 F 2.1719e-06 1.6669e-05 1.2824e-05 9.1281e-06
F’ 2.1719e-07 1.6669e-06 1.2824e-06 9.1281e-07
Q 1.6485e-06 1.873e-05 1.4851e-05 6.3491e-06
TCGA_FG_6690_20020226_32 F 3.278e-06 6.0225e-06 4.4616e-06 1.2786e-05
F’ 3.278e-07 6.0225e-07 4.4616e-07 1.2786e-06
Q 2.8851e-06 8.2565e-06 5.738e-06 8.3646e-06
TCGA_DU_7014_19860618_47 F 7.5072e-07 7.3005e-07 7.1976e-07 4.7852e-06
F’ 7.5072e-08 7.3005e-08 7.1976e-08 4.7852e-07
Q 7.3365e-07 1.9358e-06 1.9276e-06 3.5308e-06
TCGA_DU_5855_19951217_17 F 4.5493e-06 1.3564e-05 5.668e-06 7.9173e-06
F’ 4.5493e-07 1.3564e-06 5.668e-07 7.9173e-07
Q 1.9563e-06 1.4077e-05 6.6236e-06 4.1431e-06
TCGA_CS_6667_20011105_11 F 9.8201e-06 4.8824e-05 2.971e-05 2.2404e-05
F’ 9.8201e-07 4.8824e-06 2.971e-06 2.2404e-06
Q 5.4688e-06 3.7546e-05 2.9517e-05 1.3535e-05
TCGA_DU_5855_19951217_23 F 9.2185e-07 4.7199e-06 2.3582e-07 2.2748e-06
F’ 9.2185e-08 4.7199e-07 2.3582e-08 2.2748e-07
Q 7.4472e-07 2.6525e-06 3.0045e-07 1.5694e-06
TCGA_FG_6688_20020215_28 F 7.4779e-07 3.1794e-06 4.3233e-07 3.1316e-06
F’ 7.4779e-08 3.1794e-07 4.3233e-08 3.1316e-07
Q 7.163e-07 2.5181e-06 6.0248e-07 2.4944e-06
TCGA_FG_7634_20000128_27 F 1.6364e-06 7.0829e-06 7.5158e-07 5.0282e-06
F’ 1.6364e-07 7.0829e-07 7.5158e-08 5.0282e-07
Q 1.2841e-06 4.2955e-06 8.7233e-07 3.6228e-06
TCGA_FG_A60K_20040224_57 F 3.1745e-07 2.8192e-06 2.8656e-07 2.1684e-06
F’ 3.1745e-08 2.8192e-07 2.8656e-08 2.1684e-07
Q 3.6815e-07 1.7924e-06 3.8077e-07 1.6941e-06
TCGA_HT_8105_19980826_30 F 2.6643e-07 1.6962e-06 2.3248e-07 1.6918e-06
F’ 2.6643e-08 1.6962e-07 2.3248e-08 1.6918e-07
Q 2.7046e-07 1.1885e-06 2.8371e-07 1.1948e-06
Mean ranks F 1.9565 2.5652 2.8696 2.6087
F’ 1.9565 2.5652 2.8696 2.6087
Q 1.3913 3.2609 3.0870 2.2609

Table 5.

Statistical results of the second dataset

Name Metric M3DHP FCM FCM_FWCW FCM_FW
Y108 F 1.2046E-08 1.7083e-06 1.4207e-06 1.4593e-06
F’ 1.2046E-09 1.7083e-07 1.4207e-07 1.4593e-07
Q 4.464E-08 1.6522e-06 1.764e-06 1.7048e-06
Y91 F 1.1861E-08 1.3373e-06 1.2707e-06 1.2474e-06
F’ 1.225E-09 1.3373e-07 1.2707e-07 1.2474e-07
Q 4.9535E-08 1.3829e-06 1.4642e-06 1.6784e-06
Y162 F 1.3421E-08 1.733e-06 1.438e-06 1.4968e-06
F’ 1.3421E-09 1.733e-07 1.438e-07 1.4968e-07
Q 4.9978E-08 1.6378e-06 1.7666e-06 1.6752e-06
Y6 F 2.9306E-08 2.9988e-06 2.6386e-06 2.6144e-06
F’ 2.9306E-09 2.9988e-07 2.6386e-07 2.6144e-07
Q 5.3039E-08 1.9407e-06 1.8291e-06 1.9711e-06
Y96 F 2.8984E-08 2.7573e-06 8.829e-05 2.0924e-06
F’ 2.8984E-09 2.7573e-07 8.829e-06 2.0924e-07
Q 5.5713E-08 1.9101e-06 9.5744e-05 1.8068e-06
Y51 F 1.4832E-08 3.5889e-06 1.8564e-06 1.8368e-06
F’ 1.4832E-09 3.5889e-07 1.8564e-07 1.8368e-07
Q 6.5103E-08 3.0306e-06 2.1063e-06 2.0799e-06
Y188 F 5.3196E-08 7.8759e-06 0.00012718 8.755e-06
F’ 5.3196E-09 7.8759e-07 1.2718e-05 8.755e-07
Q 9.1804E-08 4.1384e-06 0.00011733 5.4e-06
Y103 F 2.1086E-08 2.585e-06 2.5897e-06 2.5174e-06
F’ 2.1086E-09 2.585e-07 2.5897e-07 2.5174e-07
Q 9.2882E-08 2.6984e-06 2.7113e-06 2.6947e-06
Y257 F 5.7117E-08 8.1859e-06 0.00024618 6.8999e-06
F’ 5.7117E-09 8.1859e-07 2.4618e-05 6.8999e-07
Q 1.0464E-07 4.2513e-06 0.00020675 4.6112e-06
Y104 F 2.68e-08 3.5976e-06 3.4846e-06 3.5133e-06
F’ 2.68e-09 3.5976e-07 3.4846e-07 3.5133e-07
Q 8.8698e-08 3.0279e-06 2.8991e-06 3.0865e-06
Y26 F 5.0321e-08 7.264e-06 7.221e-06 7.4326e-06
F’ 5.0321e-09 7.264e-07 7.221e-07 7.4326e-07
Q 1.2017e-07 5.0267e-06 4.9761e-06 5.6908e-06
Y74 F 6.2502e-08 9.7585e-06 7.9441e-06 9.0475e-06
F’ 6.2502e-09 9.7585e-07 7.9441e-07 9.0475e-07
Q 1.601e-07 6.2042e-06 5.8793e-06 6.188e-06
Y46 F 6.3921e-08 7.4007e-06 7.9425e-06 7.7024e-06
F’ 6.3921e-09 7.4007e-07 7.9425e-07 7.926e-07
Q 1.5302e-07 5.2013e-06 5.6053e-06 5.5636e-06
Y55 F 7.3447e-08 1.0033e-05 9.9115e-06 9.4861e-06
F’ 7.3447e-09 1.0386e-06 9.9115e-07 9.4861e-07
Q 1.6832e-07 6.2789e-06 6.5502e-06 6.3632e-06
Y27 F 7.4054e-08 1.0627e-05 1.0549e-05 9.6694e-06
F’ 7.4054e-09 1.0627e-06 1.0549e-06 1.0101e-06
Q 1.374e-07 5.4274e-06 5.5127e-06 5.2346e-06
Mean ranks F 1 3.4000 3.1333 2.4667
F’ 1 3.4000 3.1333 2.4667
Q 1 2.8000 3.0667 3.1333

Table 7.

Statistical results of the third dataset

Name Metric M3DHP FCM FCM_FWCW FCM_FW
1 F 8.2421e-07 6.5013e-06 6.143e-06 6.4675e-06
F’ 8.2421e-08 6.5013e-07 6.143e-07 6.4675e-07
Q 1.2179e-06 5.0839e-06 5.3816e-06 5.0659e-06
38 F 8.3189e-07 7.0701e-06 7.0384e-06 7.309e-06
F’ 8.3189e-08 7.0701e-07 7.0384e-07 7.309e-07
Q 1.3121e-06 5.7837e-06 5.7766e-06 6.1937e-06
70 F 7.4278e-07 6.58e-06 6.3453e-06 6.2719e-06
F’ 7.4278e-08 6.58e-07 6.3453e-07 6.2719e-07
Q 9.3506e-07 4.6164e-06 4.6242e-06 4.5495e-06
72 F 4.9015e-07 3.7114e-06 3.6656e-06 4.1829e-06
F’ 4.9015e-08 3.7114e-07 3.6656e-07 4.1829e-07
Q 7.729e-07 3.0127e-06 3.0999e-06 3.596e-06
112 F 8.1694e-07 6.3685e-06 6.5435e-06 6.4552e-06
F’ 8.1694e-08 6.3685e-07 6.5435e-07 6.4552e-07
Q 1.0785e-06 4.6562e-06 4.7679e-06 5.0617e-06
118 F 6.5639e-07 6.4394e-06 5.5014e-06 5.9383e-06
F’ 6.5639e-08 6.4394e-07 5.5014e-07 5.9383e-07
Q 7.2408e-07 4.1281e-06 4.2612e-06 3.9874e-06
133 F 7.7764e-07 6.2645e-06 6.6291e-06 6.293e-06
F’ 7.7764e-08 6.2645e-07 6.6291e-07 6.293e-07
Q 9.803e-07 3.972e-06 4.1823e-06 3.9673e-06
155 F 6.404e-07 6.3761e-06 6.2239e-06 5.8332e-06
F’ 6.404e-08 6.3761e-07 6.2239e-07 5.8332e-07
Q 1.0142e-06 5.3399e-06 5.2456e-06 4.8609e-06
163 F 8.0216e-07 7.904e-06 7.9085e-06 7.2487e-06
F’ 8.0216e-08 7.904e-07 7.9085e-07 7.2487e-07
Q 9.3735e-07 4.9534e-06 5.1848e-06 4.9982e-06
193 F 7.8125e-07 6.0877e-06 6.0927e-06 5.594e-06
F’ 7.8125e-08 6.0877e-07 6.3071e-07 5.594e-07
Q 9.204e-07 3.9247e-06 3.9381e-06 3.6887e-06
196 F 4.7168e-07 4.0365e-06 4.0808e-06 3.8212e-06
F’ 4.7168e-08 4.0365e-07 4.0808e-07 3.8212e-07
Q 5.988e-07 2.8232e-06 2.857e-06 2.9362e-06
198 F 6.48e-07 5.0297e-06 4.9793e-06 4.9754e-06
F’ 6.48e-08 5.0297e-07 4.9793e-07 4.9754e-07
Q 9.3293e-07 4.0181e-06 3.9713e-06 3.9837e-06
202 F 6.2007e-07 4.6009e-06 4.5859e-06 4.6129e-06
F’ 6.2007e-08 4.6009e-07 4.5859e-07 4.6129e-07
Q 7.0764e-07 3.4117e-06 3.3518e-06 3.4905e-06
217 F 6.9417e-07 6.4246e-06 6.9218e-06 6.4857e-06
F’ 6.9417e-08 6.4246e-07 6.9218e-07 6.4857e-07
Q 7.7638e-07 4.1981e-06 4.6997e-06 4.6129e-06
220 F 7.8933e-07 5.3978e-06 5.0835e-06 4.975e-06
F’ 7.8933e-08 5.3978e-07 5.0835e-07 4.975e-07
Q 1.086e-06 4.2062e-06 3.891e-06 4.252e-06
Mean ranks F 1 3.2666 2.933 2.800
F’ 1 3.2666 2.933 2.800
Q 1 2.0666 2.666 3.266

In our tests, the number of peaks found by M3DHP is used to figure out the number of clusters (m) for FCM, FCM_FW and FCM_FWCW. However, within the FCM_FW and FCM_FWCW algorithms, there are scenarios where some clusters end up empty without any pixels being assigned to them. The algorithm might converge to a solution where one or more clusters do not have any data points associated with them. The issue can arise from various factors, including the selection of initial cluster centers, data distribution, or the fact that the specified number of clusters does not align with the natural clustering in the data.

Based on what is shown in Figs. 3, 4, and 5 for all images on both datasets, the background is clearly distinguished when using M3DHP. On the other hand, with FCM, the backgrounds of 11 images out of 23 on the first dataset are mistakenly divided into many regions and are over-segmented. Also, in the second dataset, with FCM, the backgrounds of 5 images out of 15 on the second dataset are mistakenly divided into many regions and are over-segmented. Also, in the third dataset, with FCM, the backgrounds of 5 images out of 15 on the third dataset are mistakenly divided into many regions and are over-segmented.

For TCGA_HT_8105_19980826_26, the tumor can be distinguished more easily with the M3DHP than with the FCM. Many pixels are mistakenly assigned to the same tumor cluster by the FCM. Moreover, for TCGA_FG_6688_20020215_24, the tumor in the bottom center of the brain is very clearly distinguished when using M3DHP. However, in this case, FCM against M3DHP is not successful. For TCGA_DU_7014_19860618_30, the proposed algorithm segments the image correctly, while the segmented image by FCM represents both brain and tumor regions in a single region. In this case, the image is under-segmented with the associated image. Visual inspection reveals that the M3DHP method generally yields more homogeneous segmentation regions (Tables 4, 5, and 6).

As shown in Tables 3, 5, and 7, according to F and F’, the proposed method outperforms FCM in 12 out of 23 cases. Furthermore, according to Q, the proposed method outperforms FCM in 17 out of 23 cases. Results for the three evaluation functions F, F’, and Q suggest that the quantitative performance of both approaches is comparable when applied to the same image. It is important to note that there is not a huge disparity between these numbers; they all tend toward zero. The M3DHP approach demonstrates its efficacy by generating reliable results for the three statistical metrics F, F, and Q. When it comes to F, F’, and Q, M3DHP typically offers superior performance to FCM in most cases. The last three rows of Tables 3, 5, and 7 indicate the average rank of algorithms according to all three performance indicators. The M3DHP ranked first according to all three performance indicators.

In the comparative analysis of the three algorithms across various MRI slices using metrics F, F, and Q, our algorithm consistently demonstrates superior performance. It achieves the top mean rank in all three metrics, indicating its robustness and effectiveness in clustering. FCM_FW generally ranks second, outperforming the standard FCM, which consistently ranks last. The consistent top ranking of our algorithm across all metrics underscores its potential as a preferred choice for clustering tasks in varied contexts.

5.4. Experiment 2: Numerical analysis of all images using internal metrics

In this section, we evaluate the average results obtained on both datasets’ images. As shown in Tables 8 and 9, M3DHP has the best results. Regarding F criteria, after M3DHP, the method FCM_FWCW has better results. Also, in terms of F’ criteria, after M3DHP, FCM_FWCW has better results. Regarding Q criteria, after M3DHP, FCM_FW in the first dataset and FCM in the second dataset have better results. Methods FCM_FWCW have almost high performance in terms of F’ and F criteria and have similar F’ and F to M3DHP. However, in terms of Q criteria are much larger than M3DHP. In the third dataset, similar other datasets, our model has first rank.

Table 8.

The average performance of M3DHP and other state-of-the-art methods on all samples

Dataset Metrics M3DHP FCM FCM_FWCW FCM_FW
First F 1.39358120903132e-05 2.46990771224074e-05 1.95984239064522e-05 2.49533320028885e-05
F’ 1.40279322651129e-06 2.47177780413038e-06 1.96132588186868e-06 2.49722421278007e-06
Q 6.43614675892207e-06 2.54933248262013e-05 2.18424729314335e-05 1.41096227777129e-05
Second F 8.76604477611940e-07 0.000120429245447443 0.000768951296150804 0.000112026131324210
F’ 8.77502985074627e-08 1.21001514764840e-05 7.69064842309086e-05 1.13036581872294e-05
Q 8.20987313432836e-07 3.07843306248018e-05 0.000264834617236399 3.18941499332907e-05
Third F 1.214845454545455e-06 2.283909454545455e-06 2.271761000000001e-06 2.280264918181819e-06
F’ 1.216481818181818e-07 2.287917985007947e-07 2.286540368679730e-07 2.284011692280614e-07
Q 1.154263636363637e-06 2.056635539597964e-06 2.060901245568683e-06 2.060945271617046e-06

Table 9.

The mean rank of M3DHP and other state-of-the-art methods on all samples

Dataset Metrics M3DHP FCM FCM_FWCW FCM_FW
First F 1 3 2 4
F’ 1 3 2 4
Q 1 4 3 2
Second F 1 4 2 3
F’ 1 4 2 3
Q 1 2 4 3
Third F 1 3 4 2
F’ 1 4 3 2
Q 1 2 3 4

These findings lead us to the conclusion that M3DHP can demonstrate competent performance during the segmentation of brain magnetic resonance images. The visual and numerical results show that the proposed M3DHP technique produces promising segmentation results. The method’s ability to automatically generate the desired number of clusters and cluster centroids proves this.

The thorough testing demonstrates that the proposed method performs well for image segmentation, surpassing the performance standards set by well-known methods like FCM, FCM_FWCW, and FCM_FW. The consistently superior outcomes across multiple evaluation criteria underscore the potential of the proposed method as a noteworthy contribution to the field.

5.5. Experiment 3: Analysis of all images with external metrics

In this experiment, to investigate the proposed method deeply and compare the obtained results with other state-of-the-art methods, the performance of the proposed method is evaluated with image ground truths and external evaluation metrics, such as accuracy, F1, precision, recall, and specificity. The statistical results are reported in Table 10. The visual segmentation results are also shown in Fig. 6. We test the proposed method only on the first dataset because the ground truth for the second one is not available.

Table 10.

Statistical results of the first dataset with external metrics

Name Metric M3DHP FCM FCM_FWCW FCM_FW
TCGA_HT_7877_19980917_23 Accuracy 99.68 99.77 99.62 99.76
F1-Score 93.67 95.57 92.77 95.35
Precision 97.59 97.75 93.76 97.28
Recall 90.06 93.50 91.81 93.49
Specificity 90.06 93.50 91.81 93.49
TCGA_DU_7014_19860618_45 Accuracy 96.33 96.48 96.45 96.20
F1-Score 85.47 85.96 85.82 84.71
Precision 91.05 96.67 96.19 96.46
Recall 80.53 77.38 77.47 75.51
Specificity 80.53 77.38 77.47 75.51
TCGA_DU_5855_19951217_22 Accuracy 99.61 99.60 91.52 93.62
F1-Score 92.47 92.14 70.12 72.97
Precision 93.86 94.25 56.49 58.69
Recall 91.12 90.11 92.42 96.43
Specificity 91.12 90.11 92.42 96.43
TCGA_HT_8105_19980826_26 Accuracy 99.10 96.02 94.80 95.02
F1-Score 79 70.45 69.49 69.65
Precision 66.67 55.15 54.06 54.22
Recall 96.91 97.51 97.23 97.33
Specificity 96.91 97.51 97.23 97.33
TCGA_FG_6688_20020215_24 Accuracy 97.60 96.82 97.49 97.26
F1-Score 84.84 80.79 83.69 83.16
Precision 90.61 83 93.72 86.48
Recall 79.76 78.69 75.59 80.08
Specificity 79.76 78.69 75.59 80.08
TCGA_DU_7014_19860618_30 Accuracy 97.95 96.35 96.35 96.35
F1-Score 73.92 74.33 74.33 74.33
Precision 66.14 61.58 61.58 61.58
Recall 83.77 93.75 93.75 93.75
Specificity 83.77 93.75 93.75 93.75
TCGA_FG_A60K_20040224_40 Accuracy 98.27 70.13 72.65 73.49
F1-Score 89.07 66.64 65.69 62.76
Precision 84.10 55.08 54.94 54.02
Recall 94.66 84.35 81.68 74.88
Specificity 94.66 84.35 81.68 74.88
TCGA_DU_5872_19950223_38 Accuracy 98 99.31 99.40 99.38
F1-Score 83.92 94.95 95.62 95.49
Precision 98.94 98.93 98.55 98.38
Recall 72.87 91.28 92.86 92.75
Specificity 72.87 91.28 92.86 92.75
TCGA_DU_7306_19930512_21 Accuracy 99.21 99.61 98.88 98.08
F1-Score 89.47 95.24 89.15 85.21
Precision 99.60 95.76 83.28 76.09
Recall 81.21 94.72 95.91 96.81
Specificity 81.21 94.72 95.91 96.81
TCGA_DU_7013_19860523_24 Accuracy 96.28 90.44 87.08 88.80
F1-Score 78.48 66.44 69.28 62.41
Precision 75.64 60.20 59.70 57.29
Recall 81.55 74.12 82.51 68.52
Specificity 81.55 74.12 82.51 68.52
TCGA_HT_A616_19991226_19 Accuracy 93.08 90.16 90.75 90.75
F1-Score 74.29 64.13 66 66
Precision 63.18 56.89 57.98 57.98
Recall 90.16 73.49 76.60 76.60
Specificity 90.16 73.49 76.60 76.60
TCGA_HT_8106_19970727_12 Accuracy 76.63 89.63 91.12 90.97
F1-Score 60.52 62.37 67.69 65.14
Precision 53.27 56.41 59.60 58.37
Recall 70.05 69.75 78.32 73.70
Specificity 70.05 69.75 78.32 73.70
TCGA_FG_A4MU_20030903_25 Accuracy 97.74 98.12 97.74 97.16
F1-Score 81.96 83.42 82.21 80.33
Precision 71.11 73.42 71.15 68.40
Recall 96.70 96.58 97.33 97.30
Specificity 96.70 96.58 97.33 97.30
TCGA_FG_7634_20000128_21 Accuracy 98.25 98.22 95.52 96.57
F1-Score 90.12 90.22 72.88 83.97
Precision 88.20 87.47 73.18 78.32
Recall 92.12 93.14 72.59 90.49
Specificity 92.12 93.14 72.59 90.49
TCGA_FG_6690_20020226_32 Accuracy 95.53 96.62 95.55 94.81
F1-Score 76.12 78 69.88 74.32
Precision 65.10 68.29 62.70 63.27
Recall 91.62 90.94 78.91 90.05
Specificity 91.62 90.94 78.91 90.05
TCGA_DU_7014_19860618_47 Accuracy 98.74 99.25 99.26 99.26
F1-Score 89.37 93.94 94 93.96
Precision 99.32 98.90 98.80 98.88
Recall 81.23 89.45 89.65 89.52
Specificity 81.23 89.45 89.65 89.52
TCGA_DU_5855_19951217_17 Accuracy 99.63 97.37 93.62 71.97
F1-Score 80.45 69.54 67.35 63.53
Precision 68.37 53.79 51.64 50.38
Recall 97.71 98.33 96.80 85.95
Specificity 97.71 98.33 96.80 85.95
TCGA_CS_6667_20011105_11 Accuracy 96.23 97.35 94.88 93.03
F1-Score 67.25 67.91 66.92 66.45
Precision 51.51 52.13 51.13 50.84
Recall 96.85 97.41 96.80 95.88
Specificity 96.85 97.41 96.80 95.88
TCGA_DU_5855_19951217_23 Accuracy 97.64 97.77 94.70 78.84
F1-Score 76.18 75.90 65.61 67.81
Precision 73.80 75 60.01 54.83
Recall 78.72 76.83 72.35 88.85
Specificity 78.72 76.83 72.35 88.85
TCGA_FG_6688_20020215_28 Accuracy 95.47 95.94 95.88 95.22
F1-Score 79.66 79.41 80.26 82.32
Precision 77.59 80.68 79.81 76.44
Recall 81.85 78.18 80.71 89.18
Specificity 81.85 78.18 80.71 89.18
TCGA_FG_7634_20000128_27 Accuracy 96.68 96.91 96.81 96.71
F1-Score 65.43 63.80 64.63 64.88
Precision 50.97 50.89 50.94 50.93
Recall 91.34 85.46 88.41 89.36
Specificity 91.34 85.46 88.41 89.36
TCGA_FG_A60K_20040224_57 Accuracy 97.59 96.85 96.95 97.02
F1-Score 84.81 79.24 79.95 80.42
Precision 98.73 97.33 95.89 93.93
Recall 74.33 66.82 68.56 70.31
Specificity 74.33 66.82 68.56 70.31
TCGA_HT_8105_19980826_30 Accuracy 69.82 98.34 98.28 92.24
F1-Score 56.39 74.70 77.37 71.56
Precision 50.84 70.74 70.90 57.34
Recall 63.29 79.13 85.15 95.18
Specificity 63.29 79.13 85.15 95.18
Mean Accuracy 95.5243 95.4374 94.5783 92.7178
F1-Score 79.6896 78.4822 76.1178 75.9448
Precision 77.2257 74.7961 71.1304 69.5826
Recall 85.1483 85.6922 85.3657 87.0400
Specificity 85.1483 85.6922 85.3657 87.0400

Fig. 6.

Fig. 6

Fig. 6

Fig. 6

Fig. 6

Original MRI ground truth and segmented MRI slices by the M3DHP, FCM, FCM_FW and FCM_FWCW

Table 10 and Fig. 6 show that the proposed method has the best average performance for tested images. For image TCGA_DU_5855_19951217_23, tumor and non-tumor areas are well segmented, and due to the low light intensity, shape and texture, the other methods could not accurately detect the entire tumor area. Also, in the image TCGA_DU_7014_19860618_45, the border of the skull is segmented as a tumor area in the compared methods. However, the proposed methods are able to detect tumor areas well. This error in other methods is due mainly to the high brightness of the tissue surrounding the skull. The accuracy, F1-Score, and precision metrics rates are by an average of 95.52%, 79.68%, and 77.2257% on all testing images. This result highlights the efficacy of the feature combination employed in our method. The recall and specificity metrics in the proposed method are lower than those of the FCM_FW method. The reason is apparent: using feature weighting schemas and applying efficient extracted features can improve the results. However, the feature extraction phase is not used in our method, as are the FCM and FCM_FWCW methods.

6. Conclusion

In this paper, we suggest a modified form of the 3D Histogram-based segmentation technique that can choose the appropriate number of segments. The appropriate number of segments is determined by taking advantage of peak detection using a multimodal optimization algorithm. Using a multimodal optimization method, the optimal number of segments is calculated by exploiting peak detection. The proposed method has been applied to brain MRI to be segmented. The optimal number of clusters is unknown, making M3DHP more flexible for practice than other methods. To prove the efficiency of the proposed method, it has been compared with the well-known FCM clustering scheme. The results of the experiments demonstrate that the suggested strategy produces the desired outcomes and outperforms FCM. In our research, we developed a segmentation method for brain MRI that currently works with a single 2D MRI slice. The next step in our research will focus on extending this algorithm to handle 3D MRI segmentation.

Funding

No funding for this study.

Footnotes

Ethical approval and consent to participate Not applicable.

Human ethics Not applicable.

Consent for publication Not applicable.

Competing interests The authors declare no potential competing interests.

Data availability

The brain images supporting Fig. 1 are publicly available in the Kaggle: https://www.kaggle.com/datasets/mateuszbuda/lgg-mri-segmentation

References

  • 1.Pham DL, Xu C, Prince JL (2000) A survey of current methods in medical image segmentation. Annu Rev Biomed Eng 2(3):315–337 [DOI] [PubMed] [Google Scholar]
  • 2.Rahkar Farshi T, Demirci R, Feizi-Derakhshi M-R (2018) Image clustering with optimization algorithms and color space. Entropy 20(4):296. 10.3390/e20040296 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Oskouei AG, Balafar MA, and Akan T, (2023) “A brain MRI segmentation method using feature weighting and a combination of efficient visual features,” Applied Computer Vision and Soft Computing with Interpretable AI, pp. 15–34, 10.1201/9781003359456-2. [DOI] [Google Scholar]
  • 4.Tabrizi HB and Crick C, “Brain-Inspired Visual Odometry: Balancing Speed and Interpretability through a System of Systems Approach,” 2023, Accessed: Jan. 06, 2024. [Online]. Available: https://arxiv.org/abs/2312.13162v1
  • 5.Mousavirad SJ, Ebrahimpour-Komleh H (2020) Human mental search-based multilevel thresholding for image segmentation. Appl Soft Comput 97:105427. 10.1016/J.ASOC.2019.04.002 [DOI] [Google Scholar]
  • 6.Farag AA (2009) Edge-based image segmentation. Remote Sens Rev 6(1):95–121. 10.1080/02757259209532148 [DOI] [Google Scholar]
  • 7.Slabaugh G, Unal G, Wels M, Fang T, Rao B (2009) Statistical region-based segmentation of ultrasound images. Ultrasound Med Biol 35(5):781–795. 10.1016/J.ULTRASMEDBIO.2008.10.014 [DOI] [PubMed] [Google Scholar]
  • 8.Rahkar Farshi T, Ardabili AK (2021) A hybrid firefly and particle swarm optimization algorithm applied to multilevel image thresholding. Multimedia Systems 27(1):125–142. 10.1007/S00530-020-00716-Y/TABLES/7 [DOI] [Google Scholar]
  • 9.Rahkar Farshi T, Orujpour M (2019) Multi-level image thresholding based on social spider algorithm for global optimization. Int J Inf Technol 11(4):713–718. 10.1007/S41870-019-00328-4/FIGURES/2 [DOI] [Google Scholar]
  • 10.Gupta D, Anand RS (2017) A hybrid edge-based segmentation approach for ultrasound medical images. Biomed Signal Process Control 31:116–126. 10.1016/J.BSPC.2016.06.012 [DOI] [Google Scholar]
  • 11.Farshi TR, Drake JH, Özcan E (2020) A multimodal particle swarm optimization-based approach for image segmentation. Expert Syst Appl 149:113233. 10.1016/J.ESWA.2020.113233 [DOI] [Google Scholar]
  • 12.Xue G, Chen C, Lu Z-L, Dong Q (2010) Brain imaging techniques and their applications in decision-making research. Xin Li Xue Bao 42(1):120. 10.3724/SP.J.1041.2010.00120 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Saritha S, Amutha Prabha N (2016) A comprehensive review: segmentation of MRI images—brain tumor. Int J Imaging Syst Technol 26(4):295–304. 10.1002/IMA.22201 [DOI] [Google Scholar]
  • 14.Rahkar Farshi T, Demirci R (2021) Multilevel image thresholding with multimodal optimization. Multimed Tools Appl 80(10):15273–15289. 10.1007/S11042-020-10432-4/TABLES/3 [DOI] [Google Scholar]
  • 15.Kumar S, Pant M, Kumar M, Dutt A (2018) Colour image segmentation with histogram and homogeneity histogram difference using evolutionary algorithms. Int J Mach Learn Cybern 9(1):163–183. 10.1007/S13042-015-0360-7/FIGURES/5 [DOI] [Google Scholar]
  • 16.“Current Methods in the Automatic Tissue Segmentation of 3D Magnet…: Ingenta Connect.” Accessed: Jul. 09, 2022. [Online]. Available: https://www.ingentaconnect.com/content/ben/cmir/2006/00000002/00000001/art00008
  • 17.Niessen WJ, Vincken KL, Weickert J, Ter Haar Romeny BM, Viergever MA (1999) Multiscale segmentation of three-dimensional MR brain images. Int J Comput Vis 31(2):185–202. 10.1023/A:1008070000018 [DOI] [Google Scholar]
  • 18.Wang J, Kong J, Lu Y, Qi M, Zhang B (2008) A modified FCM algorithm for MRI brain image segmentation using both local and non-local spatial constraints. Comput Med Imaging Graph 32(8):685–698. 10.1016/J.COMPMEDIMAG.2008.08.004 [DOI] [PubMed] [Google Scholar]
  • 19.Verma H, Verma D, Tiwari PK (2021) A population based hybrid FCM-PSO algorithm for clustering analysis and segmentation of brain image. Expert Syst Appl 167:114121. 10.1016/J.ESWA.2020.114121 [DOI] [Google Scholar]
  • 20.Sikka K, Sinha N, Singh PK, Mishra AK (2009) A fully automated algorithm under modified FCM framework for improved brain MR image segmentation. Magn Reson Imaging 27(7):994–1004. 10.1016/J.MRI.2009.01.024 [DOI] [PubMed] [Google Scholar]
  • 21.Wang P and Wang HL, (2008) “A modified FCM algorithm for MRI brain image segmentation,” Proceedings - 2008 International Seminar on Future BioMedical Information Engineering, FBIE 2008, pp. 26–29, 10.1109/FBIE.2008.12. [DOI] [Google Scholar]
  • 22.Bezdek JC, Ehrlich R, Full W (1984) FCM: the fuzzy c-means clustering algorithm. Comput Geosci 10(2–3):191–203. 10.1016/0098-3004(84)90020-7 [DOI] [Google Scholar]
  • 23.Rahkar Farshi T, Orujpour M (2021) A multi-modal bacterial foraging optimization algorithm. J Ambient Intell Humaniz Comput 12(11):10035–10049. 10.1007/S12652-020-02755-9/FIGURES/6 [DOI] [Google Scholar]
  • 24.Orujpour M, Feizi-Derakhshi MR, Rahkar-Farshi T (2020) Multi-modal forest optimization algorithm. Neural Comput & Applic 32(10):6159–6173. 10.1007/S00521-019-04113-Z/FIGURES/11 [DOI] [Google Scholar]
  • 25.Farshi TR (2022) A memetic animal migration optimizer for multimodal optimization. Evol Syst 13(1):133–144. 10.1007/S12530-021-09368-3/TABLES/12 [DOI] [Google Scholar]
  • 26.Qing L, Gang W, Zaiyue Y, Qiuping W (2008) Crowding clustering genetic algorithm for multimodal function optimization. Appl Soft Comput 8(1):88–95. 10.1016/J.ASOC.2006.10.014 [DOI] [Google Scholar]
  • 27.Li JP, Balazs ME, Parks GT, Clarkson PJ (2002) A species conserving genetic algorithm for multimodal function optimization. Evol Comput 10(3):207–234. 10.1162/106365602760234081 [DOI] [PubMed] [Google Scholar]
  • 28.Liang Y, Leung KS (2011) Genetic algorithm with adaptive elitist-population strategies for multimodal function optimization. Appl Soft Comput 11(2):2017–2034. 10.1016/J.ASOC.2010.06.017 [DOI] [Google Scholar]
  • 29.Wang H, Wang W, Wu Z (2013) Particle swarm optimization with adaptive mutation for multimodal optimization. Appl Math Comput 221:296–305. 10.1016/J.AMC.2013.06.074 [DOI] [Google Scholar]
  • 30.Ren Z, Zhang A, Wen C, Feng Z (2014) A scatter learning particle swarm optimization algorithm for multimodal problems. IEEE Trans Cybern 44(7):1127–1140. 10.1109/TCYB.2013.2279802 [DOI] [PubMed] [Google Scholar]
  • 31.Der Chang W (2015) A modified particle swarm optimization with multiple subpopulations for multimodal function optimization problems. Appl Soft Comput 33:170–182. 10.1016/J.ASOC.2015.04.002 [DOI] [Google Scholar]
  • 32.Barrera J and Coello Coello CA, (2009) “A particle swarm optimization method for multimodal optimization based on electrostatic interaction,” Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 5845 LNAI, pp. 622–632, 10.1007/978-3-642-05258-3_55/COVER/. [DOI] [Google Scholar]
  • 33.Eberhart R and Kennedy J, (1995) “New optimizer using particle swarm theory,” Proceedings of the International Symposium on Micro Machine and Human Science, pp. 39–43, 10.1109/MHS.1995.494215. [DOI] [Google Scholar]
  • 34.Li X, (2007) “A Multimodal Particle Swarm Optimizer Based on Fitness Euclidean-distance Ratio,” Proceedings of the 9th annual conference on Genetic and evolutionary computation - GECCO ‘07, 10.1145/1276958. [DOI] [Google Scholar]
  • 35.Navon E, Miller O, Averbuch A (2005) Color image segmentation based on adaptive local thresholds. Image Vis Comput 23(1):69–85. 10.1016/J.IMAVIS.2004.05.011 [DOI] [Google Scholar]
  • 36.Qu BY, Liang JJ, Suganthan PN (2012) Niching particle swarm optimization with local search for multi-modal optimization. Inf Sci (N Y) 197:131–143. 10.1016/J.INS.2012.02.011 [DOI] [Google Scholar]
  • 37.Golzari Oskouei A, Hashemzadeh M, Asheghi B, Balafar MA (2021) CGFFCM: Cluster-weight and Group-local Feature-weight learning in Fuzzy C-Means clustering algorithm for color image segmentation [Formula presented]. Appl Soft Comput 113. 10.1016/J.ASOC.2021.108005 [DOI] [Google Scholar]
  • 38.Hashemzadeh M, Golzari Oskouei A, Farajzadeh N (2019) New fuzzy C-means clustering method based on feature-weight and cluster-weight learning. Appl Soft Comput 78:324–345. 10.1016/J.ASOC.2019.02.038 [DOI] [Google Scholar]
  • 39.“Brain MRI Images for Brain Tumor Detection.” Accessed: 05 Jan. 2024. [Online]. Available: https://www.kaggle.com/datasets/navoneel/brain-mri-images-for-brain-tumor-detection/discussion
  • 40.Chang D, Zhao Y, Liu L, Zheng C (2016) A dynamic niching clustering algorithm based on individual-connectedness and its application to color image segmentation. Pattern Recogn 60:334–347. 10.1016/J.PATCOG.2016.05.008 [DOI] [Google Scholar]
  • 41.Liu J, Yang YH (1994) Multiresolution color image segmentation. IEEE Trans Pattern Anal Mach Intell 16(7):689–700. 10.1109/34.297949 [DOI] [Google Scholar]
  • 42.Borsotti M, Campadelli P, Schettini R (1998) Quantitative evaluation of color image segmentation results. Pattern Recogn Lett 19(8):741–747. 10.1016/S0167-8655(98)00052-X [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

The brain images supporting Fig. 1 are publicly available in the Kaggle: https://www.kaggle.com/datasets/mateuszbuda/lgg-mri-segmentation

RESOURCES