Abstract
In decision-making systems, how to measure uncertain information remains an open issue, especially for information processing modeled on complex planes. In this paper, a new complex entropy is proposed to measure the uncertainty of a complex-valued distribution (CvD). The proposed complex entropy is a generalization of Gini entropy that has a powerful capability to measure uncertainty. In particular, when a CvD reduces to a probability distribution, the complex entropy will degrade into Gini entropy. In addition, the properties of complex entropy, including the nonnegativity, maximum and minimum entropies, and boundedness, are analyzed and discussed. Several numerical examples illuminate the superiority of the newly defined complex entropy. Based on the newly defined complex entropy, a multisource information fusion algorithm for decision-making is developed. Finally, we apply the decision-making algorithm in a medical diagnosis problem to validate its practicability.
1. Introduction
Uncertainty is inevitable in the applications of decision-making systems [1–4]. Considerable attention has addressed uncertainty in the past few decades [5, 6]. How to express the knowledge involved in sources of uncertain information still remains an open issue [7, 8]. Hence, researchers have attempted to model and measure uncertain information using extended soft sets [9], evidence theory [10], reasoning [11–13], belief structures [14, 15], D numbers [16, 17], Z numbers [18, 19], and other hybrid methods [20–23].
One successful alternative uncertain information measure is Gini entropy [24], which is simple to implement and has received a substantial amount of attention from researchers. Inspired by Gini entropy [24], Yager and Petry [25] recently devise an intelligent quality-based approach for fusing multisource information [26]. Bouhamed et al.[27] extend it to combine multisource possibilistic information. Later, researchers generalized the Gini entropy-based information quality to belief functions to measure uncertainty. The method of Li et al.[28, 29] is an example that has been well applied in various fields. Although Gini entropy [24] can be used to measure uncertainty, it can only be used for probability distributions.
The complex-valued model has potential expressional properties, especially for the modeling of uncertainty [30, 31]. Therefore, the complex-valued model was widely investigated and applied in various fields, such as medical diagnosis [32], decision-making [33, 34], and predicting interference effects [35, 36]. Given that the complex-valued representation model is well suited for certain applications, how can Gini entropy be generalized to complex planes to provide a more powerful capability to measure uncertainty?
In this paper, to address the abovementioned issue, a generalized entropy is proposed for measuring the uncertainty of CvDs. When CvDs reduce to probability distributions, the newly defined entropy degrades into Gini entropy. Specifically, vector expressions of CvDs are first proposed to model knowledge in complex planes. After that, a novel complex entropy called Xiao entropy is defined to measure uncertainties of CvDs. Then, the properties of complex entropy, including nonnegativity, maximum and minimum entropies, and boundedness, are analyzed and discussed. Based on the newly defined complex entropy, a multisource information fusion algorithm for decision-making is devised. Finally, we apply the decision-making algorithm in a medical diagnosis problem to verify its practicability.
The contributions of this work are summarized as follows:
A novel complex entropy, called Xiao entropy, which has the properties of nonnegativity, maximum and minimum entropies, and boundedness, is defined for the CvD
The multisource information fusion algorithm based on the newly defined entropy can be well applied to support decision-making
This study provides a new perspective of complex-valued representation for uncertain information and offers a promising and generalized solution in terms of uncertainty measurements
The preliminaries are introduced in Section 2. In Section 3, CvD vectors are defined. In Section 4, a complex entropy is defined to measure the uncertainty of CvDs. In Section 5, several numerical examples illustrate the properties of complex entropy. In Section 6, an algorithm for decision-making is designed on the basis of the newly defined entropy. Then, the decision-making algorithm is used in a medical diagnosis. Section 7 concludes this work.
2. Preliminaries
In this section, some essential concepts of uncertainty measures related to this work are introduced.
Definition 1 . —
(Gini entropy). Let P=[p1,…, pj,…, pn] be a probability distribution vector. The Gini entropy of P is defined by [24]
(1)
Definition 2 . —
(Pennecchi and Oberto's uncertainty measures). Let ℂ=[𝔠1,…, 𝔠j,…, 𝔠n] be a CvD vector, where 𝔠j=aj+bji. Pennecchi and Oberto's modulus estimations of ℂ are defined by [37]
(2) where and can be used as uncertainty measures.
3. Vector Representation of CvD
Modeling uncertainty has attracted a substantial amount of attention in a variety of areas [38]. Many methods have been proposed and applied in various fields, such as failure and risk analysis [39], classification [40, 41], information fusion [42], and decision-making [43, 44]. Here, a vector representation of CvD is presented for expressing uncertainty in a complex plane. In addition, the norm of CvD is also defined and analyzed.
Definition 3 . —
(CvD vector). Let ℂk be a CvD vector on the frame of discernment (FOD) Ψ={ψ1,…, ψj,…, ψn}, denoted by
(3) where 𝔠kj is the complex value with regard to the occurrence of ψj:
(4) where akj and bkj are real numbers and i is the imaginary unit, satisfying i2=−1.
𝔠 kj in equation (4) satisfies
(5) where |𝔠kj| is the modulus of 𝔠kj.
Equation (4) is also expressed as follows:
(6) with
(7) where rkj=|𝔠kj| ≥ 0 and θkj ∈ [−π, π] denotes an angle (phase) of 𝔠kj.
Definition 4 . —
(norm of CvD). Let ℂk be a CvD vector on FOD Ψ. Norm of CvD vector, ‖ℂk‖, is defined by
(8)
Consider properties of CvD vector in Definition 3, where for each 𝔠kj, akj2+bkj2 ∈ [0,1] and ∑j=1n|𝔠kj|=1, we observe the following:
Case 1 . —
The maximal value of ‖ℂk‖, denoted by max[‖ℂk‖], is generated, when
(9) such that
(10)
Case 2 . —
When 𝔠kj degrades into real numbers, i.e., 𝔠kj=akj(bkj=0), the minimum value of ‖ℂk‖, denoted by min[‖ℂk‖], is generated, when
(11) where
(12) In summary,
(13) where ‖ℂk‖ has a maximum value of 1 with 𝔠kj=1 for one ψj and others 𝔠kj=0; ‖ℂk‖ has a minimal value of with all 𝔠kj=(1/n).
4. Entropy for CvD
Entropy is useful for measuring uncertainty [45–47], where many kinds of entropies, such as Tsallis entropy [48], fuzzy entropy [49, 50], Deng entropy [51–53], and cross-entropy [54], are presented for different aspects [55–59]. Among them, Shannon and Gini entropies are very popular. The greater the uncertainty is, the greater the entropy is; the lesser the uncertainty is, the lesser the entropy is [60]. We make use of the concept of Gini entropy [24] to measure the uncertainty of CvD.
Definition 5 . —
(complex entropy). Let ℂk be a CvD vector on FOD Ψ. The complex entropy of ℂk, denoted as E𝒳(ℂk), is defined as
(14)
When a CvD reduces to a probability distribution, where bkj=0 and 𝔠kj=akj, then E𝒳(ℂk) can be expressed as follows:
| (15) |
which is equal to equation (1).
Property 1 . —
E 𝒳 is a generalized model of Gini entropy [24]. Specifically, when a CvD becomes a probability distribution, E𝒳 degrades into Gini entropy [24].
According to equation (13), because , we have
| (16) |
such that
| (17) |
Thus, E𝒳(ℂk) has the boundedness of [0, (1/n)].
It is inferred that
E𝒳(ℂk) reaches its maximal value E𝒳(ℂk)=1 − (1/n) when 𝔠kj=(1/n) for 1 ≤ j ≤ n. When n⟶+∞ and (1/n)⟶0, 𝒳(ℂk) reaches the maximum value 1.
E𝒳(ℂk) reaches its minimal value E𝒳(ℂk)=0 when one 𝔠kj=1 and others 𝔠kj=0.
Remark 1 . —
Notably, the larger E𝒳(ℂk) is, the larger the uncertainty in CvD ℂk is, which results in lower certainty.
Definition 6 . —
(the completely certain CvD). CvD ℂk is completely certain when E𝒳(ℂk)=0.
Definition 7 . —
(the completely uncertain CvD).
CvD ℂk is completely uncertain when E𝒳(ℂk)=1.
Theorem 1 . —
E 𝒳 has the desired properties of the entropy of the CvD, including nonnegativity, maximum and minimum entropies, and boundedness.
Property 2 . —
Let ℂk be an arbitrary CvD:
P 2.1 Nonnegativity: E𝒳(ℂk) ≥ 0
P 2.2 Maximum entropy: E𝒳(ℂk) ≤ max[E𝒳(ℂk)]
P 2.3 Minimum entropy: E𝒳(ℂk) ≥ min[E𝒳(ℂk)]
P 2.4 Boundedness: 0 ≤ E𝒳(ℂk) ≤ 1
Proof —
The proofs are trivial.
5. Numerical Examples
In this section, several examples are presented to illustrate the entropy for CvD.
Example 1 . —
Consider a CvD ℂ in the FOD Ψ={ψ1, ψ2}:
(18)
In Example 1, ℂ changes as parameter x varies, where x is set within [0,1], such that ℂ reduces to a probability distribution.
By leveraging the Gini entropy G and Xiao entropy E𝒳, the corresponding entropy measures are shown in Figure 1. Clearly, E𝒳 is the same as G entropy, which verifies that when a CvD reduces to a probability distribution, E𝒳 degrades into Gini entropy. Additionally, when x=0 or x=1, such that ℂ=[0,1] or ℂ=[1,0], G(ℂ) and E𝒳(ℂ) achieve the minimum entropy of 0, because in this case, ℂ is the completely certain CvD. By contrast, only when x=0.5, such that ℂ=[0.5, 0.5], can G(ℂ) and E𝒳(ℂ) achieve a maximum entropy of 0.5.
Figure 1.

G and E𝒳 in Example 1.
Example 2 . —
Consider a CvD ℂ in the FOD Ψ={ψ1, ψ2}:
(19)
In Example 2, ℂ changes as modulus r varies, where r is set within [0.01,0.99].
Because ℂ consists of complex numbers, Gini entropy is not applicable. The result of E𝒳 entropy is shown in Figure 2. As r increases from 0.01 to 0.5, E𝒳 entropy increases from 0.0198 to 0.5, while as r increases from 0.5 to 0.99, E𝒳 entropy gradually decreases to 0.0198. This result shows a similar trend as the entropy measures in Figure 1.
Figure 2.

E 𝒳 in Example 2.
A comparison of the results in Examples 1 and 2 shows that the proposed E𝒳 entropy is a more capable uncertainty measure than Gini entropy.
Example 3 . —
Consider a CvD ℂ in the FOD Ψ={ψ1,…, ψj,…, ψx}:
(20)
In Example 3, we set six different scales of α, namely, α ∈ [1,10], [1, 102], [1, 103], [1, 104], [1, 105], and [1, 106], to measure the variation of G(ℂ) and E𝒳(ℂ).
Figures 3(a)–3(f) depict the results of G(ℂ) and E𝒳(ℂ) with regard to six different cases, respectively. Particularly, as α varies within [1, 10], E𝒳(ℂ) has a maximum value of max[E𝒳(ℂ)]=0.9 and a minimum value of min[E𝒳(ℂ)]=0. When α changes within [1, 102], max[E𝒳(ℂ)]=0.99 and min[E𝒳(ℂ)]=0. When α varies within [1, 103], max[E𝒳(ℂ)]=0.999 and min[E𝒳(ℂ)]=0. When α changes within [1, 104], max[E𝒳(ℂ)]=0.9999 and min[E𝒳(ℂ)]=0. When α varies within [1, 105], max[E𝒳(ℂ)]=1 and min[E𝒳(ℂ)]=0. When α changes within [1, 106], max[E𝒳(ℂ)]=1 and min[E𝒳(ℂ)]=0. Thus, when a CvD becomes a completely certain distribution, i.e., a probability distribution, in which 𝔠kj=akj=1 for one j and other 𝔠kj=0, it has a minimum entropy of min[E𝒳(ℂ)]=0. On the other hand, when α⟶+∞, max[E𝒳(ℂ)] is close to 1, because in this case ℂ is completely uncertain.
Figure 3.

The entropy measures of G(ℂ) and E𝒳(ℂ) in Example 3. (a)G(ℂ) and E𝒳(ℂ): 1≤α≤10. (b)G(ℂ) and E𝒳(ℂ): 1≤α≤102. (c)G(ℂ) and E𝒳(ℂ): 1≤α≤103. (d)G(ℂ) and E𝒳(ℂ): 1≤α≤104. (e)G(ℂ) and E𝒳(ℂ): 1 ≤α≤105. (f)G(ℂ) and E𝒳(ℂ): 1≤α≤106.
Example 4 . —
Assume that there is a CvD ℂ in the FOD Ψ={ψ1,…, ψj,…, ψx}:
(21)
In Example 4, ℂ changes as r and ξ vary. Here, we set r within [0,1] and ξ within [-1,1], as shown in Figure 4(a). The entropy measure of E𝒳(ℂ) is presented in Figure 4(b), which shows how the variations in the modulus and angle of the elements in ℂ impact E𝒳(ℂ).
Figure 4.

The entropy measure in Example 4. (a) The variation of r and ξ. (b)E𝒳
E 𝒳(ℂ) changes as r varies, whereas the variation in angle θ=ξπ has no effect on E𝒳(ℂ). This result is reasonable because rkj2=|𝔠kj|2=akj2+bkj2 is related to the modulus r rather than θ.
Example 5 . —
Consider Example 2.
In Example 5, r is set within [0,1]. We compare the proposed E𝒳 with related works, that is, Pennecchi and Oberto's uncertainty measures and .
By comparing the results of E𝒳, , and shown in Figure 5, we can see that remains 0.5 and cannot accurately measure the uncertainty. However, provides a better measure of the uncertainty compared to because as r increases from 0.01 to 0.5, it increases from 0.2929 to 0.4208, while as r increases from 0.5 to 0.99, it gradually decreases to 0.2929. Nevertheless, the proposed E𝒳 has better discrimination as an uncertainty measurement and is superior to other methods.
Figure 5.

Comparison of different uncertainty measures.
6. Algorithm and Application
How to deal with decision-making problems has attracted much attention [61–65], especially for complex-valued expressed information [66, 67]. In this section, we first design a multisource information fusion algorithm for decision-making based on the proposed entropy. Then, we apply the decision-making algorithm in medical diagnosis to validate its practicability.
6.1. A Multisource Information Fusion Algorithm for Decision-Making
Problem statement: let Ψ be a FOD with a set of objectives {ψ1,…, ψj,…, ψn} to be recognized. Suppose there are t CvDs: ℂ={ℂ1,…, ℂk,…, ℂt} where ℂk=[𝔠k1,…, 𝔠kj,…, 𝔠kn] and 𝔠kj=akj+bkji. The decision-making algorithm is to identify the target from {ψ1,…, ψj,…, ψn} by combining multiple CvDs {ℂ1,…, ℂk,…, ℂt}.
The specific steps are given as follows:
- Step 1: For 1 ≤ k ≤ t, its corresponding entropy of CvD ℂk, denoted by E𝒳(ℂk), can be generated as follows:
(22) - Step 2: For 1 ≤ k ≤ t, its corresponding information volume of CvD ℂk, denoted by IV(ℂk), can be measured by
(23) - Step 3: The information volume IV(ℂk) is normalized by
(24) - Step 4: According to the normalized information volumes, the weighted average CvD, denoted as , is defined by
(25) where |ℂk|=[|𝔠k1|,…, |𝔠kj|,…, |𝔠kn|] and .
- Step 5: is fused via the complex Dempster's combination rule [68] by t − 1 times:
(26) - Step 6: For , the ψδ with the maximum absolute value is chosen:
(27) Step 7: Let λ be a threshold value for decision-making, which can be set in advance according to specific applications. If , the ψδ can be identified as the target by
| (28) |
If , it cannot be determined.
The corresponding pseudocode is given in Algorithm 1.
Algorithm 1.

Complex entropy-based multisource information fusion algorithm for decision-making.
6.2. Application in Medical Diagnosis
In this section, the proposed decision-making method is applied in medical diagnosis to demonstrate its practicability. The scenario and data of the application are based on [32].
Considering a medical diagnosis problem, where for a patient P, P suffers with the most possible disease from D = {D1: viral fever, D2: malaria, D3: typhoid, D4: stomach problem }. To clarify which disease the patient may suffer, five experts diagnose the patient's condition, in which the evaluation data are modeled as CvDs in Table 1. The threshold λ is set as 0.80 for this application to make a decision. We try to diagnose the patient P by integrating the evaluations from the five experts.
Table 1.
The evaluated data for patient modeled as CvDs.
| Experts | CvDs | Diseases | |||
|---|---|---|---|---|---|
| Viral fever: D1 | Malaria: D2 | Typhoid: D3 | Stomach problem: D4 | ||
| E 1 | ℂ E1 | 0.65e0.2i | 0.10e0.3i | 0.10e0.3i | 0.15e0.2i |
| E 2 | ℂ E2 | 0.10e0.3i | 0.60e0.2i | 0.10e0.3i | 0.20e0.2i |
| E 3 | ℂ E3 | 0.40e0.3i | 0.10e0.3i | 0.30e0.4i | 0.20e0.4i |
| E 4 | ℂ E4 | 0.50e0.2i | 0.20e0.2i | 0.10e0.3i | 0.20e0.3i |
| E 5 | ℂ E5 | 0.55e0.2i | 0.10e0.3i | 0.15e0.2i | 0.20e0.3i |
Then, the decision-making algorithm is applied to medical diagnosis by the following steps:
Step 1: The entropy values of CvD ℂEk(1 ≤ k ≤ 5) are calculated by equation (22), as shown in Table 2.
Step 2: The information volumes of CvD ℂEk(1 ≤ k ≤ 5) are calculated by equation (23), as shown in Table 2.
Step 3: The information volume IV(ℂEk)(1 ≤ k ≤ 5) is normalized by equation (24), as shown in Table 2.
Step 4: The weighted average CvD is generated by equation (25), as shown in Table 3.
Step 5: By gradually fusing the weighted average CvD with 4 times, their corresponding results are generated by equation (26), as shown in Table 3.
Step 6: The maximal absolute value of is marked with the correct color in Table 3.
Step 7: Patient P is diagnosed as most likely to suffer the disease D1:
| (29) |
Table 2.
The results in terms of entropy, information volume, and normalized information volume.
| Results | CvDs | ||||
|---|---|---|---|---|---|
| ℂ E1 | ℂ E2 | ℂ E3 | ℂ E4 | ℂ E5 | |
| E 𝒳(ℂEk) | 0.5350 | 0.5800 | 0.7000 | 0.6600 | 0.6250 |
| IV(ℂEk) | 1.7074 | 1.7860 | 2.0138 | 1.9348 | 1.8682 |
| 0.1834 | 0.1918 | 0.2163 | 0.2078 | 0.2007 | |
Table 3.
The weighted average CvD and fused results obtained by the complex Dempster's combination rule.
| Results | Diseases | Diagnosis results | |||
|---|---|---|---|---|---|
| Viral fever: D1 | Malaria: D2 | Typhoid: D3 | Stomach problem: D4 | ||
| 0.4392 | 0.2167 | 0.1533 | 0.1908 | Cannot be determined | |
| 0.6435 | 0.1567 | 0.0784 | 0.1215 | Cannot be determined | |
| 0.8034 | 0.0965 | 0.0342 | 0.0659 | Viral fever | |
| 0.9011 | 0.0534 | 0.0134 | 0.0321 | Viral fever | |
| 0.9525 | 0.0279 | 0.0049 | 0.0148 | Viral fever | |
6.3. Discussion
As shown in Table 1, we see that |ℂE1(D1)|=0.65, |ℂE3(D1)|=0.4, |ℂE4(D1)|=0.5, and |ℂE5(D1)|=0.55, which all support viral fever: D1 disease. However, |ℂE2(D3)|=0.6 supports malaria: D2 disease. Hence, ℂE2 conflicts with ℂE1, ℂE3, ℂE4, and ℂE5. By only using Table 1, it is difficult to make an accurate decision because a conflict exists among the experts. It is necessary to fuse the data collected from different experts to better support decision-making. There are five evaluations from five experts. To illuminate the effectiveness of the proposed decision-making algorithm, we gradually fuse the weighted average CvD, and the results are given in Table 3.
When the weighted average CvD is fused by 1 time, we obtain the result that has the largest value of 0.6435. Because 0.6435 is smaller than the threshold λ=0.80, the patient's disease cannot be determined. When the weighted average CvD is fused by 2 times, it is calculated that has the largest value of 0.8034. Because 0.8034 is larger than the threshold λ=0.80, the patient is diagnosed with viral fever: D1. When the weighted average CvD is fused by 3 and 4 times, it is easy to see that and have increasingly large values of 0.9011 and 0.9525 to better support decision-making. Finally, the patient is diagnosed as most likely to suffer viral fever: D1. Consequently, the value in terms of disease D1 is increased for decision-making from 0.6435 to 0.8034 to 0.9011 and then to 0.9525 as shown in Figure 6. As a result, the proposed decision-making algorithm is effective to address medical diagnosis problem.
Figure 6.

The fusion results of the weighted average CvD.
7. Conclusions
In this paper, a complex entropy, called Xiao entropy, is proposed to measure the uncertainty of complex-valued distributions (CvDs). The complex entropy is a generalized model of Gini entropy. Specifically, when the CvD turns into a probability distribution, the proposed entropy degrades into Gini entropy. Furthermore, we study the properties of complex entropy, including nonnegativity, maximum and minimum entropies, and boundedness. Several numerical examples compare the proposed complex entropy with related works. The results illuminate the superiority of the proposed complex entropy. Based on the complex entropy, a multisource information fusion algorithm for decision-making is devised. Finally, we apply the decision-making algorithm in a medical diagnosis problem to validate its practicability.
The main contributions are that this study provides a new perspective of complex-valued representation for uncertain information; the newly defined complex entropy has a powerful capability to measure uncertainty. Additionally, it offers a promising application in decision theory. In the future work, we intend to apply this complex entropy to handle more complex decision-making problems, such as the analyzing and processing of image and physiological signals.
Acknowledgments
This research was supported by the National Natural Science Foundation of China (nos. 62003280 and 61902189).
Data Availability
The data used to support the findings of this study are provided in the article.
Conflicts of Interest
The author states that there are no conflicts of interest.
References
- 1.Yager R. R., Reformat M. Z. Selecting an action to satisfy multiple aspects of a system based on uncertain granular observations. Expert Systems with Applications. 2019;126:1–8. doi: 10.1016/j.eswa.2018.12.059. [DOI] [Google Scholar]
- 2.Liu Z., Li G., Mercier G., He Y., Pan Q. Change detection in heterogenous remote sensing images via homogeneous pixel transformation. IEEE Transactions on Image Processing. 2017;27(4):1822–1834. doi: 10.1109/TIP.2017.2784560. [DOI] [PubMed] [Google Scholar]
- 3.Kang B., Zhang P., Gao Z., Chhipi-Shrestha G., Hewage K., Sadiq R. Environmental assessment under uncertainty using Dempster-Shafer theory and Z-numbers. Journal of Ambient Intelligence and Humanized Computing. 2020;11(5):2041–2060. doi: 10.1007/s12652-019-01228-y. [DOI] [Google Scholar]
- 4.Fei L., Feng Y., Liu L. On Pythagorean fuzzy decision making using soft likelihood functions. International Journal of Intelligent Systems. 2019;34(12):3317–3335. doi: 10.1002/int.22199. [DOI] [Google Scholar]
- 5.Xiao F. Evidential fuzzy multicriteria decision making based on belief entropy. IEEE Transactions on Fuzzy Systems. 2020;28(7):1477–1491. [Google Scholar]
- 6.Fei L., Feng Y. An attitudinal nonlinear integral and applications in decision making. International Journal of Fuzzy Systems. 2020 doi: 10.1007/s40815-020-00862-5. [DOI] [Google Scholar]
- 7.Xue Y., Deng Y., Garg H. Uncertain database retrieval with measure - based belief function attribute values under intuitionistic fuzzy set. Information Sciences. 2021;546:436–447. doi: 10.1016/j.ins.2020.08.096. [DOI] [Google Scholar]
- 8.Wang X., Song Y. Uncertainty measure in evidence theory with its applications. Applied Intelligence. 2018;48(7):1672–1688. doi: 10.1007/s10489-017-1024-y. [DOI] [Google Scholar]
- 9.Feng F., Cho J., Pedrycz W., Fujita H., Herawan T. Soft set based association rule mining. Knowledge-Based Systems. 2016;111:268–282. doi: 10.1016/j.knosys.2016.08.020. [DOI] [Google Scholar]
- 10.Yager R. R. Generalized dempster-shafer structures. IEEE Transactions on Fuzzy Systems. 2019;27(3):428–435. doi: 10.1109/tfuzz.2018.2859899. [DOI] [Google Scholar]
- 11.Liu Z.-G., Pan Q., Dezert J., Martin A. Combination of classifiers with optimal weight based on evidential reasoning. IEEE Transactions on Fuzzy Systems. 2018;26(3):1217–1230. doi: 10.1109/tfuzz.2017.2718483. [DOI] [Google Scholar]
- 12.Fei L., Feng Y. A novel retrieval strategy for case-based reasoning based on attitudinal Choquet integral. Engineering Applications of Artificial Intelligence. 2020;94 doi: 10.1016/j.engappai.2020.103791.103791 [DOI] [Google Scholar]
- 13.Zhou M., Liu X., Yang J. Evidential reasoning approach for MADM based on incomplete interval value. Journal of Intelligent & Fuzzy Systems. 2017;33(6):3707–3721. doi: 10.3233/jifs-17522. [DOI] [Google Scholar]
- 14.Xu X., Xu H., Wen C., Li J., Hou P., Zhang J. A belief rule-based evidence updating method for industrial alarm system design. Control Engineering Practice. 2018;81:73–84. doi: 10.1016/j.conengprac.2018.09.001. [DOI] [Google Scholar]
- 15.Fu C., Xue M., Xu D.-L., Yang S.-L. Selecting strategic partner for tax information systems based on weight learning with belief structures. International Journal of Approximate Reasoning. 2019;105:66–84. doi: 10.1016/j.ijar.2018.11.009. [DOI] [Google Scholar]
- 16.Deng X., Jiang W. A total uncertainty measure for D numbers based on belief intervals. International Journal of Intelligent Systems. 2019;34(12):p. 3302. doi: 10.1002/int.22195. [DOI] [Google Scholar]
- 17.Liu B., Deng Y. Risk evaluation in failure mode and effects analysis based on D numbers theory. International Journal of Computers Communications & Control. 2019;14(5):672–691. [Google Scholar]
- 18.Jiang W., Cao Y., Deng X. A novel Z-network model based on Bayesian network and Z-number. IEEE Transactions on Fuzzy Systems. 2020;28(8):p. 1585. doi: 10.1109/TFUZZ.2019.2918999. [DOI] [Google Scholar]
- 19.Tian Y., Liu L., Mi X., Kang B. Zslf:a new soft likelihood function based on Z-numbers and its application in expert decision system. IEEE Transactions on Fuzzy Systems. 2020;22(7):2333–2349. doi: 10.1109/TFUZZ.2020.2997328. [DOI] [Google Scholar]
- 20.Fei L., Feng Y., Liu L. Evidence combination using OWA‐based soft likelihood functions. International Journal of Intelligent Systems. 2019;34(9):2269–2290. doi: 10.1002/int.22166. [DOI] [Google Scholar]
- 21.Chen C.-H. An arrival time prediction method for bus system. IEEE Internet of Things Journal. 2018;5(5):4231–4232. doi: 10.1109/jiot.2018.2863555. [DOI] [Google Scholar]
- 22.Jiang W., Huang K., Geng J., Deng X. Multi-scale metric learning for few-shot learning. IEEE Transactions on Circuits and Systems for Video Technology. 2020:p. 1. doi: 10.1109/TCSVT.2020.2995754. [DOI] [Google Scholar]
- 23.Chen C.-H., Song F., Hwang F.-J., Wu L. A probability density function generator based on neural networks. Physica A: Statistical Mechanics and Its Applications. 2020;541 doi: 10.1016/j.physa.2019.123344.123344 [DOI] [Google Scholar]
- 24.Gini C. Variabilità e mutabilità, In: Pizetti E., Salvemini T., editors. Reprinted in Memorie di Metodologica Statistica. Rome: Libreria Eredi Virgilio Veschi; 2005. [Google Scholar]
- 25.Yager R. R., Petry F. An intelligent quality-based approach to fusing multi-source probabilistic information. Information Fusion. 2016;31:127–136. doi: 10.1016/j.inffus.2016.02.005. [DOI] [Google Scholar]
- 26.Yager R. R., Petry F. E. Using Quality Measures in the Intelligent Fusion of Probabilistic Information,” in Information Quality In Information Fusion And Decision Making. Berlin, Germany: Springer; 2019. pp. 51–77. [Google Scholar]
- 27.Bouhamed S. A., Kallel I. K., Yager R. R., Bossé É., Solaiman B. An intelligent quality-based approach to fusing multi-source possibilistic information. Information Fusion. 2020;55:68–90. [Google Scholar]
- 28.Li D., Deng Y., Gao X. A generalized expression for information quality of basic probability assignment. IEEE Access. 2019;7(1):174 734–174 739. doi: 10.1109/access.2019.2956956. [DOI] [Google Scholar]
- 29.Li D., Deng Y. A new correlation coefficient based on generalized information quality. IEEE Access. 2019;7(1):175 411–175 419. doi: 10.1109/access.2019.2957796. [DOI] [Google Scholar]
- 30.Garg H., Rani D. New generalised Bonferroni mean aggregation operators of complex intuitionistic fuzzy information based on Archimedean t-norm and t-conorm. Journal of Experimental & Theoretical Artificial Intelligence. 2020;32(1):81–109. doi: 10.1080/0952813x.2019.1620871. [DOI] [Google Scholar]
- 31.Gao X., Deng Y. Quantum model of mass function. International Journal of Intelligent Systems. 2020;35(2):267–282. doi: 10.1002/int.22208. [DOI] [Google Scholar]
- 32.Xiao F. “Generalization of Dempster–Shafer theory: a complex mass function. Applied Intelligence. 2019;50(10):3266–3275. [Google Scholar]
- 33.He Z., Chan F. T. S., Jiang W. A quantum framework for modelling subjectivity in multi-attribute group decision making. Computers & Industrial Engineering. 2018;124:560–572. doi: 10.1016/j.cie.2018.08.001. [DOI] [Google Scholar]
- 34.Garg H., Rani D. A robust correlation coefficient measure of complex intuitionistic fuzzy sets and their applications in decision-making. Applied Intelligence. 2019;49(2):496–512. doi: 10.1007/s10489-018-1290-3. [DOI] [Google Scholar]
- 35.Xiao F. CEQD: a complex mass function to predict interference effects. IEEE Transactions on Cybernetics. 2021:p. 1. doi: 10.1109/TCYB.2020.3040770. [DOI] [PubMed] [Google Scholar]
- 36.Dai J., Deng Y. A new method to predict the interference effect in quantum-like Bayesian networks. Soft Computing. 2020;24(14):10 287–310 294. doi: 10.1007/s00500-020-04693-2. [DOI] [Google Scholar]
- 37.Pennecchi F., Oberto L. Uncertainty evaluation for the estimate of a complex-valued quantity modulus. Metrologia. 2010;47(3):p. 157. doi: 10.1088/0026-1394/47/3/006. [DOI] [Google Scholar]
- 38.Deng X. Analyzing the monotonicity of belief interval based uncertainty measures in belief function theory. International Journal of Intelligent Systems. 2018;33(9):1869–1879. doi: 10.1002/int.21999. [DOI] [Google Scholar]
- 39.Xiao F., Cao Z., Jolfaei A. A novel conflict measurement in decision making and its application in fault diagnosis. IEEE Transactions on Fuzzy Systems. 2020;29(1):186–197. [Google Scholar]
- 40.Liu Z., Pan Q., Dezert J., Han J.-W., He Y. Classifier fusion with contextual reliability evaluation. IEEE Transactions on Cybernetics. 2018;48(5):1605–1618. doi: 10.1109/tcyb.2017.2710205. [DOI] [PubMed] [Google Scholar]
- 41.Xu X., Zheng J., Yang J.-b., Xu D.-l., Chen Y.-w. Data classification using evidence reasoning rule. Knowledge-Based Systems. 2017;116:144–151. doi: 10.1016/j.knosys.2016.11.001. [DOI] [Google Scholar]
- 42.Xiao F. Evidence combination based on prospect theory for multi-sensor data fusion. ISA Transactions. 2020;106:253–261. doi: 10.1016/j.isatra.2020.06.024. [DOI] [PubMed] [Google Scholar]
- 43.Fu C., Chang W., Yang S. Multiple criteria group decision making based on group satisfaction. Information Sciences. 2020;518:309–329. doi: 10.1016/j.ins.2020.01.021. [DOI] [Google Scholar]
- 44.Zhou M., Liu X.-B., Chen Y.-W., Yang J.-B. Evidential reasoning rule for MADM with both weights and reliabilities in group decision making. Knowledge-Based Systems. 2018;143:142–161. doi: 10.1016/j.knosys.2017.12.013. [DOI] [Google Scholar]
- 45.Wang C., Tan Z. X., Ye Y., Wang L., Cheong K. H., Xie N.-G. A rumor spreading model based on information entropy. Scientific Reports. 2017;7(1):1–14. doi: 10.1038/s41598-017-09171-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Yan H., Deng Y. An improved belief entropy in evidence theory. IEEE Access. 2020;8(1):57 505–557 516. doi: 10.1109/access.2020.2982579. [DOI] [Google Scholar]
- 47.Babajanyan S., Allahverdyan A., Cheong K. H. Energy and entropy: path from game theory to statistical mechanics. Physical Review Research. 2020;2(4) doi: 10.1103/physrevresearch.2.043055.043055 [DOI] [Google Scholar]
- 48.Zhang J., Liu R., Zhang J., Kang B. Extension of Yager’s negation of a probability distribution based on Tsallis entropy. International Journal of Intelligent Systems. 2020;35(1):72–84. doi: 10.1002/int.22198. [DOI] [Google Scholar]
- 49.Cao Z., Lin C.-T., Lai K.-L., et al. Extraction of SSVEPs-based inherent fuzzy entropy using a wearable headband EEG in migraine patients. IEEE Transactions on Fuzzy Systems. 2020;28(1):p. 14. doi: 10.1109/TFUZZ.2019.2905823. [DOI] [Google Scholar]
- 50.Liu P., Zhang X., Wang Z. An extended VIKOR method for multiple attribute decision making with linguistic D numbers based on fuzzy entropy. International Journal of Information Technology & Decision Making. 2020;19(01):143–167. doi: 10.1142/s0219622019500433. [DOI] [Google Scholar]
- 51.Deng Y. Uncertainty measure in evidence theory. Science China Information Sciences. 2020;63(11):p. 210201. doi: 10.1007/s11432-020-3006-9. [DOI] [Google Scholar]
- 52.Liu F., Gao X., Zhao J., Deng Y. Generalized belief entropy and its application in identifying conflict evidence. IEEE Access. 2019;7(1):126 625–126 633. doi: 10.1109/access.2019.2939332. [DOI] [Google Scholar]
- 53.Gao X., Deng Y. The pseudo-pascal triangle of maximum Deng entropy. International Journal of Computers Communications & Control. 2020;15(1):p. 1006. doi: 10.15837/ijccc.2020.1.3735. [DOI] [Google Scholar]
- 54.Song Y., Fu Q., Wang Y.-F., Wang X. Divergence-based cross entropy and uncertainty measures of Atanassov’s intuitionistic fuzzy sets with their application in decision making. Applied Soft Computing. 2019;84 doi: 10.1016/j.asoc.2019.105703.105703 [DOI] [Google Scholar]
- 55.Deng Y. Information volume of mass function. International Journal of Computers Communications & Control. 2020;15(6):p. 3983. doi: 10.15837/ijccc.2020.6.3983. [DOI] [Google Scholar]
- 56.Abellan J., Bosse E. Critique of recent uncertainty measures developed under the evidence theory and belief intervals. IEEE Transactions on Systems, Man, and Cybernetics: Systems. 2020;50(3):1186–1192. doi: 10.1109/tsmc.2017.2770128. [DOI] [Google Scholar]
- 57.Pan L., Deng Y. Probability transform based on the ordered weighted averaging and entropy difference. International Journal of Computers Communications & Control. 2020;15(4):p. 3743. doi: 10.15837/ijccc.2020.4.3743. [DOI] [Google Scholar]
- 58.Cui H., Liu Q., Zhang J., Kang B. An improved deng entropy and its application in pattern recognition. IEEE Access. 2019;7:18 284–18 292. [Google Scholar]
- 59.Deng Y., Deng Y. 2021. Entropy Measure of Quantum Entanglement. [DOI]
- 60.Xiao F. GIQ: a generalized intelligent quality-based approach for fusing multi-source information. IEEE Transactions on Fuzzy Systems. 2020:p. 1. doi: 10.1109/TFUZZ.2020.2991296. [DOI] [Google Scholar]
- 61.Xiao F. A distance measure for intuitionistic fuzzy sets and its application to pattern classification problems. IEEE Transactions on Systems, Man, and Cybernetics: Systems. 2019 doi: 10.1109/TSMC.2019.2958635. [DOI] [Google Scholar]
- 62.Liao H., Ren Z., Fang R. A Deng-entropy-based evidential reasoning approach for multi-expert multi-criterion decision-making with uncertainty. International Journal of Computational Intelligence Systems. 2020;13(1):1281–1294. [Google Scholar]
- 63.Fei L., Lu J., Feng Y. An extended best-worst multi-criteria decision-making method by belief functions and its applications in hospital service evaluation. Computers & Industrial Engineering. 2020;142 doi: 10.1016/j.cie.2020.106355.106355 [DOI] [Google Scholar]
- 64.Garg H. A new possibility degree measure for interval‐valued q‐rung orthopair fuzzy sets in decision‐making. International Journal of Intelligent Systems. 2021;36(1):526–557. doi: 10.1002/int.22308. [DOI] [Google Scholar]
- 65.Tang M., Liao H., Mi X., Xu X., Herrera F. Dynamic subgroup-quality-based consensus in managing consistency, nearness, and evenness quality indices for large-scale group decision making under hesitant environment. Journal of the Operational Research Society. 2020:1–14. doi: 10.1080/01605682.2019.1708823. [DOI] [Google Scholar]
- 66.Xiao F. CED: a distance for complex mass functions. IEEE Transactions on Neural Networks and Learning Systems. 2020:p. 1. doi: 10.1109/TNNLS.2020.2984918. [DOI] [PubMed] [Google Scholar]
- 67.Garg H., Rani D. Complex interval-valued intuitionistic fuzzy sets and their aggregation operators. Fundamenta Informaticae. 2019;164(1):61–101. doi: 10.3233/fi-2019-1755. [DOI] [Google Scholar]
- 68.Xiao F. Generalized belief function in complex evidence theory. Journal of Intelligent & Fuzzy Systems. 2020;38(4):3665–3673. doi: 10.3233/jifs-179589. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
The data used to support the findings of this study are provided in the article.
