Abstract
Evidence theory is widely used to deal with the fusion of uncertain information, but the fusion of conflicting evidence remains an open question. To solve the problem of conflicting evidence fusion in single target recognition, we proposed a novel evidence combination method based on an improved pignistic probability function. Firstly, the improved pignistic probability function could redistribute the probability of multi-subset proposition according to the weight of single subset propositions in a basic probability assignment (BPA), which reduces the computational complexity and information loss in the conversion process. The combination of the Manhattan distance and evidence angle measurements is proposed to extract evidence certainty and obtain mutual support information between each piece of evidence; then, entropy is used to calculate the uncertainty of the evidence and the weighted average method is used to correct and update the original evidence. Finally, the Dempster combination rule is used to fuse the updated evidence. Verified by the analysis results of single-subset proposition and multi-subset proposition highly conflicting evidence examples, compared to the Jousselme distance method, the Lance distance and reliability entropy combination method, and the Jousselme distance and uncertainty measure combination method, our approach achieved better convergence and the average accuracy was improved by 0.51% and 2.43%.
Keywords: DS evidence theory, pignistic probability function, information fusion
1. Introduction
As an uncertain reasoning method, evidence theory [1,2] needs weaker conditions than Bayesian probability theory, but it possesses the ability to express “uncertainty” and “ignorance” directly. The primary data required in evidence theory are more intuitive and easy to obtain than in probability reasoning theory. One can quickly integrate the knowledge and data from different experts or data sources to describe the uncertainty flexibly. It has been widely used in supplier selection [3,4], target recognition [5,6], decision making [7,8], reliability analysis [9,10,11,12,13], optimization in uncertain environments [14,15,16], etc. However, in the application of DS evidence theory, evidence fusion plays a crucial role due to the unreliability of the evidence source. The Dempster fusion rule was established based on the multiplication principle. In cases where there is conflicting evidence in evidence theory, the Dempster combination rule is used to explain counterintuitive outcomes, which can lead to what is known as the “Zadeh paradox”.
To address the issue of conflicting evidence fusion, scholars have presented many outstanding studies. When there are conflicts in original evidence, traditional DS theory cannot be applied and it needs to be improved. In recent years, a large number of scholars and experts have improved evidence theory from two aspects. The first is to improve the fusion rules of DS evidence theory. Sun promoted the concept of credibility, believing that the credibility of all evidence is equal, and modified the Dempster rule by weighted summation [17]. Yang established a unique evidential reasoning (ER) rule, combining multiple pieces of independent evidence with weight and reliability, improving and enhancing the Dempster rule by determining how to combine completely reliable evidence and analyzing significant or complete conflicts through new reliability disturbances [18]. Deng Yong proposed a new concept called generalized evidence theory (GET), and defined a new concept called generalized basic probability assignment (GBPA). They established a model to deal with uncertain information, provided generalized combination rules (GCR) for the combination of GBPA, and constructed a generalized conflict model to measure the conflict between evidence [19]. The second aspect is to modify the original evidence. Smets proposed pignistic probability transformations and adopted an average distribution strategy in order for the mass function of assignments to meet the conditions [20]. Based on a geometric interpretation of this evidence theory, Jousselme defined the Jousselme evidence distance to describe the differences in evidence through distance information [21]. Murphy believed that the evidence should be a weighted average, which would better deal with the normalization problem [22]. Tang proposed a new multi-sensor data fusion method based on weighted confidence entropy, which measures the uncertainty of evidence through mass functions and an identification framework to reduce the loss of evidence information [23].
In this field, domestic and foreign scholars and experts have achieved excellent results in a particular range. However, in the application of single target recognition, there is only a single result, i.e., . Multi-subset BPAs such as , which means there is still uncertainty in the outcome of the fusion, reduce the probability of target recognition. At present, there has been no relevant research conducted on the matter. In this article, a method for recognizing single targets through conflicting evidence fusion is proposed. This method involves consolidating multiple subsets within the framework of evidence theory into a single subset, while incorporating the evidence distance, evidence angle, and entropy to enhance the accuracy of target fusion. First, the pignistic probability function is improved to transform each original evidence into a single propositional subset to avoid a single recognition result including multiple subset propositions in the process of the Dempster rule. Then, the combination of the Manhattan distance and evidence angle measurements is proposed to extract the degree of evidence certainty and obtain the mutual support information between all data. Furthermore, entropy is introduced to calculate the uncertainty of evidence. The initial evidence is fused according to the coefficient of uncertainty and is transformed to updated evidence. Finally, the Dempster combination rule is used to fuse the updated evidence.
2. Materials and Methods
2.1. Dempster Rule
Let U be a domain set representing all values of X, and all elements in U are not integrated. Then, U is called the recognition framework of X.
The research objects in scientific theory compose a nonempty set, which is called a domain.
Definition 1.
Let U be a recognition framework, then the function satisfies the following conditions:
- (1)
;
- (2)
.
Then, is the basic probability assignment (BPA) of A, is the mass function, and represents the degree of trust in A. If , A is called a focal element.
Hypothesis and are the two basic probability assignments on the same recognition framework U, and the focal elements are and . Namely,
| (1) |
Then,
| (2) |
where ; ; and K is the conflict factor, which reflects the degree of conflict between evidence. is called the normalization factor. The Dempster rule allocates the conflict to each set in equal proportion.
Define the system identification framework as , N evidence as , and the mass functions corresponding to each evidence as .
2.2. Improved Pignistic Probability Function
In single target recognition, the fusion recognition result often only has a certain target. In the framework of DS evidence theory, when evidence contains multiple subset propositions, the fusion result also contains multiple subset propositions, which increases the computational complexity. This work improves the pignistic probability function to transform the multiple subset propositions into single subset propositions, in which the BPA of multiple subset propositions is distributed according to the weight of single subset propositions. The weight is allocated according to the information of each single subset proposition provided by the evidence itself, which reduces the computational complexity and information lost in the process of transforming the pignistic probability function [24] from the BPA to the single subset BPA. The improved pignistic probability conversion function is as follows:
| (3) |
where A is the proposition of original evidence and B is a simple subset proposition in , refers to the multi sub proposition in evidence, , ∅ is an empty set, , and represents the number of elements contained in proposition A.
After the pignistic probability function is conserved, the BPA is converted into a single subset BPA .
| (4) |
2.3. Evidence Support Based on the Manhattan Distance
The distance between evidence [25] can effectively measure the degree of support between evidence. At present, domestic and foreign scholars have proposed a variety of methods to measure distance, including the Lance distance, the Jousselme distance, and the Mahalanobis distance. However, the Lance distance does not take into account the correlation between indicators. The Jousselme distance is affected by the dispersion of the basic probability distribution of evidence [26]. The Mahalanobis distance function requires calculation of the covariance of the matrix, which is hugely complex. The Manhattan distance introduced in this paper calculates the distance of each single subset BPA identification result to measure the similarity between the evidence. This method has low computational complexity.
The Manhattan distance between two pieces of evidence is calculated as:
| (5) |
where . The Manhattan distance between each evidence is calculated to obtain the distance matrix D:
| (6) |
The distance between evidence is negatively correlated with the support.
The calculation of the evidence support of is:
| (7) |
The support degree is obtained based on the Manhattan distance between evidence. The support is normalized to obtain the support coefficient of the evidence .
| (8) |
2.4. Evidence Similarity Based on Evidence Angle
The angle between the two pieces of evidence can be used to structure the consistency between the evidence subjects, and the results obtained can be used to measure the similarity between the two evidence subjects. The formula for the evidence angle [27] is as follows:
| (9) |
The larger value of , the more consistent two pieces of evidence are. It shows that there is a higher similarity between the two pieces of evidence. The evidence angle between each evidence is calculated from the angle matrix, Ang. The similarity between evidence is calculated from the angle matrix:
| (10) |
The similarity between evidence is measured based on the evidence angle. The similarity is normalized to obtain the similarity coefficient of the evidence .
| (11) |
2.5. Evidence Uncertainty Based on Entropy
In evidence theory, the amount of information content that evidence carries can be measured by information entropy. The higher the information entropy, the more information the evidence carries, and the lower the probability of occurrence in the real world, the higher the uncertainty. The calculation formula for information entropy is as follows:
| (12) |
The uncertainty coefficient of each piece of evidence is calculated by:
| (13) |
where .
2.6. Evidence Fusion Based on the Dempster Rule
The evidence fusion coefficient integrating the Manhattan distance, the evidence angle, and the reliability entropy is:
| (14) |
The fusion coefficient is normalized to obtain the final evidence fusion coefficient . The single subset BPA is modified:
| (15) |
where ;
All the initial evidence is replaced with , and finally the modified evidence is fused with the Dempster rules:
| (16) |
The flow graph of the method proposed in this paper is shown in Figure 1.
Figure 1.
The flow graph of the proposed method.
3. Results
In the following, we verify the effectiveness of the method through two conflicting examples: those that only contain single-subset propositions and those that contain multiple-subset propositions.
3.1. An Example of Single-Subset Proposition Conflicting Evidence
An example of single-subset proposition conflicting evidence can be found in reference [28]. An evidence recognition framework is assumed and there are five independent pieces of evidence. The corresponding BPA is shown in Table 1 [28].
Table 1.
Single-subset proposition conflicting evidence.
| Evidence | |||
|---|---|---|---|
| 0.90 | 0 | 0.10 | |
| 0 | 0.01 | 0.99 | |
| 0.50 | 0.20 | 0.30 | |
| 0.98 | 0.01 | 0.01 | |
| 0.90 | 0.05 | 0.05 |
3.1.1. Improved Pignistic Probability Function
This example is a simple subset proposition, and the conversed BPA is obtained by Formula (1). .
3.1.2. Calculate Fusion Coefficient
Apply Formulas (5)–(8) as follows to obtain the evidence support coefficient, the distance matrix, and support matrix as:
The support based on the Manhattan distance between evidence is:
The evidence support coefficient is:
Apply Formulas (9)–(11) as follows to obtain the the evidence similarity coefficient ,
The evidence similarity coefficient is:
Apply Formulas (12) and (13) as follows to obtain the evidence uncertainty coefficient .
The support coefficient and similarity coefficient represent the certainty of evidence, and the uncertainty coefficient represents the uncertainty of evidence. The final evidence fusion coefficient is obtained by applying Formulas (14) and (15):
Modify the evidence again
to replace the initial evidence.
3.1.3. Evidence Fusion Based on the Dempster Rule
Formula (16) is applied for fusion four times, and the fusion results are shown in Table 2 and Figure 2. A comparison with other methods is shown in Table 3 and Figure 3.
Table 2.
Fusion results of single-subset proposition conflicting examples.
| Fusion Times | |||
|---|---|---|---|
| First | 0.9758 | 0.0051 | 0.0191 |
| Second | 0.9969 | 0.0004 | 0.0027 |
| Third | 0.9996 | 0.0000 | 0.0004 |
| Fourth | 0.9999 | 0.0000 | 0.0001 |
Figure 2.
Fusion results of multi-subset proposition conflicting examples.
Table 3.
Comparison of evidence fusion with different methods.
| Approach | Fusion Result | ||||
|---|---|---|---|---|---|
| BPA |
|
|
|
||
| 0 | 0 | 0 | 0 | ||
| Dempster-Shafer | 0 | 0 | 0 | 0 | |
| 1 | 1 | 1 | 1 | ||
| 0.4054 | 0.5055 | 0.8930 | 0.9834 | ||
| Murphy [22] | 0.0001 | 0.0000 | 0.0001 | 0.0000 | |
| 0.5946 | 0.4945 | 0.1069 | 0.0166 | ||
| 0.4054 | 0.7211 | 0.9910 | 0.9996 | ||
| Deng [29] | 0.0001 | 0.0040 | 0.0001 | 0.0000 | |
| 0.5946 | 0.2749 | 0.0089 | 0.0003 | ||
| 0.5745 | 0.8382 | 0.9558 | 0.9968 | ||
| Wang [28] | 0.0033 | 0.0142 | 0.0010 | 0.0001 | |
| 0.4223 | 0.1476 | 0.0431 | 0.0031 | ||
| 0.4054 | 0.7211 | 0.9910 | 0.9996 | ||
| Chen [30] | 0.0001 | 0.0040 | 0.0001 | 0.0000 | |
| 0.5946 | 0.2749 | 0.0089 | 0.0003 | ||
| 0.2790 | 0.5763 | 0.9397 | 0.9963 | ||
| Xiao [31] | 0.0001 | 0.0065 | 0.0004 | 0.0000 | |
| 0.7210 | 0.4173 | 0.0599 | 0.0037 | ||
| 0.4571 | 0.7178 | 0.9792 | 0.9991 | ||
| Zhao [32] | 0.0000 | 0.0046 | 0.0001 | 0.0000 | |
| 0.5429 | 0.2775 | 0.0207 | 0.0009 | ||
| 0.5784 | 0.8406 | 0.9962 | 0.9999 | ||
| Ours | 0.0000 | 0.0187 | 0.0002 | 0.0000 | |
| 0.4216 | 0.1407 | 0.0036 | 0.0001 | ||
Figure 3.
Fusion results chart of a comparison of different methods of fusion of several pieces of single-subset proposition evidence.
3.2. An Example of Multi-Subset Proposition Conflicting Evidence
Suppose there is a multi-sensor-based target recognition system, then the recognized targets are , which are the real targets. There are five independent sensors. The recognition results of the five sensors are shown in Table 4.
Table 4.
Single-subset proposition conflicting evidence.
| Evidence | ||||
|---|---|---|---|---|
| 0.41 | 0.29 | 0.30 | 0.00 | |
| 0.00 | 0.90 | 0.10 | 0.00 | |
| 0.58 | 0.07 | 0.00 | 0.35 | |
| 0.55 | 0.10 | 0.00 | 0.35 | |
| 0.60 | 0.00 | 0.10 | 0.30 |
3.2.1. Improved Pignistic Probability Function
According to Formula (3), the conversed BPA is as shown in Table 5.
Table 5.
Conversed BPA.
| Evidence | |||
|---|---|---|---|
| 0.41 | 0.29 | 0.30 | |
| 0.00 | 0.90 | 0.10 | |
| 0.93 | 0.07 | 0.00 | |
| 0.90 | 0.10 | 0.00 | |
| 0.8571 | 0.00 | 0.1429 |
3.2.2. Calculate Fusion Coefficient
As calculated by Formulas (4)–(15), the coefficients are shown in Table 6.
Table 6.
Fusion coefficients of multi-subset proposition evidence.
| Coefficient | |||||
|---|---|---|---|---|---|
| 0.0440 | 0.0221 | 0.2924 | 0.3212 | 0.3203 | |
| 0.2352 | 0.0629 | 0.2336 | 0.2376 | 0.2307 | |
| 4.7894 | 1.5984 | 1.4470 | 1.5984 | 1.8072 | |
| 0.1221 | 0.0054 | 0.2433 | 0.3004 | 0.3287 |
We obtained the final evidence of the BPA:
3.2.3. Evidence Fusion Based on the Dempster Rule
Evidence fusion was performed four times by the Dempster rule, and the fusion results are shown in Table 7 and Figure 4. A comparison with other methods is shown in Table 8 and Figure 5.
Table 7.
Fusion results of multi-subset proposition conflict examples.
| Fusion Times | |||
|---|---|---|---|
| First | 0.9790 | 0.0109 | 0.0101 |
| Second | 0.9978 | 0.0012 | 0.0010 |
| Third | 0.9998 | 0.0001 | 0.0001 |
| Fourth | 1.0000 | 0.0000 | 0.0000 |
Figure 4.
Fusion results of multi-subset proposition conflict.
Table 8.
Comparison of evidence fusion with different methods.
| Approach | Fusion Result | ||||
|---|---|---|---|---|---|
| BPA |
|
|
|
||
| 0 | 0 | 0 | 0 | ||
| Dempster-Shafer | 0.8969 | 0.6350 | 0.3320 | 0 | |
| 0.1031 | 0.3650 | 0.6680 | 1 | ||
| 0.0964 | 0.4939 | 0.8362 | 0.9613 | ||
| Murphy [22] | 0.8119 | 0.4180 | 0.1147 | 0.0147 | |
| 0.0917 | 0.0792 | 0.0410 | 0.0166 | ||
| 0.0000 | 0.0090 | 0.0081 | 0.0032 | ||
| 0.0000 | 0.6019 | 0.9329 | 0.9802 | ||
| Deng [29] | 0.8969 | 0.2908 | 0.0225 | 0.0009 | |
| 0.1031 | 0.0991 | 0.0354 | 0.0154 | ||
| 0.0000 | 0.0082 | 0.0092 | 0.0035 | ||
| 0.0000 | 0.7985 | 0.9629 | 0.9855 | ||
| Chen [30] | 0.8969 | 0.1060 | 0.0043 | 0.0001 | |
| 0.1031 | 0.0752 | 0.0190 | 0.0096 | ||
| 0.0000 | 0.0203 | 0.0139 | 0.0048 | ||
| 0.1420 | 0.6391 | 0.9400 | 0.9816 | ||
| Xiao [31] | 0.7412 | 0.2462 | 0.0165 | 0.0006 | |
| 0.1168 | 0.1072 | 0.0341 | 0.0141 | ||
| 0.0000 | 0.0075 | 0.0093 | 0.0037 | ||
| 0.1046 | 0.6945 | 0.9355 | 0.9817 | ||
| Zhao [32] | 0.7989 | 0.1902 | 0.0163 | 0.0000 | |
| 0.0965 | 0.1062 | 0.0409 | 0.0147 | ||
| 0.0000 | 0.0091 | 0.0073 | 0.0036 | ||
| 0.2678 | 0.6714 | 0.9983 | 1.0000 | ||
| Ours | 0.5551 | 0.2205 | 0.0015 | 0.0000 | |
| 0.1771 | 0.1080 | 0.0001 | 0.0000 | ||
Figure 5.
Fusion results chart of a comparison of different methods on the fusion of several pieces of multi-subset proposition evidence.
4. Discussion
As shown in the figure above, applying the Dempster fusion rule leads to counter-intuitive results.
The fusion results of single- and multi-subset conflicting evidence are discussed in this section. Analyzing Table 2 and Table 7 and Figure 2 and Figure 4, our proposed method has a good fusion effect, and the BPA reaches 0.9999 and 1.0000 in the fourth fusion. The BPA decreases with the increase in fusion time. It shows that the method proposed in this paper can effectively extract the characteristics of the evidence.
Analyzing Table 3 and Table 8 and Figure 3 and Figure 5, when the number of pieces of evidence is two or three, our method is not as effective as Chen’s method and Zhao’s method. In the case of a small amount of evidence, the input data are insufficient and it is difficult to extract multiple features from each evidence source. However, with the increase in the number of evidence sources, our method’s accuracy rapidly improves and its accuracy performance is expected to be even better. According to the tables and the figures, it can be seen that our proposed method has a higher accuracy and better effect after three or more fusion processes. Furthermore, for four or more fusion processes, our proposed method has a higher accuracy and better impact in the fusion results of multi-subset conflict examples.
Experiments have shown that the method proposed in this article can effectively extract mutually supportive features between various evidence sources when there are sufficient evidence sources and can achieve good results.
5. Conclusions
In this article, we proposed a novel evidence combination method based on an improved pignistic probability function. Considering evidence characteristics and information richness, this paper proposes a novel method to solve the problem of highly conflicting evidence fusion in DS evidence theory. Through experiments, it has been shown that we have achieved good results in dealing with single target recognition problems and an improved fusion accuracy of the evidence theory framework in target fusion recognition. Evidence theory has a strong ability to handle uncertainty problems. Our next work will further investigate how evidence theory can be extended and applied in the real world.
Abbreviations
The following abbreviations are used in this manuscript:
| BPA | basic probability assignment |
Author Contributions
Conceptualization, X.S. and F.L.; methodology, P.Q.; validation, L.Y.; data curation, G.H.; writing—original draft preparation, F.L.; writing—review and editing, X.S. All authors have read and agreed to the published version of the manuscript.
Institutional Review Board Statement
The study did not require ethical approval.
Data Availability Statement
No new data were created or analyzed in this study. Data sharing is not applicable to this article.
Conflicts of Interest
The authors declare no conflict of interest.
Funding Statement
This research received no external funding.
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
References
- 1.Dempster A.P. Upper and Lower Probabilities Induce by a Multiplicand Mapping. Ann. Math. Stat. 1967;38:325–339. doi: 10.1214/aoms/1177698950. [DOI] [Google Scholar]
- 2.Shafer G. A Mathematical Theory of Evidence. Volume 25. Princeton University Press; Princeton, NJ, USA: 1976. pp. 10–40. [Google Scholar]
- 3.Dou Z., Xu X., Lin Y., Zhou R. Application of D-S evidence fusion method in the fault detection of temperature sensor. Math. Probl. Eng. 2014;2014:395057. doi: 10.1155/2014/395057. [DOI] [Google Scholar]
- 4.Deng X., Hu Y., Deng Y., Mahadevan S. Supplier selection using AHP methodology extended by D numbers. Expert Syst. Appl. 2014;41:156–167. doi: 10.1016/j.eswa.2013.07.018. [DOI] [Google Scholar]
- 5.Xu X., Li S., Song X., Wen C., Xu D. Te optimal design of industrial alarm systems based on evidence theory. Control. Eng. Pract. 2016;46:142–156. doi: 10.1016/j.conengprac.2015.10.014. [DOI] [Google Scholar]
- 6.Chen Y., Cremers A.B., Cao Z. Interactive color image segmentation via iterative evidential labeling. Inf. Fusion. 2014;20:292–304. doi: 10.1016/j.inffus.2014.03.007. [DOI] [Google Scholar]
- 7.Suh D., Yook J. A method to determine basic probability assignment in context awareness of a moving object. Int. J. Distrib. Sens. Netw. 2013;9:972641. doi: 10.1155/2013/972641. [DOI] [Google Scholar]
- 8.Zadeh L.A. A simple view of the Dempster-Shafer theory of evidence and its implication for the rule of combination. AI Mag. 1986;7:85–90. [Google Scholar]
- 9.Leung Y., Ji N.-N., Ma J.-H. An integrated information fusion approach based on the theory of evidence and group decision-making. Inf. Fusion. 2013;14:410–422. doi: 10.1016/j.inffus.2012.08.002. [DOI] [Google Scholar]
- 10.Khamseh S.A., Sedigh A.K., Moshiri B., Fatehi A. Control performance assessment based on sensor fusion techniques. Control. Eng. Pract. 2016;49:14–28. doi: 10.1016/j.conengprac.2016.01.008. [DOI] [Google Scholar]
- 11.Niu D., Wei Y., Shi Y., Karimi H.R. A novel evaluation model for hybrid power system based on vague set and Dempster-Shafer evidence theory. Math. Probl. Eng. 2012;2012:784389. doi: 10.1155/2012/784389. [DOI] [Google Scholar]
- 12.Zhou Q., Zhou H., Zhou Q., Yang F., Luo L., Li T. Structural damage detection based on posteriori probability support vector machine and Dempster-Shafer evidence theory. Appl. Soft Comput. 2015;36:368–374. doi: 10.1016/j.asoc.2015.06.057. [DOI] [Google Scholar]
- 13.Xu J., Zhong Z., Xu L. ISHM-oriented adaptive fault diagnostics for avionics based on a distributed intelligent agent system. Int. J. Syst. Sci. 2015;46:2287–2302. doi: 10.1080/00207721.2014.971090. [DOI] [Google Scholar]
- 14.Deng Y., Liu Y., Zhou D. An improved genetic algorithm with initial population strategy for symmetric TSP. Math. Probl. Eng. 2015;2015:212794. doi: 10.1155/2015/212794. [DOI] [Google Scholar]
- 15.Du W.-B., Gao Y., Liu C., Zheng Z., Wang Z. Adequate is better: Particle swarm optimization with limited-information. Appl. Math. Comput. 2015;268:832–838. doi: 10.1016/j.amc.2015.06.062. [DOI] [Google Scholar]
- 16.Deng X., Jiang W. On the negation of a Dempster-Shafer belief structure based on maximum uncertainty allocation. Inf. Sci. 2020;516:346–352. doi: 10.1016/j.ins.2019.12.080. [DOI] [Google Scholar]
- 17.Sun Q., Ye X.Q., Gu W.-K. A New Combination Rules of Evidence Theory. ACTA Electron. Sin. 2000;28:117. [Google Scholar]
- 18.Yang J., Xu D. Evidential reasoning rule for evidence combination. Artif. Intell. 2013;205:1–29. doi: 10.1016/j.artint.2013.09.003. [DOI] [Google Scholar]
- 19.Deng Y. Generalized evidence theory. Appl. Intell. 2015;43:530–543. doi: 10.1007/s10489-015-0661-2. [DOI] [Google Scholar]
- 20.Smets P., Kennes R. The transferable belief model. Artif. Intell. 1994;66:191–234. doi: 10.1016/0004-3702(94)90026-4. [DOI] [Google Scholar]
- 21.Jousselme A.L., Grenier D., Bossé E. A new distance between two bodies of evidence. Inf. Fusion. 2001;2:91–101. doi: 10.1016/S1566-2535(01)00026-4. [DOI] [Google Scholar]
- 22.Murphy C.K. Combining belief functions when evidence conflicts. Decis. Support Syst. 2000;29:1–9. doi: 10.1016/S0167-9236(99)00084-6. [DOI] [Google Scholar]
- 23.Tang Y., Zhou D., Xu S. A Weighted Belief Entropy-based Uncertainty Measure for Multi-sensor Data Fusion. Sensors. 2017;17:928. doi: 10.3390/s17040928. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Smets P. The combination of evidence in the transferable belief model. IEEE Trans. Pattern Anal. Mach. Intell. 1990;12:447–458. doi: 10.1109/34.55104. [DOI] [Google Scholar]
- 25.Jousselme A.L., Maupin P. Distances in evidence theory: Comprehensive survey and generalizations. Int. J. Approx. Reason. 2012;53:118–145. doi: 10.1016/j.ijar.2011.07.006. [DOI] [Google Scholar]
- 26.Mao Y.F., Zhang G.D.L., Wang L. Measurement of evidence conflict based on overlapping degree. Control. Decis. 2017;32:293–298. [Google Scholar]
- 27.Chen L., Diao L., Sang J. A New Method to Handle Conflict when Combining Evidences Using Entropy Function and Evidence Angle with an Effective Application in Fault Diagnosis. Math. Probl. Eng. 2020;2020:3564365. doi: 10.1155/2020/3564365. [DOI] [Google Scholar]
- 28.Wang X., Di P., Yin D. Conflict Evidence Fusion Method Based on Lance Distance and Credibility Entropy. Syst. Eng. Electron. 2022;44:592–602. [Google Scholar]
- 29.Deng Y., Shi W.-K., Zhu Z.-F. Efficient combination approach of conflict evidence. J. Infrared Millim. Waves. 2004;23:27–32. [Google Scholar]
- 30.Lei C., Ling D., Jun S. Weighted Evidence Combination Rule Based on Evidence Distance and Uncertainty Measure: An Application in Fault Diagnosis. Math. Probl. Eng. 2018;2018:1–10. [Google Scholar]
- 31.Xiao F. Multi-sensor data fusion based on the belief divergence measure of evidences and the belief entropy. Inf. Fusion. 2018;46:23–32. doi: 10.1016/j.inffus.2018.04.003. [DOI] [Google Scholar]
- 32.Zhao K.Y., Sun R.T., Li L. An improved evidence fusion algorithm in multi-sensor systems. Appl. Intell. 2021;51:7614–7624. doi: 10.1007/s10489-021-02279-5. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
No new data were created or analyzed in this study. Data sharing is not applicable to this article.





