|
Algorithm 2 Splitting Rules based on Deng Entropy |
Require: A root node , where is the ith instance with n condition attributes and one decision attribute D; the stopping criterion: Until all conditional attributes are used up.
Ensure: An Evidential Decision Tree.
if the samples in X belong to some class then
Mark X as a leaf node and assign the class as its label.
return.
end if
for each attribute in X
do
Computer Deng Enropy according to Equation (10)
represent a BBA for the instance i of attribute
. The smaller the entropy value, the better the subsequent division.
end for
Get the best attribute and the splitting point .
|