Skip to main content
. 2019 Sep 16;21(9):897. doi: 10.3390/e21090897
Algorithm 2 Splitting Rules based on Deng Entropy
  • Require: A root node X=xii=1N, where xi is the ith instance with n condition attributes Akk=1n and one decision attribute D; the stopping criterion: Until all conditional attributes are used up.

  • Ensure: An Evidential Decision Tree.

  • if the samples in X belong to some class then

  •   Mark X as a leaf node and assign the class as its label.

  •   return.

  • end if

  • for each attribute Ak,k=1,2,,n in X do

  •   Computer Deng Enropy Ed(Ak) according to Equation (10)

  •   Ed(Ak)=i=1NwΘcik(w)log2cik(w)2|w|1Ncik represent a BBA for the instance i of attribute Ak

  •   Ed(Ak*)=argminAkEd(Ak). The smaller the entropy value, the better the subsequent division.

  • end for

  •  Get the best attribute Ak* and the splitting point ck*.