Skip to main content
. 2017 Jun 12;17(6):1360. doi: 10.3390/s17061360
Algorithm 1 C4.5 algorithm for arrhythmia learning.
  • 1:

    function DecisionTree(D, A, T)     ▷ D: training dataset, A: feature set, T: decision tree

  • 2:

    if D contains only training examples of the same heartbeat type cjC then

  • 3:

      Make T a leaf node labeled with heartbeat type cj;      ▷ C: heartbeat types

  • 4:

    else if A=Ø then

  • 5:

      Make T a leaf node labeled with heartbeat type cj, which is the most frequent heartbeat type in D;

  • 6:

    else         ▷ D contains examples belonging to a mixture of heartbeat types

  • 7:

       pi=impurityEval-1(D); ▷ We select a single feature to partition D into subsets so that each subset purer

  • 8:

       for Each feature AiA(={A1,A2,,Ak}) do

  • 9:

        pi=impurityEval-2(Ai,D);

  • 10:

      end for

  • 11:

      Select Ag{A1,A2,,Ak} that provides the biggest impurity reduction, computed using p0pi;

  • 12:

      if (p0pg)<threshold then     ▷ Ag does not significantly reduce impurity p0

  • 13:

       Make T a leaf node labeled with cj, the most frequent heartbeat type in D;

  • 14:

      else                    ▷ Ag is able to reduce impurity p0

  • 15:

       Make T a decision node on Ag;

  • 16:

       Let the possible values of Ag be v1,v2,,vm. Partition D into m disjoint subsets D1,D2,,Dm based on the m values of Ag;

  • 17:

       for Each DjD(={D1,D2,,Dm}) do

  • 18:

        if DjØ then

  • 19:

         Create a branch (edge) node Tj for vj as a child node of T;

  • 20:

         DecisionTree(Dj,A{Ag},Tj);

  • 21:

        end if

  • 22:

       end for

  • 23:

      end if

  • 24:

    end if

  • 25:

    end function