Skip to main content
. 2019 Dec 7;21(12):1204. doi: 10.3390/e21121204
Algorithm 2. Estimation of the tail entropy of a distribution function.
  1. Estimate the probability density function, obtaining values f^n(Xi) for i=0,n1;

  2. Sample from the probability density function using the sampled function Sn(f^n)(i)=f^n(Xi) for i=0,n1;

  3. Define a quantum q>0; then QqSn(f^n)(j)=(i+1/2)q, if f^n(Xj)[iq,(i+1)q);

  4. Compute the probabilities pn*(i)=cn(i)αjcn(j)=cn(i)αn=card{f^n(Xj)[iq,(i+1)q)}αn;

  5. Estimate the tail entropy of F as Hα,q(f^n)=i=0mpn*(i)log2pn*(i) where m=[αq].