Skip to main content
Entropy logoLink to Entropy
. 2020 Dec 5;22(12):1374. doi: 10.3390/e22121374

A Review of Fractional Order Entropies

António M Lopes 1,*,, José A Tenreiro Machado 2,
PMCID: PMC7761995  PMID: 33279919

Abstract

Fractional calculus (FC) is the area of calculus that generalizes the operations of differentiation and integration. FC operators are non-local and capture the history of dynamical effects present in many natural and artificial phenomena. Entropy is a measure of uncertainty, diversity and randomness often adopted for characterizing complex dynamical systems. Stemming from the synergies between the two areas, this paper reviews the concept of entropy in the framework of FC. Several new entropy definitions have been proposed in recent decades, expanding the scope of applicability of this seminal tool. However, FC is not yet well disseminated in the community of entropy. Therefore, new definitions based on FC can generalize both concepts in the theoretical and applied points of view. The time to come will prove to what extend the new formulations will be useful.

Keywords: fractional calculus, entropy, information theory

1. Introduction

In recent decades, the generalization of the concepts of differentiation [1,2,3,4] and entropy [5,6,7,8] have received considerable attention. In the first case we may cite the fractional calculus (FC) [9,10]. FC was introduced by Leibniz in the scope of mathematics by the end of the 17th century, but only recently found application in biology [11,12], physics [13,14] and engineering [15,16], among others [17,18]. The concept of entropy was introduced by Clausius [19] and Boltzmann [20] in the field of thermodynamics. Later, entropy was also explored by Shannon [21] and Jaynes [22] in the context of information theory. Meanwhile, both topics evolved considerably, motivating the formulation of fractional operators [23,24] and entropy indices [25,26,27,28,29,30,31,32,33,34,35,36,37,38]. These generalizations extend the application of the two mathematical tools and highlight certain characteristics, such as the power-law behavior, non-locality and long range memory [39,40].

This paper reviews the concept of entropy in the framework of FC. In fact, FC is not yet well disseminated among the community of entropy and, therefore, new definitions based on FC may expand the scope of this powerful tool. To the authors’ best knowledge, new entropy definitions are welcomed by the scientific community, somehow contrary to what happens with recent fractional operators. Consequently, the manuscript does not intend to assess the pros or the cons of the distinct formulations for some given problem. In a similar line of thought, the analysis of entropy-based indices proposed in the literature for comparing or characterizing some phenomena or probability distributions are outside the focus of this paper. Interested readers can obtain further information on divergence measures [41] and mutual information [42], as well as for sample [43], approximate [44], permutation [45], spectral [46], and fuzzy [47] entropies, among others [48]. Indeed, the main idea of this paper is to review the concept of fractional entropy and to present present day state of its development.

The paper is organized as follows. Section 2 presents the fundamental concepts of FC. Section 3 introduces different entropies with one, two and three parameters. Section 4 reviews the fractional-order entropy formulations. Section 5 compares the different formulations for four well-known distributions. Section 6 assesses the impact of the fractional entropies and analyses their main areas of application. Finally, Section 7 outlines the main conclusions.

2. Fractional-Order Derivatives and Integrals

FC models capture non-local effects, useful in the study of phenomena with long range correlations in time or space.

Let us consider the finite interval [a,b], with a,bR and a<b, and let n1<q<n, with nN. The Euler’s gamma function is denoted by Γ(·) and the operator [·] calculates the integer part of the argument. Several definitions of fractional derivatives were formulated [24,49,50]. A small set is presented in the follow-up, which includes both historically relevant and widely used definitions:

  • The left-side and the right-side Caputo derivatives,
    CDa+qf(t)=1Γ(nq)ax1(xτ)qn+1dndτnf(τ)dτ,xa, (1)
    CDbqf(x)=(1)nΓ(nq)xb1(τx)qn+1dndτnf(τ)dτ,xb, (2)
  • The left-side and the right-side Grünwald-Letnikov derivatives,
    GLDa+qf(x)=limh0hqm=0[xah](1)mqmf(xmh),xa, (3)
    GLDbqf(x)=limh0hqm=0[bxh](1)mqmf(x+mh),xb, (4)
  • The Hadamard derivative,
    HaD+qf(x)=qΓ(1q)0xf(x)f(τ)[log(x/τ)]q+1dττ, (5)
  • The left-side and right-side Hilfer derivatives of type 0β1,
    HDa+qf(x)=RLIa+γqdndxnRLIa+(1β)(nq)f(x), (6)
    HDbqf(x)=RLIbγq(1)dndxnRLIb(1β)(nq)f(x), (7)
    where RLIa+q and RLIbq denote the left-side and right-side Riemann-Liouville fractional integrals of order q>0, respectively, defined by:
    RLIa+qf(x)=1Γ(q)axf(τ)(xτ)1qdτ,xa, (8)
    RLIbqf(x)=1Γ(q)xbf(τ)(τx)1qdτ,xb, (9)
  • The Karcı derivative
    KDqf(x)=limh0d{[f(x+h)]q[f(x)]q}dhd[(x+h)qxq]dh=ddx[f(x)]·[f(x)q1]xq1. (10)
  • The Liouville, the left-side and the right-side Liouville derivatives,
    LDqf(x)=1Γ(1q)ddxxf(τ)(xτ)qdτ,<x<+, (11)
    LD0+qf(x)=1Γ(nq)dndxn0xf(τ)(xτ)qn+1dτ,x>0, (12)
    LD0qf(x)=(1)nΓ(nq)dndxnx+f(τ)(xτ)qn+1dτ,x<+, (13)
  • The Marchaud, the left-side and the right-side Marchaud derivatives,
    MDqf(x)=qΓ(1q)xf(x)f(τ)(xτ)q+1dτ, (14)
    MD+qf(x)=qΓ(1q)0+f(x)f(xτ)τq+1dτ, (15)
    MDqf(x)=qΓ(1q)0+f(x)f(x+τ)τq+1dτ, (16)
  • The left-side and the right-side Riemann-Liouville derivatives,
    RLDa+qf(x)=1Γ(nq)dndxnaxf(τ)(xτ)qn+1dτ,xa, (17)
    RLDbqf(x)=(1)nΓ(nq)dndxnxbf(τ)(τx)qn+1dτ,xb, (18)
  • The Riesz derivative,
    RDxqf(x)=12cos(qπ/2)1Γ(q)dndxnxf(τ)(xτ)qn+1dτ+x+f(τ)(τx)qn+1dτ, (19)
  • The local Yang derivative,
    YDqf(x)|x=x0=limxx0Δq[f(x)f(x0](xx0)q. (20)

Often, the Caputo formulation is applied in physics and numerical integration, the Riemann-Liouville in calculus, and the Grünwald-Letnikov in engineering, signal processing and control. These classical definitions are the most frequently used by researchers. In what concerns the mathematical pros and cons of the Karcı and the Yang derivatives, readers may visit [23,51] and references therein. In fact, it should be noted that some formulations need some careful reflection and are the matter of some controversy, since many authors do not consider them as fractional operators [23,52,53]. Nevertheless, the debate about what it really means the term ‘fractional derivative’ is still ongoing among contemporary mathematicians [51].

3. The Concept of Entropy

Let us consider a discrete probability distribution P={p1,p2,,pN}, with ipi=1 and pi0. The Shannon entropy, S(S), of distribution P is defined as:

S(S)=ipiI(pi)=ipilnpi, (21)

and represents the expected value of the information content given by I(pi)=lnpi. Therefore, for the uniform probability distribution we have pi=N1, NN, and the Shannon entropy takes its maximum value S=lnN, yielding the Boltzmann formula, up to a multiplicative factor, k, which denotes the Boltzmann constant.

The Rényi and Tsallis entropies are one-parameter generalizations of (21) given by, respectively:

Sq(R)=11qlnipiq,q>0,q1, (22)
Sq(T)=1q11ipiq,qR. (23)

The entropies Sq(R) and Sq(T) reduce to the Shannon formulation S(S) when q1. The Rényi entropy has an inverse power law equilibrium distribution [54], satisfying the zero-th law of thermodynamics [55]. It is important in statistics and ecology to quantify diversity, in quantum information to measure entanglement, and in computer science for randomness extraction. The Tsallis entropy was proposed in the scope of nonextensive statistical mechanics and has found application in the field of complex dynamics, in diffusion equations [56] and Fokker-Planck systems [57].

Other one-parameter entropies are the Landsberg-Vedral and Abe formulations [26,58]:

Sq(L)=11q1ipiq1, (24)
Sq(A)=ipiqpiq1qq1,q]0,1]. (25)

Expression (24) is related to the Tsallis entropy by Sq(L)=Sq(T)ipiq, and is often known as normalized Tsallis entropy. Expression (25) is a symmetric modification of the Tsallis entropy, which is invariant to the exchange qq1, and we have Sq(A)=(q1)Sq(T)q1Sq1(T)qq1.

The two-parameter Sharma-Mittal entropy [32] is a generalization of the Shannon, Tsallis and Rényi entropies, and is defined as follows:

Sr,q(SM)=11ripiq1r1q1,q>0,q1,r1. (26)

The Sharma-Mittal entropy reduces to the Rényi, Tsallis and Shannon’s formulations for the limits r1, rq and {r,q}{1,1}, respectively.

Examples of three-parameter formulations consist of the gamma and the Kaniadakis entropies, Sd,c1,c2(G) and Sκ,τ,ζ(K), respectively. The gamma entropy is given by [35]:

Sd,c1,c2(G)=iec2c1Γ(d+1,1c1lnpi,1c2lnpi), (27)

where e denotes the Napier constant, Γ(a,z1,z2) represents the generalized incomplete gamma function, defined by:

Γ(a,z1,z2)=Γ(a,z1)Γ(a,z2)=z1z2ta1etdt (28)

and Γ(x,y)=ytx1etdt is the upper incomplete gamma function.

The entropy Sd,c1,c2(G) follows the first three Khinchin axioms [35,59,60] within the parameter regions defined by (29) and (30):

c2>1>c1>0,11c1<d<11c2, (29)
c1>1>c2>0,11c2<d<11c1. (30)

Different combinations of the parameters yield distinct entropy formulations [35]. For example, if we set {d,c1,c2}={0,1,q}, then we recover the Tsallis entropy, while for {d,c1,c2}={0,1±ϵ,1ϵ}, ϵ0, we obtain the Shannon entropy.

The Kaniadakis entropy belongs to a class of trace-form entropies given by [38]:

S=ipiΛ(pi), (31)

where Λ(x) is a strictly increasing function defined for positive values of the argument, noting that Λ(x0+)=. The function Λ(x) can be viewed as a generalization of the ordinary logarithm [38] that, for three-parameter, yields:

Λ(x)=lnκ,τ,ζ(x)=ζκxτ+κζκxτκζκ+ζκ(κ+τ)ζκ+(κτ)ζκ. (32)

Therefore, the Kaniadakis entropy, Sκ,τ,ζ(K), can be expressed as:

Sκ,τ,ζ(K)=ipilnκ,τ,ζ(pi),κ,ζR,|κ|1<τ|κ|. (33)

The Entropy Sκ,τ,ζ(K) is Lesche [61] and thermodynamically [62] stable for |κ|τ|κ|. Distinct combinations of the parameters yield several entropy formulations [38]. For example, if we set κ=τ=(q1)/2 or κ0, τ=0, then expression (33) yields the Tsallis and the Shannon entropies, respectively.

Other entropies can be found in the literature [63,64], but a thorough review of all proposed formulations is out of the scope of this paper.

4. Fractional Generalizations of Entropy

It was noted [65] that the Shannon and Tsallis entropies have the same generating function ipix and that the difference in the Formulas (21) and (23) is just due to the adopted differentiation operator. In fact, using the standard first-order differentiation, ddx, we obtain the Shannon entropy:

S(S)=limx1ddxipix, (34)

while adopting the Jackson q-derivative [66], Dqf(x)=f(qx)f(x)qxx, 0<q<1, yields the Tsallis entropy [28]:

Sq(T)=limx1Dqipix. (35)

Other expressions for entropy can be obtained by adopting additional differentiation operators.

In 2001, Akimoto and Suzuki [67] proposed the one-parameter fractional entropy, Sα(AS), given by:

Sα(AS)=limx1idαdxαexlnpi, (36)

where dαdxα=RLDa+α is the Riemann-Liouville operator (17), with a=0.

The expressions (36) and (17) yield:

Sα(AS)=iα1Γ(2α)1F1(1;1α;lnpi),0<α<1, (37)

where 1F1(a;b;c) denotes the confluent hypergeometric function of the first kind [68]:

1F1(a;b;x)=1+abx1!+a(a+1)b(b+1)x22!+a(a+1)(a+2)b(b+1)(b+2)x33!+. (38)

It can be shown that [67] has the concavity and non-extensivity properties. In the limit α1, it obeys positivity and gives the Shannon entropy, S(S).

In 2009, Ubriaco introduced a one-parameter fractional entropy, Sα(U), given by [69]:

Sα(U)=limx1ddxRLDα1iexlnpi, (39)

where RLDα1 is the Riemann-Liouville left-side derivative (17) with a.

Therefore, we obtain:

Sα(U)=limx1ddx1Γ(1α)ixeτlnpi(xτ)αdτ. (40)

Performing the integration and taking the limit x1, it yields:

Sα(U)=ipi(lnpi)α,0α1. (41)

The Ubriaco entropy (41) is thermodynamically stable and obeys the same properties of the Shannon entropy, with the exception of additivity. When α1, we recover the Shannon entropy.

In 2012, Yu et al. [70] formulated a one-parameter fractional entropy by means of the simple expression:

Sα(Y)=RLI0α(pilnpi),αR+, (42)

where the operator RLI0α is the left-side Riemann-Liouville integral (8), with a=0. Expression (42) obeys the concavity property and is an extension and generalization of the Shannon entropy.

Another fractional entropy was derived in 2014 by Radhakrishnan et al. [71], being given by:

Sq,α(RCJ)=ipiq(lnpi)α,q,α>0. (43)

The two-parameter expression (43) was inspired in (41) and the entropy (44) derived by Wang in the context of the incomplete information theory [72]:

Sq(W)=lnp1q=ipiq(lnpi), (44)

where ipiq=1 and Oq=ipiqOi denotes the q-expectation that characterizes incomplete normalization.

The entropy (43) is considered a fractional entropy in a fractal phase space in which the parameters q and α are associated with fractality and fractionality, respectively. In the limit, when (i) q1 Equation (43) reduces to (41), (ii) α1 recovers Sq(W), and (iii) {q,α}={1,1} expression (43) yields the standard Shannon formula (21).

In 2014, Machado followed a different line of thought [73], thinking of Shannon information Ipi=lnpi as a function of order zero lying between the integer-order cases D1Ipi=pi1lnpi and D1Ipi=1pi. In the perspective of FC, this observation motivated the formulation of information and entropy of order αR as [24]:

Iαpi=DαIpi=piαΓα+1lnpi+ψ˜, (45)
Sα(M)=ipiαΓα+1lnpi+ψ˜pi, (46)

where Dα denotes a fractional derivative operator, ψ˜=ψ1ψ1α and ψ· represent the digamma function.

The one-parameter fractional entropy (46) fails to obey some of the Khinchin axioms with exception of the case q=0 that leads to the Shannon entropy [74]. This behavior is in line with what occurs in FC, where fractional derivatives fail to obey some of the properties of integer-order operators [1].

Expression (46) was generalized by Jalab et al. [75] in the framework of local FC [76]. A adopting (20), the following expression was proposed:

Sα(J)=ipiiαΓiα+1lnpi+ψ˜pi. (47)

Equation (47) decreases from 1 to 1α, α]0,1[. Therefore, we have:

Sα(J)ipiiαΓiα+1lnpi+1αpi. (48)

In 2016, Karcı [77] proposed the fractional derivative (10), based on the concept of indefinite limit and the l’Hôpital’s rule. Adopting f(x)=ipix, and using (10) into (34), he derived the following expression for fractional entropy [78]:

Sα(K)=KDαf(x)=KDαipix=ipipxxα1(1)p1lnp=ipipαlnp. (49)

In 2019, Ferreira and Machado [79] presented a new formula for the entropy based on the work of Abe [65] and Ubriaco [69]. They start by the definition of left-side Liouville fractional derivative of a function f with respect to another function g, with g>0, given by:

LDgαf(x)=1Γ(1α)g(x)ddxx[g(x)g(s)]αg(s)f(s)ds,0<α1. (50)

Choosing f(x)=pix and g(x)=ex+1 expression (50) leads to:

LDgαf(x)=1Γ(1α)ex+1ddxx[ex+1es+1]αes+1esln(pi)ds (51)
=[1αln(pi)]e(α1)(x+1)+x[1ln(pi)]+1Γ(1ln(pi))Γ(2αln(pi)). (52)

Therefore, we have:

LDgαf(1)=[1αln(pi)]piΓ(1ln(pi))Γ(2αln(pi)), (53)

which applying Γ(x+1)=xΓ(x), for x>0, results in:

LDgαf(1)=piΓ(1ln(pi))Γ(1αln(pi)). (54)

Using (54) into (34) gives:

Sα(FM)=ipiΓ(1ln(pi))Γ(1αln(pi)),0<α1. (55)

In 2019, Machado and Lopes [80] proposed two fractional formulations of the Rényi entropy, Sq,α(ML1) and Sq,α(ML2). Their derivation adopts a general averaging operator, instead of the linear one that is assumed for the Shannon entropy (21). Let us consider a monotonic function f(x) with inverse f1(x). Therefore, for a set of real values {xi}, i=1,2,, with probabilities {pi}, we can define a general mean [81] associated with f(x) as:

f1ipif(xi). (56)

Applying (56) to the Shannon entropy (21) we obtain:

S=f1ipif(I(pi)), (57)

where f(x) is a Kolmogorov–Nagumo invertible function [82]. If the postulate of additivity for independent events is considered in (56), then only two functions f(x) are possible, consisting of f1(x)=c·x and f2(x)=c·exp[(1q)x], with c,qR. For f1(x) we get the ordinary mean and we verify that S=S(S). For f(x)=c·e(1q)x we have the expression:

S=11qipi·exp[(1q)I(pi)], (58)

which gives the Rényi entropy:

Sq(R)=11qlnipiq,q>0,q1. (59)

If we combine (45) and (58), then we obtain:

Sq,α(ML1)=11qlnipi·exp(1q)·Iα(pi)=11qlnipi·exp(q1)·piαΓα+1lnpi+ψ˜. (60)

On the other hand, if we rewrite (22) as:

Sq(R)=q1qln1Nipiq1q·N1q=q1qlnpig·N1q, (61)

where pig=1Nipiq1q is a generalized mean, then we obtain:

Sq,α(ML2)=DαHq(R)=1Nαqq1qpigαΓα+11qlnN+lnpig+ψ˜. (62)

In the limit, when α0, both Sq,α(ML1) and Sq,α(ML2) yield (22).

5. Comparison of the Fractional-Order Entropies

In this section we use the fractional entropy formulas to compute the entropy both of abstract and real-world data series.

5.1. Fractional-Order Entropy of Some Probability Distributions

We calculate the entropy of four well-known probability distributions, namely those of Poisson, Gaussian, Lévy and Weibull. We consider these cases just with the purpose of illustrating the behavior of the different formulations. Obviously other cases could be considered, but we limit the number for the sake of parsimony. Firstly, we present the results obtained with the one-parameter entropies Sα(AS), Sα(U), Sα(Y), Sα(M), Sα(J), Sα(K) and Sα(FM). Then, we consider the two-parameter formulations Sq,α(RCJ), Sq,α(ML1) and Sq,α(ML2). Table 1 summarizes the constants adopted for the distributions and the intervals of variation of the entropy parameters.

Table 1.

The constants adopted for the probability distributions and the intervals of variation of the entropy parameters.

Distribution Expression Parameters Domain Order
1-par. Entropy
Order
2-par. Entropy
Poisson f(x)λzeλz! λ=4 z=0,1,,50 α[0,1] α[0.6,0.6]
Gaussian f(x)=1σ2πe12xμσ2 σ=4
μ=0
x[2,2]
Lévy f(x)=c2πec(2xμ)(xμ)(3/2) c=4
μ=0
x[0.1,20] q[1.2,2.2]
Weibull f(x)=kλxλ(k1)e(x/λ)k k=1.5
λ=1
x[0.01,2.5]

Figure 1 depicts the values of Sα(AS), Sα(U), Sα(Y), Sα(M), Sα(J), Sα(K) and Sα(FM) versus α[0,1]. We verify that in the limits, either α0 or α1, the values of the Shannon entropy are calculated as 2.087, 5.866, 4.953 and 5.309, respectively. Moreover, it can be seen that Sα(U) and Sα(FM) are very close to each other, Sα(AS) does not obey positivity, Sα(J) diverges at small values of α, Sα(M) has a maximum at values of α close to 0.6 and diverges as α1.

Figure 1.

Figure 1

The values of Sα(AS), Sα(U), Sα(Y), Sα(M), Sα(J), Sα(K) and Sα(FM) versus α[0,1] for the (a) Poisson, (b) Gaussian, (c) Lévy and (d) Weibull distributions.

Figure 2 portraits the values of Sq,α(RCJ), Sq,α(ML1) and Sq,α(ML2) versus α[0.6,0.6] and q[1.2,2.2]. We verify that in the domain considered the entropies vary slightly and do not diverge.

Figure 2.

Figure 2

The values of Sq,α(RCJ), Sq,α(ML1) and Sq,α(ML2) versus α[0.6,0.6] and q[1.2,2.2] for the (ac) Poisson, (df) Gaussian, (gi) Lévy and (jl) Weibull distributions.

5.2. Fractional-Order Entropy of Real-World Data

We calculate the entropy of a real-world time series, namely the Dow Jones Industrial Average (DJIA) financial index. The DJIA raw data are available at the Yahoo Finance website (https://finance.yahoo.com/). Herein, we consider the stock closing values in the time period from 1 January 1987 up to 24 November 2018, with one-day sampling interval. Occasional missing values, as well as values corresponding to closing days, are estimated using linear interpolation. The processed DJIA time series, x={x1,x2,,xT}, T=12,381 points, is used to construct a histogram of relative frequencies, f(x), with N=50 bins equally spaced and non-overlapping, for estimating the probability distribution of x.

Figure 3a depicts the values of Sα(AS), Sα(U), Sα(Y), Sα(M), Sα(J), Sα(K) and Sα(FM) versus α[0,1]. We verify that, as shown in Section 5.1, Sα(U) and Sα(FM) yield similar results, Sα(J) diverges for small values of α, and Sα(M) has a maximum at values of α close to 0.6, diverging when α1.

Figure 3.

Figure 3

The entropy of the DJIA stock index for daily closing values in the time period from 1 January 1987 up to 24 November 2018, with one-day sampling interval: (a) Sα(AS), Sα(U), Sα(Y), Sα(M), Sα(J), Sα(K) and Sα(FM) versus α[0,1]; (bd) Sq,α(RCJ), Sq,α(ML1) and Sq,α(ML2) versus α[0.6,0.6] and q[1.2,2.2].

Figure 3b–d show the values of Sq,α(RCJ), Sq,α(ML1) and Sq,α(ML2) versus α[0.6,0.6] and q[1.2,2.2], yielding results of the same type as before.

6. Impact and Applications of the Fractional-Order Entropies

To assess the impact of the fractional-order entropies on the scientific community, we consider the number of citations received by the nine papers that first proposed them. Table 1 summarizes the results obtained from the database Scopus on 7 November 2020 (www.scopus.com). We verify that those nine papers were cited 218 times by 170 distinct papers, and that the expressions proposed by Ubriaco and Machado received more attention.

To unravel the main areas of application of the fractional entropies, we use the VOSviewer (https://www.vosviewer.com/), which allows the construction and visualization of bibliometric networks [83]. The bibliometric data of the 170 papers that cite the nine papers that present the fractional-order entropies were collected from Scopus for constructing Table 2, and are the input information to the VOSviewer. The co-occurrence of the authors’ keywords in the 170 papers is analyzed, with the minimum value of co-occurrence of each keyword set to 3. Figure 4 depicts the generated map. We verify the emergence of six clusters, C={C1,,C6}. At the top, the light-blue cluster, C1, includes the fields of finance and finance time series analysis, while the light-green one, C2, encompasses a variety of areas, such as solvents, fractals, commerce, and stochastic systems, tightly connected to some entropy-based complexity measures. On the right, the dark-green cluster, C3, includes the areas of fault detection and image processing. At the bottom of the map, the red cluster, C4, highlights the fields of chromosome and DNA analysis, while the dark-blue, C5, one emphasizes some clustering and visualization techniques, as multidimensional scaling and hierarchical clustering. On the left, the magenta cluster, C6, includes keywords not related with applications.

Table 2.

Citations received by the nine papers that proposed the fractional-order entropies, according to the database Scopus on 7 November 2020.

Entropy Equation Number Authors Reference N. Citations Year
Sα(AS) (37) Akimoto and Suzuki [67] 5 2001
Sα(U) (41) Ubriaco [69] 88 2009
Sα(Y) (42) Yu et al. [70] 7 2012
Sq,α(RCJ) (43) Radhakrishnan et al. [71] 3 2014
Sα(M) (46) Machado [73] 79 2014
Sα(J) (47) Jalab et al. [75] 6 2019
Sα(K) (49) Karcı [77] 16 2016
Sα(FM) (55) Ferreira and Machado [79] 4 2019
Sq,α(ML1) (60) Machado and Lopes [80] 5 2019
Sq,α(ML2) (62) Machado and Lopes [80] 5 2019

Figure 4.

Figure 4

The map of co-occurrence of the authors’ keywords in the 170 papers extracted from Scopus for constructing Table 2. The minimum value of co-occurrence of each keyword is 3. The clusters are represented by C={C1,,C6}.

In summary, we conclude that the fractional entropies were applied to a considerable number of distinct scientific areas and that we may foresee a promising future for their development by exploring the synergies of the two mathematical tools. The prevalence of some proposals, from the point of view of citations, may be due to the time elapsed since their formulation. Indeed, more recent formulations had not yet sufficient time to disseminate in the community. Another reason may have to do with the type and audience of journal where they were published. Nonetheless, a full bibliometric analysis is not the leitmotif of the present paper.

7. Conclusions

This paper reviewed the concept of entropy in the framework of FC. To the best of the authors’ knowledge the fractional entropies proposed so far were included in this review. The different formulations result from the adopted (i) fractional-order operator or (ii) generating function. In general such entropies are non-extensive and converge to the classical Shannon entropy for certain values of their parameters. The fractional entropies have found applications in the area of complex systems, where the classical formulations revealed some limitations. The FC brings a shinny future in further developments of entropy and its applications.

Author Contributions

A.M.L. and J.A.T.M. conceived, designed and performed the experiments, analyzed the data and wrote the paper. Both authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

Footnotes

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Oldham K., Spanier J. The Fractional Calculus: Theory and Application of Differentiation and Integration to Arbitrary Order. Academic Press; New York, NY, USA: 1974. [Google Scholar]
  • 2.Samko S., Kilbas A., Marichev O. Fractional Integrals and Derivatives: Theory and Applications. Gordon and Breach Science Publishers; Amsterdam, The Netherlands: 1993. [Google Scholar]
  • 3.Miller K., Ross B. An Introduction to the Fractional Calculus and Fractional Differential Equations. John Wiley and Sons; New York, NY, USA: 1993. [Google Scholar]
  • 4.Kilbas A., Srivastava H., Trujillo J. Theory and Applications of Fractional Differential Equations. Volume 204 Elsevier; Amsterdam, The Netherlands: 2006. North-Holland Mathematics Studies. [Google Scholar]
  • 5.Plastino A., Plastino A.R. Tsallis entropy and Jaynes’ Information Theory formalism. Braz. J. Phys. 1999;29:50–60. doi: 10.1590/S0103-97331999000100005. [DOI] [Google Scholar]
  • 6.Li X., Essex C., Davison M., Hoffmann K.H., Schulzky C. Fractional Diffusion, Irreversibility and Entropy. J. Non-Equilib. Thermodyn. 2003;28:279–291. doi: 10.1515/JNETDY.2003.017. [DOI] [Google Scholar]
  • 7.Mathai A., Haubold H. Pathway model, superstatistics, Tsallis statistics, and a generalized measure of entropy. Phys. A Stat. Mech. Appl. 2007;375:110–122. doi: 10.1016/j.physa.2006.09.002. [DOI] [Google Scholar]
  • 8.Anastasiadis A. Special Issue: Tsallis Entropy. Entropy. 2012;14:174–176. doi: 10.3390/e14020174. [DOI] [Google Scholar]
  • 9.Tenreiro Machado J.A., Kiryakova V. Recent history of the fractional calculus: Data and statistics. In: Kochubei A., Luchko Y., editors. Handbook of Fractional Calculus with Applications: Basic Theory. Volume 1. De Gruyter; Berlin, Germany: 2019. pp. 1–21. [Google Scholar]
  • 10.Machado J.T., Galhano A.M., Trujillo J.J. On development of fractional calculus during the last fifty years. Scientometrics. 2014;98:577–582. doi: 10.1007/s11192-013-1032-6. [DOI] [Google Scholar]
  • 11.Ionescu C. The Human Respiratory System: An Analysis of the Interplay between Anatomy, Structure, Breathing and Fractal Dynamics. Springer; London, UK: 2013. (Series in BioEngineering). [Google Scholar]
  • 12.Lopes A.M., Machado J.T. Fractional order models of leaves. J. Vib. Control. 2014;20:998–1008. doi: 10.1177/1077546312473323. [DOI] [Google Scholar]
  • 13.Hilfer R. Application of Fractional Calculus in Physics. World Scientific; Singapore: 2000. [Google Scholar]
  • 14.Tarasov V. Fractional Dynamics: Applications of Fractional Calculus to Dynamics of Particles, Fields and Media. Springer; New York, NY, USA: 2010. [Google Scholar]
  • 15.Parsa B., Dabiri A., Machado J.A.T. Application of Variable order Fractional Calculus in Solid Mechanics. In: Baleanu D., Lopes A.M., editors. Handbook of Fractional Calculus with Applications: Applications in Engineering, Life and Social Sciences, Part A. Volume 7. De Gruyter; Berlin, Germany: 2019. pp. 207–224. [Google Scholar]
  • 16.Lopes A.M., Machado J.A.T. Fractional-order modeling of electro-impedance spectroscopy information. In: Baleanu D., Lopes A.M., editors. Handbook of Fractional Calculus with Applications: Applications in Engineering, Life and Social Sciences, Part A. Volume 7. De Gruyter; Berlin, Germany: 2019. pp. 21–41. [Google Scholar]
  • 17.Valério D., Ortigueira M., Machado J.T., Lopes A.M. Continuous-time fractional linear systems: Steady-state behaviour. In: Petráš I., editor. Handbook of Fractional Calculus with Applications: Applications in Engineering, Life and Social Sciences, Part A. Volume 6. De Gruyter; Berlin, Germany: 2019. pp. 149–174. [Google Scholar]
  • 18.Tarasov V.E. On history of mathematical economics: Application of fractional calculus. Mathematics. 2019;7:509. doi: 10.3390/math7060509. [DOI] [Google Scholar]
  • 19.Clausius R. In: The Mechanical Theory of Heat: With Its Applications to the Steam-Engine and to the Physical Properties of Bodies. Van Voorst J., editor. Creative Media Partners; Sacramento, CA, USA: 1867. [Google Scholar]
  • 20.Boltzmann L. In: Vorlesungen über die Principe der Mechanik. Barth J.A., editor. Volume 1 Nabu Press; Charleston, SC, USA: 1897. [Google Scholar]
  • 21.Shannon C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948;27:379–423, 623–656.. doi: 10.1002/j.1538-7305.1948.tb01338.x. [DOI] [Google Scholar]
  • 22.Jaynes E.T. Information theory and statistical mechanics. Phys. Rev. 1957;106:620. doi: 10.1103/PhysRev.106.620. [DOI] [Google Scholar]
  • 23.Ortigueira M.D., Machado J.T. What is a fractional derivative? J. Comput. Phys. 2015;293:4–13. doi: 10.1016/j.jcp.2014.07.019. [DOI] [Google Scholar]
  • 24.Valério D., Trujillo J.J., Rivero M., Machado J.T., Baleanu D. Fractional calculus: A survey of useful formulas. Eur. Phys. J. Spec. Top. 2013;222:1827–1846. doi: 10.1140/epjst/e2013-01967-y. [DOI] [Google Scholar]
  • 25.Lopes A.M., Tenreiro Machado J., Galhano A.M. Multidimensional Scaling Visualization Using Parametric Entropy. Int. J. Bifurc. Chaos. 2015;25:1540017. doi: 10.1142/S0218127415400179. [DOI] [Google Scholar]
  • 26.Landsberg P.T., Vedral V. Distributions and channel capacities in generalized statistical mechanics. Phys. Lett. A. 1998;247:211–217. doi: 10.1016/S0375-9601(98)00500-3. [DOI] [Google Scholar]
  • 27.Beck C. Generalised information and entropy measures in physics. Contemp. Phys. 2009;50:495–510. doi: 10.1080/00107510902823517. [DOI] [Google Scholar]
  • 28.Tsallis C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988;52:479–487. doi: 10.1007/BF01016429. [DOI] [Google Scholar]
  • 29.Kaniadakis G. Statistical mechanics in the context of special relativity. Phys. Rev. E. 2002;66:056125. doi: 10.1103/PhysRevE.66.056125. [DOI] [PubMed] [Google Scholar]
  • 30.Naudts J. Generalized thermostatistics based on deformed exponential and logarithmic functions. Phys. A Stat. Mech. Appl. 2004;340:32–40. doi: 10.1016/j.physa.2004.03.074. [DOI] [Google Scholar]
  • 31.Abe S., Beck C., Cohen E.G. Superstatistics, thermodynamics, and fluctuations. Phys. Rev. E. 2007;76:031102. doi: 10.1103/PhysRevE.76.031102. [DOI] [PubMed] [Google Scholar]
  • 32.Sharma B.D., Mittal D.P. New nonadditive measures of entropy for discrete probability distributions. J. Math. Sci. 1975;10:28–40. [Google Scholar]
  • 33.Wada T., Suyari H. A two-parameter generalization of Shannon–Khinchin axioms and the uniqueness theorem. Phys. Lett. A. 2007;368:199–205. doi: 10.1016/j.physleta.2007.04.009. [DOI] [Google Scholar]
  • 34.Bhatia P. On certainty and generalized information measures. Int. J. Contemp. Math. Sci. 2010;5:1035–1043. [Google Scholar]
  • 35.Asgarani S. A set of new three-parameter entropies in terms of a generalized incomplete Gamma function. Phys. A Stat. Mech. Appl. 2013;392:1972–1976. doi: 10.1016/j.physa.2012.12.018. [DOI] [Google Scholar]
  • 36.Hanel R., Thurner S. A comprehensive classification of complex statistical systems and an axiomatic derivation of their entropy and distribution functions. EPL (Europhys. Lett.) 2011;93:20006. doi: 10.1209/0295-5075/93/20006. [DOI] [Google Scholar]
  • 37.Sharma B.D., Taneja I.J. Entropy of type (α, β) and other generalized measures in information theory. Metrika. 1975;22:205–215. doi: 10.1007/BF01899728. [DOI] [Google Scholar]
  • 38.Kaniadakis G. Maximum entropy principle and power-law tailed distributions. Eur. Phys. J. B-Condens. Matter Complex Syst. 2009;70:3–13. doi: 10.1140/epjb/e2009-00161-0. [DOI] [Google Scholar]
  • 39.Tarasov V.E. Lattice model with power-law spatial dispersion for fractional elasticity. Cent. Eur. J. Phys. 2013;11:1580–1588. doi: 10.2478/s11534-013-0308-z. [DOI] [Google Scholar]
  • 40.Nigmatullin R., Baleanu D. New relationships connecting a class of fractal objects and fractional integrals in space. Fract. Calc. Appl. Anal. 2013;16:911–936. doi: 10.2478/s13540-013-0056-1. [DOI] [Google Scholar]
  • 41.Lin J. Divergence measures based on the Shannon entropy. IEEE Trans. Inf. Theory. 1991;37:145–151. doi: 10.1109/18.61115. [DOI] [Google Scholar]
  • 42.Cover T.M., Thomas J.A. Entropy, relative entropy and mutual information. Elem. Inf. Theory. 1991;2:1–55. [Google Scholar]
  • 43.Ebrahimi N., Pflughoeft K., Soofi E.S. Two measures of sample entropy. Stat. Probab. Lett. 1994;20:225–234. doi: 10.1016/0167-7152(94)90046-9. [DOI] [Google Scholar]
  • 44.Pincus S.M. Approximate entropy as a measure of system complexity. Proc. Natl. Acad. Sci. USA. 1991;88:2297–2301. doi: 10.1073/pnas.88.6.2297. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Bandt C., Pompe B. Permutation entropy: A natural complexity measure for time series. Phys. Rev. Lett. 2002;88:174102. doi: 10.1103/PhysRevLett.88.174102. [DOI] [PubMed] [Google Scholar]
  • 46.Pan Y., Chen J., Li X. Spectral entropy: A complementary index for rolling element bearing performance degradation assessment. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2009;223:1223–1231. doi: 10.1243/09544062JMES1224. [DOI] [Google Scholar]
  • 47.Fan J.L., Ma Y.L. Some new fuzzy entropy formulas. Fuzzy Sets Syst. 2002;128:277–284. doi: 10.1016/S0165-0114(01)00127-0. [DOI] [Google Scholar]
  • 48.Rosso O.A., Blanco S., Yordanova J., Kolev V., Figliola A., Schürmann M., Başar E. Wavelet entropy: A new tool for analysis of short duration brain electrical signals. J. Neurosci. Methods. 2001;105:65–75. doi: 10.1016/S0165-0270(00)00356-3. [DOI] [PubMed] [Google Scholar]
  • 49.De Oliveira E.C., Tenreiro Machado J.A. A review of definitions for fractional derivatives and integral. Math. Probl. Eng. 2014;2014:238459. doi: 10.1155/2014/238459. [DOI] [Google Scholar]
  • 50.Sousa J.V.D.C., de Oliveira E.C. On the ψ-Hilfer fractional derivative. Commun. Nonlinear Sci. Numer. Simul. 2018;60:72–91. doi: 10.1016/j.cnsns.2018.01.005. [DOI] [Google Scholar]
  • 51.Katugampola U.N. Correction to “What is a fractional derivative?” by Ortigueira and Machado [Journal of Computational Physics, Volume 293, 15 July 2015, Pages 4–13. Special issue on Fractional PDEs] J. Comput. Phys. 2016;321:1255–1257. doi: 10.1016/j.jcp.2016.05.052. [DOI] [Google Scholar]
  • 52.Tarasov V.E. No nonlocality. No fractional derivative. Commun. Nonlinear Sci. Numer. Simul. 2018;62:157–163. doi: 10.1016/j.cnsns.2018.02.019. [DOI] [Google Scholar]
  • 53.Abdelhakim A.A., Machado J.A.T. A critical analysis of the conformable derivative. Nonlinear Dyn. 2019;95:3063–3073. doi: 10.1007/s11071-018-04741-5. [DOI] [Google Scholar]
  • 54.Lenzi E., Mendes R., Da Silva L. Statistical mechanics based on Rényi entropy. Phys. A Stat. Mech. Appl. 2000;280:337–345. doi: 10.1016/S0378-4371(00)00007-8. [DOI] [Google Scholar]
  • 55.Parvan A., Biró T. Extensive Rényi statistics from non-extensive entropy. Phys. Lett. A. 2005;340:375–387. doi: 10.1016/j.physleta.2005.04.036. [DOI] [Google Scholar]
  • 56.Plastino A., Casas M., Plastino A. A nonextensive maximum entropy approach to a family of nonlinear reaction–diffusion equations. Phys. A Stat. Mech. Appl. 2000;280:289–303. doi: 10.1016/S0378-4371(00)00006-6. [DOI] [Google Scholar]
  • 57.Frank T., Daffertshofer A. H-theorem for nonlinear Fokker–Planck equations related to generalized thermostatistics. Phys. A Stat. Mech. Appl. 2001;295:455–474. doi: 10.1016/S0378-4371(01)00146-7. [DOI] [Google Scholar]
  • 58.Abe S. A note on the q-deformation-theoretic aspect of the generalized entropies in nonextensive physics. Phys. Lett. A. 1997;224:326–330. doi: 10.1016/S0375-9601(96)00832-8. [DOI] [Google Scholar]
  • 59.Khinchin A.I. Mathematical Foundations of Information Theory. Dover; New York, NY, USA: 1957. [Google Scholar]
  • 60.Shannon C.E., Weaver W. The Mathematical Theory of Communication. University of Illinois Press; Urbana, IL, USA: 1963. [Google Scholar]
  • 61.Lesche B. Instabilities of Rényi entropies. J. Stat. Phys. 1982;27:419–422. doi: 10.1007/BF01008947. [DOI] [Google Scholar]
  • 62.Gell-Mann M., Tsallis C. Nonextensive Entropy: Interdisciplinary Applications. Oxford University Press; Oxford, UK: 2004. [Google Scholar]
  • 63.Amigó J.M., Balogh S.G., Hernández S. A brief review of generalized entropies. Entropy. 2018;20:813. doi: 10.3390/e20110813. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 64.Namdari A., Li Z. A review of entropy measures for uncertainty quantification of stochastic processes. Adv. Mech. Eng. 2019;11:1687814019857350. doi: 10.1177/1687814019857350. [DOI] [Google Scholar]
  • 65.Abe S. Nonextensive statistical mechanics of q-bosons based on the q-deformed entropy. Phys. Lett. A. 1998;244:229–236. doi: 10.1016/S0375-9601(98)00324-7. [DOI] [Google Scholar]
  • 66.Jackson F.H. On q-functions and a certain difference operator. Earth Environ. Sci. Trans. R. Soc. Edinb. 1909;46:253–281. doi: 10.1017/S0080456800002751. [DOI] [Google Scholar]
  • 67.Akimoto M., Suzuki A. Proposition of a New Class of Entropy. J. Korean Phys. Soc. 2001;38:460–463. [Google Scholar]
  • 68.Abramowitz M., Stegun I.A., editors. Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables. Dover; New York, NY, USA: 1965. [Google Scholar]
  • 69.Ubriaco M.R. Entropies based on fractional calculus. Phys. Lett. A. 2009;373:2516–2519. doi: 10.1016/j.physleta.2009.05.026. [DOI] [Google Scholar]
  • 70.Yu S., Huang T.Z., Liu X., Chen W. Information measures based on fractional calculus. Inf. Process. Lett. 2012;112:916–921. doi: 10.1016/j.ipl.2012.08.019. [DOI] [Google Scholar]
  • 71.Radhakrishnan C., Chinnarasu R., Jambulingam S. A Fractional Entropy in Fractal Phase Space: Properties and Characterization. Int. J. Stat. Mech. 2014;2014:460364. doi: 10.1155/2014/460364. [DOI] [Google Scholar]
  • 72.Wang Q.A. Extensive generalization of statistical mechanics based on incomplete information theory. Entropy. 2003;5:220–232. doi: 10.3390/e5020220. [DOI] [Google Scholar]
  • 73.Machado J.T. Fractional Order Generalized Information. Entropy. 2014;16:2350–2361. doi: 10.3390/e16042350. [DOI] [Google Scholar]
  • 74.Bagci G.B. The third law of thermodynamics and the fractional entropies. Phys. Lett. A. 2016;380:2615–2618. doi: 10.1016/j.physleta.2016.06.010. [DOI] [Google Scholar]
  • 75.Jalab H.A., Subramaniam T., Ibrahim R.W., Kahtan H., Noor N.F.M. New Texture Descriptor Based on Modified Fractional Entropy for Digital Image Splicing Forgery Detection. Entropy. 2019;21:371. doi: 10.3390/e21040371. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76.Yang X.J. Advanced Local Fractional Calculus and Its Applications. World Science Publisher; New York, NY, USA: 2012. [Google Scholar]
  • 77.Karcı A. New approach for fractional order derivatives: Fundamentals and analytic properties. Mathematics. 2016;4:30. doi: 10.3390/math4020030. [DOI] [Google Scholar]
  • 78.Karcı A. Fractional order entropy: New perspectives. Optik. 2016;127:9172–9177. doi: 10.1016/j.ijleo.2016.06.119. [DOI] [Google Scholar]
  • 79.Ferreira R.A., Tenreiro Machado J. An Entropy Formulation Based on the Generalized Liouville Fractional Derivative. Entropy. 2019;21:638. doi: 10.3390/e21070638. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 80.Machado J.T., Lopes A.M. Fractional Rényi entropy. Eur. Phys. J. Plus. 2019;134:217. doi: 10.1140/epjp/i2019-12554-9. [DOI] [Google Scholar]
  • 81.Beliakov G., Sola H.B., Sánchez T.C. A Practical Guide to Averaging Functions. Springer; Cham, Switzerland: 2016. [Google Scholar]
  • 82.Xu D., Erdogmuns D. Information Theoretic Learning. Springer; Berlin/Heidelberg, Germany: 2010. Renyi’s entropy, divergence and their nonparametric estimators; pp. 47–102. [Google Scholar]
  • 83.Van Eck N.J., Waltman L. Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics. 2010;84:523–538. doi: 10.1007/s11192-009-0146-3. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Entropy are provided here courtesy of Multidisciplinary Digital Publishing Institute (MDPI)

RESOURCES