Abstract
In this paper, we create three different entropy curves, Tsallis q-complexity-entropy curve, Rényi r-complexity-entropy curve, and Tsallis-Rényi entropy curve via extending the traditional complexity-entropy causality plane and replacing the permutation entropy into power spectral entropy. This kind of method is free of any parameters and some features that are obscure in the time domain can be extracted in the frequency domain. Results from numerical simulations verify that these three entropy curves can characterize time series efficiently. Chaotic and stochastic time series can be distinguished based on whether the q-complexity-entropy curves are opened or closed. The unrelated stochastic process has a negative curvature associated with the Rényi r-complexity-entropy curve, whereas there are positive curvatures for related cases. In addition, the Tsallis-Rényi entropy curve can display the relationship between two entropies. Finally, we apply this method to sleep electrocardiogram and electroencephalography signals. It is proved that these signals possess similar features with long-range correlated 1/f noise. It is robust enough to exhibit different characteristics for each sleep stage. By using surrogate data sets, the nonlinearity of simulated chaotic time series and sleep data can be identified.
The complexity-entropy causality plane (CECP) proposed by Rosso et al. is a powerful tool that describes the dynamic characteristics of the system. In this paper, we first change permutation entropy into power spectral entropy, which is immune to any parameters and can extract features in the frequency domain. To further distinguish time series, we consider Tsallis and Rényi entropies and create three different entropy curves, Tsallis q-complexity-entropy curve, Rényi r-complexity-entropy curve, and Tsallis-Rényi entropy curve. By experimenting this proposed method on simulated and empirical data, chaotic and stochastic time series can be distinguished successfully and different sleep stages can be efficiently classified.
I. INTRODUCTION
Studying nonlinear dynamical systems by empirical time series is a common task in research fields.1 In last several years, a variety of complexity measures that can classify deterministic from stochastic systems have been proposed, such as Kolmogorov complexity,2 fractal dimensions,3 and Lyapunov exponents.4 However, these old and well-studied methods rely on specific algorithms and tuning parameters, which may add difficulties for estimation. Furthermore, dynamics entropy has been widely applied to estimate the complexity of nonlinear system.5,6 As an information measure that quantifies the uncertainty of system, researchers have developed many kinds of entropy methods.7–9 In particular, the entropy rate is regarded as a measure to quantify dynamic processes. Paluš studied the relation between Kolmogorov-Sinai entropy of a dynamical process and the entropy rate of a Gaussian process generated by the dynamical process and he found the possibility that there might exist a relation between them but its states could change under different situations.10,11 Recently, a wavelet-based mutual information rate was proposed to give a more general introduction to classification of dynamical states using entropy rates.12 In addition, Bandt and Pompe proposed permutation entropy (PE) by means of comparing neighboring values that could be suitable for any time series.9 Since the algorithm of this technique is simple and it has the ability to distinguish among periodic, chaotic, and stochastic signals, it is extensively utilized in nonlinear dynamic systems.13–15 For the frequency domain signals, it is supposed to extract inner information feature and determine the power spectral entropy (PSE).16
Considering the specific problem of classifying the chaotic and stochastic systems, Rosso et al. interpret that just applying permutation entropy is not enough to solve the issue.17 For example, they found that the result of logistic map at completely chaos is near to that of long-range correlated noises, such as 1/f noise, when computing their values of permutation entropy. Inspired by that, Rosso et al. combined permutation entropy and another complexity measure, named the statistical complexity introduced by López-Ruiz et al.18,19 One of the factor of the statistical complexity is called disequilibrium, which quantifies the distance between the uniform distribution (equal probability distribution) and system probability distribution. The value of disequilibrium would be different from zero if the privileged or more probable states exists. The statistical complexity is thus the product of “information ” and “disequilibrium ” Having obtained the values of permutation entropy and statistical complexity , the diagram of versus is constructed by the points . Its applications are greatly spread over various scientific communities for distinguishing noise, chaotic system, and stochastic process.20–26
However, despite the great success of CECP, there exist some drawbacks in several situations. For example, the time delay and embedding dimension are supposed to be ensured every time due to the distinction of systems. It is rather troublesome and time-consuming. Moreover, the positions of some periodic and chaotic signals in the CECP are in the vicinity so that it may lead to incorrect characterization. Prompted by the phenomenon, we consider th Tsallis entropy and th Rényi entropy and establish the fractal causality plane in the frequency domain.27,28 Tsallis entropy and Rényi entropy are the monoparametric generalizations of the Shannon entropy. The parameters of these two entropies describe different weights to the probabilities corresponding to power spectrum distribution in distinct aspects; thus different dynamical fractal properties of system can be evaluated by changing the entropic index. In this paper, we create the parametric curves of signals associated with the values of and calculated by varying the parameters and . The curve, defined as the complexity-entropy curve, tends to classify periodic, stochastic, and chaotic time series generated by complex systems.
The rest of the paper is organized as follows. Section II first introduces the methodologies of statistical complexity measure, power spectral entropy Tsallis entropy, and Rényi entropy. Then, we integrate these measures and present three different complexity-entropy causality planes based on the theories mentioned. In Sec. III, we select several different kinds of processes to validate the effectiveness of the proposed method and compare the results. Section IV displays the empirical application in sleeping data and financial time series. Finally, the conclusions are summarized in Sec. V.
II. METHODOLOGY
A. Statistical complexity measure
Statistical complexity can be defined to describe a system with a simple structure but complex dynamical characteristics and can also reveal complex patterns that are implicit in its inner dynamics.29 At the same time, the statistical complexity considers that there are two opposing extremities in the nonlinear dynamic system, that is, completely order and maximum randomness. In both cases, the system structure is very simple, only zero statistical complexity. Between these two specific cases, there are a large number of possible physical structures, which can be reflected by the potential system probability distribution characteristics.30
For a given nonlinear system with any arbitrary discrete probability distribution with possible states, the well-known Shannon information theory is defined as5
(2.1) |
The value of quantifies the complexity of the system with some degree. If we can predict certainly that all the possible outcomes whose probability is set by will actually happen. On the other hand, the uncertainty reaches maximum when the distribution is uniform, that is, , where .
Another complexity measure is “disequilibrium,” denoted by , which depicts the distance between a specified probability distribution and the equilibrium probability distribution. The disequilibrium is defined in terms of the extensive Jensen-Shannon divergence17
(2.2) |
where is a relative entropy measure (also called Jensen-Shannon divergence) between the empirical distribution and the uniform distribution . This measure can be obtained according to the symmetrized Kullback-Leibler divergence [ , and are two distributions.] and is described as
(2.3) |
and is a normalization constant, equal to the inverse of maximum possible value of . Therefore, the statistical complexity measure is formed through the product
(2.4) |
of the normalized Shannon entropy
(2.5) |
The statistical complexity measure mirrors the interrelationship between the quantity of information and its disequilibrium in the dynamic system. By applying Jensen-Shannon divergence instead of Euclidean distance, this generalized statistical complexity measure has the intensive characteristic detected in lots of thermodynamic quantities, which can also better reflect the key details of the dynamic characteristics of the system, and distinguish between different degrees of periodicity and chaos, whereas this kind of information cannot be recognized by the randomness measure.
B. Power spectral entropy
Power spectral entropy (PSE) is a kind of information entropy that aims to quantify the spectral complexity in frequency domain from an energy perspective. First, the discrete Fourier transform of time-domain signal is obtained by Fast Fourier Transform (FFT), where donates the frequency point of the number . Then, the power spectral density (PSD) is estimated as follows
(2.6) |
where is the length of FT series. After normalizing by
(2.7) |
the power spectral density distribution is obtained. Finally, the power spectral entropy can be defined by Eq. (2.1)
(2.8) |
PSE interprets the spectrum structure of signal in frequency domain, in other words, time uncertainty. If the distribution of energy is more uniform in the whole frequency domain, the signal has more complexity and thus the value of PSE is higher; whereas the narrower the spectrum peak is, the smaller the value of PSE is, which means that the system is more concussive and less complex. Normally, the value of PSE reaches the maximum when the energy or power spectrum distribution is flat, that is, where .
C. Tsallis entropy and Rényi entropy
Tsallis q-entropy, a generalization of BoltzmannGibbsShannon (BGS) entropy, is a nonextensive entropy and estimates the degree of departure from extensity.27,31 It is applicable for detecting the multifractal properties of nonlinear system. Given a discrete probability distribution , the formula is defined as
(2.9) |
where is any real number and is the total number of accessible configurations. As , the is the same as Shannon entropy. Therefore, entropic index evaluates deviations of Tsallis entropy from Shannon entropy.32 The other format of Eq. (2.9) can be written as
(2.10) |
where is the -logarithm function. is the maximum value of Tsallis entropy when the distribution is uniform.
The th-order Rényi entropy proposed by Rényi is defined as
(2.11) |
where is the entropic index.28 Shannon entropy is obtained by the limit . The maximum value of Rényi entropy is , which is the same as that in the Shannon entropy situation.
D. Three extended complexity-entropy causality plane based on power spectral entropy
In order to better describe the intrinsic spectrum distribution and classify various sorts of time series, we introduce three distinct extended complexity-entropy causality planes based on power spectral entropy based on theories mentioned above. Considering a time series , the procedure of this algorithm can be summarized as follows.
1. Tsallis complexity-entropy causality plane
Step 1: For a given time series, calculate the th Tsallis PSE and then normalize them by
(2.12) |
Step 2: Compute the Tsallis complexity measure
(2.13) |
where and
(2.14) |
is the entropic measure (Jensen-Shannon divergence) between and . is the generalization of Kullback-Leibler divergence of Tsallis format. is a normalized constant determined as
(2.15) |
Step 3: Construct the -entropy plane with as the horizontal axis and as the vertical axis.
Suppose that is the probability distribution of power spectrum and donates the amount of non-zero elements of . The following statements are obtained:
-
(1)
when ;>;
-
(2)
when ;>;
-
(3)
when ;>;
-
(4)
when .
The proofs of these statements are displayed in Appendix A. Several general properties can be summarized:
(1) If there is only one obvious power spectrum, in other words, , the q-complexity-entropy curve collapses onto the point ;
(2) If all the values of power spectral density is not zero, then and the curve ends up with a loop at point from to .
2. Rényi complexity-entropy causality plane
Step 1: For a given time series, calculate the th Rényi PSE and then normalize them by
(2.16) |
Step 2: Compute the Rényi complexity measure :
(2.17) |
where and
(2.18) |
is the entropic measure (Jensen-Shannon divergence) between and . is the generalization of Kullback-Leibler divergence of the Rényi format. is a normalized constant determined as
(2.19) |
Step 3: Construct the -entropy plane with as the horizontal axis and as the vertical axis.
Assume that is the probability distribution of power spectrum, donates the amount of non-zero elements of , and are the maximum and minimum elements of , respectively. The following statements are obtained (for detailed proofs, see Appendix B):
-
(1)
when ;
-
(2)
when ;
-
(3)
when ;
-
(4)
when .
3. Tsallis and Rényi entropy plane
Step 1: For a given time series, calculate the th Rényi PSE and then normalize them by
(2.20) |
Step 2: Calculate the th Tsallis PSE and then normalize them by
(2.21) |
Step 3: Construct the entropy plane with as the horizontal axis and as the vertical axis.
III. NUMERICAL SIMULATIONS
In this section, we display several different kinds of classes of system: chaotic systems, stochastic process, and periodic process, to illustrate the robustness and effectiveness of the proposed ECECP based on power spectral entropy. The length of simulated time series is 10 000 and the range of parameters and is defined from to 500 in size steps of .
A. Logistic map
The logistic map is a polynomial mapping of degree 2, which is defined as
(3.1) |
where is a parameter with the range of . Here, time series are obtained for , , and with initial condition . The logistic map presents distinct features when choosing different values of parameter . To be specific, the simple periodic dynamics is produced for and is for stable cycles of period four; chaotic dynamics is produced when and . It is noticeable that there is a three-period cycled window at around referring to the bifurcation diagram of logistic map.
Figure 1(a) presents the Tsallis q-complexity-entropy curves of logistic map with different parameters, , 3.5, 3.8, and 4. It is obvious that all the curves are open and begin at the point of (1,0). The curves for periodic time series seem to form a larger opening loop, that is, the values of statistical complexity of them are not far from 1 even when . And the minimum values of all occur at the point marked by when , which is also a turning point. The concrete figures can be seen in Table I. From Fig. 1(b), we can observe that the Rényi r-complexity-entropy curves for periodical time series begin at the point (1,0) and end at the point (0,0), whose curves for chaotic time series do not reach the origin although they all have one significant peak. To be specific, we calculate the derivative to visualize the curvature for each curve in Fig. 2. The four curves all increase from negative to positive values in the vicinity of . However, when , there is a significant decline to negative values for the curve of logistic map ( ). Figure 1(c) shows the relationship between Tsallis and Rényi power spectral entropy for logistic with different parameters. It is likely that the parameters and make more sense on periodic time series.
FIG. 1.
Three complexity-entropy curves for logistic map with different parameters . The markers, and , stand for the starting and end points of the curve, respectively. The marker represent the point of statistical complexity and entropy when the parameter or equals to 1.
TABLE I.
The minimum value of and maximum value of for logistic map with different parameters .
0.0055 | 0.0148 | 0.1598 | 0.3887 | |
0.9471 | 0.9810 | 0.6168 | 0.4777 |
B. Chaotic systems
In this subsection, we first compare seven different kinds of chaotic systems with changing parameters and , then analyze the inherent properties of chaotic systems.
- (1)
- (2)
- (3)
- (4)
- (5)
- (6)
- (7)
Figure 3 illustrates three different complexity-entropy curves for each chaotic map. In general, all the Tsallis q-complexity-entropy curves do not form loops, indicating that there exists zero power spectrums in power spectrum density distribution in Fig. 3(a). We further detect the minimum value of and maximum value of , which is shown in Table II. It is clear that the Rossler system possesses the least value of and the greatest value of . This consequence implies that Rossler system is much more sensitive to the parameter , which can also be concluded via Rényi r-complexity entropy curves in Fig. 3(b). Different with the former curves, the r-complexity entropy curves seem to be more efficient to characterize these map by different shapes of them even though they all begin at the point (1,0). There is a trough and a peak for the curve of the Henon map as well as the Tinkerbell map, and others only have an obvious peak. Furthermore, to classify them more comprehensively, we calculate the derivative to visualize the curvature for each map, which can be observed in Fig. 4. Without any doubt, the values of two maps mentioned above increase to positive numbers and then decrease to negative ones, whereas there is a clear rise at the beginning and then remains stale at a positive value for the other five maps. Figure 3(c) shows the relationship between Tsallis and Rényi entropies with simultaneously increasing and . The values of Tsallis and Rényi entropy are both close to 1, when . However, there exists a different situation when . The results of Tsallis entropy are near to 1 while those of Rényi entropy are far from 1. In a word, these chaotic maps can be distinguished by the fatness of shapes of the curves.
FIG. 2.
The derivative of statistical complexity corresponding to the entropy as a function of Rényi parameter for logistic map with different parameters .
FIG. 3.
Three complexity-entropy curves of seven different chaotic time series. The markers, and , stand for the starting and end points of the curve, respectively. The marker represents the point of statistical complexity and entropy when the parameter or equals to 1.
TABLE II.
The minimum value of and maximum value of for different chaotic time series.
Henon | Burgers | Gingerbreadman | Tinkerbell | Lorenz | Rossler | Chen | |
---|---|---|---|---|---|---|---|
0.7731 | 0.3534 | 0.2736 | 0.6543 | 0.2211 | 0.0743 | 0.2238 | |
0.3338 | 0.7195 | 0.8223 | 0.4375 | 0.8725 | 0.9694 | 0.8452 |
FIG. 4.
The derivative of statistical complexity corresponding to the entropy as a function of Rényi parameter for seven different chaotic time series.
C. Stochastic process
In this subsection, we generate unrelated Gaussian white noise and long-range correlated 1/f noise and the results are also averaged after simulating 50 times.
Figure 5 shows three complexity-entropy curves of Gaussian white noise and 1/f noise. By comparing the Tsallis q-complexity entropy results in Figs. 5(a) and (d), the unrelated Gaussian white noise can form an entire loop, whereas the curve for related 1/f noise cannot be closed. The results for 1/f noise varies more substantially when the parameter changes from to , which is summarized in Table III. It is also interesting to find that there is a negative linear relationship for Gaussian white noise between and with the change of Rényi parameter . However, the Rényi r-complexity entropy curve for 1/f noise is convex. We can also observe that the derivative of the curve for Gaussian white noise is always negative, while the curve for 1/f noise has a positive curvature in the neighborhood of in Fig. 6. By observing the relationship between Tsallis and Rényi power spectral entropy, Gaussian white noise has a more narrow band, whereas the values for 1/f noise change a lot with different parameters in Figs. 5(c) and 5(f).
FIG. 5.
Three complexity-entropy curves of Gaussian white noise and 1/f noise. The markers, and , stand for the starting and end points of the curve, respectively. The marker represent the point of statistical complexity and entropy when the parameter or equals to 1.
TABLE III.
The minimum value of and maximum value of for Gaussian white noise and 1/f noise.
Gaussian white noise | 1/f noise | |
---|---|---|
0.8879 | 0.6563 | |
0.1911 | 0.4779 |
FIG. 6.
The derivative of statistical complexity corresponding to the entropy as a function of Rényi parameter for Gaussian white noise and 1/f noise.
D. Statistical testing
In order to test the nonlinearity in time series, we apply a statistical method proposed by Theiler.39 This approach first assumes a null hypothesis of some linear process known as surrogate data sets, and then define a discriminating statistic for the original and every surrogate date set. If the results obtained from original data are significantly distinct than the results calculated for the surrogate data sets, then the null hypothesis is rejected and nonlinearity id identified otherwise the null hypothesis is accepted. Several measures have been introduced to generate surrogate data sets, such as Fourier Transform (FT) and Amplitude Adjusted Fourier Transform (AAFT) algorithms. Later, Paluš further discussed statistical testing as a tool to characterize the nonlinearity and causality from time series and, thus, detected the physical mechanism of underlying dynamical systems.40 In this paper, we utilize AAFT to generate the surrogate data sets and define the discriminating statistic as
(3.9) |
After selecting the discrimination statistic , calculate the value of the statistic for the data sets. Then, determine the number of surrogate data sets , which are random realizations of the null process. Calculate the value of the statistic for each surrogate data set. At last, identify whether the value is on the tail of the empirical distribution of obtained from surrogate data sets. Especially, for a two-side test, if is found among the greatest or the smallest in the sort list that consists of and , reject the null hypothesis at the level .
It is noticeable that is no less than , and is often chosen to be an integer multiple of . Although the size of the test is independent of , several researches have verified that the results of the test perform better with the increase of .41,42 In this paper, we choose surrogate data sets and reject or accept the null hypothesis at the level .43 According to the Monte Carlo method, we estimate the confidence interval of values.
Table IV shows the results of nonlinearity test for nine time series. According to the confidence interval of and values obtained from surrogate data sets, we observe that nearly all the time series including periodical time series reject the null hypothesis, meaning that the nonlinearity truly exists and these time series are not generated from linear stochastic process. In addition, we compare three complexity-entropy curves for original Rössler system and surrogate data in Fig. 7. It is obvious that the surrogate time series generated from Rössler system is very close to the Gaussian stochastic process according to Fig. 5, which proves that the complexity-entropy curves are able to distinguish chaotic process from stochastic one.
TABLE IV.
The statistical testing of and for nine time series (R donates the rejection of mull hypothesis and A represents accepting it).
Time series | Confidence interval | Judgement | Confidence interval | Judgement | ||
---|---|---|---|---|---|---|
Burgers map | 4.7367 | [4.9772, 6.5780] | R | 66.4251 | [34.2054, 36.1410] | R |
Chen system | 18.5241 | [7.7581, 9.7198] | R | 34.1023 | [76.5824, 199.0728] | R |
Gingerbreadman map | 3.6292 | [5.1536, 5.6765] | R | 70.3709 | [35.5459, 38.8218] | R |
Hénon map | 26.5579 | [22.4717, 26.0687] | R | 33.8919 | [23.5640, 31.4141] | R |
Lorenz system | 28.3404 | [34.8307, 38.3528] | R | 42.4013 | [18.7551, 22.7769] | R |
Rössler system | 2.8394 | [2.8420,3.8770] | R | 99.1665 | [43.4283,53.6694] | R |
Tinkerbell map | 13.7198 | [11.1411, 13.2249] | R | 50.9866 | [32.5737, 36.4427] | R |
Logistic map (a = 4) | 4.1260 | [3.9294, 4.3276] | A | 90.8679 | [49.5653, 49.6964] | R |
Logistic map (a = 3.5) | 11.0205 | [3.0577, 3.3210] | R | 196.9760 | [80.6788, 85.8214] | R |
FIG. 7.
The comparison of three complexity-entropy curves for original and surrogate Rössler system.
IV. EMPIRICAL DATA AND ANALYSES
A. Data description
Sleep is an important physiological activity, and it is closely related to health, work, study, and so on. A reasonable stage of sleep is the basis for studying sleep quality and diagnosing sleep disorders. There are several physiological parameters to describe different sleep stages.
In this section, a healthy male subject was enrolled and the participant was suggested to sleep in a standard sleep laboratory with an all-night Polysomnography (PSG, Compumedics, Australia). The whole research protocols satisfied the discipline of the American Academy of Sleep Medicine (AASM). Commonly used recordings include: 6-channel EEG montage, bilateral EOG, chin surface and bilateral anterior tibialis surface EMG, Resp. effort (chest and abdominal excursion), nasal pressure and airflow, snoring sensor, SpO2, ECG, and body position. PSGs were recorded depending on 30s epochs during the night, and were reviewed by experienced sleep specialists. The sampling frequency of EEG and ECG signals (channel C4-M1) in our study is 128 Hz.44
According to different morphological characteristics of electroencephalogram (EEG), electromyography of the neck (EMG), and Electro Oculog (EOG), sleep is generally divided into three categories: Wake, rapid eye movement (REM), and Non-REM. Non-REM can be also divided into three parts: N1, N2, and Slow wave sleep (SWS). The depth of sleep ranges from light sleep, mild sleep, to deep sleep. Different sleep EEGs have different morphological characteristics.45 Specifically speaking, N1 is the stage between wakefulness and sleep, transformed from comparatively unsynchronized beta (12.30 Hz) and gamma (25.100 Hz) brain waves to more synchronized but slower alpha waves (8.13 Hz), and then to theta waves (4.7 Hz). And there are hypnagogic jerks frequently happening during Stage N1. N2, dominated by the theta waves (4.7 Hz) and accompanied by sleep spindles (12.14 Hz) or K-complexes, is the first unambiguous stage of sleep. Controlled by the delta waves (0.54 Hz) is Stage SWS. Stage REM, identified by the rapid and random eye movement, possesses high frequency (containing theta, alpha and even beta) waves like the wakefulness and dreaming commonly happens during this stage. During the sleep, the balance of sympathetic and vagal will modulate in distinct sleep stages.
Here, we use both EEG and ECG signals and utilize the proposed methods to detect the inherent characteristics of different sleep stages and classify them efficiently. Considering that the duration of stage N1 is a little short in the whole period, the stages N1 and N2 are set as a stage in the following calculations to obtain more accurate results. The sampling frequency of EEG and ECG signals is 128 Hz. In addition, we select 5 successive epochs of each stage and thus, the length of experimental time series corresponding to each stage is 19 200. The range of parameters and is defined from to 500 in size steps of .
Figure 8 illustrates one epoch of each stage of ECG (left panel) and EEG (right panel) signals for one healthy subject, respectively. By visually observing t ECG plot, it is a little to identify different stages since ECG detects cardiac vibration and our heart is a bit far away from head. On the contrary, the EEG signal can mirror more accurate information about sleep and obviously different stage possesses distinct oscillation. One can draw the conclusion that the wave pattern of stage SWS is much more stable than others because deep sleep is happening. On the other hand, the rest four stages vary to some degrees especially the wave of stage Wake and REM. In the next part, we will apply three entropy curves to distinguish different stages and detect the complexity of them.
FIG. 8.
The oscillation of different sleep stages for ECG signal (left panel) and EEG signal (right panel).
B. Results and analyses
Figure 9 demonstrates three entropy curves for ECG signal and EEG signal for sleep data, respectively. In general, the q-complexity-entropy curves for ECG and EEG signals are not entirely closed, which means that the signals are not generated from stochastic process. By observing the r-complexity-entropy curves for both signals and the derivative of statistical complexity corresponding to the entropy as a function of Rényi parameter in Fig. 10, we can draw the conclusion that the signals are more similar with long-range correlated 1/f noise, whose result is consistent with that proposed by Costa et al.46 Moreover, it is noticeable that it is rather difficult to distinguish different stages when the parameters equal to 1 (emphasized by the marker ), proving the necessity of the introduced method.
FIG. 9.
Three complexity-entropy curves of ECG signal and EEG signal of sleep data. The markers, and , stand for the starting and end points of the curve, respectively. The marker represent the point of statistical complexity and entropy when the parameter or equals to 1.
FIG. 10.
The derivative of statistical complexity corresponding to the entropy as a function of Rényi parameter for ECG signal and EEG signals of sleep data.
To be specific, in Figs. 9(a)–9(c), results of ECG signal imply that these stages can be roughly divided into three parts: Wake, Non-REM, and REM. Without loss of generality, when deep sleep happens, the signal tends to be more smooth than that in the wake stage. Thus, the former signal is more likely to have less randomness and more complexity. Continually, these stages can be classified in detail via the EEG signal in Figs. 9(d)–9(f). It is clear that there are two stages in Non-REM, deep sleep (red line), and light sleep (blue line). For detailed information in Tables V and VI, we find that EEG signal is more relevant to sleep stages than the ECG signal, since the results from former signal are more accurate. Nevertheless, it seems to be more comprehensive to apply both signals to analyze sleep data. Finally, we test the nonlinearity of signals for each stage, whose outcomes are shown in Tables VII and VIII. The parameters of surrogate data sets are the same as those in the simulated time series. It is intriguing that the null hypothesis has been rejected for all the signals, whose conclusion is consistent with that obtained from three entropy curves.
TABLE V.
The minimum value of and maximum value of for ECG signal of sleep data.
SWS | N2&N1 | REM | Wake | |
---|---|---|---|---|
0.5522 | 0.5554 | 0.5975 | 0.5747 | |
0.5739 | 0.5744 | 0.5322 | 0.5584 |
TABLE VI.
The minimum value of and maximum value of for EEG signal of sleep data.
SWS | N2&N1 | REM | Wake | |
---|---|---|---|---|
0.3318 | 0.4131 | 0.4837 | 0.5153 | |
0.7921 | 0.7333 | 0.6612 | 0.6401 |
TABLE VII.
The statistical testing of and of ECG signal for sleep data (R donates the rejection of mull hypothesis and A represents accepting it).
Stage | Confidence interval | Judgement | Confidence interval | Judgement | ||
---|---|---|---|---|---|---|
SWS | 7.9157 | [9.8848, 11.8674] | R | 174.3639 | [127.2820, 173.9915] | R |
N2&N1 | 8.4667 | [10.7334, 13.0210] | R | 95.5029 | [96.5566, 140.3590] | R |
REM | 11.0889 | [16.0199, 18.2147] | R | 123.0420 | [76.5428, 108.8258] | R |
Wake | 9.3032 | [16.8049, 18.6870] | R | 156.5809 | [68.4832, 89.8091] | R |
TABLE VIII.
The statistical testing of and of EEG signal for sleep data (R donates the rejection of mull hypothesis and A represents accepting it).
Stage | Confidence interval | Judgement | Confidence interval | Judgement | ||
---|---|---|---|---|---|---|
SWS | 7.3760 | [6.4055, 6.5481] | R | 240.7165 | [158.7577, 199.2646] | R |
N2&N1 | 7.6724 | [7.8024, 7.9931] | R | 91.7974 | [76.7778, 91.7657] | R |
REM | 10.9137 | [10.7255, 11.0766] | R | 133.1611 | [93.1217, 105.7768] | R |
Wake | 9.8515 | [10.0942, 10.5263] | R | 123.3705 | [94.3321, 103.7613] | R |
V. CONCLUSIONS
In this paper, we create three different entropy curves, Tsallis q-complexity-entropy curve, Rényi r-complexity-entropy curve, and Tsallis-Rényi entropy curve via extending the traditional complexity-entropy causality plane and replacing permutation entropy into power spectral entropy. This kind of method is free of any parameters and some features that are obscure in the time domain can be extracted in frequency domain.
Results from numerical simulations verify that these three entropy curves can characterize time series efficiently. First, we test these curves on logistic map with different parameters. The curves for periodic time series seem to form a larger opening loop than chaotic ones, that is, the values of statistical complexity of them are not far from 1 even when . The Rényi r-complexity-entropy curves for periodical time series begin at the point (1,0) and end at the point (0,0), whose curves for chaotic time series do not reach the origin although they all have one significant peak. To be specific, we calculate the derivative to visualize the curvature for each curve. The four curves all increase from negative to positive values in the vicinity of . However, when , there is a significant decline to negative values for the curve of logistic map ( ). Then, we enumerate seven fully chaotic time series. They display various features via three entropy curves. And we also compare the results from Gaussian white noise and 1/f noise. It is interesting that they exhibit distinct properties. The q-complexity-entropy curve for Gaussian white noise can form a closed loop but 1/f noise cannot. The derivative for r-complexity-entropy curve of Gaussian white noise is always negative, whereas the derivative for 1/f noise is positive in the vicinity of .
Finally, we apply these entropy curves to ECG signal and EEG signal of sleep data from the healthy individual. The results are rather interesting and comprehensive. In the first place, all the q-complexity-entropy curves are not completely closed and there exist both positive and negative derivatives for r-complexity-entropy curves, confirming that the signals are more likely to have long-range correlations as 1/f noise. Then, the curves of ECG signal can roughly divided sleep stages into three parts: Wake, Non-REM, and REM. The curves of EEG signal can separate NREM into two detailed stages: deep and light sleep.
In conclusion, the inherent properties of different kinds of time series can be excavated efficiently via combining these entropy curves.
ACKNOWLEDGMENTS
The financial supports from the funds of the Fundamental Research Funds for the Central Universities (2018JBZ104, K18RC00010, and KSJB17026536), the China National Science (61771035, 61603029), and the Beijing National Science (4162047) are gratefully acknowledged.
APPENDIX A: THE LIMIT VALUES FOR AND WHEN AND
Suppose that is the probability distribution of power spectrum and donates the amount of non-zero elements of . The following statements are obtained:
-
(1)
when ;
-
(2)
when ;>;
-
(3)
when ;>;
-
(4)
when .
Proof.
(1) For any given , it is obvious that when . Applying this fact in Eq. (2.10), we acquire
(A1) (2) Reviewing Eq. (2.15), we observe that when . Thus, according to Eq. (2.14)
(A2) Integrating the results calculated above, we obtain that
(A3) (3) For , Eq. (2.10) can be described as
(A4) Thus, when , .
(4) Let from Eq. (2.15)
where
(A5) It is easy to find out that when .
Then,
(A6) Therefore,
(A7) Finally, applying all the results and item (3), we obtain that when .
APPENDIX B: THE LIMIT VALUES FOR AND WHEN AND
Assume that is the probability distribution of power spectrum, donates the amount of non-zero elements of , , and are the maximum and minimum elements of , respectively. The following statements are obtained:
-
(1)
when ;
-
(2)
when ;
-
(3)
when ;
-
(4)
when .
Proof.
(1) From Eqs. (2.11) and (2.16), we have
(B1) Hence,
(B2) (2) It is obvious that
(B3) and
(B4) Therefore,
(B5) (3) Since is the maximum element of , we can obtain and thus, .
For ,
(B6) When , according to the Squeeze Theorem, .
(4) The “disequilibrium” is described in detail as
(B7) To calculate the limit value of , we first begin by computing handy upper and lower bounds of it. It is noticeable that
(B8) and then, for ,
(B9) Similarly, we have
(B10) Thus,
(B11) Using the above results and the fact that the denominator of is positive for , we can obtain
(B12) for sufficiently large .
To get a lower bound, we observe that
(B13) and
(B14) Thus,
(B15) for sufficiently large .
Finally, integrating all the results, we can obtain
(B16) and
(B17)
REFERENCES
- 1.Okaly J. B., Mvogo A., Woulache R. L., and Kofane T. C., Commun. Nonlinear Sci. Numer Simul. 55, 183 (2017). [Google Scholar]
- 2.Kolmogorov A. N., Probl. Inf. Transm. 1, 1 (1965). [Google Scholar]
- 3.Mandelbrot B. B. and Wheeler J. A., J. R. Stat. Soc. 147, 468 (1983). [Google Scholar]
- 4.Lyapunov A. M., Int. J. Control. 31, 353 (1994). [Google Scholar]
- 5.Shannon C. E., Acm Sigmobile Mobile Comput. Commun. Rev. 5, 379 (1948). [Google Scholar]
- 6.Kullback S. and Leibler R. A., Ann. Math. Stat. 22, 79 (1951). 10.1214/aoms/1177729694 [DOI] [Google Scholar]
- 7.Pincus S., Proc. Natl. Acad. Sci. U.S.A. 88, 2297 (1991). 10.1073/pnas.88.6.2297 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Richman J. S. and Moorman J. R., Am. J. Physiol. Heart Circ. Physiol. 278, (2000). 10.1152/ajpheart.2000.278.6.H2039 [DOI] [PubMed] [Google Scholar]
- 9.Bandt C. and Pompe B., Phys. Rev. Lett. 88, 174102 (2002). 10.1103/PhysRevLett.88.174102 [DOI] [PubMed] [Google Scholar]
- 10.Palus M., Phys. Lett. A 227, 301 (1997). 10.1016/S0375-9601(97)00079-0 [DOI] [Google Scholar]
- 11.Palus M., Phys. Lett. A 235, 341 (1997). 10.1016/S0375-9601(97)00635-X [DOI] [Google Scholar]
- 12.Palus M., in Advances in Nonlinear Geosciences (Springer, Cham, 2018), pp. 427–463.
- 13.Zhang Y. and Shang P., Commun. Nonlinear Sci. Numer. Simul. 6, 1659 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Zhao X., Shang P., and Huang J., Europhys. Lett. 102, 40005 (2013). 10.1209/0295-5075/102/40005 [DOI] [Google Scholar]
- 15.Yin Y. and Shang P., Nonlinear Dyn. 78, 2921 (2014). 10.1007/s11071-014-1636-2 [DOI] [Google Scholar]
- 16.Wang J., Ai Y., Liu X., and Sun X., in 2010 2nd International Conference on Computer Engineering and Technology (International Conference on Computer Engineering and Technology IEEE, 2010) pp. 74–81.
- 17.Rosso O. A., Larrondo H. A., Martin M. T., Plastino A., and Fuentes M. A., Phys. Rev. Lett. 99, 154102 (2007). 10.1103/PhysRevLett.99.154102 [DOI] [PubMed] [Google Scholar]
- 18.Lopezruiz R., Mancini H. L., and Calbet X., Phys. Lett. A 209, 321 (2010). [Google Scholar]
- 19.Anteneodo C. and Plastino A. R., Phys. Lett. A 223, 348 (1996). 10.1016/S0375-9601(96)00756-6 [DOI] [Google Scholar]
- 20.Weck P. J., Schaffner D. A., Brown M. R., and Wicks R. T., Phys. Rev. E 91, 023101 (2015). 10.1103/PhysRevE.91.023101 [DOI] [PubMed] [Google Scholar]
- 21.Zunino L., Zanin M., Tabak B. M., Perez D. G., and Rosso O. A., Phys. A Stat. Mech. Appl. 389, 1891 (2010). 10.1016/j.physa.2010.01.007 [DOI] [Google Scholar]
- 22.Rosso O. A., Olivares F., Zunino L., De Micco L., Aquino A. L. L., Plastino A., and Larrondo H. A., Eur. Phys. J. B 86, 116 (2013). 10.1140/epjb/e2013-30764-5 [DOI] [Google Scholar]
- 23.Siddagangaiah S., Li Y., Guo X., Chen X., Zhang Q., Yang K., and Yang Y., Entropy 18, 101 (2016). 10.3390/e18030101 [DOI] [Google Scholar]
- 24.Ribeiro H. V., Jauregui M., Zunino L., and Lenzi E. K., Phys. Rev. E 95, 062106 (2017). 10.1103/PhysRevE.95.062106 [DOI] [PubMed] [Google Scholar]
- 25.Zunino L., Tabak B. M., Serinaldi F., Zanin M., Perez D. G., and Rosso O. A., Phys. A Stat. Mech. Appl. 390, 876 (2011). 10.1016/j.physa.2010.11.020 [DOI] [Google Scholar]
- 26.Ribeiro H. V., Zunino L., Mendes R. S., and Lenzi E. K., Phys. A Stat. Mech. Appl. 391, 2421 (2012). 10.1016/j.physa.2011.12.009 [DOI] [Google Scholar]
- 27.Tsallis C., J. Stat. Phys. 52, 479 (1988). 10.1007/BF01016429 [DOI] [Google Scholar]
- 28.Renyi A., “On Measures of Information and Entropy,” Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability (1960), p. 547.
- 29.Kantz H., Kurths J., and Mayer-Kress G., Nonlinear Analysis of Physiological Data (Springer, Berlin, 1998), p. 271.
- 30.Feldman D. P., Mctague C. S., and Crutchfield J. P., Chaos 18, 201 (2008). 10.1063/1.2991106 [DOI] [PubMed] [Google Scholar]
- 31.Rathie P. N. and Silva S. D., Appl. Math. Sci. 2, 1359–1363 (2008). [Google Scholar]
- 32.Johal R. S., Physics, arXiv:cond-mat/9803017 (2012). [Google Scholar]
- 33.Henon M., Commun. Math. Phys. 50, 94 (1976). 10.1007/BF01608556 [DOI] [Google Scholar]
- 34.Whitehead R. R. and Macdonald N., Phys. D Nonlinear Phenom. 13, 401 (1984). 10.1016/0167-2789(84)90141-6 [DOI] [Google Scholar]
- 35.Devaney R. L., Phys. D Nonlinear Phenom. 10, 387 (1984). 10.1016/0167-2789(84)90187-8 [DOI] [Google Scholar]
- 36.Lorenz E. N., J. Atmos. Sci. 20, 130 (1963). [DOI] [Google Scholar]
- 37.Rossler O. E., Phys. Lett. B 57, 397 (1976). 10.1016/0375-9601(76)90101-8 [DOI] [Google Scholar]
- 38.Chen G. and Ueta T., Int. J. Bifurcat. Chaos 09, 1465 (2011). 10.1142/S0218127499001024 [DOI] [Google Scholar]
- 39.Theiler J., Eubank S., Longtin A., Galdrikian B., and Farmer J. D., Phys. D Nonlinear Phenom. 58, 77 (1992). 10.1016/0167-2789(92)90102-S [DOI] [Google Scholar]
- 40.Palus M., Contemp. Phys. 48, 307 (2007). 10.1080/00107510801959206 [DOI] [Google Scholar]
- 41.Hope A. C. A., J. R. Stat. Soc. 30, 582 (1968). [Google Scholar]
- 42.Marriott F. H. C., J. R. Stat. Soc. 28, 75 (1979). [Google Scholar]
- 43.Theiler J. and Prichard D., Phys. D Nonlinear Phenom. 94, 221 (1996). 10.1016/0167-2789(96)00050-4 [DOI] [Google Scholar]
- 44.Shi W., Shang P., Ma Y., Sun S., and Yeh C. H., Commun. Nonlinear Sci. Numer. Simul. 44, 292 (2016). 10.1016/j.cnsns.2015.10.024 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Hobson J. A., Electroencephalogr. Clin. Neurophysiol. 26, 644 (1969). 10.1016/0013-4694(69)90021-2 [DOI] [Google Scholar]
- 46.Costa M. D., Goldberger A. L., and Peng C. K., Phys. Rev. E 71, 021906 (2005). 10.1103/PhysRevE.71.021906 [DOI] [PubMed] [Google Scholar]