Skip to main content
Entropy logoLink to Entropy
. 2020 Sep 8;22(9):1003. doi: 10.3390/e22091003

Composite Multiscale Partial Cross-Sample Entropy Analysis for Quantifying Intrinsic Similarity of Two Time Series Affected by Common External Factors

Baogen Li 1, Guosheng Han 1, Shan Jiang 1, Zuguo Yu 1,2,*
PMCID: PMC7597075  PMID: 33286772

Abstract

In this paper, we propose a new cross-sample entropy, namely the composite multiscale partial cross-sample entropy (CMPCSE), for quantifying the intrinsic similarity of two time series affected by common external factors. First, in order to test the validity of CMPCSE, we apply it to three sets of artificial data. Experimental results show that CMPCSE can accurately measure the intrinsic cross-sample entropy of two simultaneously recorded time series by removing the effects from the third time series. Then CMPCSE is employed to investigate the partial cross-sample entropy of Shanghai securities composite index (SSEC) and Shenzhen Stock Exchange Component Index (SZSE) by eliminating the effect of Hang Seng Index (HSI). Compared with the composite multiscale cross-sample entropy, the results obtained by CMPCSE show that SSEC and SZSE have stronger similarity. We believe that CMPCSE is an effective tool to study intrinsic similarity of two time series.

Keywords: composite multiscale partial cross-sample entropy (CMPCSE), multiscale cross-sample entropy (MCSE), time series, stock indices

1. Introduction

Complex systems with interacting constituents exist in all aspects of nature and society, such as geophysics [1], solid state physics, climate system, ecosystem, financial system [2,3], and so forth. These complex systems are constantly generating a large number of time signals. Fortunately, in recent decades, numerous creative methods have been proposed to explore the operation mechanism of these complex systems. Among them, entropy-based methods are very powerful modern analysis technology. The concept of ’entropy’ was first proposed by Clausius to deal with thermodynamic problems, and then Boltzmann gave a microscopic explanation from the perspective of statistical mechanics and proposed Boltzmann entropy. Gibbs proposed Gibbs entropy when determining uncertain system. In 1948, Shannon introduced the concept of entropy into information theory and put forward Shannon entropy (information entropy) [4]. Shortly after that, Renyi extended it and proposed Renyi entropy [5]. In 1988, Tsallis gave a Generalization of Boltzmann-Gibbs Statistics and proposed Tsallis entropy [6]. Although Gibbs entropy and Shannon entropy have the same mathematical expression, Shannon entropy has a broader meaning than thermodynamic entropy, as all the basic laws of thermodynamics can be derived from information entropy [7]. Since information entropy and Shannon entropy were proposed, many entropy-based methods have been proposed to explore the system complexity through studying the time series generated from them [8,9]. In order to quantify the changing complexity of real finite time series, Picnus proposed the approximate entropy (ApEn) [10,11,12], which had been used to study biological time series [13,14]. In 2002, Richman et al. analyzed the deficiencies of ApEn and proposed the concept of sample entropy (SampEn). Compared with ApEn, SampEn agreed with theoretical results much closer than ApEn over a broad range of conditions, and has been successfully applied to clinical cardiovascular study [15,16]. Cross-sample entropy (Cross-SampEn) was also proposed for comparing two different time series to assess their degree of similarity [15]. And in 2010, when Liu et al. studied the correlation of foreign exchange time series, they found that cross-SampEn is superior to correlation coefficient in describing the correlation between the foreign exchange time series [17]. In 2003, Costa et al. found that an increase in the entropy of a system is usually but not always associated with an increase of complexity, so the traditional entropy-based algorithms may lead to misleading results [18]. And in order to avoid this situation, they introduced the multiscale sample entropy (MSE), which had been successfully used to study various dynamical systems [19,20,21,22,23]. Not long after that, MSE was extended to multiscale cross-sample entropy (MCSE) to measure the cross-sample entropy over different time scales. Unfortunately, in the process of multi-scale analysis, the coarse-grained procedure sets a higher requirement for the length of the time series, that is, when the length of the sequence is not long enough, it will get inaccurate results. In addition, in some cases, the insufficiency of sequence length will lead to no template vector matched to another, and hence the cross-sample entropy can not be defined. In order to overcome this shortcoming, Wu et al. proposed the composite multiscale sample entropy (CMSE) [24] and refined composite multiscale entropy (RCMSE) [25] successively. Inspired by CMSE and RCMSE, Yin et al. introduced composite multiscale cross-sample entropy (CMCSE) and Refined composite multiscale cross-sample entropy (RCMCSE) [26], which reduced the probability of undefined entropy and has been successfully used to study structural health monitoring system [27]. In 2018, in order to better study the time series from the stock market, Wu and his coworkers introduced modified multiscale sample entropy measure based on symbolic representation and similarity (MSEBSS) [28]. Recently, Wang et al. proposed multiscale cross-trend sample entropy (MCTSE) to study the similarity of two time series that with potential trends [29]. In addition, multivariate multiscale sample entropy algorithm has been proposed to deal with multivariate data [30,31,32]. Recently, Jamin and Humeau-Heurtier offered a state-of-the-art on cross-entropy measures and their multiscale approaches in [33].

On the other hand, when some scholars studied the long-range correlation between time series, they found that if two non-stationary time series are driven by a common third-party force or by common external factors, the result without considering the common third-party force may not reflect their intrinsic relationship [34,35,36]. Fortunately, Baba et al. [37] found that if two time series affected by the external factors are additive, the levels of intrinsic cross-correlation between two time series can be measured by the partial cross-correlation coefficient. In 2015, Yuan et al. [38] and Qian et al. [39] introduced partial cross-correlation analysis to deal with this kind of situation from different departure points.

Inspired by the above works, we propose the composite multiscale partial cross-sample entropy (CMPCSE) to measure the intrinsic similarity of two time series affected by the third common external factor simultaneously in this paper. We first test CMPCSE on three sets of artificial data, and find that it can reveal the intrinsic similarity of the time series come from the models, and then apply it to a set of stock market indices.

2. Composite Multiscale Partial Cross-Sample Entropy

In this section, based on CMCSE [26], we propose a new method-composite multiscale partial cross-sample entropy (CMPCSE), which can be used to quantify the intrinsic similarity of two time series linearly affected by a common external factor.

Consider two time series recorded simultaneously, {x(t):t=1,2,,N} and {y(t):t=1,2,,N} linearly affected by {z(t):t=1,2,,N}, the main steps of CMPCSE are as follows:

Step 1: First we eliminate the effect of z(t) on x and y, respectively. The additive model for models of x(t) and y(t) can be given respectively as:

x(t)=βx,0+βx,1z(t)+rx(t),y(t)=βy,0+βy,1z(t)+ry(t), (1)

where, t=1,2,,N. When using regression analysis to estimate the value rx(t),ry(t), in a window of length s, we use the idea in MF-TWXDFA [40] to remove the effect of the sequence z(t) on x(t) and y(t) point by point as follows. For a given integer s (s2), the points j contained in a sliding window MWi corresponding to the point i should satisfy |ij|s. When the length of time series is different, we take different value for s. Usually the value of s is determined by experience. Accordingly, the weight function of the geographic weighted regression model is:

ωij=[1((ij)s)2]2,if|ij|s,0,otherwise. (2)

In the window MWi, we perform linear regression for {ωijxj} on {zj} or {ωijyj} on {zj}, respectively. We can get the regression values x^(zi) and y^(zi) of x(i) and y(i), respectively. Then we get the corresponding estimates of rx(t),ry(t):

r^x(i)=x(i)x^(zi)r^y(i)=y(i)y^(zi).

Then the normalized data of r^x(t),r^y(t) are defined as r¯x(t)=(r^x(t)<r^x(t)>)/δr^x(t) and r¯y(t)=(r^y(t)<r^y(t)>)/δr^y(t), respectively. Here <.> and δ are the corresponding mean and standard deviation. Next, we calculate the CMCSE of r¯x(t) and r¯y(t).

Step 2: Construct coarse-grained time series from the series r¯x(t) and r¯y(t) with the scale factor τ, respectively. Then we get {ukτ(t)} and {vkτ(t)}. Each point of the k-th coarse-grained time series at a scale factor of τ is defined as

ukτ(j)=1/τi=(j1)τ+kjτ+k1r¯x(i),1j(Nk+1)/τ,1kτ. (3)
vkτ(j)=1/τi=(j1)τ+kjτ+k1r¯y(i),1j(Nk+1)/τ,1kτ. (4)

For scale one (τ=1), the times series u11 and v11 are the original series r¯x and r¯y. For τ>1, Figure 1 and Figure 2 show two more intuitive examples of the coarse-grained procedure.

Figure 1.

Figure 1

Schematic illustration of the coarse-grained procedure of composite multiscale partial cross-sample entropy (CMPCSE) when τ=2. Modified from Reference [24].

Figure 2.

Figure 2

Schematic illustration of the coarse-grained procedure of CMPCSE when τ=3. Modified from Reference [24].

Step 3: According to the following formula, construct vector sequences with length m

mhkτ(i)=(ukτ(i),ukτ(i+1),,ukτ(i+m1)),{i:1i(Nk+1)/τm+1}, (5)
mwkτ(j)=(vkτ(j),vkτ(j+1),,ukτ(j+m1)),{j:1j(Nk+1)/τm+1}, (6)

from {ukτ(t)} and {vkτ(t)} respectively. Let mnkτ(i) be the number of vectors mwkτ(j) whose distance with mhkτ(i)

d(mhkτ(i),mwkτ(j))=max{|ukτ(i+t)vkτ(j+t)|:0tm1} (7)

is within the tolerance r. And then mnkτ=imnkτ(i) represents the total number of m-dimensional matched vector pairs and is obtained from the two k-th coarse-grained time series at a scale factor of τ. Similarly, m+1nkτ is the total number of matches of length m+1. Finally, the CMPCSE is calculated with the equation:

CMPCSE(x,y,z,τ,m,r)=CMCSE(r¯x,r¯y,τ,m,r)=1τkCSE(ukτ,vkτ,m,r)=1τklnm+1nkτmnkτ, (8)

where k means that neither m+1nkτ nor mnkτ is zero, that is, lnm+1nkτmnkτ makes sense, and τ is the number that makes lnm+1nkτmnkτ meaningful at a scale factor τ.

A more intuitive procedure of CMPCSE is shown in Figure 3.

Figure 3.

Figure 3

Flow charts of the CMPCSE algorithms.

In this paper, the entropies are calculated from scale 1 to 20, that is τ=1,2,3,,20. And the cross-sample entropy of each pair of coarse-grained series is calculated with m=2 and r=r, where r is the value selected from the candidate set {0.05,0.1,0.15,,0.95} according to the criterion proposed by Lake et al. [16].

3. Numerical Experiments for Artificial Time Series

In this section, we use a additive model of x and y as Equation (9) to perform numerical simulation and verify the effectiveness of the CMPCSE.

x(t)=2+3z(t)+rx(t),y(t)=2+3z(t)+ry(t). (9)

In the following numerical simulations, the series rx(t),ry(t) are generated from the Bivariate Fractional Brownian Motion (BFBMs), TWO-component ARFIMA process and Multifractal binomial measures, respectively, and all the third party interference factor series z(t) are pink(1/f) noise generated by the DSP System Toolbox in MATLAB 2016. In the experiments, all the results about the sequences with random terms are the average of 100 repeated results with series length N=212.

3.1. Bivariate Fractional Brownian Motion (BFBMs)

In this subsection, in order to test the performance of CMPCSE, we first use it to calculate the partial cross-sample entropy of BFBMs in the two sets of the above additive models (Equation (9)). The rx and ry are the incremental series of the two components of BFBMs with Hurst indices Hrx and Hry. Extensive research on BFMS has been made. We know that BFBMs is a single fractal process and there is a relationship Hrxry=(Hrx+Hry)/2 [41,42,43]. Wei et al. studied the long-range power cross-correlations between rx and ry in 2017 [40]. In the simulations, we set: (left) Hrx=0.6, Hry=0.7, ρ=0.7; (right) Hrx=0.6,Hry=0.9,ρ=0.7; where ρ is the cross-correlation coefficient between rx and ry.

We apply the CMPCSE method to the series simulated by BFBMs and pink noise. Figure 4 shows the results between the series simulated by the pink noise and BFBMs with (left) Hrx=0.6, Hry=0.7, ρ=0.7; (right) Hrx=0.6,Hry=0.9,ρ=0.7. From Figure 4 we can know that the entropy values of xy:z and rxry are very close at all time scales, but there are obviously discrepancy between the values of xy:z and xy except when the time scale equal to 1, which indicates that, when rx,ry are affected by the third party factor z simultaneously, the CMPCSE method can capture the intrinsic cross-sample entropy values of rx,ry by eliminating the influence of z.

Figure 4.

Figure 4

The CMPCSE results between the series simulated by the pink noise and bivariate fractional Brownian motion (BFBMs) with (left) Hrx=0.6,Hry=0.7,ρ=0.7; (right) Hrx=0.6,Hry=0.9,ρ=0.7.

3.2. TWO-Component ARFIMA Process

ARFIMA process is a monofractal process [40] and often used to model the power-law auto-correlations in stochastic variables [44]. It is defined as follows:

g(t)=G(t)+εg(t), (10)

where d(0,0.5) is a memory parameter, εg is an independent and identically distributed Gaussian variable, and G(d,t)=n=1an(d)g(tn), in which an(d) is the weight an(d)=dΓ(nd)/[Γ(1d)Γ(n+1)]. The Hurst index HGG is related to the memory parameters [45,46]. For the two-component ARFIMA processes discussed below, we take G=X or Y. The two-component ARFIMA process is defined as follows [47]:

rx(t)=WX(d1,t)+(1W)Y(d2,t)+εrx(t),ry(t)=(1W)X(d1,t)+(1W)Y(d2,t)+εry(t), (11)

where W[0.5,1] quantifies the coupling strength between the two processes rx(t) and ry(t). When W=1, rx(t) and ry(t) are fully decoupled and become two separate ARFIMA processes as defined in Equation (11). The cross-correlation between rx(t) and ry(t) increases when W decreases from 1 to 0.5 [47].

In the process of our calculations, we choose W=0.8 and the parameters (d1,d2) of ARFIMA as d1=0.1,d2=0.2 and d1=0.1,d2=0.4 respectively, and corresponding two error terms εrx(t) and εry(t) share one independent and identically distributed Gaussian variable with zero mean and unit variance. The CMPCSE method was used to the series simulated by two-component ARFIMA process and pink noise.

Figure 5 also shows that the entropy values of xy:z and rxry are very close at all time scales, but there are obviously discrepancy between the values of xy:z and xy except when the time scale equal 1. It also means that, when rx,ry are affected by the third party factor z simultaneously, one can use the CMPCSE to get intrinsic cross-sample entropy values of rx,ry.

Figure 5.

Figure 5

The CMPCSE results between the series simulated by the pink noise and two-component ARFIMA process with (left) d1=0.1,d2=0.2,W=0.8; (right) d1=0.1,d2=0.4,W=0.8.

3.3. Multifractal Binomial Measures

In this subsection, the series rx,ry to be tested come from the binomial measures generated by pmodel with known analytic multifractal properties [40]. We combine them with pink noise to test the performance of CMPCSE. Each binomial measure or multifractal signal can be generated by iteration. We start with the iteration k=0, where the data set g(i) consists of one value, g(0)(1)=1. In the kth iteration, the data set {g(k)(i),i=1,2,,zk} is obtained from g(k)(2i1)=pg(k1)(i) and g(k)(2i)=(1p)g(k1)(i). When k,g(k)(i) approaches to a binomial measures, and the scaling exponent function Hgg(q) is:

Hgg(q)=1/qlog2[pq+(1p)q]/q. (12)

In our simulation, we iterated 12 times with p1=0.2,p2=0.3,p3=0.4 and then get 3 binomial measures gp1(i),gp2(i),gp3(i). In our actual calculation process, we set rx= diff(gp(i)), here diff means the first order difference.

We present CMCSE results of the series xy, rxry and the CMPCSE xy:z in Figure 6 with px=0.2,py=0.3 and px=0.3,py=0.4. From the two pictures in Figure 6, we can easily find out that the entropy values of xy:z and rxry are very close at all time scales, but there are obviously discrepancy between the values of xy:z and xy. It also indicates that, when rx,ry are affected by the third party factor z simultaneously, one can use the CMPCSE method to get intrinsic cross-sample entropy values of rx,ry by eliminating the influence of z on x,y.

Figure 6.

Figure 6

The CMPCSE results between the series simulated by the pink noise and first order difference series of the binomial measures (left) px=0.2,py=0.3; (right) px=0.3,py=0.4.

4. Application to Stock Market Index

In order to validate the applicability of the CMPCSE method for empirical time series, we then apply it to stock market indices. The analyzed data sets consist of three Chinese stock indices: Shanghai securities composite index (SSEC), Shenzhen Stock Exchange Component Index (SZSE) and Hang Seng Index (HSI). All the raw data were download from https://finance.yahoo.com/. Then the daily closing data for the indices from 26 December 1999, to 17 July 2020, were used. Due to the different opening dates in mainland and Hong Kong, we exclude the data recorded on different dates and then reconnect the remaining parts of the original series to obtain time series with same length. As a result, the final daily closing data length is 5000.

In practice, we usually apply normalized time series. Denoting the closing index on the tth days as x(t), the daily index return is defined by: g(t)=ln(x(t))ln(x(t1)). Then the normalized daily return is defined as R(t)=(g(t)<g(t)>)/δ, where <g(t)> and δ are the mean value and standard deviation of the series g(t), respectively.

In 2015, Shi and Shang studied the multisacle cross-correlation coefficient and multisacle cross-sample entropy between SSEC, SZSE and HSI [48]. From their results, we can know that there is a strong correlation between the return data of SSEC and SZSE, and both them have weak correlation with HSI. The results of our estimation and comparison of the cross-sample entropy of the two return time series SSEC and SZSE, which includes two cases of including and excluding the influence of the HSI index, are shown in Figure 7. From the entropy measure results of return data in Figure 7, one can easily find that the entropy values of SSEC-SZSE are always bigger than SSEC-SZSE:HSI at all scales, which means that if the entropy values of SSEC-SZSE calculated by CMCSE are used to estimate the degree of similarity between SSEC and SZSE, the similarity between them will be underestimated. That is to say, the partial cross-sample entropy SSEC-SZSE:HSI can deliver more reasonable and real synchronization between the two return time series of SSEC and SZSE. We believe this result is reasonable, as SSEC and SZSE are the two most important stock indices in the mainland of china, so their daily return data should have strong synchronicity, especially under large time scales.

Figure 7.

Figure 7

Estimation and comparison of the cross-sample entropy between the two return time series Shanghai securities composite index (SSEC) and Shenzhen Stock Exchange Component Index (SZSE) when including and excluding the influence of the Hang Seng Index (HSI) index.

5. Discussion and Conclusions

In this paper, we proposed CMPCSE for quantifying intrinsic similarity of two time series affected by common external factors. Firstly, we described the calculation process of CMPCSE in detail. And then, in order to test the validity of CMPCSE, we applied it to three sets of artificial data. These three sets of artificial data were constructed by linear superposition of BFBMs, TWO-component ARFIMA process and Multifractal binomial measures with pink(1/f) noise respectively. The results of each set of the artificial data show that CMPCSE can accurately measure the intrinsic cross-sample entropy of two simultaneously recorded time series by removing the effects that come from pink noise. At last, CMPCSE was employed to investigate the partial cross-sample entropy of SSEC and SZSE by eliminating the effect of HSI. Compared with the conclusion from CMCSE, the results from CMPCSE show that SSEC and SZSE have stronger similarity. Because SSEC and SZSE are the two most important stock indices in the mainland of China, they should have strong consistency, especially under large time scales, so we think the result is reasonable and it is necessary to consider partial cross-sample entropy when one wants to measure the similarity of SZSE and SSEC.

On the other hand, we must also note that the first step in the calculation of CMPCSE is crucial to the result of CMPCSE. Maybe there are other ways to eliminate the influence of the third party on the two time series that we studied. In our work, we adopted the idea from Reference [40] and satisfactory results were obtained in our artificial data examples. At the same time, in our research process, we also notice that when CMPCSE is used to study the linear combination of NBVP times series mentioned in Reference [26] and pink noise, which is constructed in the way mentioned above, we can not get satisfactory results. Therefore, we think that the way to eliminate the third-party influence in this paper can not achieve good results for the sequence with violent oscillation. Meanwhile, we expect to see better methods to deal with similar times series.

All in all, we think the partial cross-sample entropy analysis is necessary when one wants to measure the similarity of two times series affected by common external factors and, at present, CMPCSE is a good choice.

Author Contributions

B.L. contributed to the conception and design of the study, developed the method and wrote the manuscript. Z.Y. gave the ideas and supervised the project. G.H. and S.J. analyzed the data and results. All authors discussed the results and reviewed the manuscript, and approved the final manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This project was supported by the National Natural Science Foundation of China (Grant No. 11871061), Collaborative Research project for Overseas Scholars (including Hong Kong and Macau) of the National Natural Science Foundation of China (Grant No. 61828203), the Chinese Program for Changjiang Scholars and Innovative Research Team in University (PCSIRT) (Grant No. IRT_15R58) and the Hunan Provincial Innovation Foundation for Postgraduate (Grant No. CX2017B265).

Conflicts of Interest

The authors declare no conflict of interest.

References

  • 1.Campillo M., Paul A. Long-Range Correlations in the Diffuse Seismic Coda. Science. 2003;299:547–549. doi: 10.1126/science.1078551. [DOI] [PubMed] [Google Scholar]
  • 2.Auyang S.Y. Foundations of Complex-System Theories: In Economics, Evolutionary Biology, and Statistical Physics. Cambridge University Press; Cambridge, UK: 1998. [Google Scholar]
  • 3.Plerou V., Stanley H.E. Stock return distributions: Tests of scaling and universality from three distinct stock markets. Phys. Rev. E. 2008;77:037101. doi: 10.1103/PhysRevE.77.037101. [DOI] [PubMed] [Google Scholar]
  • 4.Shannon C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948;27:379–423. doi: 10.1002/j.1538-7305.1948.tb01338.x. [DOI] [Google Scholar]
  • 5.Rényi A. On measures of entropy and information; Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability; Berkeley, CA, USA. 20 June–30 July 1960; Berkeley, CA, USA: University of California Press; 1961. pp. 547–561. [Google Scholar]
  • 6.Tsallis C. Possible Generalization of Boltzmann-Gibbs Statistics. J. Stat. Phys. 1988;52:479–487. doi: 10.1007/BF01016429. [DOI] [Google Scholar]
  • 7.Tribus M. The Maximum Entropy Formalism. MIT Press; Cambridge, MA, USA: 1979. [Google Scholar]
  • 8.Grassberger P., Procaccia I. Estimation of the Kolmogorov entropy from a chaotic signal. Phys. Rev. A. 1983;28:2591–2593. doi: 10.1103/PhysRevA.28.2591. [DOI] [Google Scholar]
  • 9.Eckmann J.P., Ruelle D., Ruelle D. Ergodic theory of chaos and strange attractors. Rev. Mod. Phys. 1985;57:617–656. doi: 10.1103/RevModPhys.57.617. [DOI] [Google Scholar]
  • 10.Pincus S.M. Approximate entropy as a measure of system complexity. Proc. Natl. Acad. Sci. USA. 1991;88:2297–2301. doi: 10.1073/pnas.88.6.2297. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Pincus S.M. Approximate entropy (ApEn) as a complexity measure. Chaos. 1995;5:110–117. doi: 10.1063/1.166092. [DOI] [PubMed] [Google Scholar]
  • 12.Pincus S.M. Quantifying complexity and regularity of neurobiological systems. Methods Neurosci. 1995;28:336–363. [Google Scholar]
  • 13.Pincus S.M., Viscarello R.R. Approximate entropy: A regularity measure for fetal heart rate analysis. Obstet. Gynecol. 1992;79:249–255. [PubMed] [Google Scholar]
  • 14.Schuckers S.A.C. Use of approximate entropy measurements to classify ventricular tachycardia and fibrillation. J. Electrocardiol. 1998;31:101–105. doi: 10.1016/S0022-0736(98)90300-4. [DOI] [PubMed] [Google Scholar]
  • 15.Richman J.S., Moorman J.R. Physiological time–series analysis using approximate entropy and sample entropy. Am. J. Physiol. Heart Circ. Physiol. 2000;278:H2039–H2049. doi: 10.1152/ajpheart.2000.278.6.H2039. [DOI] [PubMed] [Google Scholar]
  • 16.Lake D.E., Richman J.S., Griffin M.P., Moorman J.R. Sample entropy analysis of neonatal heart rate variability. Am. J. Physiol. Regul. Integr. Comp. Physiol. 2002;283:R789–R797. doi: 10.1152/ajpregu.00069.2002. [DOI] [PubMed] [Google Scholar]
  • 17.Liu L.Z., Qian X.Y., Lu H.Y. Cross-sample entropy of foreign exchange time series. Physica A. 2010;389:4785–4792. doi: 10.1016/j.physa.2010.06.013. [DOI] [Google Scholar]
  • 18.Costa M., Peng C.K., Goldberger A.L., Hausdorff J.M. Multiscale entropy analysis of human gait dynamics. Physica A. 2003;330:53–60. doi: 10.1016/j.physa.2003.08.022. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Costa M., Goldberger A.L., Peng C.K. Multiscale entropy analysis of biological signals. Phys. Rev. E. 2005;71:021906. doi: 10.1103/PhysRevE.71.021906. [DOI] [PubMed] [Google Scholar]
  • 20.Thuraisingham R.A., Gottwald G.A. On multiscale entropy analysis for physiological data. Physica A. 2006;366:323–332. doi: 10.1016/j.physa.2005.10.008. [DOI] [Google Scholar]
  • 21.Peng C.K., Costa M., Goldberger A.L. Adaptive data analysis of complex fluctuations in physiologic time series. Adv. Adapt. Data Anal. 2009;1:61–70. doi: 10.1142/S1793536909000035. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Zhang L., Xiong G.L., Liu H.S., Zou H.J., Guo W.Z. Bearing fault diagnosis using multi-scale entropy and adaptive neuro–fuzzy inference. Expert Syst. Appl. 2010;37:6077–6085. doi: 10.1016/j.eswa.2010.02.118. [DOI] [Google Scholar]
  • 23.Lin J.L., Liu J.Y.C., Li C.W., Tsai L.F., Chung H.Y. Motor shaft misalignment detection using multiscale entropy with wavelet denoising. Expert Syst. Appl. 2010;37:7200–7204. doi: 10.1016/j.eswa.2010.04.009. [DOI] [Google Scholar]
  • 24.Wu S.D., Wu C.W., Lin S.G., Wang C.C., Lee K.Y. Time series analysis using composite multiscale entropy. Entropy. 2013;15:1069–1084. doi: 10.3390/e15031069. [DOI] [Google Scholar]
  • 25.Wu S.D., Wu C.W., Lin S.G., Lee K.Y., Peng C.K. Analysis of complex time series using refined composite multiscale entropy. Phys. Lett. A. 2014;378:1369–1374. doi: 10.1016/j.physleta.2014.03.034. [DOI] [Google Scholar]
  • 26.Yin Y., Shang P.J., Feng G.C. Modified multiscale cross-sample entropy for complex time series. Appl. Math. Comput. 2016;289:98–110. [Google Scholar]
  • 27.Lin T.K., Chien Y.H. Performance evaluation of an entropy-based structural health monitoring system utilizing composite multiscale cross-sample entropy. Entropy. 2019;21:41. doi: 10.3390/e21010041. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Wu Y., Shang P.J., Li Y.L. Multiscale sample entropy and cross-sample entropy based on symbolic representation and similarity of stock markets. Commun. Nonlinear Sci. Numer. Simul. 2018;56:49–61. doi: 10.1016/j.cnsns.2017.07.021. [DOI] [Google Scholar]
  • 29.Wang F., Zhao W.C., Jiang S. Detecting asynchrony of two series using multiscale cross–trend sample entropy. Nonlinear Dyn. 2020;99:1451–1465. doi: 10.1007/s11071-019-05366-y. [DOI] [Google Scholar]
  • 30.Ahmed M.U., Li L., Cao J., Mandic D.P. Multivariate multiscale entropy for brain consciousness analysis; Proceedings of the IEEE Engineering in Medicine and Biology Society (EMBC); Boston, MA, USA. 30 August–3 September 2011; pp. 810–813. [DOI] [PubMed] [Google Scholar]
  • 31.Ahmed M.U., Mandic D.P. Multivariate multiscale entropy: A tool for complexity analysis of multichannel data. Phys. Rev. E. 2011;84:061918. doi: 10.1103/PhysRevE.84.061918. [DOI] [PubMed] [Google Scholar]
  • 32.Looney D., Adjei T., Mandic D.P. A Novel Multivariate Sample Entropy Algorithm for Modeling Time Series Synchronization. Entropy. 2018;20:82. doi: 10.3390/e20020082. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Jamin A., Humeau-Heurtier A. (Multiscale) Cross-Entropy Methods: A Review. Entropy. 2020;22:45. doi: 10.3390/e22010045. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Kenett D.Y., Shapira Y., Ben-Jacob E. RMT assessments of the market latent information embedded in the stocks’ raw, normalized, and partial correlations. J. Probab. Stat. 2009;2009:249370. doi: 10.1155/2009/249370. [DOI] [Google Scholar]
  • 35.Kenett D.Y., Tumminello M., Madi A., Gur-Gershgoren G., Mantegna R.N., Ben-Jacob E. Dominating clasp of the financial sector revealed by partial correlation analysis of the stock market. PLoS ONE. 2010;5:e15032. doi: 10.1371/journal.pone.0015032. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Shapira Y., Kenett D.Y., Ben-Jacob E. The index cohesive effect on stock market correlations. Eur. Phys. J. B. 2009;72:657. doi: 10.1140/epjb/e2009-00384-y. [DOI] [Google Scholar]
  • 37.Baba K., Shibata R., Sibuya M. Partial correlation and conditional correlation as measures of conditional independence. Aust. N. Z. J. Stat. 2004;46:657–664. doi: 10.1111/j.1467-842X.2004.00360.x. [DOI] [Google Scholar]
  • 38.Yuan N.M., Fu Z.T., Zhang H., Piao L., Xoplak E.I., Luterbacher J. Detrended Partial-Cross-Correlation Analysis: A New Method for Analyzing Correlations in Complex System. Sci. Rep. 2015;5:8143. doi: 10.1038/srep08143. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Qian X.Y., Liu Y.M., Jiang Z.Q., Podobnik B., Zhou W.X., Stanley H.E. Detrended partial cross-correlation analysis of two nonstationary time series influenced by common external forces. Phys. Rev. E. 2015;91:062816. doi: 10.1103/PhysRevE.91.062816. [DOI] [PubMed] [Google Scholar]
  • 40.Wei Y.L., Yu Z.G., Zou H.L., Anh V.V. Multifractal temporally weighted detrended cross-correlation analysis to quantify power-law cross-correlation and its application to stock markets. Chaos. 2017;27:063111. doi: 10.1063/1.4985637. [DOI] [PubMed] [Google Scholar]
  • 41.Lavancier F., Philippe A., Surgailis D. Covariance function of vector self–similar processes. Stat. Probab. Lett. 2009;79:2415–2421. doi: 10.1016/j.spl.2009.08.015. [DOI] [Google Scholar]
  • 42.Coeurjolly J.F., Amblard P.O., Achard S. On multivariate fractional Brownian motion and multivariate fractional Gaussian noise; Proceedings of the 2010 18th European Signal Processing Conference; Aalborg, Denmark. 23–27 August 2010; Piscataway, NJ, USA: IEEE; 2010. pp. 1567–1571. [Google Scholar]
  • 43.Amblard P.O., Coeurjolly J.F. Identification of the multivariate fractional Brownian motion. IEEE Trans. Signal Process. 2011;59:5152–5168. doi: 10.1109/TSP.2011.2162835. [DOI] [Google Scholar]
  • 44.Hosking J.R.M. Fractional differencing. Biometrika. 1981;68:165–176. doi: 10.1093/biomet/68.1.165. [DOI] [Google Scholar]
  • 45.Podobnik B., Ivanov P., Biljakovic K., Horvatic D., Stanley H.E., Grosse I. Fractionally integrated process with power-law correlations in variables and magnitudes. Phys. Rev. E. 2005;72:026121. doi: 10.1103/PhysRevE.72.026121. [DOI] [PubMed] [Google Scholar]
  • 46.Podobnik B., Stanley H.E. Detrended cross-correlation analysis: A new method for analyzing two nonstationary time series. Phys. Rev. Lett. 2008;100:084102. doi: 10.1103/PhysRevLett.100.084102. [DOI] [PubMed] [Google Scholar]
  • 47.Podobnik B., Horvatic D., Ng A.L., Stanley H.E., Ivanov P.C. Modeling long-range cross-correlations in two-component ARFIMA and FIARCH processes. Physica A. 2008;387:3954–3959. doi: 10.1016/j.physa.2008.01.062. [DOI] [Google Scholar]
  • 48.Shi W.B., Shang P.J. The multiscale analysis between stock market time series. Int. J. Mod. Phys. C. 2015;26:1550071. doi: 10.1142/S0129183115500710. [DOI] [Google Scholar]

Articles from Entropy are provided here courtesy of Multidisciplinary Digital Publishing Institute (MDPI)

RESOURCES