Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2021 Jul 1.
Published in final edited form as: Discrete Math. 2020 Feb 24;343(7):111870. doi: 10.1016/j.disc.2020.111870

Limits of Sums for Binomial and Eulerian Numbers and their Associated Distributions

Meng Li a, Ron Goldman b
PMCID: PMC7098577  NIHMSID: NIHMS1568773  PMID: 32218610

Abstract

We provide a probabilistic approach using renewal theory to derive some novel identities involving Eulerian numbers and uniform B-splines. The renewal perspective leads to a unified treatment for the normalized binomial coefficients and the normalized Eulerian numbers when studying their limits of sums, as well as their associated distributions – the binomial distributions (Bernstein polynomials) and the Irwin-Hall distributions (uniform B-splines). We further extend the probabilistic perspective to h-Bernstein polynomials (Pólya-Eggenberger distributions) through conditional renewal processes, and derive new limits of various ways of summing for the two special numbers and associated distributions. The proposed probabilistic unification dramatically simplifies the proofs of some identities, which are far from obvious (such as for h-Bernstein polynomials) or do not otherwise even appear promising (such as for Eulerian numbers).

Keywords: Eulerian numbers, h-Bernstein polynomials, Pólya-Eggenberger distributions, renewal theory, uniform B-splines

1. Introduction

Start with Pascal’s triangle, the binomial coefficients (nk) arranged in a triangular array as in Figure 1. Now normalize each row to sum to one: for each n divide the nth row by 2n. After this normalization we can ask: what are the sums of the columns? The entries in the first column form the geometric progression 1, 1/2, 1/4, … which sums to 2. In fact, it happens that when all the rows are normalized to sum to one, all the columns sum to two. But why 2? Equivalently we could ask: what are the sums of the long diagonals going from the upper left to the lower right? By symmetry the sums along these long diagonals are the same as the sums of the columns, so if we can sum the columns we can sum these long diagonals. But what about the short diagonals – that is, what about the sums along the short diagonals that go from lower left to upper right? If we compute the sums along these diagonals, we see the Fibonacci series: 1, 1, 2, 3, 5, 8, 13, …. But these Fibonacci numbers emerge before we normalize the rows. We could also ask: what are these sums after we normalize the rows to sum to one? Then the series becomes: 1, 1/2, 3/4, 5/8, 11/16, 21/32, …. In fact, we shall show in Section 3.1 that this series converges to 2/3. But why 2/3?

Figure 1:

Figure 1:

Pascal’s triangle – levels 0 to 8 – before normalization.

Consider next the Eulerian numbers nk arranged again in a triangular array as in Figure 2. For the Eulerian numbers we can also normalize each row to sum to one: for each n divide the nth row by n!. Now once again we can ask: what are the sums of the columns? After normalization, the sum of the first column is the Taylor expansion of e, but the other columns certainly do not sum to e. What then can we say about the sums of these columns? By symmetry the sums along the long diagonals are the same as the sums of the columns, so if we can sum the columns we can sum these long diagonals. But again what about the short diagonals – that is, what about the sums of the short diagonals that go from lower left to upper right after we normalize each row to sum to one? Are these sums in any way related to the corresponding sums in the normalized version of Pascal’s triangle?

Figure 2:

Figure 2:

The Eulerian numbers – levels 0 to 8 – before normalization.

The binomial coefficients (nk) count the number of subsets of {1, …, n} of exact order k, and satisfy the recurrence

(nk)=(n1k)+(n1k1).

In contrast, the Eulerian numbers nk count the number of permutations of {1, …, n} with exactly k ascents [1] and satisfy the recurrence

nk=(k+1)n1k+(nk)n1k1.

(Here we follow the standard conventions that (nk) and nk are both zero for n < k and that (00) and 00 are both equal to one.) At first glance then there is no reason to suspect any deep connection between these two triangular arrays of integers. But initial impressions can be deceiving; perhaps we can see some hidden connections if we look at some pictures.

We can depict 2-dimensional arrays of integers in the following fashion: represent each odd integer by a black square and each even integer by a white square. Applying this approach to the binomial coefficients and to the Eulerian numbers generates Figures 3 and 4.

Figure 3:

Figure 3:

Pascal’s triangle depicted by representing each odd number with a black square and each even number with a white square: left – levels 0 to 3, right – levels 0–127. As the number of levels increases, the Sierpinski triangle appears to emerge.

Figure 4:

Figure 4:

The Eulerian numbers depicted by representing each odd number with a black square and each even number with a white square: left – levels 0 to 3, right – levels 0–127. Once again as the number of levels increases, the Sierpinski triangle appears to emerge.

Recurrences often generate fractals. While level for level Figures 3 and 4 are different – compare, for example, levels 0 to 3 of the binomial coefficients with levels 0 to 3 of the Eulerian numbers – in the large both arrays look very much the same: they both appear to incarnate the same fractal, the Sierpinski triangle. This long-term likeness suggests that in some limiting fashion these two arrays exhibit similar behaviors. The goal of this paper is to explore some limiting connections between sums of binomial coefficients and sums of Eulerian numbers along with corresponding results for their associated distributions: the binomial distributions for the binomial coefficients and the Irwin-Hall distributions (uniform B-splines) for the Eulerian numbers. In particular, we are going to provide a unified, probabilistic approach using renewal theory to derive the following limiting identities:

A. Binomial Coefficients
A1 limkn=012n(nk)=2 (Columns)
A2 limnk012nk(nkk)=23 (Short Diagonals)
A3 limkn=0(1)n12n(nk)=0 (Alternating Sums)
B. Eulerian Numbers
B1 limkn=01n!nk=2 (Columns)
B2 limnk01(nk)!nkk=23 (Short Diagonals)
B3 limkn=0(1)n1n!nk=0 (Alternating Sums)
C. Bernstein Polynomials – Binomial Distributions
C1 limkn=0Bkn(t)=1t for t ∈ (0, 1) (Columns)
C2 limnk0Bknk(t)=11+t for t ∈ (0, 1) (Short Diagonals)
C3 limkn=0(1)nBkn(t)=0 for t ∈ (0, 1) (Alternating Sums)
D. h-Bernstein Polynomials – Pólya-Eggenberger Distributions
D1 limkn=0Bkn(t;h)=1hth for 0 < h < t < 1 (Columns)
D2 limnk0Bknk(t;h)=01xa1(1x)b1(1+x)B(a,b)dx for t ∈ (0, 1) and h > 0, where a=th, b=1th (Short Diagonals)
  D2a. limnk0Bknk(t;1)=2t for t ∈ (0, 1) (h = 1 in D2)
D3 limkn=0(1)nBkn(t;h)=0 for 0 < h < t < 1 (Alternating Sums)
E. Uniform B-splines – Irwin-Hall Distributions
E1 limtn=0N0,n(t)=2 (Columns)
E2 limnk0N0,nk(k+t)=23fort>0 (Short Diagonals)
E3 limtn=0(1)nN0,n(t)=0 (Alternating Sums)

For the identities in B and E, we shall show that the convergence rate is polynomial with an arbitrarily large order, utilizing additional results from renewal theory. For the identities in A, C, and D, we shall show that analogous results hold for any fixed k also invoking a probabilistic argument based on an infinite sequence of random variables, the same framework we use for the asymptotic identities. Below is a list of the non-asymptotic identities we will derive:

A*. Binomial Coefficients
A1* n=012n(nk)=2 (Columns)
A2* k012nk(nkk)=23+13(12)n (Short Diagonals)
A3* n=0(1)n12n(nk)=(1)k23k+1 (Alternating Sums)
C*. Bernstein Polynomials – Binomial Distributions
C1* n=0Bkn(t)=1t (Columns)
C2* k0Bknk(t)=1(t)n+11+t (Short Diagonals)
C3* n=0(1)nBkn(t)=(t)k(2t)k+1 (Alternating Sums)
D*. h-Bernstein Polynomials – Pólya-Eggenberger Distributions
D1* n=0Bkn(t;h)=1hthfor0<h<t<1 (Columns)
D2* k0Bknk(t;h)=01xa1(1x)b1(1+x)B(a,b)dx+(1)n+201xa+n(1x)b1(1+x)B(a,b)dxfort(0,1)andh>0,wherea=t/h,b=(1t)/h (Short Diagonals)
D3* n=0(1)nBkn(t;h)=(1)k01xa+k1(1x)b1(2x)k+1B(a,b)dxfor0<h<t<1,wherea=t/h,b=(1t)/h (Alternating Sums)

Although it is reassuring that these identities involving binomial coefficients and Bernstein polynomials hold for any k not only in the limit, the use of renewal theory provides an interesting link between binomial coefficients and Eulerian numbers and indicates the column sums in A1 and B1 are expected to be 2, since 2 is the reciprocal mean of both the uniform distribution on [0, 1] and the Bernoulli distribution with success probability 1/2. Likewise, the sums 2/3 of the short diagonals in A2 and B2 are the reciprocal mean of a shifted version of these two distributions. The combinatorial interpretations of binomial coefficients and Eulerian numbers provide virtually no insight into revealing such a remarkable connection. Some identities such as B1 seem straightforward using the general theory of renewal processes but are far from obvious or do not even appear promising otherwise (including the application of generating functions and enhanced approximation to the summand at each n).

The rows of the sequences we shall study form distributions; the columns do not. While it may seem unnatural to study the columns rather than the rows, there has been some recent work to investigate similar column sums with good effect. [2] introduces generating functions for the columns of the binomial distribution and uses these generating functions to derive a variety of identities for the Bernstein polynomials, including formulas for sums, alternating sums, differentiation, degree elevation, and subdivision. [3] introduces generating functions for the columns of the uniform B-splines and then applies these generating functions to derive a collection of identities for uniform B-splines, including the Schoenberg identity, formulas for sums and alternating sums, for moments and reciprocal moments, for differentiation, for Laplace transforms, and for convolutions with monomials. Thus learning about the columns can also provide rich insights into the rows. It is partly in this spirit that we investigate the identities in this paper.

2. Renewal theory

Before we proceed with our proofs, we provide a brief review of renewal theory in stochastic processes [4, pp. 358–373]. A renewal process is a stochastic model for events that occur at random times. Let X1, X2, … be independent and identically distributed (IID) non-negative random variables following a distribution F and let Sn=i=1nXi be their partial sum with S0 = 0. We may interpret each Xi as an interarrival time and Sn as the time of the nth arrival (or renewal). The renewal measure is defined by

U(A)=n=0P(SnA), (1)

for any A that is a measurable subset on [0,∞). The renewal measure U(A) is the expected number of arrivals in A since

n=0P(SnA)=n=0E(1{SnA})=E(n=01{SnA})=E{numberofarrivalsinA}, (2)

where 1(·) is the indicator function.

Let A = (x, x + Δ] where Δ > 0 is a fixed constant. The following renewal theorem, also known as Blackwell’s theorem, states that the expected number of arrivals in A is asymptotically proportional to the length of A with proportionality constant 1/μ, where μ is the expectation of the random variables Xi. The precise statement of this theorem depends on whether or not the distribution F is arithmetic: A distribution is called arithmetic if it is supported on a set of the form {nλ:n} for some λ > 0, and the largest such λ is called the span of the distribution.

Theorem 1

(Blackwell’s renewal theorem). If the distribution F is not arithmetic,

limxU(x,x+δ]=δ/μ

for any δ > 0.

If the distribution F is arithmetic with span λ,

limxU(x,x+λ]=λ/μ.

As we shall see in the next section, a list of identities involving special numbers and special distributions can be derived using a unified probabilistic argument via renewal theory by specifying A in Equation (1) and the distribution F of interarrival times. We consider mainly two classes of distributions: the uniform distribution supported on [a, b] where b > a and the Bernoulli distribution with success probability p ∈ (0, 1). Normalized Eulerian numbers and uniform B-splines correspond to uniform distributions, while normalized Binomial coefficients, Bernstein polynomials, and h-Bernstein polynomials correspond to Bernoulli distributions.

Notation

We write X ~ F if X is a random variable following a distribution F. We use Uniform(a, b) to denote the uniform distribution supported on [a, b], whose probability density function is f(x) = 1(x ∈ [a, b]) and mean is (a + b)/2. We use Bernoulli(p) to denote the Bernoulli distribution with success probability p, where the probability mass function is P(X = 1) = p (the trial succeeds) and P(X = 0) = 1 − p (the trial fails) and the mean is p. The sum of n IID Bernoulli trials drawn from Bernoulli(p) follows the binomial distribution Binomial (n, p).

We place a superscript on Xi, Sn, U in a renewal process to emphasize their dependence on the interarrival distribution F. In particular, we use the superscript “[a, b]” when Xi ~ Uniform(a, b) and “(p)” when Xi ~ Bernoulli(p); we use “(p) + 1” when Xi=Xi*+1 where Xi*Bernoulli(p), i.e., Xi follows Bernoulli(p) shifted by 1 satisfying P(Xi = 1) = 1 − p and P(Xi = 2) = p. For example, the readers may see Xi[0,1], Xi[1,2], Xi(1/2) or Xi(1/2)+1, and similarly for Sn and U.

3. Binomial coefficients and Eulerian numbers

3.1. Normalized binomial coefficients

The column sums of the normalized binomial coefficients are closely related to a renewal process due to the well known probabilistic interpretation of the normalized binomial coefficients using Bernoulli trials. Consider a renewal process in which the interarrival times Xi ~ Bernoulli(1/2). Then

12n(nk)=P(Sn(1/2)=k)=P(Sn(1/2)(k1,k]),

where Sn(1/2) is the time of the nth arrival and follows Binomial(n, 1/2).

Since the Bernoulli distribution is arithmetic with mean μ = 1/2 and span λ = 1, Theorem 1 gives

limkn=012n(nk)=limkU(1/2)(k1,k]=11/2=2. (A1)

For the sums of the short diagonals, we consider a new renewal process with shifted Bernoulli interarrival times Xi(1/2)+1 that have the same distribution as Xi(1/2)+1, i.e., Xi(1/2)+1=2 if the ith trial succeeds and Xi(1/2)+1=1 if the ith trial fails. Short diagonals are closely related to this new renewal process because

12nk(nkk)=P(X1(1/2)++Xnk(1/2)(k1,k])=P((X1(1/2)+1)++(Xnk(1/2)+1)(n1,n])=P(X1(1/2)+1++Xnk(1/2)+1(n1,n])=P(Snk(1/2)+1(n1,n]).

Therefore by Theorem 1

limnk012nk(nkk)=limnk0P(Snk(1/2)+1(n1,n])=limnk0P(Sk(1/2)+1(n1,n]) (3)
=limnU(1/2)+1(n1,n]=11+12=23, (A2)

where the change of index in Equation (3) is guaranteed by the observation that P(Sk(1/2)+1(n1,n])=0 for kn.

Equation (A1), as well as its variants in A, C, and D, actually hold for any k. We provide a probabilistic proof of their non-asymptotic counterparts in Section 7. The use of renewal theory unites Binomial coefficients and Eulerian numbers under the same framework with two interarrival time distributions, and leads to a range of identities involving special distributions. In the next section, we shall elaborate on such connections.

3.2. Normalized Eulerian numbers

Consider a renewal process with random interarrival time Xi[0,1]Uniform(0,1). [5] provides a probabilistic interpretation for the normalized Eulerian numbers: for each integer k,

1n!nk=P(Sn[0,1](k1,k]). (4)

It follows by Equation (1) that

n=01n!nk=n=0P(Sn[0,1](k1,k])=U[0,1](k1,k]. (5)

Equation (5) bridges the quantity n=01n!nk originating from Eulerian numbers with a renewal process, revealing the probabilistic interpretation of this column sum as the expected number of arrivals in the interval (k − 1, k] when the interarrival time is uniformly distributed on [0, 1].

The uniform distribution on [0, 1] is continuous, thus non-arithmetic, and has mean μ = 1/2. Substituting x = k and δ = 1 in Theorem 1, we obtain

limkn=01n!nk=11/2=2. (B1)

Rate of convergence

We have evaluated n=01n!nk numerically and observed that this sum converges very rapidly to 2. If X ~ Uniform(0, 1), then E(Xα+1)=01xα+1dx=1α+2< for any α ≥ 0. According to Corollary 5.2 in [6],

n=01n!nk2=o(kα).

Therefore, the convergence rate in Equation (B1) is polynomial with an arbitrarily large order. Below we calculate the difference n=01n!nk2 for the first several k’s using Mathematica:

k 0 1 2 3
n=01n!nk2 0.71828 −4.8 × 10−2 −4.2 × 10−3 3.9 × 10−5

k 4 5 · · · 50
n=01n!nk2 5.7 × 10−5 5.1 × 10−6 · · · < 1.0 × 10−45

Short diagonals

Short diagonals turn out to be related to a renewal process with interarrival times Xi[1,2]Uniform(1,2), which is identically distributed as Xi[0,1]+1. The mean of Xi[1,2]Uniform(1,2) is EXi[1,2]=EXi[0,1]+1=3/2.

Using Equation (4) we find that

1(nk)!nkk=P(X1[0,1]++Xnk[0,1](k1,k])=P((X1[0,1]+1)++(Xnk[0,1]+1)(n1,n])=P(X1[1,2]++Xnk[1,2](n1,n])=P(Snk[1,2](n1,n]).

Therefore,

k01(nk)!nkk=k0P(Snk[1,2](n1,n])=k0P(Sk[1,2](n1,n])=U[1,2](n1,n],

So by Theorem 1

limnk01(nk)!nkk=limnU[1,2](n1,n]=13/2=23, (B2)

which is the same value as the analogous result for the binomial coefficients.

4. Bernstein polynomials and h-Bernstein polynomials

4.1. Bernstein polynomials

We consider a renewal process with random interarrival time Xi(t)Bernoulli(t) where the success probability t ∈ (0, 1). In view of the probabilistic interpretation of the Bernstein polynomials Bkn(t)

Bkn(t)=(nk)tk(1t)nk=P(Sn(t)=(k1,k]), (6)

extensions of Equations (A1) and (A2) to Bernstein polynomials are immediately available. Since Bernoulli(t) is arithmetic with span λ = 1 and mean μ = t, a direct application of Theorem 1 gives

limkn=0Bkn(t)=limkU(t)(k1,k]=1t. (C1)

For the sums of the short diagonals in C2, we consider a new renewal process with shifted Bernoulli interarrival times Xi(t)+1 that have the same distribution as Xi(t)+1, i.e., Xi(t)+1=2 if the ith trial succeeds and Xi(t)+1=1 if the ith trial fails and has mean EXi(t)+1=EXi(t)+1=t+1. The sums of the short diagonals follow from the same argument as used in the proof of A2 and B2:

Bknk(t)=P(X1(t)++Xnk(t)(k1,k])=P((X1(t)+1)++(Xnk(t)+1)(n1,n])=P(X1(t)+1++Xnk(t)+1(n1,n])=P(Snk(t)+1(n1,n]),

which by Theorem 1 gives

limnk0Bknk(t)=limnk0P(Snk(t)+1(n1,n])=limnk0P(Sk(t)+1(n1,n])=limnU(t)+1(n1,n]=1t+1, (C2)

noting that P(Sk(t)+1(n1,n])=0 for kn.

4.2. h-Bernstein polynomials

The h-Bernstein polynomials are defined by

Bkn(t;h)=(nk)i=0k1(t+ih)i=0nk1(1t+ih)i=0n1(1+ih),k=0,1,2,,n,

where h ≥ 0. This formula is actually the probability density function of the Pólya Eggenberger distribution [7, 8]. This distribution reduces to the ordinary binomial distribution when h = 0, which has been discussed in the preceding section. Now we focus on positive h, when the Pólya Eggenberger distribution is a beta-binomial distribution with parameters a = t/h and b = (1t)/h [9, Ch 6]. A beta-binomial distribution with parameters (a, b) is the marginal distribution of X if (X|p)Binomial(n,p) and p ~ Beta(a, b), where Beta(a, b) is the Beta distribution having the probability density function f(x)=xa1(1x)b1B(a,b) for x ∈ (0, 1), and B(a, b) is the Beta function evaluated at (a, b).

This interpretation of Bkn(t;h) using a mixture of binomial distributions means that h-Bernstein polynomials correspond to a renewal process with interarrival times Xi(p), where the success probability p is a random draw from Beta(a, b). Therefore,

Bkn(t;h)=EpBeta(a,b)P(Sn(p)(k1,k])=:01P(Sn(p)(k1,k])pa1(1p)b1B(a,b)dp. (7)

Using the same argument as in the derivation for Bernstein polynomials but conditioning on the random success probability p, it follows that

limkn=0Bkn(t;h)=EpBeta(a,b)limkn=0P(Sn(p)(k1,k])=EpBeta(a,b)limkU(p)(k1,k] (8)
=EpBeta(a,b)(1p)=a+b1a1=1/h1t/h1=1hth, (D1)

where the interchange of expectation and limit in Equation (8) is guaranteed by the dominated convergence theorem [4, p. 111]. Here we require a = t/h > 1 to ensure that the expectation of 1/p exists, which means Equation (D1) holds for 0 < h < t < 1.

Asymptotic formulas for the sums of the short diagonals also hold for the h-Bernstein polynomials:

limnk0Bknk(t;h)=EpBeta(a,b)[limnn=0P(Sn(p)+1(n1,n])]=EpBeta(a,b)(1p+1)=01xa1(1x)b1(1+x)B(a,b)dx, (D2)

where 0 < t < 1, h > 0, a = t/h, b = (1 − t)/h.

By Euler’s integral representation of 2F1 [10, (1.6.1)], the integral on the right hand side of Equation (D2) reduces to a hypergeometric function, so

limnk0Bknk(t;h)=2F1(1,t/h;1/h;1).

When h = 1,

2F1(1,t/h;1/h;1)=2F1(1,t;1;1)=1F0(t;;1)=2t, (D2a)

for 0 < t < 1.

5. B-splines

The application of renewal theory to Eulerian numbers can be extended to uniform B-splines [11, 12]. Let N0,n(t) denote the B-spline of degree n with knots at the integers 0, 1, …, n + 1 and support [0, n + 1]. Let χ[0,1](t) be the characteristic function over [0, 1], i.e., χ[0,1](t)=1{t[0,1]}. Then N0,n(t) is the convolution of χ[0,1](t) with itself n + 1 times. Since χ[0,1](t) is the probability density function of Uniform(0, 1), the convolution N0,n(t) is actually the probability density function of Sn+1[0,1], or in other words, the probability density function of the Irwin-Hall distribution. Furthermore,

N0,n(t)=N0,n1(x)χ[0,1](tx)dx=t1tN0,n1(x)dx=P(Sn[0,1](t1,t]). (9)

It follows from Equations (4) and (9) that the normalized Eulerian numbers are the uniform B-splines evaluated at the integers. Moreover, in view of Theorem 1 and Equation (9), it follows that

limtn=0N0,n(t)=2. (E1)

Sums of short diagonals also have a counterpart for B-splines. Noting that

N0,nk(k+t)=P(X1[0,1]++Xnk[0,1](k+t1,k+t])=P((X1[0,1]+1)++(Xnk[0,1]+1)(n+t1,n+t))=P(X1[1,2]++Xnk[1,2](n+t1,n+t])=P(Snk[1,2](n+t1,n+t]),

we have

limnk0N0,nk(k+t)=limnU[1,2](n+t1,n+t]=13/2=23, (E2)

for all t.

6. Contrasts and alternating sums

Theorem 1 also holds for a delayed renewal process, where S0 is a random variable other than a constant zero. This insight leads to an extension of Equation (B1) to alternating sums, and more generally, contrasts. We call a length m vector (c0, c1, …, cm−1) a contrast if i=0m1ci=0. We use the same notation ck to denote its periodic extension, which is a sequence {ck : k = 0, 1, 2, …} such that ck = ci if k ≡ i (mod m).

For each j = 1, 2, …, m, consider a delayed renewal process with delay S0(j)=Sj[0,1] and interarrival time Xi(j)=i=0m1Xm(i1)+i+j+1[0,1]=Smi+j[0,1]Sm(i1)+j[0,1] for i = 1, 2, …. The corresponding partial sum Sn(j) satisfies

Sn(j)=i=1nXi(j)+Sj[0,1]=i=1n(Smi+j[0,1]Sm(i1)+j[0,1])+Sj[0,1]=Smn+j[0,1],

and the expectation of interarrival times is EXi(j)=i=0m1EXm(i1)+i+j+1[0,1]=m/2. A direct application of Theorem 1 to this delayed renewal process gives

limkn=0P(Smn+j[0,1](k1,k])=limkn=0P(Sn(j)(k1,k])=1m/2=2m.

Now

n=0cn1n!nk=n=0cnP(Sn[0,1](k1,k])=j=0m1cjn=0P(Smn+j[0,1](k1,k]). (10)

Taking the limit of both sides in Equation (10), we obtain

limkn=0cn1n!nk=j=0m1cjlimkn=0P(Smn+j[0,1](k1,k])=j=0m1cj2m=2mj=0m1cj=0.

A special case is alternating sums of the normalized Eulerian numbers when m = 2 and (c0, c1) = (1, −1), i.e.,

limkn=0(1)n1n!nk=0. (A3)

Analogous results hold for normalized binomial coefficients following the same argument as above but replacing Uniform(0, 1) by Bernoulli(1/2). Thus

limkn=0cn12n(nk)=0,

and

limkn=0(1)n12n(nk)=0. (B3)

Proofs of C3, D3, and E3 are similar to the proofs provided above either by varying the distribution of interarrival times (use Bernoulli(t) where t ∈ (0,1) for C3 and a mixture of Bernoulli(p) where p ~ Beta(a = t/h, b = (1 − t)/h) for D3) or by applying Theorem 1 to (t − 1, t] for real-valued t (for E3). Thus we omit these proofs here.

7. Non-asymptotic identities for normalized binomial coefficients and (h-)Bernstein polynomials

Here we derive non-asymptotic identities for fixed k as a counterpart to the limits in A, C, and D, also through probabilistic arguments.

The random variable n=01{Sn(t)=k} is equal to one plus the number of trials that fail after Sn first reaches k. Because the trails after Sn are independent of Sn, n=01{Sn(t)=k} has the same distribution as the number of trials before one success in an IID Bernoulli sequence, which is known to be a geometric distribution with parameter t that has mean 1/t. Consequently by Equation (2),

n=0(nk)tk(1t)nk=1t, (C1*)

for any nonnegative integer k. Substituting t = 1/2 into Equation (C1*) leads to the column sum of the normalized binomial coefficients

n=012n(nk)=11/2=2, (A1*)

for any nonnegative integer k. Equation (D1*) holds by using the equivalence between h-Bernstein polynomials and Beta-Binomial distributions as in Section 4.2.

For the sum of short diagonals,

k0Bknk(t)=k0P(Sk(t)+1=n)=E[k01(Sk(t)+1=n)]=:EZn.

Since there exists at most one value of k such that Sk(t)+1=n as Sk(t)+1 is strictly increasing in k, the random variable Zn follows a Bernoulli distribution. Let the success probability be an = P(Zn = 1). In the infinite sequence of binary trials {Sk(t)+1}, conditioning on whether the event right before {Zn = 1} occurs is a success or a failure, we conclude that an = an−1(1 − t) + an−2t for n ≥ 2, where we let a0 = 1. Solving this recurrence for an gives

k0Bknk(t)=1(t)n+11+t. (C2*)

Substituting t = 1/2 yields

k012nk(nkk)=23+13(12)n. (A2*)

In view of Bknk(t;h)=EpBeta(a,b)Bknk(p), which follows from Equations (6) and (7) with a = t/h and b = (1 − t)/h, Equation (C2*) leads to

k0Bknk(t;h)=EpBeta(a,b)k0Bknk(p)=EpBeta(a,b)[1(p)n+11+p]=01xa1(1x)b1(1+x)B(a,b)dx+(1)n+201xa+n(1x)b1(1+x)B(a,b)dx. (D2*)

For the alternating sums, we just need to show the Bernstein polynomials in C3*, then A3* and D3* will follow in the same way as we derive A2* and D2*. For Equation (C3*), we first rewrite the left-hand side into an expectation

n=0(1)nBkn(t)=n=0(1)nE1{Sn(t)=k}=E[n=0(1)n1{Sn(t)=k}]=EZk.

Consider an IID Bernoulli sequence with success probability t and a derived random sequence {Uj : j = 1, …} where Uj is the number of trials needed from the ( j − 1)th success to the jth success. Then the Ujs are IID following a geometric distribution with parameter t. Let Vj be the number of the trial at which the jth success first occurs, i.e., Vj=j=1jUj. Then,

Zk=n=VkVk+11(1)n1{Sn(t)=k}=n=VkVk+11(1)n=(1)Vk1{Vk+1Vkisodd}=j=1k(1)Uj1{Uk+1isodd},

which combined with the fact that the Ujs are IID leads to

EZk=j=1kE[(1)Uj]P{Uk+1isodd}={E[(1)U1]}kP{U1isodd}.

Let A1 = {U1 is even} and B1={U1isodd}=A1c. Then E[(1)U1]=P(A1)P(B1). Moreover it follows easily from the definition of U1 that P(A1)=i=0(1t)2i+1t=t(1t)/(1(1t)2)=(1t)/(2t) and P(B1) = 1−P(A1) = 1/(2−t). Consequently,

EZk=[P(A1)P(B1)]kP(B1)=(t)k/(2t)k+1. (C3*)

Acknowledgments

We thank Professor Plamen Simeonov for pointing out the connection between D2 and hypergeometric functions. This research was partially supported by an ORAU Ralph E. Powe Junior Faculty Enhancement Award and grant 1R24MH117529 of the BRAIN Initiative of the United States National Institutes of Health.

Footnotes

Declaration of interests

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

References

  • [1].Graham RL, Knuth DE, Patashnik O, Concrete Mathematics: A Foundation for Computer Science (Second Edition), Addison & Wesley, 1994. [Google Scholar]
  • [2].Simsek Y, Generating functions for the Bernstein type polynomials: a new approach to deriving identities and applications for the polynomials, Hacettepe Journal of Mathematics and Statistics 43 (2014) 1–14. [Google Scholar]
  • [3].Goldman R, Generating functions for uniform B-splines, in: Proceedings of the Eighth International Conference on Mathematical Methods for Curves and Surfaces, Springer, 2014, pp. 172–188. [Google Scholar]
  • [4].Feller W, An introduction to probability theory and its applications (Second Edition), volume 2, John Wiley & Sons, 1971. [Google Scholar]
  • [5].Tanny S, A probabilistic interpretation of Eulerian numbers., Duke Mathematical Journal 40 (1973) 717–722. doi: 10.1215/S0012-7094-73-04065-9. [DOI] [Google Scholar]
  • [6].Konstantopoulos T, Last G, On the use of Lyapunov function methods in renewal theory, Stochastic Processes and their Applications 79 (1999) 165–178. doi: 10.1016/S0304-4149(98)00068-4. [DOI] [Google Scholar]
  • [7].Eggenberger F, Pólya G, Über die statistik verketteter vorgänge, ZAMM-Journal of Applied Mathematics and Mechanics/Zeitschrift für Angewandte Mathematik und Mechanik 3 (1923) 279–289. [Google Scholar]
  • [8].Pólya G, Sur quelques points de la théorie des probabilités, in: Annales de l’institut Henri Poincaré, volume 1, 1930, pp. 117–161. [Google Scholar]
  • [9].Johnson NL, Kemp AW, Kotz S, Univariate Discrete Distributions, volume 444, John Wiley & Sons, 2005. [Google Scholar]
  • [10].Koekoek R, Lesky PA, Swarttouw RF, Hypergeometric Orthogonal Polynomials and Their q-Analogues, Springer Science & Business Media, 2010. [Google Scholar]
  • [11].Wang R-H, Xu Y, Xu Z-Q, Eulerian numbers: a spline perspective, Journal of Mathematical Analysis and Applications 370 (2010) 486–490. [Google Scholar]
  • [12].He T-X, Eulerian polynomials and B-splines, Journal of Computational and Applied Mathematics 236 (2012) 3763–3773. [Google Scholar]

RESOURCES