Skip to main content
Entropy logoLink to Entropy
. 2024 Dec 16;26(12):1101. doi: 10.3390/e26121101

Symplectic Bregman Divergences

Frank Nielsen 1
Editors: Stéphane Puechmorel1, Florence Nicol1
PMCID: PMC11675853  PMID: 39766730

Abstract

We present a generalization of Bregman divergences in finite-dimensional symplectic vector spaces that we term symplectic Bregman divergences. Symplectic Bregman divergences are derived from a symplectic generalization of the Fenchel–Young inequality which relies on the notion of symplectic subdifferentials. The symplectic Fenchel–Young inequality is obtained using the symplectic Fenchel transform which is defined with respect to the symplectic form. Since symplectic forms can be built generically from pairings of dual systems, we obtain a generalization of Bregman divergences in dual systems obtained by equivalent symplectic Bregman divergences. In particular, when the symplectic form is derived from an inner product, we show that the corresponding symplectic Bregman divergences amount to ordinary Bregman divergences with respect to composite inner products. Some potential applications of symplectic divergences in geometric mechanics, information geometry, and learning dynamics in machine learning are touched upon.

Keywords: dual system, duality product, inner product, symplectic form, symplectic matrix group, symplectic subdifferential, symplectic Fenchel transform, Moreau proximation, geometric mechanics

1. Introduction

Symplectic geometry [1,2,3] was historically pioneered by Lagrange around 1808–1810 [4,5,6] where the motions and dynamics (evolution curves) of a finite set of m point mass particles in a time interval T are analyzed in the phase space by a 1D curve C={c(t)=(q1,p1,,qm,pm)(t):tTR}R2n, where qi(t)Rn’s denote the point locations at time t and pi(t)Rn’s encode the momentum, i.e., pi(t)=miq˙i with q˙i=ddtqi(t). See Figure 1. (Notice that Joseph-Louis Lagrange (1736–1813) was 72 years old in 1808, and is famous for his treatise on analytic mechanics [7,8] published first in french in 1788 when he was 52 years old).

Figure 1.

Figure 1

The motion of a single point particle q(t) with mass m and momentum p(t)=mq˙(t) on a 1D line can be modeled as a curve C={c(t)=(q(t),p(t)):tTR} in the phase space R2.

The Hamiltonian coupled equations [9] governing the system motion are written in the phase space as follows:

dqidt=Hpi,dpidt=Hqi, (1)

where H(q,p,t) is the Hamiltonian describing the system. Lagrange originally started a new kind of calculus, “symplectic calculus”. Symplectic geometry can be thought as the first discovered non-Euclidean geometric structure since hyperbolic geometry is usually considered to be first studied by Lobachevsky and Bolyai around 1820–1930. We refer to the paper entitled “The symplectization of science” [10] for an outreach article on symplectic geometry.

The adverb “symplectic” stems from Greek: It means “braided together” to convey the interactions of point mass particle positions with their momenta. Its use in mathematics originated in the work of Hermann Weyl (see §6 on symplectic groups in [11]). Another synonym adverb of symplectic is “complex” which has been used to describe braided numbers z of C={z=a+ib:(a,b)R2}. Complex has its etymological root in Latin. In differential geometry, symplectic structures are closely related to (almost) complex structures on vector spaces and smooth manifolds [2].

In physics, symplectic geometry is not only at the core of classical mechanics (i.e., conservative reversible mechanics) and quantum mechanics [12], but has also recently been used to model and study dynamics of systems exhibiting dissipative terms [13,14] which are irreversible. As a pure geometry, symplectic geometry can be studied on its own by mathematicians, and gave birth to the field of symplectic topology [15]. Thus, symplectic geometry can be fruitfully applied to various areas beyond its original domain of geometric mechanics. For example, symplectic geometry has been considered in machine learning for accelerating numerical optimization methods based on symplectic integrators [16] and in physics-informed neural networks [17,18] (PINNs).

In this paper, we define symplectic Bregman divergences (Definition 5) which recover as special cases Bregman divergences [19] defined with respect to composite inner products. A Bregman divergence induced by a strictly convex and differentiable (potential) function F (called the Bregman generator) between x1 and x2 of X is defined in [19] (1967) by

BF(x1:x2)=F(x1)F(x2)x1x2,F(x2), (2)

where ·,· is an inner product on X. Let Γ0(X) denote the set of functions which are lower semi-continuous convex with non-empty effective domains. The convex conjugate F*(x*) obtained by the Legendre–Fenchel transform F*(x*)=supxXx*,xF(x) yields a dual Bregman divergence BF* when the function FΓ0(X) is of Legendre type [20,21]:

BF*(x1*:x2*)=F*(x1*)F*(x2*)x1*x2*,F*(x2*),

such that BF(x1:x2)=BF*(x2*:x1*) with

F*(x*)=x*,(F)1(x*)F((F)1(x*)).

This paper introduces and extends the work of Buliga and Saxcé [13,14] which is motivated by geometric irreversible mechanics. To contrast with [13,14], this expository paper is targeted to an audience familiar with Bregman divergences [19] in machine learning and information geometry [22] but does not assume any prior knowledge in geometric mechanics. Furthermore, we consider only finite-dimensional spaces in this study.

The paper is organized as follows: In Section 2, we define symplectic vector spaces and explain the representation of symplectic forms using dual pairings. We then define the symplectic Fenchel transform and the symplectic Fenchel–Young inequality in Section 3. The definitions of symplectic Fenchel–Young divergences (Definition 4) and symplectic Bregman divergences (Definition 5) are reported in Section 4. In particular, we show how to recover Bregman divergences with respect to composite inner products as special cases in Section 5 (Property 1). In general, symplectic Bregman divergences allow one to define Bregman divergences in dual systems equipped with pairing products. Finally, we recall the role of Bregman divergences in dually flat manifolds of information geometry in Section 6, and motivate the introduction of symplectic Bregman divergences in geometric mechanics (e.g., symplectic BEN principle of [13,14]) and learning dynamics in machine learning.

2. Dual Systems, Linear Symplectic Forms, and Symplectomorphisms

2.1. Symplectic Forms Derived from Dual Systems

We begin with two definitions:

Definition 1 

(Dual system). Let X and Y be finite m-dimensional vector spaces [23] equipped with a pairing product b(·,·), i.e., a bilinear map:

b(·,·):X×YR,

such that all continuous linear functionals on X and Y are expressed as x#(·)=b(x,·) and y#(·)=b(·,y), respectively. The triplet (X,Y,b(·,·)) forms a dual system.

(Notice that when the type of X is different from the type of Y then the bilinear map cannot be symmetric).

Definition 2 

(Symplectic vector space). A symplectic vector space (V,ω) is a vector space equipped with a map [24] ω:Z=V×VR which is

  • 1.
    bilinear: α,β,α,βR,z1,z2Z, we have
    ω(αz1+αz1,z2,βz2+βz2)=αβω(z1,z2)+αβω(z1,z2)+αβω(z1,z2)+αβω(z1,z2),
  • 2.

    skew-symmetric (or alternating): ω(z2,z1)=ω(z1,z2), and

  • 3.

    non-degenerate: if for a z0, we have ω(z,z0)=0 for all zZ then we have z0=0.

Notice that skew-symmetry implies that ω(z,z)=0 for all zZ since ω(z,z)=ω(z,z) and hence 2ω(z,z)=0. The map ω is called a linear symplectic form [24,25].

We define the symplectic form ω induced by the pairing product of a dual system as follows:

ω(z1,z2)=b(x1,y2)b(x2,y1), (3)

where z1=(x1,l1) and z2=(x2,l2) belong to Z=XY.

Let us report several examples of linear symplectic forms:

  • Let X=V be a finite n-dimensional vector space with the dual space of linear functionals Y=V* (space of covectors l). The natural pairing ((x,l))=l(x)=ixili of a vector xV with a covector lV* is an example of dual product. (We use the superscript index for indicating components of contravariant vectors and subscript index for specifying components of covariant vectors [9]). We define the symplectic form ω induced by the natural pairing of vectors with covectors as follows:
    ω(z1,z2)=((x1,l2))((x2,l1))=l2(x1)l1(x2), (4)
    where z1=(x1,l1) and z2=(x2,l2) belong to Z=VV*.
  • Consider (X,·,·) an inner product space of dimension n. The product space Z=XX of even dimension n can be equipped with the following map ω:Z×ZR induced by the inner product:
    ω(z1,z2)=x1,y2x2,y1, (5)
    where z1=(x1,y1)Z and z2=(x2,y2)Z.

For example, let X=R and x1,x2=x1x2. Then ω(z1,z2)=x1y2x2y1. This symplectic form can be interpreted as the determinant of the matrix M=x1x2y1y2 which corresponds geometrically to the signed orientation of the parallelogram defined by the vectors z1=(x1,y1) and z2=(x2,y2). See Figure 2. (This example indicates the link with integration of 2D manifolds equipped with fields of symplectic forms smoothly varying called differential 2-forms [9]).

Figure 2.

Figure 2

Interpreting a 2D symplectic form ω(z1,z2) as the signed area of a parallelogram with first oriented edge z1 (grey). A pair of vectors defines two possible orientations of the parallelogram: The orientation compatible with z1 and the reverse orientation compatible with z2. ω is called the standard area form.

In a finite-dimensional vector space, we can express the inner product as x,y=xQy for a symmetric positive-definite matrix QRn×n. Let Q=LL be the Cholesky decomposition of Q. Then we have

x,y=(Lx)I(Ly)=Lx,Ly0,

where I is the n×n identity matrix and x,y0=xy is the Euclidean inner product. Thus the form ω0 induced by ·,·0 can be expressed using linear algebra as

ω0(z1,z2)=z10II0z2=z1Ω0z2,

where Ω0R2n×2n is a skew-symmetric matrix: Ω0=Ω0. More generally, we may consider skew-symmetric matrices of the form Ω=0LL0 to define the symplectic form ωQ induced by the inner product x,yQ=xQy.

2.2. Linear Symplectomorphisms and the Groups of Symplectic Matrices

A symplectic form ω can be expressed as a 2n×2n matrix Ω=[ωij] such that ωij=ω(bi,bj) where b1=e1,,bn=en,bn+1=f1,,b2n=fn are the basis vectors, and ω(z1,z2)=z1Ωz2.

The Darboux basis [2] of the canonical form ω0 of R2n is such that ω(ei,fj)=δij and ω(ei,ej)=ω(li,lj)=0 where δ denotes the Kronecker delta function. Ω0Sp(2n) is the symplectic matrix 0II0 corresponding to the canonical form ω0 of R2n.

A transformation t:VV is called a linear symplectomorphism when ω(t(z1),t(z2)=ω(z1,z2) (i.e., t*ω=ω), i.e., when TΩT=Ω where T be the matrix representation of t. In particular t is a linear symplectomorphism with respect to ω0 when TΩ0T=Ω0. Any symplectic vector space of (V,ω) dimension 2n is symplectomorphic to the canonical symplectic space (R2n,ω0).

Linear symplectomorphisms can be represented by symplectic matrices of the symplectic group [11,26] Sp(2n):

Sp(2n)=T:TΩ0T=Ω0GL(2n),=T=ABCD:CA+AC=0,CB+AD=I,DB+BD=0,

Transpose and inverse of symplectic matrices are symplectic matrices. The inverse of a symplectic matrix T is given by

T1=Ω0TΩ0,=DBCA

Symplectic matrices of Sp(2n) have unit determinant (Sp(2n)SL(2n)GL(2n)), and in the particular case of n=1, Sp(2) corresponds precisely to the set of matrices with unit determinant. Thus rotation matrices of SO(2) which have unit determinant for a subgroup of Sp(2).

Sesquilinear symplectic forms can also be defined on complex linear spaces [27].

3. Symplectic Fenchel Transform, Symplectic Subdifferentials, and Symplectic Fenchel–Young (in)Equality

Let F:Z=X×YR{+} be a convex lower semi-continuous (lsc) function called a potential function.

Definition 3 

(Symplectic Fenchel conjugate). The symplectic Fenchel conjugate F*ω(z) is defined by

F*ω(z)=supzZω(z,z)F(z).

Notice that since ω is skew-symmetric, the order of the arguments in ω is important: The symplectic Fenchel transform optimizes with respect to the second argument of ω(·,·).

The symplectic subdifferential of F at z is defined by

ωF(z)=z1Z:z2Z,F(z+z2)F(z)+ω(z1,z2).

The differential operator ω is a set-valued operator: ω:FZ, where F is the set of potential functions. An element of the symplectic subdifferential of F at z is called a symplectic subgradient.

Remark 1. 

Moreau generalized the Fenchel conjugate using a cost function [28]. In particular, the duality induced by logarithmic cost function was studied in [29], and lead to a generalization of Bregman divergences called the logarithmic divergences which are canonical divergences of constant section curvature manifolds in information geometry.

Remark 2. 

In geometric mechanics [2], the symplectic gradient on a symplectic manifold (M,ω) is the Hamiltonian vector field, i.e., the vector field XH such that the Halmitonian mechanics equation writes concisely as ω(XH,·)=dH.

Theorem 1 

(Symplectic Fenchel–Young inequality, Theorem 2.3 of [13,14]). Let F(z) be a convex (i.e., F(z)=F(x,y) is joint convex, i.e., convex with respect to z=(x,y)) and lower semi-continuous function. Then the following inequality holds:

z,zZ,F(z)+F*ω(z)ω(z,z),

with equality if and only if zω(z).

Let us again notice that the argument order in ω(·,·) is important.

Assume that the potential functions are smooth and that symplectic subdifferentials consist only of single-element sets (singletons). By abuse of language, we shall call in this paper the symplectic gradient of F the single element of the symplectic subdifferential ω, and denote it by ωF: ωF(z)={ωF(z)}. (Our terminology and notation is thus not to be confused with the Hamiltonian vector field XH of geometric mechanics).

4. Symplectic Fenchel–Young Divergences and Symplectic Bregman Divergences

Divergences are smooth dissimilarity functions (see Section 4.2 of [30]). From the symplectic Fenchel–Young inequality of Theorem 1, we can define the symplectic Fenchel–Young divergence as follows:

Definition 4 

(Symplectic Fenchel–Young divergence). Let F:Z=X×YR be a smooth convex function. Then the symplectic Fenchel–Young divergence is the following non-negative measure of dissimilarity between z and z:

YF(z,z)=F(z)+F*ω(z)ω(z,z)0. (6)

We have YF(z,z)=0 if and only if zωF(z), i.e., z=ωF(z) when F is smooth.

Let us now define the symplectic Bregman divergence BFω(z1:z2) as YF(z1,z2) where z2=ωF(z2). Using the following identity derived from the symplectic Fenchel–Young equality:

F*ω(ωF(z))=ω(ωF(z),z)F(z),

and the bilinearity of the symplectic form, we obtain:

BFω(z1:z2)=YF(z1,z2),=F(z1)+F*ω(z2)ω(z2,z1),=F(z1)+ω(ωF(z2),z2)F(z2)ω(ωF(z2),z1),=F(z1)F(z2)ω(ωF(z2),z1z2). (7)

Since ω is skew-symmetric, we can also rewrite Equation (7) equivalently as

BFω(z1:z2)=F(z1)F(z2)+ω(z1z2,ωF(z2)). (8)

Definition 5 

(Symplectic Bregman divergence). Let (Z=X×Y,ω) be a symplectic vector space. Then the symplectic Bregman divergence between z1 and z2 of Z induced by a smooth convex potential F(z) is

BFω(z1:z2)=F(z1)F(z2)ω(ωF(z2),z1z2),

where the symplectic subdifferential gradient is the singleton ωF(z)={ωF(z)}.

Remark 3. 

The ordinary Bregman divergences (BDs) have been generalized to non-smooth strictly convex potential functions using a subdifferential map in [31,32,33] to choose among several potential subgradients at a given location. Similarly, we can extend symplectic Bregman divergences to non-smooth strictly convex potential functions using a symplectic subdifferential map.

5. Particular Cases Recover Composite Bregman Divergences

When Y=X and (X,·,·) is an inner-product space, we may consider the composite inner-product on Z=X×X:

z1,z2=x1,y1+x2,y2,

with z1=(x1,y1) and z1=(x2,y2).

Let I:ZZ be the linear function I(z)=z and denote by J:ZZ the linear function defined by

J(z)=J(x,y)=(y,x).

Notice that this definition of J makes sense because X=Y and thus (y,x)Z. We check that we have J2(x,y)=J(y,x)=(x,y)=(x,y), i.e., J2=I. Furthermore, we have g(z1,z2)=ω(z1,Jz2)=x1,x2+y1,y2 that is a positive definite inner product. That is, the automorphism J is a complex structure ω-compatible (J is a symplectomorphism).

We can express the symplectic form ω(z1,z2)=x1,y2x2,y1 induced by the inner product using the composite inner product as follows:

ω(z1,z2)=x1,y2x2,y1=J(z1),z2,ω(Jz1,z2)=z1,z2.

Similarly, the symplectic subdifferential of F can be expressed using the ordinary subdifferential (and vice versa) as follows:

zωF(z)J(z)F(z),zF(x)J(z)ωF(z).

When subdifferentials are singletons, we thus have

J(ωF(z))=F(z),ωF(z)=J(F(z)).

Last, the symplectic Fenchel conjugate of F is related by the ordinary Fenchel conjugate F* of F as follows:

F*ω(z)=F*(J(z)).

Thus in that case the symplectic Bregman divergence amounts to an ordinary Bregman divergence:

BFω(z1:z2)=F(z1)F(z2)ω(ωF(z2),z1z2),=F(z1)F(z2)+ω(ωF(z2),z1z2),=F(z1)F(z2)+ω(J(F(z2)),z1z2),=F(z1)F(z2)z1z2,F(z2),=BF(z1:z2).

Property 1. 

When the symplectic form ω is induced by an inner product ·,· of X, the symplectic Bregman divergence BFω(z1:z2) between z1=(x1,y1) and z2=(x2,y2) of Z=X×X amounts to an ordinary Bregman divergence with respect to the composite inner-product z1,z2=x1,y1+x2,y2:

BFω(z1:z2)=BF(z1:z2)=F(z1)F(z2)z1z2,F(z2).

Furthermore, if the potential function F(z) is separable, i.e., F(z)=F1(x)+F2(y) for Bregman generators F1 and F2, then we have BFω(z1:z2)=BF1(x1:x2)+BF2(y1,y2) where the Bregman divergences BF1 and BF2 are defined with respect to the inner product of X.

Notice that the symplectic Fenchel–Young inequality can be rewritten using the ordinary Fenchel–Young inequality and the linear function J as:

F(z)+F*ω(z)ω(z,z),F(z)+F*(J(z))J(z),z.

6. Summary, Discussion, and Perspectives

Since its inception in operations research, Bregman divergences [19] have proven instrumental in many scientific fields including information theory, statistics, and machine learning, just to cite a few. Let (X,·,·) be a Hilbert space, and F:XR a strictly convex and smooth real-valued function. Then the Bregman divergence induced by F is defined in [19] (1967) by

BF(x1:x2)=F(x1)F(x2)x1x2,F(x2).

In this work, we consider finite-dimensional vector spaces equipped with an inner product.

In information geometry [22,34,35], a smooth dissimilarity D(p,q) between two points p and q on an n-dimensional smooth manifold M induces a statistical structure on the manifold [36], i.e., a triplet (g,,*) where the Riemannian metric tensor g and the torsion-free affine connections ∇ and * are induced by the divergence D. The duality in information geometry is expressed by the fact that the mid-connection +*2 corresponds to the Levi-Civita connection induced by g. To build the divergence-based information geometry [37], the divergence D(p,q) is interpreted as a scalar function on the product manifold M×M of dimension 2n. Thus, the divergence D is called a contrast function [36] or yoke [38]. Conversely, a statistical structure (g,,*) on an n-dimensional manifold M induces a contrast function [39]. When the statistical manifold (M,g,,*) is dually flat with θ(·) the global ∇-affine coordinate system and η(·) the global *-affine coordinate system [40], there exists two dual global potential functions ϕ and ϕ* on the manifold M such that ϕ(p)=F(θ(p)) and ϕ*(η(p))=F*(η(p)) where F*(η) is the Legendre–Fenchel convex conjugate of F(θ). The canonical dually flat divergence on M is then defined by

D(p,q)=F(θ(p))+F*(η(p))i=1nθi(p)ηi(q),

and amounts to a Fenchel–Young divergence or equivalently a Bregman divergence:

D(p,q)=YF(θ(p):θ(q))=BF(θ(p):θ(q)),

where the Fenchel–Young divergence is defined by

YF(θ:η)=F(θ)+F*(η)i=1nθiηi.

The Riemannian metric g of a dually flat space can be expressed as g=dϕ=*dϕ* or in the θ-coordinates by gij(θ)=2θiθjF(θ) and in the η-coordinates by gij(η)=2ηiηjF*(η). That is, g is a Hessian metric [40], (g,) a Hessian structure and (g,*) a dual Hessian structure. In differential geometry, (M,g,) is called a Hessian manifold which admits a dual Hessian structure (g,*). In particular, a Hessian manifold is of Koszul type [40] when there exists a closed 1-form α such that g=α.

Remark 4. 

Notice that the potential functions F and F* are not defined uniquely although the potential functions ϕ and ϕ* on the manifold are. Indeed, consider the generator F¯(θ)=F(Aθ+b)+c,θ+d for invertible matrix AGL(d,R), vectors b,cRd and scalars dR. The gradient of the generator F¯ is η=F¯(θ)=AF(Aθ+b)+c. Solving the equation F¯(θ)=η yields the reciprocal gradient θ(η)=G¯(η)=A1GA(ηc)b from which the Legendre convex conjugate is obtained as G¯(η)=η,G¯(η)F(G¯(η)). We have BF(θ1:θ1)=BF¯(θ¯1:θ¯2) where θ¯=A1(θb).

It has been shown that a divergence D also allows one to define a symplectic structure ω on a statistical manifold [38,41]. The symplectic vector space (R2n,ω0) viewed as a symplectic manifold has symplectic form ω0=i=1ndxidyi=d(i=1nyidxi). There are no local invariants but only global invariants on symplectic manifolds (symplectic topology). That is, a symplectic structure is flat.

In this expository paper, we have defined symplectic Fenchel–Young divergences and equivalent symplectic Bregman divergences by following the study of geometric mechanics reported in [13,14]. The symplectic Bregman divergence between two points z1 and z2 on a symplectic vector space (Z,ω) induced by a convex potential function F:ZR is defined by

BFω(z1:z2)=F(z1)F(z2)+ω(z1z2,ωF(z2)),

where ωF has been called the symplectic gradient in this paper, and assumed to be the unique symplectic subdifferential at any zZ, i.e., wF(z)={ωF(z)}. Symplectic Bregman divergences are used to define Bregman divergences on dual systems (Figure 3). In the particular case of dual system (X,X,·,·), we recover ordinary Bregman divergences with composite inner products.

Figure 3.

Figure 3

Bregman divergences generalized to dual systems (X,Y,b(·,·)): A symplectic form ω on the space Z=XZ is induced by the pairing product. The Bregman divergence on the dual system is then defined as the symplectic Bregman divergence on the symplectic vector space (Z,ω).

In finite 2n-dimensional symplectic vector spaces, linear symplectic forms ω can be represented by symplectic matrices of the matrix group Sp(2n). Buliga and de Saxcé [13,14] considered geometric mechanics with dissipative terms, and stated the following “symplectic Brezis–Ekeland–Nayroles principle” (SBEN principle for short):

Definition 6 

(SBEN principle [13,14]). The natural evolution path z(t)=zrev(t)+zirr(t)Z for t[0,T] in a geometric mechanic system with convex dissipation potential ϕ(z) minimizes among all admissible paths 0TYFω(z˙(t),z˙irr(t))dt and satisfies YFω(z˙(t),z˙irr(t))=0 for all t[0,T], where YFω denotes the symplectic Fenchel–Young divergence induced by ϕ, and zrev(t) and zirr(t) are the reversible and irreversible parts of the particle z(t), respectively.

The decomposition of z=zrev+zirr into two parts can be interpreted as Moreau’s proximation [42,43] associated to the potential function ϕ: Indeed, let F(z) be a convex function of Z=Rd. Then for all zRd, we can uniquely decompose z as z=z+z* such that F(z)+F*(z*)=z,z* (Fenchel–Young equality) where z*=F(z) (see Proposition in Section 4 of [42]). The part z is called the proximation with respect to F, and the part z* is the proximation with respect to the convex conjugate F*.

We may consider the non-separable potential functions F(z)=F(x,y)=xf(y/x) which are obtained from the perspective transform [44,45] of arbitrary convex functions f(u) to define symplectic Bregman divergences. The perspective functions F(x,y) are jointly convex if and only if their corresponding generators f are convex. Such perspective transforms play a fundamental role in information theory [46] and information geometry [22].

In machine learning, symplectic geometry has been used for designing accelerated optimization methods [16,47] (Bregman–Lagrangian framework) and physics-informed neural networks [17,18] (PINNs).

This paper aims to spur interest in either designing or defining symplectic divergences from first principles, and to demonstrate their roles when studying thermodynamics [48] or the learning dynamics of ML and AI systems.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

Author Frank Nielsen is employed by the company Sony Computer Science Laboratories Inc. The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest. The author declares no conflicts of interest.

Funding Statement

This research received no external funding.

Footnotes

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

References

  • 1.McDuff D. Symplectic structures—A new approach to geometry. Not. AMS. 1998;45:952–960. [Google Scholar]
  • 2.Da Silva A.C. Lectures on Symplectic Geometry. Volume 3575 Springer; Berlin/Heidelberg, Germany: 2001. [Google Scholar]
  • 3.Libermann P., Marle C.M. Symplectic Geometry and Analytical Mechanics. Volume 35 Springer Science & Business Media; Berlin/Heidelberg, Germany: 2012. [Google Scholar]
  • 4.de Lagrange J.L. Mémoire sur la théorie des variations des éléments des planétes, et en particulier des variations des grands axes de leurs orbites. Paris. 1808;VI:713–768. [Google Scholar]
  • 5.Lagrange J.L. Second mémoire sur la théorie générale de la variation des constantes arbitraires dans tous les problemes de la mécanique. Mémoires Prem. Cl. l’Institut Fr. 1810;19:809–816. [Google Scholar]
  • 6.Marle C.M. The inception of symplectic geometry: The works of Lagrange and Poisson during the years 1808–1810. Lett. Math. Phys. 2009;90:3–21. doi: 10.1007/s11005-009-0347-y. [DOI] [Google Scholar]
  • 7.Lagrange J.L. Mécanique Analytique. Mallet-Bachelier; Paris, France: 1811. First Published by La Veuve Desaint, Paris in French in 1788 by Joseph-Louis De La Grange with title “Méchanique analitique”. [Google Scholar]
  • 8.Lagrange J.L. Analytical Mechanics. Volume 191 Springer Science & Business Media; Berlin/Heidelberg, Germany: 2013. First Published in French in 1811. [Google Scholar]
  • 9.Godinho L., Natário J. With Applications. Springer; Berlin/Heidelberg, Germany: 2012. An introduction to Riemannian geometry. [Google Scholar]
  • 10.Gotay M.J., Isenberg G. The symplectization of science. Gaz. Mathématiciens. 1992;54:59–79. (In French) [Google Scholar]
  • 11.Weyl H. The Classical Groups: Their Invariants and Representations. Princeton University Press; Princeton, NJ, USA: 1946. Number 1. [Google Scholar]
  • 12.Souriau J.M. Structure of Dynamical Systems: A Symplectic View of Physics. Volume 149 Springer Science & Business Media; Berlin/Heidelberg, Germany: 1997. [Google Scholar]
  • 13.Buliga M., de Saxcé G. A symplectic Brezis–Ekeland–Nayroles principle. Math. Mech. Solids. 2017;22:1288–1302. doi: 10.1177/1081286516629532. [DOI] [Google Scholar]
  • 14.de Saxcé G. Information Geometry. Springer; Berlin/Heidelberg, Germany: 2024. A variational principle of minimum for Navier–Stokes equation and Bingham fluids based on the symplectic formalism; pp. 1–22. [Google Scholar]
  • 15.Audin M. Contact and Symplectic Topology. Springer; Berlin/Heidelberg, Germany: 2014. Vladimir Igorevich Arnold and the invention of symplectic topology; pp. 1–25. [Google Scholar]
  • 16.Jordan M.I. Dynamical, symplectic and stochastic perspectives on gradient-based optimization; Proceedings of the International Congress of Mathematicians: Rio de Janeiro 2018; Rio de Janeiro, Brazil. 1–9 August 2018; Singapore: World Scientific; 2018. pp. 523–549. [Google Scholar]
  • 17.Chen Y., Matsubara T., Yaguchi T. Neural symplectic form: Learning Hamiltonian equations on general coordinate systems. Adv. Neural Inf. Process. Syst. 2021;34:16659–16670. [Google Scholar]
  • 18.Matsubara T., Miyatake Y., Yaguchi T. Symplectic adjoint method for exact gradient of neural ODE with minimal memory. Adv. Neural Inf. Process. Syst. 2021;34:20772–20784. [Google Scholar]
  • 19.Bregman L.M. The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming. USSR Comput. Math. Math. Phys. 1967;7:200–217. doi: 10.1016/0041-5553(67)90040-7. [DOI] [Google Scholar]
  • 20.Rockafellar R.T. Conjugates and Legendre transforms of convex functions. Can. J. Math. 1967;19:200–205. doi: 10.4153/CJM-1967-012-4. [DOI] [Google Scholar]
  • 21.Bauschke H.H., Borwein J.M., Combettes P.L. Essential smoothness, essential strict convexity, and Legendre functions in Banach spaces. Commun. Contemp. Math. 2001;3:615–647. doi: 10.1142/S0219199701000524. [DOI] [Google Scholar]
  • 22.Amari S.i. Information Geometry and Its Applications. Springer; Tokyo, Japan: 2016. Applied Mathematical Sciences. [Google Scholar]
  • 23.Horváth J. Topological Vector Spaces and Distributions. Courier Corporation; New York, NY, USA: 2013. [Google Scholar]
  • 24.McInerney A. Undergraduate Texts in Mathematics. Springer; New York, NY, USA: 2013. First Steps in Differential Geometry: Riemannian, Contact, Symplectic. [Google Scholar]
  • 25.Bourguignon J.P. Variational Calculus. Springer; Berlin/Heidelberg, Germany: 2022. [Google Scholar]
  • 26.Siegel C.L. Symplectic Geometry. Elsevier; Amsterdam, The Netherlands: 1964. [Google Scholar]
  • 27.Everitt W., Markus L. Complex symplectic geometry with applications to ordinary differential operators. Trans. Am. Math. Soc. 1999;351:4905–4945. doi: 10.1090/S0002-9947-99-02418-6. [DOI] [Google Scholar]
  • 28.Moreau J.J. Inf-convolution, sous-additivité, convexité des fonctions numériques. [(accessed on 25 August 2024)];J. Mathématiques Pures Appliquées. 1970 Available online: https://hal.science/hal-02162006/ [Google Scholar]
  • 29.Wong T.K.L. Logarithmic divergences from optimal transport and Rényi geometry. Inf. Geom. 2018;1:39–78. doi: 10.1007/s41884-018-0012-6. [DOI] [Google Scholar]
  • 30.Leok M., Zhang J. Connecting information geometry and geometric mechanics. Entropy. 2017;19:518. doi: 10.3390/e19100518. [DOI] [Google Scholar]
  • 31.Kiwiel K.C. Free-steering relaxation methods for problems with strictly convex costs and linear constraints. Math. Oper. Res. 1997;22:326–349. doi: 10.1287/moor.22.2.326. [DOI] [Google Scholar]
  • 32.Gordon G.J. Ph.D. Thesis. Carnegie Mellon University; Pittsburgh, PA, USA: 1999. Approximate Solutions to Markov Decision Processes. [Google Scholar]
  • 33.Iyer R., Bilmes J.A. Submodular-Bregman and the Lovász-Bregman divergences with applications. Adv. Neural Inf. Process. Syst. 2012;25:2933–2941. [Google Scholar]
  • 34.Nielsen F. An elementary introduction to information geometry. Entropy. 2020;22:1100. doi: 10.3390/e22101100. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Nielsen F. The many faces of information geometry. Not. Am. Math. Soc. 2022;69:36–45. doi: 10.1090/noti2403. [DOI] [Google Scholar]
  • 36.Eguchi S. A differential geometric approach to statistical inference on the basis of contrast functionals. Hiroshima Math. J. 1985;15:341–391. doi: 10.32917/hmj/1206130775. [DOI] [Google Scholar]
  • 37.Amari S.i., Cichocki A. Information geometry of divergence functions. Bull. Pol. Acad. Sci. Tech. Sci. 2010;58:183–195. doi: 10.2478/v10175-010-0019-1. [DOI] [Google Scholar]
  • 38.Barndorff-Nielsen O.E., Jupp P.E. Annales de la Faculté des Sciences de Toulouse: Mathématiques. Volume 6. Université Paul Sabatier; Toulouse, France: 1997. Statistics, yokes and symplectic geometry; pp. 389–427. [Google Scholar]
  • 39.Matumoto T. Any statistical manifold has a contrast function: On the C3-functions taking the minimum at the diagonal of the product manifold. Hiroshima Math. J. 1993;23:327–332. doi: 10.32917/hmj/1206128255. [DOI] [Google Scholar]
  • 40.Shima H. The Geometry of Hessian Structures. World Scientific; Singapore: 2007. [Google Scholar]
  • 41.Zhang J. Geometric Theory of Information. Springer; Berlin/Heidelberg, Germany: 2014. Divergence functions and geometric structures they induce on a manifold; pp. 1–30. [Google Scholar]
  • 42.Moreau J.J. Proximité et dualité dans un espace hilbertien. Bull. Société Mathématique Fr. 1965;93:273–299. doi: 10.24033/bsmf.1625. [DOI] [Google Scholar]
  • 43.Rockafellar R. Integrals which are convex functionals. Pac. J. Math. 1968;24:525–539. doi: 10.2140/pjm.1968.24.525. [DOI] [Google Scholar]
  • 44.Dacorogna B., Maréchal P. The role of perspective functions in convexity, polyconvexity, rank-one convexity and separate convexity. J. Convex Anal. 2008;15:271–284. [Google Scholar]
  • 45.Combettes P.L. Perspective functions: Properties, constructions, and examples. Set-Valued Var. Anal. 2018;26:247–264. doi: 10.1007/s11228-017-0407-x. [DOI] [Google Scholar]
  • 46.Csiszár I., Shields P.C. Information theory and statistics: A tutorial. Found. Trends® Commun. Inf. Theory. 2004;1:417–528. doi: 10.1561/0100000004. [DOI] [Google Scholar]
  • 47.Shi B., Du S.S., Su W., Jordan M.I. Acceleration via symplectic discretization of high-resolution differential equations. Adv. Neural Inf. Process. Syst. 2019;32:5744–5752. [Google Scholar]
  • 48.Barbaresco F. Handbook of Statistics. Volume 46. Elsevier; Amsterdam, The Netherlands: 2022. Symplectic theory of heat and information geometry; pp. 107–143. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.


Articles from Entropy are provided here courtesy of Multidisciplinary Digital Publishing Institute (MDPI)

RESOURCES