Notes on Differential Geometry and Lie Groups (original) (raw)

For Statistical Manifold Geometry

2005

Statistical manifolds are representations of smooth families of probability density functions that allow differential geometric methods to be applied to problems in stochastic processes, mathematical statistics and information theory. It is common to have to consider a number of linear connections on a given statistical manifold and so it is important to know the corresponding universal connection and curvature; then all linear connections and their curvatures are pullbacks. An important class of statistical manifolds is that arising from the exponential families and one particular family is that of gamma distributions, which we showed recently to have important uniqueness properties in stochastic processes. Here we provide formulae for universal connections and curvatures on exponential families and give an explicit example for the manifold of gamma distributions.

Geometric Structures for Lie Group Machine Learning: Natural Gradient, Koszul-Souriau-Fisher Metric and Lie Groups Thermodynamics

GDR CNRS ISIS Deep Learning Theory, 2019

L’objet de l’exposé concerne les structures géométriques élémentaires de l’apprentissage machine, fondées sur le « gradient naturel » de la « Géométrie de l’Information », qui rend invariant par changement de paramétrisation le gradient d’apprentissage dans les réseaux de neurones par le biais de la matrice de Fisher. Après cette introduction exposant le « gradient naturel » [1] et ses extensions récentes aux réseaux de neurones profonds [2], nous développons de nouvelles mé-thodes pour étendre l’approche à des espaces plus abstraits et en particulier les groupes de Lie (matriciels). L’apprentissage profond a été étendu récemment avec succès aux graphes, mais le thème émergent « (Matrix) Lie Group Machine Learning » [14][18][15][16][20][21] est une extension particulièrement intéressante pour les applications indus-trielles : reconnaissance de mouvements/cinématiques (série temporelle d’éléments du groupes SE(3)), reconnaissances de postures/gestes articulés [3](série temporelle de vecteurs d’éléments du groupe SO(3)), reconnaissance micro-Doppler [12](série temporelle d’éléments du groupe SU(1,1)) et en robotique (éléments de sous-groupes du groupe affine Aff(n)). Nous exposerons l’extension de la notion de métrique de Fisher par le mathématicien Jean-Louis Koszul [4][6][17] sur les cônes convexes saillants. Pour l’extension de l’apprentissage machine aux groupes de Lie, nous présenterons les outils issues de la physique statistique à travers le modèle du physicien Jean-Marie Souriau de la « Thermodynamique des groupes de Lie » [7][8][19][22] basé sur la géométrie symplectique (application moment, 2 forme KKS « Kirillov-Kostant-Souriau » dans le cas non-équivariant [5], le cocycle symplectique de Souriau, les méthodes des orbites coadjointes [9] issues de la théorie des représentations de Kirillov des groupes de Lie [10][11]). Nous terminerons par une illustration d’apprentissage machine pour les exemples canoniques de groupes de Lie matri-ciels classiques tels que les groupes SU(1,1)[13][23] (cas équivariant à cohomologie nulle), et le groupe SE(3) (cas non-équivariant à cohomologie non-nulle)[13].

Lie Theory and Applications. III

2008

Geometry and analysis of shape space. The core of shape space in one of its simplest forms is the orbit space of the action of the group of diffeomorphisms of the circle S1 (the reparametrization group) on the space of immersions of the circle into the plane R2. The aim is to find good Riemannian metrics which allow applications in pattern recognition and visualization. The contributions of the PI were obtained mainly in collaboration with David Mumford. In the paper [M107] via the Hamiltonian approach many metrics were investigated, together with their conserved quantities (one of them is the reparameterization momentum) and their sectional curvatures. Recall from a result of the predecessor of this project, that the L2-metric on the space of immersions has zero geodesic distance on the orbit space under the reparameterization group. These metrics come in 3 flavors: Some are derived from the the L2-metric by multiplying it with a function of the length of the curve (a conformal cha...

Geometry of statistical manifolds

Differential Geometry and its Applications, 1992

A statistical manifold (M, g, V) is a Riemannian manifold (M, g) equipped with torsionfree affine connections V, V' which are dual with respect to g. A point p E M is said to be V-isotropic if the sectional curvatures have the same value k(p), and (M, g, V) is said to be V-isotropic when M consists entirely of V-isotropic points. When the difference tensor cr of V and the Levi-Civita connection Vs of g is "apolar" with respect to g, Kurose has shown that (Y 3 0, and hence V = V' = Vs, provided that k(p) = k(constant). His proof relies on the existence of affine immersion which may no longer hold when k(p) is not constant. One objective of this paper is to show that the above Kurose's result still remains valid when (M, g, V) is assumed only to be V-isotropic. We also discuss the case where (M, g) is complete Riemannian.

A Riemannian scalar measure for diffusion tensor images

Pattern Recognition, 2011

We study a well-known scalar quantity in differential geometry, the Ricci scalar, in the context of Diffusion Tensor Imaging (DTI). We explore the relation between the Ricci scalar and the two most popular scalar measures in DTI: Mean Diffusivity and Fractional Anisotropy. We discuss results of computing the Ricci scalar on synthetic as well as real DTI data.

Statistics and Differential Geometry

2005

Abstract It has been realised for several decades now, probably since Efron's paper introducing the concept of statistical curvature [Efr75], that most of the main concepts and methods of differential geometry are of substantial interest in connection with the theory of statistical inference. This report describes in simple cases the links existing between the two theories. It is based on an article introducing the topic, by R. Kass [Kas89]. The focus is on parametric statistical models.

Statistics on diffeomorphisms via tangent space representations

Neuroimage, 2004

In this paper, we present a linear setting for statistical analysis of shape and an optimization approach based on a recent derivation of a conservation of momentum law for the geodesics of diffeomorphic flow. Once a template is fixed, the space of initial momentum becomes an appropriate space for studying shape via geodesic flow since the flow at any point along the geodesic is completely determined by the momentum at the origin through geodesic shooting equations. The space of initial momentum provides a linear representation of the nonlinear diffeomorphic shape space in which linear statistical analysis can be applied. Specializing to the landmark matching problem of Computational Anatomy, we derive an algorithm for solving the variational problem with respect to the initial momentum and demonstrate principal component analysis (PCA) in this setting with three-dimensional face and hippocampus databases. D

Computing Bi-Invariant Pseudo-Metrics on Lie Groups for Consistent Statistics

Entropy, 2015

In computational anatomy, organ's shapes are often modeled as deformations of a reference shape, i.e., as elements of a Lie group. To analyze the variability of the human anatomy in this framework, we need to perform statistics on Lie groups. A Lie group is a manifold with a consistent group structure. Statistics on Riemannian manifolds have been well studied, but to use the statistical Riemannian framework on Lie groups, one needs to define a Riemannian metric compatible with the group structure: a bi-invariant metric. However, it is known that Lie groups, which are not a direct product of compact and abelian groups, have no bi-invariant metric. However, what about bi-invariant pseudo-metrics? In other words: could we remove the assumption of the positivity of the metric and obtain consistent statistics on Lie groups through the pseudo-Riemannian framework? Our contribution is two-fold. First, we present an algorithm that constructs bi-invariant pseudo-metrics on a given Lie group, in the case of existence. Then, by running the algorithm on commonly-used Lie groups, we show that most of them do not admit any bi-invariant (pseudo-) metric. We thus conclude that the (pseudo-) Riemannian setting is too limited for the definition of consistent statistics on general Lie groups.