A Breiman Type Theorem for Gibbs Measures (original) (raw)
Bulletin of the Brazilian Mathematical Society, New Series, 2021
We investigate in this work some situations where it is possible to estimate or determine the upper and the lower q-generalized fractal dimensions D ± µ (q), q ∈ R, of invariant measures associated with continuous transformations over compact metric spaces. In particular, we present an alternative proof of Young's Theorem [31] for the generalized fractal dimensions of the Bowen-Margulis measure associated with a C 1+α-Axiom A system over a two-dimensional compact Riemannian manifold M. We also present estimates for the generalized fractal dimensions of an ergodic measure for which Brin-Katok's Theorem is satisfied punctually, in terms of its metric entropy. Furthermore, for expansive homeomorphisms (like C 1-Axiom A systems), we show that the set of invariant measures such that D + µ (q) = 0 (q ≥ 1), under a hyperbolic metric, is generic (taking into account the weak topology). We also show that for each s ∈ [0, 1), D + µ (s) is bounded above, up to a constant, by the topological entropy, also under a hyperbolic metric. Finally, we show that, for some dynamical systems, the metric entropy of an invariant measure is typically zero, settling a conjecture posed by Sigmund in [25] for Lipschitz transformations which satisfy the specification property. Key words and phrases. Expansive homeomorphisms, Hausdorff dimension, packing dimension, invariant measures, generalized fractal dimensions, dynamical systems with specification * Work partially supported by CIENCIACTIVA C.G. 176-2015 † Work partially supported by FAPEMIG (a Brazilian government agency; Universal Project 001/17/CEX-APQ-00352-17) popular of all, the Hausdorff dimension, introduced in 1919 by Hausdorff, which gives a notion of size useful for distinguishing between sets of zero Lebesgue measure. Unfortunately, the Hausdorff dimension of relatively simple sets can be very hard to calculate; besides, the notion of Hausdorff dimension is not completely adapted to the dynamics per se (for instance, if Z is a periodic orbit, then its Hausdorff dimension is zero, regardless to whether the orbit is stable, unstable, or neutral). This fact led to the introduction of other characteristics for which it is possible to estimate the size of irregular sets. For this reason, some of these quantities were also branded as "dimensions" (although some of them lack some basic properties satisfied by Hausdorff dimension, such as σ-stability; see [12]). Several good candidates were proposed, such as the correlation, information, box counting and entropy dimensions, among others. Thus, in order to obtain relevant information about the dynamics, one should consider not only the geometry of the measurable set Z ⊂ X (where X is some Borel measurable space), but also the distribution of points on Z under f (which is assumed to be a measurable transformation). That is, one should be interested in how often a given point x ∈ Z visits a fixed subset Y ⊂ Z under f. If µ is an ergodic measure for which µ(Y) > 0, then for a typical point x ∈ Z, the average number of visits is equal to µ(Y). Thus, the orbit distribution is completely determined by the measure µ. On the other hand, the measure µ is completely specified by the distribution of a typical orbit. This fact is widely used in the numerical study of dynamical systems where the distributions are, in general, non-uniform and have a clearly visible fine-scaled interwoven structure of hot and cold spots, that is, regions where the frequency of visitations is either much greater than average or much less than average respectively.
Entropy, Hausdorff measures old and new, and limit sets of geometrically finite Kleinian groups
Acta Mathematica, 1984
Given a (closed) set A contained in the plane and a positive function ~p(r) (for example r ~, r6(log I/r) 6', etc.) one defines the (covering) Hausdorff measure of A relative to ~(r) by considering coverings of a subset AcA by balls Bi, B2 .... of radii rl, r2 .... all less than e~>0. The (covering) Hausdorff ~o-measure of A is the limit as e---~0 of the infimum over such coverings of the sums EiVd(ri).
Entropy and the uniform mean ergodic theorem for a family of sets
Transactions of the American Mathematical Society, 2016
We define a notion of entropy for an infinite family C of measurable sets in a probability space. We show that the mean ergodic theorem holds uniformly for C under every ergodic transformation if and only if C has zero entropy. When the entropy of C is positive, we establish a strong converse showing that the uniform mean ergodic theorem fails generically in every isomorphism class, including the isomorphism classes of Bernoulli transformations. As a corollary of these results, we establish that every strong mixing transformation is uniformly strong mixing on C if and only if the entropy of C is zero, and obtain a corresponding result for weak mixing transformations.
Hausdorff Dimension of Sets of Generic Points for Gibbs Measures
2002
For a translation invariant Gibbs measure n on the configuration space X of a lattice finite spin system, we consider the set X n of generic points. Using a Breiman type convergence theorem on the set X m of generic points of an arbitrary translation invariant probability measure m on X, we evaluate the Hausdorff dimension of the set X n with respect to any metric out of a wide class of ''scale'' metrics on X (including Billingsley metrics generated by Gibbs measures).