MARKOV CHAINS (original) (raw)
Related papers
Markov and the Creation of Markov Chains
2006
We describe the life, times and legacy of Andrei Andreevich Markov (1856 -1922), and his writings on what became known as Markov chains. One focus is on his first paper [27] of 1906 on this topic, which already contains important contractivity principles embodied in the Markov Dobrushin coefficient of ergodicity, which in fact makes an explicit appearance in that paper. The contractivity principles are shown directly to underpin a number of results of the later theory. The coefficient is especially useful as a condition number in measuring the effect of perturbation of a stochastic matrix on the stationary distribution (sensitivity analysis). Some recent work in this direction is reviewed from the standpoint of the paper [53], presented at the first of the present series of conferences [63].
Markov and almost Markov properties in one, two or more directions
2020
In this review-type paper written at the occasion of the Oberwolfach workshop {\\em One-sided vs. Two-sided stochastic processes} (february 22-29, 2020), we discuss and compare Markov properties and generalisations thereof in more directions, as well as weaker forms of conditional dependence, again either in one or more directions. In particular, we discuss in both contexts various extensions of Markov chains and Markov fields and their properties, such as ggg-measures, Variable Length Markov Chains, Variable Neighbourhood Markov Fields, Variable Neighbourhood (Parsimonious) Random Fields, and Generalized Gibbs Measures.
Probability and Mathematical Statistics
The paper gives some insight into the relations between two types of Markov processes – in the strict sense and in the wide sense – as well as into two aspects of periodicity. It concerns Markov processes with finite state space, the elements of which are complex numbers. Firstly it is shown that under some assumptions this space can be transformed in such a way that the resulting Markov process is also Markov in the wide sense. Next, sufficient conditions are given under which periodic homogeneous Markov chain is a periodically correlated process.
The Five Greatest Applications of Markov Chains
One hundred years removed from A. A. Markov's development of his chains, we take stock of the field he generated and the mathematical impression he left. As a tribute to Markov, we present what we consider to be the five greatest applications of Markov chains.
Conditional Markov Chains Revisited Part I: Construction and properties
In this paper we continue the study of conditional Markov chains (CMCs) with finite state spaces, that we initiated in Bielecki, Jakubowski and Niew\k{e}g{\l}owski (2014a) in an effort to enrich the theory of CMCs that was originated in Bielecki and Rutkowski (2004). We provide an alternative definition of a CMC and an alternative construction of a CMC via a change of probability measure. It turns out that our construction produces CMCs that are also doubly stochastic Markov chains (DSMCs), which allows for study of several properties of CMCs using tools available for DSMCs.
A zero-one law for Markov chains
Stochastics
We prove an analog of the classical Zero-One Law for both homogeneous and nonhomogeneous Markov chains (MC). Its almost precise formulation is simple: given any event A from the tail σ-algebra of MC (Z n), for large n, with probability near one, the trajectories of the MC are in states i, where P (A|Z n = i) is either near 0 or near 1. A similar statement holds for the entrance σ-algebra, when n tends to −∞. To formulate this second result, we give detailed results on the existence of nonhomogeneous Markov chains indexed by Z − or Z in both the finite and countable cases. This extends a well-known result due to Kolmogorov. Further, in our discussion, we note an interesting dichotomy between two commonly used definitions of MCs.
On Markov chains and ltrations
1997
In this paper we rederive some well known results for continuous time Markov processes that live on a nite state space. Martingale techniques are used throughout the paper. Special attention is paid to the construction of a continuous time Markov process, when we start from a discrete time Markov chain. The Markov property here holds with respect to ltrations that need not be minimal.
On characterisation of Markov processes via martingale problems
Proceedings Mathematical Sciences, 2006
It is well-known that well-posedness of a martingale problem in the class of continuous (or r.c.l.l.) solutions enables one to construct the associated transition probability functions. We extend this result to the case when the martingale problem is well-posed in the class of solutions which are continuous in probability. This extension is used to improve on a criterion for a probability measure to be invariant for the semigroup associated with the Markov process. We also give examples of martingale problems that are well-posed in the class of solutions which are continuous in probability but for which no r.c.l.l. solution exists.