The Markov Renewal Theorem and Related Results (original) (raw)
Related papers
RECURRENCE THEOREMS FOR MARKOV RmOM WALKS
Let (M,, SJnao be a Markov random walk whose driving chain (M,JnbD with general. state space (9, G) is ergodic with unique stationary distribution 4. Providing n- S, + 0 in probability under PI, it is shown that the recurrence set of (S,- y (Mo)+ y (MJ)3).20 forms a closed subgroup of R depending on the lattice-type of (M,, The so-called shift function y is bounded and appears in that lattice-type condition. The recurrence set d (S&,*, itself is aIso given but may look more complicated depending on y. The results extend the classical recurrence theorem for random walks with il.d. increments and further sharpen results by Berbee, Dekking and others on the recurrence behavior of random walks with stationary increments. ANIS 1991 Subject Cl~ssifications: 60J05, 60515, 60K05, 60K15.
On the Markov Renewal Theorem ( Corrected version )
Let (S, S) be a measurable space with countably generated σ-field S and (M n , X n) n≥0 a Markov chain with state space S ×IR and transition kernel IP : S × (S ⊗ B) → [0, 1]. Then (M n , S n) n≥0 , where S n = X 0 + ... + X n for n ≥ 0, is called the associated Markov random walk. Markov renewal theory deals with the asymptotic behavior of suitable functionals of (M n , S n) n≥0 like the Markov renewal measure n≥0 P ((M n , S n) ∈ A × (t + B)) as t → ∞ where A ∈ S and B denotes a Borel subset of IR. It is shown that the Markov renewal theorem as well as a related ergodic theorem for semi-Markov processes hold true if only Harris recurrence of (M n) n≥0 is assumed. This was proved by purely analytical methods by Shurenkov [16] in the one-sided case where IP (x, [0, ∞)) = 1 for all x ∈ S. Our proof uses probabilistic arguments, notably the construction of regeneration epochs for (M n) n≥0 such that (M n , X n) n≥0 is at least nearly regenerative and an extension of Blackwell's ren...
Stochastic Processes and their Applications, 1994
EA X (t +B)) as t-t% where A E 9 and B denotes a Bore1 subset of iw. It is shown that the Markov renewal theorem as well as a related ergodic theorem for semi-Markov processes hold true if only Harris recurrence of (&I,),,, is assumed. This was proved by purely analytical methods by Shurenkov [ 15 ] in the one-sided case where $(x, S X [ 0, a)) = 1 for all 1~s. Our proof uses probabilistic arguments, notably the construction of regeneration epochs for (M,),,, such that (%X,.
Uniform Markov renewal theory and ruin probabilities in Markov random walks
The Annals of Applied Probability, 2004
Let {X n , n ≥ 0} be a Markov chain on a general state space X with transition probability P and stationary probability π. Suppose an additive component S n takes values in the real line R and is adjoined to the chain such that {(X n , S n), n ≥ 0} is a Markov random walk. In this paper, we prove a uniform Markov renewal theorem with an estimate on the rate of convergence. This result is applied to boundary crossing problems for {(X n , S n), n ≥ 0}. To be more precise, for given b ≥ 0, define the stopping time τ = τ (b) = inf{n : S n > b}. When a drift µ of the random walk S n is 0, we derive a one-term Edgeworth type asymptotic expansion for the first passage probabilities P π {τ < m} and P π {τ < m,S m < c}, where m ≤ ∞, c ≤ b and P π denotes the probability under the initial distribution π. When µ = 0, Brownian approximations for the first passage probabilities with correction terms are derived. Applications to sequential estimation and truncated tests in random coefficient models and first passage times in products of random matrices are also given.
On the Moments of Markov Renewal Processes
Advances in Applied Probability, 1969
Recently Kshirsagar and Gupta [5] obtained expressions for the asymptotic values of the first two moments of a Markov renewal process. The method they employed involved formal inversion of matrices of Lap1ace-Stie1tjes transforms.
Ladder epochs and ladder chain of a Markov random walk with discrete driving chain
Advances in Applied Probability
Let (Mn,Sn)n≥0 be a Markov random walk with positive recurrent driving chain (Mn)n≥0 on the countable state space 𝒮 with stationary distribution π. Suppose also that lim supn→∞Sn=∞ almost surely, so that the walk has almost-sure finite strictly ascending ladder epochs σn>. Recurrence properties of the ladder chain (Mσn>)n≥0 and a closely related excursion chain are studied. We give a necessary and sufficient condition for the recurrence of (Mσn>)n≥0 and further show that this chain is positive recurrent with stationary distribution π> and 𝔼π>σ1><∞ if and only if an associated Markov random walk (𝑀̂n,𝑆̂n)n≥0, obtained by time reversal and called the dual of (Mn,Sn)n≥0, is positive divergent, i.e. 𝑆̂n→∞ almost surely. Simple expressions for π> are also provided. Our arguments make use of coupling, Palm duality theory, and Wiener‒Hopf factorization for Markov random walks with discrete driving chain.
A new proof of convergence of MCMC via the ergodic theorem
Statistics & Probability Letters, 2011
A key result underlying the theory of MCMC is that any η-irreducible Markov chain having a transition density with respect to η and possessing a stationary distribution π is automatically positive Harris recurrent. This paper provides a short self-contained proof of this fact using the ergodic theorem in its standard form as the most advanced tool.
Random Walks with Stochastically Bounded Increments: Renewal Theory via Fourier Analysis
The Yokohama Mathematical Journal 横濱市立大學紀要 D部門 数学, 1994
This paper develops renewal theory for a rather general class of random walks S, including linear submartingales with positive drift. The basic assumption on SN is that their conditional increment distribution functions with respect to some filtration PN are bounded from above and below by integrable distribution functions. Under a further mean stability condition these random walks turn out to be natural candidates for satisfying Blackwell-type renewal theorems. In a companion paper [2], certain uniform lower and upper drift bounds for SN, describing its average growth on finite remote time intervals, have been introduced and shown to be equal in case the afore-mentioned mean stability condition holds true. With the help of these bounds we give lower and upper estimates for H * U(B), where U denotes the renewal measure of S,, H a suitable delay distribution and B a Borel subset of R. This is then further utilized in combination with a coupling argument to prove the principal result, namely an extension of Blackwell's renewal theorem to random walks of the previous type whose conditional increment distribution additionally contain a subsequence with a common component in a certain sense. A number of examples are also presented.
Renewal Theory for Markov Chains on the Real Line
The Annals of Probability, 1982
We generalize this theory to the case where {S.} is a Markov chain on 1 the real line with stationary transition probabilities satisfying a drift condition. The expectations we are concerned with satisfy generalized renewal equations, and in our main theorems, we show that these expectations are the unique solutions of the equations they satisfy.