See Durrett Sec 5.6 for the theory of discrete time recurrent Markov Chains with uncountable state space, as developed following Harris. The general idea is to recognize a suitable regenerative struc-ture, like what happens to a discrete time, discrete space Markov chain each time it comes back to a point. Then decompose the path into blocks which are i.i.d. This idea can also be applied to continuous time, discrete space chains. We now discuss a continuous time, discrete space Markov Chain, with time-homogeneous transi-tion probabilities. Let S = state space. The theory is easiest if S is nite. Some aspects can be extended to S countable. The book of Karlin and Taylor [5], provides details for most of the fol-lowing discussion. See also [3...
We consider time--inhomogeneous Markov chains on a finite state--space, whose transition probabiliti...
A classical random walk (St, t ∈ N) is defined by St:= t∑ n=0 Xn, where (Xn) are i.i.d. When the inc...
International audienceMarkov chains are a fundamental class of stochastic processes. They are widely...
Time-homogeneous Markov chains with nite state space in discrete time 1 Theory The following is a di...
In this paper we extend the results of Meyn and Tweedie (1992b) from discrete-time parameter to cont...
A rigorous and largely self-contained account of (a) the bread-and-butter concepts and techniques in...
Summary. Bounds on convergence rates for Markov chains are a very widely-studied topic, motivated la...
A continuous-time Markov process (CTMP) is a collection of variables indexed by a continuous quantit...
Esta dissertação tem como tema o estudo das cadeias de Markov discretas com valores em um espaço de ...
A continuous-time Markov process (CTMP) is a collection of variables indexed by a continuous quantit...
Educação Superior::Ciências Exatas e da Terra::MatemáticaConsider a system that is always in one of ...
International audienceThis book concerns discrete-time homogeneous Markov chains that admit an invar...
Markov chains are a fundamental class of stochastic processes. They are widely used to solve problem...
In a recent paper, van Doorn (1991) explained how quasi-stationary distributions for an absorbing bi...
1.1 Discrete-time Markov chains........................... 4 1.2 Continuous-time Markov chains.........
We consider time--inhomogeneous Markov chains on a finite state--space, whose transition probabiliti...
A classical random walk (St, t ∈ N) is defined by St:= t∑ n=0 Xn, where (Xn) are i.i.d. When the inc...
International audienceMarkov chains are a fundamental class of stochastic processes. They are widely...
Time-homogeneous Markov chains with nite state space in discrete time 1 Theory The following is a di...
In this paper we extend the results of Meyn and Tweedie (1992b) from discrete-time parameter to cont...
A rigorous and largely self-contained account of (a) the bread-and-butter concepts and techniques in...
Summary. Bounds on convergence rates for Markov chains are a very widely-studied topic, motivated la...
A continuous-time Markov process (CTMP) is a collection of variables indexed by a continuous quantit...
Esta dissertação tem como tema o estudo das cadeias de Markov discretas com valores em um espaço de ...
A continuous-time Markov process (CTMP) is a collection of variables indexed by a continuous quantit...
Educação Superior::Ciências Exatas e da Terra::MatemáticaConsider a system that is always in one of ...
International audienceThis book concerns discrete-time homogeneous Markov chains that admit an invar...
Markov chains are a fundamental class of stochastic processes. They are widely used to solve problem...
In a recent paper, van Doorn (1991) explained how quasi-stationary distributions for an absorbing bi...
1.1 Discrete-time Markov chains........................... 4 1.2 Continuous-time Markov chains.........
We consider time--inhomogeneous Markov chains on a finite state--space, whose transition probabiliti...
A classical random walk (St, t ∈ N) is defined by St:= t∑ n=0 Xn, where (Xn) are i.i.d. When the inc...
International audienceMarkov chains are a fundamental class of stochastic processes. They are widely...