This paper deals with the computation of invariant measures and stationary expectations for discrete-time Markov chains governed by a block-structured one-step transition probability matrix. The method generalizes in some respect Neuts’ matrix-geometric approach to vector-state Markov chains. The method reveals a strong relationship between Markov chains and matrix continued fractions which can provide valuable information for mastering the growing complexity of real-world applications of large-scale grid systems and multidimensional level-dependent Markov models. The results obtained are extended to continuous-time Markov chains
Markov chains have famously been a crucial tool in understanding stochastic processes and queuing sy...
We propose an alternate parameterization of stationary regular finite-state Markov chains, and a dec...
International audienceThis book concerns discrete-time homogeneous Markov chains that admit an invar...
This paper deals with the computation of invariant measures and stationary expectations for discrete...
This paper deals with the computation of invariant measures and stationary expectations for discrete...
AbstractIn this report we relate the property of stochastic boundedness to the existence of stationa...
Markov chains, whose transition matrices reveal a certain type of block-structure, find many applica...
In this paper, we present an algorithmic approach to find the stationary probability distribution of...
In the past several decades, matrix analytic methods have proven effective at studying two important ...
Computational procedures for the stationary probability distribution, the group inverse of the Marko...
In this paper, we study Markov chains with infinite state block-structured transition ma-trices, who...
In many stochastic models a Markov chain is present either directly or indirectly through some form ...
This article describes an accurate procedure for computing the mean first passage times of a finite ...
AbstractA number of important theorems arising in connection with Gaussian elimination are derived, ...
This article is concerned with the estimation of Markov process transition probabilities for nonhomo...
Markov chains have famously been a crucial tool in understanding stochastic processes and queuing sy...
We propose an alternate parameterization of stationary regular finite-state Markov chains, and a dec...
International audienceThis book concerns discrete-time homogeneous Markov chains that admit an invar...
This paper deals with the computation of invariant measures and stationary expectations for discrete...
This paper deals with the computation of invariant measures and stationary expectations for discrete...
AbstractIn this report we relate the property of stochastic boundedness to the existence of stationa...
Markov chains, whose transition matrices reveal a certain type of block-structure, find many applica...
In this paper, we present an algorithmic approach to find the stationary probability distribution of...
In the past several decades, matrix analytic methods have proven effective at studying two important ...
Computational procedures for the stationary probability distribution, the group inverse of the Marko...
In this paper, we study Markov chains with infinite state block-structured transition ma-trices, who...
In many stochastic models a Markov chain is present either directly or indirectly through some form ...
This article describes an accurate procedure for computing the mean first passage times of a finite ...
AbstractA number of important theorems arising in connection with Gaussian elimination are derived, ...
This article is concerned with the estimation of Markov process transition probabilities for nonhomo...
Markov chains have famously been a crucial tool in understanding stochastic processes and queuing sy...
We propose an alternate parameterization of stationary regular finite-state Markov chains, and a dec...
International audienceThis book concerns discrete-time homogeneous Markov chains that admit an invar...