We obtain new entropy and mutual information formulae for regenerative stochastic processes. We use them on Markov channels to generalize the results in Goldsmith and Varaiya [3]. Also we obtain tighter bounds on capacity and better algorithms than [3]
In this paper, we focus our attention on the Rényi entropy rate of hidden Markov processes under ce...
This book is an updated version of the information theory classic, first published in 1990. About on...
One of the most basic characterizations of the relationship between two random variables, X and Y, i...
We obtain new entropy and mutual information formulae for regenerative stochastic processes. We use ...
International audienceThe channel capacity of a deterministic system with confidential data is an up...
Abstract. The channel capacity of a deterministic system with confidential data is an upper bound on...
This paper presents sufficient conditions for the direct computation of the entropy for functional (...
In this thesis we study quantum mechanical processes with a Markovian character. We focus on matters...
Markov regenerative processes are continuous‐time stochastic processes with more general conditions ...
Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomne...
<p>Concentration inequalities are very often a crucial step in deriving many results in statistical ...
Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process’ intrinsic randomne...
MS-Th-E-10: Stochastic Dynamics with Applications - paper no. MS-Th-E-10-1In this talk, making use ...
In this paper we summarize our recently proposed work on the information theory analysis of regenera...
International audience—We study entropy rates of random sequences for general entropy functionals in...
In this paper, we focus our attention on the Rényi entropy rate of hidden Markov processes under ce...
This book is an updated version of the information theory classic, first published in 1990. About on...
One of the most basic characterizations of the relationship between two random variables, X and Y, i...
We obtain new entropy and mutual information formulae for regenerative stochastic processes. We use ...
International audienceThe channel capacity of a deterministic system with confidential data is an up...
Abstract. The channel capacity of a deterministic system with confidential data is an upper bound on...
This paper presents sufficient conditions for the direct computation of the entropy for functional (...
In this thesis we study quantum mechanical processes with a Markovian character. We focus on matters...
Markov regenerative processes are continuous‐time stochastic processes with more general conditions ...
Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomne...
<p>Concentration inequalities are very often a crucial step in deriving many results in statistical ...
Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process’ intrinsic randomne...
MS-Th-E-10: Stochastic Dynamics with Applications - paper no. MS-Th-E-10-1In this talk, making use ...
In this paper we summarize our recently proposed work on the information theory analysis of regenera...
International audience—We study entropy rates of random sequences for general entropy functionals in...
In this paper, we focus our attention on the Rényi entropy rate of hidden Markov processes under ce...
This book is an updated version of the information theory classic, first published in 1990. About on...
One of the most basic characterizations of the relationship between two random variables, X and Y, i...