2009 IEEE International Symposium on Information TheoryWe consider a finite-state memoryless channel with i.i.d. channel state and the input Markov process supported on a mixing finite-type constraint. We discuss the asymptotic behavior of entropy rate of the output hidden Markov chain and deduce that the mutual information rate of such a channel is concave with respect to the parameters of the input Markov processes at high signal-to-noise ratio. In principle, the concavity result enables good numerical approximation of the maximum mutual information rate and capacity of such a channel. © 2009 IEEE.link_to_subscribed_fulltex
This paper shows the existence of the optimal training, in terms of achievable mutual information ra...
In this thesis we study quantum mechanical processes with a Markovian character. We focus on matters...
Abstract — We study the capacity of Markov channels with causal deterministic partial (quantized) st...
We consider a finite-state memoryless channel with i.i.d. channel state and the input Markov process...
Abstract—The computation of the capacity of a finite-state channel (FSC) is a fundamental and long-s...
The main concerns of this thesis are some special families of channels with memory, which are of gre...
The computation of the capacity of a finite-state channel (FSC) is a fundamental and long-standing o...
Inspired by the ideas from the field of stochastic approximation, we propose a randomized algorithm ...
Inspired by ideas from the field of stochastic approximation, we propose a randomized algorithm to c...
Inspired by ideas from the field of stochastic approximation, we propose a ran- domized algorithm to...
Abstract. We have no satisfactory capacity formula for most channels with finite states. Here, we co...
Abstract—We consider a class of finite-state Markov channels with feedback. We first introduce a sim...
We study a hidden Markov process which is the result of a transmission of the binary symmetric Marko...
This paper presents sufficient conditions for the direct computation of the entropy for functional (...
Determining the achievable rates at which information can be reliably transmitted across noisy chann...
This paper shows the existence of the optimal training, in terms of achievable mutual information ra...
In this thesis we study quantum mechanical processes with a Markovian character. We focus on matters...
Abstract — We study the capacity of Markov channels with causal deterministic partial (quantized) st...
We consider a finite-state memoryless channel with i.i.d. channel state and the input Markov process...
Abstract—The computation of the capacity of a finite-state channel (FSC) is a fundamental and long-s...
The main concerns of this thesis are some special families of channels with memory, which are of gre...
The computation of the capacity of a finite-state channel (FSC) is a fundamental and long-standing o...
Inspired by the ideas from the field of stochastic approximation, we propose a randomized algorithm ...
Inspired by ideas from the field of stochastic approximation, we propose a randomized algorithm to c...
Inspired by ideas from the field of stochastic approximation, we propose a ran- domized algorithm to...
Abstract. We have no satisfactory capacity formula for most channels with finite states. Here, we co...
Abstract—We consider a class of finite-state Markov channels with feedback. We first introduce a sim...
We study a hidden Markov process which is the result of a transmission of the binary symmetric Marko...
This paper presents sufficient conditions for the direct computation of the entropy for functional (...
Determining the achievable rates at which information can be reliably transmitted across noisy chann...
This paper shows the existence of the optimal training, in terms of achievable mutual information ra...
In this thesis we study quantum mechanical processes with a Markovian character. We focus on matters...
Abstract — We study the capacity of Markov channels with causal deterministic partial (quantized) st...