We consider a finite-state memoryless channel with i.i.d. channel state and the input Markov process supported on a mixing finite-type constraint. We discuss the asymptotic behavior of entropy rate of the output hidden Markov chain and deduce that the mutual information rate of such a channel is concave with respect to the parameters of the input Markov processes at high signal-to-noise ratio. In principle, the concavity result enables good numerical approximation of the maximum mutual information rate and capacity of such a channel. 1 Channel Model In this paper, we show that for certain input-restricted finite-state memoryless channels, the mutual information rate, at high SNR, is effectively a concave function of Markov input processes o...
This paper shows the existence of the optimal training, in terms of achievable mutual information ra...
We study the classical problem of noisy constrained capacity in the case of the binary symmetric cha...
Determining the achievable rates at which information can be reliably transmitted across noisy chann...
2009 IEEE International Symposium on Information TheoryWe consider a finite-state memoryless channel...
We consider a memoryless channel with an input Markov process supported on a mixing finite-type cons...
The main concerns of this thesis are some special families of channels with memory, which are of gre...
Abstract—The computation of the capacity of a finite-state channel (FSC) is a fundamental and long-s...
Inspired by the ideas from the field of stochastic approximation, we propose a randomized algorithm ...
The computation of the capacity of a finite-state channel (FSC) is a fundamental and long-standing o...
Inspired by ideas from the field of stochastic approximation, we propose a randomized algorithm to c...
Inspired by ideas from the field of stochastic approximation, we propose a ran- domized algorithm to...
Abstract. We have no satisfactory capacity formula for most channels with finite states. Here, we co...
Abstract—We consider a class of finite-state Markov channels with feedback. We first introduce a sim...
We study a hidden Markov process which is the result of a transmission of the binary symmetric Marko...
This paper presents sufficient conditions for the direct computation of the entropy for functional (...
This paper shows the existence of the optimal training, in terms of achievable mutual information ra...
We study the classical problem of noisy constrained capacity in the case of the binary symmetric cha...
Determining the achievable rates at which information can be reliably transmitted across noisy chann...
2009 IEEE International Symposium on Information TheoryWe consider a finite-state memoryless channel...
We consider a memoryless channel with an input Markov process supported on a mixing finite-type cons...
The main concerns of this thesis are some special families of channels with memory, which are of gre...
Abstract—The computation of the capacity of a finite-state channel (FSC) is a fundamental and long-s...
Inspired by the ideas from the field of stochastic approximation, we propose a randomized algorithm ...
The computation of the capacity of a finite-state channel (FSC) is a fundamental and long-standing o...
Inspired by ideas from the field of stochastic approximation, we propose a randomized algorithm to c...
Inspired by ideas from the field of stochastic approximation, we propose a ran- domized algorithm to...
Abstract. We have no satisfactory capacity formula for most channels with finite states. Here, we co...
Abstract—We consider a class of finite-state Markov channels with feedback. We first introduce a sim...
We study a hidden Markov process which is the result of a transmission of the binary symmetric Marko...
This paper presents sufficient conditions for the direct computation of the entropy for functional (...
This paper shows the existence of the optimal training, in terms of achievable mutual information ra...
We study the classical problem of noisy constrained capacity in the case of the binary symmetric cha...
Determining the achievable rates at which information can be reliably transmitted across noisy chann...