Abstract. This article surveys application of convex optimization theory to topics in Information Theory. Topics include optimal robust algorithms for hypothesis testing; a fresh look at the relationships between channel coding and robust hypothesis testing; and the structure of optimal input distributions in channel coding. A key finding is that the optimal distribution achieving channel capacity is typically discrete, and that the distribution achieving an optimal error exponent for rates below capacity is always discrete. We find that the resulting codes significantly out-perform traditional signal constellation schemes such as QAM and PSK
In this article, we revisit the classical problem of channel coding and obtain novel results on prop...
This book is an updated version of the information theory classic, first published in 1990. About on...
In this paper, we use entropy functions to characterise the set of rate-capacity tuples achievable w...
This paper concerns the structure of optimal codes for stochastic channel models. An investigation o...
This paper concerns the structure of optimal codes for stochastic channel models. An investigation o...
International audienceThe channel capacity of a deterministic system with confidential data is an up...
Abstract. The channel capacity of a deterministic system with confidential data is an upper bound on...
This paper studies the problem of optimal channel design. For a given input probability distribution...
The form of capacity achieving input distribution is specified for a class of finite state channels ...
We evaluate the information capacity of channels for which the noise process is a Gaussian measure o...
Abstract—Convexity properties of error rates of a class of decoders, including the maximum-likelihoo...
Properties of optimal entropy-constrained vector quantizers (ECVQs) are studied for the squared err...
A central question in information theory is to determine the maximum success probability that can be...
We show that the entropy of a message can be tested in a device-independent way. Specifically, we co...
The field of information science has greatly developed, and applications in various fields have emer...
In this article, we revisit the classical problem of channel coding and obtain novel results on prop...
This book is an updated version of the information theory classic, first published in 1990. About on...
In this paper, we use entropy functions to characterise the set of rate-capacity tuples achievable w...
This paper concerns the structure of optimal codes for stochastic channel models. An investigation o...
This paper concerns the structure of optimal codes for stochastic channel models. An investigation o...
International audienceThe channel capacity of a deterministic system with confidential data is an up...
Abstract. The channel capacity of a deterministic system with confidential data is an upper bound on...
This paper studies the problem of optimal channel design. For a given input probability distribution...
The form of capacity achieving input distribution is specified for a class of finite state channels ...
We evaluate the information capacity of channels for which the noise process is a Gaussian measure o...
Abstract—Convexity properties of error rates of a class of decoders, including the maximum-likelihoo...
Properties of optimal entropy-constrained vector quantizers (ECVQs) are studied for the squared err...
A central question in information theory is to determine the maximum success probability that can be...
We show that the entropy of a message can be tested in a device-independent way. Specifically, we co...
The field of information science has greatly developed, and applications in various fields have emer...
In this article, we revisit the classical problem of channel coding and obtain novel results on prop...
This book is an updated version of the information theory classic, first published in 1990. About on...
In this paper, we use entropy functions to characterise the set of rate-capacity tuples achievable w...