Abstract — The problem of truly-lossless (Pe = 0) distributed source coding [1] requires knowledge of the joint statistics of the sources. In particular the locations of the zeroes of the probability mass func-tions (pmfs) are crucial for encoding at rates below (H(X),H(Y)) [2]. We consider the distributed com-putation of the empirical joint pmf Pn of a sequence of random variable pairs observed at physically sep-arated nodes of a network. We consider both worst-case and average measures of information exchange and treat both exact calculation of Pn and a notion of approximation. We find that in all cases the com-munication cost grows linearly with the size of the in-put. Further, we consider the problem of determining whether the empirical...
In the distributed coding of correlated sources, the problem of characterizing the joint probability...
We first consider communication complexity which arises in applications where a system needs to comp...
Ahlswede R, Csiszár I. To get a bit of information may be as hard as to get full information. IEEE t...
Abstract—A problem of interactive function computation in a collocated network is studied in a distr...
The problem of truly-lossless (Pe = 0) distributed source coding [1] requires knowledge of the joint...
This paper discusses and analyzes various models of binary correlated sources, which may be relevant...
We consider a worst-case asymmetric distributed source coding problem where an information sink comm...
Abstract — In the early years of information theory, mutual information was defined as a random vari...
A network of nodes communicate via point-to-point memoryless independent noisy channels. Each node ...
We examine the communication required for generating random variables remotely. One party Alice is g...
In their seminal work [1], Slepian and Wolf consider the network information theoretic problem of co...
In this thesis, I explore via two formulations the impact of communication constraints on distribute...
Abstract—A network of nodes communicate via noisy channels. Each node has some real-valued initial m...
International audienceA novel multi-terminal source coding problem motivated by biclustering applica...
This paper considers the problem of communicating correlated information from multiple source nodes ...
In the distributed coding of correlated sources, the problem of characterizing the joint probability...
We first consider communication complexity which arises in applications where a system needs to comp...
Ahlswede R, Csiszár I. To get a bit of information may be as hard as to get full information. IEEE t...
Abstract—A problem of interactive function computation in a collocated network is studied in a distr...
The problem of truly-lossless (Pe = 0) distributed source coding [1] requires knowledge of the joint...
This paper discusses and analyzes various models of binary correlated sources, which may be relevant...
We consider a worst-case asymmetric distributed source coding problem where an information sink comm...
Abstract — In the early years of information theory, mutual information was defined as a random vari...
A network of nodes communicate via point-to-point memoryless independent noisy channels. Each node ...
We examine the communication required for generating random variables remotely. One party Alice is g...
In their seminal work [1], Slepian and Wolf consider the network information theoretic problem of co...
In this thesis, I explore via two formulations the impact of communication constraints on distribute...
Abstract—A network of nodes communicate via noisy channels. Each node has some real-valued initial m...
International audienceA novel multi-terminal source coding problem motivated by biclustering applica...
This paper considers the problem of communicating correlated information from multiple source nodes ...
In the distributed coding of correlated sources, the problem of characterizing the joint probability...
We first consider communication complexity which arises in applications where a system needs to comp...
Ahlswede R, Csiszár I. To get a bit of information may be as hard as to get full information. IEEE t...