150 pagesData compression is a widely used technique to reduce the transmission rate of a source signal due to the capacity bottleneck of an information channel. Classical rate-distortion theory and information bottleneck method lay the theoretical foundations for lossy data compression from an information-theoretic perspective. However, in reality there are a wide variety of lossy compression applications constrained by a non information-theoretic bottleneck, such as privacy, quantization and dimensionality bottleneck, and are hence not within the scope of the classical rate-distortion theory and information bottleneck method. Moreover, nowadays more and more applications deal with sophisticated machine learning (ML) tasks such as inferenc...
International audienceA grand challenge in representation learning is the development of computation...
13 pages, 7 figures. Submitted for publicationThis paper investigates, from information theoretic gr...
While rate distortion theory compresses data under a distortion constraint, information bottleneck (...
A grand challenge in representation learning is the development of computational algorithms that lea...
We define the relevant information in a signal x 2 X as being the information that this signal provi...
The practical successes of deep neural networks have not been matched by theoretical progress that s...
Submitted to the 2018 International Zurich Seminar on Information and Communication (IZS)Internation...
(to appear)International audienceThis paper investigates a multi-terminal source coding problem unde...
This paper addresses the optimization of distributed compression in a sensor network. A direct commu...
This paper investigates a multi-terminal source coding problem under a logarithmic loss fidelity whi...
The information bottleneck (IB) method is a technique for extracting information that is relevant fo...
This dissertation develops a method for integrating information theoretic principles in distributed ...
International audienceA grand challenge in representation learning is the development of computation...
13 pages, 7 figures. Submitted for publicationThis paper investigates, from information theoretic gr...
While rate distortion theory compresses data under a distortion constraint, information bottleneck (...
A grand challenge in representation learning is the development of computational algorithms that lea...
We define the relevant information in a signal x 2 X as being the information that this signal provi...
The practical successes of deep neural networks have not been matched by theoretical progress that s...
Submitted to the 2018 International Zurich Seminar on Information and Communication (IZS)Internation...
(to appear)International audienceThis paper investigates a multi-terminal source coding problem unde...
This paper addresses the optimization of distributed compression in a sensor network. A direct commu...
This paper investigates a multi-terminal source coding problem under a logarithmic loss fidelity whi...
The information bottleneck (IB) method is a technique for extracting information that is relevant fo...
This dissertation develops a method for integrating information theoretic principles in distributed ...
International audienceA grand challenge in representation learning is the development of computation...
13 pages, 7 figures. Submitted for publicationThis paper investigates, from information theoretic gr...
While rate distortion theory compresses data under a distortion constraint, information bottleneck (...