Abstract—Classical rate-distortion theory requires specifying a source distribution. Instead, we analyze rate-distortion properties of individual objects using the recently developed algorithmic rate-distortion theory. The latter is based on the noncomputable notion of Kolmogorov complexity. To apply the theory we approximate the Kolmogorov complexity by standard data compression techniques, and perform a number of experiments with lossy compression and denoising of objects from different domains. We also introduce a natural generalization to lossy compression with side information. To maintain full generality we need to address a difficult searching problem. While our solutions are therefore not time efficient, we do observe good denoising...
We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to w...
Building upon a series of recent works on perception-constrained lossy compression, we develop a rat...
Understanding generalization in modern machine learning settings has been one of the major challenge...
Abstract — Classical rate-distortion theory requires knowledge of an elusive source distribution. In...
Classical rate-distortion theory requires specifying a source distribution. Instead, we analyze rate...
We examine the structure of families of distortion balls from the perspective of Kolmogorov complexi...
Abstract—We present two results related to the computational complexity of lossy compression. The fi...
Abstract—Motivated by questions in lossy data compression and by theoretical considerations, the pro...
Abstract — We consider the problem of lossy data compression for data arranged on twodimensional arr...
The development of a universal lossy data compression model based on a lossy version of the Kraft in...
International audienceUnderstanding generalization in modern machine learning settings has been one ...
In the context of lossy compression, Blau \ Michaeli \cite{blau2019rethinking} adopt a mathematical ...
Throughout the years, measuring the complexity of networks and graphs has been of great interest to ...
Kolmogorov complexity is a theory based on the premise that the complexity of a binary string can be...
We consider the problem of lossy data compression for data arranged on two-dimensional arrays (such ...
We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to w...
Building upon a series of recent works on perception-constrained lossy compression, we develop a rat...
Understanding generalization in modern machine learning settings has been one of the major challenge...
Abstract — Classical rate-distortion theory requires knowledge of an elusive source distribution. In...
Classical rate-distortion theory requires specifying a source distribution. Instead, we analyze rate...
We examine the structure of families of distortion balls from the perspective of Kolmogorov complexi...
Abstract—We present two results related to the computational complexity of lossy compression. The fi...
Abstract—Motivated by questions in lossy data compression and by theoretical considerations, the pro...
Abstract — We consider the problem of lossy data compression for data arranged on twodimensional arr...
The development of a universal lossy data compression model based on a lossy version of the Kraft in...
International audienceUnderstanding generalization in modern machine learning settings has been one ...
In the context of lossy compression, Blau \ Michaeli \cite{blau2019rethinking} adopt a mathematical ...
Throughout the years, measuring the complexity of networks and graphs has been of great interest to ...
Kolmogorov complexity is a theory based on the premise that the complexity of a binary string can be...
We consider the problem of lossy data compression for data arranged on two-dimensional arrays (such ...
We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to w...
Building upon a series of recent works on perception-constrained lossy compression, we develop a rat...
Understanding generalization in modern machine learning settings has been one of the major challenge...