Physics concepts have often been borrowed and independently developed by other fields of science. In this perspective, a significant example is that of the entropy in information theory. The aim of this paper is to provide a short and pedagogical introduction to the use of data compression techniques for the estimate of the entropy and other relevant quantities in information theory and algorithmic information theory. We consider in particular the LZ77 algorithm as a case study and discuss how a zipper can be used for information extraction. 1
This book is the first one that provides a solid bridge between algorithmic information theory and s...
In this paper, we present a general method for information extraction that exploits the features of ...
Data compression at its base is concerned with how information is organized in data. Understanding t...
Physics concepts have often been borrowed and independently developed by other fields of science. In...
In this short note we review the concept of complexity in the context of Information Theory (Shannon...
Information content and compression are tightly related concepts that can be addressed through both ...
ABSTRACT In order to find out the limiting speed of solving a specific problem using computer, this...
As the era of big data arises, people get access to numerous amounts of multi-view data. Measuring, ...
Abstract — In this paper, the role of pattern matching information theory is motivated and discussed...
This paper describes the method which allows an estimation of information entropy in the meaning of ...
This paper explores the idea of information loss through data compression, as occurs in the course o...
Calculations of entropy of a signal or mutual information between two variables are valuable analyti...
We study the relation between Information Theory and Automatic Problem Solving to demonstrate that t...
AbstractWe study complexity and information and introduce the idea that while complexity is relative...
We investigate how information leakage reduces computational entropy of a random variable X. Recall ...
This book is the first one that provides a solid bridge between algorithmic information theory and s...
In this paper, we present a general method for information extraction that exploits the features of ...
Data compression at its base is concerned with how information is organized in data. Understanding t...
Physics concepts have often been borrowed and independently developed by other fields of science. In...
In this short note we review the concept of complexity in the context of Information Theory (Shannon...
Information content and compression are tightly related concepts that can be addressed through both ...
ABSTRACT In order to find out the limiting speed of solving a specific problem using computer, this...
As the era of big data arises, people get access to numerous amounts of multi-view data. Measuring, ...
Abstract — In this paper, the role of pattern matching information theory is motivated and discussed...
This paper describes the method which allows an estimation of information entropy in the meaning of ...
This paper explores the idea of information loss through data compression, as occurs in the course o...
Calculations of entropy of a signal or mutual information between two variables are valuable analyti...
We study the relation between Information Theory and Automatic Problem Solving to demonstrate that t...
AbstractWe study complexity and information and introduce the idea that while complexity is relative...
We investigate how information leakage reduces computational entropy of a random variable X. Recall ...
This book is the first one that provides a solid bridge between algorithmic information theory and s...
In this paper, we present a general method for information extraction that exploits the features of ...
Data compression at its base is concerned with how information is organized in data. Understanding t...