Classical probability theory considers probability distributions that assign probabilities to all events (at least in the finite case). However, there are natural situations where only part of the process is controlled by some probability distribution while for the other part we know only the set of possibilities without any probabilities assigned. We adapt the notions of algorithmic information theory (complexity, algorithmic randomness, martingales, a priori probability) to this frame work and show that many classical results are still valid
AbstractThe Bayesian program in statistics starts from the assumption that an individual can always ...
Abstract. In this text we will discuss different forms of randomness in Natural Sciences and present...
The notion of algorithmic complexity (also sometimes called \algorithmic en-tropy") appeared in...
The article further develops Kolmogorov's algorithmic complexity theory. The definition of randomnes...
Early work on the frequency theory of probability made extensive use of the notion of randomness, co...
By flipping a coin repeatedly and recording the result, we can create a sequence that intuitively is...
This document contains lecture notes of an introductory course on Kolmogorov complexity. They cover ...
AbstractThis paper studies Dawid’s prequential framework from the point of view of the algorithmic t...
In this paper we give an introduction to the connection between complexity theory and the study of r...
This paper is a subjective, short overview of algorithmic information theory. We critically discuss ...
<p>Algorithmic probability is traditionally defined by considering the output of a universal machine...
After a brief review of ontic and epistemic descriptions, and of subjective, logical and statistical...
Following an axiomatic introduction to the prequential (predictive sequential) principle to statisti...
This article presents different recent theoretical results illustrating the interactions between pro...
Andrey Kolmogorov put forward in 1933 the five fundamental axioms of classical probability theory. T...
AbstractThe Bayesian program in statistics starts from the assumption that an individual can always ...
Abstract. In this text we will discuss different forms of randomness in Natural Sciences and present...
The notion of algorithmic complexity (also sometimes called \algorithmic en-tropy") appeared in...
The article further develops Kolmogorov's algorithmic complexity theory. The definition of randomnes...
Early work on the frequency theory of probability made extensive use of the notion of randomness, co...
By flipping a coin repeatedly and recording the result, we can create a sequence that intuitively is...
This document contains lecture notes of an introductory course on Kolmogorov complexity. They cover ...
AbstractThis paper studies Dawid’s prequential framework from the point of view of the algorithmic t...
In this paper we give an introduction to the connection between complexity theory and the study of r...
This paper is a subjective, short overview of algorithmic information theory. We critically discuss ...
<p>Algorithmic probability is traditionally defined by considering the output of a universal machine...
After a brief review of ontic and epistemic descriptions, and of subjective, logical and statistical...
Following an axiomatic introduction to the prequential (predictive sequential) principle to statisti...
This article presents different recent theoretical results illustrating the interactions between pro...
Andrey Kolmogorov put forward in 1933 the five fundamental axioms of classical probability theory. T...
AbstractThe Bayesian program in statistics starts from the assumption that an individual can always ...
Abstract. In this text we will discuss different forms of randomness in Natural Sciences and present...
The notion of algorithmic complexity (also sometimes called \algorithmic en-tropy") appeared in...