AbstractIn this paper we introduce a general method of establishing tight linear inequalities between different types of predictive complexity. Predictive complexity is a generalisation of Kolmogorov complexity and it bounds the ability of an algorithm to predict elements of a sequence. Our method relies upon probabilistic considerations and allows us to describe explicitly the sets of coefficients which correspond to true inequalities. We apply this method to two particular types of predictive complexity, namely, logarithmic complexity, which coincides with a variant of Kolmogorov complexity, and square-loss complexity, which is interesting for applications
We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff fin...
AbstractThe central problem in machine learning (and statistics) is the problem of predicting future...
AbstractIt is well known in the theory of Kolmogorov complexity that most strings cannot be compress...
AbstractIn this paper we introduce a general method of establishing tight linear inequalities betwee...
AbstractThe notions of predictive complexity and of corresponding amount of information are consider...
AbstractPredictive complexity is a generalization of Kolmogorov complexity. It corresponds to an “op...
AbstractThis paper studies sequence prediction based on the monotone Kolmogorov complexity Km=-logm,...
Predictive complexity is a generalisation of Kolmogorov complexity motivated by an on-line predictio...
This paper studies sequence prediction based on the monotone Kolmogorov complexity Km = − log m, i.e...
This paper studies sequence prediction based on the monotone Kolmogorov complexity Km = − logm, i.e....
AbstractThe paper introduces a way of re-constructing a loss function from predictive complexity. We...
The paper introduces a way of re-constructing a loss function from predictive complexity. We show t...
AbstractKolmogorov's very first paper on algorithmic information theory (Kolmogorov, Problemy pereda...
This paper studies sequence prediction based on the monotone Kolmogorov complexity Km = -log m, i.e....
We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff fin...
We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff fin...
AbstractThe central problem in machine learning (and statistics) is the problem of predicting future...
AbstractIt is well known in the theory of Kolmogorov complexity that most strings cannot be compress...
AbstractIn this paper we introduce a general method of establishing tight linear inequalities betwee...
AbstractThe notions of predictive complexity and of corresponding amount of information are consider...
AbstractPredictive complexity is a generalization of Kolmogorov complexity. It corresponds to an “op...
AbstractThis paper studies sequence prediction based on the monotone Kolmogorov complexity Km=-logm,...
Predictive complexity is a generalisation of Kolmogorov complexity motivated by an on-line predictio...
This paper studies sequence prediction based on the monotone Kolmogorov complexity Km = − log m, i.e...
This paper studies sequence prediction based on the monotone Kolmogorov complexity Km = − logm, i.e....
AbstractThe paper introduces a way of re-constructing a loss function from predictive complexity. We...
The paper introduces a way of re-constructing a loss function from predictive complexity. We show t...
AbstractKolmogorov's very first paper on algorithmic information theory (Kolmogorov, Problemy pereda...
This paper studies sequence prediction based on the monotone Kolmogorov complexity Km = -log m, i.e....
We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff fin...
We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff fin...
AbstractThe central problem in machine learning (and statistics) is the problem of predicting future...
AbstractIt is well known in the theory of Kolmogorov complexity that most strings cannot be compress...