Algorithmic information theory gives an idealized notion of compressibility that is often presented as an objective measure of simplicity. It is suggested at times that Solomonoff prediction, or algorithmic information theory in a predictive setting, can deliver an argument to justify Occam’s razor. This article explicates the relevant argument and, by converting it into a Bayesian framework, reveals why it has no such justificatory force. The supposed simplicity concept is better perceived as a specific inductive assumption, the assumption of effectiveness. It is this assumption that is the characterizing element of Solomonoff prediction and wherein its philosophical interest lies
In this paper, we analyze the problem of prediction in physics from the computational viewpoint. We ...
The advent of formal definitions of the simplicity of a theory has important implications for model ...
A long-standing debate in perception concerns the question of whether perceptual organization is gui...
Algorithmic information theory gives an idealized notion of compressibility that is often presented ...
The framework of Solomonoff prediction assigns prior probability to hypotheses inversely proportiona...
Solomono's optimal but noncomputable method for inductive inference assumes that observation s...
Understanding inductive reasoning is a problem that has engaged mankind for thousands of years. This...
I discuss the use of Kolmogorov complexity and Bayes’ theorem in Solomonoff’s inductive method to ex...
In this thesis I investigate the theoretical possibility of a universal method of prediction. A pred...
in terms of minimizing retractions en route to the truth, relative to all deterministic scientific s...
Solomonoff’s inductive learning model is a powerful, universal and highly elegant theory of sequence...
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We e...
James McAllister’s 2003 article, “Algorithmic randomness in empirical data ” claims that empirical d...
In contrast to statistical entropy which measures the quantity of information in an average object ...
In this paper, we analyze the problem of prediction in physics from the computational viewpoint. We ...
The advent of formal definitions of the simplicity of a theory has important implications for model ...
A long-standing debate in perception concerns the question of whether perceptual organization is gui...
Algorithmic information theory gives an idealized notion of compressibility that is often presented ...
The framework of Solomonoff prediction assigns prior probability to hypotheses inversely proportiona...
Solomono's optimal but noncomputable method for inductive inference assumes that observation s...
Understanding inductive reasoning is a problem that has engaged mankind for thousands of years. This...
I discuss the use of Kolmogorov complexity and Bayes’ theorem in Solomonoff’s inductive method to ex...
In this thesis I investigate the theoretical possibility of a universal method of prediction. A pred...
in terms of minimizing retractions en route to the truth, relative to all deterministic scientific s...
Solomonoff’s inductive learning model is a powerful, universal and highly elegant theory of sequence...
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We e...
James McAllister’s 2003 article, “Algorithmic randomness in empirical data ” claims that empirical d...
In contrast to statistical entropy which measures the quantity of information in an average object ...
In this paper, we analyze the problem of prediction in physics from the computational viewpoint. We ...
The advent of formal definitions of the simplicity of a theory has important implications for model ...
A long-standing debate in perception concerns the question of whether perceptual organization is gui...