The paper describes an approach to measuring convergence of an algorithm to its result in terms of an entropy-like function of partitions of its inputs of a given length. The goal is to look at the algorithmic data processing from the viewpoint of information transformation, with a hope to better understand the work of algorithm, and maybe its complexity. The entropy is a measure of uncertainty, it does not correspond to our intuitive understanding of information. However, it is what we have in this area. In order to realize this approach we introduce a measure on the inputs of a given length based on the Principle of Maximal Uncertainty: all results should be equiprobable to the algorithm at the beginning. An algorithm is viewed as a set o...
Given the widespread use of lossless compression algorithms to approximate algorithmic (Kolmogorov-C...
In the field of optimization using probabilistic models of the search space, this thesis identifies ...
Entropy gain is widely used for learning decision trees. However, as we go deeper downward the tree,...
International audienceIntuitively, an algorithm that is computing the value of a function for a give...
Algorithmic entropy can be viewed as a special case of the entropy studied in statistical mechanics....
Abstract—Entropy rate of sequential data-streams naturally quantifies the complexity of the generati...
Abstract—Some tools to measure convergence properties of genetic algorithms are introduced. A classi...
Convergence of genetic algorithms in the form of asymptotic stability requirements is discussed. Som...
We investigate the properties of a Block Decomposition Method (BDM), which extends the power of a Co...
We investigate the properties of a divide-and-conquer Block Decomposition Method (BDM), which extend...
<p>ABSTRACT</p> <p>In order to find out the limiting speed of solving a specific problem using comp...
We study how the Shannon entropy of sequences produced by an information source converges to the sou...
We investigate the properties of a Block Decomposition Method (BDM), which extends the power of a Co...
In criticality safety evaluation, effective multiplication factor and its corresponding source distr...
In this paper we prove estimates on the behaviour of the Kolmogorov Sinai entropy relative to a part...
Given the widespread use of lossless compression algorithms to approximate algorithmic (Kolmogorov-C...
In the field of optimization using probabilistic models of the search space, this thesis identifies ...
Entropy gain is widely used for learning decision trees. However, as we go deeper downward the tree,...
International audienceIntuitively, an algorithm that is computing the value of a function for a give...
Algorithmic entropy can be viewed as a special case of the entropy studied in statistical mechanics....
Abstract—Entropy rate of sequential data-streams naturally quantifies the complexity of the generati...
Abstract—Some tools to measure convergence properties of genetic algorithms are introduced. A classi...
Convergence of genetic algorithms in the form of asymptotic stability requirements is discussed. Som...
We investigate the properties of a Block Decomposition Method (BDM), which extends the power of a Co...
We investigate the properties of a divide-and-conquer Block Decomposition Method (BDM), which extend...
<p>ABSTRACT</p> <p>In order to find out the limiting speed of solving a specific problem using comp...
We study how the Shannon entropy of sequences produced by an information source converges to the sou...
We investigate the properties of a Block Decomposition Method (BDM), which extends the power of a Co...
In criticality safety evaluation, effective multiplication factor and its corresponding source distr...
In this paper we prove estimates on the behaviour of the Kolmogorov Sinai entropy relative to a part...
Given the widespread use of lossless compression algorithms to approximate algorithmic (Kolmogorov-C...
In the field of optimization using probabilistic models of the search space, this thesis identifies ...
Entropy gain is widely used for learning decision trees. However, as we go deeper downward the tree,...