Psychological research has found that human perception of randomness is biased. In particular, people consistently show the overalternating bias: they rate binary sequences of symbols (such as Heads and Tails in coin flipping) with an excess of alternation as more random than prescribed by the normative criteria of Shannon's entropy. Within data mining for medical applications, Marcellin proposed an asymmetric measure of entropy that can be ideal to account for such bias and to quantify subjective randomness. We fitted Marcellin's entropy and Renyi's entropy (a generalized form of uncertainty measure comprising many different kinds of entropies) to experimental data found in the literature with the Differential Evolution algorithm. We obser...
Intelligence is often related to the behavioural complexity an agent can generate. For example, when...
In many tasks, human behavior is far noisier than is optimal. Yet when asked to behave randomly, peo...
We give a survey of the basic statistical ideas underlying the definition of entropy in information...
<p>(A) Data redrawn from Falk (1975) and reported in [<a href="http://www.ploscompbiol.org/article/i...
Human randomness perception is commonly described as biased. This is because when generating random ...
This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for ...
International audienceIn many areas of computer science, it is of primary importance to assess the r...
AbstractPsychologists have studied people's intuitive notions of randomness by two kinds of tasks: j...
Searching for information is critical in many situations. In medicine, for instance, careful choice ...
As a measure of randomness or uncertainty, the Boltzmann–Shannon entropy H has become one of the mos...
Shannon entropy is often considered as a measure of uncertainty. It is commonly believed that entro...
A variety of conceptualizations of psychological uncertaintyexist. From an information-theoretic per...
Many algorithms of machine learning use an entropy measure as optimization criterion. Among the wide...
Since its evolution, the concept of Entropy has been applied in various fields like Computer Science...
Intelligence is often related to the behavioural complexity an agent can generate. For example, when...
Intelligence is often related to the behavioural complexity an agent can generate. For example, when...
In many tasks, human behavior is far noisier than is optimal. Yet when asked to behave randomly, peo...
We give a survey of the basic statistical ideas underlying the definition of entropy in information...
<p>(A) Data redrawn from Falk (1975) and reported in [<a href="http://www.ploscompbiol.org/article/i...
Human randomness perception is commonly described as biased. This is because when generating random ...
This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for ...
International audienceIn many areas of computer science, it is of primary importance to assess the r...
AbstractPsychologists have studied people's intuitive notions of randomness by two kinds of tasks: j...
Searching for information is critical in many situations. In medicine, for instance, careful choice ...
As a measure of randomness or uncertainty, the Boltzmann–Shannon entropy H has become one of the mos...
Shannon entropy is often considered as a measure of uncertainty. It is commonly believed that entro...
A variety of conceptualizations of psychological uncertaintyexist. From an information-theoretic per...
Many algorithms of machine learning use an entropy measure as optimization criterion. Among the wide...
Since its evolution, the concept of Entropy has been applied in various fields like Computer Science...
Intelligence is often related to the behavioural complexity an agent can generate. For example, when...
Intelligence is often related to the behavioural complexity an agent can generate. For example, when...
In many tasks, human behavior is far noisier than is optimal. Yet when asked to behave randomly, peo...
We give a survey of the basic statistical ideas underlying the definition of entropy in information...