We show that the natural scaling of measurement for a particular problem defines the most likely probability distribution of observations taken from that measurement scale. Our approach extends the method of maximum entropy to use measurement scale as a type of information constraint. We argue that a very common measurement scale is linear at small magnitudes grading into logarithmic at large magnitudes, leading to observations that of ten follow Student's probability distribution which has a Gaussian shape for small fluctuations from the mean and a power law shape for large fluctuations from the mean. An inverse scaling of ten arises in which measures naturally grade from logarithmic to linear as one moves from small to large magnitudes, l...
This thesis is a formal presentation of entropy and related principles as they relate to probability...
The problem of determining the optimal number of multiple measurements based on the type of the erro...
The relationship between three probability distributions and their maximizable entropy forms is disc...
We show that the natural scaling of measurement for a particular problem defines the most likely pro...
When we only have partial information about the probability distri-bution, i.e., when several differ...
Probability distributions can be read as simple expressions of information. Each continuous probabil...
Probability distributions can be read as simple expressions of information. Each continuous probabil...
The history of the so called “Benford’s Law”, which concerns the distribution of the first significa...
Shift and stretch invariance lead to the exponential-Boltzmann probability distribution. Rotational ...
Shift and stretch invariance lead to the exponential-Boltzmann probability distribution. Rotational ...
A number of “normalized” measures ol entropy have been obtained to measure the “intrinsic” uncertain...
Organizations of many variables in nature such as soil moisture and topography exhibit patterns with...
We present a novel derivation of the constraints required to obtain the underlying principles of sta...
Shannon’s entropy is calculated using probabilities P(i) i.e. S= - Sum over i P(i) ln(P(i)). A proba...
In this paper an alternative approach to statistical mechanics based on the maximum inform...
This thesis is a formal presentation of entropy and related principles as they relate to probability...
The problem of determining the optimal number of multiple measurements based on the type of the erro...
The relationship between three probability distributions and their maximizable entropy forms is disc...
We show that the natural scaling of measurement for a particular problem defines the most likely pro...
When we only have partial information about the probability distri-bution, i.e., when several differ...
Probability distributions can be read as simple expressions of information. Each continuous probabil...
Probability distributions can be read as simple expressions of information. Each continuous probabil...
The history of the so called “Benford’s Law”, which concerns the distribution of the first significa...
Shift and stretch invariance lead to the exponential-Boltzmann probability distribution. Rotational ...
Shift and stretch invariance lead to the exponential-Boltzmann probability distribution. Rotational ...
A number of “normalized” measures ol entropy have been obtained to measure the “intrinsic” uncertain...
Organizations of many variables in nature such as soil moisture and topography exhibit patterns with...
We present a novel derivation of the constraints required to obtain the underlying principles of sta...
Shannon’s entropy is calculated using probabilities P(i) i.e. S= - Sum over i P(i) ln(P(i)). A proba...
In this paper an alternative approach to statistical mechanics based on the maximum inform...
This thesis is a formal presentation of entropy and related principles as they relate to probability...
The problem of determining the optimal number of multiple measurements based on the type of the erro...
The relationship between three probability distributions and their maximizable entropy forms is disc...