We give an effective method to compute the entropy for polynomials orthogonal on a segment of the real axis that uses as input data only the coefficients of the recurrence relation satisfied by these polynomials. This algorithm is based on a series expression for the mutual energy of two probability measures naturally connected with the polynomials. The particular case of Gegenbauer polynomials is analyzed in detail. These results are applied also to the computation of the entropy of spherical harmonics, important for the study of the entropic uncertainty relations as well as the spatial complexity of physical systems in central potentials
This is the author accepted manuscript. The final version is available from Elsevier via the DOI in ...
The univariate noncentral distributions can be derived by multiplying their central distributions wi...
Let $p_n$ be the $n$-th orthonormal polynomial on the real line, whose zeros are $\lambda_j^{(n)}$, ...
AbstractThis is a brief account on some results and methods of the asymptotic theory dealing with th...
17 pages, 1 figure.-- PACS nrs.: 03.67.−a, 02.30.Gp.-- MSC2000 codes: 30E20, 33B10, 33C45, 33F10, 42...
The Boltzmann-Shannon information entropy of probability measures which involve the continuous hyper...
This is a survey of the present knowledge on the analytical determination of the Shannon information...
AbstractThis is a survey of the present knowledge on the analytical determination of the Shannon inf...
An algorithm for estimating the entropy, which is based on the representation of the entropy functio...
AbstractThe Renyi, Shannon and Fisher spreading lengths of the classical or hypergeometric orthogona...
This work was partially supported by the Agencia Estatal de Investigacion (Spain) and the European R...
In this work, the spread of hypergeometric orthogonal polynomials (HOPs) along their orthogonality i...
Entropy and other fundamental quantities of information theory are customarily expressed and manipul...
The polynomials occurring in the wave functions of hydrogenic excited states are found to present di...
A Dirichlet polynomial d in one variable y is a function of the form d(y)=anny+⋯+a22y+a11y+a00y for ...
This is the author accepted manuscript. The final version is available from Elsevier via the DOI in ...
The univariate noncentral distributions can be derived by multiplying their central distributions wi...
Let $p_n$ be the $n$-th orthonormal polynomial on the real line, whose zeros are $\lambda_j^{(n)}$, ...
AbstractThis is a brief account on some results and methods of the asymptotic theory dealing with th...
17 pages, 1 figure.-- PACS nrs.: 03.67.−a, 02.30.Gp.-- MSC2000 codes: 30E20, 33B10, 33C45, 33F10, 42...
The Boltzmann-Shannon information entropy of probability measures which involve the continuous hyper...
This is a survey of the present knowledge on the analytical determination of the Shannon information...
AbstractThis is a survey of the present knowledge on the analytical determination of the Shannon inf...
An algorithm for estimating the entropy, which is based on the representation of the entropy functio...
AbstractThe Renyi, Shannon and Fisher spreading lengths of the classical or hypergeometric orthogona...
This work was partially supported by the Agencia Estatal de Investigacion (Spain) and the European R...
In this work, the spread of hypergeometric orthogonal polynomials (HOPs) along their orthogonality i...
Entropy and other fundamental quantities of information theory are customarily expressed and manipul...
The polynomials occurring in the wave functions of hydrogenic excited states are found to present di...
A Dirichlet polynomial d in one variable y is a function of the form d(y)=anny+⋯+a22y+a11y+a00y for ...
This is the author accepted manuscript. The final version is available from Elsevier via the DOI in ...
The univariate noncentral distributions can be derived by multiplying their central distributions wi...
Let $p_n$ be the $n$-th orthonormal polynomial on the real line, whose zeros are $\lambda_j^{(n)}$, ...