AbstractWe obtain rates of strong approximation of the empirical process indexed by functions by a Brownian bridge under only random entropy conditions. The results of Berthet and Mason [P. Berthet, D.M. Mason, Revisiting two strong approximation results of Dudley and Philipp, in: High Dimensional Probability, in: IMS Lecture Notes-Monograph Series, vol. 51, 2006, pp. 155–172] under bracketing entropy are extended by combining their method to properties of the empirical entropy. Our results show that one can improve the universal rate vn=o(loglogn) from Dudley and Philipp [R.M. Dudley, W. Philipp, Invariance principles for sums of Banach space valued random elements and empirical processes, Z. Wahrsch. Verw. Gebiete 62 (1983) 509–552] into ...
AbstractWe prove a strong invariance principle for the two-parameter empirical process of stationary...
AbstractLet ω1, ω2, … be a sequence of i.i.d. r.v. with Eω1 ≠ 0 and Var ωi < ∞, with common distribu...
One of Shannon's intuitive schemes for estimating the Entropy of printed English is generalized here...
AbstractWe obtain rates of strong approximation of the empirical process indexed by functions by a B...
International audienceWe obtain rates of strong approximation of the empirical process indexed by fu...
Nous obtenons des vitesses d'approximation forte du processus empirique par une suite de ponts brown...
International audienceWe establish a Glivenko-Cantelli and a Donsker theorem for a class of random d...
Entropy rate of discrete random sources are a real valued functional on the space of probability mea...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
AbstractWe approximate the empirical process, based on multivariate random samples with an arbitrary...
AbstractIn this note, we give an estimate for the difference between the rate function for empirical...
The concept of entropy rate is well defined in dynamical systems theory but is impossible to apply i...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
AbstractThis paper extends results of Bolthausen and Schmock on the asymptotical behaviour of certai...
ABSTRACT. \Ve give a characterilla.tion of Maximum Entropy JMinimum Relative En~ tropy inference by ...
AbstractWe prove a strong invariance principle for the two-parameter empirical process of stationary...
AbstractLet ω1, ω2, … be a sequence of i.i.d. r.v. with Eω1 ≠ 0 and Var ωi < ∞, with common distribu...
One of Shannon's intuitive schemes for estimating the Entropy of printed English is generalized here...
AbstractWe obtain rates of strong approximation of the empirical process indexed by functions by a B...
International audienceWe obtain rates of strong approximation of the empirical process indexed by fu...
Nous obtenons des vitesses d'approximation forte du processus empirique par une suite de ponts brown...
International audienceWe establish a Glivenko-Cantelli and a Donsker theorem for a class of random d...
Entropy rate of discrete random sources are a real valued functional on the space of probability mea...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
AbstractWe approximate the empirical process, based on multivariate random samples with an arbitrary...
AbstractIn this note, we give an estimate for the difference between the rate function for empirical...
The concept of entropy rate is well defined in dynamical systems theory but is impossible to apply i...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
AbstractThis paper extends results of Bolthausen and Schmock on the asymptotical behaviour of certai...
ABSTRACT. \Ve give a characterilla.tion of Maximum Entropy JMinimum Relative En~ tropy inference by ...
AbstractWe prove a strong invariance principle for the two-parameter empirical process of stationary...
AbstractLet ω1, ω2, … be a sequence of i.i.d. r.v. with Eω1 ≠ 0 and Var ωi < ∞, with common distribu...
One of Shannon's intuitive schemes for estimating the Entropy of printed English is generalized here...