This paper compares three methods for producing lower bounds on the minimax risk under quadratic loss. The first uses the bounds from Brown and Gajek. The second method also uses the information inequality and results in bounds which are always at least as good as those from the first method. The third method is the hardest-linear-family method described by Donoho and Liu. These methods are applied in four examples, the last of which relates to a frequently considered problem in nonparametric regression
Abstract. Information inequalities for the minimax risk of sequential es-timators are derived in the...
We show how to a derive exact distribution-free nonparametric results for minimax risk when underlyi...
The problem of the minimax estimation of an additive nonparametric regression is considered. The reg...
This paper compares three methods for producing lower bounds on the minimax risk under quadratic los...
This paper compares three methods for producing lower bounds on the minimax risk under quadratic los...
This paper presents lower bounds for the minimax risk under quadraticloss, derived from information ...
The information inequality has been shown to be an effective tool for providing lower bounds for the...
The information inequality has been shown to be an effective tool for providing lower bounds for the...
International audienceThe paper deals with the problem of nonparametric estimating the Lp-norm, p ∈ ...
This paper presents lower bounds, derived from the information inequality, for the Bayes risk under ...
In this paper we present a direct and simple approach to obtain bounds on the asymptotic minimax ris...
Abstract—Lower bounds involving -divergences between the underlying probability measures are proved ...
This paper presents lower bounds, derived from the information inequality, for the Bayes risk under ...
This paper presents lower bounds, derived from the information inequality, for the Bayes risk under ...
Information inequalities for the minimax risk of sequential estimators are derived in the case where...
Abstract. Information inequalities for the minimax risk of sequential es-timators are derived in the...
We show how to a derive exact distribution-free nonparametric results for minimax risk when underlyi...
The problem of the minimax estimation of an additive nonparametric regression is considered. The reg...
This paper compares three methods for producing lower bounds on the minimax risk under quadratic los...
This paper compares three methods for producing lower bounds on the minimax risk under quadratic los...
This paper presents lower bounds for the minimax risk under quadraticloss, derived from information ...
The information inequality has been shown to be an effective tool for providing lower bounds for the...
The information inequality has been shown to be an effective tool for providing lower bounds for the...
International audienceThe paper deals with the problem of nonparametric estimating the Lp-norm, p ∈ ...
This paper presents lower bounds, derived from the information inequality, for the Bayes risk under ...
In this paper we present a direct and simple approach to obtain bounds on the asymptotic minimax ris...
Abstract—Lower bounds involving -divergences between the underlying probability measures are proved ...
This paper presents lower bounds, derived from the information inequality, for the Bayes risk under ...
This paper presents lower bounds, derived from the information inequality, for the Bayes risk under ...
Information inequalities for the minimax risk of sequential estimators are derived in the case where...
Abstract. Information inequalities for the minimax risk of sequential es-timators are derived in the...
We show how to a derive exact distribution-free nonparametric results for minimax risk when underlyi...
The problem of the minimax estimation of an additive nonparametric regression is considered. The reg...