Notre travail port sur l'inf´erence au sujet de l'AIC (un cas de vraisemblance p`enalis´ee) d'Akaike (1973), o`u comme estimateur de divergence de Kullback-Leibler est intimement reli´ee `a l'estimateur de maximum de vraisemblance. Comme une partie de la statistique inf´erentielle, dans le contexte de test d'hypoth`ese, la divergence de Kullback-Leibler et le lemme de Neyman-Pearson sont deux concepts fondamentaux. Tous les deux sont au sujet du rapports de 11 vraisemblance. Neyman-Pearson est au sujet du taux d'erreur du test du rapport de vraisemblance et la divergence de Kullback-Leibler est l'esp´erance du rapport de log-vraisemblance
The Kullback information criterion (KIC) was proposed by Cavanaugh (1999) to serve as an asymptotica...
International audienceUn test d'adéquation dans le cas de la régression univariée est proposé. Ce te...
AbstractWe propose a new definition of the Neyman chi-square divergence between distributions. Based...
Notre travail port sur l'inf´erence au sujet de l'AIC (un cas de vraisemblance p`enalis´ee) d'Akaike...
Kullback-Leibler divergence and the Neyman-Pearson lemma are two fundamental concepts in statistics....
Patrice Bertail (rapporteur), Denis Bosq (pésident), Michel Delecroix, Dominique Picard, Ya'acov Rit...
We propose new nonparametric accordance Rényi-α and α-Tsallis divergence es-timators for continuous...
AbstractKullback–Leibler divergence and the Neyman–Pearson lemma are two fundamental concepts in sta...
International audienceRecently, Azari et al (2006) showed that (AIC) criterion and its corrected ver...
International audienceRecently, Azari et al (2006) showed that (AIC) criterion and its corrected ver...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
The Akaike information criterion (AIC) is a widely used tool for model selection. AIC is derived as ...
Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distr...
International audienceWe consider fitting uncategorical data to a parametric family of distributions...
International audienceThis paper proposes a new estimation algorithm for the uni-variate Cox-Ingerso...
The Kullback information criterion (KIC) was proposed by Cavanaugh (1999) to serve as an asymptotica...
International audienceUn test d'adéquation dans le cas de la régression univariée est proposé. Ce te...
AbstractWe propose a new definition of the Neyman chi-square divergence between distributions. Based...
Notre travail port sur l'inf´erence au sujet de l'AIC (un cas de vraisemblance p`enalis´ee) d'Akaike...
Kullback-Leibler divergence and the Neyman-Pearson lemma are two fundamental concepts in statistics....
Patrice Bertail (rapporteur), Denis Bosq (pésident), Michel Delecroix, Dominique Picard, Ya'acov Rit...
We propose new nonparametric accordance Rényi-α and α-Tsallis divergence es-timators for continuous...
AbstractKullback–Leibler divergence and the Neyman–Pearson lemma are two fundamental concepts in sta...
International audienceRecently, Azari et al (2006) showed that (AIC) criterion and its corrected ver...
International audienceRecently, Azari et al (2006) showed that (AIC) criterion and its corrected ver...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
The Akaike information criterion (AIC) is a widely used tool for model selection. AIC is derived as ...
Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distr...
International audienceWe consider fitting uncategorical data to a parametric family of distributions...
International audienceThis paper proposes a new estimation algorithm for the uni-variate Cox-Ingerso...
The Kullback information criterion (KIC) was proposed by Cavanaugh (1999) to serve as an asymptotica...
International audienceUn test d'adéquation dans le cas de la régression univariée est proposé. Ce te...
AbstractWe propose a new definition of the Neyman chi-square divergence between distributions. Based...