defined as the argument of the log in the α-order Renyi entropy, has been successfully used as an information theoretic criterion for supervised adaptive system training. In this paper, we use the survival function (or equivalently the distribution function) of an absolute value transformed random variable to define a new information potential, named the survival information potential (SIP). Compared with the IP, the SIP has some advantages, such as validity in a wide range of distributions, robustness, and the simplicity in computation. The properties of SIP and a simple formula for computing the empirical SIP are given in the paper. Finally, the SIP criterion is applied in adaptive system training, and simulation examples on FIR adaptive ...
The most widely used forms of model selection criteria, the Bayesian Information Criterion (BIC) an...
We consider the minimum error entropy (MEE) criterion and an empirical risk minimization learn-ing a...
We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler rel...
Abstract. In our recent studies we have proposed the use of minimum error entropy criterion as an al...
The error-entropy-minimization approach in adaptive system training is addressed in this paper. The ...
Recently we have proposed a recursive estimator for Reuyi's quadratic entropy. This estimator c...
Adaptive _ltering has gained wide popularity in recent times in non stationary signal processing env...
In this paper, we propose minimizing the Fisher information of the error in supervised training of l...
Abstract—Recent publications have proposed various informa-tion-theoretic learning (ITL) criteria ba...
In this paper, we propose Minimum Error Entropy with self adjusting step-size (MEE-SAS) as an altern...
Information theoretical measures are used to design, from first principles, an objective function th...
First, this paper recalls a recently introduced method of adaptive monitoring of dynamical systems a...
International audienceIn this paper, information theory is applied for probabilistic sensitivity ana...
2 In this paper, we propose a Minimum Error Entropy with self adjusting step-size (MEE-SAS) as an al...
In supervised infinite impulse response adaptive filtering, approximate gradient-based approaches ar...
The most widely used forms of model selection criteria, the Bayesian Information Criterion (BIC) an...
We consider the minimum error entropy (MEE) criterion and an empirical risk minimization learn-ing a...
We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler rel...
Abstract. In our recent studies we have proposed the use of minimum error entropy criterion as an al...
The error-entropy-minimization approach in adaptive system training is addressed in this paper. The ...
Recently we have proposed a recursive estimator for Reuyi's quadratic entropy. This estimator c...
Adaptive _ltering has gained wide popularity in recent times in non stationary signal processing env...
In this paper, we propose minimizing the Fisher information of the error in supervised training of l...
Abstract—Recent publications have proposed various informa-tion-theoretic learning (ITL) criteria ba...
In this paper, we propose Minimum Error Entropy with self adjusting step-size (MEE-SAS) as an altern...
Information theoretical measures are used to design, from first principles, an objective function th...
First, this paper recalls a recently introduced method of adaptive monitoring of dynamical systems a...
International audienceIn this paper, information theory is applied for probabilistic sensitivity ana...
2 In this paper, we propose a Minimum Error Entropy with self adjusting step-size (MEE-SAS) as an al...
In supervised infinite impulse response adaptive filtering, approximate gradient-based approaches ar...
The most widely used forms of model selection criteria, the Bayesian Information Criterion (BIC) an...
We consider the minimum error entropy (MEE) criterion and an empirical risk minimization learn-ing a...
We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler rel...