In signal processing, ARMA processes are widely used to model short-memory processes. In various applications, comparing or classifying ARMA processes is required. In this paper, our purpose is to provide analytical expressions of the divergence rates of the Kullback-Leibler divergence, the Rényi divergence (RD) of order αand their symmetric versions for two Gaussian ARMA processes, by taking advantage of results such as the Yule-Walker equations and notions such as inverse filtering. The divergence rates can be interpreted as the sum of different quantities: power of one ARMA process filtered by the inverse filter associated with the second ARMA process, cepstrum, etc. Finally, illustrations show that the ranges of values taken by the dive...
Abstract—Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is rel...
In many applications, such as image retrieval and change detection, we need to assess the similarity...
As a probabilistic distance between two probability density functions, Kullback-Leibler divergence i...
In signal processing, ARMA processes are widely used to model short-memory processes. In various app...
In this paper, we aim at analyzing the differences between three families of divergences used to com...
In this paper we consider the problem of discriminating between stationary Gaussian processes. We ex...
Comparing processes or models is of interest in various applications. Among the existing approaches,...
The purpose of this paper is first to derive the expressions of various divergences that can be expr...
Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distr...
Kullback-Leibler (KL) divergence is one of the most important divergence measures between probabilit...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
Accepted by IEEE Transactions on Information Theory. To appear.Rényi divergence is related to Rényi ...
A class of processes with a time varying spectral representationis introduced. A time varying spectr...
Abstract — This article proposes to monitor industrial process faults using Kullback Leibler (KL) di...
International audienceThe Kullback-Leibler divergence (KLD) between two multivariate generalized Gau...
Abstract—Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is rel...
In many applications, such as image retrieval and change detection, we need to assess the similarity...
As a probabilistic distance between two probability density functions, Kullback-Leibler divergence i...
In signal processing, ARMA processes are widely used to model short-memory processes. In various app...
In this paper, we aim at analyzing the differences between three families of divergences used to com...
In this paper we consider the problem of discriminating between stationary Gaussian processes. We ex...
Comparing processes or models is of interest in various applications. Among the existing approaches,...
The purpose of this paper is first to derive the expressions of various divergences that can be expr...
Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distr...
Kullback-Leibler (KL) divergence is one of the most important divergence measures between probabilit...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
Accepted by IEEE Transactions on Information Theory. To appear.Rényi divergence is related to Rényi ...
A class of processes with a time varying spectral representationis introduced. A time varying spectr...
Abstract — This article proposes to monitor industrial process faults using Kullback Leibler (KL) di...
International audienceThe Kullback-Leibler divergence (KLD) between two multivariate generalized Gau...
Abstract—Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is rel...
In many applications, such as image retrieval and change detection, we need to assess the similarity...
As a probabilistic distance between two probability density functions, Kullback-Leibler divergence i...