This paper deals with the estimation of transfer entropy based on the k-nearest neighbors (k-NN) method. To this end, we first investigate the estimation of Shannon entropy involving a rectangular neighboring region, as suggested in already existing literature, and develop two kinds of entropy estimators. Then, applying the widely-used error cancellation approach to these entropy estimators, we propose two novel transfer entropy estimators, implying no extra computational cost compared to existing similar k-NN algorithms. Experimental simulations allow the comparison of the new estimators with the transfer entropy estimator available in free toolboxes, corresponding to two different extensions to the transfer entropy estimation of the Krask...
This paper introduces a class of k-nearest neighbor (k-NN) estimators called bi-partite plug-in (BPI...
In this paper, we investigate efficient estimation of differential entropy for multivariate random v...
Information flow between the systems can be quantified by using the measures that are derived fromen...
This paper deals with the estimation of transfer entropy based on the k-nearest neighbors (k-NN) met...
Nearest neighbour methods are a classical approach in nonparametric statistics. The k-nearest neighb...
k Nearest Neighbor (kNN) method is an important statistical method. There are several advantages of ...
Entropy estimation is an important problem in information theory and statistical science. Many popul...
A consistent entropy estimator for hyperspherical data is proposed based on the k-nearest neighbor (...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
Many statistical procedures, including goodness-of-fit tests and methods for independent component a...
A class of estimators of the Rényi and Tsallis entropies of an unknown distribution f in Rm is prese...
In molecular sciences, the estimation of entropies of molecules is important for the understanding o...
International audienceThis paper deals with the control of bias estimation when estimating mutual in...
International audienceThis paper deals with the control of bias estimation when estimating mutual in...
This paper introduces a class of k-nearest neighbor (k-NN) estimators called bi-partite plug-in (BPI...
In this paper, we investigate efficient estimation of differential entropy for multivariate random v...
Information flow between the systems can be quantified by using the measures that are derived fromen...
This paper deals with the estimation of transfer entropy based on the k-nearest neighbors (k-NN) met...
Nearest neighbour methods are a classical approach in nonparametric statistics. The k-nearest neighb...
k Nearest Neighbor (kNN) method is an important statistical method. There are several advantages of ...
Entropy estimation is an important problem in information theory and statistical science. Many popul...
A consistent entropy estimator for hyperspherical data is proposed based on the k-nearest neighbor (...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
Many statistical procedures, including goodness-of-fit tests and methods for independent component a...
A class of estimators of the Rényi and Tsallis entropies of an unknown distribution f in Rm is prese...
In molecular sciences, the estimation of entropies of molecules is important for the understanding o...
International audienceThis paper deals with the control of bias estimation when estimating mutual in...
International audienceThis paper deals with the control of bias estimation when estimating mutual in...
This paper introduces a class of k-nearest neighbor (k-NN) estimators called bi-partite plug-in (BPI...
In this paper, we investigate efficient estimation of differential entropy for multivariate random v...
Information flow between the systems can be quantified by using the measures that are derived fromen...