Mutual information is one of the mostly used measures for evaluating image similarity. In this paper, we investigate the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned documents. These three generalizations derive from the Kullback–Leibler distance, the difference between entropy and conditional entropy, and the Jensen–Tsallis divergence, respectively. In addition, the ratio between these measures and the Tsallis joint entropy is analyzed. The performance of all these measures is studied for different entropic indexes in the context of document classification and registrationThis work has been funded in part with Grant Numbers TIN2010-21089-C03-01 from the Spanish...
Quantitative evaluation of similarity between feature densities of images is an important step in se...
We study two information similarity measures, relative entropy and the similarity metric, and method...
Mutual information of two random variables can be easily obtained from their Shannon entropies. Howe...
Mutual information is one of the mostly used measures for evaluating image similarity. In this paper...
We are in the information age where most data is stored in digital format. Thus, the management of d...
Measures of image similarity that inspect the intensity probability distribution of the images have ...
Two new similarity measures for rigid image registration, based on the normalization of Jensen'...
Image similarity and image recognition are modern and rapidly growing technologies because of their ...
We propose a similarity measure for comparing digital images. The technique is based on mutual info...
The Tsallis measure of mutual information is combined with the simultaneous perturbation stochastic ...
Mutual information (MI) is a popular entropy-based similarity measure used in the medical imaging fi...
The Tsallis measure of mutual information is combined with the simultaneous perturbation stochastic ...
Mutual information (MI) is a popular entropy-based similarity measure used in the medical imaging fi...
Matching a reference image to a secondary image extracted from a database of transformed exemplars c...
The Tsallis measure of mutual information is combined with the simultaneous perturbation stochastic ...
Quantitative evaluation of similarity between feature densities of images is an important step in se...
We study two information similarity measures, relative entropy and the similarity metric, and method...
Mutual information of two random variables can be easily obtained from their Shannon entropies. Howe...
Mutual information is one of the mostly used measures for evaluating image similarity. In this paper...
We are in the information age where most data is stored in digital format. Thus, the management of d...
Measures of image similarity that inspect the intensity probability distribution of the images have ...
Two new similarity measures for rigid image registration, based on the normalization of Jensen'...
Image similarity and image recognition are modern and rapidly growing technologies because of their ...
We propose a similarity measure for comparing digital images. The technique is based on mutual info...
The Tsallis measure of mutual information is combined with the simultaneous perturbation stochastic ...
Mutual information (MI) is a popular entropy-based similarity measure used in the medical imaging fi...
The Tsallis measure of mutual information is combined with the simultaneous perturbation stochastic ...
Mutual information (MI) is a popular entropy-based similarity measure used in the medical imaging fi...
Matching a reference image to a secondary image extracted from a database of transformed exemplars c...
The Tsallis measure of mutual information is combined with the simultaneous perturbation stochastic ...
Quantitative evaluation of similarity between feature densities of images is an important step in se...
We study two information similarity measures, relative entropy and the similarity metric, and method...
Mutual information of two random variables can be easily obtained from their Shannon entropies. Howe...