summary:Neural networks with radial basis functions are considered, and the Shannon information in their output concerning input. The role of information- preserving input transformations is discussed when the network is specified by the maximum information principle and by the maximum likelihood principle. A transformation is found which simplifies the input structure in the sense that it minimizes the entropy in the class of all information-preserving transformations. Such transformation need not be unique - under some assumptions it may be any minimal sufficient statistics
The present paper1 aims to propose a new type of information-theoretic method to maximize mutual inf...
© World Scientific Publishing CompanyThe information channel capacity of neurons is calculated in th...
In this article, we explore the concept of minimization of information loss (MIL) as a a target for ...
summary:Neural networks with radial basis functions are considered, and the Shannon information in t...
Many systems in nature process information by transforming inputs from their environments into obser...
The way brain networks maintain high transmission efficiency is believed to be fundamental in unders...
This chapter discusses the role of information theory for analysis of neural networks using differen...
We investigate the use of maximum marginal likelihood of the data to determine some of the critical ...
International audienceWe investigate the consequences of maximizing information transfer in a simple...
In blind sourc separation (BSS), two di#erent separation tecration are mainly used: minimal mutual i...
Input nodes of neural networks are usually predetermined by using a priori knowledge or selected by ...
There have been a number of recent papers on information theory and neural networks, especially in a...
International audienceWe prove that maximization of mutual information between the output and the in...
The overarching purpose of the studies presented in this report is the exploration of the uses of in...
Both multilayer perceptrons (MLP) and Generalized Radial Basis Functions (GRBF) have good approxim...
The present paper1 aims to propose a new type of information-theoretic method to maximize mutual inf...
© World Scientific Publishing CompanyThe information channel capacity of neurons is calculated in th...
In this article, we explore the concept of minimization of information loss (MIL) as a a target for ...
summary:Neural networks with radial basis functions are considered, and the Shannon information in t...
Many systems in nature process information by transforming inputs from their environments into obser...
The way brain networks maintain high transmission efficiency is believed to be fundamental in unders...
This chapter discusses the role of information theory for analysis of neural networks using differen...
We investigate the use of maximum marginal likelihood of the data to determine some of the critical ...
International audienceWe investigate the consequences of maximizing information transfer in a simple...
In blind sourc separation (BSS), two di#erent separation tecration are mainly used: minimal mutual i...
Input nodes of neural networks are usually predetermined by using a priori knowledge or selected by ...
There have been a number of recent papers on information theory and neural networks, especially in a...
International audienceWe prove that maximization of mutual information between the output and the in...
The overarching purpose of the studies presented in this report is the exploration of the uses of in...
Both multilayer perceptrons (MLP) and Generalized Radial Basis Functions (GRBF) have good approxim...
The present paper1 aims to propose a new type of information-theoretic method to maximize mutual inf...
© World Scientific Publishing CompanyThe information channel capacity of neurons is calculated in th...
In this article, we explore the concept of minimization of information loss (MIL) as a a target for ...