By extending Schwarz’s (1978) basic idea we derive a Bayesian information criterion which enables us to evaluate models estimated by the maximum penalised likelihood method or the method of regularisation. The proposed criterion is applied to the choice of smoothing parameters and the number of basis functions in radial basis function network models. Monte Carlo experiments were conducted to examine the performance of the nonlinear modelling strategy of estimating the weight parameters by regularisation and then determining the adjusted parameters by the Bayesian information criterion. The simulation results show that our modelling procedure performs well in various situations
The problem of evaluating the goodness of the predictive distributions devel-oped by the Bayesian mo...
Information of interest can often only be extracted from data by model fitting. When the functional ...
A quantitative and practical Bayesian framework is described for learn-ing of mappings in feedforwar...
Bayesian nonlinear regression modeling based on basis expansions provides efficient methods for anal...
We apply the nonconcave penalized likelihood approach to obtain variable selections as well as shrin...
MI: Global COE Program Education-and-Research Hub for Mathematics-for-IndustryグローバルCOEプログラム「マス・フォア・イ...
Factor analysis is one of the most popular methods of multivariate statistical analysis. This techni...
. In order to avoid overfitting in neural learning, a regularization term is added to the loss funct...
Gaussian graphical models with sparsity in the inverse covariance matrix are of significant interest...
<p>This article discusses a general framework for smoothing parameter estimation for models with reg...
We describe procedures for Bayesian estimation and testing in cross-sectional, panel data and nonlin...
We propose a hierarchical full Bayesian model for radial basis networks. This model treats the model...
We propose a hierarchical full Bayesian model for radial basis networks. This model treats the model...
The classical model selection criteria, such as the Bayesian Information Criterion (BIC) or Akaike i...
This paper is concerned with the model selection and model averaging problems in system identificati...
The problem of evaluating the goodness of the predictive distributions devel-oped by the Bayesian mo...
Information of interest can often only be extracted from data by model fitting. When the functional ...
A quantitative and practical Bayesian framework is described for learn-ing of mappings in feedforwar...
Bayesian nonlinear regression modeling based on basis expansions provides efficient methods for anal...
We apply the nonconcave penalized likelihood approach to obtain variable selections as well as shrin...
MI: Global COE Program Education-and-Research Hub for Mathematics-for-IndustryグローバルCOEプログラム「マス・フォア・イ...
Factor analysis is one of the most popular methods of multivariate statistical analysis. This techni...
. In order to avoid overfitting in neural learning, a regularization term is added to the loss funct...
Gaussian graphical models with sparsity in the inverse covariance matrix are of significant interest...
<p>This article discusses a general framework for smoothing parameter estimation for models with reg...
We describe procedures for Bayesian estimation and testing in cross-sectional, panel data and nonlin...
We propose a hierarchical full Bayesian model for radial basis networks. This model treats the model...
We propose a hierarchical full Bayesian model for radial basis networks. This model treats the model...
The classical model selection criteria, such as the Bayesian Information Criterion (BIC) or Akaike i...
This paper is concerned with the model selection and model averaging problems in system identificati...
The problem of evaluating the goodness of the predictive distributions devel-oped by the Bayesian mo...
Information of interest can often only be extracted from data by model fitting. When the functional ...
A quantitative and practical Bayesian framework is described for learn-ing of mappings in feedforwar...