In many classification and prediction problems it is known that the response variable depends on certain explanatory variables. Monotone neural networks can be used as powerful tools to build monotone models with better accuracy and lower variance compared to ordinary nonmonotone models. Monotonicity is usually obtained by putting constraints on the parameters of the network. In this paper, we will clarify some of the theoretical results on monotone neural networks with positive weights, issues that are sometimes misunderstood in the neural network literature. Furthermore, we will generalize some of the results obtained by Sill for the so-called min-max networks to the case of partially monotone problems. The method is illustrated in practi...
A mixed order hyper network (MOHN) is a neural network in which weights can connect any number of ne...
We consider the existence of fixed points of nonnegative neural networks, i.e., neural networks that...
Supervised Learning in Multi-Layered Neural Networks (MLNs) has been recently proposed through the w...
In many applications, it is a priori known that the target function should satisfy certain constrain...
In many data mining applications, it is a priori known that the target function should satisfy certa...
Monotonicity is a constraint which arises in many application domains. We present a machine learning...
Monotone functions and data sets arise in a variety of applications. We study the interpolation prob...
This article discusses a number of reasons why the use of non-monotonic functions as activation func...
Abstract—Most existing neural networks for solving linear variational inequalities (LVIs) with the m...
In many application areas of machine learning, prior knowledge concerning the monotonicity of relati...
Contains fulltext : 83870.pdf (publisher's version ) (Closed access
Learning parameters of a probabilistic model is a necessary step in most machine learning modeling t...
The paper considers the full-range (FR) model of cellular neural networks (CNNs) with ideal hard-lim...
We consider training noise in neural networks as a means of tuning the structure of retrieval basins...
In this paper, we propose efficient neural network models for solving a class of variational inequal...
A mixed order hyper network (MOHN) is a neural network in which weights can connect any number of ne...
We consider the existence of fixed points of nonnegative neural networks, i.e., neural networks that...
Supervised Learning in Multi-Layered Neural Networks (MLNs) has been recently proposed through the w...
In many applications, it is a priori known that the target function should satisfy certain constrain...
In many data mining applications, it is a priori known that the target function should satisfy certa...
Monotonicity is a constraint which arises in many application domains. We present a machine learning...
Monotone functions and data sets arise in a variety of applications. We study the interpolation prob...
This article discusses a number of reasons why the use of non-monotonic functions as activation func...
Abstract—Most existing neural networks for solving linear variational inequalities (LVIs) with the m...
In many application areas of machine learning, prior knowledge concerning the monotonicity of relati...
Contains fulltext : 83870.pdf (publisher's version ) (Closed access
Learning parameters of a probabilistic model is a necessary step in most machine learning modeling t...
The paper considers the full-range (FR) model of cellular neural networks (CNNs) with ideal hard-lim...
We consider training noise in neural networks as a means of tuning the structure of retrieval basins...
In this paper, we propose efficient neural network models for solving a class of variational inequal...
A mixed order hyper network (MOHN) is a neural network in which weights can connect any number of ne...
We consider the existence of fixed points of nonnegative neural networks, i.e., neural networks that...
Supervised Learning in Multi-Layered Neural Networks (MLNs) has been recently proposed through the w...