We present a method for learning higher-order polynomial functions from examples using linear regression and feature construction. Regression is used on a set of training instances to produce a weight vector for a linear function over the feature set. If this hypothesis is imperfect, a new feature is constructed by forming the product of the two features that most effectively predict the squared error of the current hypothesis. The algorithm is then repeated. In an extension to this method, the specific pair of features to combine is selected by measuring their joint ability to predict the hypothesis' error
In this paper we will classify patterns using an algorithm analogous to the k-means algorithm and th...
The approach of subset selection in polynomial regression model building assumes that the chosen fix...
Polynomial approximations to boolean functions have led to many positive results in com-puter scienc...
Polynomial regression is still widely used in engineering and economics where polynomials of low ord...
This paper is concerned with estimating the regression function fρ in supervised learning by utilizi...
The approach of subset selection in polynomial regression model building assumes that the chosen fix...
In this lesson you'll learn about how to find the best fit line to a set of curved data points and h...
Abstract. The paper addresses the task of multi-target polynomial regression, i.e., the task of indu...
In pattern recognition it is desirable that the classifier be easy to obtain and evaluate. To this e...
We present an iterative algorithm for nonlinear regression based on con-struction of sparse polynomi...
We present an effective method for supervised feature construction. The main goal of the approach is...
Automating the discovery of qualitative models from observations is a difficult problem of machine l...
AbstractIn this paper we consider several variants of Valiant's learnability model that have appeare...
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research...
Can we effectively learn a nonlinear representation in time comparable to linear learning? We descri...
In this paper we will classify patterns using an algorithm analogous to the k-means algorithm and th...
The approach of subset selection in polynomial regression model building assumes that the chosen fix...
Polynomial approximations to boolean functions have led to many positive results in com-puter scienc...
Polynomial regression is still widely used in engineering and economics where polynomials of low ord...
This paper is concerned with estimating the regression function fρ in supervised learning by utilizi...
The approach of subset selection in polynomial regression model building assumes that the chosen fix...
In this lesson you'll learn about how to find the best fit line to a set of curved data points and h...
Abstract. The paper addresses the task of multi-target polynomial regression, i.e., the task of indu...
In pattern recognition it is desirable that the classifier be easy to obtain and evaluate. To this e...
We present an iterative algorithm for nonlinear regression based on con-struction of sparse polynomi...
We present an effective method for supervised feature construction. The main goal of the approach is...
Automating the discovery of qualitative models from observations is a difficult problem of machine l...
AbstractIn this paper we consider several variants of Valiant's learnability model that have appeare...
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research...
Can we effectively learn a nonlinear representation in time comparable to linear learning? We descri...
In this paper we will classify patterns using an algorithm analogous to the k-means algorithm and th...
The approach of subset selection in polynomial regression model building assumes that the chosen fix...
Polynomial approximations to boolean functions have led to many positive results in com-puter scienc...