We propose new families of models and algorithms for high-dimensional nonpara- metric learning with joint sparsity constraints. Our approach is based on a regular- ization method that enforces common sparsity patterns across different function components in a nonparametric additive model. The algorithms employ a coor- dinate descent approach that is based on a functional soft-thresholding operator. The framework yields several new models, including multi-task sparse additive models, multi-response sparse additive models, and sparse additive multi-category logistic regression. The methods are illustrated with experiments on synthetic data and gene microarray data
Learning sparsity pattern in high dimension is a great challenge in both implementation and theory. ...
Sparsity plays an essential role in a number of modern algorithms. This thesis examines how we can i...
<p>We develop a highly scalable optimization method called "hierarchical group-thresholding" for sol...
We present a new class of models for high-dimensional nonparametric regression and classification ca...
We present a new class of models for high-dimensional nonparametric regression and classification ca...
We consider the problem of sparse variable selection in nonparametric additive models, with the prio...
In this work we are interested in the problems of supervised learning and variable selection when th...
In this work we are interested in the problems of supervised learning and variable selection when th...
Thesis (Ph.D.)--University of Washington, 2018Recently, technological advances have allowed us to ga...
We propose a new nonparametric learning method based on multivariate dyadic regression trees (MDRTs)...
The public defense on 14th May 2020 at 16:00 (4 p.m.) will be organized via remote technology. Li...
Distributed statistical learning has become a popular technique for large-scale data analysis. Most ...
Recent advances in machine learning have spawned progress in various fields. In the context of machi...
<p>The development of modern information technology has enabled collecting data of unprecedented siz...
A variable screening procedure via correlation learning was proposed by Fan and Lv (2008) to reduce ...
Learning sparsity pattern in high dimension is a great challenge in both implementation and theory. ...
Sparsity plays an essential role in a number of modern algorithms. This thesis examines how we can i...
<p>We develop a highly scalable optimization method called "hierarchical group-thresholding" for sol...
We present a new class of models for high-dimensional nonparametric regression and classification ca...
We present a new class of models for high-dimensional nonparametric regression and classification ca...
We consider the problem of sparse variable selection in nonparametric additive models, with the prio...
In this work we are interested in the problems of supervised learning and variable selection when th...
In this work we are interested in the problems of supervised learning and variable selection when th...
Thesis (Ph.D.)--University of Washington, 2018Recently, technological advances have allowed us to ga...
We propose a new nonparametric learning method based on multivariate dyadic regression trees (MDRTs)...
The public defense on 14th May 2020 at 16:00 (4 p.m.) will be organized via remote technology. Li...
Distributed statistical learning has become a popular technique for large-scale data analysis. Most ...
Recent advances in machine learning have spawned progress in various fields. In the context of machi...
<p>The development of modern information technology has enabled collecting data of unprecedented siz...
A variable screening procedure via correlation learning was proposed by Fan and Lv (2008) to reduce ...
Learning sparsity pattern in high dimension is a great challenge in both implementation and theory. ...
Sparsity plays an essential role in a number of modern algorithms. This thesis examines how we can i...
<p>We develop a highly scalable optimization method called "hierarchical group-thresholding" for sol...