Huge data sets containing millions of training examples with a large number of attributes (tall fat data) are relatively easy to gather. However one of the bottlenecks for successful inference of useful information from the data is the computational complexity of machine learning algorithms. Most state-of-the-art nonparametric machine learning algorithms have a computational complexity of either O(N 2) or O(N 3), where N is the number of training examples. This has seriously restricted the use of massive data sets. The bottleneck computational primitive at the heart of various algorithms is the multiplication of a structured matrix with a vector, which we refer to as matrix-vector product (MVP) primitive. The goal of my thesis is to speedup...
My research is on theoretical foundations of machine learning. During graduate school, I primarily a...
The massive size of data that needs to be processed by Machine Learning models nowadays sets new cha...
Thesis (Ph.D.)--University of Washington, 2020We present several novel results on computational prob...
Huge data sets containing millions of training examples with a large number of attributes are relati...
There has been a recent revolution in machine learning based on the following simple idea. Instead o...
Traditional machine learning has been largely concerned with developing techniques for small or mode...
Pervasive and networked computers have dramatically reduced the cost of collecting and distributing ...
The last few years have witnessed the rise of the big data era, which features the prevalence of dat...
University of Technology Sydney. Faculty of Engineering and Information Technology.Machine learning ...
Modern technological advances have prompted massive scale data collection in manymodern fields such ...
2018-01-18This is the era of big data, where both challenges and opportunities lie ahead for the mac...
Recent improvements in machine learning methods have significantly advanced many fields in- cluding ...
textWith an immense growth of data, there is a great need for solving large-scale machine learning p...
The aim of this thesis is to develop scalable numerical optimization methods that can be used to add...
Nowadays linear methods like Regression, Principal Component Analysis and Canonical Correlation Anal...
My research is on theoretical foundations of machine learning. During graduate school, I primarily a...
The massive size of data that needs to be processed by Machine Learning models nowadays sets new cha...
Thesis (Ph.D.)--University of Washington, 2020We present several novel results on computational prob...
Huge data sets containing millions of training examples with a large number of attributes are relati...
There has been a recent revolution in machine learning based on the following simple idea. Instead o...
Traditional machine learning has been largely concerned with developing techniques for small or mode...
Pervasive and networked computers have dramatically reduced the cost of collecting and distributing ...
The last few years have witnessed the rise of the big data era, which features the prevalence of dat...
University of Technology Sydney. Faculty of Engineering and Information Technology.Machine learning ...
Modern technological advances have prompted massive scale data collection in manymodern fields such ...
2018-01-18This is the era of big data, where both challenges and opportunities lie ahead for the mac...
Recent improvements in machine learning methods have significantly advanced many fields in- cluding ...
textWith an immense growth of data, there is a great need for solving large-scale machine learning p...
The aim of this thesis is to develop scalable numerical optimization methods that can be used to add...
Nowadays linear methods like Regression, Principal Component Analysis and Canonical Correlation Anal...
My research is on theoretical foundations of machine learning. During graduate school, I primarily a...
The massive size of data that needs to be processed by Machine Learning models nowadays sets new cha...
Thesis (Ph.D.)--University of Washington, 2020We present several novel results on computational prob...