Many state-of-the-art deep learning models rely on dynamic computation logic, making them difficult to optimize. In this thesis, we present a hashing based algorithm that is able to detect and optimize computation logic common to different computation graphs. We show that our algorithm can be integrated seamlessly into popular deep learning frameworks such as TensorFlow, with nearly zero code changes required on the part of users in order to adapt our optimizations to their programs. Experiments show that our algorithm achieves 1.35× speedup on a sentiment classification task trained with the popular Tree-LSTM model
Thesis (Ph.D.)--University of Washington, 2020In the past decade deep learning has revolutionized ma...
Deep learning training consumes ever-increasing time and resources, and that isdue to the complexity...
Lecture for the course CSC 59970: Intro to Data Science (Week Thirteen) delivered at the City Coll...
Many state-of-the-art deep learning models rely on dynamic computation logic, making them difficult t...
Recent decades have witnessed the breakthrough of deep learning algorithms, which have been widely u...
Machine learning workflow development is a process of trial-and-error: developers iterate on workflo...
Deep learning powers many transformative core technologies including Autonomous Driving, Natural Lan...
Machine learning is the branch of artificial intelligence giving computers the ability to learn patt...
The proliferation of research on high efficient performance on deep learning has contributed to an i...
Thesis (Ph.D.)--University of Washington, 2022As the scaling and performance demands for deep learni...
In recent years, machine learning (ML) and, more noticeably, deep learning (DL), have be- come incre...
In recent times, computer scientists and technology companies have quickly begun to realize that mac...
Machine learning (ML) is now commonplace, powering data-driven applications in various organizations...
Rapid progress in deep learning is leading to a diverse set of quickly changing models, with a drama...
Algorithm selection and generation techniques are two methods that can be used to exploit the perfor...
Thesis (Ph.D.)--University of Washington, 2020In the past decade deep learning has revolutionized ma...
Deep learning training consumes ever-increasing time and resources, and that isdue to the complexity...
Lecture for the course CSC 59970: Intro to Data Science (Week Thirteen) delivered at the City Coll...
Many state-of-the-art deep learning models rely on dynamic computation logic, making them difficult t...
Recent decades have witnessed the breakthrough of deep learning algorithms, which have been widely u...
Machine learning workflow development is a process of trial-and-error: developers iterate on workflo...
Deep learning powers many transformative core technologies including Autonomous Driving, Natural Lan...
Machine learning is the branch of artificial intelligence giving computers the ability to learn patt...
The proliferation of research on high efficient performance on deep learning has contributed to an i...
Thesis (Ph.D.)--University of Washington, 2022As the scaling and performance demands for deep learni...
In recent years, machine learning (ML) and, more noticeably, deep learning (DL), have be- come incre...
In recent times, computer scientists and technology companies have quickly begun to realize that mac...
Machine learning (ML) is now commonplace, powering data-driven applications in various organizations...
Rapid progress in deep learning is leading to a diverse set of quickly changing models, with a drama...
Algorithm selection and generation techniques are two methods that can be used to exploit the perfor...
Thesis (Ph.D.)--University of Washington, 2020In the past decade deep learning has revolutionized ma...
Deep learning training consumes ever-increasing time and resources, and that isdue to the complexity...
Lecture for the course CSC 59970: Intro to Data Science (Week Thirteen) delivered at the City Coll...