Recent advancements in large pre-trained transformer models (GPT2/3, T5) have found use in program synthesis to generate programs that satisfy a set of input/output examples. However, these models perform poorly on long-horizon and low-data tasks, and often don't seem to understand the semantics of the languages they generate. We investigate an approach that tackles both of these issues, by using attributed context-free-grammars of programming languages to generate programs, and then analyzing generated programs so that they can be annotated with compile and runtime attributes, such as types, so that information about the program can be remembered during long-horizon generation. We firstly find that synthesized datasets can be made efficien...
Long Short-Term Memory (LSTM) and Transformers are two popular neural architectures used for natural...
Although the program verification community has developed several techniques for analyzing software ...
Given a large Transformer model, how can we obtain a small and computationally efficient model which...
A key challenge of existing program synthesizers is ensuring that the synthesized program generalize...
Thesis (Master's)--University of Washington, 2021Transformer models perform well on NLP tasks, but r...
Program synthesis strives to generate a computer program as a solution to a given problem specificat...
Program synthesis is a promising area of research concerned with automatically producing program imp...
Thesis (Ph.D.)--University of Washington, 2015Program synthesis is a family of techniques that gener...
Transformers are the current state-of-the-art of natural language processing in many domains and are...
Building systems that can synthesize programs from natural specifications (such as examples or langu...
We show how to "compile" human-readable programs into standard decoder-only transformer models. Our ...
Compilers use cost models to choose between different optimization opportunities, and increasingly t...
With the advancement of modern technologies, programming becomes ubiquitous not only among professio...
This document aims to be a self-contained, mathematically precise overview of transformer architectu...
Large Transformer models achieved the state-of-the-art status for Natural Language Understanding tas...
Long Short-Term Memory (LSTM) and Transformers are two popular neural architectures used for natural...
Although the program verification community has developed several techniques for analyzing software ...
Given a large Transformer model, how can we obtain a small and computationally efficient model which...
A key challenge of existing program synthesizers is ensuring that the synthesized program generalize...
Thesis (Master's)--University of Washington, 2021Transformer models perform well on NLP tasks, but r...
Program synthesis strives to generate a computer program as a solution to a given problem specificat...
Program synthesis is a promising area of research concerned with automatically producing program imp...
Thesis (Ph.D.)--University of Washington, 2015Program synthesis is a family of techniques that gener...
Transformers are the current state-of-the-art of natural language processing in many domains and are...
Building systems that can synthesize programs from natural specifications (such as examples or langu...
We show how to "compile" human-readable programs into standard decoder-only transformer models. Our ...
Compilers use cost models to choose between different optimization opportunities, and increasingly t...
With the advancement of modern technologies, programming becomes ubiquitous not only among professio...
This document aims to be a self-contained, mathematically precise overview of transformer architectu...
Large Transformer models achieved the state-of-the-art status for Natural Language Understanding tas...
Long Short-Term Memory (LSTM) and Transformers are two popular neural architectures used for natural...
Although the program verification community has developed several techniques for analyzing software ...
Given a large Transformer model, how can we obtain a small and computationally efficient model which...