Large language models (LLMs) have become increasingly prominent in academia and industry due to their remarkable performance in diverse applications. As these models evolve with increasing parameters, they excel in tasks like sentiment analysis and machine translation. However, even models with billions of parameters face challenges in tasks demanding multi-step reasoning. Code generation and comprehension, especially in C and C++, emerge as significant challenges. While LLMs trained on code datasets demonstrate competence in many tasks, they struggle with rectifying non-compilable C and C++ code. Our investigation attributes this subpar performance to two primary factors: the quality of the training dataset and the inherent complexity of t...
One of the most common solutions adopted by software researchers to address code generation is by tr...
Recent breakthroughs in Large Language Models (LLMs), such as GPT-3 and Codex, now enable software d...
In recent years, significant progress has been made in the field of natural language processing (NLP...
Large Language Models (LLMs) play an ever-increasing role in the field of Artificial Intelligence (A...
Large language models (LMs) of code have recently shown tremendous promise in completing code and sy...
In the challenging field of introductory programming, high enrollments and failure rates drive us to...
In this work, we evaluate 10 open-source instructed LLMs on four representative code comprehension a...
Large Language Models (LLMs) for code are a family of high-parameter, transformer-based neural netwo...
This paper systematically investigates the generation of code explanations by Large Language Models ...
In this study, we present a novel dataset for training machine learning models translating between O...
International audienceCompCert is the first commercially available optimizing compiler that is forma...
Large Language Models (LLM) are a new class of computation engines, "programmed" via prompt engineer...
Predictive modeling using machine learning is an effective method for building compiler heuristics, ...
International audienceCompCert is the first commercially available optimizing compiler that is forma...
Thesis (Ph.D.)--University of Washington, 2023Language models (LMs) are at the core of almost all st...
One of the most common solutions adopted by software researchers to address code generation is by tr...
Recent breakthroughs in Large Language Models (LLMs), such as GPT-3 and Codex, now enable software d...
In recent years, significant progress has been made in the field of natural language processing (NLP...
Large Language Models (LLMs) play an ever-increasing role in the field of Artificial Intelligence (A...
Large language models (LMs) of code have recently shown tremendous promise in completing code and sy...
In the challenging field of introductory programming, high enrollments and failure rates drive us to...
In this work, we evaluate 10 open-source instructed LLMs on four representative code comprehension a...
Large Language Models (LLMs) for code are a family of high-parameter, transformer-based neural netwo...
This paper systematically investigates the generation of code explanations by Large Language Models ...
In this study, we present a novel dataset for training machine learning models translating between O...
International audienceCompCert is the first commercially available optimizing compiler that is forma...
Large Language Models (LLM) are a new class of computation engines, "programmed" via prompt engineer...
Predictive modeling using machine learning is an effective method for building compiler heuristics, ...
International audienceCompCert is the first commercially available optimizing compiler that is forma...
Thesis (Ph.D.)--University of Washington, 2023Language models (LMs) are at the core of almost all st...
One of the most common solutions adopted by software researchers to address code generation is by tr...
Recent breakthroughs in Large Language Models (LLMs), such as GPT-3 and Codex, now enable software d...
In recent years, significant progress has been made in the field of natural language processing (NLP...