End-to-end encoder-decoder approaches to data-to-text generation are often black boxes whose predictions are difficult to explain. Breaking up the end-to-end model into sub-modules is a natural way to address this problem. The traditional pre-neural Natural Language Generation (NLG) pipeline provides a framework for breaking up the end-to-end encoder-decoder. We survey recent papers that integrate traditional NLG submodules in neural approaches and analyse their explainability. Our survey is a first step towards building explainable neural NLG models
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
Natural language generation (NLG) is a subfield of natural language processing (NLP) that is often c...
This paper offers a comprehensive review of the research on Natural Language Generation (NLG) over t...
International audienceEnd-to-end encoder-decoder approaches to data-to-text generation are often bla...
Neural encoder-decoder models for language generation can be trained to predict words directly from ...
Zarrieß S, Voigt H, Schüz S. Decoding Methods in Neural Language Generation: A Survey. Information. ...
Traditionally, most data-to-text applications have been designed using a modular pipeline architectu...
We introduce the problems of data-to-text generation and the current state of the art, i.e. pretrain...
There are rich opportunities to reduce the language complexity of professional content (either human...
International audienceIn Natural Language Generation (NLG), End-to-End (E2E) systems trained through...
International audienceIn Natural Language Generation (NLG), End-to-End (E2E) systems trained through...
Deep Neural Networks such as Recurrent Neural Networks and Transformer models are widely adopted for...
Numerical tables are widely employed to communicate or report the classification performance of mach...
Recent advances in deep neural language models combined with the capacity of large scale datasets ha...
Natural Language Generation (NLG) is defined as the systematic approach for producing human understa...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
Natural language generation (NLG) is a subfield of natural language processing (NLP) that is often c...
This paper offers a comprehensive review of the research on Natural Language Generation (NLG) over t...
International audienceEnd-to-end encoder-decoder approaches to data-to-text generation are often bla...
Neural encoder-decoder models for language generation can be trained to predict words directly from ...
Zarrieß S, Voigt H, Schüz S. Decoding Methods in Neural Language Generation: A Survey. Information. ...
Traditionally, most data-to-text applications have been designed using a modular pipeline architectu...
We introduce the problems of data-to-text generation and the current state of the art, i.e. pretrain...
There are rich opportunities to reduce the language complexity of professional content (either human...
International audienceIn Natural Language Generation (NLG), End-to-End (E2E) systems trained through...
International audienceIn Natural Language Generation (NLG), End-to-End (E2E) systems trained through...
Deep Neural Networks such as Recurrent Neural Networks and Transformer models are widely adopted for...
Numerical tables are widely employed to communicate or report the classification performance of mach...
Recent advances in deep neural language models combined with the capacity of large scale datasets ha...
Natural Language Generation (NLG) is defined as the systematic approach for producing human understa...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
Natural language generation (NLG) is a subfield of natural language processing (NLP) that is often c...
This paper offers a comprehensive review of the research on Natural Language Generation (NLG) over t...