© 2019 Dr. Cong Duy Vu HoangNeural sequence models have recently achieved great success across various natural language processing tasks. In practice, neural sequence models require massive amount of annotated training data to reach their desirable performance; however, there will not always be available data across languages, domains or tasks at hand. Prior and external knowledge provides additional contextual information, potentially improving the modelling performance as well as compensating the lack of large training data, particular in low-resourced situations. In this thesis, we investigate the usefulness of utilising prior and external knowledge for improving neural sequence models. We propose the use of various kinds of prior and ex...
Neural network sequence models have become a fundamental building block for natural language process...
Although many studies have provided evidence that abstract knowledge can be acquired in artificial g...
Abstract Sequence-to-sequence models have achieved impressive results on various tasks. However, the...
In Natural Language Processing (NLP), it is important to detect the relationship between two sequenc...
This thesis studies the introduction of a priori structure into the design of learning systems based...
Neural sequence model, though widely used for modeling sequential data such as the language model, h...
Auto-regressive sequence models can estimate the distribution of any type of sequential data. To stu...
Sequence Labelling is the task of mapping sequential data from one domain to another domain. As we c...
We develop a precise writing survey on sequence-to-sequence learning with neural network and its mod...
2018-08-01Recurrent neural networks (RNN) have been successfully applied to various Natural Language...
Zarrieß S, Voigt H, Schüz S. Decoding Methods in Neural Language Generation: A Survey. Information. ...
With the advent of deep learning, research in many areas of machine learning is converging towards t...
Neural sequence models have been applied with great success to a variety of tasks in natural languag...
Huge neural autoregressive sequence models have achieved impressive performance across different app...
Recently, significant improvements have been achieved in various natural language processing tasks u...
Neural network sequence models have become a fundamental building block for natural language process...
Although many studies have provided evidence that abstract knowledge can be acquired in artificial g...
Abstract Sequence-to-sequence models have achieved impressive results on various tasks. However, the...
In Natural Language Processing (NLP), it is important to detect the relationship between two sequenc...
This thesis studies the introduction of a priori structure into the design of learning systems based...
Neural sequence model, though widely used for modeling sequential data such as the language model, h...
Auto-regressive sequence models can estimate the distribution of any type of sequential data. To stu...
Sequence Labelling is the task of mapping sequential data from one domain to another domain. As we c...
We develop a precise writing survey on sequence-to-sequence learning with neural network and its mod...
2018-08-01Recurrent neural networks (RNN) have been successfully applied to various Natural Language...
Zarrieß S, Voigt H, Schüz S. Decoding Methods in Neural Language Generation: A Survey. Information. ...
With the advent of deep learning, research in many areas of machine learning is converging towards t...
Neural sequence models have been applied with great success to a variety of tasks in natural languag...
Huge neural autoregressive sequence models have achieved impressive performance across different app...
Recently, significant improvements have been achieved in various natural language processing tasks u...
Neural network sequence models have become a fundamental building block for natural language process...
Although many studies have provided evidence that abstract knowledge can be acquired in artificial g...
Abstract Sequence-to-sequence models have achieved impressive results on various tasks. However, the...