Cross-lingual semantic parsing transfers parsing capability from a high-resource language (e.g., English) to low-resource languages with scarce training data. Previous work has primarily considered silver-standard data augmentation or zero-shot methods, however, exploiting few-shot gold data is comparatively unexplored. We propose a new approach to cross-lingual semantic parsing by explicitly minimizing cross-lingual divergence between probabilistic latent variables using Optimal Transport. We demonstrate how this direct guidance improves parsing from natural languages using fewer examples and less training. We evaluate our method on two datasets, MTOP and MultiATIS++SQL, establishing state-of-the-art results under a few-shot cross-lingual ...
Cross-lingual word embeddings are an increasingly important reseource in cross-lingual methods for N...
One of the things that need to change when it comes to machine translation is the models' ability to...
Large pre-trained multilingual models such as mBERT and XLM-R enabled effective cross-lingual zero-s...
Cross-lingual semantic parsing transfers parsing capability from a high-resource language (e.g., Eng...
Modern virtual assistants use internal semantic parsing engines to convert user utterances to action...
Some Transformer-based models can perform cross-lingual transfer learning: those models can be train...
Pre-trained multilingual language models play an important role in cross-lingual natural language un...
Pre-trained multilingual language models show significant performance gains for zero-shot cross-ling...
Word alignment which aims to extract lexicon translation equivalents between source and target sente...
As numerous modern NLP models demonstrate high-performance in various tasks when trained with resour...
Cross-lingual transfer learning with large multilingual pre-trained models can be an effective appro...
Thesis (Master's)--University of Washington, 2020This work presents methods for learning cross-lingu...
While recent work on multilingual language models has demonstrated their capacity for cross-lingual ...
Building machine learning prediction models for a specific NLP task requires sufficient training dat...
In cross-lingual language understanding, machine translation is often utilized to enhance the transf...
Cross-lingual word embeddings are an increasingly important reseource in cross-lingual methods for N...
One of the things that need to change when it comes to machine translation is the models' ability to...
Large pre-trained multilingual models such as mBERT and XLM-R enabled effective cross-lingual zero-s...
Cross-lingual semantic parsing transfers parsing capability from a high-resource language (e.g., Eng...
Modern virtual assistants use internal semantic parsing engines to convert user utterances to action...
Some Transformer-based models can perform cross-lingual transfer learning: those models can be train...
Pre-trained multilingual language models play an important role in cross-lingual natural language un...
Pre-trained multilingual language models show significant performance gains for zero-shot cross-ling...
Word alignment which aims to extract lexicon translation equivalents between source and target sente...
As numerous modern NLP models demonstrate high-performance in various tasks when trained with resour...
Cross-lingual transfer learning with large multilingual pre-trained models can be an effective appro...
Thesis (Master's)--University of Washington, 2020This work presents methods for learning cross-lingu...
While recent work on multilingual language models has demonstrated their capacity for cross-lingual ...
Building machine learning prediction models for a specific NLP task requires sufficient training dat...
In cross-lingual language understanding, machine translation is often utilized to enhance the transf...
Cross-lingual word embeddings are an increasingly important reseource in cross-lingual methods for N...
One of the things that need to change when it comes to machine translation is the models' ability to...
Large pre-trained multilingual models such as mBERT and XLM-R enabled effective cross-lingual zero-s...