Spoken language understanding (SLU) aims to map a user's speech into a semantic frame. Since most of the previous works use the semantic structures for SLU, we verify that the structure is useful even for noisy input. We apply a structured prediction method to SLU problem and compare it to an unstructured one. In addition, we present a combined method to embed long-distance dependency between entities in a cascaded manner. On air travel data, we show that our approach improves performance over baseline models.X11sciescopu
In this paper, we exploit non-local features as an estimate of long-distance dependencies to improve...
International audienceSpoken Language Understanding (SLU) is a core task in most human-machine inter...
This paper investigates several approaches to bootstrapping a new spoken language understanding (SLU...
In the past two decades there have been several projects on Spoken Language Understanding (SLU). I...
Spoken language systems (SLS) communicate with users in natural language through speech. There are t...
International audienceThis work deals with spoken language understanding (SLU) systems in the scenar...
International audienceThis work deals with spoken language understanding (SLU) systems in the scenar...
International audienceThis work deals with spoken language understanding (SLU) systems in the scenar...
When building spoken dialogue systems for a new domain, a major bottleneck is developing a spoken la...
International audienceEnd-to-end spoken language understanding (SLU) predicts intent directly from a...
Spoken Language Understanding (SLU) is a core task in most human-machine interaction systems . With ...
The paper presents a purely data-driven spoken language understanding (SLU) system. It consists of t...
A new language model for speech recognition inspired by linguistic analysis is presented. The model ...
International audienceEnd-to-end architectures have been recently proposed for spoken language under...
Understanding spoken language is a highly complex problem, which can be decomposed into several simp...
In this paper, we exploit non-local features as an estimate of long-distance dependencies to improve...
International audienceSpoken Language Understanding (SLU) is a core task in most human-machine inter...
This paper investigates several approaches to bootstrapping a new spoken language understanding (SLU...
In the past two decades there have been several projects on Spoken Language Understanding (SLU). I...
Spoken language systems (SLS) communicate with users in natural language through speech. There are t...
International audienceThis work deals with spoken language understanding (SLU) systems in the scenar...
International audienceThis work deals with spoken language understanding (SLU) systems in the scenar...
International audienceThis work deals with spoken language understanding (SLU) systems in the scenar...
When building spoken dialogue systems for a new domain, a major bottleneck is developing a spoken la...
International audienceEnd-to-end spoken language understanding (SLU) predicts intent directly from a...
Spoken Language Understanding (SLU) is a core task in most human-machine interaction systems . With ...
The paper presents a purely data-driven spoken language understanding (SLU) system. It consists of t...
A new language model for speech recognition inspired by linguistic analysis is presented. The model ...
International audienceEnd-to-end architectures have been recently proposed for spoken language under...
Understanding spoken language is a highly complex problem, which can be decomposed into several simp...
In this paper, we exploit non-local features as an estimate of long-distance dependencies to improve...
International audienceSpoken Language Understanding (SLU) is a core task in most human-machine inter...
This paper investigates several approaches to bootstrapping a new spoken language understanding (SLU...