Pretrained language models based on the transformer architecture have shown great success in NLP. Textual training data often comes from the web and is thus tagged with time-specific information, but most language models ignore this information. They are trained on the textual data alone, limiting their ability to generalize temporally. In this work, we extend the key component of the transformer architecture, i.e., the self-attention mechanism, and propose temporal attention - a time-aware self-attention mechanism. Temporal attention can be applied to any transformer model and requires the input texts to be accompanied with their relevant time points. It allows the transformer to capture this temporal information and create time-specific c...
Natural language processing (NLP) techniques had significantly improved by introducing pre-trained l...
Many NLP tasks require processing long contexts beyond the length limit of pretrained models. In ord...
International audienceLanguage is an interface to the outside world. In order for embodied agents to...
Time is an important aspect of text documents, which has been widely exploited in natural language p...
Keeping the performance of language technologies optimal as time passes is of great practical intere...
Large pretrained language models using the transformer neural network architecture are becoming a do...
Abstract—It is largely unknown how the brain deals with time. Hidden Markov Model (HMM) has a probab...
Deep neural models (e.g. Transformer) naturally learn spurious features, which create a ``shortcut''...
Recent progress in NLP witnessed the development of large-scale pre-trained language models (GPT, BE...
When an NLP model is trained on text data from one time period and tested or deployed on data from a...
We explore a semi-supervised approach for improving the portability of time expression recognition t...
Transformer has recently become one of the most popular deep learning models often utilized for proc...
textThis thesis explores the temporal analysis of text using the implicit temporal cues present in d...
Understanding time is essential to understanding events in the world. Knowing what has happened, wha...
Transformers have achieved state-of-the-art results across multiple NLP tasks. However, the self-att...
Natural language processing (NLP) techniques had significantly improved by introducing pre-trained l...
Many NLP tasks require processing long contexts beyond the length limit of pretrained models. In ord...
International audienceLanguage is an interface to the outside world. In order for embodied agents to...
Time is an important aspect of text documents, which has been widely exploited in natural language p...
Keeping the performance of language technologies optimal as time passes is of great practical intere...
Large pretrained language models using the transformer neural network architecture are becoming a do...
Abstract—It is largely unknown how the brain deals with time. Hidden Markov Model (HMM) has a probab...
Deep neural models (e.g. Transformer) naturally learn spurious features, which create a ``shortcut''...
Recent progress in NLP witnessed the development of large-scale pre-trained language models (GPT, BE...
When an NLP model is trained on text data from one time period and tested or deployed on data from a...
We explore a semi-supervised approach for improving the portability of time expression recognition t...
Transformer has recently become one of the most popular deep learning models often utilized for proc...
textThis thesis explores the temporal analysis of text using the implicit temporal cues present in d...
Understanding time is essential to understanding events in the world. Knowing what has happened, wha...
Transformers have achieved state-of-the-art results across multiple NLP tasks. However, the self-att...
Natural language processing (NLP) techniques had significantly improved by introducing pre-trained l...
Many NLP tasks require processing long contexts beyond the length limit of pretrained models. In ord...
International audienceLanguage is an interface to the outside world. In order for embodied agents to...