The sequence memoizer is a model for sequence data with state-of-the-art performance on language modeling and compression. We propose a number of improvements to the model and inference algorithm, including an enlarged range of hyperparameters, a memory-efficient representation, and inference algorithms operating on the new representation. Our derivations are based on precise definitions of the various processes that will also allow us to provide an elementary proof of the "mysterious" coagulation and fragmentation properties used in the original paper on the sequence memoizer by Wood et al. (2009). We present some experimental results supporting our improvements
The on-line sequence modelling algorithm `Prediction by Partial Matching ' (PPM) has set the pe...
Auto-regressive sequence models can estimate the distribution of any type of sequential data. To stu...
This paper gives an overview of our decomposition of a group of existing and novel on-line sequence ...
The sequence memoizer is a model for sequence data with state-of-the-art per-formance on language mo...
Probabilistic models of sequences play a central role in most machine translation, automated speech ...
We propose an unbounded-depth, hierarchical, Bayesian nonparametric model for discrete sequence data...
Sequence segmentation is a flexible and highly accurate mechanism for modeling several applications....
In sequence-to-sequence tasks, sentences with heterogeneous semantics or grammatical structures may ...
(HSM) for statistical modeling of sequence data. The HSM generalizes our previous proposal on struct...
International audienceWorking memory capacity can be improved by recoding the memorized information ...
Working memory capacity can be improved by recoding the memorized information in a condensed form. H...
We consider problems of sequence processing and propose a solution based on a discrete state model i...
Huge neural autoregressive sequence models have achieved impressive performance across different app...
Most machine learning algorithms require a fixed length input to be able to perform commonly desired...
© 2019 Dr. Cong Duy Vu HoangNeural sequence models have recently achieved great success across vario...
The on-line sequence modelling algorithm `Prediction by Partial Matching ' (PPM) has set the pe...
Auto-regressive sequence models can estimate the distribution of any type of sequential data. To stu...
This paper gives an overview of our decomposition of a group of existing and novel on-line sequence ...
The sequence memoizer is a model for sequence data with state-of-the-art per-formance on language mo...
Probabilistic models of sequences play a central role in most machine translation, automated speech ...
We propose an unbounded-depth, hierarchical, Bayesian nonparametric model for discrete sequence data...
Sequence segmentation is a flexible and highly accurate mechanism for modeling several applications....
In sequence-to-sequence tasks, sentences with heterogeneous semantics or grammatical structures may ...
(HSM) for statistical modeling of sequence data. The HSM generalizes our previous proposal on struct...
International audienceWorking memory capacity can be improved by recoding the memorized information ...
Working memory capacity can be improved by recoding the memorized information in a condensed form. H...
We consider problems of sequence processing and propose a solution based on a discrete state model i...
Huge neural autoregressive sequence models have achieved impressive performance across different app...
Most machine learning algorithms require a fixed length input to be able to perform commonly desired...
© 2019 Dr. Cong Duy Vu HoangNeural sequence models have recently achieved great success across vario...
The on-line sequence modelling algorithm `Prediction by Partial Matching ' (PPM) has set the pe...
Auto-regressive sequence models can estimate the distribution of any type of sequential data. To stu...
This paper gives an overview of our decomposition of a group of existing and novel on-line sequence ...