This dissertation presents several new methods of supervised and unsupervised learning of word sense disambiguation models. The supervised methods focus on performing model searches through a space of probabilistic models, and the unsupervised methods rely on the use of Gibbs Sampling and the Expectation Maximization (EM) algorithm. In both the supervised and unsupervised case, the Naive Bayesian model is found to perform well. An explanation for this success is presented in terms of learning rates and bias-variance decompositions
We describe the results of performing text mining on a challenging problem in natural language proce...
Word Sense Disambiguation is a difficult problem to solve in the unsupervised setting. This is becau...
In this paper, a supervised learning system of word sense disambiguation is presented. It is based o...
A statistical word sense disambiguation (WSD) model using Naive Bayes assumption is developed in thi...
We describe two probabilistic models for unsuper-vised word-sense disambiguation using parallel cor-...
This beachelor's thesis deals with word sense disambiguation problem using the machine learning tech...
We introduce a generative probabilistic model, the noisy channel model, for unsupervised word sense ...
In this paper, word sense disambiguation (WSD) ac-curacy achievable by a probabilistic classier, usi...
We present a corpus--based approach to word--sense disambiguation that only requires information tha...
Abstract. This paper describes an experimental comparison between two standard supervised learning m...
In this paper, a supervised learning system of word sense disambiguation is presented. It is based o...
Word sense disambiguation is a core problem in many tasks related to language processing. In this pa...
We replace the overlap mechanism of the Lesk algorithm with a simple, general-purpose Naive Bayes mo...
This paper describes an experimental comparison between two standard supervised learning methods, na...
We develop latent Dirichlet allocation with WORDNET (LDAWN), an unsupervised probabilistic topic mod...
We describe the results of performing text mining on a challenging problem in natural language proce...
Word Sense Disambiguation is a difficult problem to solve in the unsupervised setting. This is becau...
In this paper, a supervised learning system of word sense disambiguation is presented. It is based o...
A statistical word sense disambiguation (WSD) model using Naive Bayes assumption is developed in thi...
We describe two probabilistic models for unsuper-vised word-sense disambiguation using parallel cor-...
This beachelor's thesis deals with word sense disambiguation problem using the machine learning tech...
We introduce a generative probabilistic model, the noisy channel model, for unsupervised word sense ...
In this paper, word sense disambiguation (WSD) ac-curacy achievable by a probabilistic classier, usi...
We present a corpus--based approach to word--sense disambiguation that only requires information tha...
Abstract. This paper describes an experimental comparison between two standard supervised learning m...
In this paper, a supervised learning system of word sense disambiguation is presented. It is based o...
Word sense disambiguation is a core problem in many tasks related to language processing. In this pa...
We replace the overlap mechanism of the Lesk algorithm with a simple, general-purpose Naive Bayes mo...
This paper describes an experimental comparison between two standard supervised learning methods, na...
We develop latent Dirichlet allocation with WORDNET (LDAWN), an unsupervised probabilistic topic mod...
We describe the results of performing text mining on a challenging problem in natural language proce...
Word Sense Disambiguation is a difficult problem to solve in the unsupervised setting. This is becau...
In this paper, a supervised learning system of word sense disambiguation is presented. It is based o...