AbstractWe consider a boosting technique that can be directly applied to multiclass classification problems. Although many boosting algorithms have been proposed so far, most of them are developed essentially for binary classification problems, and in order to handle multiclass classification problems, they need to be reduced somehow to binary ones. In order to avoid such reductions, we introduce a notion of the pseudo-entropy function G that gives an information-theoretic criterion, called the conditional G-entropy, for measuring the loss of hypotheses. The conditional G-entropy turns out to be useful for defining the weakness of hypotheses that approximate, in some way, a multiclass function in general, so that we can consider the boostin...