Abstract. Fano’s inequality has proven to be one important result in Shannon’s information theory having found applications in numerous proofs of convergence. It also provides us with a lower bound on the symbol error probability in a communication channel, in terms of Shannon’s definitions of entropy and mutual information. This result is also significant in that it suggests insights on how the classification performance is influenced by the amount of information transferred through the classifier. We have previously extended Fano’s lower bound on the probability of error to a family of lower and upper bounds based on Renyi’s definitions of entropy and mutual information. These new bounds however, despite their theoretical appeal, were pra...