In this paper it was proved that the quantum relative entropy D(k) can be asymptotically attained by Kullback Leibler divergences of probabilities given by a certain sequence of measurements. The sequence of measurements depends on , but is independent of the choice of . In classical statistical theory the relative entropy D(pkq) is an information quantity which means the statistical eÆciency in order to distinguish a probability measure p of a measurable space from another probability measure q of the same measurable space. The states correspond to measures on measurable space. When p; q are discrete probabilities, the relative entropy (called also information divergence) introduced by Kullback and Leibler is dened by [1]: D(pkq) := X ...