The Schnorr-Stimm dichotomy theorem [Schnorr and Stimm, 1972] concerns finite-state gamblers that bet on infinite sequences of symbols taken from a finite alphabet ?. The theorem asserts that, for any such sequence S, the following two things are true. (1) If S is not normal in the sense of Borel (meaning that every two strings of equal length appear with equal asymptotic frequency in S), then there is a finite-state gambler that wins money at an infinitely-often exponential rate betting on S. (2) If S is normal, then any finite-state gambler betting on S loses money at an exponential rate betting on S. In this paper we use the Kullback-Leibler divergence to formulate the lower asymptotic divergence div(S||?) of a probability measure ? on ?...