In this paper, we investigate the use of nonlinear distortion of the electrical post-detection signal in order to design simple, yet very effective, maximum likelihood sequence detection (MLSD) receivers for optical communications with direct photo-detection. This distortion enables the use of standard Euclidean branch metrics in the Viterbi algorithm which implements MLSD. Our results suggest that the nonlinear characteristic can be optimized with respect to the uncompensated chromatic dispersion and other relevant system parameters, such as the extinction ratio. The proposed schemes with optimized distortion exhibit the same performance of more sophisticated MLSD schemes, still guaranteeing more efficient Viterbi algorithm implementation