The paper presents a method aiming at improving the reliability of Simultaneous Localization And Mapping (SLAM) approaches based on vision systems. Classical SLAM approaches treat camera capturing time as negligible, and the recorded frames as sharp and well-defined, but this hypothesis does not hold true when the camera is moving too fast. In such cases, in fact, frames may be severely degraded by motion blur, making features matching task a difficult operation. The method here presented is based on a novel approach that combines the benefits of a fully probabilistic SLAM algorithm with the basic ideas behind modern motion blur handling algorithms. Whereby the Kalman Filter, the new approach predicts the best possible blur Point Spread Fun...