Bayesian optimization is a sequential procedure for obtaining the global optimum of black-box functions without knowing a priori their true form. Good uncertainty estimates over the shape of the objective function are essential in guiding the optimization process. However, these estimates can be inaccurate if the true objective function violates assumptions made by its model (e.g., Gaussianity). This paper studies which uncertainties are needed in Bayesian optimization models and argues that ideal uncertainties should be calibrated -- i.e., an 80% predictive interval should contain the true outcome 80% of the time. We propose a simple algorithm for enforcing this property and show that it enables Bayesian optimization to arrive at the globa...