The implementation of conditional Gaussian Processes (GPs), also known as kriging models, requires the inversion of a covariance matrix. In practice, e.g., when performing optimization with the EGO algorithm, this matrix is often ill-conditioned. The two most classical regularization methods to avoid degeneracy of the covariance matrix are i) adding a small positive constant to the diagonal (which we will refer to with a slight abuse of language as “nugget ” regularization) [4] and ii) pseudoinverse (PI) [1] in which singular values smaller than a threshold are zeroed. This work first provides algebraic calculations which allow comparing PI and nugget regularizations with respect to interpolation properties when the observed points tend tow...