In this article, we discuss the procedure for computing the values of the unknowns under the condition of the minimum sum of squares of the observation residuals (least-squares method), taking into account the errors in the unknowns. Many authors have already presented the problem, especially in the field of regression analysis and computations of transformation parameters. We present an overview of the theoretical foundations of the least-squares method and extensions of this method by considering the errors in unknowns in the model matrix. The method, which can be called ‘the total least-squares method’, is presented in the paper for the case of fitting the regression line to a set of points and for the case of calculating transformation ...