The paper is concerned with the problem of automatic detection and correction of errors into massive data sets. As customary, erroneous data records are detected by formulating a set of rules. Such rules are here encoded into linear inequalities. This allows to check the set of rules for inconsistencies and redundancies by using a polyhedral mathematics approach. Moreover, it allows to correct erroneous data records by introducing the minimum changes through an integer linear programming approach. Results of a particularization of the proposed procedure to a real-world case of census data correction are reported
This paper presents some theoretical findings from our recent methodological research addressing the...
Automatic detection and correction of errors into large data sets is a very relevant task in many a...
Software applications have become an indispensable integral part of this world. In all areas of ever...
The paper is concerned with the problem of automatic detection and correction of errors into massive...
The paper is concerned with the problem of automatic detection and correction of inconsistent or out...
In a variety of relevant real world problems, tasks of "data mining" and "knowledge discovery" are r...
The paper is concerned with the problem of automatic detection and correction of erroneous data into...
The paper is concerned with the problem of automatic detection and correction of inconsistent or out...
The paper is concerned with the problem of automatic detection and correction of inconsistent or out...
AbstractThe paper is concerned with the problem of automatic detection and correction of inconsisten...
Data collected by statistical offices generally contain errors, which have to be corrected before re...
Very often real-world databases also contain records which are anomalous, or atypical, in the sense ...
This paper is concerned with the problem of automatic detection and correction of inconsistent or ou...
In the case of large-scale surveys, such as a Census, data may contain errors or missing values. An ...
The Census Bureau’s SPEER editing system applies the Fellegi-Holt model to economic establishment su...
This paper presents some theoretical findings from our recent methodological research addressing the...
Automatic detection and correction of errors into large data sets is a very relevant task in many a...
Software applications have become an indispensable integral part of this world. In all areas of ever...
The paper is concerned with the problem of automatic detection and correction of errors into massive...
The paper is concerned with the problem of automatic detection and correction of inconsistent or out...
In a variety of relevant real world problems, tasks of "data mining" and "knowledge discovery" are r...
The paper is concerned with the problem of automatic detection and correction of erroneous data into...
The paper is concerned with the problem of automatic detection and correction of inconsistent or out...
The paper is concerned with the problem of automatic detection and correction of inconsistent or out...
AbstractThe paper is concerned with the problem of automatic detection and correction of inconsisten...
Data collected by statistical offices generally contain errors, which have to be corrected before re...
Very often real-world databases also contain records which are anomalous, or atypical, in the sense ...
This paper is concerned with the problem of automatic detection and correction of inconsistent or ou...
In the case of large-scale surveys, such as a Census, data may contain errors or missing values. An ...
The Census Bureau’s SPEER editing system applies the Fellegi-Holt model to economic establishment su...
This paper presents some theoretical findings from our recent methodological research addressing the...
Automatic detection and correction of errors into large data sets is a very relevant task in many a...
Software applications have become an indispensable integral part of this world. In all areas of ever...