For the storage of big data, there are significant challenges with its long-term reliability. This paper studies how to use the natural redundancy in data for error correction, and how to combine it with error-correcting codes to effectively improve data reliability. It explores several aspects of natural redundancy, including the discovery of natural redundancy in compressed data, the efficient decoding of codes with random structures, the capacity of error-correcting codes that contain natural redundancy, and the time-complexity tradeoff between source coding and channel coding
Big-data systems enable storage and analysis of massive amounts of data, and are fueling the data re...
Video compression is necessary for effective coding of video data so that the data can be stored or ...
We introduce a general technique that we call syndrome compression, for designing low-redundancy err...
With ever increasing amount of digital data being generated everyday on various platforms the need f...
This dissertation is focused on creating mathematical techniques---influenced by information theory ...
The problem of adding reliability to data has already been mentioned in Section 2.12. This appendix ...
The classic approach for error correction is to add controlled external redundancy to data. This app...
We characterize the achievable pointwise redundancy rates for lossy data compression at a fixed dist...
The high demand for data storage and data communication brings many new challenges and concerns, inc...
Abstract – We characterize the achievable pointwise redundancy rates for lossy data compression at a...
This dissertation studies problems of data management under unreliable conditions: how can data be e...
Non-volatile memories are becoming the dominant type of storage devices in modern computers because ...
In cloud computing, storage area networks, remote backup storage, and similar settings, stored data ...
With the internet growing exponentially, the amount of information stored digitally becomes enormous...
Application of error correcting codes for data compression is first investigated by Shannon where he...
Big-data systems enable storage and analysis of massive amounts of data, and are fueling the data re...
Video compression is necessary for effective coding of video data so that the data can be stored or ...
We introduce a general technique that we call syndrome compression, for designing low-redundancy err...
With ever increasing amount of digital data being generated everyday on various platforms the need f...
This dissertation is focused on creating mathematical techniques---influenced by information theory ...
The problem of adding reliability to data has already been mentioned in Section 2.12. This appendix ...
The classic approach for error correction is to add controlled external redundancy to data. This app...
We characterize the achievable pointwise redundancy rates for lossy data compression at a fixed dist...
The high demand for data storage and data communication brings many new challenges and concerns, inc...
Abstract – We characterize the achievable pointwise redundancy rates for lossy data compression at a...
This dissertation studies problems of data management under unreliable conditions: how can data be e...
Non-volatile memories are becoming the dominant type of storage devices in modern computers because ...
In cloud computing, storage area networks, remote backup storage, and similar settings, stored data ...
With the internet growing exponentially, the amount of information stored digitally becomes enormous...
Application of error correcting codes for data compression is first investigated by Shannon where he...
Big-data systems enable storage and analysis of massive amounts of data, and are fueling the data re...
Video compression is necessary for effective coding of video data so that the data can be stored or ...
We introduce a general technique that we call syndrome compression, for designing low-redundancy err...