We consider the compression of a continuous real-valued source X using scalar quantizers and average squared error distortion D. Using lossless compression of the quantizer's output, Gish and Pierce showed that uniform quantizing yields the smallest output entropy in the limit D -> 0, resulting in a rate penalty of 0.255 bits/sample above the Shannon Lower Bound (SLB). We present a scalar quantization scheme named lossy-bit entropy-constrained scalar quantization (Lb-ECSQ) that is able to reduce the D -> 0 gap to SLB to 0.251 bits/sample by combining both lossless and binary lossy compression of the quantizer's output. We also study the low-resolution regime and show that Lb-ECSQ significantly outperforms ECSQ in the case of 1-bit quantizat...
This article examines the problem of compressing a uniformly quantized independent and identically d...
Entropy-constrained trellis coded quantization (ECTCQ) of memoryless sources is known to be an effic...
The nonnegativity of relative entropy implies that the differential entropy of a random vector X wi...
We consider the compression of a continuous real-valued source X using scalar quantizers and average...
We consider the compression of a continuous real-valued source X using scalar quantizers and average...
Entropy-coded scalar quantization (ECSQ) is an efficient method of data compression for analog sourc...
This correspondence analyzes the low-resolution performance of entropy-constrained scalar quantizati...
Abstract: The optimal Entropy Constrained Scalar Quantizer (ECSQ) gives the best possible rate-disto...
The aim of this research is to investigate source coding, the representation of information source o...
The aim of this research is to investigate source coding, the representation of information source o...
For memoryless sources, Entropy-Constrained Scalar Quantizers (ECSQs) can perform closely to the Gis...
We consider optimal scalar quantization with $r$th power distortion and constrained R\'enyi entropy ...
In this paper, we build multiresolution source codes using entropy constrained dithered scalar quant...
This correspondence considers low-resolution scalar quantization for a memoryless Gaussian source wi...
Abstract—The global maximum of an entropy function with different decision levels for a three-level ...
This article examines the problem of compressing a uniformly quantized independent and identically d...
Entropy-constrained trellis coded quantization (ECTCQ) of memoryless sources is known to be an effic...
The nonnegativity of relative entropy implies that the differential entropy of a random vector X wi...
We consider the compression of a continuous real-valued source X using scalar quantizers and average...
We consider the compression of a continuous real-valued source X using scalar quantizers and average...
Entropy-coded scalar quantization (ECSQ) is an efficient method of data compression for analog sourc...
This correspondence analyzes the low-resolution performance of entropy-constrained scalar quantizati...
Abstract: The optimal Entropy Constrained Scalar Quantizer (ECSQ) gives the best possible rate-disto...
The aim of this research is to investigate source coding, the representation of information source o...
The aim of this research is to investigate source coding, the representation of information source o...
For memoryless sources, Entropy-Constrained Scalar Quantizers (ECSQs) can perform closely to the Gis...
We consider optimal scalar quantization with $r$th power distortion and constrained R\'enyi entropy ...
In this paper, we build multiresolution source codes using entropy constrained dithered scalar quant...
This correspondence considers low-resolution scalar quantization for a memoryless Gaussian source wi...
Abstract—The global maximum of an entropy function with different decision levels for a three-level ...
This article examines the problem of compressing a uniformly quantized independent and identically d...
Entropy-constrained trellis coded quantization (ECTCQ) of memoryless sources is known to be an effic...
The nonnegativity of relative entropy implies that the differential entropy of a random vector X wi...