Abstract
Quantization is at the heart of analog-to-digital (A/D) conversion. Two criteria have been widely used for designing quantizers: the minimization of the average distortion due to quantization, and the maximization of information-theoretic entropy i.e. ensure that each of the quantization regions is used equally frequently in encoding the input signal [1]. In general, these two criteria are not equivalent and a particular quantizer is only optimal with respect to a given design criterion [3,4]. To help prevent performance degradation in case of non-stationary input signals, several adaptive quantization schemes have been developed. These schemes attempt to maintain near-optimal performance by matching quantization to the short-term characteristics of the input signal [2]. However, these schemes can only compensate for slowly varying input characteristics and e.g. are not suited for performing adaptive waveform quantization of speech signals.
In this contribution, a novel unsupervised competitive learning rule, called Boundary Adaptation Rule (BAR) [4], is presented for performing adaptive non-uniform A/D conversion. BAR is completely different from other unsupervised competitive learning rules since it maximizes information-theoretic entropy explicitly. In this way, it outperforms other unsupervised learning rules in generating an equiprobable quantization of the analog signal range. Two versions of BAR are introduced, with different computational requirements and speeds of convergence: a simple rule with time complexity O(k), with k the number of quantization intervals, and a fast rule, called fast BAR (FBAR), with time complexity O(1).
Using FBAR, an application to adaptive waveform coding of speech signals is considered. The signals originate from the TIMIT data base. For reasons discussed in the presentation, a fixed gain of 10 is used for normalizing the speech signals. The signal-to-quantization-noise ratio (SN R) is determined and compared with that obtained using a uniform and a μ-law quantizer (μ = 255). The average SN R attained with FBAR clearly outperforms those of the other quantizers: 29.61 dB instead of 16.05 and 19.57 dB, respectively, for a 5 bit converter. To achieve a similar result with a uniform and a μ-law quantizer, 7.33 bits and 6.65 bits are required, respectively. Finally, in case (non-adaptive) Huffman coding is used before transmission, the average bit rate of our A/D converter decreases to 3.5 bits for an average SN R of 29.61 dB! This clearly demonstrates the advantage of performing A/D conversion adaptively with FBAR.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Ahalt, S.C., Krishnamurthy, A.K., Chen, P., & Melton, D.E. (1990). Competitive learning algorithms for vector quantization. Neural Networks, 3, 277–290.
Gersho, A., & Gray, R.M. (1991). Vector quantization and signal compression. Boston: Kluwer.
Ueda, N., & Nakano, R. (1993). A competitive & selective learning method for designing optimal vector quantizers. Proc. 1993 IEEE Int,’l Conf. on Neural Networks, San Francisco, Vol. III, 1444–1450.
Van Huile, M.M., & Martinez, D. (1993). On an unsupervised learning rule for scalar quantization following the maximum entropy principle, Neural Compulation (in press).
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1993 Springer-Verlag London Limited
About this paper
Cite this paper
van Hulle, M.M. (1993). Adaptive Non-Uniform A/D Conversion Achieved with an Unsupervised Learning Rule Maximizing Information-Theoretic Entropy. In: Gielen, S., Kappen, B. (eds) ICANN ’93. ICANN 1993. Springer, London. https://doi.org/10.1007/978-1-4471-2063-6_15
Download citation
DOI: https://doi.org/10.1007/978-1-4471-2063-6_15
Published:
Publisher Name: Springer, London
Print ISBN: 978-3-540-19839-0
Online ISBN: 978-1-4471-2063-6
eBook Packages: Springer Book Archive