Abstract
The principal goal of data compression (also known as source coding) is to replace data by a compact representation in such a manner that from this representation the original data can be reconstructed either perfectly, or with high enough accuracy. Generally, the representation is given in the form of a sequence of binary digits (bits) that can be used for efficient digital transmission or storage.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
E. A. Abaya and G. L. Wise. On the existence of optimal quantizers. IEEE Trans. Inform. Theory, 28:937 — 940, Nov. 1982.
E. A. Abaya and G. L. Wise. Convergence of vector quantizers with applications to optimal quantization. SIAM Journal on Applied Mathematics, 44: 183–189, 1984.
M. Anthony and P. L. Bartlett. Neural Network Learning: Theoretical Foundations. Cambridge University Press, 1999.
R. B. Ash. Probability and Measure Theory. Academic Press, New York, 2000.
P. Bartlett, T. Linder, and G. Lugosi. The minimax distortion redundancy in empirical quantizer design. IEEE Trans. Inform. Theory, IT-44(5): 1802–1813, Sep. 1998.
T. Berger. Optimum quantizers and permutation codes. IEEE Trans. Inform. Theory, IT-18: 759–765, Nov. 1972.
P. A. Chou. The distortion of vector quantizers trained on n vectors decreases to the optimum as Op (1/n). in Proc. IEEE Int. Symp. Information Theory (Trondheim, Norway, Jun. 27-Jul. 1, 1994 ), p. 457.
P. A. Chou and B. J. Betts. When optimal entropy-constrained quantizers have only a finite number of codewords. in Proc. IEEE Int. Symp. Information Theory (Cambridge, MA, USA, Aug. 16–21, 1998 ), p. 97.
P. A. Chou, T. Lookabaugh, and R. M. Gray. Entropy-constrained vector quantization. IEEE Trans. Acoust. Speech, Signal Processing, ASSP-37: 31–42, Jan. 1989.
D. Cohn, E. Riskin, and R. Ladner. Theory and practice of vector quantizers trained on small training sets. IEEE Trans. on Pattern Analysis and Machine Intelligence, 16: 54–65, Jan. 1994.
P. C. Cosman, K. O. Perlmutter, S. M. Perlmutter, R. A. Olshen, and R. M. Gray. Training sequence size and vector quantizer performance. In Proceedings of Asilomar Conference on Signals, Systems, and Computers, pages 434–438, 1991.
T. Cover and J. A. Thomas. Elements of Information Theory. Wiley, New York, 1991.
L. Devroye, L. Györfi, and G. Lugosi. A Probabilistic Theory of Pattern Recognition. Springer, New York, 1996.
R. M. Dudley. Real Analysis and Probability. Chapman & Hall, New York, 1989.
R.M. Dudley. Central limit theorems for empirical measures. Annals of Probability, 6: 899–929, 1978.
A. Gersho and R. M. Gray. Vector Quantization and Signal Compression. Kluwer, Boston, 1992.
H. Gish and J. N. Pierce. Asymptotically efficient quantizing. IEEE Trans. Inform. Theory, IT-14: 676–683, Sep. 1968.
S. Graf and H. Luschgy. Consistent estimation in the quantization problem for random vectors. In Transactions of the Twelfth Prague Conference on Information Theory, Statistical Decision Functions, Random Processes, pages 84–87, 1994.
S. Graf and H. Luschgy. Rates of convergence for the empirical quantization error. preprint, 1999.
S. Graf and H. Luschgy. Foundations of Quantization for Probability Distributions. Springer Verlag, Berlin, Heidelberg, 2000.
R. M. Gray. Source Coding Theory. Kluwer, Boston, 1990.
R. M. Gray and L. D. Davisson. Quantizer mismatch. IEEE Trans. Communications, 23: 439–443, 1975.
R. M. Gray, T. Linder, and J. Li. A Lagrangian formulation of Zador’s entropy-constrained quantization theorem. IEEE Trans. Inform. Theory,2001 (to appear).
R. M. Gray and D. L. Neuhoff. Quantization. IEEE Trans. Inform. Theory, (Special Commemorative Issue), IT-44(6): 2325–2383, Oct. 1998.
R. M. Gray, D. L. Neuhoff, and P. C. Shields. A generalization of Orstein’s J-distance with applications to information theory. Annals of Probability, 3: 315–328, 1975.
A. György and T. Linder. Optimal entropy-constrained scalar quantization of a uniform source. IEEE Trans. Inform. Theory, IT-46:pp. 2704–2711, Nov. 2000.
A. György and T. Linder. On the structure of optimal entropy-constrained scalar quantizers. IEEE Trans. Inform. Theory,2001 (to appear).
A. György and T. Linder. On optimal Lagrangian entropy-constrained vector quantization. preprint, 2001.
Y. Linde, A. Buzo, and R. M. Gray. An algorithm for vector quantizer design. IEEE Transactions on Communications, COM-28: 84–95, Jan. 1980.
T. Linder. On the training distortion of vector quantizers. IEEE Trans. Inform. Theory, IT-46: 1617–1623, Jul. 2000.
T. Linder, G. Lugosi, and K. Zeger. Rates of convergence in the source coding theorem, in empirical quantizer design, and in universal lossy source coding. IEEE Trans. Inform. Theory, 40: 1728–1740, Nov. 1994.
T. Linder, G. Lugosi, and K. Zeger. Empirical quantizer design in the presence of source noise or channel noise. IEEE Trans. Inform. Theory, IT-43: 612–623, Mar. 1997.
T. Linder, V. Tarokh, and K. Zeger. Existence of optimal prefix codes for infinite source alphabets. IEEE Trans. Inform. Theory, IT-43: 2026–2028, Nov. 1997.
S. P. Lloyd. Least squared quantization in PCM. unpublished memorandum, Bell Lab., 1957; Also, IEEE Trans. Inform. Theory, vol. IT-28, no. 2, pp. 129–137., Mar. 1982.
J. MacQueen. Some methods for classification and and analysis of multivariate observations. in Proc. 5th Berkeley Symp. on Mathematical Statistics and Probability, vol. 1, pp. 281–296, 1967.
R Zador. Asymptotic quantization error of continuous signals and the quantization dimension. IEEE Trans. Inform. Theory, IT-28: 139–149, Mar. 1982.
N. Merhav and J. Ziv. On the amount of side information required for lossy data compression. IEEE Trans. Inform. Theory, IT-43: 1112–1121, July 1997.
D. Pollard. Strong consistency of k-means clustering. Annals of Statistics, 9, no. 1: 135140, 1981.
D. Pollard. Quantization and the method of k-means. IEEE Trans. Inform. Theory, IT-28: 199–205, Mar. 1982.
D. Pollard. A central limit theorem for k-means clustering. Annals of Probability, vol. 10, no. 4: 919–926, 1982.
D. Pollard. Empirical Processes: Theory and Applications. NSF-CBMS Regional Conference Series in Probability and Statistics, Institute of Mathematical Statistics, Hayward, CA, 1990.
R. T. Rockafellar. Convex Analysis. Princeton University Press, Princeton, NJ, 1970.
M. J. Sabin. Global convergence and empirical consistency of the generalized Lloyd algorithm. PhD thesis, Stanford Univ., 1984.
M. J. Sabin and R. M. Gray. Global convergence and empirical consistency of the generalized Lloyd algorithm. IEEE Trans. Inform. Theory, IT-32: 148–155, Mar. 1986.
K. Sayood. Introduction to Data Compression. Morgan Kaufmann Publishers, San Francisco, 2nd edition, 2001.
E.V. Slud. Distribution inequalities for the binomial law. Annals of Probability, 5: 404412, 1977.
H. Steinhaus. Sur la division des corp materiels en parties. Bull. Acad. Polon. Sci, IV: 80 1804, May 1956.
S. Szarek. On the best constants in the Khintchine inequality. Studia Mathematica, 63: 197–208, 1976.
V. N. Vapnik. Statistical Learning Theory. Wiley, New York, 1998.
P. Zador. Topics in the asymptotic quantization of continuous random variables. unpublished memorandum, Bell Laboratories, Murray Hill, NJ, Feb. 1966.
A. J. Zeevi. On the performance of vector quantizers empirically designed from dependent sources. in Proceedings of Data Compression Conference, DCC’98, (J. Storer, M. Cohn, ed.) pp. 73–82, IEEE Computer Society Press, Los Alamitos, California, 1998.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Wien
About this chapter
Cite this chapter
Linder, T. (2002). Learning-Theoretic Methods in Vector Quantization. In: Györfi, L. (eds) Principles of Nonparametric Learning. International Centre for Mechanical Sciences, vol 434. Springer, Vienna. https://doi.org/10.1007/978-3-7091-2568-7_4
Download citation
DOI: https://doi.org/10.1007/978-3-7091-2568-7_4
Publisher Name: Springer, Vienna
Print ISBN: 978-3-211-83688-0
Online ISBN: 978-3-7091-2568-7
eBook Packages: Springer Book Archive