Abstract
Representation is important in machine-based pattern recognition, AI, and machine learning. We need to represent states and state transitions appropriately in AI-based problem-solving. Similarly, in clustering and classification, we need to represent the data points, clusters, and classes.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Bibliography
Zipf GK (1949) Human behavior and the principle of least effort. Addison-Wesley
Murty MN, Devi VS (2015) Introduction to Pattern recognition and machine learning. IISc Press
Fukunaga K (2013) Introduction to statistical pattern recognition. Academic Press
Duda RO, Hart PE, Stork DG (2001) Pattern classification. Wiley Interscience
Francois D, Wertz V, Verleysen M (2007) The concentration of fractional distances. IEEE Trans Knowl Data Eng 19(7):873–886
Andoni A, Indyk P (2008) Near-optimal hashing algorithms for approximate nearest neighbor in high dimensions. Commun ACM 51(1):117–122
Murty MN, Raghava R (2016) Support vector machines and perceptrons. Springer briefs in computer science, Springer, Cham
Holte RC (1993) Very simple classification rules perform well on most commonly used datasets. Mach Learn 11:63–91
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2019 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Murty, M.N., Biswas, A. (2019). Representation. In: Centrality and Diversity in Search. SpringerBriefs in Intelligent Systems. Springer, Cham. https://doi.org/10.1007/978-3-030-24713-3_3
Download citation
DOI: https://doi.org/10.1007/978-3-030-24713-3_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-24712-6
Online ISBN: 978-3-030-24713-3
eBook Packages: Computer ScienceComputer Science (R0)