Abstract
Several binary rule encoding schemes have been proposed for Pittsburgh-style classifier systems. This paper focus on the analysis of how maximally general and accurate rules, regardless of the encoding, can be evolved in a such classifier systems. The theoretical analysis of maximally general and accurate rules using two different binary rule encoding schemes showed some theoretical results with clear implications to the scalability of any genetic-based machine learning system that uses the studied encoding schemes. Such results are clearly relevant since one of the binary representations studied is widely used on Pittsburgh-style classifier systems, and shows an exponential shrink of the useful rules available as the problem size increases . In order to be able to perform such analysis we use a simple barebones Pittsburgh classifier system—the compact classifier system (CCS)—based on estimation of distribution algorithms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Wilson, S.W.: Classifier fitness based on accuracy. Evolutionary Computation 3(2), 149–175 (1995)
De Jong, K.A., Spears, W.M.: Learning Concept Classification Rules using Genetic Algorithms. In: Proceedings of the Twelfth International Conference on Artificial Intelligence IJCAI-91, vol. 2, pp. 651–656. Morgan Kaufmann, San Francisco (1991)
Janikow, C.Z.: A Knowledge Intensive Genetic Algorithm for Supervised Learning. Machine Learning 13, 198–228 (1993)
Holland, J.H.: Adaptation in Natural and Artificial Systems. University of Michigan Press, Ann Arbor (1975)
Goldberg, D.E.: Computer-aided gas pipeline operation using genetic algorithms and rule learning. Dissertation Abstracts International 44, 3174B. Doctoral dissertation, University of Michigan (1983)
Goldberg, D.E.: Genetic algorithms in search, optimization, and machine learning. Addison-Wesley, Reading (1989)
Wilson, S.W.: Get real! XCS with continuous valued inputs. In: Festschrift in Honor of John H. Holland, pp. 111–121 (1999)
Lanzi, P.L.: Extending the Representation of Classifier Conditions Part I: From Binary to Messy Coding. In: Proceedings of the Genetic and Evolutinary Computation Conference (GECCO’99), pp. 337–344. Morgan Kaufmann, San Francisco (1999)
Lanzi, P.L., Perrucci, A.: Extending the Representation of Classifier Conditions Part II: From Messy Coding to S-Expressions. In: Proceedings of the Genetic and Evolutinary Computation Conference (GECCO’99), pp. 345–352. Morgan Kaufmann, San Francisco (1999)
Llorà, X., Garrell, J.M.: Knowledge-Independent Data Mining with Fine-Grained Parallel Evolutionary Algorithms. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO’2001), pp. 461–468. Morgan Kaufmann, San Francisco (2001)
Llorà, X.: Genetic Based Machine Learning using Fine-grained Parallelism for Data Mining. PhD thesis, Enginyeria i Arquitectura La Salle. Ramon Llull University, Barcelona (February 2002)
Mitchell, T.M.: Machine Learning. McGraw-Hill, New York (1997)
Butz, M., et al.: Automated Global Structure Extraction For Effective Local Building Block Processing in XCS. IlliGAL Report No. 2005011, University of Illinois at Urbana-Champaign, Illinois Genetic Algorithms Laboratory, Urbana, IL (2005)
Butz, M., et al.: Extracted Global Structure Makes Local Building Block Processing Effective in XCS. IlliGAL Report No. 2005010, University of Illinois at Urbana-Champaign, Illinois Genetic Algorithms Laboratory, Urbana, IL (2005)
Pelikan, M., Lobo, F., Goldberg, D.E.: A survey of optimization by building and using probabilistic models. Computational Optimization and Applications 21, 5–20 (2002), (Also IlliGAL Report No. 99018)
Harik, G., Lobo, F., Goldberg, D.E.: The compact genetic algorithm. In: Proceedings of the IEEE International Conference on Evolutionary Computation, pp. 523–528. IEEE Computer Society Press, Los Alamitos (1998), (Also IlliGAL Report No. 97006)
Llorà, X., Sastry, K., Goldberg, D.E.: The Compact Classifier System: Motivation, Analysis, and First Results. In: Proceedings of the Genetic and Evolutinary Computation Conference (GECCO 2005), ACM Press, New York (in press, 2005)
Larrañaga, P., Lozano, J.A.: Estimation of Distribution Algorithms. Kluwer Academic Publishers, Boston (2002)
Goldberg, D.E., Korb, B., Deb, K.: Messy genetic algorithms: Motivation, analysis, and first results. Complex Systems 3(5), 493–530 (1989), (Also IlliGAL Report No. 89003)
Baluja, S.: Population-based incremental learning: A method of integrating genetic search based function optimization and competitive learning. Technical Report CMU-CS-94-163, Carnegie Mellon University (1994)
Baluja, S., Caruana, R.: Removing the genetics from the standard genetic algorithm. Technical Report CMU-CS-95-141, Carnegie Mellon University (1995)
Mühlenbein, H., Paaß, G.: From recombination of genes to the estimation of distributions I. Binary parameters. In: Ebeling, W., et al. (eds.) PPSN 1996. LNCS, vol. 1141, pp. 178–187. Springer, Heidelberg (1996)
Mühlenbein, H.: The equation for response to selection and its use for prediction. Evolutionary Computation 5(3), 303–346 (1997)
Harik, G., et al.: The gambler’s ruin problem, genetic algorithms, and the sizing of populations. Evolutionary Computation 7(3), 231–253 (1999), (Also IlliGAL Report No. 96004)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Llorà, X., Sastry, K., Goldberg, D.E. (2007). Binary Rule Encoding Schemes: A Study Using the Compact Classifier System. In: Kovacs, T., Llorà, X., Takadama, K., Lanzi, P.L., Stolzmann, W., Wilson, S.W. (eds) Learning Classifier Systems. IWLCS IWLCS IWLCS 2003 2004 2005. Lecture Notes in Computer Science(), vol 4399. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-71231-2_4
Download citation
DOI: https://doi.org/10.1007/978-3-540-71231-2_4
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-71230-5
Online ISBN: 978-3-540-71231-2
eBook Packages: Computer ScienceComputer Science (R0)