Skip to main content

Good Error Correcting Output Codes for Adaptive Multiclass Learning

  • Conference paper
  • First Online:
Book cover Multiple Classifier Systems (MCS 2003)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2709))

Included in the following conference series:

Abstract

In recent work, we introduced a generalization of ECOC learning under the theory of recursive error correcting codes. We named it RECOC (Recursive ECOC) learning. If long output codewords are allowed, as in the case of problems involving a large number of classes, standard recursive codes such as LDPC or Turbo can be used. However, if the number of classes is moderated, neither good LDPC nor Turbo codes might exist due to the small block lengths involved. In this paper, RECOC learning based on the recently introduced ensemble of Product Accumulated codes is analyzed. Due to their native data storage oriented design, smaller block lengths are allowed. In addition, because of their simplicity, all key concepts regarding RECOC learning can be easily explained.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. James, G., Hastie, T.: The error coding method and PiCTs. Journal of Computational and Graphical Statistics, 7:3:377–387 (1997)

    Article  MathSciNet  Google Scholar 

  2. Dietterich, T., Bakiri, G.: Error-correcting output codes: A general method for improving multiclass inductive learning programs. Proceedings of the Ninth National Conference on Artificial Intelligence (AAAI-91), pp. 572–577, Anaheim, CA: AAAI Press (1991)

    Google Scholar 

  3. Crammer, K., Singer, Y.: On the Learnability and Design of Output Codes for Multiclass Problems. 35–46. Nicolò Cesa-Bianchi, Sally A. Goldman (Eds.): Proceedings of the Thirteenth Annual Conference on Computational Learning Theory (COLT 2000), pp. 35–46, Palo Alto, California. Morgan Kaufmann (2000)

    Google Scholar 

  4. Kong, E., Dietterich, T.: Error-correcting output coding corrects bias and variance. In Proceedings of the 12 th International Conference on Machine Learning, pp. 313–321, (1995)

    Google Scholar 

  5. Allwein, E., Schapire, R., Singer, Y.: Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers. Journal of Machine Learning Research 1, pp. 113–141 (2000)

    Article  MathSciNet  Google Scholar 

  6. Guruswami, V., Sahai, A.: Multiclass Learning, Boosting, and Error-Correcting Codes. Proceedings of the Twelfth Annual Conference on Computational Learning Theory (COLT 99), pp. 145–155, Santa Cruz, CA, USA (1999)

    Google Scholar 

  7. Tapia, E., González, J.C., García Villalba, J., Villena, J.: Recursive Adaptive ECOC models. Proceedings of the 10th Portuguese Conference on Artificial Intelligence, EPIA 2001. Springer Lecture Notes in Artificial Intelligence, LNAI 2258, Oporto, Portugal (2001)

    Google Scholar 

  8. Tapia, E., González, J.C., García Villalba, J.: Recursive Classifiers. Proceedings of the 2002 IEEE International Symposium on Information Theory ISIT 2002, Lausanne, Switzerland (2002)

    Google Scholar 

  9. Tanner, M.: A recursive Approach to Low Complexity Error Correcting Codes. IEEE Transactions on Information Theory, Vol. IT-27, pp. 533–547 (1981)

    Article  MathSciNet  Google Scholar 

  10. Richardson, T., Urbanke, R.: The Capacity of Low Density Parity Check Codes under Message Passing Decoding. IEEE Transactions on Information Theory, Vol. 47, No. 2, pp. 599–618 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  11. Benedetto S., Montorsi G.: Unveiling Turbo Codes: Some Results on Parallel Concatenated Coding Schemes. IEEE Transactions on Information Theory, Vol. 42, No. 2, pp. 409–428, (1996)

    Article  MATH  Google Scholar 

  12. Li, J., Narayanan K. R., Georghiades C. N.: Product Accumulate Codes: A class of Capacity-Approaching, Low Complexity Codes. Department of Electrical Engineering, Texas

    Google Scholar 

  13. A&M University. Submitted (2001) IEEE Transactions on Information Theory. http://ee.tamu.edu/~jingli/pub/paper/IT_1.pdf

    Google Scholar 

  14. Kschischang, F., Frey, B.: Iterative decoding of compound codes by probability propagation in graphical models. IEEE Journal on Selected Areas in Communications, Vol. 16-2, pp. 219–230 (1998)

    Article  Google Scholar 

  15. Wiberg, N.: Codes and Decoding on General Graphs. Doctoral Dissertation, Department of Electrical Engineering, Linköping University, Sweden (1996)

    Google Scholar 

  16. Csiszár, I., Körner J.: Information Theory: Coding Theorems for Discrete Memoryless Systems. Academic Press, Inc. (London) LTD (1981)

    MATH  Google Scholar 

  17. Berrou, C., Glavieux, A.: Near Optimum Error Correcting Coding and Decoding: Turbo Codes. IEEE Transactions on Communications, Vol. 44, No. 10, pp. 1261–1271 (1996)

    Article  Google Scholar 

  18. McEliece, R. J., MacKay, D. J.: Turbo Decoding as an Instance of Pearl’s Belief Propagation Algorithm. IEEE Journal on Selected Areas in Communications, Vol. 16, No. 2, pp. 140–152 (1998)

    Article  Google Scholar 

  19. Weiss, Y.: Belief propagation and revision in networks with loops. Technical Report 1616, MIT AI lab (1997)

    Google Scholar 

  20. Schapire, R. E., Singer, Y.: Improved Boosting Algorithms Using Confidence-rated Predictions. Machine Learning, Vol. 37, No. 3, pp. 277–296 (1999)

    Article  MATH  Google Scholar 

  21. Sason, I., Shamai, S.: Improved Upper bounds on the ML decoding error probability of parallel and serial concatenated turbo codes via their ensemble distance spectrum. IEEE Trans. on Information Theory, Vol. 46, No. 1, pp. 24–47 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  22. Witten, I., Frank E.: Data Mining, Practical Machine Learning Tools and Techniques with JAVA Implementations. Morgan Kaufmann Publishers, San Francisco, California (2000)

    Google Scholar 

  23. Blaum M., Fan J. L., Xu L.: Soft Decoding of Several Classes of Array Codes. Proceedings of the 2002 IEEE International Symposium on Information Theory ISIT 2002, Lausanne, Switzerland (2002)

    Google Scholar 

  24. Kell, D. B., King, R. D.: On the optimization of classes for the assignment of unidentified reading frames in functional genomics programs: the need for machine learning. Trends in Biotechnology 18, pp. 93–98 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Tapia, E., González, J.C., García-Villalba, J. (2003). Good Error Correcting Output Codes for Adaptive Multiclass Learning. In: Windeatt, T., Roli, F. (eds) Multiple Classifier Systems. MCS 2003. Lecture Notes in Computer Science, vol 2709. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44938-8_16

Download citation

  • DOI: https://doi.org/10.1007/3-540-44938-8_16

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-40369-2

  • Online ISBN: 978-3-540-44938-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics