Advertisement

Paladyn

, Volume 1, Issue 2, pp 89–98 | Cite as

Enaction as a conceptual framework for developmental cognitive robotics

  • David VernonEmail author
Research Article

Abstract

This paper provides an accessible introduction to the cognitive systems paradigm of enaction and shows how it forms a practical framework for robotic systems that can develop cognitive abilities. The principal idea of enaction is that a cognitive system develops it own understanding of the world around it through its interactions with the environment. Thus, enaction entails that the cognitive system operates autonomously and that it generates its own models of how the world works. A discussion of the five key elements of enaction - autonomy, embodiment, emergence, experience, and sense-making - leads to a core set of functional, organizational, and developmental requirements which are then used in the design of a cognitive architecture for the iCub humanoid robot.

Keywords

enaction enactive systems cognition autonomy embodiment emergence experience sense-making 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    D. Vernon, G. Metta, and G. Sandini. A survey of artificial cognitive systems: Implications for the autonomous development of mental capabilities in computational agents. IEEE Transaction on Evolutionary Computation, 11(2):151–180, 2007.CrossRefGoogle Scholar
  2. [2]
    W. J. Freeman and R. Núñez. Restoring to cognition the forgotten primacy of action, intention and emotion. Journal of Consciousness Studies, 6(11–12):ix–xix, 1999.Google Scholar
  3. [3]
    J. R. Anderson, D. Bothell, M. D. Byrne, S. Douglass, C. Lebiere, and Y. Qin. An integrated theory of the mind. Psychological Review, 111(4):1036–1060, 2004.CrossRefGoogle Scholar
  4. [4]
    P. Langley. An adaptive architecture for physical agents. In IEEE/WIC/ACM International Conference on Intelligent Agent Technology, pages 18–25, Compiegne, France, 2005. IEEE Computer Society Press.Google Scholar
  5. [5]
    H. Maturana and F. Varela. The Tree of Knowledge - The Biological Roots of Human Understanding. New Science Library, Boston & London, 1987.Google Scholar
  6. [6]
    D. Vernon. Cognitive vision: The case for embodied perception. Image and Vision Computing, 26(1):127–141, 2008.CrossRefGoogle Scholar
  7. [7]
    D. Vernon and D. Furlong. Philosophical foundations of enactive AI. In M. Lungarella, F. Iida, J. C. Bongard, and R. Pfeifer, editors, 50 Years of AI, volume LNAI 4850, pages 53–62. Springer-Verlag, Heidelberg, 2007.Google Scholar
  8. [8]
    D. Vernon, G. Metta, and G. Sandini. Embodiment in cognitive systems: on the mutual dependence of cognition and robotics. In J. Gray and S. Nefti-Meziani, editors, Embodied Cognitive Systems. Institution of Engineering and Technology (IET), UK, 2010.Google Scholar
  9. [9]
    H. Maturana. Biology of cognition. Research Report BCL 9.0, University of Illinois, Urbana, Illinois, 1970.Google Scholar
  10. [10]
    H. Maturana. The organization of the living: a theory of the living organization. Int. Journal of Man-Machine Studies, 7(3):313–332, 1975.CrossRefGoogle Scholar
  11. [11]
    H. R. Maturana and F. J. Varela. Autopoiesis and Cognition - The Realization of the Living. Boston Studies on the Philosophy of Science. D. Reidel Publishing Company, Dordrecht, Holland, 1980.Google Scholar
  12. [12]
    F. Varela. Principles of Biological Autonomy. Elsevier North Holland, New York, 1979.Google Scholar
  13. [13]
    F. J. Varela. Whence perceptual meaning? A cartography of current ideas. In F. J. Varela and J.-P. Dupuy,, Understanding Origins - Contemporary Views on the Origin of Life, Mind and Society, Boston Studies in the Philosophy of Science, pages 235–263, Dordrecht, 1992. Kluwer Academic Publishers.Google Scholar
  14. [14]
    T. Winograd and F. Flores. Understanding Computers and Cognition - A New Foundation for Design. Addison-Wesley Publishing Company, Inc., Reading, Massachusetts, 1986.zbMATHGoogle Scholar
  15. [15]
    E. Di Paolo, M. Rohde, and H. De Jaegher. Horizons for the enactive mind: Values, social interaction, and play. In J. Stewart, O. Gapenne, and E. Di Paolo, editors, Enaction: Towards a New Paradigm for Cognitive Science, Cambridge, MA, 2008. MIT Press.Google Scholar
  16. [16]
    E. Thompson. Mind in Life: Biology, Phenomenology, and the Sciences of Mind. Harvard University Press, Boston, 2007.Google Scholar
  17. [17]
    W. D. Christensen and C. A. Hooker. An interactivist-constructivist approach to intelligence: self-directed anticipative learning. Philosophical Psychology, 13(1):5–45, 2000.CrossRefGoogle Scholar
  18. [18]
    T. Froese and T. Ziemke. Enactive artificial intelligence: Investigating the systemic organization of life and mind. Artificial Intelligence, 173:466–500, 2009.CrossRefGoogle Scholar
  19. [19]
    T. Ziemke and R. Lowe. On the role of emotion in embodied cognitive architectures: From organisms to robots. Cognition and Computation, 1:104–117, 2009.CrossRefGoogle Scholar
  20. [20]
    F. Varela, E. Thompson, and E. Rosch. The Embodied Mind. MIT Press, Cambridge, MA, 1991.Google Scholar
  21. [21]
    J. A. S. Kelso. Dynamic Patterns - The Self-Organization of Brain and Behaviour. MIT Press, Cambridge, MA, 3rd edition, 1995.Google Scholar
  22. [22]
    M. H. Bickhard. Autonomy, function, and representation. Artificial Intelligence, Special Issue on Communication and Cognition, 17(3–4):111–131, 2000.Google Scholar
  23. [23]
    R. Grush. The emulation theory of representation: motor control, imagery, and perception. Behavioral and Brain Sciences, 27:377–442, 2004.Google Scholar
  24. [24]
    G. Hesslow. Conscious thought as simulation of behaviour and perception. Trends in Cognitive Sciences, 6(6):242–247, 2002.CrossRefGoogle Scholar
  25. [25]
    M. P. Shanahan. A cognitive architecture that combines internal simulation with a global workspace. Consciousness and Cognition, 15:433–449, 2006.CrossRefGoogle Scholar
  26. [26]
    B. Ogden, K. Dautenhahn, and P. Stribling. Interactional structure applied to the identification and generation of visual interactive behaviour: Robots that (usually) follow the rules. In I. Wachsmuth and T. Sowa, editors, Gesture and Sign Languages in Human-Computer Interaction, volume LNAI 2298 of Lecture Notes LNAI, pages 254–268. Springer, 2002.Google Scholar
  27. [27]
    H. H. Clark. Managing problems in speaking. Speech Communication, 15:243–250, 1994.CrossRefGoogle Scholar
  28. [28]
    J. Nadel, C. Guerini, A. Peze, and C. Rivet. The evolving nature of imitation as a format for communication. In J. Nadel and G. Butterworth, editors, Imitation in Infancy, pages 209–234. Cambridge University Press, Cambridge, 1999.Google Scholar
  29. [29]
    G. S. Speidel. Imitation: a bootstrap for learning to speak. In G. E. Speidel and K. E. Nelson, editors, The many faces of imitation in language learning, pages 151–180. Springer Verlag, 1989.Google Scholar
  30. [30]
    C. Trevarthen, T. Kokkinaki, and G. A. Fiamenghi Jr. What infants’ imitations communicate: with mothers, with fathers and with peers. In J. Nadel and G. Butterworth, editors, Imitation in Infancy, pages 61–124. Cambridge University Press, Cambridge, 1999.Google Scholar
  31. [31]
    E. Thelen and L. B. Smith. A Dynamic Systems Approach to the Development of Cognition and Action. MIT Press / Bradford Books Series in Cognitive Psychology. MIT Press, Cambridge, Massachusetts, 1994.Google Scholar
  32. [32]
    D. Vernon. The space of cognitive vision. In H. I. Christensen and H.-H. Nagel, editors, Cognitive Vision Systems: Sampling the Spectrum of Approaches, LNCS, pages 7–26, Heidelberg, 2006. Springer-Verlag.Google Scholar
  33. [33]
    J. L. Krichmar and G. M. Edelman. Principles underlying the construction of brain-based devices. In T. Kovacs and J. A. R. Marshall, editors, Proceedings of AISB’ 06 - Adaptation in Artificial and Biological Systems, volume 2 of Symposium on Grand Challenge 5: Architecture of Brain and Mind, pages 37–42, Bristol, 2006. University of Bristol.Google Scholar
  34. [34]
    H. Gardner. Multiple Intelligences: The Theory in Practice. Basic Books, New York, 1993.Google Scholar
  35. [35]
    E. Thelen. Time-scale dynamics and the development of embodied cognition. In R. F. Port and T. van Gelder, editors, Mind as Motion - Explorations in the Dynamics of Cognition, pages 69–100, Cambridge, Massachusetts, 1995. Bradford Books, MIT Press.Google Scholar
  36. [36]
    T. Ziemke. Are robots embodied? In C. Balkenius, J. Zlatev, K. Dautenhahn, H. Kozima, and C. Breazeal, editors, Proceedings of the First InternationalWorkshop on Epigenetic Robotics-Modeling Cognitive Development in Robotic Systems, volume 85 of Lund University Cognitive Studies, pages 75–83, Lund, Sweden, 2001.Google Scholar
  37. [37]
    T. Ziemke. What’s that thing called embodiment? In R. Alterman and D. Kirsh, editors, Proceedings of the 25th Annual Conference of the Cognitive Science Society, Lund University Cognitive Studies, pages 1134–1139, Mahwah, NJ, 2003. Lawrence Erlbaum.Google Scholar
  38. [38]
    R. A. Brooks. Flesh and Machines: How Robots Will Change Us. Pantheon Books, New York, 2002.Google Scholar
  39. [39]
    L. Camaioni, M. C. Caselli, E. Longbardi, and V. Volterra. A parent report instrument for early language assessment. First Language, 11:345–360, 1991.CrossRefGoogle Scholar
  40. [40]
    D. Vernon, C. von Hofsten, and L. Fadiga. A Roadmap for Cognitive Development in Humanoid Robots. Cognitive Systems Monographs (COSMOS). Springer, in press, 2010.Google Scholar
  41. [41]
    G. Metta, L. Natale, F. Nori, G. Sandini, D. Vernon, L. Fadiga, C. von Hofsten, J. Santos-Victor, A. Bernardino, and L. Montesano. The iCub Humanoid Robot: An Open-Systems Platform for Research in Cognitive Development. submitted to Neural Networks, 2010.Google Scholar
  42. [42]
    G. Sandini, G. Metta, and D. Vernon. The icub cognitive humanoid robot: An open-system research platform for enactive cognition. In M. Lungarella, F. Iida, J. C. Bongard, and R. Pfeifer, editors, 50 Years of AI, volume LNAI 4850, pages 359–370. Springer-Verlag, Heidelberg, 2007.Google Scholar
  43. [43]
    M. Maggiali, G. Cannata, P. Maiolino, G. Metta, M. Randazzo, and G. Sandini. Embedded distributed capacitive tactile sensor. In The 11th Mechatronics Forum Biennial International Conference, University of Limerick, Ireland, 2008.Google Scholar
  44. [44]
    J. J. Gibson. The theory of affordances. In R. Shaw and J. Bransford, editors, Perceiving, acting and knowing: toward an ecological psychology, pages 67–82. Lawrence Erlbaum, 1977.Google Scholar
  45. [45]
    L. Montesano and M. Lopes. Learning grasping affordances from local visual descriptors. In IEEE International Conference on Development and Learning, Shanghai, China, 2009.Google Scholar
  46. [46]
    L. Montesano, M. Lopes, A. Bernardino, and J. Santos-Victor. Modeling affordances using bayesian networks. In IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, USA, 2007.Google Scholar
  47. [47]
    L. Montesano, M. Lopes, A. Bernardino, and J. Santos-Victor. Learning object affordances: From sensory motor maps to imitation. IEEE Transactions on Robotics, 24(1), 2008.Google Scholar
  48. [48]
    L. Montesano, M. Lopes, F. Melo, A. Bernardino, and J. Santos-Victor. A Computational Model of Object Affordances. IET, 2009.Google Scholar

Copyright information

© © Versita Warsaw and Springer-Verlag Wien 2010

Authors and Affiliations

  1. 1.Department of Robotics, Brain, and Cognitive SciencesItalian Institute of TechnologyGenoaItaly

Personalised recommendations