Advertisement

Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

The role of appearance and motion in action prediction

Abstract

We used a novel stimulus set of human and robot actions to explore the role of humanlike appearance and motion in action prediction. Participants viewed videos of familiar actions performed by three agents: human, android and robot, the former two sharing human appearance, the latter two nonhuman motion. In each trial, the video was occluded for 400 ms. Participants were asked to determine whether the action continued coherently (in-time) after occlusion. The timing at which the action continued was early, late, or in-time (100, 700 or 400 ms after the start of occlusion). Task performance interacted with the observed agent. For early continuations, accuracy was highest for human, lowest for robot actions. For late continuations, the pattern was reversed. Both android and human conditions differed significantly from the robot condition. Given the robot and android conditions had the same kinematics, the visual form of the actor appears to affect action prediction. We suggest that the selection of the internal sensorimotor model used for action prediction is influenced by the observed agent’s appearance.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3

References

  1. Bar, M. (2009). Predictions: a universal principle in the operation of the human brain. Introduction. Philos Trans R Soc Lond B Biol Sci, 364(1521), 1181–1182.

  2. Bubic, A., von Cramon, D. Y., & Schubotz, R. I. (2010). Prediction, cognition and the brain. Front Hum Neurosci, 4, 25.

  3. Calvo-Merino, B., Grezes, J., Glaser, D. E., Passingham, R. E., & Haggard, P. (2006). Seeing or doing? Influence of visual and motor familiarity in action observation. Curr Biol, 16(19), 1905–1910.

  4. Casile, A., & Giese, M. A. (2006). Nonvisual motor training influences biological motion perception. Curr Biol, 16(1), 69–74.

  5. Catmur, C., Walsh, V., & Heyes, C. (2007). Sensorimotor learning configures the human mirror system. Curr Biol, 17(17), 1527–1531.

  6. Chaminade, T., & Cheng, G. (2009). Social cognitive neuroscience and humanoid robotics. J Physiol Paris, 103(3–5), 286–295.

  7. Chaminade, T., Hodgins, J., & Kawato, M. (2007). Anthropomorphism influences perception of computer-animated characters’ actions. Soc Cogn Affect Neurosci, 2(3), 206–216.

  8. Christensen, A., Ilg, W., & Giese, M. A. (2011). Spatiotemporal tuning of the facilitation of biological motion perception by concurrent motor execution. J Neurosci, 31(9), 3493–3499.

  9. Coradeschi, S., Ishiguro, H., Asada, M., Shapiro, S. C., Thielscher, M., Breazeal, C., et al. (2006). Human-inspired robots. IEEE Intell Syst, 21(4), 74–85.

  10. Cross, E. S., Hamilton, A. F., & Grafton, S. T. (2006). Building a motor simulation de novo: observation of dance by dancers. Neuroimage, 31(3), 1257–1267.

  11. Flanagan, J. R., & Johansson, R. S. (2003). Action plans used in action observation. Nature, 424(6950), 769–771.

  12. Friston, K. (2005). A theory of cortical responses. Phil Trans B, 360(1456), 815–836.

  13. Graf, M., Reitzner, B., Corves, C., Casile, A., Giese, M., & Prinz, W. (2007). Predicting point-light actions in real-time. Neuroimage, 36(Suppl 2), T22–T32.

  14. Ho, C–. C., & MacDorman, K. F. (2010). Revisiting the uncanny valley theory: developing and validating an alternative to the Godspeed indices. Comput Hum Behav, 26(6), 1508–1518.

  15. Ishiguro, H. (2006). Android science: conscious and subconscious recognition. Connection Sci, 18(4), 319–332.

  16. Kawato, M., & Wolpert, D. (1998). Internal models for motor control. Novartis Found Symp, 218, 291–304. Discussion 304–297.

  17. Kilner, J. M., Friston, K. J., & Frith, C. D. (2007a). Predictive coding: an account of the mirror neuron system. Cogn Process, 8(3), 159–166.

  18. Kilner, J. M., Hamilton, A. F., & Blakemore, S. J. (2007b). Interference effect of observed human movement on action is due to velocity profile of biological motion. Soc Neurosci, 2(3–4), 158–166.

  19. Kilner, J. M., Paulignan, Y., & Blakemore, S. J. (2003). An interference effect of observed biological movement on action. Curr Biol, 13(6), 522–525.

  20. Kilner, J. M., Vargas, C., Duval, S., Blakemore, S. J., & Sirigu, A. (2004). Motor activation prior to observation of a predicted movement. Nat Neurosci, 7(12), 1299–1301.

  21. Minato, T. & Ishiguro, H. (2008) Construction and evaluation of a model of natural human motion based on motion diversity. Proc of the 3rd ACM/IEEE International Conference on Human-Robot Interaction, 65–71.

  22. Mori, M. (1970). The uncanny valley. Energy, 7(4), 33–35.

  23. Pelphrey, K. A., Mitchell, T. V., McKeown, M. J., Goldstein, J., Allison, T., & McCarthy, G. (2003). Brain activity evoked by the perception of human walking: controlling for meaningful coherent motion. J Neurosci, 23(17), 6819–6825.

  24. Pollick, F. E. (2009). In search of the Uncanny Valley. In P. Daras & O. M. Ibarra (Eds.), UC Media 2009 (pp. 69–78). Venice, Italy: Springer.

  25. Pollick, F. E., Hale, J. G., & Tzoneva-Hadjigeorgieva, M. (2005). Perception of humanoid movement. Int J Humanoid Robot, 3, 277–300.

  26. Press, C., Gillmeister, H., & Heyes, C. (2007). Sensorimotor experience enhances automatic imitation of robotic action. Proc Biol Sci, 274(1625), 2509–2514.

  27. Prinz, W. (2006). What re-enactment earns us. Cortex, 42(4), 515–517.

  28. Rao, R. P., & Ballard, D. H. (1999). Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects. Nat Neurosci, 2(1), 79–87.

  29. Rizzolatti, G., Fogassi, L., & Gallese, V. (2001). Neurophysiological mechanisms underlying the understanding and imitation of action. Nat Rev Neurosci, 2(9), 661–670.

  30. Saunier, G., Papaxanthis, C., Vargas, C. D., & Pozzo, T. (2008). Inference of complex human motion requires internal models of action: behavioral evidence. Exp Brain Res, 185(3), 399–409.

  31. Saygin, A. P. (2007). Superior temporal and premotor brain areas necessary for biological motion perception. Brain, 130(Pt 9), 2452–2461.

  32. Saygin, A. P., Chaminade, T., & Ishiguro, H. (2010). The perception of humans and robots: Uncanny hills in parietal cortex. In S. Ohlsson & R. Catrambone (Eds.), Proceedings of the 32nd Annual Conference of the Cognitive Science Society (pp. 2716–2720). Portland, OR: Cognitive Science Society.

  33. Saygin, A. P., Chaminade, T., Ishiguro, H., Driver, J., & Frith, C. F. (2011a). The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions. Social Cognitive and Affective Neuroscience.

  34. Saygin, A. P., Chaminade, T., Urgen, B. A., & Ishiguro, H. (2011b). Cognitive neuroscience and robotics: a mutually beneficial joining of forces. In L. Takayama (Ed.), Robotics: systems and science. CA: Los Angeles.

  35. Schubotz, R. I. (2007). Prediction of external events with our motor system: towards a new framework. Trends Cogn Sci, 11(5), 211–218.

  36. Schutz-Bosbach, S., & Prinz, W. (2007). Perceptual resonance: action-induced modulation of perception. Trends Cogn Sci, 11(8), 349–355.

  37. Shimada, M., Minato, T., Itakura, S. & Ishiguro, H. (2006). Evaluation of android using unconscious recognition, Proc of the IEEE-RAS International Conference on Humanoid Robots, 157–162.

  38. Sparenberg, P., Springer, A., & Prinz, W. (2011). Predicting others’ actions: evidence for a constant time delay in action simulation. Psychol Res.

  39. Springer, A., Brandstadter, S., Liepelt, R., Birngruber, T., Giese, M., Mechsner, F., et al. (2011). Motor execution affects action prediction. Brain Cogn, 76(1), 26–36.

  40. Stadler, W., Schubotz, R. I., von Cramon, D. Y., Springer, A., Graf, M., & Prinz, W. (2011). Predicting and memorizing observed action: differential premotor cortex involvement. Hum Brain Mapp, 32(5), 677–687.

  41. Umilta, M. A., Kohler, E., Gallese, V., Fogassi, L., Fadiga, L., Keysers, C., et al. (2001). I know what you are doing. A neurophysiological study. Neuron, 31(1), 155–165.

  42. Wilson, M., & Knoblich, G. (2005). The case for motor involvement in perceiving conspecifics. Psychol Bull, 131(3), 460–473.

  43. Wolpert, D. M., Doya, K., & Kawato, M. (2003). A unifying computational framework for motor control and social interaction. Philos Trans R Soc B Biol Sci, 358(1431), 593–602.

Download references

Acknowledgments

SIAD was developed with support from the Kavli Institute of Brain and Mind (Innovative Research Award to APS). The research was additional supported by California Institute for Telecommunications and Information Technology (Calit2) Strategic Research Opportunities Program (CSRO), and the Hellman Fellowship Program. WS received funding from Deutsche Forschungsgemeinschaft (DFG; Project: STA 1076/1-1). Ulrike Riedel and Marcus Daum assisted in data collection. We thank Thierry Chaminade, Wolfgang Prinz, Hiroshi Ishiguro and students and staff at the Intelligent Robotics Laboratory at Osaka University.

Author information

Correspondence to Ayse Pinar Saygin.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Saygin, A.P., Stadler, W. The role of appearance and motion in action prediction. Psychological Research 76, 388–394 (2012). https://doi.org/10.1007/s00426-012-0426-z

Download citation

Keywords

  • Humanoid Robot
  • Artificial Agent
  • Biological Motion
  • Android
  • Action Prediction