Advertisement

Object and Gesture Recognition to Assist Children with Autism during the Discrimination Training

  • Eduardo Quintana
  • Catalina Ibarra
  • Lizbeth Escobedo
  • Monica Tentori
  • Jesus Favela
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7441)

Abstract

Teachers prompt children with autism to redirect their attention to the object discrimination training and reduce the time they spend “off task”. In this paper, we describe MOBIS, a mobile augmented reality application enabling multi-modal interaction to provide guidance to students with autism during the object discrimination training. The system uses a vision-based object recognition algorithm to associate visual and verbal prompts to the object being discriminated (i.e., “object of interest”). The results of a performance evaluation of the system show that the object recognition component achieves an accuracy of 90%, processing an image every 0.5 seconds. Accelerometers placed on objects of interest are used to detect interaction gestures with an accuracy of 87%. The performance of both algorithms is sufficient to support the object discrimination training in real-time.

Keywords

Augmented reality object recognition multimodal interaction 

References

  1. 1.
    Bay, H., Ess, A., Tuytelaars, T., Van-Gool, L.: Speeded-up robust features. Computer Vision and Image Understanding 110(3), 346–359 (2008)CrossRefGoogle Scholar
  2. 2.
    Binger, C.: Classroom-Based Language Goals and Intervention for Children Who Use AAC: Back to Basics. Perspective on Augmentative and Alternative Communication 17, 20–26 (2008)CrossRefGoogle Scholar
  3. 3.
    Escobedo, L., Nguyen, D., Hayes, G., Boyd, L., Rangel, A., García, D., Hirano, S., Tentori, M.: MOSOCO: A Mobile Assistive Tool to Support Children with Autism Practicing Social Skills in Real-Life Situations. In: CHI 2012, ACM, Austin (2012)Google Scholar
  4. 4.
    García-Macías, J.A., Alvarez-Lozano, J., Estrada, P.E., Aviles-Lopez, E.: Browsing the Internet of Things with Sentient Visors. IEEE Computer 44(5), 46–52 (2011)CrossRefGoogle Scholar
  5. 5.
    Hayes, G.R., Hirano, S., Marcu, G., Monibi, M., Nguyen, D.H., Yeganyan, M.: Interactive Visual Supports for Children with Autism. Personal and Ubiquitous Computing 14(7) (2010)Google Scholar
  6. 6.
    Hirano, S.H., Yeganyan, M.T., Marcu, G., Nguyen, D.H., Boyd, L., Hayes, G.R.: vSked: evaluation of a system to support classroom activities for children with autism. In: 28th CHI 2010. ACM Press, Atlanta (2010)Google Scholar
  7. 7.
    Ibarra, C., Escobedo, L., Tentori, M.: Smart objects to support the discrimination training of children with autism. In: UBICOMP 2012 (submitted, 2012)Google Scholar
  8. 8.
    Quintana, E., Favela, J.: Ambient Notifications as Memory Aids for People Suffering from Dementia. In: 5th Intl. Conf. on Ubiquitous Computing and Ambient Intelligence (UCAMI 2011), Riviera Maya, Mexico (2011)Google Scholar
  9. 9.
    Tartaro, A., Cassell, J.: Playing with virtual peers: bootstrapping contingent discourse in children with autism. In: International Conference of the Learning Sciences (2008)Google Scholar
  10. 10.
    Tentori, M., Hayes, G.R.: Designing for interaction immediacy to Enhance Social Skills of Children with Autism. In: UBICOMP 2010, Denmark, Copehaguen, pp. 51–60 (2010)Google Scholar
  11. 11.
    Williams, G., Pérez-González, L.A., Muller, A.: Using a combined blocking procedure to teach color discrimination to a child with autism. Journal of Applied Behavior Analysis 38(4), 555–558 (2005)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Eduardo Quintana
    • 1
  • Catalina Ibarra
    • 2
  • Lizbeth Escobedo
    • 2
  • Monica Tentori
    • 1
  • Jesus Favela
    • 1
  1. 1.Department of Computer ScienceCICESEMéxico
  2. 2.School of Computer ScienceUABCMéxico

Personalised recommendations