Skip to main content

Evaluation of the Multimodal Object Oriented Architecture

  • Chapter
  • First Online:
Object-Oriented User Interfaces for Personalized Mobile Learning

Part of the book series: Intelligent Systems Reference Library ((ISRL,volume 64))

  • 1041 Accesses

Abstract

This chapter describes an evaluation study for an application of the Object Oriented architecture of a multimodal mobile system. A system relying on this structure is described in the previous chapter. In this chapter the authors evaluate the “quality” of their approach by attempting to provide solutions to the problems of successfully handling multimodal data in the much demanding area of mobile affective interaction. The results in this chapter’s findings indicate the success of their project and also strengthens their belief that the OO paradigm can successfully handle mobile multimodal data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Alepis E, Virvou M, Kabassi K (2008) Knowledge engineering for affective bi-modal interaction in mobile devices. In: Knowledge-based software engineering. Frontiers in artificial intelligence and applications, vol 180. IOS Press, Amsterdam, pp 305–314, ISBN 978-1-58603-900-4

    Google Scholar 

  • Nasoz F, Lisetti CL (2006) MAUI avatars: mirroring the user’s sensed emotions via ex-pressive multi-ethnic facial avatars. J Vis Lang Comput 17:430–444

    Article  Google Scholar 

  • Pantic M, Rothkrantz LJM (2003) Toward an affect-sensitive multimodal human-computer interaction. Proc IEEE 91(9):1370–1390

    Google Scholar 

  • Picard RW (2003) Affective computing: challenges. Int J Hum Comput Stud 59(1–2):55–64

    Google Scholar 

  • Stathopoulou I-O, Alepis E, Tsihrintzis GA, Virvou M (2010) On assisting a visual-facial affect recognition system with keyboard-stroke pattern information. Knowl-Based Syst 23(4):350–356

    Google Scholar 

  • Tsihrintzis G, Virvou M, Stathopoulou IO, Alepis E (2008) On improving visual-facial emotion recognition with audio-lingual and keyboard stroke pattern information, web intelligence and intelligent agent technology, WI-IAT’08, vol 1, pp 810–816

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Efthimios Alepis .

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Alepis, E., Virvou, M. (2014). Evaluation of the Multimodal Object Oriented Architecture. In: Object-Oriented User Interfaces for Personalized Mobile Learning. Intelligent Systems Reference Library, vol 64. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-53851-3_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-53851-3_9

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-53850-6

  • Online ISBN: 978-3-642-53851-3

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics