Eye-based Direct Interaction for Environmental Control in Heterogeneous Smart Environments


environmental control is the control, operation, and monitoring of an environment via intermediary technology such as a computer. Typically this means control of a domestic home.Within the scope of COGAIN, this environmental control concerns the control of the personal environment of a person (with or without a disability). This defines environmental control as the control of a home or domestic setting and those objects that are within that setting. Thus, we may say that environmental control systems enable anyone to operate a wide range of domestic appliances and other vital functions in the home by remote control. In recent years the problem of self-sufficiency for older people and people with a disability has attracted increasing attention and resources. The search for new solutions that can guarantee greater autonomy and a better quality of life has begun to exploit easily available state-of-the-art technology. Personal environmental control can be considered to be a comprehensive and effective aid, adaptable to the functional possibilities of the user and to their desired actions.


Environmental Control Spinal Muscular Atrophy Assistive Technology Smart Home Control Interface 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    Bates R, Spakov O (2006) D2.3 implementation of cogain gaze tracking standards. Deliverable, Communication by Gaze Interaction (COGAIN), IST-2003-511598, URL http://www.cogain.org/results/reports/COGAIN-D2.3.pdf
  2. [2]
    Bates R, Daunys G, Villanueva A, Castellina E, Hong G, Istance H, Gale A, Lauruska V, Spakov O, Majaranta P (2006) D2.4 a survey of existing ‘de-facto’ standards and systems of environmental control. Deliverable, Communication by Gaze Interaction (COGAIN), IST-2003-511598, URL http://www.cogain.org/results/reports/COGAIN-D2.4.pdf
  3. [3]
    Bonino D, Corno F (2008) Dogont - ontology modeling for intelligent domotic environments. In: 7th International Semantic Web Conference. October 26-30, 2008. Karlsruhe, Germany.Google Scholar
  4. [4]
    Bonino D, Castellina E, Corno F (2008) Dog: an ontology-powered osgi domotic gateway. In: 20th IEEE Int’l Conference on Tools with Artificial Intelligence, IEEE, USAGoogle Scholar
  5. [5]
    Bonino D, Castellina E, Corno F (2008) The DOG Domotic OSGi Gateway. URL http://domoticdog.sourceforge.net
  6. [6]
    Chung SL, Chen WY (2007) MyHome: A Residential Server for Smart Homes. Knowledge-Based Intelligent Information and Engineering Systems 4693/2007:664–670CrossRefGoogle Scholar
  7. [7]
    Cook DJ, Youngblood M, Heierman EO III, Gopalratnam K, Rao S, Litvin A, Khawaja F (2003) Mavhome: An agent-based smart home. In: PERCOM ’03: Proceedings of the First IEEE International Conference on Pervasive Computing and Communications, IEEE Computer Society, Washington, DC, USA, p 521Google Scholar
  8. [8]
    Davidoff S, Lee MK, Zimmerman J, Dey A (2006) Principle of Smart Home Control. In: Proceedings of the Conference on Ubiquitous Computing, Springer, pp 19–34.Google Scholar
  9. [9]
    DLNA (2007) DLNA overview and vision whitepaper 2007. Digital Living Network Alliance, URL http://www.dlna.org/en/industry/pressroom/DLNA_white_paper.pdf
  10. [10]
    Donegan M, Oosthuizen L, Bates R, Daunys G, Hansen J, Joos M, Majaranta P, Signorile I (2005) D3.1 user requirements report with observations of difficulties users are experiencing. Deliverable, Communication by Gaze Interaction (COGAIN), IST-2003-511598, URL http://www.cogain.org/results/reports/COGAIN-D3.1.pdf
  11. [11]
    Figuereido L, Nunes T, Caetano F, Gomes A (2008) Magic environment. In: Istance H, Štěpánková O, Bates R (eds) Proceedings of COGAIN 2008Google Scholar
  12. [12]
    Gale A (2005) Attention responsive technology and ergonomics. In: Bust PD, McCabe PT (eds) Proceedings of the International Conference on Contemporary Ergonomics (CE2005), Taylor and Francis, 978-0-415-37448-4Google Scholar
  13. [13]
    Gruber T (1995) Toward principles for the design of ontologies used for knowledge sharing. International Journal Human-Computer Studies 43(5-6):907–928CrossRefGoogle Scholar
  14. [14]
    Heikkilä H (2008) Gesturing with gaze. In: Istance H, Štěpánková O, Bates R (eds) Proceedings of COGAIN 2008Google Scholar
  15. [15]
    HGI (2008) Home gateway technical requirements: Residential profile. Tech. rep., Home Gateway Initiative, URL www.homegatewayinitiative.org/publis/HGI_V1.01_Residential.pdf
  16. [16]
    Istance H (2008) User performance of gaze based interaction with on-line virtual communities. In: Istance H, Štěpánková O, Bates R (eds) Proceedings of COGAIN 2008Google Scholar
  17. [17]
    Jacob R (1990) What you look at is what you get: eye movement-based interaction techniques. In: Carrasco J, Whiteside J (eds) Proceedings of the ACM CHI 90 Human Factors in Computing Systems Conference, Washington, USA, pp 11–18Google Scholar
  18. [18]
    Lowe D (2004) Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision 60, 2:91–110CrossRefGoogle Scholar
  19. [19]
    McGuinness DL, van Harmelen F (2004) Owl web ontology language. W3C Recommendation, http://www.w3.org/TR/owl-features/
  20. [20]
    Nakashima H (2007) Advances in Ambient Intelligence Edited, Series: Frontiers in Artificial Intelligence and Applications, vol 164, chap Cyber Assist Project for Ambient Intelligence, pp 1–20Google Scholar
  21. [21]
    Novák P, Krajník T, Přeučil L, Fejtová M, Štěpánková O (2008) AI support for a gaze-controlled wheelchair. In: Istance H, Štěpánková O, Bates R (eds) Proceedings of COGAIN 2008Google Scholar
  22. [22]
    OSGI (2007) OSGi service platform release 4. Tech. rep., The OSGi allianceGoogle Scholar
  23. [23]
    Saito T, Tomoda I, Takabatake Y, Arni J, Teramoto K (2000) Home gateway architecture and its implementation. IEEE Transactions on Consumer Electronics 46(4):1161–1166CrossRefGoogle Scholar
  24. [24]
    Shi F, Gale A, Purdy K (2006) Helping people with ict device control by eye gaze. In: Miesenberger K, Klaus J, Zagler W, Karshmer A (eds) Computers Helping People with Special Needs, Springer Verlag, Berlin, Lecture Notes in Computer Science, pp 480–487CrossRefGoogle Scholar
  25. [25]
    Shi F, Gale A, Purdy K (2006) Sift approach matching constraints for real-time attention responsive system. In: Byun J, Pan Z (eds) Proceedings of the 5th Asia Pacific International Symposium on Information Technology, Hangzhou, ChinaGoogle Scholar
  26. [26]
    Winer D (2003) XML-RPC specification. Tech. rep., UserLand SoftwareGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  1. 1.Politecnico di Torino, Dip. di Automatica e InformaticaTorinoItaly
  2. 2.Applied Vision Research Centre, Loughborough UniversityLoughboroughUK
  3. 3.Unit for Computer-Human Interaction (TAUCHI), Department of Computer SciencesFI-33014 University of TampereFinland

Personalised recommendations