Advertisement

Graphical User Interface Redefinition Addressing Users’ Diversity

  • José Luís SilvaEmail author
  • J. C. Silva
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11262)

Abstract

Improvements can still be made in the development of Interactive Computing Systems (ICSs) aiming to ease their use. This is particularly true when trying to address users’ diversity. Most ICSs do not adjust themselves to the user nor consider user’s particularities. However, some provide solutions to address better specificities of expert and novice users. Others adjust themselves based on user’s interaction history, but this does not always lead to improvements in use. An aspect that prevents to address users’ diversity broadly is the fact that most of existing ICSs do not provide source code access. This means that only owners can introduce improvements on them.

This paper proposes an approach (based on both affective computing and computer vision) to broadly improve design for diversity (without source code access) for both existing and to be developed ICSs. The results are twofold: (i) example of an initial set of design guidelines; (ii) opens the way to runtime Graphical User Interface (GUI) redefinition and adjustment based on both user’s features and emotions reducing therefore designers’ restrictions when addressing users’ diversity.

Keywords

Engineering interactive computing systems Graphical user interface redefinition Diversity inclusion Design guidelines Affective computing 

Notes

Acknowledgments

This work was supported from Fundação para a Ciência e a Tecnologia (FCT, Portugal), through project UID/EEA/50009/2013 and by ISTAR-IUL through project UID/MULTI/0446/2013.

References

  1. 1.
    Norman, D.A.: The Invisible Computer: Why Good Products Can Fail, the Personal Computer Is so Complex, and Information Appliances are the Solution. MIT Press, Cambridge (1998)Google Scholar
  2. 2.
    Law, C.M., et al.: A systematic examination of universal design resources: part 1, heuristic evaluation. Univ. Access Inf. Soc. 7(1–2), 31–54 (2008) Google Scholar
  3. 3.
    Huh, J., Ackerman, M.S.: Designing for all users: including the odd users. In: Extended Abstracts on Human Factors in Computing Systems, pp. 2449–2458. ACM (2009)Google Scholar
  4. 4.
    Ruzic, L., Sanfod, J.A.: Universal Design Mobile Interface Guidelines (UDMIG) for an aging population. In: Marston, H.R., Freeman, S., Musselwhite, C. (eds.) Mobile e-Health. HIS, pp. 17–37. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-60672-9_2CrossRefGoogle Scholar
  5. 5.
    Meiselwitz, G., Wentz, B., Lazar, J.: Universal usability: past, present, and future. In: Foundations and Trends in Human-Computer Interaction. Now Publishers Inc. (2009)Google Scholar
  6. 6.
    Yeh, T., Chang, T.-H., Miller, R.C.: Sikuli: using GUI screenshots for search and automation. In: Symposium on User Interface Software and Technology, pp. 183–192. ACM (2009)Google Scholar
  7. 7.
    Shneiderman, B.: Universal usability. Commun. ACM 43(5), 84–91 (2000)CrossRefGoogle Scholar
  8. 8.
    Shneiderman, B.: Shneiderman’s eight golden rules of interface design (2009)Google Scholar
  9. 9.
    FDIS, ISO: 9241-110: 2006. Ergonomics of human system interaction-Part 110: Dialogue principles. International Organization for Standardization (ISO), Switzerland (2009)Google Scholar
  10. 10.
    ISO: Ergonomics of Human-system Interaction: Principles and requirements for physical input devices (ISO 9241–400:2007, IDT). International Organisation for Standardisation (2007)Google Scholar
  11. 11.
    Pak, R., McLaughlin, A.: Designing Displays for Older Adults. CRC Press, Boca Raton (2010)Google Scholar
  12. 12.
    Silva, J.L., Ornelas, J.D., Silva, J.C.: Make it ISI: interactive systems integration tool. In: Proceedings of the Symposium on Engineering Interactive Computing Systems, pp. 245–250. ACM (2016)Google Scholar
  13. 13.
    Dixon, M., Leventhal, D., Fogarty, J.: Content and hierarchy in pixel-based methods for reverse engineering interface structure. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, p. 969 (2011)Google Scholar
  14. 14.
    Morgan Dixon, A., Nied, C., Fogarty, J.: Prefab layers and prefab annotations: extensible pixel-based interpretation of graphical interfaces. In: UIST 2014, pp. 221–230 (2014)Google Scholar
  15. 15.
    Gajos, K.Z., Weld, D.S., Wobbrock, J.O.: Automatically generating personalized user interfaces with SUPPLE. Artif. Intell. 174(12–13), 910–950 (2010)CrossRefGoogle Scholar
  16. 16.
    Gajos, K.Z., Wobbrock, J.O., Weld, D.S.: Improving the performance of motor-impaired users with automatically-generated, ability-based interfaces. In: Proceedings of Conference on Human Factors in Computing Systems, USA, pp. 1257–1266. ACM (2008)Google Scholar
  17. 17.
    Gajos, K., Weld, D.S.: SUPPLE: automatically generating user interfaces. In: Proceedings of the 9th International Conference on Intelligent User Interfaces, pp. 93–100. ACM (2004)Google Scholar
  18. 18.
    Redmon, J., Farhadi, A.: YOLO9000: better, faster, stronger. arXiv preprint arXiv:1612.08242 (2016)
  19. 19.
    Picard, R.W.: Affective Computing. The MIT Press, Cambridge (1995)Google Scholar
  20. 20.
    Tao, J., Tan, T.: Affective computing: a review. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, pp. 981–995. Springer, Heidelberg (2005).  https://doi.org/10.1007/11573548_125CrossRefGoogle Scholar
  21. 21.
    Poria, S., Cambria, E., Bajpai, R., Hussain, A.: A review of affective computing: from unimodal analysis to multimodal fusion. Inf. Fusion 37, 98–125 (2017)CrossRefGoogle Scholar
  22. 22.
    Cyr, D.: Emotion and web site design. In: Soegaard, M., Dam, R.F. (eds.) The Encyclopedia of Human Computer Interaction, 2nd edn., Chap. 40. Interaction Design FoundationGoogle Scholar
  23. 23.
    Mori, G., Paternò, F., Furci, F.: Design criteria for stimulating emotions in web applications. In: Abascal, J., Barbosa, S., Fetter, M., Gross, T., Palanque, P., Winckler, M. (eds.) INTERACT 2015. LNCS, vol. 9296, pp. 165–182. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-22701-6_12CrossRefGoogle Scholar
  24. 24.
    Alexander, S., Sarrafzadeh, A.: Interfaces that adapt like humans. In: Masoodian, M., Jones, S., Rogers, B. (eds.) APCHI 2004. LNCS, vol. 3101, pp. 641–645. Springer, Heidelberg (2004).  https://doi.org/10.1007/978-3-540-27795-8_70CrossRefGoogle Scholar
  25. 25.
    Singh, G., Badia, S., Ventura, R., Silva, J.L.: Physiologically attentive user interface for robot teleoperation - real time emotional state estimation and interface modification using physiology, facial expressions and eye movements. In: Proceedings of the International Joint Conference on Biomedical Engineering Systems and Technologies, pp. 294–302. ISBN 978-989-758-279-0Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2019

Authors and Affiliations

  1. 1.Madeira-ITIFunchalPortugal
  2. 2.Instituto Universitário de Lisboa (ISCTE-IUL), ISTAR-IULLisbonPortugal
  3. 3.Escola Superior de Tecnologia, 2Ai, Instituto Politécnico do Cávado e do AveBarcelosPortugal

Personalised recommendations