Blind-friendly user interfaces – a pilot study on improving the accessibility of touchscreen interfaces

  • Akif KhanEmail author
  • Shah Khusro


Touchscreen devices such as a smartphone, smartwatch, and tablets are essential assistive devices for visually impaired and blind people in performing activities of daily living. The vision alternative accessibility services such as screen readers, multimodal interactions, vibro-tactical, haptic feedback, and gestures are helping blind people in operating touchscreen interfaces. Part of usability problem with today touchscreen user interfaces contributes to a trade-off in discoverability, navigational complexity, cognitive overload, layout persistency, a cumbersome input mechanism, accessibility, and cross-device interactions. One solution to these problems is to design an accessibility-inclusive blind-friendly user interface framework for performing common activities on a smartphone. This framework re-organizes/re-generates the interface components into a simplified blind-friendly user interface based on user profile and contextual recommendations. The paper reports an improvement in the user experience of blind people in performing activities on a smartphone. Forty-one blind people have participated in this empirical study, resulting in improved users and interaction experience in an operating smartphone.


HCI Usability Accessibility User interfaces Blind-friendly U X 



This research work has been undertaken by the first author as partial fulfillment of Ph.D. degree with the support of the Higher Education Commission (HEC) of Pakistan.

Authors’ contributions

The proposed usable paradigm for blind people was developed through collective and collaborative efforts by all authors. AK and SK carried out the study design, framework, implementation and manuscript drafting. AK participated in data collection, analysis, and manuscript editing. Iftikhar Alam helped in the revision of the manuscript in term of manuscript organization etc. All authors have read and approved the content, structure of the manuscript.

Compliance with ethical standards

Competing interests

The authors declare that they have no competing interests.


  1. 1.
    Akiki P (2014) Engineering adaptive model-driven user interfaces for enterprise applications. The Open UniversityGoogle Scholar
  2. 2.
    Akiki PA, Bandara AK, Yu Y (2015) Adaptive model-driven user interface development systems. ACM Comput Surv 47:9–33Google Scholar
  3. 3.
    AODA Guidlines (2017) Accessed 24 March 2017
  4. 4.
    Boren T, Ramey J (2000) Thinking aloud: reconciling theory and practice. IEEE Trans Prof Commun 43:261–278CrossRefGoogle Scholar
  5. 5.
    Brewster S (2002) Overcoming the lack of screen space on mobile computers. Pers Ubiquit Comput 6:188–205CrossRefGoogle Scholar
  6. 6.
    Brewster S, Chohan F, Brown L (2007) Tactile feedback for mobile interactions. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 159–162Google Scholar
  7. 7.
    Buxton W, Hill R, Rowley P (1985) Issues and techniques in touch-sensitive tablet input. ACM SIGGRAPH Comput Graph 19:215–224CrossRefGoogle Scholar
  8. 8.
    Buzzi MC, Buzzi M, Leporini B, Trujillo A (2015) Exploring visually impaired people's gesture preferences for smartphones. In: Proceedings of the 11th biannual conference on Italian SIGCHI chapter. ACM, pp 94–101Google Scholar
  9. 9.
    Buzzi MC, Buzzi M, Leporini B, Trujillo A (2016) Analyzing visually impaired people’s touch gestures on smartphones. Multimed Tools Appl:1–29Google Scholar
  10. 10.
    Damaceno RJP, Braga JC, Mena-Chalco JP Mobile device accessibility for the visually impaired: problems mapping and recommendations. Univ Access Inf Soc:1–15Google Scholar
  11. 11.
    DIS I (2006) 9241-171. Ergonomics of human-system interaction-part 171: guidance on software accessibility. Draft International Standard ISOGoogle Scholar
  12. 12.
    Dorigo ML, Harriehausen-Mühlbauer B, Stengel I, Haskell-Dowland PS (2013) Nonvisual presentation and navigation within the structure of digital text-documents on mobile devices. In: International conference on universal access in human-computer interaction. Springer, pp 311–320Google Scholar
  13. 13.
    Elias M, Lohmann S, Auer S (2016) Fostering accessibility of OpenCourseWare with semantic technologies–a literature review. In: International conference on knowledge engineering and the semantic web. Springer, pp 241–256Google Scholar
  14. 14.
    Fruchterman JR (2003) In the palm of your hand: a vision of the future of technology for people with visual impairments. Journal of Visual Impairment and Blindness 97:585–591Google Scholar
  15. 15.
    Gajos KZ, Long JJ, Weld DS (2006) Automatically generating custom user interfaces for users with physical disabilities. In: Proceedings of the 8th international ACM SIGACCESS conference on computers and accessibility. ACM, pp 243–244Google Scholar
  16. 16.
    Gamecho B, Minón R, Aizpurua A, Cearreta I, Arrue M, Garay-Vitoria N, Abascal J (2015) Automatic generation of tailored accessible user interfaces for ubiquitous services. IEEE Transactions on Human-Machine Systems 45:612–623CrossRefGoogle Scholar
  17. 17.
    Google (2017) UI overview. Accessed 12 July 2017
  18. 18.
    GSA (2017) Section 508. Accessed 8 Nov 2017
  19. 19.
    Guerreiro T, Lagoá P, Nicolau H, Gonçalves D, Jorge JA (2008) From tapping to touching: making touch screens accessible to blind users. IEEE Multimedia 15:0048–0050CrossRefGoogle Scholar
  20. 20.
    Guerreiro T, Montague K, Guerreiro J, Nunes R, Nicolau H, Gonçalves DJ (2015) Blind people interacting with large touch surfaces: strategies for one-handed and two-handed exploration. In: Proceedings of the 2015 international conference on interactive tabletops & surfaces. ACM, pp 25–34Google Scholar
  21. 21.
    Hakobyan L, Lumsden J, O’Sullivan D, Bartlett H (2013) Mobile assistive technologies for the visually impaired. Surv Ophthalmol 58:513–528CrossRefGoogle Scholar
  22. 22.
    Hussain J et al (2018a) Model-based adaptive user interface based on context and user experience evaluation. Journal on Multimodal User Interfaces 12:1–16CrossRefGoogle Scholar
  23. 23.
    Hussain J et al (2018b) A multimodal deep log-based user experience (UX) platform for UX evaluation. Sensors 18:1622CrossRefGoogle Scholar
  24. 24.
    Kaiser HF (1974) An index of factorial simplicity. Psychometrika 39:31–36CrossRefGoogle Scholar
  25. 25.
    Kane SK, Bigham JP, Wobbrock JO (2008) Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In: Proceedings of the 10th international ACM SIGACCESS conference on computers and accessibility. ACM, pp 73–80Google Scholar
  26. 26.
    Kane SK, Jayant C, Wobbrock JO, Ladner RE (2009) Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities. In: Proceedings of the 11th international ACM SIGACCESS conference on computers and accessibility. ACM, pp 115–122Google Scholar
  27. 27.
    Kane SK, Wobbrock JO, Ladner RE (2011) Usable gestures for blind people: understanding preference and performance. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 413–422Google Scholar
  28. 28.
    Khan A, Khusro S, Alam I (2018a) BlindSense: an accessibility-inclusive universal user Interface for blind people engineering. Technology & Applied Science Research 8:2775–2784Google Scholar
  29. 29.
    Khan A, Khusro S, Niazi B, Ahmad J, Alam I, Khan I (2018b) TetraMail: a usable email client for blind people. Univ Access Inf Soc:1–20Google Scholar
  30. 30.
    Kientz JA, Patel SN, Tyebkhan AZ, Gane B, Wiley J, Abowd GD (2006) Where's my stuff?: design and evaluation of a mobile system for locating lost items for the visually impaired. In: Proceedings of the 8th international ACM SIGACCESS conference on computers and accessibility. ACM, pp 103–110Google Scholar
  31. 31.
    Kieras D (2001) Using the keystroke-level model to estimate execution times. University of MichiganGoogle Scholar
  32. 32.
    Krainz E, Miesenberger K, Feiner J (2018) Can we improve app accessibility with advanced development methods? In: International conference on computers helping people with special needs. Springer, pp 64–70Google Scholar
  33. 33.
    Kuber R, Hastings A, Tretter M, Fitzpatrick D (2012) Determining the accessibility of mobile screen readers for blind users. In: Proceedings of IASTED HCIGoogle Scholar
  34. 34.
    Kurakata K, Sagawa K (2008) Standards to address the needs of older persons and persons with disabilities. The Japanese Journal of Ergonomics 44:22–23CrossRefGoogle Scholar
  35. 35.
    Legge GE et al (2013) Indoor navigation by people with visual impairment using a digital sign system. PLoS One 8:e76783CrossRefGoogle Scholar
  36. 36.
    Leporini B, Paternò F (2008) Applying web usability criteria for vision-impaired users: does it really improve task performance? Int J Hum Comput Interact 24:17–47CrossRefGoogle Scholar
  37. 37.
    Lewis JR, Sauro J (2009) The factor structure of the system usability scale. In: International conference on human centered design. Springer, pp 94–103Google Scholar
  38. 38.
    McCarthy J, Wright P (2004) Technology as experience. Interactions 11:42–43CrossRefGoogle Scholar
  39. 39.
    McGookin D, Brewster S, Jiang W (2008) Investigating touchscreen accessibility for people with visual impairments. In: Proceedings of the 5th Nordic conference on human-computer interaction: building bridges. ACM, pp 298–307Google Scholar
  40. 40.
    Mi N, Cavuoto LA, Benson K, Smith-Jackson T, Nussbaum MA (2014) A heuristic checklist for an accessible smartphone interface design. Univ Access Inf Soc 13:351–365CrossRefGoogle Scholar
  41. 41.
    Miñón R, Paternò F, Arrue M, Abascal J (2016) Integrating adaptation rules for people with special needs in model-based UI development process. Univ Access Inf Soc 15:153–168CrossRefGoogle Scholar
  42. 42.
    Nah FF-H, Zhang D, Krogstie J, Zhao S (2017) Editorial of the special issue on mobile human–computer interaction. Taylor & FrancisGoogle Scholar
  43. 43.
    Nesbat SB (2003) A system for fast, full-text entry for small electronic devices. In: Proceedings of the 5th international conference on multimodal interfaces. ACM, pp 4–11Google Scholar
  44. 44.
    Niazi B, Khusro S, Khan A, Alam I (2016) A touch sensitive keypad layout for improved usability of smartphones for the blind and visually impaired persons. In: Artificial intelligence perspectives in intelligent systems. Springer, pp 427–436Google Scholar
  45. 45.
    Nicolau H, Montague K, Guerreiro T, Rodrigues A, Hanson VL (2015) HoliBraille: multipoint vibrotactile feedback on mobile devices. In: Proceedings of the 12th web for all conference. ACM, p 30Google Scholar
  46. 46.
    Niranjan (2015) Show-java. Accessed 4 Feb 2015
  47. 47.
    Oliveira J, Guerreiro T, Nicolau H, Jorge J, Gonçalves D (2011) Blind people and mobile touch-based text-entry: acknowledging the need for different flavors. In: The proceedings of the 13th international ACM SIGACCESS conference on computers and accessibility. ACM, pp 179–186Google Scholar
  48. 48.
    Paek T, Chickering DM (2007) Improving command and control speech recognition on mobile devices: using predictive user models for language modeling. User Model User-Adap Inter 17:93–117CrossRefGoogle Scholar
  49. 49.
    Park YS, Han SH (2010) Touch key design for one-handed thumb interaction with a mobile phone: effects of touch key size and touch key location. Int J Ind Ergon 40:68–76CrossRefGoogle Scholar
  50. 50.
    Park D, Lee J-H, Kim S (2011) Investigating the affective quality of interactivity by motion feedback in mobile touchscreen user interfaces. Int J Hum Comput Stud 69:839–853CrossRefGoogle Scholar
  51. 51.
    Petrie H, Bevan N (2009) The evaluation of accessibility, usability, and user experienceGoogle Scholar
  52. 52.
    Pirhonen A, Brewster S, Holguin C (2002) Gestural and audio metaphors as a means of control for mobile devices. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 291–298Google Scholar
  53. 53.
    Plos O, Buisine S (2006) Universal design for mobile phones: a case study. In: CHI'06 extended abstracts on human factors in computing systems. ACM, pp 1229–1234Google Scholar
  54. 54.
  55. 55.
    Rodriguez-Sanchez M, Moreno-Alvarez M, Martin E, Borromeo S, Hernandez-Tamames J (2014) Accessible smartphones for blind users: a case study for a wayfinding system. Expert Syst Appl 41:7210–7222CrossRefGoogle Scholar
  56. 56.
    Romero M, Frey B, Southern C, Abowd GD (2011) BrailleTouch: designing a mobile eyes-free soft keyboard. In: Proceedings of the 13th international conference on human computer interaction with Mobile devices and services. ACM, pp 707–709Google Scholar
  57. 57.
    Sandnes FE, Tan TB, Johansen A, Sulic E, Vesterhus E, Iversen ER (2012) Making touch-based kiosks accessible to blind users through simple gestures. Univ Access Inf Soc 11:421–431CrossRefGoogle Scholar
  58. 58.
    Shahzad SK, Granitzer M, Tochterman K (2009) Designing user interfaces through ontological user model: functional programming approach. In: Computer sciences and convergence information technology, 2009. ICCIT'09. Fourth international conference on. IEEE, pp 99–104Google Scholar
  59. 59.
    Shneiderman B (1986) Designing the user interface-strategies for effective human-computer interaction. Pearson Education IndiaGoogle Scholar
  60. 60.
    Shneiderman B (2010) Designing the user interface: strategies for effective human-computer interaction. Pearson Education IndiaGoogle Scholar
  61. 61.
    Sieverthson H, Lund M (2017) Usability challenges for the mobile web: an enterprise perspectiveGoogle Scholar
  62. 62.
    Smith B (2004) Beyond concepts: ontology as reality representation. In: Proceedings of the third international conference on formal ontology in information systems (FOIS 2004). IOS Press, Amsterdam, pp 73–84Google Scholar
  63. 63.
    Soui M, Ghédira K, Hammadi S (2007) Proposal of personalized multimodal information diffusion system. In: Proc. of the 1st intern. conf. on ICT & accessibility. ICTA, pp 219–224Google Scholar
  64. 64.
    Southern C, Clawson J, Frey B, Abowd G, Romero M (2012) Braille touch: mobile touchscreen text entry for the visually impaired. In: Proceedings of the 14th international conference on human-computer interaction with mobile devices and services companion. ACM, pp 155–156Google Scholar
  65. 65.
    Statista (2017) Number of apps available in leading app stores and play store. Accessed 14 Dec 2017
  66. 66.
    Story MF (1998) Maximizing usability: the principles of universal design. Assist Technol 10:4–12CrossRefGoogle Scholar
  67. 67.
    Strumillo P, Skulimowski P, Polanczyk M (2009) Programming Symbian smartphones for the blind and visually impaired. In: Computers in medical activity. Springer, pp 129–136Google Scholar
  68. 68.
    W3C (2017) WAI. Accessed 25 Sept 2017
  69. 69.
    Waddell C, Regan B, Henry SL, Burks MR, Thatcher J, Urban MD, Bohman P (2003) Constructing accessible web sites. ApressGoogle Scholar
  70. 70.
    Wall SA, Brewster SA (2006) Tac-tiles: multimodal pie charts for visually impaired users. In: Proceedings of the 4th Nordic conference on human-computer interaction: changing roles. ACM, pp 9–18Google Scholar
  71. 71.
    Wu M, Balakrishnan R (2003) Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays. In: Proceedings of the 16th annual ACM symposium on user interface software and technology. ACM, pp 193–202Google Scholar
  72. 72.
    Yargin GT, Crilly N (2015) Information and interaction requirements for software tools supporting analogical design. AI EDAM 29:203–214Google Scholar
  73. 73.
    Yu W, Kuber R, Murphy E, Strain P, McAllister G (2006) A novel multimodal interface for improving visually impaired people’s web accessibility. Virtual Reality 9:133–148CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Department of Computer ScienceUniversity of PeshawarPeshawarPakistan

Personalised recommendations