Abstract.
We introduce the preliminary design of a novel vision-augmented touch system called HandSight intended to support activities of daily living (ADLs) by sensing and feeding back non-tactile information about the physical world as it is touched. Though we are interested in supporting a range of ADL applications, here we focus specifically on reading printed text. We discuss our vision for HandSight, describe its current implementation and results from an initial performance analysis of finger-based text scanning. We then present a user study with four visually impaired participants (three blind) exploring how to continuously guide a user’s finger across text using three feedback conditions (haptic, audio, and both). Though preliminary, our results show that participants valued the ability to access printed material, and that, in contrast to previous findings, audio finger guidance may result in the best reading performance.
Chapter PDF
Similar content being viewed by others
References
Pascolini, D., Mariotti, S.P.: Global estimates of visual impairment: 2010. Br. J. Ophthalmol. 96(5), 614–618 (2011)
National Eye Institute at the National Institute of Health, Blindness Statistics and Data. http://www.nei.nih.gov/eyedata/blind.asp (accessed: March 10, 2014)
Dakopoulos, D., Bourbakis, N.G.: Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey. IEEE Trans. Syst. Man, Cybern. Part C (Applications Rev.) 40(1), 25–35 (2010)
Balakrishnan, G., Sainarayanan, G., Nagarajan, R., Yaacob, S.: Wearable real-time stereo vision for the visually impaired. Eng. Lett. 14(2), 6–14 (2007)
Hesch, J.A., Roumeliotis, S.I.: Design and Analysis of a Portable Indoor Localization Aid for the Visually Impaired. Int. J. Rob. Res. 29(11), 1400–1415 (2010)
Hub, A., Diepstraten, J., Ertl, T.: Design and Development of an Indoor Navigation and Object Identification System for the Blind. SIGACCESS Access. Comput. (77–78), 147–152 (2003)
Manduchi, R.: Mobile Vision as Assistive Technology for the Blind: An Experimental Study. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds.) ICCHP 2012, Part II. LNCS, vol. 7383, pp. 9–16. Springer, Heidelberg (2012)
Helal, A., Moore, S.E., Ramachandran, B.: Drishti: an integrated navigation system for visually impaired and disabled. In: Proceedings Fifth International Symposium on Wearable Computers, pp. 149–156 (2001)
Krishna, S., Little, G., Black, J., Panchanathan, S.: A wearable face recognition system for individuals with visual impairments. In: Proceedings of the 7th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 106–113 (2005)
Krishna, S., Colbry, D., Black, J., Balasubramanian, V., Panchanathan, S.: A systematic requirements analysis and development of an assistive device to enhance the social interaction of people who are blind or visually. In: Impaired, Workshop on Computer Vision Applications for the Visually Impaired (CVAVI 2008), European Conference on Computer Vision ECCV 2008 (2008)
Gade, L., Krishna, S., Panchanathan, S.: Person localization using a wearable camera towards enhancing social interactions for individuals with visual impairment. In: Proceedings of the 1st ACM SIGMM International Workshop on Media Studies and Implementations that Help Improving Access to Disabled Users, pp. 53–62 (2009)
OrCam Technologies Ltd., OrCam - See for Yourself. http://www.orcam.com/ (accessed: June 23, 2014)
Iannacci, F., Turnquist, E., Avrahami, D., Patel, S.N.: The haptic laser: multi-sensation tactile feedback for at-a-distance physical space perception and interaction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2047–2050 (2011)
Khambadkar, V., Folmer, E.: GIST: A gestural interface for remote nonvisual spatial perception. In: Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, pp. 301–310 (2013)
Israr, A., Bau, O., Kim, S.-C., Poupyrev, I.: Tactile feedback on flat surfaces for the visually impaired. In: CHI 2012 Extended Abstracts on Human Factors in Computing Systems, pp. 1571–1576 (2012)
Norman, J.F., Bartholomew, A.N.: Blindness enhances tactile acuity and haptic 3-D shape discrimination. Atten. Percept. Psychophys. 73(7), 2323–2331 (2011)
Goldreich, D., Kanics, I.M.: Tactile Acuity is Enhanced in Blindness. J. Neurosci. 23(8), 3439–3445 (2003)
Karim, S., Andjomshoaa, A., Tjoa, A.M.: Exploiting sensecam for helping the blind in business negotiations. In: Miesenberger, K., Klaus, J., Zagler, W.L., Karshmer, A.I. (eds.) ICCHP 2006. LNCS, vol. 4061, pp. 1147–1154. Springer, Heidelberg (2006)
Nanayakkara, S., Shilkrot, R., Yeo, K.P., Maes, P.: EyeRing: a finger-worn input device for seamless interactions with our surroundings. In: Proceedings of the 4th Augmented Human International Conference, pp. 13–20 (2013)
Yang, X.-D., Grossman, T., Wigdor, D., Fitzmaurice, G.: Magic finger: always-available input through finger instrumentation. In: Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, pp. 147–156 (2012)
Shilkrot, R., Huber, J., Liu, C., Maes, P., Chandima, N.S.: FingerReader: a wearable device to support text reading on the go. In: CHI 2014 Ext. Abstr. Hum. Factors Comput. Syst. (Vi), pp. 2359–2364 (2014)
Cooper, F.S., Gaitenby, J.H., Nye, P.W.: Evolution of reading machines for the blind: Haskins Laboratories’ research as a case history. Haskins Laboratories (1983)
Capp, M., Picton, P.: The optophone: an electronic blind aid. Eng. Sci. Educ. J. 9(3), 137–143 (2000)
D’Albe, E.F.: On a Type-Reading Optophone. Proc. R. Soc. London. Ser. A 90(619), 373–375 (1914)
Bliss, J.C.: A Relatively High-Resolution Reading Aid for the Blind. Man-Machine Syst. IEEE Trans. 10(1), 1–9 (1969)
Kendrick, D.: From Optacon to Oblivion: The Telesensory Story. American Foundation for the Blind AccessWorld Magazine 6(4) (2005)
Intel, Intel Reader. http://www.intel.com/pressroom/kits/healthcare/reader/ (accessed: January 10, 2014)
knfb Reading Technology Inc., knfb Reader Classic. http://www.knfbreader.com/products-classic.php (accessed: June 22, 2014)
Gaudissart, V., Ferreira, S., Thillou, C., Gosselin, B.: SYPOLE: mobile reading assistant for blind people. In: 9th Conference Speech and Computer (SPECOM) (2004)
Blindsight, Text Detective. http://blindsight.com/textdetective/ (accessed: November 01, 2013)
knfb Reading Technology Inc., kReader Mobile. http://www.knfbreader.com/products-kreader-mobile.php (accessed: June 24, 2014)
Mori, S., Suen, C.Y., Yamamoto, K.: Historical review of OCR research and development. Proc. IEEE 80(7), 1029–1058 (1992)
Shen, H., Coughlan, J.M.: Towards a Real-Time System for Finding and Reading Signs for Visually Impaired Users. In: Miesenberger, K., Karshmer, A., Penaz, Petr, Zagler, W. (eds.) ICCHP 2012, Part II. LNCS, vol. 7383, pp. 41–47. Springer, Heidelberg (2012)
Chen, X., Yuille, A.L.: Detecting and reading text in natural scenes. In: Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2004, vol. 2, pp. II–366–II–373 (2004)
Wang, K., Babenko, B., Belongie, S.: End-to-end scene text recognition. In: 2011 IEEE International Conference on Computer Vision (ICCV), pp. 1457–1464 (2011)
Jayant, C., Ji, H., White, S., Bigham, J.P.: Supporting Blind Photography. In: The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 203–210 (2011)
Manduchi, R., Coughlan, J.M.: The last meter: blind visual guidance to a target. In: Proceedings of ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2014) (2014, to appear)
Kane, S.K., Frey, B., Wobbrock, J.O.: Access lens: a gesture-based screen reader for real-world documents. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 347–350 (2013)
OrCam as uploaded by YouTube user Amnon Shashua, OrCam at Digital-Life-Design (DLD) in Munich, Digital-Life-Design (2014). http://youtu.be/3m9ivtJI6iA?t=2m10s (accessed: June 22, 2014)
OrCam as uploaded by YouTube user Amnon Shashua, OrCam TED@NYC, TED@NYC (2013). http://youtu.be/_3XVsCsscyw?t=4m53s (accessed: June 22, 2014)
Crossan, A., Brewster, S.: Multimodal Trajectory Playback for Teaching Shape Information and Trajectories to Visually Impaired Computer Users. ACM Trans. Access. Comput. 1(2), 12:1–12:34 (2008)
Plimmer, B., Reid, P., Blagojevic, R., Crossan, A., Brewster, S.: Signing on the Tactile Line: A Multimodal System for Teaching Handwriting to Blind Children. ACM Trans. Comput. Interact. 18(3), 17:1–17:29 (2011)
Su, J., Rosenzweig, A., Goel, A., de Lara, E., Truong, K.N.: Timbremap: Enabling the visually-impaired to use maps on touch-enabled devices. In: Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 17–26 (2010)
Harada, S., Takagi, H., Asakawa, C.: On the audio representation of radial direction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2779–2788 (2011)
Yatani, K., Truong, K.N.: SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices. In: Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, pp. 111–120 (2009)
Yatani, K., Banovic, N., Truong, K.: SpaceSense: representing geographical information to visually impaired people using spatial tactile feedback. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 415–424 (2012)
Noble, N., Martin, B.: Shape discovering using tactile guidance. In: Proceeding of the 6th International Conference EuroHaptics (2006)
Oh, U., Kane, S.K., Findlater, L.: Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures. In: Proceedings of the ACM SIGACCESS International Conference on Computers and Accessibility (ASSETS 2013) (2013, to appear)
Lederman, S.J., Klatzky, R.L.: Hand movements: A window into haptic object recognition. Cogn. Psychol. 19(3), 342–368 (1987)
Johnson, K.: Neural basis of haptic perception. In: Pashler, H., Yantis, S. (eds.) Stevens’ Handbook of Experimental Psychology: Volume 1: Sensation and Perception, 3rd edn., pp. 537–580. Wiley Online Library (2002)
Saponas, T.S., Tan, D.S., Morris, D., Balakrishnan, R., Turner, J., Landay, J.A.: Enabling always-available input with muscle-computer interfaces. In: Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, pp. 167–176 (2009)
Morris, D., Saponas, T.S., Tan, D.: Emerging input technologies for always-available mobile interaction. Found. Trends Human-Computer Interact. 4(4), 245–316 (2010)
Saponas, T.S.: Supporting Everyday Activities through Always-Available Mobile Computing. University of Washington (2010)
AWAIBA, NanEye Medical Image Sensors. http://www.awaiba.com/en/products/medical-image-sensors/ (accessed: January 10, 2014)
Samsung, Samsung Galaxy Gear. http://www.samsung.com/us/mobile/wearable-tech/SM-V7000ZKAXAR (accessed: January 10, 2014)
Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision. Cambridge University Press (2003)
Jagannathan, L., Jawahar, C.: Perspective correction methods for camera based document analysis. In: Proc. First Int. Work. Camera-based Doc. Anal. Recognit., pp. 148–154 (2005)
Zaliva, V.: Horizontal Perspective Correction in Text Images (2012). http://notbrainsurgery.livejournal.com/40465.html
Abuhaiba, I.S.: Efficient OCR using Simple Features and Decision Trees with Backtracking. Arab. J. Sci. Eng., 31(2), 223–244 (2006)
Rosten, E., Drummond, T.: Fusing points and lines for high performance tracking. In: Tenth IEEE International Conference on Computer Vision (ICCV 2005), vol. 1,2, pp. 1508–1515 (2005)
Tomasi, C., Kanade, T.: Detection and Tracking of Point Features. Carnegie Mellon University Technical Report CMU-CS-91-132 (1991)
Fischler, M.A., Bolles, R.C.: Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Commun. ACM 24(6), 381–395 (1981)
Keefer, R., Liu, Y., Bourbakis, N.: The Development and Evaluation of an Eyes-Free Interaction Model for Mobile Reading Devices. Human-Machine Syst. IEEE Trans. 43(1), 76–91 (2013)
Dowhower, S.L.: Repeated Reading: Research into Practice. Read. Teach. 42(7), 502–507 (1989)
Levy, B.A.: Text processing: Memory representations mediate fluent reading. Perspect. Hum. Mem. Cogn. aging Essays Honour Fergus Craik, 83–98 (2001)
Lucke, J.: Autonomous cleaning of corrupted scanned documents — a generative modeling approach. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 3338–3345 (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Stearns, L. et al. (2015). The Design and Preliminary Evaluation of a Finger-Mounted Camera and Feedback System to Enable Reading of Printed Text for the Blind. In: Agapito, L., Bronstein, M., Rother, C. (eds) Computer Vision - ECCV 2014 Workshops. ECCV 2014. Lecture Notes in Computer Science(), vol 8927. Springer, Cham. https://doi.org/10.1007/978-3-319-16199-0_43
Download citation
DOI: https://doi.org/10.1007/978-3-319-16199-0_43
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-16198-3
Online ISBN: 978-3-319-16199-0
eBook Packages: Computer ScienceComputer Science (R0)