Skip to main content
Log in

SmartGrip: grip sensing system for commodity mobile devices through sound signals

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

Although many studies have attempted to detect the hand postures of a mobile device to utilize these postures as a user interface, they either require additional hardware or can differentiate a limited number of grips only if there is a touch event on the mobile device’s screen. In this paper, we propose a novel grip sensing system, called SmartGrip, which allows a mobile device to detect different hand postures without any additional hardware and a screen touch event. SmartGrip emits carefully designed sound signals and differentiates the propagated signals distorted by different user grips. To achieve this, we analyze how a sound signal propagates from the speaker to the microphone of a mobile device and then address three key challenges: sound structure design, volume control, and feature extraction and classification. We implement and evaluate SmartGrip on three Android mobile devices. With six representative grips, SmartGrip exhibits 93.1% average accuracy for ten users in an office environment. We also demonstrate that SmartGrip operates with 83.5 to 98.3% accuracy in six different (noisy) locations. Further demonstrating the feasibility of SmartGrip as a user interface, we develop an Android application that exploits SmartGrip, validating its practical usage.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

Notes

  1. https://www.youtube.com/watch?v=FvQ87wmS6kk

  2. Note that the criterion volume depends on mobile devices. For Samsung Galaxy S8 and Google Pixel, the criterion volume corresponds to 60%. It is not difficult to calibrate the criterion volume for target mobile devices.

  3. After applying FFT, we have 512 features with 0–24kHZ. We remove features with 0–16kHZ as well as additional features related to fade in & fade out, yielding 172 features.

References

  1. Creem SH, Proffitt DR (2001) Grasping objects by their handles: a necessary interaction between cognition and action. J Exp Psychol Hum Percept Perform 27(1):218

    Article  Google Scholar 

  2. MacKenzie C, Iberall T (1994) The grasping hand. advances in psychology, vol 104. New York: North Holland 26:1–32

    Google Scholar 

  3. Yoon D, Hinckley K, Benko H, Guimbretière F, Irani P, Pahud M, Gavriliu M Sensing tablet grasp + micro-mobility for active reading, in: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, ACM, pp 477– 487

  4. Wimmer R, Boring S (2009) Handsense: discriminating different ways of grasping and holding a tangible user interface. In: Proceedings of the 3rd International Conference on Tangible and Embedded Interaction, ACM, pp 359–362

  5. Taylor BT, Bove VM Jr (2009) Graspables: grasp-recognition as a user interface. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, pp 917–926

  6. Cheng L-P, Liang H-S, Wu C-Y, Chen MY (2013) iGrasp: grasp-based adaptive keyboard for mobile devices. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, pp 3037–3046

  7. Ono M, Shizuki B, Tanaka J (2013) Touch & activate: adding interactivity to existing objects using active acoustic sensing. In: Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, ACM, pp 31–40

  8. Cheng L-P, Hsiao F-I, Liu Y-T, Chen MY (2013) iRotateGrasp: automatic screen rotation based on grasp of mobile devices, in: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, pp 3051–3054

  9. Goel M, Wobbrock J, Patel S (2012) Gripsense: using built-in sensors to detect hand posture and pressure on commodity mobile phones, in: Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, ACM, pp 545–554

  10. Park C, Ogawa T (2015) A study on grasp recognition independent of users’ situations using built-in sensors of smartphones. in: Adjunct Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, pp 69–70

  11. Goel M, Jansen A, Mandel T, Patel SN, Wobbrock JO (2013) Contexttype: using hand posture information to improve mobile touch screen text entry. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, pp 2795–2798

  12. Zhou Z, Diao W, Liu X, Zhang K (2014) Acoustic fingerprinting revisited: generate stable device ID stealthily with inaudible sound. In: Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security, ACM, pp 429–440

  13. Ackerman E, Oda F (1962) Acoustic absorption coefficients of human body surfaces. Tech. rep., DTIC Document

  14. Ewins D (2000) Modal testing: theory, practice, and application, Research Studies Press,

  15. Schwarz BJ, Richardson MH (1999) Experimental modal analysis. CSI Reliability week 35(1):1–12

    Google Scholar 

  16. Tung Y-C, Shin KG (2016) Expansion of human-phone interface by sensing structure-borne sound propagation. In: Proceedings of the 14th Annual International Conference on Mobile Systems, Applications, and Services, ACM, pp 277–289

  17. Tung Y-C, Shin KG (2015) Echotag: accurate infrastructure-free indoor location tagging with smartphones. In: Proceedings of the 21st Annual International Conference on Mobile Computing and Networking, ACM, pp 525–536

  18. Plack CJ (2005) The sense of hearing. Lawrence Erlbaum Associates Publishers

  19. Lazik P, Rowe A (2012) Indoor pseudo-ranging of mobile devices using ultrasonic chirps. In: Proceedings of the 10th ACM Conference on Embedded Network Sensor Systems, ACM, pp 99–112

  20. Chang C-C, Lin C-J (2011) Libsvm: a library for support vector machines. ACM Transactions on Intelligent Systems and Technology 2(3):27

    Article  Google Scholar 

  21. Kim N, Lee J (2017) Towards grip sensing for commodity smartphones through acoustic signature. In: Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing, ACM, pp 1–4

Download references

Funding

This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (2019R1A2B5B02001794). Jinkyu Lee is the corresponding author.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jinkyu Lee.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

A short, preliminary version of this paper has been presented as a poster [21], which is 4 pages long.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kim, N., Lee, J., Whang, J.J. et al. SmartGrip: grip sensing system for commodity mobile devices through sound signals. Pers Ubiquit Comput 24, 643–654 (2020). https://doi.org/10.1007/s00779-019-01337-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-019-01337-7

Keywords

Navigation