Skip to main content

Human-Drone Interaction: Using Pointing Gesture to Define a Target Object

  • Conference paper
  • First Online:
Human-Computer Interaction. Multimodal and Natural Interaction (HCII 2020)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12182))

Included in the following conference series:

Abstract

This paper focuses on exploring the optimal gesture interface for Human-Drone Interaction in a firefighting scenario. For this purpose, we conducted a Preliminary Interview and User Study with seven subjects from the Kobe Firefighting Brigade, Japan. As a drone’s flight and locomotion properties significantly affect the user’s mental and physical expectations, differently compared to other grounded robots, a careful investigation of user-defined design preferences and interactions is required. This work proposes an examination and discussion, with experienced firefighters, about Human-Drone Interactions when relying solely on the drone’s monocular camera, without relying on other devices such as GPS or external cameras. The User Study had three main elements: A drone, a building, and the volunteering firefighters. During the Study, each firefighter should specify a window to the drone. The drone would theoretically use that window to enter the building and perform some designed tasks, like information gathering, thus saving the firefighter time and effort. Results show that the subjects always chose Pointing Gestures and voice commands as a means to communicate to the drone a target window. We built A prototype application with the resulting gesture from this investigation. In the prototype, a drone can understand to which object the user is pointing.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Reich, L.: How drones are being used in disaster management?. http://geoawesomeness.com/drones-fly-rescue/

  2. Azevedo, M.A.: Drones give journalists a new way to report news. https://newsroom.cisco.com/feature-content?articleId=1851973

  3. Smith, S.: Military and civilian drone use - The future of unmanned aerial vehicles. https://www.thebalancecareers.com/military-and-civilian-drone-use-4121099

  4. Funk, M.: Human-drone interaction: let’s get ready for flying user interfaces!. ACM Interact. 25, 78–81 (2018)

    Article  Google Scholar 

  5. Kim, J., et al.: Autonomous flight system using marker recognition on drone. In: 21st Korea-Japan Joint Workshop on Frontiers of Computer Vision (FCV), p. 1–4. IEEE (2015)

    Google Scholar 

  6. Al-Eidan, R.M., Al-Khalifa, H., Al-Salman, A.M.: A Review of wrist-worn wearable: sensors, models, and challenges. J. Sensors 2018, 20 (2018)

    Article  Google Scholar 

  7. Alsheakhal, M., Skaik, A., Aldahdouh, M., Alhelou, M.: Hand gesture recognition system. In: Information Communication and Systems, p. 132 (2011)

    Google Scholar 

  8. Obaid, M., et al.: How would you gesture navigate a drone? a user-centered approach to control a drone. In: Proceedings of the 20th International Academic Mindtrek Conference (2016)

    Google Scholar 

  9. Nagi, J., Giusti, A., Di Caro, G.A., Gambardella, L.M.: Human control of UAVs using face pose estimates and hand gestures. In 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 1–2. IEEE, March 2014

    Google Scholar 

  10. De Marsico, M., Spagnoli, A.: Using hands as an easy UAV joystick for entertainment applications. In: Proceedings of the 13th Biannual Conference of the Italian SIGCHI Chapter: Designing the Next Interaction (2019)

    Google Scholar 

  11. Tölgyessy, M., Dekan, M., Duchoň, F., Rodina, J., Hubinský, P., Chovanec, L.: Foundations of Visual Linear Human–Robot Interaction via Pointing Gesture Navigation. Int. J. Soc. Robotics 9(4), 509–523 (2017). https://doi.org/10.1007/s12369-017-0408-9

    Article  Google Scholar 

  12. Droeschel, D., Stückler, J., Behnke, S.: Learning to interpret pointing gestures with a time-of-flight camera. In: Proceedings of the IEEE International Conference on Human-robot Interaction, pp. 481–488 (2011)

    Google Scholar 

  13. Fransen, B.R., Lawson, W.E., Bugajska, M.D.: Integrating vision for human-robot interaction. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 9–16 (2010)

    Google Scholar 

  14. Li, Z., Jarvis, R.: Visual interpretation of natural pointing gestures in 3D space for human-robot interaction. In: Proceedings of the IEEE International Conference on Control, Automation, Robotics and Vision, pp. 2513–2518 (2010)

    Google Scholar 

  15. Yoshida, K., Hibino, F., Takahashi, Y., Maeda, Y.: Evaluation of pointing navigation interface for mobile robot with spherical vision system. In: Proceedings of the IEEE International Conference on Fuzzy Systems, pp. 721–726 (2011)

    Google Scholar 

  16. Pateraki, M., Baltzakis, H., Trahanias, P.: Visual estimation of pointed targets for robot guidance via fusion of face pose and hand orientation. In: Proceedings of the IEEE International Conference on Computer Vision Workshops, pp. 1060–1067 (2011)

    Google Scholar 

  17. Van Den Bergh, M., et al.: Realtime 3D hand gesture interaction with a robot for understanding directions from humans. In: Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication, pp. 357–362 (2011)

    Google Scholar 

  18. Pourmehr, S., Monajjemi, V., Wawerla, J., Vaughan, R., Mori, G.: A robust integrated system for selecting and commanding multiple mobile robots. In: Proceedings of the IEEE International Conference on Robotics and Automation, pp. 2874–2879 (2013)

    Google Scholar 

  19. Abidi, S., Williams, M., Johnston, B.: Human pointing as a robot directive. In: Proceedings of the IEEE International Conference on Human–Robot Interaction, pp. 67–68 (2013)

    Google Scholar 

  20. Gromov, B., Gambardella, L.M., Giustin, A.: Video: landing a drone with pointing gestures. In: Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, p. 374 (2018)

    Google Scholar 

  21. Azari, B., Lim, A., Vaughan, R.T.: Commodifying pointing in HRI: simple and fast pointing gesture detection from RGB-D images. arXiv preprint arXiv:1902.02636 (2019)

  22. Mirri, S., Prandi, C., Salomoni, P.: Human-Drone Interaction: state of the art, open issues and challenges. In: Proceedings of the ACM SIGCOMM 2019 Workshop on Mobile AirGround Edge Computing, Systems, Networks, and Applications, pp. 43–48 (2019)

    Google Scholar 

  23. Redmon, J., Farhadi, A.: YOLO9000: better, faster, stronger. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 7263–7271 (2017)

    Google Scholar 

  24. Cao, Z., Hidalgo, G., Simon, T., Wei, S.E., Sheikh, Y.: OpenPose: realtime multi-person 2D pose estimation using part affinity fields. arXiv preprint arXiv:1812.08008 (2018)

  25. Colonnesi, C., Stams, G.J.J., Koster, I., Noom, M.J.: The relation between pointing and language development: a meta-analysis. Dev. Rev. 30, 352–366 (2010)

    Article  Google Scholar 

  26. Jane, L.E., Ilene, L.E., Landay, J.A., Cauchard, J.R.: Drone and Wo: cultural influences on human-drone interaction techniques. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 6794–6799 (2017)

    Google Scholar 

  27. Cauchard, J.R., E, J.L., Zhai, K.Y., Landay, J.A.: Drone and me: an exploration into natural human-drone interaction. In: Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing, pp. 361–365 (2015)

    Google Scholar 

  28. Parrot Bebop 2 drone. https://www.parrot.com/us/drones/parrot-bebop-2

  29. Robot Operating System: Robot Operating System - Documentation. http://wiki.ros.org/Documentation

  30. Goldman, R.: Intersection of two lines in three-space. In: Graphics Gems, p. 304. Academic Press Professional, Inc., San Diego. ISBN 0-12-286169-5 (1990)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anna C. S. Medeiros .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Medeiros, A.C.S., Ratsamee, P., Uranishi, Y., Mashita, T., Takemura, H. (2020). Human-Drone Interaction: Using Pointing Gesture to Define a Target Object. In: Kurosu, M. (eds) Human-Computer Interaction. Multimodal and Natural Interaction. HCII 2020. Lecture Notes in Computer Science(), vol 12182. Springer, Cham. https://doi.org/10.1007/978-3-030-49062-1_48

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-49062-1_48

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-49061-4

  • Online ISBN: 978-3-030-49062-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics