Augmented Reality for Interactive Robot Control
Robots are widely used to support mission-critical, high-risk and complex operations. Human supervision and remote robot control are often required to operate robots in unpredictable and changing scenarios. Often, robots are controlled remotely by technicians via joystick interfaces which require training and experience to operate. To improve robot usage and practicality, we propose using augmented reality (AR) to create a more intuitive, less training-intensive means of controlling robots than traditional joystick control. AR is a creative platform for developing robot control systems, because AR combines the real world (the environment around the user, the physical robot, etc.) with the digital world (holograms, digital displays, etc.); it can even interpret physical gestures, such as pinching two fingers.
In this research, a Microsoft Hololens headset is used to create an AR environment to control a Yaskawa Motoman SIA5D robot. The control process begins with the user placing an interactable holographic robot in 3D space. The user can then select between two control methods: manual control and automatic control. In manual control, the user can move the end effector of the holographic robot and the physical robot will respond immediately. In automatic control, the user can move the end effector of the holographic robot to a desired location, view a holographic preview of the motion, and select execute if the motion plan is satisfactory. In this preview mode, the user is able to preview both the motion of the robot and the torques experienced by the joints of the manipulator. This gives the user additional feedback on the planned motion. In this project we succeeded in creating an AR control system that makes controlling a robotic manipulator intuitive and effective.
KeywordsAugmented reality Microsoft Hololens Robotic arm Force feedback Motion planning
We would like to acknowledge the 2018 Dynamics Summer School at the Los Alamos National Laboratory for sponsoring this project as well as James Riback and Anita Jaramillo for their contributions to this project.
- 3.Kuriya, R., Tsujimura, T., Izumi, K.: Augmented reality robot navigation using infrared marker. In: Robot and Human Interactive Communication (RO-MAN), 2015 24th IEEE International Symposium on, pp. 450–455. IEEE, Hoboken (2015)Google Scholar
- 6.Hashimoto, S., Ishida, A., Inami, M., Igarashi, T.: Touchme: an augmented reality based remote robot manipulation. In: 21st International Conference on Artificial Reality and Telexistence, Proceedings of ICAT2011. Osaka University, Osaka (2011)Google Scholar
- 13.Lee, S., Lucas, N.P., Darin Ellis, R., Pandya, A.: Development and human factors analysis of an augmented reality interface for multi-robot tele-operation and control. In: Unmanned Systems Technology XIV, vol. 8387, p. 83870N. International Society for Optics and Photonics, Baltimore, Maryland (2012)CrossRefGoogle Scholar
- 16.Codd-Downey, R., Mojiri Forooshani, P., Speers, A., Wang, H., Jenkin, M.: From ROS to unity: leveraging robot and virtual environment middleware for immersive teleoperation. In: Information and Automation (ICIA), 2014 IEEE International Conference on, pp. 932–936. IEEE, Hoboken (2014)CrossRefGoogle Scholar
- 17.Sita, E., Horvath, C.M., Thomessen, T., Korondi, P., Pipe, A.G.: ROS-Unity3D based system for monitoring of an industrial robotic process. In: 2017 IEEE/SICE International Symposium on System Integration (SII). IEEE, Hoboken (2018)Google Scholar