Skip to main content

Deep Learning for Smartphone-Based Human Activity Recognition Using Multi-sensor Fusion

  • Conference paper
  • First Online:
Wireless Internet (WICON 2018)

Abstract

In the field of ubiquitous computing, machines need to be aware of the present context to enable anticipatory communication with humans. This leads to human-centric applications that have the primary objective of improving the Quality-of-Life (QoL) of its users. One important type of context information for these applications is the current activity of the user, which can be derived from environmental and wearable sensors. Due to the processing capabilities and the number of sensors embedded in a smartphone, this device exhibits the most promise among other existing technologies in human activity recognition (HAR) research. While machine learning-based solutions have been successful in past HAR studies, several design struggles can be easily resolved with deep learning. In this paper, we investigated Convolutional Neural Networks and Long Short-Term Memory Networks in dealing with common challenges in smartphone-based HAR, such as device location and subject dependency, and manual feature extraction. We showed that the CNN model accomplished location- and subject-independent recognition with overall accuracy of 98.38% and 90.61%, respectively. The LSTM model also performed location-independent recognition with an accuracy of 97.17% but has a subject-independent recognition accuracy of only 80.02%. Finally, optimal performance of the network was achieved by performing Bayesian Optimization using Gaussian Processes in tuning the design hyperparameters.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zhang, Y., Markovic, S., Sapir, I., Wagenaar, R.C., Little, T.D.: Continuous functional activity monitoring based on wearable tri-axial accelerometer and gyroscope. In: 2011 5th International Conference on Pervasive Computing Technologies for Healthcare and Workshops, Pervasive Health 2011, pp. 370–373 (2011). https://doi.org/10.4108/icst.pervasivehealth.2011.245966

  2. Yamansavascilar, B., Amac Guvensan, M.: Activity recognition on smartphones: efficient sampling rates and window sizes, 1–6 (2016). https://doi.org/10.1109/percomw.2016.7457154

  3. Altini, M., Penders, J., Amft, O.: Energy expenditure estimation using wearable sensors: a new methodology for activity-specific models. In: Proceedings—Wireless Health 2012, WH 2012 (2012). https://doi.org/10.1145/2448096.2448097

  4. Rashidi, P., Mihailidis, A.: A survey for ambient-assisted living tools for older adults. IEEE J. Biomed. Health Inform. 17(3) (2013)

    Google Scholar 

  5. Khan, A.M., Tufail, A., Khattak, A.M., Laine, T.H.: Activity recognition on smartphones via sensor-fusion and KDA-based SVMs. Int. J. Distrib. Sens. Netw. 1–14 (2014). https://doi.org/10.1155/2014/503291

  6. Zhu, C., Sheng, W.: Multi-sensor fusion for human daily activity recognition in robot-assisted living. In: Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction—HRI 2009 (2009). https://doi.org/10.1145/1514095.1514187

  7. San Buenaventura, C., Tiglao, N.: Basic human activity recognition based on sensor fusion in smartphones. In: IFIP/IEEE IM 2017 Workshop: 1st Workshop on Protocols, Applications and Platforms for Enhanced Living Environments (2017)

    Google Scholar 

  8. Vavoulas, G., Pediaditis, M., Chatzaki, C., Spanakis, E., Tsiknakis, M.: The mobifall dataset: fall detection and classification with a smartphone. Int. J. Monit. Surveill. Technol. Res. 2, 44–56 (2016). https://doi.org/10.4018/ijmstr.2014010103

    Article  Google Scholar 

  9. Pires, I., Garcia, N., Pombo, N., Flórez-Revuelta, F.: From data acquisition to data fusion: a comprehensive review and a roadmap for the identification of activities of daily living using mobile devices. Sensors 16(2), 184 (2016). https://doi.org/10.3390/s16020184

    Article  Google Scholar 

  10. Zebin, T., Scully, P.J., Ozanyan, K.B.: Human activity recognition with inertial sensors using a deep learning approach. In: 2016 IEEE Sensors (2016). https://doi.org/10.1109/icsens.2016.7808590

  11. Shoaib, M., Bosch, S., Incel, O., Scholten, H., Havinga, P.: Fusion of smartphone motion sensors for physical activity recognition. Sensors 14(6), 10146–10176 (2014). https://doi.org/10.3390/s140610146

  12. Wen, J., Loke, S., Indulska, J., Zhong, M.: Sensor-based activity recognition with dynamically added context. In: Mihaela, U., Valeriy, V. (eds) 12th EAI International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services MOBIQUITOUS 2015. International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, pp. e4.1–e4.10, Coimbra, Portugal, 22–24 July 2015 (2015). https://doi.org/10.4108/eai.22-7-2015.2260164

Download references

Acknowledgement

The authors acknowledge the financial support of the University of the Philippines and Department of Science and Technology through the Engineering for Research and Development for Technology (ERDT) Program.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nestor Michael C. Tiglao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

San Buenaventura, C.V., Tiglao, N.M.C., Atienza, R.O. (2019). Deep Learning for Smartphone-Based Human Activity Recognition Using Multi-sensor Fusion. In: Chen, JL., Pang, AC., Deng, DJ., Lin, CC. (eds) Wireless Internet. WICON 2018. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 264. Springer, Cham. https://doi.org/10.1007/978-3-030-06158-6_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-06158-6_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-06157-9

  • Online ISBN: 978-3-030-06158-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics