Advertisement

Human Activities Recognition Using Accelerometer and Gyroscope

  • Anna Ferrari
  • Daniela Micucci
  • Marco Mobilio
  • Paolo NapoletanoEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11912)

Abstract

Plenty of supervised machine learning techniques that use accelerometer and gyroscope signals for automatic Human Activity Recognition (HAR) has been proposed in the last decade. According to recent studies, the combination of accelerometer and gyroscope signals, also called multimodal recognition, increases the accuracy in HAR with respect to the use of each signal alone. This paper presents the results of an analysis we performed in order to compare the effectiveness of machine learning techniques when used separately or jointly on accelerometer and gyroscope signals. We compare SVM and \(k-\)NN classifiers (combined with hand-crafted features) with a deep residual network using three publicly available datasets. The results show that the use of deep learning techniques in multimodal mode (i.e., using accelerometer and gyroscope signals jointly) outperforms other strategies of at least 10%.

Keywords

Inertial sensors Machine learning Deep learning Human Activity Recognition 

References

  1. 1.
    Acampora, G., Cook, D., Rashidi, P., Vasilakos, A.: A survey on ambient intelligence in healthcare. Proc. IEEE 101(12), 2470–2494 (2013)CrossRefGoogle Scholar
  2. 2.
    Anguita, D., Ghio, A., Oneto, L., Parra, X., Reyes-Ortiz, J.L.: A public domain dataset for human activity recognition using smartphones. In: Proceedings of the European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN13) (2013)Google Scholar
  3. 3.
    Bianco, S., Cadene, R., Celona, L., Napoletano, P.: Benchmark analysis of representative deep neural network architectures. IEEE Access 6, 64270–64277 (2018)CrossRefGoogle Scholar
  4. 4.
    Bianco, S., Napoletano, P., Schettini, R.: Multimodal car driver stress recognition. In: Proceedings of the EAI International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth19) (2019)Google Scholar
  5. 5.
    Ferrari, A., Micucci, D., Marco, M., Napoletano, P.: A framework for long-term data collection to support automatic human activity recognition. In: Proceedings of the Intelligent Environments (IE) Work on Reliable Intelligent Environment (WoRIE19) (2019)Google Scholar
  6. 6.
    Ferrari, A., Micucci, D., Marco, M., Napoletano, P.: Hand-crafted features vs residual networks for human activities recognition using accelerometer. In: Proceedings of the IEEE International Symposium on Consumer Technologies (ISCT19) (2019)Google Scholar
  7. 7.
    Ferrari, A., Micucci, D., Marco, M., Napoletano, P.: On the homogenization of heterogeneous inertial-based databases for human activity recognition. In: Proceedings of IEEE Services Work on Big Data for Public Health Policy Making (2019)Google Scholar
  8. 8.
    Grossi, G., Lanzarotti, R., Napoletano, P., Noceti, N., Odone, F.: Positive technology for elderly well-being: a review. Pattern Recogn. Lett. (2019)Google Scholar
  9. 9.
    He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR16) (2016)Google Scholar
  10. 10.
    Malekzadeh, M., Clegg, R.G., Cavallaro, A., Haddadi, H.: Protecting sensory data against sensitive inferences. In: Proceedings of the Workshop on Privacy by Design in Distributed Systems (W-P2DS18) (2018)Google Scholar
  11. 11.
    Micucci, D., Mobilio, M., Napoletano, P.: Unimib shar: a dataset for human activity recognition using acceleration data from smartphones. Appl. Sci. 7(10), 1101 (2017)CrossRefGoogle Scholar
  12. 12.
    Micucci, D., Mobilio, M., Napoletano, P., Tisato, F.: Falls as anomalies? an experimental evaluation using smartphone accelerometer data. J. Ambient Intell. Humaniz. Comput. 8(1), 87–99 (2017)CrossRefGoogle Scholar
  13. 13.
    Ronao, C.A., Cho, S.B.: Human activity recognition with smartphone sensors using deep learning neural networks. Expert Syst. Appl. 59, 235–244 (2016)CrossRefGoogle Scholar
  14. 14.
    Twomey, N., et al.: Comprehensive study of activity recognition using accelerometers. Informatics 5(2), 27 (2018)CrossRefGoogle Scholar
  15. 15.
    Vavoulas, G., Chatzaki, C., Malliotakis, T., Pediaditis, M., Tsiknakis, M.: The mobiact dataset: recognition of activities of daily living using smartphones. In: Proceedings of Information and Communication Technologies for Ageing Well and e-Health (ICT4AgeingWell16) (2016)Google Scholar
  16. 16.
    Wang, J., Chen, Y., Hao, S., Peng, X., Hu, L.: Deep learning for sensor-based activity recognition: a survey. Pattern Recogn. Lett. 119, 3–11 (2019)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.University of Milano - BicoccaMilanItaly

Personalised recommendations