Skip to main content

Indoor Activity Recognition by Combining One-vs.-All Neural Network Classifiers Exploiting Wearable and Depth Sensors

  • Conference paper
Advances in Computational Intelligence (IWANN 2013)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 7903))

Included in the following conference series:

Abstract

Activity recognition has recently gained a lot of interest and appears to be a promising approach to help the elderly population pursue an independent living. There already exist several methods to detect human activities based either on wearable sensors or on cameras but few of them combine the two modalities. This paper presents a strategy to enhance the robustness of indoor human activity recognition by combining wearable and depth sensors. To exploit the data captured by those sensors, we used an ensemble of binary one-vs-all neural network classifiers. Each activity-specific model was configured to maximize its performance. The performance of the complete system is comparable to lazy learning methods (k-NN) that require the whole dataset.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Amft, O., Tröster, G.: Recognition of dietary activity events using on-body sensors. Artificial Intelligence in Medicine 42(2), 121–136 (2008)

    Article  Google Scholar 

  2. Bishop, C.: Neural networks for pattern recognition. OUP, USA (1995)

    Google Scholar 

  3. Browne, G., et al.: Sensecam improves memory for recent events and quality of life in a patient with memory retrieval difficulties. Memory 19(7), 713–722 (2011)

    Article  Google Scholar 

  4. Garcia-Pedrajas, N., Ortiz-Boyer, D.: An empirical study of binary classifier fusion methods for multiclass classification. Information Fusion 12(2), 111–130 (2011)

    Article  Google Scholar 

  5. Hondori, H., et al.: Monitoring intake gestures using sensor fusion (microsoft kinect and inertial sensors) for smart home tele-rehab setting. In: 2012 1st Annual IEEE Healthcare Innovation Conference (2012)

    Google Scholar 

  6. Kepski, M., Kwolek, B.: Fall detection on embedded platform using kinect and wireless accelerometer. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds.) ICCHP 2012, Part II. LNCS, vol. 7383, pp. 407–414. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  7. Lara, O.D., et al.: Centinela: A human activity recognition system based on acceleration and vital sign data. Pervasive and Mobile Computing 8(5), 717–729 (2012)

    Article  Google Scholar 

  8. Rijsbergen, C.J.V.: Information Retrieval, 2nd edn. Butterworth-Heinemann, Newton (1979)

    Google Scholar 

  9. Roggen, D., et al.: Collecting complex activity data sets in highly rich networked sensor environments. In: Proceedings of the Seventh International Conference on Networked Sensing Systems (INSS), pp. 233–240. IEEE CSP (2010)

    Google Scholar 

  10. Sagha, H., et al.: Benchmarking classification techniques using the Opportunity human activity dataset. In: IEEE International Conference on Systems, Man, and Cybernetics (2011)

    Google Scholar 

  11. Satizábal M, H.F., Pérez-Uribe, A.: Relevance metrics to reduce input dimensions in artificial neural networks. In: de Sá, J.M., Alexandre, L.A., Duch, W., Mandic, D.P. (eds.) ICANN 2007. LNCS, vol. 4668, pp. 39–48. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  12. Shotton, J., et al.: Real-time human pose recognition in parts from single depth images. In: Computer Vision and Pattern Recognition (CVPR), pp. 1297–1304. IEEE (2011)

    Google Scholar 

  13. Stiefmeier, T., et al.: Wearable activity tracking in car manufacturing. IEEE Pervasive Computing 7(2), 42–50 (2008)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Delachaux, B., Rebetez, J., Perez-Uribe, A., Satizábal Mejia, H.F. (2013). Indoor Activity Recognition by Combining One-vs.-All Neural Network Classifiers Exploiting Wearable and Depth Sensors. In: Rojas, I., Joya, G., Cabestany, J. (eds) Advances in Computational Intelligence. IWANN 2013. Lecture Notes in Computer Science, vol 7903. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-38682-4_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-38682-4_25

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-38681-7

  • Online ISBN: 978-3-642-38682-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics