Advertisement

A Novel Simulation Based Classifier Using Random Tree and Reinforcement Learning

  • Israr AhmedEmail author
  • Munir NaveedEmail author
  • Mohammed AdnanEmail author
Conference paper
Part of the Lecture Notes on Data Engineering and Communications Technologies book series (LNDECT, volume 29)

Abstract

In this work, we present a new classification model to solve Human Activity Recognition (HAR) problem. The new classifier is a hybrid of Random Tree and Monte-Carlo simulations where Random Tree is used to select random samples for each simulation. The simulation use a generative model to train a value function that predicts a activity depending on sensor values. The classifier trains in an unsupervised learning style and does not require a training example dataset. It builds value function depending on response from environment. The experiments are performed on HAR dataset and compared with the start-of-the-art rival techniques. The performance is measure using precision, recall, f-Score and accuracy rate. The results show the new algorithm performs better than its rival techniques in f-score and accuracy. The classifier is also scalable and can also generalize non-deterministic behaviours.

References

  1. 1.
    Parasuraman, R., Sheridan, T.B., Wickens, C.D.: A model for types and levels of human interaction with automation. IEEE Trans. Syst. Man Cybern. 30(3), 286–297 (2000)CrossRefGoogle Scholar
  2. 2.
    Gandetto, M., Marchesooti, L., Sciutto, S., Negroni, D., Regazzoni, C.S.: From multi-sensor surveillance towards smart interactive spaces. In: IEEE International Conference on Multimedia and Expo, vol. 1, pp. 1–641 (2003)Google Scholar
  3. 3.
    Li, Y., Shi, D., Ding, B., Liu, D.: Unsupervised feature learning for human activity recognition using smartphone sensors. In: Prasath, R., O’Reilly, P., Kathirvalavakumar, T. (eds.) Mining Intelligence and Knowledge Exploration. Lecture Notes in Computer Science, vol. 8891. Springer (2014)Google Scholar
  4. 4.
    Ronao, C.A., Cho, S.B.: Deep convolutional neural networks for human activity recognition with smartphone sensors. In: Arik, S., Huang, T., Lai, W., Liu, Q. (eds.) Neural Information Processing. ICONIP 2015. Lecture Notes in Computer Science, vol. 9492. Springer (2015)Google Scholar
  5. 5.
    Reyes Ortiz, J.L., Oneto, L., Samà, A., Parra, X., Anguita, D.: Transition-aware human activity recognition using smartphones. Neurocomputing 171, 754–767 (2016). ISSN: 0925-2312CrossRefGoogle Scholar
  6. 6.
    Ann Ronao, C., Cho, S.-B.: Human activity recognition with smartphone sensors using deep learning neural networks. Expert Syst. Appl. 59, 235–244 (2016)CrossRefGoogle Scholar
  7. 7.
    Ronao, C.A., Cho, S.B.: Human activity recognition using smartphone sensors with two-stage continuous hidden Markov models. In: 2014 10th International Conference on Natural Computation (ICNC), Xiamen, 2014, pp. 681–686 (2014).  https://doi.org/10.1109/icnc.2014.6975918
  8. 8.
    Reyes-Ortiz, J.L., Alessandro, G., Xavier, P., Davide, A., Joan, C., Andreu, C.: Human activity and motion disorder recognition: towards smarter interactive cognitive environments. In: 2013 ESANN European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning. Bruges (Belgium), i6doc.com publ. (2013). ISBN 978-2-87419-081-0Google Scholar
  9. 9.
    Jiang, W., Yin, Z.: Human activity recognition using wearable sensors by deep convolutional neural networks. In: Proceedings of the 23rd ACM International Conference on Multimedia, Brisbane, Australia, pp. 1307–1310 (2015). ISBN: 978-1-4503-3459-4  https://doi.org/10.1145/2733373.2806333
  10. 10.
    Reyes-Ortiz, J.L., Oneto, L., Ghio, A., Samá, A., Anguita, D., Parra, X.: Human activity recognition on smartphones with awareness of basic activities and postural transitions. In: Wermter, S., et al. (eds.) Artificial Neural Networks and Machine Learning – ICANN 2014. ICANN 2014. Lecture Notes in Computer Science, vol. 8681. Springer (2014)Google Scholar
  11. 11.
    Mourcou, Q., Fleury, A., Franco, C., Klopcic, F., Vuillerme, N.: Performance evaluation of smartphone inertial sensors measurement for range of motion. Sensors (2015). (ISSN 1424-8220; CODEN: SENSC9)Google Scholar
  12. 12.
    Davis, K., Owusu, E., Bastani, V., Marcenaro, L., Hu, J., Regazzoni, C., Feijs, L.: Activity recognition based on inertial sensors for ambient assisted living. In: 2016 19th International Conference on Information Fusion (FUSION), Heidelberg, pp. 371–378 (2016)Google Scholar
  13. 13.
    Jiang, W., Yin, Z.: Human activity recognition using wearable sensors by deep convolutional neural networks. In: Proceedings of the 23rd ACM International Conference on Multimedia, pp. 1307–1310, Brisbane, Australia (2015). ISBN: 978-1-4503-3459-4  https://doi.org/10.1145/2733373.2806333
  14. 14.
    Yang, J., Lee, J., Choi, J.: Activity recognition based on RFID object usage for smart mobile devices. J. Comput. Sci. Technol. 26(2), 239–246 (2011)CrossRefGoogle Scholar
  15. 15.
    Chen, L., Wei, H., Ferryman, J.: A survey of human motion analysis using depth imagery. Pattern Recognit. Lett. 34(15), 1995–2006 (2013)CrossRefGoogle Scholar
  16. 16.
    Ong, W., Palafox, L., Koseki, T.: Investigation of feature extraction for unsupervised learning in human activity detection. Bull. Networking Comput. Syst. Softw. 2(1), 30–35 (2013)Google Scholar
  17. 17.
    Lara, O.D., Labrador, M.A.: A survey on human activity recognition using wearable sensors. IEEE Commun. Surv. Tutorials 15(3), 1192–1209 (2013)CrossRefGoogle Scholar
  18. 18.
    Chaaraoui, A.A., Padilla-López, J.R., Climent-Pérez, P., Flórez-Revuelta, F.: Evolutionary joint selection to improve human action recognition with RGB-D devices. Expert Syst. Appl. 41(3), 786–794 (2014)CrossRefGoogle Scholar
  19. 19.
    Ryoo, M.S.: Human activity prediction: early recognition of ongoing activities from streaming videos. In: 2011 International Conference on Computer Vision, no. Iccv, pp. 1036–1043 (2011)Google Scholar
  20. 20.
    Iglesias, J., Cano, J., Bernardos, A.M., Casar, J.R.: A ubiquitous activity-monitor to prevent sedentariness. In: Proceedings of IEEE Conference on Pervasive Computing and Communications (2011)Google Scholar
  21. 21.
    Choujaa, D., Dulay, N.: TRAcME: Temporal activity recognition using mobile phone data. In: IEEE/IFIP International Conference on Embedded and Ubiquitous Computing, vol. 1, pp. 119–126 (2008)Google Scholar
  22. 22.
    Parkka, J., Ermes, M., Korpipaa, P., Mantyjarvi, J., Peltola, J., Korhonen, I.: Activity classification using realistic data from wearable sensors. IEEE Trans. Inf Technol. Biomed. 10(1), 119–128 (2006)CrossRefGoogle Scholar
  23. 23.
    Jatoba, L.C., Grossmann, U., Kunze, C., Ottenbacher, J., Stork, W.: Context-aware mobile health monitoring: evaluation of different pattern recognition methods for classification of physical activity. In: 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 5250–5253 (2008)Google Scholar
  24. 24.
    Olszewski, R.T., Faloutsos, C., Dot, D.B.: Generalized Feature Extraction for Structural Pattern Recognition in Time-Series Data (2001)Google Scholar
  25. 25.
    Naveed, M., Kitchin, D., Crampton, A.: Monte-Carlo planning for pathfinding in real-time strategy games. In: Proceedings of PlanSIG 2010. 28th Workshop of the UK Special Interest Group on Planning and Scheduling Joint Meeting with the 4th Italian Workshop on Planning and Scheduling, Brescia, Italy, pp. 125–132. PlanSIG (2010)Google Scholar
  26. 26.
    Naveed, M., Crampton, A., Kitchin, D., McCluskey, T.L.: Real-time path planning using a simulation-based Markov decision process. In: Research and Development in Intelligent Systems, vol. 28, pp. 35–48. Springer, London (2011)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Higher Colleges of TechnologyAbu DhabiUnited Arab Emirates

Personalised recommendations