Advertisement

Motion-Based Gait Identification Using Spectro-temporal Transform and Convolutional Neural Networks

  • Omid Dehzangi
  • Mojtaba TaherisadrEmail author
  • Raghvendar ChangalVala
  • Priyanka Asnani
Conference paper
Part of the Internet of Things book series (ITTCC)

Abstract

The wide range of usage and application of wearable sensors like as smart watches provide access to precious inertial sensor data that is usable in human identification based on their gait pattern. A large number of studies have been conduced on extracting high-level and various heuristic features out of inertial sensor data to identify discriminative gait signatures and distinguish the target individual from others. However, complexity of the collected data from inertial sensors, detachment between the predictive learning models and intuitive feature extraction module increase the error rate of manual feature extraction. We propose a new method for the task of human gait identification based on spectro-temporal two dimensional expansion of gait cycle. Then, we design a deep convolutional neural network learning in order to extract discriminative features from the two dimensional expanded gait cycles and also jointly optimize the identification model simultaneously. We propose a systematic approach for processing nonstationary motion signals with the application of human gait identification with 3 main elements: first gait cycle extraction, second spectro-temporal representation of gait cycle, and third deep convolutional learning. We collect motion signal from 5 inertial sensors placed at different locations including lower-back, chest, right knee, right ankle, and right hand wrist. We pre-process the acquired raw signals by motion signal processing and then we propose an efficient heuristic segmentation methodology and extract gait cycle from the segmented and processed data. Spectro-temporal two dimensional features are extracted by merging key instantaneous temporal and spectral descriptors in a gait cycle which is capable of characterizing the non-stationarities in each gait cycle inertial data. The two dimensional time-frequency distribution representation of the gait cycle extracted from acquired inertial sensor data from 10 subjects are fed as input to the designed and proposed 10 layers DCNN architecture. Based on our experimental analysis, 93.36% accuracy was achieved for subject identification task.

Keywords

Spectro-temporal analysis Motion analysis Convolutional neaural network Gait identification Time frequency representation Sensor fusion Motion 

References

  1. 1.
    El-Sheimy, N., Hou, H., Niu, X.: Analysis and modeling of inertial sensors using Allan variance. IEEE Trans. Instrum. Meas. 57(1), 140–149 (2008)CrossRefGoogle Scholar
  2. 2.
    Sprager, S., Juric, M.B.: Inertial sensor-based gait recognition: a review. Sensors 15(9), 22089–22127 (2015)CrossRefGoogle Scholar
  3. 3.
    Gafurov, D., Einar, S., Patrick, B.: Gait authentication and identification using wearable accelerometer sensor. In: 2007 IEEE Workshop on Automatic Identification Advanced Technologies. IEEE (2007)Google Scholar
  4. 4.
    Kim, E., Sumi, H., Diane, C.: Human activity recognition and pattern discovery. IEEE Pervasive Comput. 9(1) (2010)CrossRefGoogle Scholar
  5. 5.
    Mortazavi, B., et al.: Met calculations from on-body accelerometers for exergaming movements. In: 2013 IEEE International Conference on Body Sensor Networks (BSN). IEEE (2013)Google Scholar
  6. 6.
    Vikas, V., Crane, C.D.: Measurement of robot link joint parameters using multiple accelerometers and gyroscopes. ASME Paper No. DETC2013-12741 (2013)Google Scholar
  7. 7.
    Robertson, K., et al.: C-66 prompting technologies: is prompting during activity transition more effective than time-based prompting? Arch. Clin. Neuropsychol. 29(6) (2014)Google Scholar
  8. 8.
    Ahmadi, A., et al.: Automatic activity classification and movement assessment during a sports training session using wearable inertial sensors. In: 2014 11th International Conference on Wearable and Implantable Body Sensor Networks (BSN). IEEE (2014)Google Scholar
  9. 9.
    Le Moing, J., Stengel, I.: The smartphone as a gait recognition device impact of selected parameters on gait recognition. In: 2015 International Conference on Information Systems Security and Privacy (ICISSP). IEEE (2015)Google Scholar
  10. 10.
    Gupta, M. (ed.): Handbook of Research on Social and Organizational Liabilities in Information Security. IGI Global (2008)Google Scholar
  11. 11.
    Vienne, A., et al.: Inertial sensors to assess gait quality in patients with neurological disorders: a systematic review of technical and analytical challenges. Front. Psychol. 8 (2017)Google Scholar
  12. 12.
    Chen, C., Jafari, R., Kehtarnavaz, N.: A survey of depth and inertial sensor fusion for human action recognition. Multimed. Tools Appl. 76(3), 4405–4425 (2017)CrossRefGoogle Scholar
  13. 13.
    Roberts, M.L., Zahay, D.: Internet Marketing: Integrating Online and Offline Strategies. Cengage Learning (2012)Google Scholar
  14. 14.
    Yamada, M., et al.: Objective assessment of abnormal gait in patients with rheumatoid arthritis using a smartphone. Rheumatol. Int. 32(12), 3869–3874 (2012)CrossRefGoogle Scholar
  15. 15.
    Sposaro, F., Tyson, G.: iFall: an Android application for fall monitoring and response. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2009. IEEE (2009)Google Scholar
  16. 16.
    Tunca, C., et al.: Inertial sensor-based robust gait analysis in non-hospital settings for neurological disorders. Sensors 17(4), 825 (2017)CrossRefGoogle Scholar
  17. 17.
    Boashash, B.: Estimating and interpreting the instantaneous frequency of a signal. II. Algorithms and applications. Proceedings of the IEEE 80(4), 540–568 (1992)CrossRefGoogle Scholar
  18. 18.
    Boashash, B., Sucic, V.: Resolution measure criteria for the objective assessment of the performance of quadratic time-frequency distributions. IEEE Trans. Signal Process. 51(5), 1253–1263 (2003)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Karpathy, A.: Cs231n: convolutional neural networks for visual recognition. Neural Netw. 1 (2016)Google Scholar
  20. 20.
    Zhong, Y., Deng, Y.: Sensor orientation invariant mobile gait biometrics. In: 2014 IEEE International Joint Conference on Biometrics (IJCB). IEEE (2014)Google Scholar
  21. 21.
    Auger, F., et al.: Time-frequency toolbox. CNRS France-Rice University 46 (1996)Google Scholar
  22. 22.
    Tong, M., Minghao, T.: LEACH-B: an improved LEACH protocol for wireless sensor network. In: 2010 6th International Conference on Wireless Communications Networking and Mobile Computing (WiCOM). IEEE (2010)Google Scholar
  23. 23.
    Rigamonti, R., Brown, M.A., Lepetit, V.: Are sparse representations really relevant for image classification?. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE (2011)Google Scholar
  24. 24.
    Reynolds, D.A., Quatieri, T.F., Dunn, R.B.: Speaker verification using adapted Gaussian mixture models. Digit. Signal Process. 10(1–3), 19–41 (2000)CrossRefGoogle Scholar
  25. 25.
    Vapnik, V.: The Nature of Statistical Learning Theory. Springer (2013)Google Scholar
  26. 26.
    Bezdek, J.C., et al.: Fuzzy Models and Algorithms for Pattern Recognition and Image Processing, vol. 4. Springer (2006)Google Scholar
  27. 27.
    Woods, K., Kegelmeyer, W.P., Bowyer, K.: Combination of multiple classifiers using local accuracy estimates. IEEE Trans. Pattern Anal. Mach. Intell. 19(4), 405–410 (1997)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Omid Dehzangi
    • 1
  • Mojtaba Taherisadr
    • 2
    Email author
  • Raghvendar ChangalVala
    • 2
  • Priyanka Asnani
    • 2
  1. 1.Rockefeller Neuroscience Institute, West Virginia UniversityMorgantownUSA
  2. 2.University of MichiganDearbornUSA

Personalised recommendations