Advertisement

A School Violence Detection Algorithm Based on a Single MEMS Sensor

  • Jifu Shi
  • Liang YeEmail author
  • Hany Ferdinando
  • Tapio Seppänen
  • Esko Alasaarela
Conference paper
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 517)

Abstract

School violence has become more and more frequent in today’s school life and caused great harm to the social and educational development in many countries. This paper used a MEMS sensor which is fixed on the waist to collect data and performed feature extraction on the acceleration and gyro data of the sensors. Altogether nine kinds of activities were recorded, including six daily-life kinds and three violence kinds. A filter-based Relief-F feature selection algorithm was used and Radial Basis Function (RBF) neural network classifier was applied on them. The results showed that the algorithm could distinguish physical violence movements from daily-life movements with an accuracy of 90%.

Keywords

Activity recognition School violence Relief-F MEMS accelerometer RBF neural network 

Notes

Acknowledgements

This work was supported by the National Natural Science Foundation of China (61602127), and partly supported by the Directorate General of Higher Education, Indonesia (2142/E4.4/K/2013), and the Finnish Cultural Foundation, North Ostrobothnia Regional Fund. The authors would like to thank those people who have helped with these experiments.

References

  1. 1.
    Gupta, P., Dallas, T.: Feature selection and activity recognition system using a single triaxial accelerometer. IEEE Trans. Biomed. Eng. 61(6), 1780 (2014)CrossRefGoogle Scholar
  2. 2.
    Politi, O., Mporas, I., Megalooikonomou, V.: Human motion detection in daily activity tasks using wearable sensors. In: Signal Processing Conference, pp. 2315–2319. IEEE (2014)Google Scholar
  3. 3.
    Altini, M., Vullers, R., Van Hoof, C., et al.: Self-calibration of walking speed estimations using smartphone sensors. In: IEEE International Conference on Pervasive Computing and Communication Workshops, pp. 10–18. IEEE Computer Society (2014)Google Scholar
  4. 4.
    San Buenaventura, C.V., Tiglao.: Basic human activity recognition based on sensor fusion in smartphones. In: 2017 IFIP/IEEE Symposium on Integrated Network and Service Management (IM)Google Scholar
  5. 5.
    Wang, Z., Huo, Y.: A multi-attribute fusion acceleration feature selection algorithm for activity recognition on smart phones. In: International Conference on Information Science, Electronics and Electrical Engineering, pp. 145–148. IEEE (2014)Google Scholar
  6. 6.
    Coskun, D., Incel, O.D., Ozgovde, A.: Phone position/placement detection using accelerometer: impact on activity recognition. In: IEEE Tenth International Conference on Intelligent Sensors, Sensor Networks and Information Processing, pp. 1–6. IEEE (2015)Google Scholar
  7. 7.
    Ayu, M.A., Ismail, S.A., Mantoro, T., et al.: Real-time activity recognition in mobile phones based on its accelerometer data. In: International Conference on Informatics and Computing, pp. 292–297. IEEE (2017)Google Scholar
  8. 8.
    Gao, L., Bourke, A.K., Nelson, J.: A system for activity recognition using multi-sensor fusion. In: International Conference of the IEEE Engineering in Medicine & Biology Society, Embc., p. 7869 (2011)Google Scholar
  9. 9.
    Kushwah, A., Kumar, S., Hegde, R.M.: Multi-sensor data fusion methods for indoor activity recognition using temporal evidence theory. Pervasive & Mob. Comput. 21, 19–29 (2015)CrossRefGoogle Scholar
  10. 10.
    Rahman, M.M., Charoenlarpnopparut, C., Suksompong, P.: Signal processing for multi-sensor E-nose system: acquisition and classification. In: International Conference on Information, Communications and Signal Processing, pp. 1–5 (2015)Google Scholar
  11. 11.
    Gao, L., Bourke, A.K., Nelson, J.: A comparison of classifiers for activity recognition using multiple accelerometer-based sensors (2012)Google Scholar
  12. 12.
    Zebin, T., Scully, P.J., Ozanyan, K.B.: Inertial sensor based modelling of human activity classes: feature extraction and multi-sensor data fusion using machine learning algorithms. In: eHealth 360°. Springer International Publishing (2017)Google Scholar
  13. 13.
    Chowdhury, A., Tjondronegoro, D., Chandran, V., et al.: Physical activity recognition using posterior-adapted class-based fusion of multi-accelerometers data. IEEE J. Biomed. & Health Inform. PP(99), 1–1 (2017)Google Scholar
  14. 14.
    Gao, L., Bourke, A.K., Nelson, J.: Evaluation of accelerometer based multi-sensor versus single-sensor activity recognition systems. Med. Eng. Phys. 36(6), 779–785 (2014)CrossRefGoogle Scholar
  15. 15.
    Pirttikangas, S., Fujinami, K., Nakajima, T.: Feature selection and activity recognition from wearable sensors. In: Ubiquitous Computing Systems, pp. 516–527. Springer (2006)Google Scholar
  16. 16.
    Wichit, N.: Multi-sensor data fusion model for activity detection. In: International Conference on ICT and Knowledge Engineering, pp. 54–59. IEEE (2015)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  • Jifu Shi
    • 1
  • Liang Ye
    • 1
    • 2
    Email author
  • Hany Ferdinando
    • 2
    • 3
  • Tapio Seppänen
    • 4
  • Esko Alasaarela
    • 2
  1. 1.School of Electronic and Information EngineeringHarbin Institute of TechnologyHarbinChina
  2. 2.OPEM Unit, Health and Wellness Measurement Research GroupUniversity of OuluOuluFinland
  3. 3.Department of Electrical EngineeringPetra Christian UniversitySurabayaIndonesia
  4. 4.Physiological Signal Analysis Team, University of OuluOuluFinland

Personalised recommendations