Advertisement

An Efficient Scheme for Prototyping kNN in the Context of Real-Time Human Activity Recognition

  • Paulo J. S. FerreiraEmail author
  • Ricardo M. C. MagalhãesEmail author
  • Kemilly Dearo GarciaEmail author
  • João M. P. CardosoEmail author
  • João Mendes-MoreiraEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11871)

Abstract

The Classifier kNN is largely used in Human Activity Recognition systems. Research efforts have proposed methods to decrease the high computational costs of the original kNN by focusing, e.g., on approximate kNN solutions such as the ones relying on Locality-sensitive Hashing (LSH). However, embedded kNN implementations need to address the target device memory constraints and power/energy consumption savings. One of the important aspects is the constraint regarding the maximum number of instances stored in the kNN learning process (being it offline or online and incremental). This paper presents simple, energy/computationally efficient and real-time feasible schemes to maintain a maximum number of learning instances stored by kNN. Experiments in the context of HAR show the efficiency of our best approaches, and their capability to avoid the kNN storage runs out of training instances for a given activity, a situation not prevented by typical default schemes.

Keywords

k-Nearest Neighbor Classification kNN prototyping LSH Human Activity Recognition (HAR) 

References

  1. 1.
    Zhang, S., Wei, Z., Nie, J., Huang, L., Wang, S., Li, Z.: A review on human activity recognition using vision-based method. J. Healthc. Eng. 2017, 31 (2017). Article ID 3090343Google Scholar
  2. 2.
    Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 13(1), 21–27 (1967)CrossRefGoogle Scholar
  3. 3.
    Su, X., Tong, H., Ji, P.: Activity recognition with smartphone sensors. Tsinghua Sci. Technol. 19(3), 235–249 (2014) CrossRefGoogle Scholar
  4. 4.
    Calvo-Zaragoza, J., Valero-Mas, J.J., Rico-Juan, R.J.: Improving kNN multi-label classification in Prototype Selection scenarios using class proposals. Pattern Recogn. 48(5), 1608–1622 (2015)CrossRefGoogle Scholar
  5. 5.
    Garcia, S., Derrac, J., Cano, J., Herrera, F.: Prototype selection for nearest neighbor classification: taxonomy and empirical study. IEEE Trans. Pattern Anal. Mach. Intell. 34(3), 417–435 (2012)CrossRefGoogle Scholar
  6. 6.
    Bifet, A., Holmes, G., Kirkby, R., Pfahringer, B.: MOA: massive online analysis. J. Mach. Learn. Res. 11, 1601–1604 (2010)Google Scholar
  7. 7.
    Reiss, A., Stricker, D.: Introducing a new benchmarked dataset for activity monitoring. In: The 16th IEEE International Symposium on Wearable Computers (ISWC) (2012)Google Scholar
  8. 8.
    Bifet, A., Pfahringer, B., Read, J., Holmes, G.: Efficient data stream classification via probabilistic adaptive windows. In: Proceedings of the 28th Annual ACM Symposium on Applied Computing, pp. 801–806. ACM, March 2013Google Scholar
  9. 9.
    Bifet, A., Gavalda, R.: Learning from time-changing data with adaptive windowing. In: Proceedings of the 2007 SIAM International Conference on Data Mining, pp. 443–448. Society for Industrial and Applied Mathematics, April 2007Google Scholar
  10. 10.
    Garcia, K.D., de Carvalho, A.C.P.L.F., Mendes-Moreira, J.: A cluster-based prototype reduction for online classification. In: Yin, H., Camacho, D., Novais, P., Tallón-Ballesteros, A. (eds.) IDEAL 2018. Lecture Notes in Computer Science, vol. 11314, pp. 603–610. Springer, Cham (2018).  https://doi.org/10.1007/978-3-030-03493-1_63CrossRefGoogle Scholar
  11. 11.
    Indyk, P., Motwani, R.: Approximate nearest neighbors: towards removing the curse of dimensionality. In: Proceedings of the Thirtieth Annual ACM Symposium on Theory of Computing, pp. 604–613. ACM (1998)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.INESC TEC, Faculty of EngineeringUniversity of PortoPortoPortugal
  2. 2.University of TwenteEnschedeThe Netherlands

Personalised recommendations