Advertisement

Mobile Based Prompted Labeling of Large Scale Activity Data

  • Ian Cleland
  • Manhyung Han
  • Chris Nugent
  • Hosung Lee
  • Shuai Zhang
  • Sally McClean
  • Sungyoung Lee
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8277)

Abstract

This paper describes the use of a prompted labeling solution to obtain class labels for user activity and context information on a mobile device. Based on the output from an activity recognition module, the prompt labeling module polls for class transitions from any of the activities (e.g. walking, running) to the standing still activity. Once a transition has been detected the system prompts the user, through the provision of a message on the mobile phone, to provide a label for the last activity that was carried out. This label, along with the raw sensor data is then stored locally prior to being uploaded to cloud storage. The paper provides technical details of how and when the system prompts the user for an activity label and discusses the information that can be gleaned from sensor data. This system allows for activity and context information to be collected on a large scale. Data can then be used within new opportunities in data mining and modeling of user context for a variety of applications.

Keywords

Mobile Device Cloud Service Activity Recognition Context Aware Application Context Aware Service 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Intille, S.S., Lester, J., Sallis, J.F., et al.: New Horizons in Sensor Development. Medicine & Science in Sports & Exercise 44, S24–S31 (2012)Google Scholar
  2. 2.
    Hamm, J., Stone, B., Belkin, M., Dennis, S.: Automatic Annotation of Daily Activity from Smartphone-Based Multisensory Streams. In: Uhler, D., Mehta, K., Wong, J.L. (eds.) MobiCASE 2012. LNICST, vol. 110, pp. 328–342. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  3. 3.
    Preece, S.J., Goulermas, J.Y., Kenney, L.P.J., et al.: Activity Identification using Body-Mounted sensors—a Review of Classification Techniques. Physiol. Meas. 30, R1–R33 (2009)Google Scholar
  4. 4.
    Avci, A., Bosch, S., Marin-Perianu, M., et al.: Activity Recognition using Inertial Sensing for Healthcare, Wellbeing and Sports Applications: A Survey, pp. 1–10 (2010)Google Scholar
  5. 5.
    Hossmann, T., Efstratiou, C., Mascolo, C.: Collecting Big Datasets of Human Activity One Checkin at a Time, pp. 15–20 (2012)Google Scholar
  6. 6.
    Lane, N.D., Miluzzo, E., Lu, H., et al.: A Survey of Mobile Phone Sensing. IEEE Communications Magazine 48, 140–150 (2010)CrossRefGoogle Scholar
  7. 7.
    Krishnan, N.C., Colbry, D., Juillard, C., et al.: Real Time Human Activity Recognition using Tri-Axial Accelerometers, pp. 1–5 (2008)Google Scholar
  8. 8.
    Bao, L., Intille, S.S.: Activity Recognition from User-Annotated Acceleration Data. In: Ferscha, A., Mattern, F. (eds.) PERVASIVE 2004. LNCS, vol. 3001, pp. 1–17. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  9. 9.
    Parkka, J., Ermes, M., Korpipaa, P., et al.: Activity Classification using Realistic Data from Wearable Sensors. IEEE Transactions on Information Technology in Biomedicine 10, 119–128 (2006)CrossRefGoogle Scholar
  10. 10.
    Mannini, A., Intille, S.S., Rosenberger, M., et al.: Activity Recognition using a Single Accelerometer Placed at the Wrist Or Ankle. Med. Sci. Sports Exerc. (2013); E-Published ahead of PrintGoogle Scholar
  11. 11.
    Plotz, T., Chen, C., Hammerla, N.Y., et al.: Automatic Synchronization of Wearable Sensors and Video-Cameras for Ground Truth Annotation–A Practical Approach, pp. 100–103 (2012)Google Scholar
  12. 12.
    Cruciani, F., Donnelly, M.P., Nugent, C.D., Parente, G., Paggetti, C., Burns, W.: DANTE: A video based annotation tool for smart environments. In: Par, G., Morrow, P. (eds.) S-CUBE 2010. LNICST, vol. 57, pp. 179–188. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  13. 13.
    Lasecki, W.S., Song, Y.C., Kautz, H., et al.: Real-Time Crowd Labeling for Deployable Activity Recognition, pp. 1203–1212 (2013)Google Scholar
  14. 14.
    Kawaguchi, N., Watanabe, H., Yang, T., et al.: HASC2012corpus: Large Scale Human Activity Corpus and its Application (2012)Google Scholar
  15. 15.
    Tapia, E.M., Intille, S.S., Larson, K.: Activity recognition in the home using simple and ubiquitous sensors. In: Ferscha, A., Mattern, F. (eds.) PERVASIVE 2004. LNCS, vol. 3001, pp. 158–175. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  16. 16.
    Harada, S., Lester, J., Patel, K., et al.: VoiceLabel: Using Speech to Label Mobile Sensor Data, pp. 69–76 (2008)Google Scholar
  17. 17.
    Han, M., Lee, Y., Lee, S.: Comprehensive Context Recognizer Based on Multimodal Sensors in a Smartphone. Sensors 12, 12588–12605 (2012)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2013

Authors and Affiliations

  • Ian Cleland
    • 1
  • Manhyung Han
    • 2
  • Chris Nugent
    • 1
  • Hosung Lee
    • 2
  • Shuai Zhang
    • 1
  • Sally McClean
    • 3
  • Sungyoung Lee
    • 2
  1. 1.Computer Science Research Institute and School of Computing and MathematicsUniversity of UlsterCo. AntrimNorthern Ireland
  2. 2.Dept. of Computer EngineeringKyung Hee UniversityKorea
  3. 3.Computer Science Research Institute and School of Computing and Information EngineeringUniversity of UlsterColeraineNorthern Ireland

Personalised recommendations