Advertisement

Aiding Episodic Memory in Lifelog System Focusing on User Status

  • Xin Ye
  • Jiro TanakaEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11582)

Abstract

Lifelog can be described as a digital library of an individual’s life, which is known for its ability to record life and help with memory. Autographer, a wearable camera that can captured images automatically, is always used for aiding episodic memory in lifelog system. In order to improve the effectiveness of retrieving memory using lifelog, this paper proposed two novelty user-relative memory cues to extract important memories for lifeloggers. They are special sentiment cue and special movement cue. With the integration of 2 Autographers and sensors embedded in Android smartphone, we implement a web-based lifelog viewer for lifeloggers to conveniently retrieve memories. On account of our system, we invited some participants to test the usability and efficiency of using our system. The preliminary result showed positive potential of aiding episodic memory by using our approaches.

Keywords

Episodic memory Lifelog Special movement Special sentiment 

References

  1. 1.
    Tulving, E.: Elements of episodic memory. Politics 2 (1983)Google Scholar
  2. 2.
    Tulving, E.: Episodic memory: from mind to brain. Annu. Rev. Psychol. 53(1), 1–25 (2002)CrossRefGoogle Scholar
  3. 3.
    Klein, S.B., Loftus, J., Kihlstrom, J.F.: Memory and temporal experience: the effects of episodic memory loss on an amnesic patient’s ability to remember the past and imagine the future. Soc. Cogn. 20(5), 353–379 (2002)CrossRefGoogle Scholar
  4. 4.
    Lee, M.L., Dey, A.K.: Lifelogging memory appliance for people with episodic memory impairment. In: Proceedings of the 10th International Conference on Ubiquitous Computing, pp. 44–53. ACM, New York (2008)Google Scholar
  5. 5.
    Aizawa, K., Tancharoen, D., Kawasaki, S., Yamasaki, T.: Efficient retrieval of life log based on context and content. In: Proceedings of the 1st ACM Workshop on Continuous Archival and Retrieval of Personal Experiences, pp. 22–31. ACM, New York (2004)Google Scholar
  6. 6.
    Chen, Y., Jones, G.J.: Augmenting human memory using personal lifelogs. In: Proceedings of the 1st Augmented Human International Conference, p. 24. ACM, New York (2010)Google Scholar
  7. 7.
    Chowdhury, S., Ferdous, M.S., Jose, J.M.: A user-study examining visualization of lifelogs. In: 2016 14th International Workshop on Content-Based Multimedia Indexing (CBMI), pp. 1–6. IEEE (2016)Google Scholar
  8. 8.
    Ekman, P.: Facial expression and emotion. Am. Psychol. 48(4), 384–392 (1993)CrossRefGoogle Scholar
  9. 9.
    Face++: Api references (2018). https://www.faceplusplus.com.cn/
  10. 10.
    Moves: Api references (2018). https://dev.moves-app.com/
  11. 11.
    Yu, F., Chang, E., Xu, Y.Q., Shum, H.Y.: Emotion detection from speech to enrich multimedia content. In: Shum, H.Y., Liao, M., Chang, S.F. (eds.) PCM 2001. LNCS, vol. 2195, pp. 550–557. Springer, Heidelberg (2001).  https://doi.org/10.1007/3-540-45453-5_71CrossRefGoogle Scholar
  12. 12.
    Breazeal, C., Aryananda, L.: Recognition of affective communicative intent in robot-directed speech. Auton. Robots 12(1), 83–104 (2002)zbMATHCrossRefGoogle Scholar
  13. 13.
    Agrafioti, F., Hatzinakos, D., Anderson, A.K.: ECG pattern analysis for emotion detection. IEEE Trans. Affect. Comput. 3(1), 102–115 (2012)CrossRefGoogle Scholar
  14. 14.
    Niedenthal, P.M., Halberstadt, J.B., Margolin, J., Innes-Ker, Å.H.: Emotional state and the detection of change in facial expression of emotion. sEur. J. Soc. Psychol. 30(2), 211–222 (2000)Google Scholar
  15. 15.
    Caridakis, G., Malatesta, L., Kessous, L., Amir, N., Raouzaiou, A., Karpouzis, K.: Modeling naturalistic affective states via facial and vocal expressions recognition. In: Proceedings of the 8th International Conference on Multimodal Interfaces, pp. 146–154. ACM, New York (2006)Google Scholar
  16. 16.
    Savran, A., et al.: Emotion detection in the loop from brain signals and facial images (2006)Google Scholar
  17. 17.
    Baek, J., Lee, G., Park, W., Yun, B.J.: Accelerometer signal processing for user activity detection. In: Negoita, M.G., Howlett, R.J., Jain, L.C. (eds.) KES 2004. LNCS (LNAI), vol. 3215, pp. 610–617. Springer, Heidelberg (2004).  https://doi.org/10.1007/978-3-540-30134-9_82CrossRefGoogle Scholar
  18. 18.
    Ravi, N., Dandekar, N., Mysore, P., Littman, M.L.: Activity recognition from accelerometer data. In: AAAI, vol. 5, pp. 1541–1546 (2005)Google Scholar
  19. 19.
    Harvey, J.A., Skelton, D.A., Chastin, S.F.: Acceptability of novel lifelogging technology to determine context of sedentary behaviour in older adults. AIMS Public Health 3(1), 158–171 (2016)CrossRefGoogle Scholar
  20. 20.
    Wang, J., Tanaka, J.: Aiding autobiographical memory by using wearable devices. In: Arai, K., Kapoor, S., Bhatia, R. (eds.) FICC 2018. AISC, vol. 887, pp. 534–545. Springer, Cham (2019).  https://doi.org/10.1007/978-3-030-03405-4_37CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Waseda UniversityKitakyushuJapan

Personalised recommendations