Advertisement

Event Extraction Using Transportation of Temporal Optical Flow Fields

  • Itaru Gotoh
  • Hiroki Hiraoka
  • Atsushi ImiyaEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11134)

Abstract

In this paper, we develop a method to transform a sequence of images to a sequence of events. Optical flow, which is the vector fields of pointwise motion computed from monocular image sequences, describes pointwise motion in an environment. The method extracts the global smoothness and continuity of motion fields and detects collapses of the smoothness of the motion fields in long-time image sequences using transportation of the temporal optical flow field.

References

  1. 1.
    Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: Proceedings of IJCAI 1981, pp. 674–679 (1981)Google Scholar
  2. 2.
    Tomasi, C., Kanade, T.: Detection and tracking of point features. Int. J. Comput. Vis. 9, 137–154 (1991)CrossRefGoogle Scholar
  3. 3.
    Horn, B.K.P., Schunck, B.G.: Determining optical flow. Artif. Intell. 17, 185–203 (1981)CrossRefGoogle Scholar
  4. 4.
    Hwang, S.-H., Lee, U.-K.: A hierarchical optical flow estimation algorithm based on the interlevel motion smoothness constraint. Pattern Recogn. 26, 939–952 (1993)CrossRefGoogle Scholar
  5. 5.
    Vaina, L.M., Beardsley, S.A., Rushton, S.K. (eds.): Optic Flow and Beyond. SL, vol. 324. Springer, Dordrecht (2004).  https://doi.org/10.1007/978-1-4020-2092-6CrossRefGoogle Scholar
  6. 6.
    Duffy, C.J.: Optic flow analysis for self-movement prerception. Int. Rev. Neurobiol. 44, 199–218 (2000)CrossRefGoogle Scholar
  7. 7.
    Lappe, M., Bremmer, F., van den Berg, A.V.: Perception of self-motion from visual flow. Trends Cogn. Sci. 3, 329–336 (1999)CrossRefGoogle Scholar
  8. 8.
    Calow, D., Krüger, N., Wörgötter, F., Lappe, M.: Statistics of optic flow for self-motion through natural scenes. In: Ilg, U., Bülthoff, H.H., Mallot, A.H., et al. (eds.) Dynamic Perception, pp. 133–138. IOS Press (2004)Google Scholar
  9. 9.
    Villani, C.: Optimal Transport. Old and New. Springer, Heidelberg (2009).  https://doi.org/10.1007/978-3-540-71050-9CrossRefzbMATHGoogle Scholar
  10. 10.
    Rabin, J., Delon, J., Gousseau, Y.: Transportation distances on the circle. JMIV 41, 147–167 (2011)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Sabatini, S.P.: A physicalist approach to first-order analysis of optic flow fields in extrastriate cortical areas. In: ICANN 1999 (1999)Google Scholar
  12. 12.
    Park, K.-Y., Jabri, M, Lee, S.-Y., Sejnowski, T.J.: Independent components of optical flows have MSTd-like receptive fields. In: Proceedings of the 2nd International Workshop on ICA and Blind Signal Separation, pp. 597–601 (2000)Google Scholar
  13. 13.
    Wurtz, R.: Optic flow: a brain region devoted to optic flow analysis? Curr. Biol. 8, R554–R556 (1998)CrossRefGoogle Scholar
  14. 14.
    Greenlee, M.: Human cortical areas underlying the perception of optic flow: brain imaging studies. Int. Rev. Neurobiol. 44, 269–292 (2000)CrossRefGoogle Scholar
  15. 15.
    Andersen, R.A.: Neural mechanisms of visual motion perception in primates. Neuron 18, 865–872 (1997)CrossRefGoogle Scholar
  16. 16.
    Newsome, W.T., Baré, E.B.: A selective impariment of motion perception following lesions of the middle temporal visual area (MT). J. Neurosci. 8, 2201–2211 (1988)CrossRefGoogle Scholar
  17. 17.
    Pan, C., Deng, H., Yin, X.-F., Liu, J.-G.: An optical flow-based integrated navigation system inspired by insect vision. Biol. Cybern. 105, 239–252 (2011)CrossRefGoogle Scholar
  18. 18.
    Franceschini, N.: Visual guidance based on optic flow: a biorobotic approach. J. Physiol. Paris 98, 281–292 (2004)CrossRefGoogle Scholar
  19. 19.
    Srinivasan, M.V.: Honeybees as a model for the study of visually guided flight, navigation, and biologically inspired robotics. Physiol. Rev. 91, 413–460 (2011)CrossRefGoogle Scholar
  20. 20.
    Serres, J.R., Ruffier, F.: Optic flow-based collision-free strategies: from insects to robots. Arthropod Struct. Dev. 46, 703–717 (2017)CrossRefGoogle Scholar
  21. 21.
    Sobey, P.J.: Active navigation with a monocular robot. Biol. Cybern. 71, 433–440 (1994)CrossRefGoogle Scholar
  22. 22.
    Fisher, N.I.: Statistical Analysis of Circular Data. Cambridge University Press, Cambridge (1993)CrossRefGoogle Scholar
  23. 23.
    Weickert, J., Schnörr, C.: Variational optic flow computation with a spatio-temporal smoothness constraint. J. Math. Imaging Vis. 14, 245–255 (2001)CrossRefGoogle Scholar
  24. 24.
    Spies, H., Jähne, B., Barron, J.L.: Range flow estimation. Comput. Vis. Image Underst. 85, 209–231 (2002)CrossRefGoogle Scholar
  25. 25.
    Barron, J.L., Klette, R.: Quantitative color optical flow. In: Proceedings of ICPR 2002, vol. 4, pp. 251–255 (2002)Google Scholar
  26. 26.
    Golland, P., Bruckstein, A.M.: Motion from color. Comput. Vis. Image Underst. 68, 346–362 (1997)CrossRefGoogle Scholar
  27. 27.
    Kirisits, C., Lang, L.F., Scherzer, O.: Decomposition of optical flow on the sphere. GEM Int. J. Geomathematics 5, 17–141 (2014)MathSciNetCrossRefGoogle Scholar
  28. 28.
    Lukas, F., Lang, L.K., Scherzer, O.: Optical flow on evolving sphere-like surfaces. Inverse Probl. Imaging 11, 305–338 (2017)MathSciNetCrossRefGoogle Scholar
  29. 29.
    Rumpf, M., Wirth, B.: Variational methods in shape analysis. In: Scherzer, O. (ed.) Handbook of Mathematical Methods in Imaging, pp. 1819–1858. Springer, New York (2015).  https://doi.org/10.1007/978-1-4939-0790-8_56CrossRefzbMATHGoogle Scholar
  30. 30.
    Hafner, D., Demetz, O., Weickert, J.: Why is the census transform good for robust optic flow computation? In: Kuijper, A., Bredies, K., Pock, T., Bischof, H. (eds.) SSVM 2013. LNCS, vol. 7893, pp. 210–221. Springer, Heidelberg (2013).  https://doi.org/10.1007/978-3-642-38267-3_18CrossRefGoogle Scholar
  31. 31.
    Vázquez, D., López, A.M., Marín, J., Ponsa, D., Gómez, D.: Virtual and real world Adaptation for pedestrian detection. IEEE PAMI 36, 797–809 (2014)CrossRefGoogle Scholar
  32. 32.
    Imiya, A.: Detection of piecewise-linear signals by the randomized Hough transform. Pattern Recogn. Lett. 17, 771–776 (1996)CrossRefGoogle Scholar
  33. 33.
    Imiya, A., Iwawaki, K.: Voting method for the detection of subpixel flow field. Pattern Recognit. Lett. 24, 197–214 (2003)CrossRefGoogle Scholar
  34. 34.
    Ohnishi, N., Imiya, A.: Featureless robot navigation using optical flow. Connect. Sci. 17, 23–46 (2005)CrossRefGoogle Scholar
  35. 35.
    Ohnishi, N., Imiya, A.: Appearance-based navigation and homing for autonomous mobile robot. Image Vis. Comput. 31, 511–532 (2013)CrossRefGoogle Scholar
  36. 36.
    Ohnishi, N., Imiya, A.: Independent component analysis of optical flow for robot navigation. Neurocomputing 71, 2140–2163 (2008)CrossRefGoogle Scholar
  37. 37.
    Alibouch, B., Radgui, A., Rziza, M., Aboutajdine, D.: Optical flow estimation on omnidirectional images: an adapted phase based method. In: Elmoataz, A., Mammass, D., Lezoray, O., Nouboud, F., Aboutajdine, D. (eds.) ICISP 2012. LNCS, vol. 7340, pp. 468–475. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-31254-0_53CrossRefGoogle Scholar
  38. 38.
    Torii, A., Imiya, A., Sugaya, H., Mochizuki, Y.: Optical flow computation for compound eyes: variational analysis of omni-directional views. In: De Gregorio, M., Di Maio, V., Frucci, M., Musio, C. (eds.) BVAI 2005. LNCS, vol. 3704, pp. 527–536. Springer, Heidelberg (2005).  https://doi.org/10.1007/11565123_51CrossRefGoogle Scholar
  39. 39.
    Mochizuki, Y., Imiya, A.: Pyramid transform and scale-space analysis in image analysis. In: Dellaert, F., Frahm, J.-M., Pollefeys, M., Leal-Taixé, L., Rosenhahn, B. (eds.) Outdoor and Large-Scale Real-World Scene Analysis. LNCS, vol. 7474, pp. 78–109. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-34091-8_4CrossRefGoogle Scholar
  40. 40.
    Kato, T., Itoh, H., Imiya, A.: Motion language of stereo image sequence. In: CVPR Workshops, pp. 1211–1218 (2017)Google Scholar
  41. 41.
    Ohnishi, N., Mochizuki, Y., Imiya, A., Sakai, T.: On-line planar area segmentation from sequence of monocular monochrome images for visual navigation of autonomous robot. In: VISAPP 2010, pp. 435–442 (2010)Google Scholar
  42. 42.
    Kameda, Y., Imiya, A.: The William Harvey code: mathematical analysis of optical flow computation for cardiac motion. In: Rosenhahn, B., Klette, R., Metaxas, D.N. (eds.) Human Motion, Understanding, Modelling, Capture, and Animation, Computational Imaging and Vision, vol. 36, pp. 81–104. Springer, Dordrecht (2006).  https://doi.org/10.1007/978-1-4020-6693-1_4CrossRefGoogle Scholar
  43. 43.
    Inagaki, S., Itoh, H., Imiya, A.: Multiple alignment of spatiotemporal deformable objects for the average-organ computation. In: Agapito, L., Bronstein, M.M., Rother, C. (eds.) ECCV 2014. LNCS, vol. 8928, pp. 353–366. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-16220-1_25CrossRefGoogle Scholar
  44. 44.

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.School of Science and EngineeringChiba UniversityChibaJapan
  2. 2.Institute of Management and Information TechnologiesChiba UniversityChibaJapan

Personalised recommendations