Abstract
Biologically inspired computational models of visual processing often utilize conventional frame-based cameras for data acquisition. Instead, the Dynamic Vision Sensor (DVS) emulates the main processing sequence of the mammalian retina and generates spike-trains to encode temporal changes in the luminance distribution of a visual scene. Based on such sparse input representation we propose neural mechanisms for initial motion estimation and integration functionally related to the dorsal stream in the visual cortical hierarchy. We adapt the spatio-temporal filtering scheme as originally suggested by Adelson and Bergen to make it consistent with the input representation generated by the DVS. In order to regulate the overall activation of single neurons against a pool of neighboring cells, we incorporate a competitive stage that operates upon the spatial as well as the feature domain. The impact of such normalization stage is evaluated using information theoretic measures. Results of optical flow estimation were analyzed using synthetic ground truth data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Litzenberger, M., Belbachir, A.N., Donath, N., Gritsch, G., Garn, H., Kohn, B., Posch, C., Schraml, S.: Estimation of vehicle speed based on asynchronous data from a silicon retina optical sensor. In: IEEE Intelligent Transportation Systems Conference Toronto, Canada, pp. 17–20 (2006)
Liu, S., Delbruck, T.: Neuromorphic sensory systems. Neurobiology 20, 288–295 (2010)
Lichtsteiner, P., Posch, C., Delbruck, T.: A 128 \(\times \) 128 120 db 15 \(\mu s\) latency asynchronous temporal contrast vision sensor. IEEE J. Solid-State Circuits 43, 566–576 (2008)
Delbruck, T., Lichtsteiner, P.: Fast sensory motor control based on event-based hybrid neuromorphic-procedural system. In: IEEE International Symposiom on Circuit and System, pp. 845–848 (2007)
Litzenberger, M., Posch, C., Bauer, D., Belbachir, A.N., Schon, P., Kohn, B., Garn, H.: Embedded vision system for real-time object tracking using an asynchronous transient vision sensor. In: 12th - Signal Processing Education Workshop. IEEE DSPW, pp. 173–178 (2006)
Ni, Z., Pacoret, C., Benosman, R., Ieng, S., Regnier, S.: Asynchronous event-based high speed vision for microparticle tracking. J. Microsc. 43, 1365–2818 (2011)
Abdul-Kreem, L.I., Neumann, H.: Bio-inspired model for motion estimation using address event representation. In: 10th International Conference on Computer Vision Theory and Application, VISIGRAPP, Berlin, Germany, 11–14 March 2015
Adelson, E., Bergen, J.: Spatiotemporal energy models for the perception of motion. J. Opt. Soc. Am. 2, 90–105 (1985)
Brox, T., Bruhn, A., Papenberg, N., Weickert, J.: High accuracy optical flow estimation based on a theory for warping. In: Pajdla, T., Matas, J.G. (eds.) ECCV 2004. LNCS, vol. 3024, pp. 25–36. Springer, Heidelberg (2004)
Drulea, M., Nedevschi, S.: Motion estimation using the correlation transform. IEEE Trans. Image Process. 22, 1057–7149 (2013)
Fleet, D., Jepson, A.: Computation of component image velocity from local phase information. Int. J. Comput. Vis. 5, 77–104 (1990)
Horn, B., Schunck, B.: Determining optical flow. Artif. Intell. 17, 185–203 (1981)
Benosman, R., Leng, S., Clercq, C., Bartolozzi, C., Srinivasan, M.: Asynchronous framless event-based opticlal flow. Neural Netw. 27, 32–37 (2012)
Lucas, B.D., Kanade, T.: An iterative image registration technique with and application to stereo vision. In: Proceedings of Imaging Understanding Workshop, pp. 121–130 (1981)
Tschechne, S., Brosch, T., Sailer, R., Egloffstein, N., Abdul-Kreem, L.I., Neumann, H.: On event-based motion detection and integration. In: Proceedings of 8th International Conference on Bio-inspired Information and Communication Technologies, BICT, December 1–3, Boston, MA, USA. ACM digital library (2014)
Tschechne, S., Sailer, R., Neumann, H.: Bio-inspired optic flow from event-based neuromorphic sensor input. In: El Gayar, N., Schwenker, F., Suen, C. (eds.) ANNPR 2014. LNCS, vol. 8774, pp. 171–182. Springer, Heidelberg (2014)
De Valois, R., Cottarisb, N.P., Mahonb, L.E., Elfara, S.D., Wilsona, J.A.: Spatial and temporal receptive fields of geniculate and cortical cells and directional selectivity. Vis. Res. 40, 3685–3702 (2000)
Brosch, T., Tschechne, S., Neumann, H.: On event-based optical flow detection. Front. Neurosci. 9, Article No. 137, 1–15 (2015)
Ringach, D.L.: Spatial structure and symmetry of simple-cell receptive fields in macaque primary visual cortex. Neurophysiology 88, 455–463 (2002)
Carandini, M., Heeger, D.J.: Normalization as a canonical neural computation. Nat. Rev. Neurosci. 13, 51–62 (2012)
Brosch, T., Neumann, H.: Computing with a canonical neural circuits model with pool normalization and modulating feedback. Neural Comput. 26, 2735–2789 (2014)
Blomfield, S.: Arithmetical operations performed by nerve cells. Brain Res. 69, 115–124 (1974)
Dayan, P., Abbot, L.F.: Theoretical Neuroscience. MIT Press, Cambridge (2001)
Silver, R.A.: Neuronal arithmetic. Nat. Rev. Neurosci. 11, 474–489 (2010)
Grossberg, S.: Nonlinear neural networks: principles, mechanisms, and architectures. Neural Netw. 1, 17–61 (1988)
Bouecke, J., Tlapale, E., Kornprobst, P., Neumann, H.: Neural mechanisms of motion detection, integration, and segregation: from biology to artificial image processing systems. EURASIP J. Adv. Signal Process. 2011, Article ID 781561, 22 (2010). doi: 10.1155/2011/781561
Lyu, S., Simoncelli, E.P.: Nonlinear extraction of independent components of natural images using radial gaussianization. Neural Comput. 21, 1485–1519 (2009)
Bayerl, P., Neumann, H.: Disambiguating visual motion through contextual feedback modulation. Neural Comput. 16, 2041–2066 (2004)
Yo, C., Wilson, H.: Perceived direction of moving two-dimensional patterns depends on duration, contrast and eccentricity. Vis. Res. 32, 135–147 (1992)
Adelson, E., Movshon, J.: Phenomenal coherence of moving visual pattern. Nature 300, 523–525 (1982)
Simoncelli, E.: Bayesian multiscale differential optical flow. In: Handbook of Computer Vision and Applications, Chap. 14. Academic Press (1999)
Caplovitz, G., Hsieh, P., Tse, P.: Mechanisms underlying the perceived angular velocity of a rigidly rotating object. Vis. Res. 46, 2877–2893 (2006)
Hubel, D.H., Wiesel, T.N.: Receptive fields and functional architecture in two nonstriate visual areas (18 and 19) of the cat. J. Neurophysiol. 28, 229–289 (1965)
Pack, C.C., Livingstone, M.S., Duffy, K.R., Born, R.T.: End-stopping and the aperture problem: two-dimensional motion signals in macaque v1. Neuron 39, 671–680 (2003)
Tsui, J.M.G., Hunter, N., Born, R.T., Pack, C.C.: The role of v1 surround suppression in mt motion integration. J. Neurophysiol. 24, 3123–3138 (2010)
Studený, M., Vejnarová, J.: The multiinformation function as a tool for measuring stochastic dependence. In: Jordan, M.I. (ed.) Learning in Graphical Models. NATO ASI Series, vol. 89, pp. 261–297. Springer, Heidelberg (1998). (Kluwer Academic Publishers)
Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley, Hoboken (2006)
Lyu, S., Simoncelli, E.P.: Nonlinear extraction of independent components of natural images using radial gaussianization. Neural Comput. 21, 1485–1519 (2009)
Strout, J.J., Pantle, A., Mills, S.L.: An energy model of interframe interval effects in single-step apparent motion. Vis. Res. 34, 3223–3240 (1994)
Emerson, R.C., Bergen, J.R., Adelson, E.H.: Directionally selective complex cells and the computation of motion energy in cat visual cortex. Vis. Res. 32, 203–218 (1992)
Challinor, K.L., Mather, G.: A motion-energy model predicts the direction discrimination and mae duration of two-stroke apparent motion at high and low retinal illuminance. Visi. Res. 50, 1109–1116 (2010)
Acknowledgements
LIAK. has been supported by grants from the Ministry of Higher Education and Scientific Research (MoHESR) Iraq and from the German Academic Exchange Service (DAAD). HN. acknowledges support from DFG in the Collaborative Research Center SFB/TR (A companion technology for cognitive technical systems). The authors would like to thank M. Schels for his help in recording biological motion.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Abdul-Kreem, L.I., Neumann, H. (2016). Estimating Visual Motion Using an Event-Based Artificial Retina. In: Braz, J., et al. Computer Vision, Imaging and Computer Graphics Theory and Applications. VISIGRAPP 2015. Communications in Computer and Information Science, vol 598. Springer, Cham. https://doi.org/10.1007/978-3-319-29971-6_21
Download citation
DOI: https://doi.org/10.1007/978-3-319-29971-6_21
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-29970-9
Online ISBN: 978-3-319-29971-6
eBook Packages: Computer ScienceComputer Science (R0)