Abstract
For spiking networks to perform computational tasks, benchmark data sets are required for model design, refinement and testing. Classic machine learning benchmark data sets use classification as the dominant paradigm, however the temporal characteristics of spiking neural networks mean they are likely to be more useful for problems involving sequence data. To support these paradigms, we provide data sets of 11 moving scenes, each with multiple variations, recorded from a dynamic vision sensor (DVS128), comprising high dimensional (16k pixels) and low latency (15 microsecond) events. We also present a novel long range prediction task based on the DVS128 data, and introduce a pilot study of a spiking neural network learning to predict thousands of events into the future.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Chandy, K.M.: Event-driven applications: Costs, benefits and design approaches. Presented at the Gartner Application Integration and Web Services Summit, San Diego, CA (2006)
Lichtsteiner, P., et al.: A 128x128 120 db 15 s latency asynchronous temporal contrast vision sensor. IEEE Journal of Solid-State Circuits 43(2), 566–576 (2008)
Liu, S.-C., et al.: Event-based 64-channel binaural silicon cochlea with q enhancement mechanisms. In: Proceedings of 2010 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2027–2030 (2010)
Ahn, E.Y., et al.: Dynamic vision sensor camera based bare hand gesture recognition. In: 2011 IEEE Symposium on Computational Intelligence for Multimedia, Signal and Vision Processing (CIMSIVP), pp. 52–59 (April 2011)
O’Connor, P., et al.: Real-time classification and sensor fusion with a spiking deep belief network. Frontiers in Neuroscience 7(178) (2013)
Brandli, C., et al.: Adaptive pulsed laser line extraction for terrain reconstruction using a dynamic vision sensor. Frontiers in Neuroscience 7(275) (2014)
Piatkowska, E., et al.: Spatiotemporal multiple persons tracking using dynamic vision sensor. In: 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 35–40. IEEE (2012)
Drazen, D., et al.: Toward real-time particle tracking using an event-based dynamic vision sensor. Experiments in Fluids 51(5), 1465–1469 (2011)
Koeth, F., et al.: Self-organisation of motion features with a temporal asynchronous dynamic vision sensor. Biologically Inspired Cognitive Architectures 6(0), 8–11 (2013)
Bichler, O., et al.: Extraction of temporally correlated features from dynamic vision sensors with spike-timing-dependent plasticity. Neural Networks 32, 339–348 (2012)
Gomez-Rodriguez, F., et al.: Real time multiple objects tracking based on a bio-inspired processing cascade architecture. In: Proceedings of 2010 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 1399–1402 (May 2010)
Censi, A., et al.: Low-latency localization by active led markers tracking using a dynamic vision sensor. In: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 891–898. IEEE (2013)
Rogister, P., et al.: Asynchronous event-based binocular stereo matching. IEEE Transactions on Neural Networks and Learning Systems 23(2), 347–353 (2012)
Pikatkowska, E., Belbachir, A.N.: Asynchronous stereo vision for event-driven dynamic stereo sensor using an adaptive cooperative approach. In: 2013 IEEE International Conference on Computer Vision (ICCV) Workshops, pp. 45–50 (2013)
Weikersdorfer, D., Hoffmann, R., Conradt, J.: Simultaneous localization and mapping for event-based vision systems. In: Chen, M., Leibe, B., Neumann, B., et al. (eds.) ICVS 2013. LNCS, vol. 7963, pp. 133–142. Springer, Heidelberg (2013)
Weikersdorfer, D., Conradt, J.: Event-based particle filtering for robot self-localization. In: 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 866–870. IEEE (2012)
Perez-Peña, F., et al.: Neuro-inspired spike-based motion: From dynamic vision sensor to robot motor open-loop control through spike-vite. Sensors 13(11), 15, 805–15, 832 (2013)
Delbrück, T., Lang, M.: Robotic goalie with 3ms reaction time at 4% cpu load using event-based dynamic vision sensor. Frontiers in Neuroscience 7(223) (2013)
Ball, D., et al.: Irat: Intelligent rat animat technology. In: Proceedings of the 2010 Australasian Conference on Robotics and Automation, pp. 1–3 (2010)
Delbrück, T.: Jaer open source project (2007), http://jaer.wiki.sourceforge.net
Delbrück, T.: Caer (2013), http://sourceforge.net/projects/jaer/files/cAER/
Ball, D., et al.: Openratslam: An open source brain-based slam system. Autonomous Robots 34(3), 149–176 (2013)
Besl, P., McKay, N.D.: A method for registration of 3-d shapes. IEEE Transactions on Pattern Analysis and Machine Intelligence 14(2), 239–256 (1992)
Kullback, S., Leibler, R.A.: On information and sufficiency. In: The Annals of Mathematical Statistics, pp. 79–86 (1951)
Gibson, T., et al.: Predicting temporal sequences using an event-based spiking neural network incorporating learnable delays. In: The International Joint Conference on Neural Networks, IJCNN (in press 2014)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Gibson, T.(. et al. (2014). Event-Based Visual Data Sets for Prediction Tasks in Spiking Neural Networks. In: Wermter, S., et al. Artificial Neural Networks and Machine Learning – ICANN 2014. ICANN 2014. Lecture Notes in Computer Science, vol 8681. Springer, Cham. https://doi.org/10.1007/978-3-319-11179-7_80
Download citation
DOI: https://doi.org/10.1007/978-3-319-11179-7_80
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-11178-0
Online ISBN: 978-3-319-11179-7
eBook Packages: Computer ScienceComputer Science (R0)