Advertisement

Event-based Sensing for Space Situational Awareness

  • Gregory CohenEmail author
  • Saeed Afshar
  • Brittany Morreale
  • Travis Bessell
  • Andrew Wabnitz
  • Mark Rutten
  • André van Schaik
Article
  • 30 Downloads

Abstract

A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatio-temporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and night-time (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An event-based sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding challenges required by space-based SSA systems. Results from these experiments and the systems developed highlight the applicability of event-based sensors to ground and space-based SSA tasks.

Keywords

Neuromorphic Event-based SSA Daytime 

References

  1. 1.
    Belbachir, A.N., Litzenberger, M., Schraml, S., Hofstatter, M., Bauer, D., Schon, P., Humenberger, M., Sulzbachner, C., Lunden, T., Merne, M.: CARE: a dynamic stereo vision sensor system for fall detection. In: 2012 IEEE International Symposium on Circuits and Systems, pp. 7310–734. IEEE, Seoul (2012).  https://doi.org/10.1109/ISCAS.2012.6272141
  2. 2.
    Boahen, K.: Point-to-point connectivity between neuromorphic chips using address events. IEEE Trans. Circ. Syst. II Analog Digit. Signal Process. 47(5), 416–434 (2000).  https://doi.org/10.1109/82.842110 CrossRefzbMATHGoogle Scholar
  3. 3.
    Brandli, C., Berner, R., Yang, M., Liu, S.-C., Delbruck, T.: A 240 x 180 130 dB 3 us latency global shutter spatiotemporal vision sensor. IEEE J. Solid State Circ. 49(10), 2333–2341 (2014).  https://doi.org/10.1109/JSSC.2014.2342715 CrossRefGoogle Scholar
  4. 4.
    Cohen, G.K., Orchard, G., Leng, S.H., Tapson, J., Benosman, R.B., van Schaik, A.: Skimming digits: Neuromorphic classification of spike-encoded images. Front. Neurosci. 10(184), 1–11 (2016).  https://doi.org/10.3389/fnins.2016.00184 Google Scholar
  5. 5.
    Delbrück, T., Lang, M.: Robotic goalie with 3 ms reaction time at 4% CPU load using event-based dynamic vision sensor. Front. Neurosci. 7(7 NOV), 1–7 (2013).  https://doi.org/10.3389/fnins.2013.00223 Google Scholar
  6. 6.
    Delbrück, T., Linares-Barranco, B., Culurciello, E., Posch, C.: Activity-driven, event-based vision sensors. In: Proceedings of 2010 IEEE International Symposium on Circuits and Systems, IEEE, pp. 2426–2429 (2010).  https://doi.org/10.1109/ISCAS.2010.5537149
  7. 7.
    Fukushima, K., Yamaguchi, Y., Yasuda, M., Nagata, S.: An electronic model of the retina. Proc. IEEE 58(12), 1950–1952 (1970).  https://doi.org/10.1109/PROC.1970.8066 CrossRefGoogle Scholar
  8. 8.
    Indiveri, G., Horiuchi, T.K.: Frontiers in neuromorphic engineering.  https://doi.org/10.3389/fnins.2011.00118 (2011)
  9. 9.
    Kueng, B., Mueggler, E., Gallego, G., Scaramuzza, D.: Low-Latency Visual Odometry Using Event-Based Feature Tracks. In: 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, pp. 16–23 (2016),  https://doi.org/10.1109/IROS.2016.7758089
  10. 10.
    Lagorce, X., Meyer, C., Ieng, S.H., Filliat, D., Benosman, R.: Asynchronous event-based multikernel algorithm for high-speed visual features tracking. IEEE Trans. Neural Netw. Learn. Syst. 26(8), 1–12 (2014).  https://doi.org/10.1109/TNNLS.2014.2352401 MathSciNetGoogle Scholar
  11. 11.
    Land, M.F., Fernald, R.D.: The evolution of eyes. Ann. Rev. Neurosci. 15 (1990), 1–29 (1992).  https://doi.org/10.1146/annurev.ne.15.030192.000245 CrossRefGoogle Scholar
  12. 12.
    Lichtsteiner, P., Posch, C., Delbrück, T.: A 128 X 128 120db 30mw asynchronous vision sensor that responds to relative intensity change. In: 2006 IEEE International Solid State Circuits Conference - Digest of Technical Papers, pp. 2004–2006 (2006).  https://doi.org/10.1109/ISSCC.2006.1696265
  13. 13.
    Lichtsteiner, P., Posch, C., Delbrück, T.: A 128 x 128 120 dB 15 us latency asynchronous temporal contrast vision sensor. IEEE J. Solid State Circ. 43(2), 566–576 (2008).  https://doi.org/10.1109/JSSC.2007.914337 CrossRefGoogle Scholar
  14. 14.
    Mahowald, M.: An analog VLSI system for stereoscopic vision. Kluwer Int. series in engineering and computer science. Kluwer Academic Publishers, Norwell (1994)CrossRefGoogle Scholar
  15. 15.
    Mueggler, E., Huber, B., Scaramuzza, D.: Event-based, 6-DOF pose tracking for high-speed maneuvers. In: 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2761–2768. IEEE.  https://doi.org/10.1109/IROS.2014.6942940 (2014)
  16. 16.
    Ni, Z., Pacoret, C., Benosman, R., Ieng, S.H., Régnier, S: Asynchronous event based high speed vision for micro-particles tracking. J. Microsc. 245(3), 236–244 (2011)CrossRefGoogle Scholar
  17. 17.
    Ni, Z., Bolopion, A., Agnus, J., Benosman, R., Régnier, S.: Asynchronous event-based visual shape tracking for stable haptic feedback in Microrobotics. IEEE Trans. Robot. (T-RO) 28(5), 1081–1089 (2012)CrossRefGoogle Scholar
  18. 18.
    Posch, C., Matolin, D., Wohlgenannt, R.: A QVGA 143 dB dynamic range frame-free PWM image sensor with lossless pixel-level video compression and time-domain CDS. IEEE J. Solid State Circ. 46(1), 259–275 (2011).  https://doi.org/10.1109/JSSC.2010.2085952 CrossRefGoogle Scholar

Copyright information

© American Astronautical Society 2018

Authors and Affiliations

  1. 1.The MARCS Institute for Brain, Behavior and DevelopmentWestern Sydney UniversitySydneyAustralia
  2. 2.United States Air ForceWashingtonUSA
  3. 3.National Security and Intelligence, Surveillance and Reconnaissance DivisionDefence Science and Technology GroupCanberraAustralia

Personalised recommendations