Skip to main content

Deep Learning of Multisensory Streaming Data for Predictive Modelling with Applications in Finance, Ecology, Transport and Environment

  • Chapter
  • First Online:

Part of the book series: Springer Series on Bio- and Neurosystems ((SSBN,volume 7))

Abstract

This chapter presents methods for using eSNN and BI-SNN for deep, incremental learning and predictive modelling of streaming data and for deep knowledge representation. The methods are applied for predictive modelling in the areas of finance, ecology, transport and environment using respective multisensory streaming data. Each of these applications require specific model design in terms of data preparation, SNN model parameters, experimental setting and validation. Each of the methods are illustrated with case study problems and data, but their applicability can be extended to a wider class of problems where multisensory streaming data is available.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   229.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   299.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   299.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. N. Kasabov, N. Scott, E. Tu, S. Marks, N. Sengupta, E. Capecci, M. Othman, M. Doborjeh, N. Murli, R. Hartono, J. Espinosa-Ramos, L. Zhou, F. Alvi, G. Wang, D. Taylor, V. Feigin, S. Gulyaev, M. Mahmoudh, Z.G. Hou, J. Yang, Design methodology and selected applications of evolving spatio-temporal data machines in the NeuCube neuromorphic framework. Neural Netw. 78, 1–14 (2016). https://doi.org/10.1016/j.neunet.2015.09.011

    Article  Google Scholar 

  2. E. Tu, N. Kasabov, J. Yang, Mapping temporal variables into the NeuCube for improved pattern recognition, predictive modeling, and understanding of stream data. IEEE Trans. Neural Netw. Learn. Syst. 28(6), 1305–1317 (2017)

    Article  MathSciNet  Google Scholar 

  3. N. Kasabov, NeuCube: a spiking neural network architecture for mapping, learning and understanding of spatio-temporal brain data. Neural Netw. 52, 62–76 (2014)

    Article  Google Scholar 

  4. C. Chu, Y. Ni, G.J.S.C. Tan, J. Ashburton, Kernel regression for fMRI pattern prediction. Neuroimage 56(9), 662–673 (2011)

    Article  Google Scholar 

  5. M. Gholami Doborjeh, N. Kasabov, Mapping, learning, visualisation and classification of fMRI data in the NeuCube evolving spiking neural network framework. IEEE Trans. Neural Netw. Learn. Syst. 28(4), 887–899 (2015)

    Google Scholar 

  6. M. Just, StarPlus fMRI data (2001). http://www.cs.cmu.edu/afs/cs.cmu.edu/project/theo-81/www/

  7. T.M. Mitchell, R. Hutchinson, M.A. Just, R.S.F.P. Niculescu, X. Wang, Classifying instantaneous cognitive states from fMRI data, in AMIA Annual Symposium Proceedings (American Medical Informatics Association, 2003), p. 465

    Google Scholar 

  8. N. Murli, N. Kasabov, B. Handaga, Classification of fMRI data in the NeuCube evolving spiking neural network architecture, in Proceedings ICONIP (Springer), pp. 421–428

    Google Scholar 

  9. T. Delbruck, P. Lichtsteiner, Fast sensory motor control based on event-based hybrid neuromorphic-procedural system, in 2007 IEEE International Symposium on Circuits and Systems, pp. 845–848. IEEE, New Orleans, LA, USA (2007). http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=4252767

  10. N. Kasabov, V. Feigin, Z.-G. Hou, Y. Chen, L. Liang, R. Krishnamurthi et al., Evolving spiking neural networks for personalised modelling, classification and prediction of spatio-temporal patterns with a case study on stroke. Neurocomputing 134, 269–279 (2014)

    Article  Google Scholar 

  11. J. Schmidhuber, Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2014)

    Article  Google Scholar 

  12. J. Liu, Y. Chen, Y. Chuo, H. Tsai, Variations of ionospheric total electron content during the chi-chi earthquake. Geophys. Res. Lett. 28(7), 1383–1386 (2001)

    Article  Google Scholar 

  13. S. Liu, S. Wang, K. Jayarajah, A. Misra, R. Krishnan, Todmis: mining communities from trajectories, in Proceedings of 22nd ACM International Conference on Information & Knowledge Management, CIKM’13. ACM (2013), pp. 2109–2118. http://doi.acm.org/10.1145/2505515.2505552

  14. D. Buonomano, W. Maass, State-dependent computations: spatio-temporal processing in cortical networks. Nat. Rev. Neurosci. 10, 113–125 (2009)

    Article  Google Scholar 

  15. W. Gerstner, A.K. Kreiter, H.M.H.A.V. Markram, Theory and simulation in neuroscience. Proc. Natl. Acad. Sci. U S A 94(24), 12740–12741 (1997)

    Article  Google Scholar 

  16. W. Gerstner, H. Sprekeler, G. Deco, Theory and simulation in neuroscience. Science 338, 60–65 (2012)

    Article  Google Scholar 

  17. S. Song, K.D. Miller, L.F. Abbott, Competitive Hebbian learning through spike-timing-dependent synaptic plasticity. Nat. Neurosci. 3(9), 919–926 (2000). http://www.ncbi.nlm.nih.gov/pubmed/10966623

  18. S. Fusi, Spike-driven synaptic plasticity for learning correlated patterns of mean firing rates. Rev. Neurosci. 14(1–2), 73–84 (2003)

    Google Scholar 

  19. E.M. Izhikevich, Which model to use for cortical spiking neurons? IEEE Trans. Neural Netw. 15(5), 1063–1070 (2004)

    Article  Google Scholar 

  20. N. Kasabov, J. Hu, Y. Chen, N. Scott, Y. Turkova, Spatio-temporal EEG data classification in the NeuCube 3D SNN environment: methodology and examples, in Proceedings of the International Conference on Neural Information Processing (Springer, Daegu, Korea, 2013), pp. 63–69

    Google Scholar 

  21. A. Mohemmed, N. Kasabov, Incremental learning algorithm for spatio-temporal spike pattern classification, in Proceedings of the IEEE world congress on computational intelligence, Brisbane, Australia, pp. 1227–1232

    Google Scholar 

  22. N. Kasabov, E. Capecci, Spiking neural network methodology for modelling, classification and understanding of EEG data measuring cognitive processes. Inf. Sci. 294, 565–575 (2015)

    Article  MathSciNet  Google Scholar 

  23. S. Soltic, N. Kasabov, Knowledge extraction from evolving spiking neural networks with rank order population coding. Int. J. Neural Syst. 20(6), 437–445 (2010)

    Article  Google Scholar 

  24. N. Kasabov, K. Dhoble, N. Nuntalid, G. Indiveri, Dynamic evolving spiking neural networks for on-line spatio-and spectro-temporal pattern recognition. Neural Netw. 41, 188–201 (2013)

    Article  Google Scholar 

  25. N. Kasabov, Evolving Connectionist Systems (Springer, Berlin, 2007)

    MATH  Google Scholar 

  26. M. Defoin-Platel, S. Schliebs, N. Kasabov, Quantum-inspired evolutionary algorithm: a multi-model EDA. IEEE Trans. Evol. Comput. 13(6), 1218–1232 (2009)

    Article  Google Scholar 

  27. S. Bruckner, V. Šoltészová, M.E. Gröller, J. Hladuvka, K. Buhler, J.Y. Yu, B.J. Dickson, BrainGazer—visual queries for neurobiology research. IEEE Trans. Vis. Comput. Graph. 15(6), 1497–1504 (2009). https://doi.org/10.1109/TVCG.2009.121

    Article  Google Scholar 

  28. C.-Y. Lin, K.-L. Tsai, S.-C. Wang, C.-H. Hsieh, H.-M. Chang, A.-S. Chiang, The neuron navigator: exploring the information pathway through the neural maze, in 2011 IEEE Pacific Visualization Symposium, PacificVis (2011), pp. 35–42

    Google Scholar 

  29. A. von Kapri, T. Rick, T.C. Potjans, M. Diesmann, T. Kuhlen, Towards the visualization of spiking neurons in virtual reality. Stud. Health Technol. Inform. 163, 685–687 (2011)

    Google Scholar 

  30. S. Marks, VR Visualisation of NeuCube, Evolving Systems (Springer, Berlin, 2017)

    Google Scholar 

  31. R. Khansama, V. Ravi, N. Sengupta, A.R. Gollahalli, N. Kasabov, I. Bilbao-Quintana, Stock market movement prediction using evolving spiking neural networks, Evolving Systems, 2018

    Google Scholar 

  32. Quandl Financial, Economic and Alternative Data. https://www.quandl.com/

  33. Historical-Indices. http://www.bseindia.com/indices/IndexArchiveData.aspx

  34. NSE—national stock exchange of India ltd. https://www.nseindia.com/products/content/equities/indices/historicalindexdata.htm

  35. Wikipedia. http://wikipedia.com

  36. K.-I. Oyama, Y. Kakinami, J.-Y. Liu, M. Kamogawa, T. Kodama, Reduction of electron temperature in low-latitude ionosphere at 600 km before and after large earthquakes. J. Geophys. Res. Space Phys. (1978–2012) 113(A11) (2008)

    Google Scholar 

  37. T.H. Jordan, Earthquake predictability, brick by brick. Seismol. Res. Lett. 77(1), 3–6 (2006)

    Article  MathSciNet  Google Scholar 

  38. R.J. Geller, D.D. Jackson, Y.Y. Kagan, F. Mulargia, Enhanced: earthquakes cannot be predicted. Science 275(5306), 1616–1620 (1997)

    Article  Google Scholar 

  39. S. Pulinets, A. Legen’Ka, T. Gaivoronskaya, V.K. Depuev, Main phenomenological features of ionospheric precursors of strong earth- quakes. J. Atmos. Solar Terr. Phys. 65(16), 1337–1347 (2003)

    Article  Google Scholar 

  40. D. Ghosh, A. Deb, R. Sengupta, Anomalous radon emission as precursor of earthquake. J. Appl. Geophys. 69(2), 67–81 (2009)

    Article  Google Scholar 

  41. Y. Li, Y. Liu, Z. Jiang, J. Guan, G. Yi, S. Cheng, B. Yang, T. Fu, Z. Wang, Behavioral change related to Wenchuan devastating earthquake in mice. Bioelectromagnetics 30(8), 613–620 (2009)

    Article  Google Scholar 

  42. R.A. Grant, T. Halliday, Predicting the unpredictable; evidence of pre-seismic anticipatory behaviour in the common toad. J. Zool. 281(4), 263–271 (2010)

    Google Scholar 

  43. I. Sovic´, K. Sˇ ariri, M. Zˇ ivcˇic´, High frequency microseismic noise as possible earthquake precursor. Res. Geophys. 3(1), e2 (2013)

    Article  Google Scholar 

  44. G. Sobolev, A. Lyubushin, Microseismic impulses as earthquake precursors. Izv. Phys. Solid Earth 42(9), 721–733 (2006)

    Article  Google Scholar 

  45. Q. Huang, Search for reliable precursors: a case study of the seismic quiescence of the 2000 western Tottori prefecture earthquake. J. Geophys. Res. Solid Earth (1978–2012) 111(B4) (2006)

    Google Scholar 

  46. Y.-M. Wu, L.-Y. Chiao, Seismic quiescence before the 1999 chi-chi, Taiwan, mw 7.6 earthquake. Bull. Seismol. Soc. Am. 96(1), 321–327 (2006)

    Article  Google Scholar 

  47. J. Reyes, A. Morales-Esteban, F. Mart´ınez-A´ lvarez, Neural networks to predict earthquakes in chile. Appl. Soft Comput. 13(2), 1314–1328 (2013)

    Article  Google Scholar 

  48. A. Morales-Esteban, F. Martínez-Álvarez, J. Reyes, Earthquake prediction in seismogenic areas of the iberian peninsula based on computational intelligence. Tectonophysics 593, 121–134 (2013)

    Article  Google Scholar 

  49. M. Shibli, A novel approach to predict earthquakes using adaptive neural fuzzy inference system and conservation of energy-angular momentum. Int. J. Comput. Inf. Syst. Ind. Manag. Appl. ISSN (2011), pp. 2150–7988

    Google Scholar 

  50. A. Zamani, M.R. Sorbi, A.A. Safavi, Application of neural network and ANFIS model for earthquake occurrence in Iran. Earth Sci. Inf. 6(2), 71–85 (2013)

    Article  Google Scholar 

  51. E. Joelianto, S. Widiyantoro, M. Ichsan, Time series estimation on earthquake events using ANFIS with mapping function. Int. J. Artif. Intell. 3(A09), 37–63 (2008)

    Google Scholar 

  52. A. Ikram, U. Qamar, A rule-based expert system for earthquake prediction. J. Intell. Inf. Syst. 43(2), 205–230 (2014)

    Article  Google Scholar 

  53. N. Kasabov, N. Scott, E. Tu, S. Marks, N. Sengupta, E. Capecci, M. Othman, M.G. Doborjeh, N. Murli, J.I. Espinosa-Ramos et al., Evolving spatio-temporal data machines based on the neucube neuromorphic framework: design methodology and selected applications. Neural Netw. (2015)

    Google Scholar 

  54. T. Petersen, K. Gledhill, M. Chadwick, N.H. Gale, J. Ristau, The New Zealand national seismograph network. Seismol. Res. Lett. 82(1), 9–20 (2011)

    Article  Google Scholar 

  55. P.S.P Maciaga, N.K. Kasabov, M. Kryszkiewicza, R. Benbenik, Prediction of hourly air pollution in London area using evolving spiking neural networks. Environ. Modelling Software, Elsevier (2018/2019)

    Google Scholar 

  56. Square Kilometer Array (SKA) Project: https://www.skatelescope.org

    Google Scholar 

  57. N. Kasabov (ed.), Springer Handbook of Bio-/Neuroinformatics (Springer, Berlin, 2014)

    Google Scholar 

  58. N. Kasabov, To spike or not to spike: a probabilistic spiking neuron model. Neural Netw. 23(1), 16–19 (2010)

    Article  Google Scholar 

  59. S. Schliebs, N. Kasabov, Evolving spiking neural network—a survey. Evolving Syst. 4(2), 87–98 (2013)

    Article  Google Scholar 

  60. B. Schrauwen, J. Van Campenhout, BSA, a fast and accurate spike train encoding scheme, in Proceedings of the International Joint Conference on Neural Networks, vol. 4 (IEEE Piscataway, NJ, 2003), pp. 2825–2830

    Google Scholar 

  61. R. Hartono, PhD Thesis, Auckland University of Technology (2018)

    Google Scholar 

  62. J.L. Lobo, I. Laña, J. Del Ser, M.N. Bilbao, N. Kasabov, Evolving spiking neural networks for online learning over drifting data streams. Neural Netw. 108, 1–19 (2018)

    Google Scholar 

Download references

Acknowledgements

Parts of the material in this chapter have been previously published as referenced in the relevant sections of this chapter. I would like to acknowledge the contribution to these publications of my co-authors Enmei Tu, Josafath Israel Espinosa, Sue Worner, Reggio Hartono, Stefan Marks, Nathan Scott, S. Gulyaev, N. Sengupta, R. Khansam, V. Ravi, A. Gollahalli, Petr Maciak, Imanol Bilbao-Quintana.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nikola K. Kasabov .

Appendices

Appendix 1

Appendix 2

Improved stock market movement prediction with optimised eSNN parameters on the same stock data as in 19.2 (Fig. 19.24)

Fig. 19.24
figure 24

A grid optimisation of two eSNN parameters (number of receptive fields N and with of the receptive fields) of eSNN for BSE stock movement prediction. This resulted in a significant improvement of predicted stock value (max accuracy achieved is 90% for N = 11 and width = 1.6) (the figure is created by Imanol Bilbao-Quintana)

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer-Verlag GmbH Germany, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Kasabov, N.K. (2019). Deep Learning of Multisensory Streaming Data for Predictive Modelling with Applications in Finance, Ecology, Transport and Environment. In: Time-Space, Spiking Neural Networks and Brain-Inspired Artificial Intelligence . Springer Series on Bio- and Neurosystems, vol 7. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-57715-8_19

Download citation

Publish with us

Policies and ethics