Bayesian Optimisation of Large-scale Photonic Reservoir Computers


Reservoir computing is a growing paradigm for simplified training of recurrent neural networks, with a high potential for hardware implementations. Numerous experiments in optics and electronics yield comparable performance with digital state-of-the-art algorithms. Many of the most recent works in the field focus on large-scale photonic systems, with tens of thousands of physical nodes and arbitrary interconnections. While this trend significantly expands the potential applications of photonic reservoir computing, it also complicates the optimisation of the high number of hyper-parameters of the system. In this work, we propose the use of Bayesian optimisation for efficient exploration of the hyper-parameter space in a minimum number of iteration. We test this approach on a previously reported large-scale experimental system, compare it with the commonly used grid search, and report notable improvements in performance and the number of experimental iterations required to optimise the hyper-parameters. Bayesian optimisation thus has the potential to become the standard method for tuning the hyper-parameters in photonic reservoir computing.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5


  1. 1.

    Jaeger H. Science 2004;304(5667):78.

    Article  Google Scholar 

  2. 2.

    Maass W, Natschläger T, Markram H. Neural Comput 2002;14(11):2531.

    Article  Google Scholar 

  3. 3.

    Lukośevičius M, Jaeger H. Comput Sci Rev 2009;3(3):127.

    Article  Google Scholar 

  4. 4.

    Appeltant L, Soriano M, der Sande GV, Danckaert J, Massar S, Dambre J, Schrauwen B, Mirasso C, Fischer I. 2011. Nature Communications 2(1).

  5. 5.

    Paquot Y, Duport F, Smerieri A, Dambre J, Schrauwen B, Haelterman M, Massar S. 2012. Scientific reports 2(1).

  6. 6.

    Larger L, Soriano MC, Brunner D, Appeltant L, Gutierrez JM, Pesquera L, Mirasso CR, Fischer I. 2012. , Vol. 20.

  7. 7.

    Martinenghi R, Rybalko S, Jacquot M, Chembo YK, Larger L. 2012. Phys Rev lett 108(24).

  8. 8.

    Larger L, Baylón-Fuentes A, Martinenghi R, Udaltsov VS, Chembo YK, Jacquot M. 2017. Physical Review X 7(1).

  9. 9.

    Antonik P, Haelterman M, Massar S. Cogn Comput 2017;9(3):297.

    Article  Google Scholar 

  10. 10.

    Duport F, Schneider B, Smerieri A, Haelterman M, Massar S. Opt Express 2012;20(20):22783.

    Article  Google Scholar 

  11. 11.

    Brunner D, Soriano MC, Mirasso CR, Fischer I. 2013. Nature Communications 4(1).

  12. 12.

    Vinckier Q, Duport F, Smerieri A, Vandoorne K, Bienstman P, Haelterman M, Massar S. Optica 2015;2(5):438.

    Article  Google Scholar 

  13. 13.

    Akrout A, Bouwens A, Duport F, Vinckier Q, Haelterman M, Massar S. 2016. arXiv:1612.08606.

  14. 14.

    Vandoorne K, Mechet P, Vaerenbergh TV, Fiers M, Morthier G, Verstraeten D, Schrauwen B, Dambre J, Bienstman P. 2014. Nature Communications 5(1).

  15. 15.

    Triefenbach F, Jalalvand A, Schrauwen B, Martens JP. 2010.

  16. 16.

    The 2006/07 forecasting competition for neural networks & computational intelligence. (2006).

  17. 17.

    Coarer FDL, Sciamanna M, Katumba A, Freiberger M, Dambre J, Bienstman P, Rontani D. IEEE Journal of Selected Topics in Quantum Electronics 2018;24(6):1.

    Article  Google Scholar 

  18. 18.

    Bueno J, Maktoobi S, Froehly L, Fischer I, Jacquot M, Larger L, Brunner D. Optica 2018; 5(6):756.

    Article  Google Scholar 

  19. 19.

    Antonik P, Marsal N, Rontani D. IEEE Journal of Selected Topics in Quantum Electronics 2020;26 (1):1.

    Article  Google Scholar 

  20. 20.

    Antonik P, Marsal N, Brunner D, Rontani D. 2019. Nature Machine Intelligence.

  21. 21.

    Dong J, Rafayelyan M, Krzakala F, Gigan S. IEEE Journal of Selected Topics in Quantum Electronics 2020;26(1):1.

    Article  Google Scholar 

  22. 22.

    Penkovsky B, Larger L, Brunner D. J Appl Phys 2018;124(16):162101.

    Article  Google Scholar 

  23. 23.

    Mockus J. J Glob Optim 1994;4(4):347.

    MathSciNet  Article  Google Scholar 

  24. 24.

    Brochu E, Cora VM, De Freitas N. 2010. arXiv:1012.2599.

  25. 25.

    Mockus J. 2012. Bayesian approach to global optimization: theory and applications. vol 37, Springer Science & Business Media.

  26. 26.

    Frazier PI. 2018. arXiv:1807.02811.

  27. 27.

    Yperman J, Becker T. 2016. arXiv:1611.05193.

  28. 28.

    Griffith A, Pomerance A, Gauthier DJ. Chaos: An Interdisciplinary Journal of Nonlinear Science 2019; 29(12):123108.

    MathSciNet  Article  Google Scholar 

  29. 29.

    Cerina L, Franco G, Santambrogio MD. 2019. Proceedings of ESANN.

  30. 30.

    Rasmussen CE, Williams CK. 2006. Gaussian process for machine learning. MIT Press.

  31. 31.

    Tikhonov AN, Goncharsky A, Stepanov V, Yagola AG, Vol. 328. Numerical methods for the solution of ill-posed problems. Netherlands: Springer; 1995.

    Google Scholar 

  32. 32.

    Antonik P, Duport F, Hermans M, Smerieri A, Haelterman M, Massar S. IEEE Trans Neural Netw Learn Syst 2017;28(11):2686.

    Article  Google Scholar 

  33. 33.

    Rodan A, Tino P. IEEE Trans Neural Netw 2011;22(1):131.

    Article  Google Scholar 

  34. 34.

    der Sande GV, Brunner D, Soriano MC. 2017. Nanophotonics 6(3).

  35. 35.

    MacKay DJC. Neural Computn 1992;4(3):448.

    Article  Google Scholar 

  36. 36.

    MathWorks. Gaussian process regression model.

  37. 37.

    MathWorks. Bayesian optimization algorithm.

  38. 38.

    Schuldt C, Laptev I, Caputo B. Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004. IEEE; 2004.

  39. 39.

    Dalal N, Triggs B. 2005 IEEE computer society conference on computer vision and pattern recognition (CVPR). IEEE; 2005.

  40. 40.

    Bahi HE, Mahani Z, Zatni A, Saoud S. 2015. A robust system for printed and handwritten character recognition of images obtained by camera phone. Tech. rep.

  41. 41.

    Pearson K. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science 1901;2(11): 559.

    Article  Google Scholar 

  42. 42.

    Hotelling H. J Educ Psychol 1933;24(6):417.

    Article  Google Scholar 

  43. 43.

    Smith LI. 2002. A tutorial on principal components analysis. Tech rep.

  44. 44.

    Shi Y, Zeng W, Huang T, Wang Y. 2015 IEEE international conference on multimedia and expo (ICME). IEEE; 2015.

Download references


This work was financially supported by AFOSR (grant nos. FA-9550-15-1-0279 and FA-9550-17-1-0072), the Région Grand-Est, and the Volkswagen Foundation via the NeuroQNet Project.

Author information



Corresponding author

Correspondence to Piotr Antonik.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest

Additional information

Ethical Approval

This article does not contain any studies with human participants or animals performed by any of the authors

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article belongs to the Topical Collection: Trends in Reservoir Computing

Guest Editors: Claudio Gallicchio, Alessio Micheli, Simone Scardapane, Miguel C. Soriano

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Antonik, P., Marsal, N., Brunner, D. et al. Bayesian Optimisation of Large-scale Photonic Reservoir Computers. Cogn Comput (2021).

Download citation


  • Bayesian optimisation
  • Photonic reservoir computing
  • Large-scale networks
  • Hyper-parameter optimisation