Functional Federated Learning in Erlang (ffl-erl)

  • Gregor UlmEmail author
  • Emil Gustavsson
  • Mats Jirstrand
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11285)


The functional programming language Erlang is well-suited for concurrent and distributed applications, but numerical computing is not seen as one of its strengths. Yet, the recent introduction of Federated Learning, which leverages client devices for decentralized machine learning tasks, while a central server updates and distributes a global model, motivated us to explore how well Erlang is suited to that problem. We present the Federated Learning framework ffl-erl and evaluate it in two scenarios: one in which the entire system has been written in Erlang, and another in which Erlang is relegated to coordinating client processes that rely on performing numerical computations in the programming language C. There is a concurrent as well as a distributed implementation of each case. We show that Erlang incurs a performance penalty, but for certain use cases this may not be detrimental, considering the trade-off between speed of development (Erlang) versus performance (C). Thus, Erlang may be a viable alternative to C for some practical machine learning tasks.


Machine learning Federated Learning Distributed computing Functional programming Erlang 



Our research was financially supported by the project On-board/Off-board Distributed Data Analytics (OODIDA) in the funding program FFI: Strategic Vehicle Research and Innovation (DNR 2016-04260), which is administered by VINNOVA, the Swedish Government Agency for Innovation Systems. It was carried out in the Fraunhofer Cluster of Excellence “Cognitive Internet Technologies.” Adrian Nilsson and Simon Smith assisted with the implementation. Melinda Tóth pointed us to Sher’s work. We also thank our anonymous reviewers for their helpful feedback.


  1. 1.
    Allison, L.: Models for machine learning and data mining in functional programming. J. Funct. Program. 15(1), 15–32 (2005)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Bauer, H., Goh, Y., Schlink, S., Thomas, C.: The supercomputer in your pocket. McKinsey on Semiconductors, pp. 14–27 (2012)Google Scholar
  3. 3.
    Chen, D., Zhao, H.: Data security and privacy protection issues in cloud computing. In: Proceedings of the 2012 International Conference on Computer Science and Electronics Engineering (ICCSEE), vol. 1, pp. 647–651. IEEE (2012)Google Scholar
  4. 4.
    Coppola, R., Morisio, M.: Connected car: technologies, issues, future trends. ACM Comput. Surv. (CSUR) 49(3), 1–36 (2016)CrossRefGoogle Scholar
  5. 5.
    Cuccu, G., Togelius, J., Cudre-Mauroux, P.: Playing atari with six neurons. arXiv preprint arXiv:1806.01363 (2018)
  6. 6.
    Evans-Pughe, C.: The connected car. IEE Rev. 51(1), 42–46 (2005)CrossRefGoogle Scholar
  7. 7.
    Fisher, R., Marshall, M.: Iris Data Set. UC Irvine Machine Learning Repository (1936)Google Scholar
  8. 8.
    Gybenko, G.: Approximation by superposition of sigmoidal functions. Math. Control Signals Syst. 2(4), 303–314 (1989)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Hornik, K.: Approximation capabilities of multilayer feedforward networks. Neural Netw. 4(2), 251–257 (1991)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Johansson, E., Pettersson, M., Sagonas, K.: A high performance Erlang system. In: Proceedings of the 2nd ACM SIGPLAN International Conference on Principles and Practice Of Declarative Programming, pp. 32–43. ACM (2000)Google Scholar
  11. 11.
    LeCun, Y., Cortes, C., Burges, C.J.: MNIST handwritten digit database. AT&T Labs (2010).
  12. 12.
    Lee, J., Kim, C.M.: A roadside unit placement scheme for vehicular telematics networks. In: Kim, T., Adeli, H. (eds.) ACN/AST/ISA/UCMA -2010. LNCS, vol. 6059, pp. 196–202. Springer, Heidelberg (2010). Scholar
  13. 13.
    Löscher, A., Sagonas, K.: The Nifty way to call hell from heaven. In: Proceedings of the 15th International Workshop on Erlang, pp. 1–11. ACM (2016)Google Scholar
  14. 14.
    McMahan, H.B., Moore, E., Ramage, D., Hampson, S., et al.: Communication-efficient learning of deep networks from decentralized data. arXiv preprint arXiv:1602.05629 (2016)
  15. 15.
    Nissen, S.: Implementation of a fast artificial neural network library (FANN). Report, Department of Computer Science University of Copenhagen (DIKU) 31, 29 (2003)Google Scholar
  16. 16.
    Orr, G.B., Müller, K.R.: Neural Networks: Tricks of the Trade. Springer, Heidelberg (2003). Scholar
  17. 17.
    Sagonas, K., Pettersson, M., Carlsson, R., Gustafsson, P., Lindahl, T.: All you wanted to know about the HiPE compiler (but might have been afraid to ask). In: Proceedings of the 2003 ACM SIGPLAN Workshop on Erlang, pp. 36–42. ACM (2003)Google Scholar
  18. 18.
    Sher, G.I.: Handbook of Neuroevolution Through Erlang. Springer, Heidelberg (2013). Scholar
  19. 19.
    Srihari, S.N., Kuebert, E.J.: Integration of hand-written address interpretation technology into the United States postal service remote computer reader system. In: Proceedings of the Fourth International Conference on Document Analysis and Recognition, vol. 2, pp. 892–896. IEEE (1997)Google Scholar
  20. 20.
    Tene, O., Polonetsky, J.: Privacy in the age of big data: a time for big decisions. Stan. L. Rev. Online 64, 63–69 (2011)Google Scholar
  21. 21.
    Ulm, G., Gustavsson, E., Jirstrand, M.: OODIDA: On-board/Off-board distributed data analytics for connected vehicles. arXiv preprint arXiv:1902.00319 (2019)
  22. 22.
    Yu, T., Clack, C.: PolyGP: a polymorphic genetic programming system in Haskell. In: Genetic Programming, vol. 98 (1998)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Fraunhofer-Chalmers Research Centre for Industrial MathematicsGothenburgSweden
  2. 2.Fraunhofer Center for Machine LearningGothenburgSweden

Personalised recommendations