Parallel Parameter Identification in Industrial Biotechnology



Real-valued black-box optimization of badly behaved and not well understood functions is a wide topic in many scientific areas. Possible applications range from maximizing portfolio profits in financial mathematics over efficient training of neuronal networks in computational linguistics to parameter identification of metabolism models in industrial biotechnology. This paper presents a comparison of several global as well as local optimization strategies applied to the task of efficiently identifying free parameters of a metabolic network model. A focus is being set on the ease of adapting these strategies to modern, highly parallel architectures. Finally an outlook on the possible parallel performance is being presented.


High performance computing Black-box optimization   Biotechnology 


  1. 1.
    Auger, A., Beyer, H.G., Hansen, N., Finck, S., Ros, R., Schoenauer, M., Whitley, D.: Black-Box Optimization Benchmarking (BBOB) 2009., checked 27 Feb 2013
  2. 2.
    Baumann, T., Resch, M.: Parallel parameter identification in industrial biotechnology. In: Proceedings of the 2012 IEEE 10th International Symposium on Parallel and Distributed Processing with Applications, ISPA ’12, pp. 127–133. IEEE Computer Society (2012). doi: 10.1109/ISPA.2012.25
  3. 3.
    Dräger, A., Kronfeld, M., Ziller, M.J., Supper, J., Planatscher, H., Magnus, J.B., Oldiges, M., Kohlbacher, O., Zell, A.: Modeling metabolic networks in c. glutamicum: a comparison of rate laws in combination with various parameter optimization strategies. BMC Syst. Biol. 3, 5 (2009).
  4. 4.
    García, S., Molina, D., Lozano, M., Herrera, F.: A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: a case study on the CEC’2005 special session on real parameter optimization. J. Heuristics 15, 617–644 (2009)CrossRefMATHGoogle Scholar
  5. 5.
    Hansen, N.: Applications of the CMA-ES., checked 27 Feb 2013
  6. 6.
    Hansen, N.: The cma evolution strategy: a tutorial., checked 27 Feb 2013
  7. 7.
    Hansen, N.: The cma evolution strategy: a comparing review. In: Lozano, J., Larrañaga, P., Inza, I., Bengoetxea, E. (eds.) Towards a New Evolutionary Computation, Studies in Fuzziness and Soft Computing, vol. 192, pp. 75–102. Springer, Berlin (2006). doi: 10.1007/3-540-32494-1_4
  8. 8.
    Hansen, N., Auger, A., Ros, R., Finck, S., Pošík, P.: Comparing results of 31 algorithms from the black-box optimization benchmarking bbob-2009. In: Proceedings of the 12th Annual Conference Companion on Genetic and Evolutionary Computation, GECCO ’10, pp. 1689–1696. ACM, New York (2010)Google Scholar
  9. 9.
    HLRS: NEC Nehalem Cluster., checked 27 Feb 2013
  10. 10.
    Jayapal, K.P., Wlaschin, K.F., Hu, W.S., Yap, M.G.: Recombinant protein therapeutics from cho cells-20 years and counting. Chem. Eng. Prog. 103(10), 40–47 (2007)Google Scholar
  11. 11.
    Johnson, S.G.: The NLopt nonlinear-optimization package., checked 27 Feb 2013
  12. 12.
    Jones, D.R., Perttunen, C.D., Stuckman, B.E.: Lipschitzian optimization without the lipschitz constant. J. Optim. Theory Appl. 79, 157–181 (1993). doi: 10.1007/BF00941892. Google Scholar
  13. 13.
    Kaelo, P., Ali, M.: Some variants of the controlled random search algorithm for global optimization. J. Optim. Theory Appl. 130, 253–264 (2006)CrossRefMATHMathSciNetGoogle Scholar
  14. 14.
    Maier, K., Hofmann, U., Reuss, M., Mauch, K.: Dynamics and control of the central carbon metabolism in hepatoma cells. BMC Syst. Biol. 4(1) (2010)Google Scholar
  15. 15.
    Marti, K.: Controlled random search procedures for global optimization. In: Arkin, V., Shiraev, A., Wets, R. (eds.) Stochastic Optimization, Lecture Notes in Control and Information Sciences, vol. 81, pp. 457–474. Springer, Berlin (1986)Google Scholar
  16. 16.
    Mishra, S.K.: Repulsive Particle Swarm., checked 27 Feb 2013
  17. 17.
    Müller, C.L., Baumgartner, B., Ofenbeck, G., Schrader, B., Sbalzarini., I.F.: pCMALib: a parallel FORTRAN 90 library for the evolution strategy with covariance matrix adaptation. In: Proceedings ACM Genetic and Evolutionary Computation Conference (GECCO’09). Montreal, Canada (2009)Google Scholar
  18. 18.
    Nelder, J.A., Mead, R.: A simplex method for function minimization. Comput. J. 7, 308–313 (1965)CrossRefMATHGoogle Scholar
  19. 19.
    Particle Swarm Central., checked 27 Feb 2013
  20. 20.
    Powell, M.J.D.: The BOBYQA algorithm for bound constrained optimization without derivatives. Tech. rep, Department of Applied Mathematics and Theoretical Physics, Cambridge, England (2009)Google Scholar
  21. 21.
    Regis, R.G., Shoemaker, C.A.: Constrained global optimization of expensive black box functions using radial basis functions. J. Glob. Optim. 31, 153–171 (2005). doi: 10.1007/s10898-004-0570-0.
  22. 22.
    Rowan, T.H.: Functional stability analysis of numerical algorithms. Tech. rep. (1990)Google Scholar
  23. 23.
    Runarsson, T.P., Yao, X.: Stochastic ranking for constrained evolutionary optimization. IEEE Trans. Evolut. Comput. 4, 284–294 (2000)CrossRefGoogle Scholar
  24. 24.
    Storn, R., Price, K.: Differential evolution a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 11, 341–359 (1997)CrossRefMATHMathSciNetGoogle Scholar
  25. 25.
    Vaz, A., Vicente, L.: A particle swarm pattern search method for bound constrained global optimization. J. Glob. Optim. 39, 197–219 (2007). doi: 10.1007/s10898-007-9133-5 Google Scholar
  26. 26.
    Vaz, A.: PSwarm., checked 27 Feb 2013

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  1. 1.High Performance Computing Center Stuttgart (HLRS)University of Stuttgart StuttgartGermany

Personalised recommendations