Time Accounting Artificial Neural Networks for Biochemical Process Models

  • Petia GeorgievaEmail author
  • Luis Alberto Paz Suárez
  • Sebastião Feyo de Azevedo
Part of the Studies in Computational Intelligence book series (SCI, volume 657)


This paper is focused on developing more efficient computational schemes for modeling in biochemical processes. A theoretical framework for estimation of process kinetic rates based on different temporal (time accounting) artificial neural network (ANN) architectures is introduced. Three ANNs that explicitly consider temporal aspects of modeling are exemplified: (i) Recurrent Neural Network (RNN) with global feedback (from the network output to the network input); (ii) time-lagged feedforward neural network (TLFN), and (iii) reservoir computing network (RCN). Crystallization growth rate estimation is the benchmark for testing the methodology. The proposed hybrid (dynamical ANN and analytical submodel) schemes are promising modeling framework when the process is strongly nonlinear and particularly when input--output data is the only information available.


Artificial Neural Network Recurrent Neural Network Distillation Column Observation Error Crystal Size Distribution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



This work was financed by the Portuguese Foundation for Science and Technology within the activity of the Research Unit IEETA-Aveiro, which is gratefully acknowledged.


  1. 1.
    Antonelo, E.A., Schrauwen, B., Campenhout, J.V.: Generative modeling of autonomous robots and their environments using reservoir computing. Neural Process. Lett. 26(3), 233–249 (2007)CrossRefGoogle Scholar
  2. 2.
    Bastin, G., Dochain, D.: On-line Estimation and Adaptive Control of Bioreactors. Elsevier Science Publishers, Amsterdam (1990)Google Scholar
  3. 3.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Springer (2006)Google Scholar
  4. 4.
    Chen, L., Bastin, G.: Structural identifiability of the yeals coefficients in bioprocess models when the reaction rates are unknown. Math. Biosci. 132, 35–67 (1996)CrossRefzbMATHGoogle Scholar
  5. 5.
    Georgieva, P., Meireles, M.J., Feyo de Azevedo, S.: Knowledge based hybrid modeling of a batch crystallization when accounting for nucleation, growth and agglomeration phenomena. Chem. Eng. Sci. 58, 3699–3707 (2003)CrossRefGoogle Scholar
  6. 6.
    Hagan, M.T., Demuth, H.B., Beale, M.H.: Neural Network Design. PWS Publishing, Boston, MA (1996)Google Scholar
  7. 7.
    Haykin, S.: Neural Networks: A Comprehensive Foundation. Prentice Hall, NJ (1999)zbMATHGoogle Scholar
  8. 8.
    Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks. Technical Report GMD Report 148, German National Research Center for Information Technology (2001)Google Scholar
  9. 9.
    Maass, W., Natschlager, T., Markram, H.: Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput. 14(11), 2531–2560 (2002)CrossRefzbMATHGoogle Scholar
  10. 10.
    Mandic D.P., Chambers, J.A.: Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability (Adaptive & Learning Systems for Signal Processing, Communications & Control). Wiley (2001)Google Scholar
  11. 11.
    Noykove, N., Muller, T.G., Gylenberg, M., Timmer J.: Quantitative analysis of anaerobic wastewater treatment processes: identifiably and parameter estimation. Biotechnol. Bioeng. 78(1), 91–103 (2002)Google Scholar
  12. 12.
    Oliveira, C., Georgieva, P., Rocha, F., Feyo de Azevedo, S.: Artificial Neural Networks for Modeling in Reaction Process Systems, Neural Computing & Applications. Springer 18, 15–24 (2009)Google Scholar
  13. 13.
    Principe, J.C., Euliano, N.R., Lefebvre, W.C.: Neural and Adaptive Systems: Fundamentals Through Simulations. Wiley, New York (2000)Google Scholar
  14. 14.
    Steil, J.J.:. Backpropagation-Decorrelation: Online recurrent learning with O(N) complexity. In: Proceedings of the International Joint Conference on Neural Networks (IJCNN), vol. 1, pp. 843–848Google Scholar
  15. 15.
    Walter, E., Pronzato, L.: Identification of Parametric Models from Experimental Data. Springer, London (1997)zbMATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2017

Authors and Affiliations

  • Petia Georgieva
    • 1
    Email author
  • Luis Alberto Paz Suárez
    • 2
  • Sebastião Feyo de Azevedo
    • 2
  1. 1.Signal Processing Lab, IEETA, DETIUniversity of AveiroAveiroPortugal
  2. 2.Faculty of Engineering, Department of Chemical EngineeringUniversity of PortoPortoPortugal

Personalised recommendations