Advertisement

Block-Based Neural Network High Speed Optimization

  • Kundo LeeEmail author
  • Tomoki Hamagami
Conference paper
Part of the Proceedings in Adaptation, Learning and Optimization book series (PALO, volume 12)

Abstract

Block-Based Neural Network (BBNN) consists of a 2-D array of memory-based modular component NNs with flexible structures and internal configuration that can be implemented in reconfigurable hardware such as a field programmable gate array (FPGA). The network structure and the weights are encoded in bit strings and globally optimized using the genetic operators. An asynchronous BBNN (ABBNN), which is a new model of BBNN, enables higher performance for BBNN by utilizing the parallel computation and the pipeline architecture. An ABBNN’s operating frequency is kept stable for all scales of the network, while conventional BBNN’s decreases accordingly. The architecture of ABBNN provides the capabilities to process and analyze high sample rate data at the same time. However, optimization by the genetic algorithm is a high-cost task, and the memory access is one of the causes which degrade the training performance. In this paper, we introduce a new algorithm to reduce the memory access in BBNN optimization. ABBNN, optimized with the proposed evolutionary algorithm, is applied to general classifiers to verify the effectiveness with regards to the reduction of memory access.

Keywords

FPGA Evolvable hardware Genetic Algorithm Block-Based Neural Network 

References

  1. 1.
    Moon, S.W., Kong, S.G.: Block-based neural networks. IEEE Trans. Neural Netw. 12(2), 307–317 (2001)CrossRefGoogle Scholar
  2. 2.
    Sahin, S., Becerikli, Y., Yazici, S.: Neural network implementation in hardware using FPGAs. In: Proceedings of the 13th International Conference on Neural Information Processing, vol. 3, pp. 1105–1112 (2006)CrossRefGoogle Scholar
  3. 3.
    Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Proceedings of NIPS, pp. 1106–1114 (2012)Google Scholar
  4. 4.
    Mohammed, E.Z., Ali, H.K.: Hardware implementation of artificial neural network using field programmable gate array. Int. J. Comput. Theor. Eng. 5, 795 (2013).  https://doi.org/10.7763/IJCTE.2013.V5.795CrossRefGoogle Scholar
  5. 5.
    Sze, V., Chen, Y.-H., Yang, T.-J., Emer, J.: Efficient processing of deep neural networks: a tutorial and survey. https://arxiv.org/abs/1703.09039
  6. 6.
    Merchant, S., Peterson, G.D., Park, S.K., Kong, S.G.: FPGA implementation of evolvable block-based neural networks. In: Proceedings of IEEE Congress on Evolutionary Computation, pp. 3129–3136 (2006)Google Scholar
  7. 7.
    Kothandaraman, S.K.: Implementation of block-based neural networks on reconfigurable computing platforms. Master’s thesis, University of Tennessee (2004). http://trace.tennessee.edu/utk_gradthes/2268
  8. 8.
    Merchant, S., Peterson, G.D., Kong, S.G.: Intrinsic embedded hardware evolution of block-based neural networks. In: Proceedings of Engineering of Reconfigurable Systems and Algorithms (ERSA) (2006)Google Scholar
  9. 9.
    Niknam, A., Hoseini, P., Mashoufi, B., Khoei, A.: A novel evolutionary algorithm for block-based neural network training. In: Proceedings of the First Iranian Conference on Pattern Recognition and Image Analysis (PRIA), pp. 1-6 (2013)Google Scholar
  10. 10.
    Higuchi, T., et al.: Evolvable hardware with genetic learning. In: Proceedings of IEEE International Symposium on Circuits and Systems, vol. 4, pp. 29–32 (1997)Google Scholar
  11. 11.
    Satoh, H., Yamamura, M., Kobayashi, S.: Minimal generation gap model for GAs considering both exploration and exploitation. In: Proceedings of 4th International Conference on Soft Computing, Iizuka, 30 September–5 October 1996, pp. 494–497 (1996)Google Scholar
  12. 12.
    Goldberg, D., Thierens, D.: Elitist recombination: an integrated selection recombination GA. In: First IEEE World Congress on Computational Intelligence, vol. 1, pp. 508–512 (1994)Google Scholar
  13. 13.
    Kong, S.G.: Time series prediction with evolvable block-based neural networks. In: Proceedings of IEEE International Joint Conference on Neural Networks (IJCNN-2004) (2004)Google Scholar
  14. 14.
    Jiang, W., Kong, S.G., Peterson, G.D.: ECG signal classification using block-based neural networks. In: Proceedings of International Joint Conference on Neural Networks, Montreal, Canada (2005)Google Scholar
  15. 15.
    Tran, Q.A., Jiang, F., Ha, Q.M.: Evolving block-based neural network and field programmable gate arrays for host-based intrusion detection system. In: 2012 Fourth International Conference on Knowledge and Systems Engineering (KSE). IEEE (2012)Google Scholar
  16. 16.
    Lee, K., Hamagami, T.: High performance block-based neural network model by pipelined parallel communication. IEEE J. Trans. Electron. Inf. Syst. 139, 1059–1065 (2019). (in Japanese)Google Scholar
  17. 17.
    Kirkpatrick, S., Gelatt, C.D., Vecchi, M.P.: Optimization by simulated annealing. Science 220(4598), 671–680 (1983)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Roberto, L.: UCI Machine Learning Repository. Intelnics (2014). http://archive.ics.uci.edu/ml
  19. 19.
    Yeh, I.-C.: Modeling slump flow of concrete using second-order regressions and artificial neural networks. Cem. Concr. Compos. 29(6), 474–480 (2007)CrossRefGoogle Scholar
  20. 20.
    Cortez, P., Morais, A.: A data mining approach to predict forest fires using meteorological data. In: Neves, J., Santos, M.F., Machado, J. (eds.) Proceedings of the 13th EPIA, New Trends in Artificial Intelligence (2007)Google Scholar
  21. 21.
    Carla, B.: UCI Machine Learning Repository. Vision Group, University of Massachusetts (1990). http://archive.ics.uci.edu/ml
  22. 22.
    Roberto, L.: Yacht hydrodynamics data set. UCI Machine Learning Repository (2013). http://archive.ics.uci.edu/ml

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Faculty of Engineering, Division of Intelligent Systems EngineeringYokohama National UniversityHodogaya, YokohamaJapan
  2. 2.Mentor Graphics Japan Co., Ltd.TokyoJapan

Personalised recommendations