Skip to main content

Block-Based Neural Network High Speed Optimization

  • Conference paper
  • First Online:
Proceedings of the 23rd Asia Pacific Symposium on Intelligent and Evolutionary Systems (IES 2019)

Part of the book series: Proceedings in Adaptation, Learning and Optimization ((PALO,volume 12))

Included in the following conference series:

  • 314 Accesses

Abstract

Block-Based Neural Network (BBNN) consists of a 2-D array of memory-based modular component NNs with flexible structures and internal configuration that can be implemented in reconfigurable hardware such as a field programmable gate array (FPGA). The network structure and the weights are encoded in bit strings and globally optimized using the genetic operators. An asynchronous BBNN (ABBNN), which is a new model of BBNN, enables higher performance for BBNN by utilizing the parallel computation and the pipeline architecture. An ABBNN’s operating frequency is kept stable for all scales of the network, while conventional BBNN’s decreases accordingly. The architecture of ABBNN provides the capabilities to process and analyze high sample rate data at the same time. However, optimization by the genetic algorithm is a high-cost task, and the memory access is one of the causes which degrade the training performance. In this paper, we introduce a new algorithm to reduce the memory access in BBNN optimization. ABBNN, optimized with the proposed evolutionary algorithm, is applied to general classifiers to verify the effectiveness with regards to the reduction of memory access.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Moon, S.W., Kong, S.G.: Block-based neural networks. IEEE Trans. Neural Netw. 12(2), 307–317 (2001)

    Article  Google Scholar 

  2. Sahin, S., Becerikli, Y., Yazici, S.: Neural network implementation in hardware using FPGAs. In: Proceedings of the 13th International Conference on Neural Information Processing, vol. 3, pp. 1105–1112 (2006)

    Chapter  Google Scholar 

  3. Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Proceedings of NIPS, pp. 1106–1114 (2012)

    Google Scholar 

  4. Mohammed, E.Z., Ali, H.K.: Hardware implementation of artificial neural network using field programmable gate array. Int. J. Comput. Theor. Eng. 5, 795 (2013). https://doi.org/10.7763/IJCTE.2013.V5.795

    Article  Google Scholar 

  5. Sze, V., Chen, Y.-H., Yang, T.-J., Emer, J.: Efficient processing of deep neural networks: a tutorial and survey. https://arxiv.org/abs/1703.09039

  6. Merchant, S., Peterson, G.D., Park, S.K., Kong, S.G.: FPGA implementation of evolvable block-based neural networks. In: Proceedings of IEEE Congress on Evolutionary Computation, pp. 3129–3136 (2006)

    Google Scholar 

  7. Kothandaraman, S.K.: Implementation of block-based neural networks on reconfigurable computing platforms. Master’s thesis, University of Tennessee (2004). http://trace.tennessee.edu/utk_gradthes/2268

  8. Merchant, S., Peterson, G.D., Kong, S.G.: Intrinsic embedded hardware evolution of block-based neural networks. In: Proceedings of Engineering of Reconfigurable Systems and Algorithms (ERSA) (2006)

    Google Scholar 

  9. Niknam, A., Hoseini, P., Mashoufi, B., Khoei, A.: A novel evolutionary algorithm for block-based neural network training. In: Proceedings of the First Iranian Conference on Pattern Recognition and Image Analysis (PRIA), pp. 1-6 (2013)

    Google Scholar 

  10. Higuchi, T., et al.: Evolvable hardware with genetic learning. In: Proceedings of IEEE International Symposium on Circuits and Systems, vol. 4, pp. 29–32 (1997)

    Google Scholar 

  11. Satoh, H., Yamamura, M., Kobayashi, S.: Minimal generation gap model for GAs considering both exploration and exploitation. In: Proceedings of 4th International Conference on Soft Computing, Iizuka, 30 September–5 October 1996, pp. 494–497 (1996)

    Google Scholar 

  12. Goldberg, D., Thierens, D.: Elitist recombination: an integrated selection recombination GA. In: First IEEE World Congress on Computational Intelligence, vol. 1, pp. 508–512 (1994)

    Google Scholar 

  13. Kong, S.G.: Time series prediction with evolvable block-based neural networks. In: Proceedings of IEEE International Joint Conference on Neural Networks (IJCNN-2004) (2004)

    Google Scholar 

  14. Jiang, W., Kong, S.G., Peterson, G.D.: ECG signal classification using block-based neural networks. In: Proceedings of International Joint Conference on Neural Networks, Montreal, Canada (2005)

    Google Scholar 

  15. Tran, Q.A., Jiang, F., Ha, Q.M.: Evolving block-based neural network and field programmable gate arrays for host-based intrusion detection system. In: 2012 Fourth International Conference on Knowledge and Systems Engineering (KSE). IEEE (2012)

    Google Scholar 

  16. Lee, K., Hamagami, T.: High performance block-based neural network model by pipelined parallel communication. IEEE J. Trans. Electron. Inf. Syst. 139, 1059–1065 (2019). (in Japanese)

    Google Scholar 

  17. Kirkpatrick, S., Gelatt, C.D., Vecchi, M.P.: Optimization by simulated annealing. Science 220(4598), 671–680 (1983)

    Article  MathSciNet  Google Scholar 

  18. Roberto, L.: UCI Machine Learning Repository. Intelnics (2014). http://archive.ics.uci.edu/ml

  19. Yeh, I.-C.: Modeling slump flow of concrete using second-order regressions and artificial neural networks. Cem. Concr. Compos. 29(6), 474–480 (2007)

    Article  Google Scholar 

  20. Cortez, P., Morais, A.: A data mining approach to predict forest fires using meteorological data. In: Neves, J., Santos, M.F., Machado, J. (eds.) Proceedings of the 13th EPIA, New Trends in Artificial Intelligence (2007)

    Google Scholar 

  21. Carla, B.: UCI Machine Learning Repository. Vision Group, University of Massachusetts (1990). http://archive.ics.uci.edu/ml

  22. Roberto, L.: Yacht hydrodynamics data set. UCI Machine Learning Repository (2013). http://archive.ics.uci.edu/ml

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kundo Lee .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lee, K., Hamagami, T. (2020). Block-Based Neural Network High Speed Optimization. In: Sato, H., Iwanaga, S., Ishii, A. (eds) Proceedings of the 23rd Asia Pacific Symposium on Intelligent and Evolutionary Systems. IES 2019. Proceedings in Adaptation, Learning and Optimization, vol 12. Springer, Cham. https://doi.org/10.1007/978-3-030-37442-6_8

Download citation

Publish with us

Policies and ethics