The Journal of Supercomputing

, Volume 74, Issue 7, pp 3211–3235 | Cite as

Balancing the learning ability and memory demand of a perceptron-based dynamically trainable neural network

  • Edward Richter
  • Spencer Valancius
  • Josiah McClanahan
  • John Mixter
  • Ali Akoglu


Artificial neural networks (ANNs) have become a popular means of solving complex problems in prediction-based applications such as image and natural language processing. Two challenges prominent in the neural network domain are the practicality of hardware implementation and dynamically training the network. In this study, we address these challenges with a development methodology that balances the hardware footprint and the quality of the ANN. We use the well-known perceptron-based branch prediction problem as a case study for demonstrating this methodology. This problem is perfect to analyze dynamic hardware implementations of ANNs because it exists in hardware and trains dynamically. Using our hierarchical configuration search space exploration, we show that we can decrease the memory footprint of a standard perceptron-based branch predictor by 2.3\(\times \) with only a 0.6% decrease in prediction accuracy.


Artificial neural network Branch prediction Perceptron SimpleScalar 



Research reported in this publication was supported in part by Raytheon Missile Systems under the contract 2017-UNI-0008. The content is solely the responsibility of the authors and does not necessarily represent the official views of the Raytheon Missile Systems.


  1. 1.
    ARM Cortex-M7 Processor (2014) ARM, revision r0p2Google Scholar
  2. 2.
    Akopyan F, Sawada J, Cassidy A, Alvarez-Icaza R, Arthur J, Merolla P, Imam N, Nakamura Y, Datta P, Nam GJ, Taba B, Beakes M, Brezzo B, Kuang JB, Manohar R, Risk WP, Jackson B, Modha DS (2015) Truenorth: design and tool flow of a 65 mw 1 million neuron programmable neurosynaptic chip. IEEE Trans Comput Aided Des Integr Circuits Syst 34(10):1537–1557. CrossRefGoogle Scholar
  3. 3.
    Amant RS, Jimenez DA, Burger D (2008) Low-power, high-performance analog neural branch prediction. In: 2008 41st IEEE/ACM International Symposium on Microarchitecture, pp 447–458.
  4. 4.
    Bhattacharjee A (2017) Using branch predictors to predict brain activity in brain-machine implants. In: Proceedings of the 50th Annual IEEE/ACM International Symposium on Microarchitecture, ACM, New York, NY, USA, MICRO-50 ’17, pp 409–422.
  5. 5.
    Burger D, Austin TM (1997) The simplescalar tool set, version 2.0. SIGARCH Comput Archit News 25(3):13–25. CrossRefGoogle Scholar
  6. 6.
    Calder B, Grunwald D, Lindsay D, Martin J, Mozer M, Zorn B (1995) Corpus-based static branch prediction. SIGPLAN Not 30(6):79–92. CrossRefGoogle Scholar
  7. 7.
    Das M, Banerjee A, Sardar B (2017) An empirical study on performance of branch predictors with varying storage budgets. In: 2017 7th International Symposium on Embedded Computing and System Design (ISED), pp 1–5.
  8. 8.
    Henning JL (2000) SPEC CPU2000: measuring CPU performance in the new millennium. Computer 33(7):28–35. CrossRefGoogle Scholar
  9. 9.
    Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780. CrossRefGoogle Scholar
  10. 10.
    Hubara I, Courbariaux M, Soudry D, El-Yaniv R, Bengio Y (2016) Binarized neural networks. In: Lee DD, Sugiyama M, Luxburg UV, Guyon I, Garnett R (eds) Advances in neural information processing systems 29. Curran Associates, Inc., pp 4107–4115.
  11. 11.
    Hubara I, Courbariaux M, Soudry D, El-Yaniv R, Bengio Y (2016) Quantized neural networks: Training neural networks with low precision weights and activations. CoRR arXiv:1609.07061
  12. 12.
    Jimenez DA (2003) Fast path-based neural branch prediction. In: Proceedings of the 36th Annual IEEE/ACM International Symposium on Microarchitecture, IEEE Computer Society, Washington, DC, USA, MICRO 36, p 243.
  13. 13.
    Jimenez DA, Lin C (2001) Dynamic branch prediction with perceptrons. In: Proceedings HPCA Seventh International Symposium on High-Performance Computer Architecture, pp 197–206.
  14. 14.
    Jimenez DA, Lin C (2002) Neural methods for dynamic branch prediction. ACM Trans Comput Syst 20(4):369–397. CrossRefGoogle Scholar
  15. 15.
    Jouppi NP, Young C, Patil N, Patterson D, Agrawal G, Bajwa R, Bates S, Bhatia S, Boden N, Borchers A, Boyle R, Cantin P, Chao C, Clark C, Coriell J, Daley M, Dau M, Dean J, Gelb B, Ghaemmaghami TV, Gottipati R, Gulland W, Hagmann R, Ho RC, Hogberg D, Hu J, Hundt R, Hurt D, Ibarz J, Jaffey A, Jaworski A, Kaplan A, Khaitan H, Koch A, Kumar N, Lacy S, Laudon J, Law J, Le D, Leary C, Liu Z, Lucke K, Lundin A, MacKean G, Maggiore A, Mahony M, Miller K, Nagarajan R, Narayanaswami R, Ni R, Nix K, Norrie T, Omernick M, Penukonda N, Phelps A, Ross J, Salek A, Samadiani E, Severn C, Sizikov G, Snelham M, Souter J, Steinberg D, Swing A, Tan M, Thorson G, Tian B, Toma H, Tuttle E, Vasudevan V, Walter R, Wang W, Wilcox E, Yoon DH (2017) In-datacenter performance analysis of a tensor processing unit. CoRR arXiv:1704.04760
  16. 16.
    Khan MM, Lester DR, Plana LA, Rast A, Jin X, Painkras E, Furber SB (2008) Spinnaker: Mapping neural networks onto a massively-parallel chip multiprocessor. In: 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence), pp 2849–2856.
  17. 17.
    Ko JH, Fromm J, Philipose M, Tashev I, Zarar S (2017) Precision scaling of neural networks for efficient audio processing. ArXiv e-prints arXiv:1712.01340
  18. 18.
    Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. In: Pereira F, Burges CJC, Bottou L, Weinberger KQ (eds) Advances in neural information processing systems 25. Curran Associates, Inc., pp 1097–1105.
  19. 19.
    Lu Y, Liu Y, Wang H (2011) A study of perceptron based branch prediction on simplescalar platform. In: 2011 IEEE International Conference on Computer Science and Automation Engineering, vol 4, pp 591–595.
  20. 20.
    Ma Y, Gao H, Zhou H (2006) Using indexing functions to reduce conflict aliasing in branch prediction tables. IEEE Trans Comput 55(8):1057–1061. CrossRefGoogle Scholar
  21. 21.
    Maas A, Le QV, ONeil TM, Vinyals O, Nguyen P, Ng AY (2012) Recurrent neural networks for noise reduction in robust ASR. In: INTERSPEECHGoogle Scholar
  22. 22.
    Mao Y, Shen J, Gui X (2018) A study on deep belief net for branch prediction. IEEE Access 6:10,779–10,786. CrossRefGoogle Scholar
  23. 23.
    McFarling S (1993) Combining branch predictors. Technical Report TN-36m, Digital Western Research Laboratory, Palo Alto, CAGoogle Scholar
  24. 24.
    Michaud P, Seznec A (2014) Pushing the branch predictability limits with the multi-poTAGE+SC predictor. In: 4th JILP Workshop on Computer Architecture Competitions (JWAC-4): Championship Branch Prediction (CBP-4), Minneapolis, USA.
  25. 25.
    Murray AF (1995) Applications of neural networks. Springer, New YorkCrossRefGoogle Scholar
  26. 26.
    Nazzal J, El-Emary M, I, A Najim S, (2008) Multilayer perceptron neural network (MLPS) for analyzing the properties of Jordan Oil Shale. World Appl Sci J 5:546–552Google Scholar
  27. 27.
    Orhan U, Hekim M, Ozer M (2011) EGG signals classification using the k-means clustering and a multilayer perceptron neural network model. Expert Syst Appl 38(10):13475–13481.,
  28. 28.
    Parasanna S, Sarma R, Balasubramanian S (2017) A study on improving branch prediction accuracy in the context of conditional branches. Int J Eng Technol Sci Res 4:654–662Google Scholar
  29. 29.
    Patterson DA, Hennessy JL (2013) Computer organization and design, fifth edition: the hardware/software interface, 5th edn. Morgan Kaufmann Publishers Inc., San FranciscoGoogle Scholar
  30. 30.
    Rau BR (1991) Pseudo-randomly interleaved memory. In: Proceedings of the 18th Annual International Symposium on Computer Architecture, ACM, New York, NY, USA, ISCA ’91, pp 74–83.
  31. 31.
    Sainath T, Vinyals O, Senior A, Sak H (2015) Convolutional, long short-term memory, fully connected deep neural networks. In: ICASSPGoogle Scholar
  32. 32.
    Seznec A (2005) Analysis of the o-geometric history length branch predictor. In: 32nd International Symposium on Computer Architecture (ISCA’05), pp 394–405.
  33. 33.
    Seznec A (2007) The L-TAGE branch predictor. J Instr Level Parallelism.
  34. 34.
    Seznec A (2011) A 64-kbytes ISL-TAGE branch predictor. In: Proceedings of the 3rd Championship Branch PredictionGoogle Scholar
  35. 35.
    Seznec A (2011) A new case for the tage branch predictor. In: Proceedings of the 44th Annual IEEE/ACM International Symposium on Microarchitecture, ACM, New York, NY, USA, MICRO-44, pp 117–127.
  36. 36.
    Sherwood T, Sair S, Calder B (2003) Phase tracking and prediction. In: Proceedings of the 30th Annual International Symposium on Computer Architecture, ACM, New York, NY, USA, ISCA ’03, pp 336–349.
  37. 37.
    Sprangle E, Chappell RS, Alsup M, Patt YN (1997) The agree predictor: a mechanism for reducing negative branch history interference. In: Conference Proceedings. The 24th Annual International Symposium on Computer Architecture, pp 284–291.
  38. 38.
    Umuroglu Y, Fraser NJ, Gambardella G, Blott M, Leong P, Jahre M, Vissers K (2017) Finn: a framework for fast, scalable binarized neural network inference. In: Proceedings of the 2017 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays, ACM, New York, NY, USA, FPGA ’17, pp 65–74.
  39. 39.
    Vanzella E, Cristiani S, Fontana A, Nonino M, Arnouts S, Giallongo E, Grazian A, Fasano G, Popesso P, Saracco P, Zaggia S (2004) Photometric redshifts with the multilayer perceptron neural network: application to the HDF-S and SDSS. Astron Astrophys 423:761–776. arXiv:astro-ph/0312064 CrossRefGoogle Scholar
  40. 40.
    Yeh TY, Patt YN (1991) Two-level adaptive training branch prediction. In: Proceedings of the 24th Annual International Symposium on Microarchitecture, ACM, New York, NY, USA, MICRO 24, pp 51–61.
  41. 41.
    Zhou Z, Kejriwal M, Miikkulainen R (2013) Extended scaled neural predictor for improved branch prediction. In: The 2013 International Joint Conference on Neural Networks (IJCNN), pp 1–7.

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  • Edward Richter
    • 1
  • Spencer Valancius
    • 1
  • Josiah McClanahan
    • 1
  • John Mixter
    • 1
  • Ali Akoglu
    • 1
  1. 1.Department of Electrical and Computer EngineeringUniversity of ArizonaTucsonUSA

Personalised recommendations