Abstract
Neural networks are increasingly used in recognition, mining and autonomous driving. However, for safety-critical applications, such as autonomous driving, the reliability of NN is an important area that remains largely unexplored. Fortunately, NN itself has fault-tolerance capability, especially, different neurons have different fault-tolerance capability. Thus applying uniform error protection mechanism while ignore this important feature will lead to unnecessary energy and performance overheads. In this paper, we propose a neuron vulnerability factor (NVF) quantifying the neural network vulnerability to soft error, which could provide a good guidance for error-tolerant techniques in NN. Based on NVF, we propose a computation scheduling scheme to reduce the lifetime of neurons with high NVF. The experiment results show that our proposed scheme can improve the accuracy of the neural network by 12% on average, and greatly reduce the fault-tolerant overhead.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Change history
16 July 2022
In the version of this paper that was originally published the affiliation of the third author “Xin Fu” was incorrect. This has now been.
References
Sangchoolie, B., Pattabiraman, K., Karlsson, J.: One bit is (not) enough: an empirical study of the impact of single and multiple bit-flip errors. In: International Conference on Dependable Systems and Networks (2017)
Azizimazreah, A., et al.: Tolerating soft errors in deep learning accelerators with reliable on-chip memory designs. In: IEEE International Conference on Networking, Architecture and Storage (NAS) (2018)
Karlsson, J., Liden, P., Dahlgren, P., Johansson, R., Gunneflo, U.: Using heavy-ion radiation to validate fault-handling mechanisms. In: MICRO (1994)
Zhang, Q., Xu, Q.: Approxit: a quality management framework of approximate computing for iterative methods. IEEE Trans. Comput.-Aided Des. Integr. Circ. Syst. (2017)
Chen, X., Chen, D.Z., Hu, X.S.: moDNN: memory optimal DNN training on GPUs. In: DATE (2018)
Ma, Y., et al.: Optimizing loop operation and dataflow in FPGA acceleration of deep convolutional neural networks. In: FPGA (2017)
Savino, A., Vallero, A., Carlo, S.D.: ReDO: cross-layer multi-objective design-exploration framework for efficient soft error resilient systems. IEEE Trans. Comput. 67, 1462–1477 (2018)
Reagen, B., et al.: Minerva: enabling low-power, highly-accurate deep neural network accelerators. In: ISCA (2016)
Venkataramani, S., Ranjan, A., Roy, K., Raghunathan, A.: AxNN: energy-efficient neuromorphic systems using approximate computing. In: ISLPED (2014)
Zhang, Q., Wang, T., Tian, Y., Yuan, F., Xu, Q.: ApproxANN: an approximate computing framework for artificial neural network. In: DATE (2015)
Mukherjee, S.S., Weaver, C., Emer, J., Reinhardt, S.K.: A systematic methodology to compute the are ehitectural vulnerability factors for a hish performance microprocessor. In: MICRO (2003)
Acknowledgment
This work was supported by the National Natural Science Foundation of China (NSFC) under grants (61772350), Common Information System Equipment Pre-research Funds (Open Project, JZX2017-0988/Y300), the Construction Plan of Beijing High-level Teacher Team (CIT&TCD201704082, CIT&TCD20170322), the Open Project of State Key Laboratory of Computer Architecture (CARCH201607). The work is also supported by the Capacity Building for Sci-Tech Innovation Fundamental Scientific Research Funds (025185305000). Beijing Nova program (Z181100006218093), Research Fund from Beijing Innovation Center for Future Chips (KYJJ2018008).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Li, K., Wang, J., Fu, X., Sui, X., Zhang, W. (2020). Reliability Enhancement of Neural Networks via Neuron-Level Vulnerability Quantization. In: Wen, S., Zomaya, A., Yang, L.T. (eds) Algorithms and Architectures for Parallel Processing. ICA3PP 2019. Lecture Notes in Computer Science(), vol 11945. Springer, Cham. https://doi.org/10.1007/978-3-030-38961-1_24
Download citation
DOI: https://doi.org/10.1007/978-3-030-38961-1_24
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-38960-4
Online ISBN: 978-3-030-38961-1
eBook Packages: Computer ScienceComputer Science (R0)