Skip to main content

The Performance of the Stochastic DNN-kWTA Network

  • Conference paper
Book cover Neural Information Processing (ICONIP 2014)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8834))

Included in the following conference series:

  • 4823 Accesses

Abstract

Recently, the dual neural network (DNN) model has been used to synthesize the k-winners-take-all (kWTA) process. The advantage of this DNN-kWTA model is that its structure is very simple. It contains 2nā€‰+ā€‰1 connections only. Also, the convergence behavior of the DNN-kWTA model under the noise condition was reported. However, there is no an analytic expression on the equilibrium point. Hence it is difficult to study how the noise condition affects the model performance. This paper studies how the noise condition affects the model performance. Based on the energy function, we propose an efficient method to study the performance of the DNN-kWTA model under the noise condition. Hence we can efficiently study how the noise condition affects the model performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Lazzaro, J., Ryckebusch, S., Mahowald, M.A., Mead, M.A.: Winner-take-all networks of O(N) complexity. In: Advances in Neural Information Processing Systems, vol.Ā 1, pp. 703ā€“711. Morgan Kaufmann Publishers Inc. (1989)

    Google ScholarĀ 

  2. Hopfield, J.J.: Neurons with graded response have collective computational properties like those of two-state neurons. Proc. Natl. Acad. Sci. USAĀ 81(10), 3088ā€“3092 (1984)

    ArticleĀ  Google ScholarĀ 

  3. Sum, J., Leung, C.S., Tam, P., Young, G., Kan, W., Chan, L.W.: Analysis for a class of winner-take-all model. IEEE Trans. Neural Netw.Ā 10(1), 64ā€“71 (1999)

    ArticleĀ  Google ScholarĀ 

  4. Hu, X., Wang, J.: An improved dual neural network for solving a class of quadratic programming problems and its -winners-take-all application. IEEE Trans. Neural Netw.Ā 19(22), 2022ā€“2031 (2008)

    ArticleĀ  Google ScholarĀ 

  5. Wang, J., Guo, Z.: Parametric sensitivity and scalability of k-winners-take-all networks with a single state variable and infinity-gain activation functions. In: Zhang, L., Kwok, J., Lu, B.-L. (eds.) ISNN 2010, Part I. LNCS, vol.Ā 6063, pp. 77ā€“85. Springer, Heidelberg (2010)

    ChapterĀ  Google ScholarĀ 

  6. Wang, J.: Analysis and design of a -winners-take-all model with a single state variable and the heaviside step activation function. IEEE Trans. Neural Netw.Ā 21(9), 1496ā€“1506 (2010)

    ArticleĀ  Google ScholarĀ 

  7. Xiao, Y., Liu, Y., Leung, C.S., Sum, J., Ho, K.: Analysis on the convergence time of dual neural network-based. IEEE Trans. Neural Netw. Learn. Syst.Ā 23(4), 676ā€“682 (2012)

    ArticleĀ  Google ScholarĀ 

  8. Leung, C.S., Sum, J.: A fault tolerant regularizer for RBF networks. IEEE Trans. Neural Netw.Ā 19(3), 493ā€“507 (2008)

    ArticleĀ  Google ScholarĀ 

  9. Wang, L.: Noise injection into inputs in sparsely connected Hopfield and winner-take-all neural networks. IEEE Trans. on System Man and Cybernetics Part B: CyberneticsĀ 27(5), 868ā€“870 (1997)

    ArticleĀ  Google ScholarĀ 

  10. Hu, M., Li, H., Wu, Q., Rose, G.S., Chen, Y.: Memristor crossbar based hardware realization of BSB recall function. In: Proc. the 2012 International Joint Conference on Neural Networks, IJCNN (2012)

    Google ScholarĀ 

  11. He, J., Zhan, S., Chen, D., Geiger, R.L.: Analyses of static and dynamic random offset voltages in dynamic comparators. IEEE Trans. on Circuits and Systems IĀ 56(5), 911ā€“919 (2009)

    ArticleĀ  MathSciNetĀ  Google ScholarĀ 

  12. Sum, J., Leung, C.S., Ho, K.: Effect of Input Noise and Output Node Stochastic on Wangā€™s kWTA. IEEE Trans. Neural Netw. Learn. Syst.Ā 24(9), 1472ā€“1478 (2013)

    ArticleĀ  Google ScholarĀ 

  13. Bowling, S.R., Khasawneh, M.T., Kaewkuekool, S., Cho, B.R.: A logistic approximation to the cumulative normal distribution. Journal of Industrial Engineering and ManagementĀ 2(1), 114ā€“127 (2009)

    ArticleĀ  Google ScholarĀ 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

Ā© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Feng, R., Leung, CS., Ng, KT., Sum, J. (2014). The Performance of the Stochastic DNN-kWTA Network. In: Loo, C.K., Yap, K.S., Wong, K.W., Teoh, A., Huang, K. (eds) Neural Information Processing. ICONIP 2014. Lecture Notes in Computer Science, vol 8834. Springer, Cham. https://doi.org/10.1007/978-3-319-12637-1_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-12637-1_35

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-12636-4

  • Online ISBN: 978-3-319-12637-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics