Abstract
A novel optimum feedforward neural networks construction method was proposed. We first define a 0-1 covering matrix and propose an integer programming method for minimum sphere set covering problem. We then further define the extended covering matrix with smooth function, relax the objective and constraints to formulate a more general linear programming method for the minimum sphere set covering problem. We call this method Linear Programming Minimum Sphere set Covering (LPMSC). After that, we propose to apply the LPMSC method to neural network construction. With specific smooth functions, we can obtain optimum neural networks via LPMSC needless of prior knowledge, which is more objective and automatic like Support Vector Machines (SVM). Finally, we investigate the performances of the proposed method through UCI benchmark datasets compared with SVM. The parameters of SVM and our method are determined using 5-fold cross validation. Results shows that our method need less neurons than SVM while retain comparable even superior performances in the datasets studied.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Mao, K.Z., Huang, G.-B.: Neuron Selection for RBF Neural Network Classifier Based on Data Structure Preserving Criterion. IEEE Trans. Neural Networks 16(6), 1531–1540 (2005)
Vonk, E., Jain, L.C., Johnson, R.P.: Automatic Generation of Neural Network Architecture Using Evolutionary Computation. World Scientific Publishing, River Edge NJ (1997)
Rutkoski, L.: Adaptive Probabilistic Neural Networks for Pattern Classification in Time-varying Environment. IEEE Trans. Neural Networks 15(4), 811–827 (2004)
Huang, G.-B., Saratchandran, P., Sundararajan, N.: A Generalized Growing and Pruning RBF (GGAP-RBF) Neural Network for Function Approximation. IEEE Trans. Neural Networks 16(1), 57–67 (2005)
Girosi, F., Jones, M., Poggio, T.: Regularization Theory and Neural Networks Architectures. Neural Computation 7(2), 219–269 (1995)
Tsang, I.W., Kwok, J.T., Cheung, P.-M.: Core Vector Machines: Fast SVM Training on Very Large Data Sets. Journal of Machine Learning Research 6, 363–392 (2005)
Marchand, M., Shawe-Taylor, J.: The Set Covering Machine. Journal of Machine Learning Research 3, 723–746 (2002)
Floyd, S., Warmuth, M.: Sample Compression, Learnability, and the Vapnik-Chervonenkis Dimension. Machine Learning 21, 269–304 (1995)
Shawe-Taylor, J., Bartlett, P.L., Williamson, R.C., Anthony, M.: Structural Risk Minimization over Data-dependent Hierarchies. IEEE Trans. Information Theory 44, 1926–1940 (1998)
Hussain, Z., Szedmak, S., Shawe-Taylor, J.: The Linear Programming Set Covering Machine. In: PASCAL 2004 (2004), http://eprints.pascal-network.org/archive/00001210/01/lp_scm.pdf
Newman, D.J., Hettich, S., Blake, C.L., Merz, C.J.: Repository of Machine Learning Databases (1998), http://www.ics.uci.edu/m~learn/MLRepository.html
Wei, X.-K., Löfberg, J., Feng, Y., Li, Y.-H., Li, Y.-F.: Enclosing Machine Learning for Class Description. In: Liu, D., et al. (eds.) ISNN 2007. LNCS, vol. 4491, pp. 428–437. Springer, Heidelberg (2007)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Wei, XK., Li, YH., Li, YF. (2007). Optimum Neural Network Construction Via Linear Programming Minimum Sphere Set Covering. In: Alhajj, R., Gao, H., Li, J., Li, X., Zaïane, O.R. (eds) Advanced Data Mining and Applications. ADMA 2007. Lecture Notes in Computer Science(), vol 4632. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-73871-8_39
Download citation
DOI: https://doi.org/10.1007/978-3-540-73871-8_39
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-73870-1
Online ISBN: 978-3-540-73871-8
eBook Packages: Computer ScienceComputer Science (R0)