Abstract
Instrumental observation of a natural phenomenon represents a transmission of information which is generally subject to random disturbances. In the article the scattering of empirical data provided by observation of a certain state is described by a probability distribution function. It is further applied at the estimation of probability distribution of a compound phenomenon which must be characterized by several possible states. The uncertainty of observation is commonly described by the information entropy. It is shown that the empirical information I e , which is defined as the difference between the entropies of a compound phenomenon and a single state, characterizes a complexity of observed phenomenon. With an increasing number of observations N the value of I e , increases less quickly as log N and converges to a fixed value 1 ∞. A proper number of empirical samples sufficient to represent the phenomenon can be estimated as K r = exp I ∞. The redundancy of empirical observations is therefore defined by the excess complexity R = log N — I ∞. When the phenomenon is modeled by a radial basis function NN a proper number of neurons can be described by the parameter K r . An optimal NN structure can be obtained by minimizing the objective function which is comprised of the redundancy of the model and the information divergence between the representative and empirical distribution.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
I. Grabec, W. Sachse, Synergetics of Measurement, Prediction and Control, Springer-Verlag, Berlin, 1997
A. Leonardis, H. Bischof, An efficient MDL-based construction of RBF networks, Neural Networks, 11, 963–973,(1998)
J. Risanen, “Complexity of Models”, Complexity, Entropy, and the Physics of Information, Vol VIII, Ed. W. H. Zurek, Addison-Wesley, 1990, pp 117–125
R. O. Duda, P. E. Hart, Pattern Classification and Scene Analysis, J. Wiley and Sons, New York, 1973, Ch.4
T. M. Cover, J. A. Thomas, Elements of Information Theory, J. Wiley & Sons, new York, 1991
E. Parzeri, “On Estimation of Probability Density Function and Mode”, Ann. Math. Stat, 35, 1065–1076 (1962)
C. E. Shannon, “Mathematical Theory of Information”, Bell. Syst. Tech. J., 27, 379–423 (1948)
A. N. Kolmogorov, “On the Shannon Theory of Information Transmission in the Case of Continuous Signals”, IEEE Trans. Inform. Theory, IT-2, 102–108 (1956)
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer-Verlag Wien
About this paper
Cite this paper
Grabec, I. (1999). Adaptation of NN Complexity to Empirical Information. In: Artificial Neural Nets and Genetic Algorithms. Springer, Vienna. https://doi.org/10.1007/978-3-7091-6384-9_27
Download citation
DOI: https://doi.org/10.1007/978-3-7091-6384-9_27
Publisher Name: Springer, Vienna
Print ISBN: 978-3-211-83364-3
Online ISBN: 978-3-7091-6384-9
eBook Packages: Springer Book Archive