A Universal Approximator Network for Predicting Conditional Probability Densities

  • Dirk Husmeier
Part of the Perspectives in Neural Computing book series (PERSPECT.NEURAL)


The structure of a universal approximator network for predicting conditional probability densities is derived, and it is shown that the resulting architecture can deal with both stochastic and determinstic processes. Two variants, the derivative-of-sigmoid mixture (DSM) and the Gaussian mixture (GM) networks are presented, and their relation to a stochastic kernel expansion is noted. The chapter concludes with a comparison between these models and several relevant alternative approaches which have recently been introduced to the neural network community.


Hide Layer Gaussian Mixture Model Expectation Maximisation Algorithm Conditional Density Moment Generate Function 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag London Limited 1999

Authors and Affiliations

  • Dirk Husmeier
    • 1
  1. 1.Neural Systems Group, Department of Electrical & Electronic EngineeringImperial CollegeLondonUK

Personalised recommendations