A Universal Approximator Network for Predicting Conditional Probability Densities
The structure of a universal approximator network for predicting conditional probability densities is derived, and it is shown that the resulting architecture can deal with both stochastic and determinstic processes. Two variants, the derivative-of-sigmoid mixture (DSM) and the Gaussian mixture (GM) networks are presented, and their relation to a stochastic kernel expansion is noted. The chapter concludes with a comparison between these models and several relevant alternative approaches which have recently been introduced to the neural network community.
KeywordsHide Layer Gaussian Mixture Model Expectation Maximisation Algorithm Conditional Density Moment Generate Function
Unable to display preview. Download preview PDF.