Abstract
Chapters 4 and 5 have discussed the use of decision forests in supervised tasks, i.e. when labeled training data are available. In contrast, this chapter discusses the use of forests in unlabeled scenarios. For instance, one important task is that of discovering the intrinsic nature and structure of large sets of unlabeled data. This task can be tackled via another probabilistic model, the density forest. Density forests are explained here as an instantiation of our abstract decision forest model as described in Chap. 3.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Better alternatives, perhaps incorporating priors, may be employed here.
- 2.
A split function is applied usually only to a small, selected subset of features ϕ(v) and thus it can be computed efficiently, i.e. B is very small.
- 3.
No use is made of the ground truth density in this stage.
- 4.
See “consistent estimator” in Wikipedia for a definition of consistency.
References
Bishop CM (2006) Pattern recognition and machine learning. Springer, New York
Breiman L (2001) Random forests. Mach Learn 45(1)
Criminisi A, Shotton J, Konukoglu E (2011) Online tutorial on decision forests. http://research.microsoft.com/projects/decisionforests
Criminisi A, Shotton J, Konukoglu E (2012) Decision forests: a unified framework for classification, regression, density estimation, manifold learning and semi-supervised learning. Found Trends Comput Graph Vis 7(2–3)
Dempster A, Laird N, Rubin D (1977) Maximum likelihood from incomplete data via the EM algorithm. J R Stat Soc Ser B Methodol 39
Devroye L (1986) Non-uniform random variate generation. Springer, New York
Gupta SS (1963) Probability integrals of multivariate normal and multivariate t. Ann Math Stat 34(3)
Kristan M, Skocaj D, Leonardis A (2008) Incremental learning with Gaussian mixture models. In: Computer vision winter workshop (CVWW), Moravske Toplice, Slovenia
MacQueen JB (1967) Some methods for classification and analysis of multivariate observations. In: Proc of 5th Berkeley symposium on mathematical statistics and probability. University of California Press, Berkeley
Moosmann F, Triggs B, Jurie F (2006) Fast discriminative visual codebooks using randomized clustering forests. In: Advances in neural information processing systems (NIPS)
Müller A, Nowozin S, Lampert CH (2012) Information theoretic clustering using minimum spanning trees. In: Proc annual symposium of the German association for pattern recognition (DAGM)
Neal RM (2001) Annealed importance sampling. Stat Comput 11
Parzen E (1962) On estimation of a probability density function and mode. Ann Math Stat 33
Plackett RL (1954) A reduction formula for normal multivariate integrals. Biometrika 41
Ram P, Gray AG (2011) Density estimation trees. In: Proc ACM SIGKDD intl conf on knowledge discovery and data mining (KDD)
Shi T, Horvath S (2006) Unsupervised learning with random forest predictors. J Comput Graph Stat 15
Shotton J, Johnson M, Cipolla R (2008) Semantic texton forests for image categorization and segmentation. In: Proc IEEE conf computer vision and pattern recognition (CVPR)
Silverman BW (1986) Density estimation. Chapman and Hall, London
Skilling J (2010) Maximum entropy and Bayesian methods. Kluwer Academic, Dordrecht
Szekely GJ, Rizzo ML (2004) Testing for equal distributions in high dimensions. Interstat, Nov 2004.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag London
About this chapter
Cite this chapter
Criminisi, A., Shotton, J. (2013). Density Forests. In: Criminisi, A., Shotton, J. (eds) Decision Forests for Computer Vision and Medical Image Analysis. Advances in Computer Vision and Pattern Recognition. Springer, London. https://doi.org/10.1007/978-1-4471-4929-3_6
Download citation
DOI: https://doi.org/10.1007/978-1-4471-4929-3_6
Publisher Name: Springer, London
Print ISBN: 978-1-4471-4928-6
Online ISBN: 978-1-4471-4929-3
eBook Packages: Computer ScienceComputer Science (R0)