Skip to main content

Metamodelle

  • Chapter
  • First Online:
  • 31k Accesses

Part of the book series: VDI-Buch ((VDI-BUCH))

Zusammenfassung

Die direkte Analyse und Optimierung komplexer Systeme ist schwierig und zeitaufwendig, so dass diese immer häufiger mit der Unterstützung von Metamodellen durchgeführt wird. Metamodelle bilden das zu untersuchende System auf Basis von Mess- oder Simulationsdaten mathematisch ab und können Systemantworten im Bereich von Millisekunden oder Sekunden vorhersagen. Der Einsatz von linearen Modellen oder Polynomen mit starrer Vorgabe der maximal abzubildenden Komplexität oder der Grundform von Zusammenhängen ist nicht zielführend. Gerade bei unbekannter und gleichzeitig hoher Komplexität des abzubildenden Systems führt dies zu falschen Folgerungen. Daher sind verschiedenste Modellverfahren entwickelt worden, die ohne feste Vorgabe von Systemzusammenhängen komplexe Systeme auf Basis von raumfüllenden Mess- oder Simulationsdaten abbilden. Dieses Kapitel stellt Grundlagen und Algorithmen verschiedener Verfahren sowie Methoden zur Qualitätskontrolle vor. Dabei startet es bei alt bekannten Verfahren, wie Regression oder Splines und entwickelt sich über verbreitete Verfahren, wie Kriging, Radiale Basisfunktionen oder Künstliche Neuronale Netze hin zu Support Vector Regression oder Gauß Prozess Modellen. Die Aufbereitung der Algorithmen ermöglicht eine gezielte Auswahl und den sicheren Einsatz der Verfahren in kommerziellen Softwarepaketen oder eine erste grundlegende Implementierung.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   119.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Literaturverzeichnis

  • 1. Akaike, H.: A new look at the statistical identification model. IEEE Transactions on Automatic Control 19, pp. 716–723 (1974) 236

    Google Scholar 

  • 2. An, J., Owen, A.: Quasi-regression. J. Complexity 17, pp. 588–607 (2001) 234

    Google Scholar 

  • 3. Bailey, T., Gatrell, A.: Interactive spatial data analysis. Longman Scientific & Technical (1995) 283

    Google Scholar 

  • 4. Barnes, R.J.: The variogram sill and the sample variance. Mathematical Geology 23(4), pp. 673–678 (1991) 277

    Google Scholar 

  • 5. Bartels, R.H., Beatty, J.C., Barsky, B.A.: An Introduction to Splines for Use in Computer Graphics and Geometric Modeling. Morgan Kaufmann (1987) 254

    Google Scholar 

  • 6. Bowman, A.W., Azzalini, A.: Applied smoothing techniques for data analysis: the Kernel approach with S-plus illustrations. Oxford Statistical Science Series. Clarendon Press, Oxford (2004) 247

    Google Scholar 

  • 7. Bozdogan, H.: Model Selection and Akaikes Information Criterion (AIC): The General Theory and its Analytical Extensions. Psychometrika 52, pp. 346–370 (1987) 236

    Google Scholar 

  • 8. Breiman, L.: Heuristics of instability and stabilization in model selection. The Annals of Statistics 24(6), pp. 2350–2383 (1996) 313

    Google Scholar 

  • 9. Breiman, L., Friedman, J., Stone, C.J., Olshen, R.A.: Classification and Regression Trees. Chapman and Hall/CRC (1984) 248, 249, 252

    Google Scholar 

  • 10. Brent, R.P.: Algorithms for Minimisation Without Derivatives. Prentice Hall (1973) 307

    Google Scholar 

  • 11. Carr, J.C., Beatson, R.K., Cherrie, J., Mitchell, T.J., Fright, W.R., McCallum, B.C., Evans, T.R.: Reconstruction and representation of 3D objects with radial basis functions. In: Proceedings of the 28th annual conference on Computer graphics and interactive techniques, pp. 67–76. ACM, New York (2001) 290

    Google Scholar 

  • 12. CDIAC: Atmospheric Concentrations of CO2 from Mauna Loa, Hawaii (2016). URL http://cdiac.ornl.gov/trends/co2/recent_mauna_loa_co2.html. (abgerufen 11/2016) 299

  • 13. Cherkassky, V.S., Mulier, F.: Learning from Data: Concepts, Theory and Methods, 2nd edn. John Wiley & Sons, New York, USA (2007) 313

    Google Scholar 

  • 14. Cleveland, W.: Robust locally weighted regression and smoothing scatter plots. J. Amer. Stat. Assoc. 74, pp. 829–836 (1979) 245

    Google Scholar 

  • 15. Cressie, N.: Statistics for Spatial Data. Wiley, New York (1993) 277

    Google Scholar 

  • 16. Cristianini, N., Shawe-Taylor, J.: An introduction to Support Vector Machines and other kernel–based learning machines. Cambridge University Press, Cambridge (2000) 260, 262, 265, 266, 270, 271

    Google Scholar 

  • 17. David, M., Blais, R.A.: Geostatistical Ore Reserve Estimation. Developments in geomathematics. Elsevier Scientific Pub. Co. (1977) 277

    Google Scholar 

  • 18. Draper, N.R., Smith, H.: Applied Regression Analysis, 3rd edn. Wiley (1998) 238

    Google Scholar 

  • 19. Drucker, H., Burges, C., Kaufman, L., Smola, A., Vapnik, V.: Support Vector Regression Machines. Advances in Neural Information Processing Systems 9, p. 155–161 (1997) 258

    Google Scholar 

  • 20. Dumouchel, W., O’Brien, F.: Computing and graphics in statistics. In: A. Buja, P.A. Tukey (eds.) Computing and graphics in statistics, chap. Integrating a robust option into a multiple regression computing environment, pp. 41–48. Springer-Verlag New York (1991) 239

    Google Scholar 

  • 21. Duong, T., Hazelton, M.: Plug-in Bandwidth Selectors for Bivariate Kernel Density Estimation. Journal of Nonparametric Statistics 15, pp. 17–30 (2003) 247

    Google Scholar 

  • 22. Efron, B., Hastie, T., Johnstone, L., Tibshirani, R.: Least Angle Regression. Annals of Statistics 32, pp. 407–499 (2002). URL http://www-stat.stanford.edu/~tibs/ftp/LeastAngle_2002.pdf. (abgerufen 11/2016) 238

  • 23. Efroymson, M.: Mathematical Methods for Digital Computers, chap. Multiple regression analysis. John Wiley & Sons Inc (1960) 236

    Google Scholar 

  • 24. Eilers, P.H.C., Rijnmond, D.M., Marx, B.D.: Flexible smoothing with B-splines and penalties. Statistical Science 11, pp. 89–121 (1996) 254

    Google Scholar 

  • 25. Fang, K.T., Li, R., Sudjianto, A.: Design and Modeling for Computer Experiments (Computer Science & Data Analysis). Chapman & Hall/CRC (2005) 192, 193, 194, 195, 197, 209, 210, 212, 213, 214, 219, 238, 254, 255, 417, 418, 419

    Google Scholar 

  • 26. Fletcher, R.: Practical Methods of Optimization, 2nd edn. Wiley (2000) 272

    Google Scholar 

  • 27. Fletcher, T.: Support Vector Machines Explained (2009). URL http://www.tristanfletcher.co.uk. (abgerufen 11/2016) 259, 260, 261, 262, 263, 264, 265, 266, 267, 269

  • 28. Frank, I.E., Friedman, J.H.: A statistical view of some chemometrics regression tools. Technometrics 35, pp. 109–148 (1993) 238

    Google Scholar 

  • 29. Friedman, J.: Multivariate Adaptive Regression Splines (with discussion). Annals of Statistics 19, pp. 1–141 (1991) 256, 257, 258

    Google Scholar 

  • 30. Friedman, J.: Fast MARS. Tech. Rep. 110, Stanford University Department of Statistics (1993) 256, 257, 258

    Google Scholar 

  • 31. Graham, J.: Ordinary Kriging (2015). URL http://www.math.umt.edu/graham. (abgerufen 11/2016) 283

  • 32. Graham, J.: Trend, Variation and Universal Kriging (2015). URL http://www.math.umt.edu/graham. (abgerufen 11/2016) 283

  • 33. Gunn, S.R.: Support Vector Machines for Classification and Regression. Tech. rep., University of Southampton (1998). URL http://users.ecs.soton.ac.uk/srg/publications/pdf/SVM.pdf. (abgerufen 11/2016) 268, 400

  • 34. Guyon, I.: An introduction to variable and feature selection. Journal of Machine Learning Research 3, pp. 1157–1182 (2003) 313

    Google Scholar 

  • 35. Härdle, W.: Applied Nonparametric Regression. Cambridge University Press (1990) 244

    Google Scholar 

  • 36. Hastie, T., Loader, C.: Local regression: automatic kernel carpentry (with discussion). Statistical Science 8, pp. 120–143 (1993) 245

    Google Scholar 

  • 37. Hastie, T., Tibshirani, R.: Generalized Additive Models. Chapman and Hall, London (1990) 235

    Google Scholar 

  • 38. Hengl, T.: Universal Kriging - Algorithm (2013). URL http://spatial-analyst.net/ILWIS/htm/ilwisapp/universal_kriging_algorithm.htm. (abgerufen 11/2016) 283

  • 39. Hesterberg, T., Choi, N.H., Meier, L., Fraley, C.: Least angle and L1 penalized regression: A review. Statistics Surveys 2, pp. 61–93 (2008) 238

    Google Scholar 

  • 40. Hoerl, A.E., Kennard, R.W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12, pp. 55–67 (1970) 238

    Google Scholar 

  • 41. Holland, P.W.,Welsch, R.E.: Robust Regression Using Iteratively Reweighted Least-Squares. Communications in Statistics: Theory and Methods A6, pp. 813–827 (1977) 239

    Google Scholar 

  • 42. Host, G.: Kriging by local polynomials. Computational Statistics and Data Analysis 29(3), pp. 295–312 (1999). URL http://ideas.repec.org/a/eee/csdana/v29y1999i3p295-312.html. (abgerufen 11/2016) 247

  • 43. Huber, P., Ronchetti, E.M.: Robust Statistics. Wiley (2009) 239, 383

    Google Scholar 

  • 44. Huth, F.: Simulation eines Hochdruck-Benzineinspritzsystems. Master’s thesis, RWTH Aachen (2011) 191, 193, 287

    Google Scholar 

  • 45. Ishiguro, M., Sakamoto, Y., Kitagawa, G.: Bootstrapping log likelihood and EIC, an extension of AIC. Annals of the Institute of Statistical Mathematics 49, pp. 411–434 (1997) 236

    Google Scholar 

  • 46. Jekabsons, G.: Ensembling adaptively constructed polynomial regression models. International Journal of Intelligent Systems and Technologies (IJIST) 3(2), pp. 56–61 (2008) 242, 313

    Google Scholar 

  • 47. Jekabsons, G.: Machine Learning, chap. Adaptive Basis Function Construction: an approach for adaptive building of sparse polynomial regression models, pp. 127–156. InTech (2010) 235, 242, 313

    Google Scholar 

  • 48. JiSheng Hao, L.M., Wang, W.: IFIP International Federation for Information Processing, vol. 288, chap. An New Algorithm for Modeling Regression Curve, pp. 86–91. Springer (2009) 269

    Google Scholar 

  • 49. Jones, M., Marron, J., Sheather, S.: A Brief Survey of Bandwidth Selection for Density Estimation. Journal of the American Statistical Association 91, pp. 401–407 (1996) 247

    Google Scholar 

  • 50. Journel, A.G., Huijbregts, C.J.: Mining Geostatistics. Academic Press, London (1978) 277

    Google Scholar 

  • 51. King, M.L., Zhang, X., Hyndman, R.J.: Bandwidth Selection for Multivariate Kernel Density Estimation Using MCMC. Computational Statistics and Data Analysis 50, pp. 3009–3031 (2004) 247

    Google Scholar 

  • 52. Kohavi, R., John, G.H.: Wrappers for Feature Subset Selection. Artificial Intelligence 97, pp. 273–324 (1997) 313

    Google Scholar 

  • 53. Kotsiantis, S.B., Pintelas, P.E.: Combining Bagging and Boosting. International Journal of Mathematical, Computational, Physical, Electrical and Computer Engineering 1(8), pp. 372– 381 (2007) 313

    Google Scholar 

  • 54. Kürz, C.: Radiale-Basis-Funktionen. Seminararbeit, Fraunhofer-Institut für Algorithmen und Wissenschaftliches Rechnen (2008). (abgerufen 11/2016) 284, 287

    Google Scholar 

  • 55. Lichtenstern, A.: Kriging methods in spatial statistics. Master’s thesis, Technische Universität München (2013) 283

    Google Scholar 

  • 56. Loughrey, J., Cunningham, P.: Overfitting in Wrapper-Based Feature Subset Selection: The Harder You Try the Worse it Gets. In: M. Bramer, F. Coenen, T. Allen (eds.) SGAI Conf., pp.33–43. Springer (2004) 313

    Google Scholar 

  • 57. Mallows, C.: Some Comments on Cp. Technometrics 42, pp. 87–94 (2000) 236

    Google Scholar 

  • 58. Marron, J.S., Nolan, D.: Canonical kernels for density estimation. Statistics & Probability Letters 7(3), pp. 195–199 (1988). URL http://ideas.repec.org/a/eee/stapro/v7y1988i3p195-199.html. (abgerufen 11/2016) 244

  • 59. Martinez, W.L., Martinez, A.R.: Computational Statistics Handbook with Matlab. Chapman & Hall/CRC (2002) 248, 249, 252

    Google Scholar 

  • 60. Masters, T.: Practical Neural Network Recipes in C++. Academic Press (1993) 303, 304, 307, 309

    Google Scholar 

  • 61. Mathworks: Matlab Dokumentation (2015). URL https://de.mathworks.com/help/matlab/. (abgerufen 03/2017) 236, 239, 240, 241, 384

  • 62. Menzel, W., Zhang, J., Hendrich, N.: Algorithmisches Lernen - Support Vector Machines. Vorlesungsunterlagen (2009). URL http://tams-www.informatik.uni-hamburg.de/lectures/2009ss/vorlesung/Algorithmisches_Lernen/. (abgerufen 11/2016) 265

  • 63. Miller, A.: Subset Selection in Regression. CRC Press Inc (2002) 238

    Google Scholar 

  • 64. Montgomery, D.C.: Design and Analysis of Experiments. John Wiley and Sons, Hoboken, NJ (2001/2009) 90, 129, 234

    Google Scholar 

  • 65. Nadaraya, E.: On Estimating Regression. Theory Probab. Appl. 9, pp. 141–142 (1964) 243

    Google Scholar 

  • 66. Nadaraya, E.: On Non-Parametric Estimates of Density Functions and Regression Curves. Theory Probab. Appl. 10, pp. 186–190 (1965) 243

    Google Scholar 

  • 67. Navratil, G.: Ausgleichsrechnung II. Vorlesungsunterlagen (2006). URL ftp://ftp.geoinfo.tuwien.ac.at/navratil/Ausgleich2.pdf. (abgerufen 11/2016) 276, 277, 283

  • 68. Neal, R.M.: Bayesian Learning for Neural Networks. Springer-Verlag New York (1996) 300

    Google Scholar 

  • 69. Opitz, D., Maclin, R.: Popular Ensemble Methods: An Empirical Study. Journal of Artificial Intelligence Research 11, pp. 169–198 (1999) 313

    Google Scholar 

  • 70. Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical Recipes: The Art of Scientific Computing. Cambridge University Press (2007) 199, 203, 220, 254, 265, 276, 277, 281, 285, 307, 334, 381, 398, 400

    Google Scholar 

  • 71. Pudil, P., Ferri, F., Novovicova, J., Kittler, J.: Floating search methods for feature selection with nonmonotonic criterion functions. In: Proceedings of International Conference on Pattern Recognition, vol. 2, pp. 279–283 (1994) 236

    Google Scholar 

  • 72. Rasmussen, C.E.: Solving Challenging Non-linear Regression Problems by Manipulating a Gaussian Distribution. Machine Learning Summer School, Cambridge (2009). URL http://mlg.eng.cam.ac.uk/mlss09/mlss_slides/Rasmussen_1.pdf. (abgerufen 11/2016) 296, 299, 301, 302

  • 73. Rasmussen, C.E.: GPML: Gaussian Process for Machine Learning (Matlab Toolbox) (2013). URL http://www.gaussianprocess.org/gpml/code/matlab/doc/. (abgerufen 11/2016) 297, 301, 302

  • 74. Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. The MIT Press (2006). URL http://www.gaussianprocess.org/gpml/. (abgerufen 11/2016) 291, 292, 293, 294, 295, 297, 300, 301, 302

  • 75. Rathbun, S.: The universal kriging equations (1994). URL http://www.esapubs.org/archive/ecol/E079/001/kriging.htm. (abgerufen 11/2016) 283

  • 76. Reunanen, J.: Overfitting in Making Comparisons Between Variable Selection Methods. Journal of Machine Learning Research 3, pp. 1371–1382 (2003). URL http://www.jmlr.org/papers/volume3/reunanen03a/reunanen03a.pdf. (abgerufen 11/2016) 313

  • 77. Reunanen, J.: Feature extraction: foundations and applications, chap. Search strategies, pp. 119–137. Springer (2006) 235, 313

    Google Scholar 

  • 78. Rissanen, J.: Modelling by Shortest Data Description. Automatica 14, pp. 465–471 (1978) 236

    Google Scholar 

  • 79. Rodriguez, C.C.: The ABC of Model Selection: AIC, BIC and the New CIC. In: K.H. Knuth, A.E. Abbas, R.D. Morris, J.P. Castle (eds.) Bayesian Inference and Maximum Entropy Methods in Science and Engineering, American Institute of Physics Conference Series, vol. 803, pp. 80–87 (2005) 236

    Google Scholar 

  • 80. Runarsson, T.P., Sigurdsson, S.: Support Vector Machines. Vorlesungsunterlagen (2003). URL http://notendur.hi.is/~tpr/tutorials/svm/fyrirlestrar.html. (abgerufen 11/2016) 258, 259, 260, 261, 266, 268, 269, 271, 274

  • 81. Runarsson, T.P., Sigurdsson, S.: Support Vector Machines - kernel-based learning methods, computational/statistical learning theory. University of Iceland (2003). URL https://notendur.hi.is/tpr/tutorials/svm/notes/. (abgerufen 11/2016) 269, 270

  • 82. Sachs, L., Hedderich, J.: Angewandte Statistik, Methodensammlung mit R. Springer-Verlag Berlin Heidelberg (2009) 68, 234

    Google Scholar 

  • 83. Sain, S., Baggerly, K., Scott, D.: Cross-Validation of Multivariate Densities. Journal of the American Statistical Association 89, pp. 807–817 (1994) 247

    Google Scholar 

  • 84. SAS: SAS/STAT 9.2 User’s Guide - The Variogram Procedure (2009). URL http://support.sas.com/documentation/. (abgerufen 11/2016) 276, 277

  • 85. Saunders, C., Gammerman, A., Vovk, V.: Ridge Regression Learning Algorithm in Dual Variables. In: 15th International Conference on Machine Learning (1998) 273

    Google Scholar 

  • 86. Schölkopf, B., Bartlett, P., Smola, A., Williamson, R.: Shrinking the Tube: A New Support Vector Regression Algorithm. In: Advances in Neural Information Processing Systems, vol. 11, pp. 330–336. MIT Press (1999) 269

    Google Scholar 

  • 87. Schwarz, G.: Estimating the Dimension of a Model. The Annals of Statistics 6, pp. 461–464 (1978) 236

    Google Scholar 

  • 88. Simpson, T., Lin, D., Chen, W.: Sampling strategies for computer experiments: design and analysis. International Journal of Reliability and Safety (IJRS) 2(3), pp. 209–240 (2001) 318

    Google Scholar 

  • 89. Smola, A.J., Schölkopf, B.: A Tutorial on Support Vector Regression. Statistics and Computing pp. 199–222 (2004) 258, 265

    Google Scholar 

  • 90. Söndergerath, D.: Multivariante Statistik. TU-Braunschweig (2007). Vorlesungsunterlagen 279

    Google Scholar 

  • 91. Spiegelhalter, D., Best, N., Carlin, B., van der Linde, A.: Bayesian Measures of Model Complexity and Fit (with Discussion). Journal of the Royal Statistical Society 64, pp. 583–616 (2002) 236

    Google Scholar 

  • 92. Stone, C.J., Hansen, M., Kooperberg, C., Truong, Y.K.: Polynomial splines and their tensor products in extended linear modeling. Ann. Statist 25, pp. 1371–1470 (1997) 254, 255

    Google Scholar 

  • 93. Street, J.O., Carroll, R.J., Ruppert, D.: A Note on Computing Robust Regression Estimates Via Iteratively Reweighted Least Squares. The American Statistician 42(2) (1988) 239

    Google Scholar 

  • 94. Suykens, J., Brabanter, K.D., Karsmakers, P., Ojeda, F., Alzate, C., Brabanter, J.D., Pelckmans, K., Moor, B.D., Vandewalle, J.: LS-SVMlab Toolbox User’s Guide. Technical report, Katholieke Universiteit Leuven (2011). URL http://www.esat.kuleuven.be/sista/lssvmlab/. (abgerufen 11/2016) 272

  • 95. Suykens, J.A.K., Vandewalle, J.: Least Squares Support Vector Machine Classifiers. Neural Processing Letters 9(3), pp. 293–300 (1999) 272

    Google Scholar 

  • 96. T. Hastie, R.T., Friedman, J.: The Elements of Statistical Learning Data Mining, Inference and Prediction. Springer (2001) 256

    Google Scholar 

  • 97. Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society 58, pp. 267–288 (1996). URL http://www-stat.stanford.edu/~tibs/lasso/lasso.pdf. (abgerufen 11/2016) 238

  • 98. Tibshirani, R.: The LASSO method for variable selection in the COX model. Statistics in medicine 16, pp. 385–395 (1997) 238

    Google Scholar 

  • 99. du Toit,W.: Radial Basis Function Interpolation. Master’s thesis, University of Stellenbosch (2008). URL http://scholar.sun.ac.za/handle/10019.1/2002. (abgerufen 11/2016) 284, 285

  • 100. Vapnik, V., Cortes, C.: Support Vector Networks. Machine Learning 20, pp. 273–297 (1995). URL http://www.springerlink.com/content/k238jx04hm87j80g/. (abgerufen 11/2016) 258

  • 101. Vapnik, V.N.: The Nature of statistical learning theory. Springer, New York (2000) 258

    Google Scholar 

  • 102. Venkataraman, P.: Applied Optimization with MATLAB Programming, 2nd edn. Wiley Publishing, Hoboken (2009) 261

    Google Scholar 

  • 103. Wackernagel, H.: Multivariate geostatistics - an introduction with applications, 2nd edn. Springer, Berlin (1998) 277

    Google Scholar 

  • 104. Wahba, G.: Splines Models for Observational Data. Series in Applied Mathematics 59 (1990) 254, 302

    Google Scholar 

  • 105. Wand, M., M.C.Jones: Multivariate Plug-in Bandwidth Selection. Computational Statistics 9, pp. 97–116 (1994) 247

    Google Scholar 

  • 106. Watson, G.: Smooth regression analysis. Sankhaya: The Indian Journal of Statistics A 26, pp. 359–372 (1964) 243

    Google Scholar 

  • 107. Welling, M.: Kernel Ridge Regression. Paper (2005) 273

    Google Scholar 

  • 108. Wendt, H.: Support Vector Machines for Regression Estimation and their Application to Chaotic Time Series Prediction. Master’s thesis, Technische Universität Wien (2005) 266, 267

    Google Scholar 

  • 109. Wermuth, N.: Beobachtungen zur Ridge-Regression. Jahrbücher für Nationalökonomie und Statistik 189, pp. 300–307 (1975) 238

    Google Scholar 

  • 110. Williams, C.K.I., Rasmussen, C.E.: Gaussian Processes for Regression. In: Advances in Neural Information Processing Systems, vol. 8, pp. 514–520 (1996). URL http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.25.8841. (abgerufen 11/2016) 300

  • 111. Wilson, A.G., Adams, R.P.: Gaussian process kernels for pattern discovery and extrapolation (2013) 297

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Karl Siebertz .

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer-Verlag GmbH Deutschland

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Siebertz, K., Bebber, D.v., Hochkirchen, T. (2017). Metamodelle. In: Statistische Versuchsplanung. VDI-Buch. Springer Vieweg, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-55743-3_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-55743-3_9

  • Published:

  • Publisher Name: Springer Vieweg, Berlin, Heidelberg

  • Print ISBN: 978-3-662-55742-6

  • Online ISBN: 978-3-662-55743-3

  • eBook Packages: Computer Science and Engineering (German Language)

Publish with us

Policies and ethics