Skip to main content

Part of the book series: Statistics for Industry and Technology ((SIT))

  • 3046 Accesses

Abstract

This chapter deals with the problem of local sensitivity analysis, that is, how sensitive the results of a statistical analysis are to a change in the data. A closed formula for the calculation of local sensitivities in optimization problems is applied to some optimization problems in statistics, including regression, maximum likelihood, and other situations involving ordered and data constrained parameters. In addition, a general method for evaluating the sensitivities for the method of moments is obtained. The methods are illustrated with several examples.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Atkinson, A. C. (1984). Fast very robust methods for the detection of multiple outliers, Journal of the American Statistical Association, 89, 1329–1339.

    Article  Google Scholar 

  2. Atkinson, A. C. (1985). Plots, Transformations, and Regression: An Introduction to Graphical Methods of Diagnostic Regression Analysis, Clarendon Press, Oxford, England.

    MATH  Google Scholar 

  3. Barnett, V., and Lewis, T. (1994). Outliers in Statistical Data, 3rd ed., John Wiley_& Sons, New York.

    MATH  Google Scholar 

  4. Barrett, B. E., and Gray, J. B. (1997). On the use of robust diagnostics in least squares regression analysis, Proceedings of the Statistical Computing Section of the American Statistical Association, 130–135.

    Google Scholar 

  5. Bazaraa, M. S., Sherali, H. D., and Shetty C. M. (1993). Nonlinear Programming, Theory and Algorithms, 2nd ed., John Wiley_& Sons, New York.

    MATH  Google Scholar 

  6. Belsley, D. A., Kuh, E., and Welsch, R. E. (1980). Regression Diagnostics: Identifying Influential Data and Sources of Multicollinearity, John Wiley_& Sons, New York.

    Google Scholar 

  7. Billor, N., Chatterjee, S., and Hadi, A.S. (2001). Iteratively re-weighted least squares method for outlier detection in linear regression, Bulletin of the International Statistical Institute, 1, 470–472.

    Google Scholar 

  8. Billor, N., Hadi, A. S., and Velleman, P. F. (2000). BACON: Blocked adaptive computationally-efficient outlier nominators, Computational Statistics and Data Analysis, 34, 279–298.

    Article  MATH  Google Scholar 

  9. Castillo, E., Conejo, A. J., MĂ­nguez, R., and Castillo, C. (2006). A closed formula for local sensitivity analysis in mathematical programming, Engineering Optimization.

    Google Scholar 

  10. Castillo, E., Conejo, A. J., Pedregal, P., GarcĂ­a, R., and Alguacil, N. (2001). Building and Solving Mathematical Programming Models in Engineering and Science, John Wiley_& Sons, New York.

    Google Scholar 

  11. Castillo, E., Hadi, A. S., Conejo, A., and Fernández-Canteli, A. (2004b). A general method for local sensitivity analysis with application to regression models and other optimization problems, Technometrics, 46, 433–444.

    Article  Google Scholar 

  12. Conejo, A. J., Castillo, E., MĂ­nguez, R., and GarcĂ­a-Bertrand, R. (2005). Decomposition Techniques in Mathematical Programming: Engineering and Science Applications, Springer-Verlag, New York.

    Google Scholar 

  13. Chatterjee, S., and Hadi, A. S. (1988). Sensitivity Analysis in Linear Regression, John Wiley_& Sons, New York.

    MATH  Google Scholar 

  14. Chatterjee, S., Hadi, A. S., and Price B. (2000). Regression Analysis by Example, Third edition, John Wiley_& Sons, New York.

    MATH  Google Scholar 

  15. Cook, R. D. (1977). Detection of influential observations in linear regression, Technometrics, 19, 15–18.

    Article  MATH  MathSciNet  Google Scholar 

  16. Cook, R. D. (1986). Assessment of local influence (with discussion), Journal of the Royal Statistical Society, Series B, 48, 133–169.

    MATH  Google Scholar 

  17. Cook, R. D., and Weisberg, S. (1982). Residuals and Influence in Regression, Chapman and Hall, London.

    MATH  Google Scholar 

  18. Escobar, L. A., and Meeker, W. Q. (1992). Assessing influence in regression analysis with censored data, Biometrics, 48, 507–528.

    Article  MATH  MathSciNet  Google Scholar 

  19. Gray, J. B. (1986). A simple graphic for assessing influence in regression, Journal of Statistical Computation and Simulation, 24, 121–134.

    Article  Google Scholar 

  20. Gray, J. B., and Ling, R. F. (1984). K-clustering as a detection tool for influential subsets in regression (with discussion), Technometrics, 26, 305–330.

    Article  MathSciNet  Google Scholar 

  21. Hadi, A. S. (1992a). Identifying multiple outliers in multivariate data, Journal of the Royal Statistical Society, Series B, 54, 761–771.

    MathSciNet  Google Scholar 

  22. Hadi, A. S. (1992b). A new measure of overall potential influence in linear regression, Computational Statistics_& Data Analysis, 14, 1–27.

    Article  MATH  Google Scholar 

  23. Hadi, A. S. (1994). A modification of a method for the detection of outliers in multivariate samples, Journal of the Royal Statistical Society, Series B, 56, 393–396.

    MATH  Google Scholar 

  24. Hadi, A. S., and Simonoff, J. S. (1993). Procedures for the identification of multiple outliers in linear models, Journal of the American Statistical Association, 88, 1264–1272.

    Article  MathSciNet  Google Scholar 

  25. Hawkins, D. M. (1980). Identification of Outliers, Chapman and Hall, London.

    MATH  Google Scholar 

  26. Jones, W. D., and Ling, R. F. (1988). A new unifying class of infuence measures for regression diagnostics, Proceedings of the Statistical Computing Section of the American Statistical Association, 305–310.

    Google Scholar 

  27. Luenberger, D. G. (1989). Linear and Nonlinear Programming, 2nd ed., Addison-Wesley, Reading, MA.

    Google Scholar 

  28. Mayo, M. S., and Gray, J. B. (1997). Elemental subsets: The building blocks of regression, Journal of the American Statistical Association, 51, 122–129.

    Google Scholar 

  29. Nyquist, H. (1992). Sensitivity analysis in empirical studies, Journal of Official Statistics, 8, 167–182.

    Google Scholar 

  30. Paul, S. R. and Fung, K. Y. (1991). A generalized extreme studentized residual multiple-outlier-detection procedure in linear regression, Technometrics, 33, 339–348.

    Article  MATH  Google Scholar 

  31. Peña, D., and Yohai, V. (1995). The detection of influential subsets in linear regression by using an influence matrix, Journal of the Royal Statistical Society, Series B, 57, 145–156.

    MATH  Google Scholar 

  32. Pregibon, D. (1981). Logistic regression diagnostics, Annals of Statistics, 9, 705–724.

    MATH  MathSciNet  Google Scholar 

  33. Saltelli, A., Chan, K. and Scott, E. M. (2000). Sensitivity Analysis, John Wiley_& Sons, New York.

    MATH  Google Scholar 

  34. Schwarzmann, B. (1991). A connection between local-influence analysis and residual diagnostics, Technometrics, 33, 103–104.

    Article  Google Scholar 

  35. Simonoff, J. S. (1991). General approaches to stepwise identification of unusual values in data analysis, In Directions in Robust Statistics and Diagnostics: Part II (Eds., W. Stahel and S. Weisberg), pp. 223–242, Springer-Verlag, New York.

    Google Scholar 

  36. Weissfeld, I., and Schneider, H. (1990a). Influence diagnostics for the normal linear model with censored data, Australian Journal of Statistics, 32, 11–20.

    MATH  Google Scholar 

  37. Weissfeld, I., and Schneider, H. (1990b). Influence diagnostics for the Weibull model fit to censored data, Statistics_& Probability Letters, 9, 67–73.

    Article  MathSciNet  Google Scholar 

  38. Welsch, R. E., and Kuh, E. (1977). Linear regression diagnostics, Technical Report 923–77, Sloan School of Management, Massachusetts Institute of Technology, Boston, MA.

    Google Scholar 

  39. Winsnowski, W. J., Montgomery, D. C., and James, R. S. (2001), A comparative analysis of multiple outlier detection procedures in the linear regression model, Computational Statistics_& Data Analysis, 36, 351–382.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Birkhäuser Boston

About this chapter

Cite this chapter

Castillo, E., Castillo, C., Hadi, A.S., Sarabia, J.M. (2006). Some New Methods for Local Sensitivity Analysis in Statistics. In: Balakrishnan, N., Sarabia, J.M., Castillo, E. (eds) Advances in Distribution Theory, Order Statistics, and Inference. Statistics for Industry and Technology. Birkhäuser Boston. https://doi.org/10.1007/0-8176-4487-3_22

Download citation

Publish with us

Policies and ethics