Analogic Inference Design

  • Roel J. Wieringa
Chapter

Abstract

Analogic inference is generalization by similarity. In our schema of inferences (Fig. 15.1), analogic inference is done after abductive inference. What we generalize about by analogy is not a description of phenomena, nor a statistical model of a population, but an explanation. In Sect. 15.1, we show that it can be used in case-based and in sample-based research. In Sect. 15.2, we contrast feature-based similarity with architectural similarity and show that architectural similarity gives a better basis for generalization than feature-based similarity. Analogic generalization is done by induction over a series of positive and negative cases, called analytical induction (Sect. 15.3). We discuss the validity of analogic generalizations in Sect. 15.4 and generalize the concept of generalization to that of a theory of similitude in Sect. 15.5.

Keywords

Rubber 

References

  1. 1.
    P. Bartha, By Parallel Reasoning (Oxford University Press, Oxford, 2010)CrossRefGoogle Scholar
  2. 2.
    L.J. Cronbach, Designing Evaluations of Educational and Social Programs (Jossey-Bass, San Francisco, 1982)Google Scholar
  3. 3.
    D. Damian, J. Chisan, An empirical study of the complex relationships between requirements engineering processes and other processes that lead to payoffs in productivity, quality and risk management. IEEE Trans. Softw. Eng. 32(7), 433–453 (2006)CrossRefGoogle Scholar
  4. 4.
    G. Gigerenzer, External validity of laboratory experiments: the frequency-validity relationship. Am. J. Psychol. 97(2), 185–195 (1984)CrossRefGoogle Scholar
  5. 5.
    C. Hildebrand, G. Häubl, A. Herrmann, J.R. Landwher, When social media can be bad for you: community feedback stifles consumer creativity and reduces satisfaction with self-designed products. Inf. Syst. Res. 24(1), 14–29 (2013)CrossRefGoogle Scholar
  6. 6.
    R.W. Holt, D.A. Boehm-Davis, A.C. Shultz, Mental representations of programs for student and professional programmers, in Empirical Studies of Programmers: Second Workshop, G.M. Olson, S. Sheppard, E. Soloway (Ablex Publishing, Norwood, 1987), pp. 33–46 http://dl.acm.org/citation.cfm?id=54968.54971
  7. 7.
    M. Höst, B. Regnell, C. Wohlin, Using students as subjects - a comparative study of students and professionals in lead-time impact assessment. Empir. Softw. Eng. 5(3), 201–214 (2000)CrossRefMATHGoogle Scholar
  8. 8.
    I. Lakatos, Proofs and Refutations, ed. by J. Worall, E. Zahar (Cambridge University Press, Cambridge, 1976)Google Scholar
  9. 9.
    A. Mockus, R.T. Fielding, J. Herbsleb, A case study of open source software development: the Apache server, in Proceedings of the 22nd International Conference on Software Engineering (ICSE 2000) (ACM Press, New York, 2000), pp. 263–272Google Scholar
  10. 10.
    A. Mockus, R.T. Fielding, J.D. Herbsleb, Two case studies of open source software development: Apache and Mozilla. ACM Trans. Softw. Eng. Methodol. 11(3), 309–346 (2002)CrossRefGoogle Scholar
  11. 11.
    L. Prechelt, B. Unger-Lamprecht, M. Philippsen, W.F. Tichy, Two controlled experiments assessing the usefulness of design pattern documentation in program maintenance. IEEE Trans. Softw. Eng. 28(6), 595–606 (2002)CrossRefGoogle Scholar
  12. 12.
    W.S. Robinson, The logical structure of analytic induction. Am. Sociol. Rev. 16(6), 812–818 (1951)CrossRefGoogle Scholar
  13. 13.
    P. Runeson, Using students as experiment subjects–an analysis on graduate and freshmen student data, in Proceedings of the Seventh International Confonference Empirical Assessment and Evaluation in Software Engineering (EASE ’03), pp. 95–102 (2003)Google Scholar
  14. 14.
    R. Sabherwal, The evolution of coordination in outsourced software development projects: a comparison of client and vendor perspectives. Inf. Organ. 13, 153–202 (2003)CrossRefGoogle Scholar
  15. 15.
    W.R. Shadish, T.D. Cook, D.T. Campbell, Experimental and Quasi-Experimental Designs for Generalized Causal Inference (Houghton Mifflin, Boston, 2002)Google Scholar
  16. 16.
    S. Sterrett, Similarity and dimensional analysis, in Handbook of the Philosophy of Science, Volume 9: Philosophy of Technology and the Engineering Sciences, ed. by A. Meijers (Elsevier, Amsterdam, 2009), pp. 799–824Google Scholar
  17. 17.
    M. Svahnberg, A. Aurum, C. Wohlin, Using students as subjects – an empirical evaluation, in Proceedings of the Second ACM-IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM ’08) (ACM, New York, 2008), pp. 288–290Google Scholar
  18. 18.
    J. Tacq, Znaniecki’s analytical induction as a method of sociological research. Pol. Sociol. Rev. 158(2), 187–208 (2007)Google Scholar
  19. 19.
    W.G. Vincenti, What Engineers Know and How They Know It. Analytical Studies from Aeronautical History (Johns Hopkins, Baltimore, 1990)Google Scholar
  20. 20.
    P. Willner, Methods for assessing the validity of animal models of human psychopathology, in Neuromethods Vol. 18: Animal Models in Psychiatry I, ed. by A. Boultin, G. Baker, M. Martin-Iverson (The Humana Press, New York, 1991), pp. 1–23Google Scholar
  21. 21.
    R.K. Yin, Case Study Research: Design and Methods (Sage, Thousand Oaks, 1984)Google Scholar
  22. 22.
    F. Znaniecki, The Method of Sociology (Octagon Books, New York, 1968). First printing 1934Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  • Roel J. Wieringa
    • 1
  1. 1.University of TwenteEnschedeThe Netherlands

Personalised recommendations