Skip to main content

Cybernetic Tenets: Philosophical Considerations

  • Chapter
  • First Online:
The Nature of the Machine and the Collapse of Cybernetics

Abstract

The purpose of this chapter is to philosophically dissect the main tenets of cybernetics, in order to unveil the metaphysical commitments behind the enterprise at large. This part shows how the entirety of the cybernetic ethos was underpinned by a framework constituted by bold conjectures regarding the nature of a machine. Martin Heidegger’s views on cybernetics will be acknowledged and criticized. The chapter presents a novel way to pin down the core of cybernetics. It employs cybernetics’ own (though largely overseen) main ascription of features to its notion of a machine: a. as being possibly teleological, b. as being possibly immaterial, and c. as embodying the concretization of a particular theory.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 99.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The American Society for Cybernetics lists 46 serious definitions of the term. See www.asc-cybernetics.org/foundations/definitions.htm

  2. 2.

    The connection between cybernetics and Heraclitus has been mentioned time and time again in academic circles, but almost at a colloquial level. A rigorous fleshing out of these possible links escapes the aim of this work—and remains to be done. The philosopher who addressed the possibility of a strong connection between them was Martin Heidegger, but he suggested that such connection is too deep for us to understand it at the present time. More on this below.

  3. 3.

    Marcovich 1967, p. 171

  4. 4.

    Ibid., p. 206.

  5. 5.

    Ibid., p. 424.

  6. 6.

    Ibid., p. 449.

  7. 7.

    Ibid., p. 268.

  8. 8.

    Ibid., p. 113.

  9. 9.

    Ibid., p. 105.

  10. 10.

    Ibid., p. 36.

  11. 11.

    Ibid., p. 33.

  12. 12.

    The last two quotes might also be read as circular causality being more explanatory powerful than linear causality—which, of course, is itself intimately related with information.

  13. 13.

    Chapter 6, Section “The Homeostat: A Living Machine”.

  14. 14.

    Marcovich 1967, p. 145.

  15. 15.

    Heidegger and Fink 1993.

  16. 16.

    At present, we reflect on the phenomenon of steering. This phenomenon has today, in the age of cybernetics, become so fundamental that it occupies and determines the whole of natural science and the behavior of humans so that it is necessary for us to gain more clarity about it (Heidegger and Fink 1993, p.12)

  17. 17.

    Heraclitean fragment cited above as DK 64 (Marcovich 1967, p. 424).

  18. 18.

    I do not want to allow a misunderstanding to arise from my allusion to modern cybernetics in the course of the discussion about what steering is. Misunderstanding would arise if we restricted ourselves to what is said about steering in Frs. 64 and 41, and if we constructed a connection between Heraclitus and cybernetics. This connection between Heraclitus and cybernetics lies much deeper hidden and is not so easy to grasp. It goes in another direction that we could not discuss in the context of our present awareness of Heraclitus (Heidegger and Fink 1993, p. 16).

  19. 19.

    “The deepest intuitions concerning real-life complex systems date back already to Heraclitus”. Hyötyniemi goes on to state three core ideas that in his view are evidently Heraclitean:

    Everything changes, everything remains the same.

    Everything is based on hidden tensions.

    Everything is steered by all other things.

    Although he does not mention this, one could relate the first one to DK12, the second one to DK 54 & 123, and the third one to DK 41 & 64 (See Hyötyniemi 2006, p. 7).

  20. 20.

    Dupuy 2000, p. 90. It would seem that this is Dupuy’s personal translation of a part of the Spiegel interview I reproduce below.

  21. 21.

    On May 31st, 1976. The English translation appeared in 1981 (Heidegger 1981).

  22. 22.

    Heidegger 1981, p. 59.

  23. 23.

    Cited in MacDonald 2008, p. 177.

  24. 24.

    Heidegger 1999.

  25. 25.

    Ibid., § 67.

  26. 26.

    The word that Heidegger used is “enframing”, which gives a more active role to the otherwise more passive connotation of “frame”. See Heidegger 1977a, p. 20.

  27. 27.

    Heidegger 1984.

  28. 28.

    Heidegger 1999, § 61.

  29. 29.

    Quoted in François 2007, p. 432.

  30. 30.

    Heidegger 1977b, p. 376. The quote continues:

    [Philosophy] has found its place in the scientific attitude of socially active humanity. But the fundamental characteristic of this scientific attitude is its cybernetic, that is, technological character. The need to ask about modern technology is presumably dying out to the same extent that technology more definitely characterizes and regulates the appearance of the totality of the world and the position of man in it.

  31. 31.

    When algorithms are defined rigorously in Computer Science literature (which only happens rarely), they are generally identified with abstract machines…this does not square with our intuitions about algorithms and the way we interpret and apply results about them…This problem of defining algorithms is mathematically challenging, as it appears that our intuitive notion is quite intricate and its correct, mathematical modeling may be quite abstract. (Moschovakis 2001, p. 919).

  32. 32.

    The recognized abstract nature of an algorithm makes it ineligible for patenting:

    Determining whether the claim falls within one of the four enumerated categories of patentable subject matter recited in 35 U.S.C. 101 (i.e., process, machine, manufacture or composition of matter) does not end the analysis because claims directed to nothing more than abstract ideas (such as mathematical algorithms), natural phenomena, and laws of nature are not eligible for patent protection (United States Patent and Trademark Office 2014, § 2106, II).

    In the same vein, the patenting of software (arguably a conjunction of algorithms) remains controversial, as one can witness in current news regarding the mutual lawsuits launched between global technology companies (Google, Apple, Samsung, etc.). However, certain uses of an algorithm, on entities that could qualify as “processes”, are patentable. The United States Patent and Trademark Office (USPTO) designates four realms of reality that could receive patent protection:

    i. Process—an act, or a series of acts or steps…(“A process is a mode of treatment of certain materials to produce a given result. It is an act, or a series of acts, performed upon the subject-matter to be transformed and reduced to a different state or thing.”…) (Ibid., § 2106, I).

    Interestingly, the second realm patentable (a machine), is thus defined:

    ii. Machine—a concrete thing, consisting of parts, or of certain devices and combination of devices…This includes every mechanical device or combination of mechanical powers and devices to perform some function and produce a certain effect or result. (Ibid.)

    It would seem that the USPTO, not without reason, has avoided attaching intrinsic materiality to a machine (“a concrete thing”) and has essentially linked mechanicity to an underpinning machine-structure (“this includes every mechanical device or combination of mechanical powers”).

    The other two areas of reality captured by the USPTO for patenting protection are iii. Manufacture and iv. Composition of matter (Ibid.).

  33. 33.

    Turing 1939, p. 150.

  34. 34.

    Chapter 4, Sections “The “Foundational Crisis of Mathematics” and The Response from Formalism” & “A Machinal Understanding of an Algorithm and the Material Liberation of the Machine”.

  35. 35.

    Dupuy 2000, ch. 2.

  36. 36.

    Ashby 1956, p. 2.

  37. 37.

    In contradistinction, positive feedback occurs when feedback has a reverberating effect, reinforcing the process toward the goal, without correction. An example would a radio transmitter and receiver, which picks up the signal of what it already transmitted, processing it again and broadcasting it stronger (See Chapter 2, Section “The AA-Predictor”).

  38. 38.

    Taylor 1950a.

  39. 39.

    For example, foregoing the notion of a final cause, but retaining the notion of a pattern of behavior that aims toward a goal.

  40. 40.

    Wiener 1950; Taylor 1950b.

  41. 41.

    Rosenblueth and Wiener 1945.

  42. 42.

    Prigogine and Stengers 1984.

  43. 43.

    See Chapter 2, Section “The Pre-meetings and Founding Articles”. Shannon’s Master’s thesis was an application of Boolean algebra to the workings of such machine. He went on to pursue a relatively quick Ph.D. in mathematics, also at MIT, proposing an algebra applicable to Mendelian genetics. It was right after earning this degree that he began working for Bell Laboratories.

  44. 44.

    Conway and Siegelman 2005, p.186.

  45. 45.

    Shannon 1948.

  46. 46.

    Although Warren Weaver was not directly immersed in the cybernetic project, partly due to his role as science administrator (rather than practitioner), he was by proxy firmly related to the circle. However, he was directly involved with some posterior developments that came out of cybernetics. More on this in Chapter 9, Section “Cybernetics 2.0: The Nano-Bio-Info-Cogno Convergence”.

  47. 47.

    Weaver 1949.

  48. 48.

    Later on, Wiener would become more assertive, even protective, of the role that he himself played in developing “Shannon’s theory”. Cf. Conway and Siegelman 2005, pp. 187–188.

  49. 49.

    Shannon 1948, p. 380.

  50. 50.

    Boden 2006, pp. 204–205; Hayles 1999, pp. 50–57.

  51. 51.

    Chapter 2, Section “The Macy Conferences”.

  52. 52.

    Shannon 1948, p. 379.

  53. 53.

    A recorded conversation between Claude Shannon and his wife Betty, toward the end of his life, seems to point in that direction:

    Betty: In the first place, you called it a theory of communication…You didn’t call it a theory of information.

    Claude: Yes, I thought that communication is a matter of getting bits from here to here, whether they’re part of the Bible or just which way a coin is tossed…

    Betty: It bothered you along the way a few times but by that time it was out of your hands.

    Claude: The theory has to do with just getting bits from here to here…That’s the communication part of it, what the communication engineers were trying to do. Information, where you attach meaning to it, that’s the next thing, that’s a step beyond, and that isn’t a concern of the engineer, although it’s interesting to talk about (Conway and Siegelman 2005, pp. 189–190).

  54. 54.

    Not without some still clinging for a while longer to Mackay’s more cumbersome (for engineering) approach (Hayles 1999, p. 56).

  55. 55.

    Chapter 2, Section “The Macy Conferences”.

  56. 56.

    Weaver 1949, p. 12.

  57. 57.

    Weaver 1949, p. 9.

  58. 58.

    Ibid., p. 13.

  59. 59.

    Hayles 1999, pp. 63–65.

  60. 60.

    The received view of entropy as the spiraling into chaos and oblivion inherent to nature gets questioned as the default state of affairs of things (Prigogine and Stengers 1984). More on this below.

  61. 61.

    “Every process, event, happening—call it what you will; in a word, everything that is going on in Nature means an increase of the entropy of the part of the world where it is going on…An isolated system or a system in a uniform environment…increases its entropy and more or less rapidly approaches the inert state of maximum entropy. We now recognize this fundamental law of physics to be just the natural tendency of things to approach the chaotic state…” (Schrödinger 2012, p. 71).

  62. 62.

    “[A] living organism continually increases its entropy—or, as you may say, produces positive entropy—and thus tends to approach the dangerous state of maximum entropy, which is of death. It can only keep aloof from it, i.e. alive, by continually drawing from its environment negative entropy…What an organism feeds upon is negative entropy. Or, to put it less paradoxically, the essential thing in metabolism is that the organism succeeds in freeing itself from all the entropy it cannot help producing while alive…Thus the device by which an organism maintains itself stationary at a fairly high level of orderliness (= fairly low level of entropy) really consists [in] continually sucking orderliness from its environment.” (Ibid.)

  63. 63.

    “At all levels, be it the level of macroscopic physics, the level of fluctuations, or the microscopic level, nonequilibrium is the source of order. Nonequilibrium brings “order out of chaos” (Prigogine and Stengers 1984, pp. 286–287. Italics original).

  64. 64.

    Prigogine and Stengers 1984, p. 13.

  65. 65.

    Chapter 6.

  66. 66.

    Von Neumann 1951, p. 32.

  67. 67.

    Von Neumann 1945.

  68. 68.

    Von Neumann 1945, p. 5.

  69. 69.

    Electronic Discrete Variable Automatic Computer.

  70. 70.

    Situation which, as we will see below, signaled the different orientation that Americans, in contrast with its British Allies, had regarding the war efforts—which reverberated in different ways of practicing cybernetic science. More on this in the next Section “The Backlash: Unforeseen Consequences of a Behavior-based Ontology”.

  71. 71.

    Marsalli 2006.

  72. 72.

    See the following section. Also Pickering 2005.

  73. 73.

    Chapter 3, Section “Traditional Explanations for the Collapse of Cybernetics”.

  74. 74.

    The problematic epistemological strategy that the construction of McCulloch and Pitts’ networks entailed will be further addressed in the chapter pertaining to the contribution of John von Neumann (7).

  75. 75.

    More on this in Chapter 9, Section “Viconian Constructability as a Criterion of Truth”.

  76. 76.

    The psychoanalytic assessment of what supposedly lurks behind the mostly male-filled area of Artificial Intelligence, the so-called “womb envy” syndrome, could in fact be subsumed under this more primogenial philosophical rubric referred to in the previous note. See McCorduck 2004, ch. 5.

  77. 77.

    This likely is the legacy of experimentation of the Royal Society since Bacon, Boyle and Newton. I thank Jagdish Hattiangadi for pointing this out.

  78. 78.

    See Chapter 3, Section “The Ratio Club”.

  79. 79.

    Ibid. Also Pickering 2010, p. 10.

  80. 80.

    Cordeschi 2008a.

  81. 81.

    Chapter 3, Section “The Ratio Club”.

  82. 82.

    “…thinking on very much the same lines as Kenneth Craik did, but with much less sparkle and humour” (Husbands and Holland 2008, p. 14).

  83. 83.

    Husbands & Holland (ibid.) share such opinion.

  84. 84.

    Referred to in Chapter 6, Section “The Homeostat: A Living Machine” of this work.

  85. 85.

    Pickering 2010, pp. 37–54.

  86. 86.

    Referred to in Chapter 2, Section “The AA-Predictor” of this work.

  87. 87.

    Referred to above and in Chapter 7, Section “John von Neumann’s Appropriation of McCulloch and Pitts’ Networks”.

  88. 88.

    Hayles 1999, pp. 63–65.

  89. 89.

    As seen on Chapter 4, Section “A Machinal Understanding of an Algorithm and the Material Liberation of the Machine” of this work. Also Galison 1994, pp. 233–252.

  90. 90.

    Galison 1994, p. 249.

  91. 91.

    Rosenblueth and Wiener 1945.

  92. 92.

    Rosenblueth and Wiener 1945, p. 320.

  93. 93.

    Pickering 2010, ch. 2.

  94. 94.

    Chapter 2, Section “The Macy Conferences” and Chapter 3, Section “Norbert Wiener’s Cybernetics”.

  95. 95.

    One physically embodied, the other immaterial.

Bibliography

  • Ashby, W. R. (1956). An Introduction to Cybernetics. London: UK: Chapman and Hall. New York, NY: John Wiley & Sons.

    Book  Google Scholar 

  • Boden, M. A. (2006). Mind as Machine: A History of Cognitive Science. Oxford; New York: Clarendon Press; Oxford University Press.

    Google Scholar 

  • Conway, F., & Siegelman, J. (2005). Dark Hero of the Information Age: In Search f Norbert Wiener, the Father of Cybernetics. New York, NY: Basic Books.

    Google Scholar 

  • Cordeschi, R. (2008a). Steps toward the synthetic method: Symbolic information processing and self-organizing systems in early artificial intelligence modeling. In P. Husbands, O. Holland, & M. Wheeler (Eds.), The Mechanical Mind in History (pp. 219–258). Cambridge, MA: MIT Press.

    Google Scholar 

  • Dupuy, J. P. (2000). The Mechanization of the Mind: On the Origins of Cognitive Science. Princeton, N.J: Princeton University Press.

    Google Scholar 

  • François, D. (2007). The Self-Destruction of the West. Paris, France: Éditions Publibook.

    Google Scholar 

  • Galison, P. (1994). The ontology of the enemy: Norbert Wiener and the cybernetic vision. Critical Inquiry, 21(1), 228–266.

    Article  Google Scholar 

  • Hayles, N. K. (1999). How we Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago, IL: University of Chicago Press.

    Book  Google Scholar 

  • Heidegger, M. (1977a). The question concerning technology. In W. Lovitt (Trans.). The Question Concerning Technology and Other Essays (pp. 3–35). New York, NY: Harper & Row. (Original work published 1954).

    Google Scholar 

  • Heidegger, M. (1977b). The end of philosophy and the task of thinking. In D. F. Krell (Ed.), Basic Writings. From Being and Time (1927) to The Task of Thinking (1964) (pp. 373–392). New York, NY: Harper & Row. (Original work published 1969).

    Google Scholar 

  • Heidegger, M. (1981). “Only a God Can Save Us”: The Spiegel interview. In T. Sheehan (Ed.), Heidegger: The Man and the Thinker (pp. 45–67). Chicago, IL: Precedent. (Original work published 1966).

    Google Scholar 

  • Heidegger, M. (1984). The Metaphysical Foundations of Logic (M. Heim, Trans.). Bloomington, IN: Indiana University Press. (Original work published 1978).

    Google Scholar 

  • Heidegger, M. (1999). Contributions to Philosophy (From Enowning) (P. Emad & K. Maly, Trans.). Bloomington, IN: Indiana University Press. (Original work published 1989).

    Google Scholar 

  • Heidegger, M., & Fink, E. (1993) Heraclitus Seminar (C. Seibert, Trans.). Evanston, IL: Northwestern University Press. (Original work published 1970).

    Google Scholar 

  • Husbands, P., & Holland, O. (2008). The Ratio Club: A hub of British cybernetics. In P. Husbands, O. H., & M. Wheeler (Eds.), The Mechanical Mind in History (pp. 91–148). Cambridge, MA: MIT Press.

    Google Scholar 

  • Hyötyniemi, H. (2006, August). Neocybernetics in Biological Systems. Espoo, Finland: Helsinki University of Technology, Control Engineering Laboratory. Retrieved from www.control.hut.fi/rpt/r151isbn9789512286133.pdf

    Google Scholar 

  • MacDonald, I. (2008). Adorno and Heidegger: Philosophical Questions. Stanford, CA: Stanford University Press.

    Google Scholar 

  • Marcovich, M. (1967). Heraclitus. Greek Text with a Short Commentary. Editio maior. Merida, Venezuela: Universidad de los Andes Press.

    Google Scholar 

  • Marsalli, M. (2006). McCulloch-Pitts neurons (National Science Foundation Grants #9981217 and #0127561). Retrieved from www.mind.ilstu.edu/curriculum/modOverview.php?modGUI=212

  • McCorduck, P. (2004). Machines Who Think: A Personal Inquiry into the History and Prospects of Artificial Intelligence. Natick, MA: A K Peters.

    Google Scholar 

  • Moschovakis, Y. N. (2001). What is an algorithm?. In B. Engquist & W. Schmid (Eds.), Mathematics Unlimited – 2001 and Beyond (pp. 919–936). Berlin, Germany: Springer.

    Chapter  Google Scholar 

  • Pickering, A. (2005). A gallery of monsters: Cybernetics and self-organisation, 1940–1970. In S. Franchi & G. Güzeldere (Eds.), Mechanical Bodies, Computational Minds: Artificial Intelligence from Automata to Cyborgs (pp. 229–245). Cambridge, MA: MIT Press.

    Google Scholar 

  • Pickering, A. (2010). The Cybernetic Brain: Sketches of Another Future. Chicago, IL: University of Chicago Press.

    Book  Google Scholar 

  • Prigogine, I., & Stengers, I. (1984). Order Out of Chaos: Man’s New Dialogue with Nature. New York, NY: Bantam Books.

    Google Scholar 

  • Rosenblueth, A., & Wiener, N. (1945). The role of models in science. Philosophy of Science, 12, 316–321.

    Article  Google Scholar 

  • Schrodinger, E. (2012). What is Life?: With Mind and Matter and Autobiographical Sketches. Cambridge, UK: Cambridge University Press. (Original work published 1944).

    Google Scholar 

  • Shannon, C. E. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27, 379–423.

    Article  Google Scholar 

  • Taylor, R. (1950a). Comments on a mechanistic conception of purposefulness. Philosophy of Science, 17(4), 310–317.

    Article  Google Scholar 

  • Taylor, R. (1950b). Purposeful and non-purposeful behavior: A rejoinder. Philosophy of Science, 17(4), 327–332.

    Article  Google Scholar 

  • Turing, A. M. (1939). Systems of logic based on ordinals. Proceedings of the London Mathematical Society, 45(1), 161–228.

    Article  Google Scholar 

  • United States Patent and Trademark Office. (2014, March). Manual of Patent Examining Procedure. Alexandria, VA: U.S. Government Printing Office.

    Google Scholar 

  • Von Neumann, J. (1945). First Draft of a Report on the EDVAC. (Contract No. W-670-ORD-4926, Between the United States Army Ordinance Department and the University of Pennsylvania). Philadelphia, PA: Moore School of Electrical Engineering.

    Book  Google Scholar 

  • Von Neumann, J. (1951). The general and logical theory of automata. In L. A. Jeffress (Ed.), Cerebral Mechanisms in Behavior: The Hixon Symposium (pp. 1–41). New York, NY: John Wiley & Sons.

    Google Scholar 

  • Weaver, W. (1949). The mathematics of communication. Scientific American, 181(1), 11–15.

    Article  Google Scholar 

  • Wiener, N. (1950). The Human Use of Human Being: Cybernetics and Society. Boston, MA: Houghton Mifflin Co. (2nd rev. ed. 1954.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2017 The Author(s)

About this chapter

Cite this chapter

Malapi-Nelson, A. (2017). Cybernetic Tenets: Philosophical Considerations. In: The Nature of the Machine and the Collapse of Cybernetics. Palgrave Studies in the Future of Humanity and its Successors. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-319-54517-2_5

Download citation

Publish with us

Policies and ethics