Abstract
Computation is to the processing of data, information, and knowledge, what physical work processes are to material transformations. Indeed computation is a work process of a very special kind in which energy is consumed to transform messages received into usable forms. Here, we will consider several kinds of computational processes and see how they all provide this transformation. Everyone is familiar with the digital computer, which has become one of the most ubiquitous forms of man-made computation. But we will also investigate biological forms of computation, especially, for example, how brains perform computations such as transforming sensory data streams into actions. One especially important use of computation (both in machines and brains) is the construction and running of simulation models.
“Calculating machines comprise various pieces of mechanism for assisting the human mind in executing the operations of arithmetic. Some few of these perform the whole operation without any mental attention when once the given numbers have been put into the machine.”
Charles Babbage , 1864 (From Hyman 1989)
“The most important feature that distinguishes biological computations from the work carried out by our personal computers – biological computations care. … valuing some computations more than others.”
Read Montague , 2006
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Throughout the history of neuroscience, the workings of the brain have been compared to, if not explained as, the workings of the best-known technology of the time. They have been treated as complex plumbing systems, telephony switching exchanges, and, most recently, digital computers. Thankfully, we now know that it is none of the above (although in some sense, it is also all of the above!).
- 2.
Computing processes are generally divided into digital and analog, but these generally refer to “machines.” Analog computers were extensively used prior to the development of the integrated circuit (chips) which allowed more computation power to be developed in small packages. Human-built analog computers were used for control and simulation purposes, which is why digital computers could replace them. However, in nature, there are a large number of evolved analog computing processes meeting the definition given here that cannot be easily emulated with algorithms. These are discussed in the chapter.
- 3.
According to David Harel (1987), the field is inappropriately named. He likens the name of computer science to toaster science. The science is not about the machine per se, it is about algorithmic computational processes, which just happen to be implemented in mechanical /electronic machines!
- 4.
See Wikipedia, Boolean Algebra: http://en.wikipedia.org/wiki/Boolean_algebra.
- 5.
For interested readers who would like to learn more about computers from the bits to the programming level, a very accessible (text) book is by Patt and Patel (2004).
- 6.
For example, natural numbers (integers 0, 1, 2, etc.) are used to represent letters and many different written symbols in various languages. See American Standard Code for Information Interchange (ASCII), http://en.wikipedia.org/wiki/ASCII. This code has been supplanted in newer computers by the Unicode, which allows the encoding of 65,536 different characters allowing it to represent most all characters from most written human languages!
- 7.
- 8.
Though it will be beyond the scope of the book, it should be pointed out that not all functions that might be expressed are actually computable algorithmically. Below, we explore other kinds of computations that are less rigid than machine types (deterministic) that can approximate non-computable functions.
- 9.
This third “state” introduces the notion of trinary (ternary) logic as opposed to binary logic. But all of the process rules still hold.
- 10.
- 11.
- 12.
- 13.
- 14.
- 15.
Computers use pseudorandom number generators to approximate stochastic processes. These allow us to simulate stochastic processes but not to actually emulate them. There are add-on devices that use, for example, line noise, to produce true random variable values, but these generally have a uniform distribution output and so are still of limited value in trying to emulate real-life randomness .
- 16.
In the field of psychology, a considerable amount of work has been done showing that human beings do not usually make rational (logical) decisions or judgments but rather use a discernible set of heuristic rules for most judgments. These heuristics can and do lead to consistent and detectable biases and result in judgmental errors. An excellent compendium of the state of research on this can be found in Gilovich et al. (2002).
- 17.
We will save the details for the next chapter but curious readers might want to take a look at the Wikipedia article: http://en.wikipedia.org/wiki/Classical_conditioning.
- 18.
- 19.
A capacitor, an element in electronic circuits that stores a charge, is an example of a leaky integrator. See http://en.wikipedia.org/wiki/Capacitor.
- 20.
Here is where real brains work quite differently from classical artificial neural networks (ANNs) based on what is called distributed representation work. In the latter, representations are encoded in a distributed fashion throughout all of the neural connections. Each synapse has a weight value corresponding to the notion of synaptic efficacy. But every synapse participates in every memory trace. In these ANNs, the whole network encodes every pattern and requires extensive training to get all of the weights just right. In real brains, we now know that neurons and clusters of neurons represent patterns and objects.
- 21.
See Alkon (1987), ch 16 for a thorough description of the model.
- 22.
- 23.
The mapping from retinal locations, i.e., the two-dimensional array of photosensitive cells in the retinas, is retinotopically preserved. See Wikipedia article on retinotopy for more details: http://en.wikipedia.org/wiki/Retinotopy. Other sensory modalities have similar topologically preserving layouts, e.g., touch sensors in the skin. Auditory sensing requires a transformation of complex audio signals that breaks the signal into its frequency components. This is accomplished by the hair cells arrayed along the cochlea of the inner ear, where each group of hairs along the linear array is responsive to a specific frequency (range). Primary auditory cortex then uses a very similar mapping scheme to preserve what is now called tonotopic registration. Auditory features also include amplitude of the various frequencies composing a tone at a specific instance in time.
- 24.
“Face neurons” have actually been detected in human brains! Using a procedure called unit recording (listening to individual neurons or small clusters for higher than background activity), neuroscientists have determined that there really are neurons that get excited with either the presentation of specific celebrity faces (like Bill Clinton’s) or when the individual was asked to think about the celebrity. This along with many similar findings from animal recordings have revived the theory of neuron encoding (presented here) of features, objects, and concepts, as opposed to the theory called distributed representation (Rumelhart and McClelland 1986, ch 3). For a quick description of how individual neurons can encode specific objects, like grandma’s face, see http://en.wikipedia.org/wiki/Grandmother_cell
- 25.
Storytelling is really what the brain does. We experience the world as a sequence of events that flow, generally, from one to the next over time. When we try to communicate with others and with ourselves, we actually construct stories. One important aspect of stories is that they take on meaning based on the affective quality of the events and their content. A unicorn might elicit a feeling of mysticism since no one has ever seen one, but the idea that it could exist in a different place or time helps generate a sense of wonder.
- 26.
- 27.
A full accounting of the approach can be found in Mobus , G. Foraging Search: Prototypical Intelligence (http://faculty.washington.edu/gmobus/ForagingSearch/Foraging.html. Accessed September 24, 2013).
- 28.
- 29.
See http://en.wikipedia.org/wiki/Bounded_rationality. Take note in this article how the word information is substituted for knowledge ! We warned you that even the experts make this error.
Bibliography and Further Reading
Alkon DL (1987) Memory traces in the brain. Cambridge University Press, Cambridge
Baars BJ, Gage NM (2007) Cognition , brain, and consciousness. Elsvier AP, New York
Gilovich T, Griffin D, Kahneman D (eds) (2002) Heuristics and Biases: the psychology of intuitive judgment, Paperbackth edn. Cambridge University Press, New York
Harel D (1987) The science of computing: exploring the nature and power of algorithms. Addison-Wesley Publishing Company, Inc., New York
Hyman A (ed) (1989) Science and reform: selected works of Charles Babbage . Cambridge University Press, Cambridge
Levine DS, Aparicio M (eds) (1994) Neural networks for knowledge representation and inference. Lawrence Erlbaum Associates, Hillsdale
Mobus GE (1994) Toward a theory of learning and representing causal inferences in neural networks.In: Levine and Aparicio (1994), Lawrence Erlbaum Associates, Hillsdale, ch 13
Montague R (2006) Why choose this book: how we make decisions. Dutton, New York
Patt YN, Patel SJ (2004) Introduction to computing systems: from bits to gates to C & beyond, 2nd edn. McGraw-Hill, New York
Rumelhart J, McClelland D (1986) Parallel distributed processing: explorations in the microstructure of cognition. MIT, Cambridge
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer Science+Business Media New York
About this chapter
Cite this chapter
Mobus, G.E., Kalton, M.C. (2015). Computational Systems. In: Principles of Systems Science. Understanding Complex Systems. Springer, New York, NY. https://doi.org/10.1007/978-1-4939-1920-8_8
Download citation
DOI: https://doi.org/10.1007/978-1-4939-1920-8_8
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4939-1919-2
Online ISBN: 978-1-4939-1920-8
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)