Abstract
This chapter derives a standard memory cell, that is, a cell that is arranged to interface with other such cells to constitute a memory word of long-term memory. A standard memory cell contains the necessary gates to control signals to and from a memory element; each standard memory cell has its own individual memory element.
A word of long-term memory is assumed enabled by Rin and Lin (Right in and Left in) signals; an associative match is signaled by the emergence of a signal at Rout and Lout (Right out and Left out). Cues or attributes are applied from the direction of conscious short-term memory (STM), via a cue editor, and if there are matches, the contents of the words are returned toward conscious STM (via a recall editor).
Multiple matches are common when cues are few in number, necessitating a system of multiple match resolution. This is accomplished in part by using gates to block additional matches once a particular match is located. Memory search is then resumed using the same cues, but to support multiple match resolution, a simulated qubit in the form of a toggle is reset to zero and serves to block matching words already returned.
Neural circuits for reading long-term memory are also used to write, that is, memorize images that are taken from conscious STM. Memorization is modeled as occurring automatically once certain conditions are met, such as, the image is not already memorized, and the image has occurred in consciousness a given number of times. A simplified filter circuit is suggested for images that appear a couple of times, although in practice, several rehearsals would be more realistic for memorization.
A multiwrite system is necessary to ensure that each newly memorized image goes into only one blank available word of long-term memory. This is accomplished below with a stack of long-term memory elements all cleared to false except one, where the new memory will go. Once the memory is in place, the memory element is set to true and stays true indefinitely, just like the memory itself.
This chapter compares memorization to learning, with an eye to explaining the amazing abilities of savants who apparently memorize huge amounts of information. Learning filters are explained for state machine learning, in which sequences are recalled from subconscious long-term memory without having to pass through conscious STM. Savant memorization is proposed, in fact, to be learning, which is accomplished with the aid of special learning filters. An example filter is given that permits subconscious learning with a very low count of rehearsals.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Pagiamtzis K, Sheikholeslami A (2006) Content-addressable memory (CAM) circuits and architectures: a tutorial and survey, IEEE J Solid State Circuits 41(3):712–727
Burger JR (2009) Human memory modeled with standard analog and digital circuits: inspiration for man-made computers. Wiley, Hoboken, NJ
Huettel SA, Song AW, McCarthy G (2004) Functional magnetic resonance imaging. Siniauer Associates, Inc, Sunderland, MA, pp 162–170
Ogawa S, Lee TM, Nayuak AS, Glynn P (1990) Oxygenation-sensitive contrast in magnetic resonance image of rodent brain at high magnetic fields. Magn Reson Med 14(1):68–78
Burger JR (2011) Qubits underlie gifted savants. NeuroQuantology 9:351–360
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer Science+Business Media New York
About this chapter
Cite this chapter
Burger, J.R. (2013). Long-Term Memory Neural Circuits, Fast and Precise. In: Brain Theory From A Circuits And Systems Perspective. Springer Series in Cognitive and Neural Systems, vol 6. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-6412-9_8
Download citation
DOI: https://doi.org/10.1007/978-1-4614-6412-9_8
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-6411-2
Online ISBN: 978-1-4614-6412-9
eBook Packages: Biomedical and Life SciencesBiomedical and Life Sciences (R0)