Résumé
Les architectures connexionnistes, utilisant le concept de neurone formel, peuvent présenter des propriétés rappelant certaines fonctions cognitives simples. Elles peuvent donc conduire à des machines nouvelles de traitement du signal, mettant en Œuvre des fonctions de classification avec généralisation ou de mémoire associative, par exemple. Ľimplantation de ces fonctions dans le silicium peut être avantageuse, si ľon arrive à surmonter les problèmes de connectivité et de réalisation des coefficients synaptiques, que ce soit en technologie analogique ou bien en technologie numérique. Ľavènement de technologies submicroniques permettra de faire des progrès en direction de réseaux plus importants que ce que ľon sait intégrer actuellement, mais ce n’est probablement que dans ľassociation de réseaux indépendants travaillant en coopération dans des architectures hiérarchisées que ľon arrivera à implanter des fonctions de plus haut niveau, avec un fort parallélisme.
Abstract
Connexionist architectures, using the formal neuron concept, may exhibit properties reminding of some simple cognitive functions. They may thus lead to new signal processing machines, implementing classification and generalization functionalities, or associative memories, for instance. Integration of these functions in silicon can be beneficial, if it is possible to solve connectivity issues and synaptic coefficients realisation, whether using analog or digital technologies. With coming submicrometer technologies, it will be possible to make progress towards networks more important than those which are integrated nowadays, but it is probably only through association of independant networks, cooperating within hierarchized architectures, that it will be possible to implement higher level functions, having a strong parallelism.
Bibliographie
Références mentionnées dans le texte
McCulloch (W.S.), Pitts (W.). A logical calculus of the ideas immanent in nervous activity.Bull. Math. Biophys. (1943),5, pp. 115–133.
Hebb (D.O.). The organization of behaviour.Wiley (1949).
Rosenblatt (F.). The perceptron: a probabilistic model for information storage and organization in the brain.Psychological Rev. (1958),65, pp. 386–408.
Duda (R.O.),Hart (P.E.). Pattern classification and scene analysis.Wiley (1973).
Rumelhart (D.E.),McClelland (J.L.). Parallel distributed processing.MIT Press (1986).
Hopfield (J.J.). Neural networks and physical systems with emergent collective computational abilities.Proc. Natl. Acad. Sci. USA (1982),79, pp. 2554–2558.
Personnaz (L.), Guyon (I.), Dreyfus (G.). Collective computational properties of neural networks: new learning mechanisms.Phys. Rev. A (1986),34, pp. 4217–4228.
Personnaz (L.). Etude de réseaux de neurones formels : conception, propriétés et applications.Thèse de Doctoral ďEtat, Université de Paris VI, 1986.
Amit (D.J.), Gutfreund (H.), Sompolinsky (H.). Storing infinite numbers of patterns in a spin-glass model of neural networks,Phys. Rev. Lett. (1985),55, n° 14, pp. 1530–1533.
Ackley (D.H.), Hinton (G.E.), Sejnowski (T.J.). A learning algorithm for Boltzman machines.Cognitive Science (1985),9, pp. 147–169.
Azencott (R.). Synchronous Boltzmann machines and Gibbs fields : learning algorithms.Proc. Les Arcs NATO Advanced Workshop on Neurocomputing, F. Fogelman, J. Hérault, eds.Springer Verlag (1990).
Sompolinsky (H.). Neural networks with non linear synapses and a static noise.Phys. Rev. A (1986),34, pp. 2571–2574.
Hopfield (J.J.). Neurons with graded response have collective computational properties like those of two-state neurons.Proc. Natl. Acad. Sci. USA (1984),81, pp. 3088–3092.
Steinbuch (K.). Die Lernmatrix.Kybernetik (1961),1, pp. 36–45.
Schwartz (D.B.), Howard (R.E.), Denker (J.S.), Epworth (R.W.), Graf (H.P.), Hubbard (W.), Jackel (L.D.), Straughn (B.), Tennant (D.M.). Dynamics of microfabricated electronic neural networks.Appl. Phys. Lett. (1987),50, pp. 1110–1112.
Graf (H.P.),de Vegvar (P.). A cmos implementation of a neural network model.Proc. Stanford Conf. on Advanced research on VLSI, P. Losleben, ed., 1987,MIT Press, pp. 351–367.
Graf (H.P.),Henderson (D.). A reconfigurable cmos neural network.IEEE Int. Solid-State Circ. Conf. (1990).
Holler (M.),Tam (S.),Castro (H.),Benson (R.). An electrically trainable artificial neural network with 10240 “floating gate” synapses.Proc. International Joint Conference on Neural Networks, Washington D.C. (1989).
Alspector (J.),Allen (R.B.),Hu (V.),Satyanarayana (S.). Stochastic learning networks and their electronic implementation.Neural Information Processing Systems, Natural and Synthetic, D.Z. Anderson, ed., American Institute of Physics (1988).
Graf (H.P.),Jackel (L.D.),Howard (R.E.),Straughn (J.S.),Denker (J.S.),Hubbard (W.),Tennant (D.M.),Schwartz (D.). vlsi implementation of a neural network memory with several hundreds of neurons.Neural networks for computing, J.S. Denker, ed.,American Institute of Physics (1986).
Spencer (E.G.). Programmable bistable switches and resistors for neural networks.Neural networks for computing, J.S. Denker, ed.,American Institute of Physics (1986).
Vignolle (J.M.),Chambost (E. de),Defrance (M.),Le Pesant (J.P.),Mourey (B.),Robin (Ph.),Micheron (F). A ferroelectric interconnection matrix for vlsi programmable neural network.Thomson-CSV LCR preprint, février 1989.
Tsividis (Y.P.), Anastassiou (D.). Switched capacitor neural network.Electronic Lett. (1987),23, pp. 958–959.
Schwartz (D.B.), Howard (R.E.), Hubbard (W.E.). A programmable analog neural network chip.IEEE J. of Solid State Circ. (1989),24, pp. 313–319.
Sage (J.P.),Thompson (K.),Withers (R.S.). An artificial network integrated circuit based on MNOS/CCD principles.In : Neural networks for computing, J.S. Denker ed.,American Institute of Physics (1986).
Murray (A.F.), Smith (A.V.W.). Asynchronous vlsi neural networks using pulse-stream arithmetic.IEEE J. of Solid State Circ. (1988),23, pp. 688–697.
Del Corso (D.),Gregoretti (F.),Pellegrini (C.),Reyneri (L.M.). An artificial neural network based on multiplexed pulse streams.Proc. 1st Workshop on Microelectronics for Neural Networks, K. Goser, U. Ramacher, U. Rückert, eds., Dortmund (1990).
Weinfeld (M.). A fully digital CMOS integrated Hopfield network including the learning algorithm. International Workshop on vlsi for artificial intelligence, Oxford (GB), July 1988.In : vlsi for artificial intelligence, J.G. Delgado-Frias and W. Moore eds.,Kluwer Academic (1989).
Kung (S.Y.), Hwang (J.N.). Parallel architectures for artificial neural nets.Proc. IEEE International Conf. on Neural Networks (1988),11, pp. 165–172.
Jones (S.),Thomaz (M.),Sammut (K.). Linear systolic neural network machine.IFIP Workshop on Parallel Architectures on Silicon, Grenoble (1989).
Blayo (F.),Hurat (P.). A systolic architecture dedicated to neural networks.Neural networks, from models to applications, L. Personnaz, G. Dreyfus, eds.,IDSET (1989).
Yasunaga (M.),Masuda (N.),Asai (M.),Yamada (M.),Hirai (Y). A wafer scale in integration neural network utilizing completely digital circuits.Proc. International Joint Conference on Neural Networks, Washington D.C. (1989).
Duranton (M.),Gobert (J.),Mauduit (N.). A digital vlsi module for neural networks.Neural networks, from models to applications. L. Personnaz, G. Dreyfus, eds.,IDSET (1989).
Alla (P.Y.),Dreyfus (G.),Gascuel (J.D.),Johannet (A.),Personnaz (L.),Roman (J.),Weinfeld (M.). Silicon integration of learning algorithm and other auto-adaptive properties in a digital feedback neural network.Proc. 1st Workshop on Microelectronics for Neural Networks,K. Goser, U. Ramacher, U. Rückert,eds., Dortmund (1990).
Sejnowski (T.J.), Rosenberg (C.M.). Parallel networks that learn to pronounce english text.Complex Systems (1987),1, pp. 145–168.
Personnaz (L.),Johannet (A.),Dreyfus (G.),Weinfeld (M.). Towards a neural network chip : a performance assessment and a simple example.Neural networks, from models to applications. L. Personnaz, G. Dreyfus eds.,IDSET(1989).
Ouali (J.),Saucier (G.),Trilhe (J.). A flexible wafer scale network.ICCD Conference, Cambridge (USA) (1989).
Jessi Blue Book :Advanced Neural Circuits and Networks on Silicon (1990).
Peretto (P.), Niez (J.J.). Long term memory storage capacity of multiconnected neural networks.Biol. Cybernetics (1986),54, pp. 53–63.
Références additionnelles
Anderson (D.Z.). Neural information processing systems.American Institute of Physics (1988).
Anderson (J.A.),Rosenfeld (E.). Neurocomputing, foundations of research.MIT Press (1988).
Del Corso (D.),Grosspietsch (K.E.),Treleaven (P.). European approaches to vlsi neural networks.IEEE Micro (Déc. 1989), 9, n° 6.
Denker (J.S.). Neural networks for computing.American Institute of Physics (1986).
Kohonen (T.). Self-organization and associative memory. Springer series in Information Science,Springer Verlag (1984).
Lippmann (R.). An introduction to computing with neural nets.IEEE ASSP Magazine (avril 1987),4, n° 2, pp. 4–22.
Personnaz (L.), Dreyfus (G.). Neural networks: from models to applications.Editions IDSET, Paris (1989).
Ramacher (U.). Introduction to vlsi design of artificial networks.Kluwer Academic publishers (1990).
Touretzky (D.S.). Advances in neural information processing systems.Morgan Kaufman publishers, vol. 1 (1989), vol. 2 (1990).
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Weinfeld, M. Intégration ďarchitectures à base de réseaux de neurones formels : un défi pour les technologies submicroniques. Ann. Télécommun. 46, 142–155 (1991). https://doi.org/10.1007/BF02995443
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.1007/BF02995443