Skip to main content
Proceedings of the National Academy of Sciences of the United States of America logoLink to Proceedings of the National Academy of Sciences of the United States of America
. 1988 Apr;85(7):2141–2145. doi: 10.1073/pnas.85.7.2141

Neural networks counting chimes.

D J Amit 1
PMCID: PMC279945  PMID: 3353371

Abstract

It is shown that the ideas that led to neural networks capable of recalling associatively and asynchronously temporal sequences of patterns can be extended to produce a neural network that automatically counts the cardinal number in a sequence of identical external stimuli. The network is explicitly constructed, analyzed, and simulated. Such a network may account for the cognitive effect of the automatic counting of chimes to tell the hour. A more general implication is that different electrophysiological responses to identical stimuli, at certain stages of cortical processing, do not necessarily imply synaptic modification, a la Hebb. Such differences may arise from the fact that consecutive identical inputs find the network in different stages of an active temporal sequence of cognitive states. These types of networks are then situated within a program for the study of cognition, which assigns the detection of meaning as the primary role of attractor neural networks rather than computation, in contrast to the parallel distributed processing attitude to the connectionist project. This interpretation is free of homunculus, as well as from the criticism raised against the cognitive model of symbol manipulation. Computation is then identified as the syntax of temporal sequences of quasi-attractors.

Full text

PDF
2141

Selected References

These references are in PubMed. This may not be the complete list of references from this article.

  1. Amit DJ, Gutfreund H, Sompolinsky H. Information storage in neural networks with low levels of activity. Phys Rev A Gen Phys. 1987 Mar 1;35(5):2293–2303. doi: 10.1103/physreva.35.2293. [DOI] [PubMed] [Google Scholar]
  2. Amit DJ, Gutfreund H, Sompolinsky H. Spin-glass models of neural networks. Phys Rev A Gen Phys. 1985 Aug;32(2):1007–1018. doi: 10.1103/physreva.32.1007. [DOI] [PubMed] [Google Scholar]
  3. Dehaene S., Changeux J. P., Nadal J. P. Neural networks that learn temporal sequences by selection. Proc Natl Acad Sci U S A. 1987 May;84(9):2727–2731. doi: 10.1073/pnas.84.9.2727. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Finkel L. H., Edelman G. M. Interaction of synaptic modification rules within populations of neurons. Proc Natl Acad Sci U S A. 1985 Feb;82(4):1291–1295. doi: 10.1073/pnas.82.4.1291. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Hopfield J. J. Neural networks and physical systems with emergent collective computational abilities. Proc Natl Acad Sci U S A. 1982 Apr;79(8):2554–2558. doi: 10.1073/pnas.79.8.2554. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Kleinfeld D. Sequential state generation by model neural networks. Proc Natl Acad Sci U S A. 1986 Dec;83(24):9469–9473. doi: 10.1073/pnas.83.24.9469. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Shinomoto S. A cognitive and associative memory. Biol Cybern. 1987;57(3):197–206. doi: 10.1007/BF00364151. [DOI] [PubMed] [Google Scholar]
  8. Sompolinsky H, Kanter I., I Temporal association in asymmetric neural networks. Phys Rev Lett. 1986 Dec 1;57(22):2861–2864. doi: 10.1103/PhysRevLett.57.2861. [DOI] [PubMed] [Google Scholar]

Articles from Proceedings of the National Academy of Sciences of the United States of America are provided here courtesy of National Academy of Sciences

RESOURCES