Title :
Self-organization, scaling, and parallelism
Author :
Stassinopoulos, Dimitiris
Author_Institution :
Comput. Sci. Div., NASA Ames Res. Center, Moffett Field, CA, USA
Abstract :
The problem of learning in the absence of external intelligence is discussed in the context of a simple model. The model departs from the traditional gradient-descent based approaches to learning by operating at a highly susceptible “critical” state, with low activity and sparse connections between firing neurons. Quantitative studies in the context of two simple association tasks demonstrate that the elements of the model that are essential for self-organized learning are also essential for its scaling performance and parallelism
Keywords :
brain models; learning (artificial intelligence); neurophysiology; self-organising feature maps; association tasks; firing neurons; low activity; parallelism; scaling; self-organization; sparse connections; Animals; Artificial neural networks; Biological neural networks; Biological system modeling; Context modeling; Evolution (biology); Neurons; Pediatrics; TV; Unsupervised learning;
Conference_Titel :
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location :
Anchorage, AK
Print_ISBN :
0-7803-4859-1
DOI :
10.1109/IJCNN.1998.687173