Title :
Transputers and neural networks: an analysis of implementation constraints and performance
Author :
Murre, Jacob M J
Author_Institution :
Unit. of Exp. & Theor. Psychol., Leiden Univ., Netherlands
fDate :
3/1/1993 12:00:00 AM
Abstract :
A performance analysis is presented that focuses on the achievable speedup of a neural network implementation and on the optimal size of a processor network (transputers or multicomputers that communicate in a comparable manner). For fully and randomly connected neural networks the topology of the processor network can only have a small, constant effect on the iteration time. With randomly connected neural networks, even severely limiting node fan-in has only a negligible effect on decreasing the communication overhead. The class of modular neural networks is studied as a separate case which is shown to have better implementation characteristics. On the basis of implementation constraints, it is argued that randomly connected neural networks cannot be realistic models of the brain
Keywords :
network topology; neural nets; performance evaluation; transputers; CALM development systems; implementation constraints; iteration time; performance analysis; randomly connected neural networks; topology; transputers; Artificial neural networks; Biological neural networks; Brain modeling; Central Processing Unit; Jacobian matrices; Network topology; Neural networks; Neurons; Performance analysis; Psychology;
Journal_Title :
Neural Networks, IEEE Transactions on