Title :
Training of an Artificial Neural Network with Backpropagation algorithm using notification oriented paradigm
Author :
Fenando Sch?tz;Jo?o A. Fabro;Carlos R. E. Lima;Adriano F. Ronszcka;Paulo C. Stadzisz;Jean M. Sim?o
Author_Institution :
Graduate School in Electrical Engineering and Industrial Informatics (CPGEI), Federal University of Technology - Paran? (UTFPR) - Av. Sete de Setembro, 3165. Curitiba-PR, Brazil. 80-230-901
Abstract :
When implementing Artificial Neural Networks with imperative programing languages, the resulting programs are usually highly coupled. This problem usually hampers distribution over multiple processors, especially when the ANN executes on general-purpose processors. An emerging technique called Notification Oriented Paradigm (NOP) facilitates the development of distributed and decoupled systems, and minimizes the amount of processing calculations, while avoiding structural and temporal redundancies in the logical-causal evaluation. These advantages seem to be relevant to parallel systems, especially in the case of ANNs. In this sense, this paper presents the development of a Multi-Layer Perceptron using Backpropagation training algorithm, based on the concepts of NOP in its implementation. The overall performance of NOP implementation was not so good in comparison with the Imperative Paradigm (IP) implementation, because the current materialization of NOP (framework) is still based on a single thread implementation. Even though, this implementation proved to be plausible and decoupled, as well as parallelizable according the inherent parallelism of ANNs, conceptually intrinsic on the NOP.
Keywords :
"Artificial neural networks","Training","Neurons","Computer architecture","Hardware","Program processors"
Conference_Titel :
Computational Intelligence (LA-CCI), 2015 Latin America Congress on
DOI :
10.1109/LA-CCI.2015.7435978