Title of article :
Competition-based neural network for the multiple travelling salesmen problem with minmax objective
Author/Authors :
Samerkae Somhom، نويسنده , , Abdolhamid Modares، نويسنده , , Takao Enkawa، نويسنده ,
Issue Information :
دوهفته نامه با شماره پیاپی سال 1999
Abstract :
In this paper we present the neural network model known as the mixture-of-experts (MOE) and determine its accuracy and its robustness. We do this by comparing the classification accuracy of MOE, backpropagation neural network (BPN), Fisher’s discriminant analysis, logistics regression, k nearest neighbor, and the kernel density on five real-world two-group data sets. Our results lead to three major conclusions: (1) the MOE network architecture is more accurate than BPN; (2) MOE tends to be more accurate than the parametric and non-parametric methods investigated; (3) MOE is a far more robust classifier than the other methods for the two-group problem.
Keywords :
Multiple travelling salesmen problem , Competition-based neural network , Optimization
Journal title :
Computers and Operations Research
Journal title :
Computers and Operations Research