Title : 
Optimizing an algebraic perceptron solution
         
        
            Author : 
Hanselmann, Thomas ; Noakes, Lyle
         
        
            Author_Institution : 
Dept. of Electr. & Electron. Eng., Western Australia Univ., Nedlands, WA, Australia
         
        
        
        
        
        
            Abstract : 
In this paper ways of optimizing a solution achieved by the perceptron algorithm are investigated. The perceptron algorithm was successfully applied to polynomial inner-product kernels equivalent to those of support vector machines (SVM). Therefore, the algebraic perceptron can achieve a polynomial separation in the input data space by doing a linear separation in a high-dimensional feature space. Unlike the solution for the separating hyperplane of a SVM, a solution achieved by the algebraic perceptron will be nonoptimal in general. We improve an algebraic perceptron solution by mapping data points onto a sphere and then applying a standard C-SVM in form of the Adatron or Vijayakumar´s (1999) SVM-seq, respectively. As a consequence of this, there is no equality constraint in the C-SVM problem
         
        
            Keywords : 
algebra; learning automata; optimisation; perceptrons; Adatron; C-SVM; SVM-seq; algebraic perceptron solution optimization; high-dimensional feature space; linear separation; nonoptimal solution; polynomial inner-product kernels; polynomial separation; separating hyperplane; support vector machines; Information processing; Intelligent systems; Kernel; Large-scale systems; Mathematics; Polynomials; Quadratic programming; Software packages; Statistics; Support vector machines;
         
        
        
        
            Conference_Titel : 
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
         
        
            Conference_Location : 
Washington, DC
         
        
        
            Print_ISBN : 
0-7803-7044-9
         
        
        
            DOI : 
10.1109/IJCNN.2001.939579