• DocumentCode
    396695
  • Title

    Distribution approximation, combinatorial optimization, and Lagrange-Barrier

  • Author

    Xu, Lei

  • Author_Institution
    Dept. of Comput. Sci. & Eng., Chinese Univ. of Hong Kong, China
  • Volume
    3
  • fYear
    2003
  • fDate
    20-24 July 2003
  • Firstpage
    2354
  • Abstract
    In this paper, typical analog combinatorial optimization approaches, such as Hopfield net, Hopfield-Lagrange net, Maximum entropy approach, Lagrange-Barrier approach, are systematically examined from the perspective of learning distribution. The minimization of a combinatorial cost is turned into a procedure of learning a simple distribution to approximate the Gibbs distribution induced from this cost such that both the distributions share a same global peak. From this new perspective, a new general guideline is obtained for developing analog combinatorial optimization approaches. Moreover, the Lagrange-Barrier iterative procedure proposed in Xu (1994) is further elaborated with guaranteed convergence on a feasible solution that satisfies constraints.
  • Keywords
    Hopfield neural nets; approximation theory; iterative methods; learning (artificial intelligence); maximum entropy methods; minimisation; Gibbs distribution; Hopfield-Lagrange net; Lagrange-Barrier iterative methods; analog combinatorial optimization; combinatorial cost; distribution approximation; learning distribution; maximum entropy methods; minimisation; Artificial neural networks; Computer science; Costs; Councils; Entropy; Equations; Guidelines; Lagrangian functions; Traveling salesman problems;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2003. Proceedings of the International Joint Conference on
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7898-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2003.1223780
  • Filename
    1223780