Abstract :
Many multivariate statistical procedures, such as principal component, canonical correlation, correspondence, and discriminant analysis, are based on the solution of a certain matrix approximation problem, one of the most important being the reduced-rank approximation. Its solution hinges on the singular value or the eigendecomposition of a certain matrix. Therefore, numerical techniques to determine eigenvalues and eigenvectors play an important role in these statistical applications. Furthermore, various modifications, generalizations, or refinements of the classical methods lead to some sort of nonlinear eigenproblem with no immediate unique solution. However, the nonlinear eigenvector approach has proved to be quite effective in solving these types of constrained optimization problems in diverse fields of multivariate data analysis. Since a general analysis of the properties and performance of the nonlinear eigenvector algorithm seems to be long overdue, this paper collects the common features in the applications mentioned above and gives an overview from a superior perspective. It unifies the treatment by contrasting the nonlinear eigenvector algorithm with the well-known inverse iteration, iteratively reweighted least squares, and majorization algorithms; extracts the mathematical tools—basically from convex analysis and matrix algebra—needed for the convergence proofs; and discusses convergence properties.