Font Size: a A A

Geometric optimization methods for adaptive filtering

Posted on:1994-11-06Degree:Ph.DType:Thesis
University:Harvard UniversityCandidate:Smith, Steven ThomasFull Text:PDF
GTID:2478390014992380Subject:Mathematics
Abstract/Summary:
The techniques and analysis presented in this thesis provide new methods to solve optimization problems posed on Riemannian manifolds. These methods are applied to the subspace tracking problem found in adaptive signal processing and adaptive control. A new point of view is offered for the constrained optimization problem. Some classical optimization techniques on Euclidean space are generalized to Riemannian manifolds. Several algorithms are presented and their convergence properties are analyzed employing the Riemannian structure of the manifold. Specifically, two new algorithms, which can be thought of as Newton's method and the conjugate gradient method on Riemannian manifolds, are presented and shown to possess quadratic and superlinear convergence, respectively. These methods are applied to several eigenvalue and singular value problems, which are posed as constrained optimization problems. New efficient algorithms for the eigenvalue problem are obtained by exploiting the special homogeneous space structure of the constraint manifold. It is shown that Newton's method applied to the Rayleigh quotient on a sphere converges cubically, and that the Rayleigh quotient iteration is an efficient approximation of Newton's method. The Riemannian version of the conjugate gradient method applied to this function gives a new algorithm for finding the eigenvectors corresponding to the extreme eigenvalues of a symmetric matrix. The Riemannian version of the conjugate gradient method applied to a generalized Rayleigh quotient yields a superlinearly convergent algorithm for computing the k eigenvectors corresponding to the extreme eigenvalues of an n-by-n matrix. This algorithm requires {dollar}O(nksp2){dollar} operations and O(k) matrix-vector multiplications per step. Several gradient flows are analyzed that solve eigenvalue and singular value problems. The new optimization algorithms are applied to the subspace tracking problem of adaptive signal processing. A new algorithm for subspace tracking is given, which is based upon the conjugate gradient method applied to the generalized Rayleigh quotient. The results of several numerical experiments demonstrating the convergence properties of the new algorithms are given.
Keywords/Search Tags:Method, Optimization, New, Rayleigh quotient, Riemannian manifolds, Adaptive, Algorithms, Several
Related items