270 research outputs found

    A geometric convergence theory for the preconditioned steepest descent iteration

    Full text link
    Preconditioned gradient iterations for very large eigenvalue problems are efficient solvers with growing popularity. However, only for the simplest preconditioned eigensolver, namely the preconditioned gradient iteration (or preconditioned inverse iteration) with fixed step size, sharp non-asymptotic convergence estimates are known and these estimates require an ideally scaled preconditioner. In this paper a new sharp convergence estimate is derived for the preconditioned steepest descent iteration which combines the preconditioned gradient iteration with the Rayleigh-Ritz procedure for optimal line search convergence acceleration. The new estimate always improves that of the fixed step size iteration. The practical importance of this new estimate is that arbitrarily scaled preconditioners can be used. The Rayleigh-Ritz procedure implicitly computes the optimal scaling.Comment: 17 pages, 6 figure

    Angle-free cluster robust Ritz value bounds for restarted block eigensolvers

    Full text link
    Convergence rates of block iterations for solving eigenvalue problems typically measure errors of Ritz values approximating eigenvalues. The errors of the Ritz values are commonly bounded in terms of principal angles between the initial or iterative subspace and the invariant subspace associated with the target eigenvalues. Such bounds thus cannot be applied repeatedly as needed for restarted block eigensolvers, since the left- and right-hand sides of the bounds use different terms. They must be combined with additional bounds which could cause an overestimation. Alternative repeatable bounds that are angle-free and depend only on the errors of the Ritz values have been pioneered for Hermitian eigenvalue problems in doi:10.1515/rnam.1987.2.5.371 but only for a single extreme Ritz value. We extend this result to all Ritz values and achieve robustness for clustered eigenvalues by utilizing nonconsecutive eigenvalues. Our new bounds cover the restarted block Lanczos method and its modifications with shift-and-invert and deflation, and are numerically advantageous.Comment: 24 pages, 4 figure

    On the signal contribution function with respect to different norms

    Get PDF
    The signal contribution function (SCF) in multivariate curve resolution evaluates signal portions of specific components either in absolute or in relative form related to the integrated signal of all components. In 1999, Gemperline used the summed signal data, and in 2001, Tauler worked with the square-summed relative signal in order to determine the profiles that minimize, respectively maximize, the signal contribution. These profiles approximate the bands of all feasible profiles. Here, Gemperline's approach using the entrywise 1-matrix norm is proved to provide accurate bounds for two-component systems. This revives the approach of summed mass or absorption values with its potentially better chemical interpretability.Fil: Neymeyr, Klaus. Universität Rostock; Alemania. Leibniz-Institut für Katalyse; AlemaniaFil: Sawall, Mathias. Universität Rostock; AlemaniaFil: Olivieri, Alejandro Cesar. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Rosario. Instituto de Química Rosario. Universidad Nacional de Rosario. Facultad de Ciencias Bioquímicas y Farmacéuticas. Instituto de Química Rosario; Argentina. Universidad Nacional de Rosario. Facultad de Ciencias Bioquímicas y Farmacéuticas. Departamento de Química Analítica; Argentin

    Convergence analysis of a block preconditioned steepest descent eigensolver with implicit deflation

    Full text link
    Gradient-type iterative methods for solving Hermitian eigenvalue problems can be accelerated by using preconditioning and deflation techniques. A preconditioned steepest descent iteration with implicit deflation (PSD-id) is one of such methods. The convergence behavior of the PSD-id is recently investigated based on the pioneering work of Samokish on the preconditioned steepest descent method (PSD). The resulting non-asymptotic estimates indicate a superlinear convergence of the PSD-id under strong assumptions on the initial guess. The present paper utilizes an alternative convergence analysis of the PSD by Neymeyr under much weaker assumptions. We embed Neymeyr's approach into the analysis of the PSD-id using a restricted formulation of the PSD-id. More importantly, we extend the new convergence analysis of the PSD-id to a practically preferred block version of the PSD-id, or BPSD-id, and show the cluster robustness of the BPSD-id. Numerical examples are provided to validate the theoretical estimates.Comment: 26 pages, 10 figure

    Gradient flow approach to geometric convergence analysis of preconditioned eigensolvers

    Full text link
    Preconditioned eigenvalue solvers (eigensolvers) are gaining popularity, but their convergence theory remains sparse and complex. We consider the simplest preconditioned eigensolver--the gradient iterative method with a fixed step size--for symmetric generalized eigenvalue problems, where we use the gradient of the Rayleigh quotient as an optimization direction. A sharp convergence rate bound for this method has been obtained in 2001--2003. It still remains the only known such bound for any of the methods in this class. While the bound is short and simple, its proof is not. We extend the bound to Hermitian matrices in the complex space and present a new self-contained and significantly shorter proof using novel geometric ideas.Comment: 8 pages, 2 figures. Accepted to SIAM J. Matrix Anal. (SIMAX

    Convergence Analysis of Gradient Iterations for the Symmetric Eigenvalue Problem

    Full text link
    corecore