Computable eigenvalue bounds for rank-k perturbations (original) (raw)
Related papers
Perturbation Bounds for Eigenvalues and Determinants of Matrices. A Survey
Axioms, 2021
The paper is a survey of the recent results of the author on the perturbations of matrices. A part of the results presented in the paper is new. In particular, we suggest a bound for the difference of the determinants of two matrices which refines the well-known Bhatia inequality. We also derive new estimates for the spectral variation of a perturbed matrix with respect to a given one, as well as estimates for the Hausdorff and matching distances between the spectra of two matrices. These estimates are formulated in the terms of the entries of matrices and via so called departure from normality. In appropriate situations they improve the well-known results. We also suggest a bound for the angular sectors containing the spectra of matrices. In addition, we suggest a new bound for the similarity condition numbers of diagonalizable matrices. The paper also contains a generalization of the famous Kahan inequality on perturbations of Hermitian matrices by non-normal matrices. Finally, ta...
Linear Algebra and its Applications, 2006
For a symmetric positive semi-definite diagonally dominant matrix, if its off-diagonal entries and its diagonally dominant parts for all rows (which are defined for a row as the diagonal entry subtracted by the sum of absolute values of off-diagonal entries in that row) are known to a certain relative accuracy, we show that its eigenvalues are known to the same relative accuracy. Specifically, we prove that if such a matrix is perturbed in a way that each off-diagonal entry and each diagonally dominant part have relative errors bounded by some , then all its eigenvalues have relative errors bounded by . The result is extended to the generalized eigenvalue problem.
Perturbation Splitting for More Accurate Eigenvalues
SIAM Journal on Matrix Analysis and Applications, 2009
Let T be a symmetric tridiagonal matrix with entries and eigenvalues of different magnitudes. For some T , small entrywise relative perturbations induce small errors in the eigenvalues, independently of the size of the entries of the matrix; this is certainly true when the perturbed matrix can be written as T = X T T X with small ||X T X − I||. Even if it is not possible to express in this way the perturbations in every entry of T , much can be gained by doing so for as many as possible entries of larger magnitude. We propose a technique which consists of splitting multiplicative and additive perturbations to produce new error bounds which, for some matrices, are much sharper than the usual ones. Such bounds may be useful in the development of improved software for the tridiagonal eigenvalue problem, and we describe their role in the context of a mixed precision bisection-like procedure. Using the very same idea of splitting perturbations (multiplicative and additive), we show that when T defines well its eigenvalues, the numerical values of the pivots in the usual decomposition T − λI = LDL T may be used to compute approximations with high relative precision.
Efficient eigenvalue computation for quasiseparable Hermitian matrices under low rank perturbations
Numerical Algorithms, 2008
In this paper we address the problem of efficiently computing all the eigenvalues of a large N × N Hermitian matrix modified by a possibly non Hermitian perturbation of low rank. Previously proposed fast adaptations of the QR algorithm are considerably simplified by performing a preliminary transformation of the matrix by similarity into an upper Hessenberg form. The transformed matrix can be specified by a small set of parameters which are easily updated during the QR process. The resulting structured QR iteration can be carried out in linear time using linear memory storage. Moreover, it is proved to be backward stable. Numerical experiments show that the novel algorithm outperforms available implementations of the Hessenberg QR algorithm already for small values of N .
Forward stable eigenvalue decomposition of rank-one modifications of diagonal matrices
We present a new algorithm for solving an eigenvalue problem for a real symmetric matrix which is a rank-one modification of a diagonal matrix. The algorithm computes each eigenvalue and all components of the corresponding eigenvector with high relative accuracy in O(n)O(n)O(n) operations. The algorithm is based on a shift-and-invert approach. Only a single element of the inverse of the shifted matrix eventually needs to be computed with double the working precision. Each eigenvalue and the corresponding eigenvector can be computed separately, which makes the algorithm adaptable for parallel computing. Our results extend to Hermitian case.
A Schatten-q Matrix Perturbation Theory via Perturbation Projection Error Bound
ArXiv, 2020
This paper studies the Schatten-$q$ error of low-rank matrix estimation by singular value decomposition under perturbation. Specifically, we establish a tight perturbation bound on the low-rank matrix estimation via a perturbation projection error bound. This new proof technique has provable advantages over the classic approaches. Then, we establish lower bounds to justify the tightness of the upper bound on the low-rank matrix estimation error. Based on the matrix perturbation projection error bound, we further develop a unilateral and a user-friendly sin$\Theta$ bound for singular subspace perturbation. Finally, we demonstrate the advantage of our results over the ones in the literature by simulation.
ETNA - Electronic Transactions on Numerical Analysis
In this paper, we derive backward error formulas of two approximate eigenpairs of a semisimple eigenvalue with multiplicity two for structured and unstructured matrix pencils. We also construct the minimal structured perturbations with respect to the Frobenius norm such that these approximate eigenpairs become exact eigenpairs of an appropriately perturbed matrix pencil. The structures we consider include T-symmetric/T-skewsymmetric, Hermitian/skew-Hermitian, T-even/T-odd, and H-even/H-odd matrix pencils. Further, we establish various relationships between the backward error of a single approximate eigenpair and the backward error of two approximate eigenpairs of a semisimple eigenvalue with multiplicity two.
Sensitivity and Accuracy of Eigenvalues Relative to Their Perturbation
The main objective of this paper is to study the sensitivity of eigenvalues in their computational domain under perturbations, and to provide a solid intuition with some numerical example as well as to represent them in graphically. The sensitivity of eigenvalues, estimated by the condition number of the matrix of eigenvectors has been discussed with some numerical example. Here, we have also demonstrated, other approaches imposing some structures on the complex eigenvalues, how this structure affects the perturbed eigenvalues as well as what kind of paths do they follow in the complex plane.