You need to know: Probability (denoted by P), mean, variance, normal distribution, abbreviation i.i.d. for independent identically distributed random variables, Euclidean space , notation
for length of
, matrix, matrix multiplication, inverse matrix, norm
of
matrix A.
Background: A random variable X is called subgaussian if there exist constants
B and b such that for all
. For integer
, let
be an
matrix whose entries
are i.i.d. with the same distribution as X.
The Theorem: On 30th June 2005, Mark Rudelson submitted to the Annals of Mathematics a paper in which he proved the following result. Let X be a subgaussian random variable with mean and variance
. Then there are positive constants
,
, and
such that for any integer
, and any
, the inequality
holds with probability greater than
.
Short context: As a special case, the Theorem applies to matrices whose i.i.d. entries are equal to with equal chances (Bernoulli matrices). It is known that such random matrix A is invertible with probability exponentially close to 1. The Theorem provides a bound for the norm
of the inverse matrix. The bound is polynomial in n and holds with probability greater than
if n is large enough. Previously, similar bound was known only if i.i.d. entries
of A follow the normal distribution.
Links: Free arxiv version of the original paper is here, journal version is here. See also Section 8.11 of this book for an accessible description of the Theorem.
3 thoughts on “The norm of the inverse of n by n matrix with i.i.d. subgaussian entries is at most Cn^1.5/e with probability 1-e, if e>c/n^0.5”