Every tensor has a tensor-train decomposition

You need to know: Matrix, rank \text{rank}(M) of a matrix M.

Background: Let n_1, n_2, \dots, n_d be positive integers. Let {\cal I} be the set of vectors i=(i_1, i_2, \dots, i_d) such that all i_j are integers and 1 \leq i_j \leq n_j for j=1,2,\dots, d. A function A:{\cal I} \to {\mathbb R} is called a d-dimensional n_1 \times n_2 \times \dots \times n_d tensor. Let r_k, k=0,1,2,\dots,d be positive integers such that r_0=r_d=1. We say that A has tensor-train decomposition (TT decomposition in short) with TT-ranks r_k if A(i_1, i_2, \dots, i_d) = G_1(i_1)G_2(i_2) \dots G_d(i_d), where G_k(i_k) is an r_{k-1} \times r_k matrix. For k=0,1,\dots,d, the unfolding matrix A_k of A is an (\prod\limits_{s=1}^k n_s) \times (\prod\limits_{s=k+1}^d n_s) matrix with rows indexed by (i_1, \dots, i_k) and columns indexed by (i_{k+1}, \dots, i_d) such that A_k( (i_1, \dots, i_k), (i_{k+1}, \dots, i_d)) = A(i_1, \dots, i_d).

The Theorem: On 10th March 2009, Ivan Oseledets submitted to the SIAM Journal on Scientific Computing a paper in which he proved that for each d-dimensional tensor A there exists a TT decomposition with TT-ranks r_k \leq \text{rank}(A_k), k=0,1,\dots, d.

Short context: Tensors are natural generalizations of matrices and are central in many applications. However, storing a d-dimensional tensor requires memory which grows exponential in d, and performing any operations requires exponential time. TT decomposition with low TT-ranks allows to store tensors and do computations on them with significantly less time and memory resources.

Links: The original paper is available here.

Go to the list of all theorems

Leave a comment