There is a convergent algorithm for minimising the total variation

You need to know: Matrix, norm ||u|| of a matrix u, convergence of sequence of matrices to a limit.

Background: For N \times N matrix u with entries u_{i,j} define its total variation as J(u) = \sum\limits_{i=1}^N \sum\limits_{j=1}^N \sqrt{(u_{i+1,j}-u_{i,j})^2 + (u_{i,j+1}-u_{i,j})^2}, where u_{N+1,j}=u_{N,j} and u_{i,N+1}=u_{i,N} by convention. For given N \times N matrix g and \lambda>0, consider optimization problem \min\limits_u \left(\frac{||u-g||}{2\lambda}+J(u)\right) looking for matrix u close to g but with small total variation J(u).

The Theorem: In January 2004, Antonin Chambolle published in the Journal of Mathematical Imaging and Vision a paper in which he presented an algorithms for constructing a sequence u_n of matrices which is guaranteed to converge to the optimal solution u^* of this problem.

Short context: The optimisation problem described above arises in various applications including image denoising, zooming, and the computation of the mean curvature motion of interfaces. The algorithm presented in the paper is guaranteed to converge and works quite fast in practical applications.

Links: The original paper is available here.

Go to the list of all theorems

Leave a comment