We consider sequential, i.e., Gauss-Seidel type, subspace correction methods for the iterative solution of symmetric positive definite variational problems, where the order of subspace correction steps is not deterministically fixed as in standard multiplicative Schwarz methods. Here, we greedily choose the subspace with the largest (or at least a relatively large) residual norm for the next update step, which is also known as the Gauss-Southwell method. We prove exponential convergence in the energy norm, with a reduction factor per iteration step directly related to the spectral properties, e.g., the condition number, of the underlying space splitting. To avoid the additional computational cost associated with the greedy pick, we alternatively consider choosing the next subspace randomly, and show similar estimates for the expected error reduction. We give some numerical examples, in particular applications to a Toeplitz system and to multilevel discretizations of an elliptic boundary value problem, which illustrate the theoretical estimates.
This is joint work with Peter Oswald (JU Bremen).