next up previous
Next: About this document ...

Laurent Debreu
Multigrid solvers and multigrid preconditioners for the solution of variational data assimilation problems

INRIA EPI MOISE and Laboratoire Jean Kuntzmann
51 rue des Mathematiques
38400 Saint Martin d'Heres
FRANCE
Laurent.Debreu@inria.fr
Emilie Neveu
Ehouarn Simon
François-Xavier LeDimet
Arthur Vidard

In order to lower the computational cost of the variational data assimilation process, we investigate the use of multigrid methods to solve the associated optimal control system. On a linear advection equation, we study the impact of the regularization term on the optimal control and the impact of discretization errors on the efficiency of the coarse grid correction step. We show that even if the optimal control problem leads to the solution of an elliptic system, numerical errors introduced by the discretization can alter the success of the multigrid methods. The view of the multigrid iteration as a preconditioner for a Krylov optimization method leads to a more robust algorithm. A scale dependent weighting of the multigrid preconditioner and the usual background error covariance matrix based preconditioner is proposed and brings significant improvements.

We consider the time evolution of a system governed by the following equation:

\begin{displaymath}\begin{array}{l} \displaystyle \frac{{\rm d}X}{{\rm d}t}=F(X) [2mm] \displaystyle X(t=t_0)={\bf x} \end{array}\end{displaymath} (1)

$ {\bf x}$ is the initial condition at time $ t=t_0$ and will be our control parameter. The variational data assimilation problem consists in finding the minimum of a cost function $ J({\bf x})$ that measures the distance from the numerical model to the observations and includes a background or regularization term associated to a first guess $ {\bf
x}_b$ .

$\displaystyle J({\bf x})=\frac{1}{2}\left({\bf x}-{\bf x}_b\right)^T{{\bf B}^{-...
...bf x},t)\right)-y\right){{\bf R}^{-1}}\left(H\left(X({\bf x},t)\right)-y\right)$ (2)

Here $ y$ are the observations. $ H$ is the observation operator from the model to the observations space, $ {\bf R}$ and $ {\bf B}$ are respectively the observations and background error covariances matrices. At a minimum $ {\bf x}^\star$ of $ J$ , the gradient is zero

$\displaystyle \nabla_{\bf x} J({\bf x}^\star)=0$ (3)

When the model $ F$ and the observations operator $ H$ are linear, the cost function is quadratic and the solution of (3) is equivalent to the solution of

$\displaystyle {\bf A}{\bf x^{\star}}=b$ (4)

where $ {\bf A}$ is the Hessian of the cost function:

$\displaystyle {\bf A}={\bf B}^{-1} + {\bf H}^T {\bf R}^{-1} {\bf H}
$

where $ {\bf H}$ is a compact representation that includes both the model and the observation operators and the right hand side $ b$ is given by

$\displaystyle b={\bf B}^{-1} {\bf x}_b+{\bf H}^T {\bf R}^{-1} y
$

The subject of this presentation is the application of multigrid methods for the solution of (4). On a model problem of a linear advection equation, the following key points are investigated:

We then consider the use of the multigrid cycle as a preconditioner for a conjugate gradient algorithm. Best results are obtained by an hybrid preconditioner written as a combination of the multigrid cycle and the traditional background error covariance matrix ($ {\bf B}$ ) based preconditioner.


[1] Laurent Debreu, Emilie Neveu, Ehouarn Simon, François-Xavier Le Dimet and Arthur Vidard, 2014: Multigrid solvers and multigrid preconditioners for the solution of variational data assimilation problems submitted to QJRMS, http://hal.inria.fr/hal-00874643
[2] Emilie Neveu, Laurent Debreu and François-Xavier Le Dimet, 2011: Multigrid methods and data assimilation - Convergence study and first experiments on non-linear equations ARIMA, 14, 63-80,
http://intranet.inria.fr/international/arima/014/014005.html




next up previous
Next: About this document ...
Copper Mountain 2014-02-24