1.4
The operational 4D-Var data assimilation system of Meteo-France: specific characteristics and behaviour in the special case of the 99 Xmas storms over France
Jean-François Geleyn, Météo France, Toulouse, France; and D. Banciu, M. Bellus, R. El Khatib, P. Moll, P. Saez, and J. N. Thépaut
The operational data assimilation part of the ARPEGE/IFS global spectral variable resolution system of Météo-France went from the 3D-Var to the 4D-Var status (at equal resolution for the deterministic model, at similar resolution for the control variable and for a roughly trebled computing cost) on 20/06/2000. While similar in its concept to its ECMWF IFS/ARPEGE counterpart it relies for cost-efficiency on two novel features: a multi-incremental approach (the inner loops are successively solved at T42, T63 and T95 resolution) and the introduction of digital filter initialisation as a weak constraint inside the minimisation in order to replace the so-called Jc-NMI normal mode based penalty term of the classical formulation of 4D-Var (the technique is named Jc-DFI by mimicking, even if the two techniques are radically different, in particular in terms of complexity and costs).
Given the satisfying performances of the 3D-Var ARPEGE system operational at the time of the devastating «T1» and «T2» storms of 26/12/1999 06UTC and 27/12/1999 18UTC over France, it was of paramount importance to assess the performances of the new system on the same cases, and this for all the operational changes since the event. While the results oscillated around the same quality (with varying results depending on the range of prediction) for the T1 storm, the forecast of the T2 event was dramatically improved when introducing 4D-Var. This was not so surprising since part of the (partial) failure of the operational system at the time had been traced-back to a syndrome of exaggerated «first-guess-check» rejection of crucial data once the assimilated trajectory of the deepening low had started to diverge within the order of magnitude of its (small) active radius.
More interesting was the result of a study aiming (also with the help of reruns of the preceding year Xmas storm of 20/12/98) to understand the reasons for the high rate of successful forecasts of the ARPEGE system when other operational and research systems had had more forecasting problems for T1 and T2. Showing the lack of influence of model characteristics in pure forecasting mode was easy and brought us back to the data assimilation problem. But there it appeared that, once the rejection problems are cured by a more time-continuous procedure, the crucial ingredient lies not in the assimilation technique itself but in the tuning of the parameterisation set of the model and in particular in the computation of turbulent fluxes of heat and moisture in deep stable PBL situations. Work is underway to try and link this empirical finding with a more in-depth explanation of the influence of such «physical» choices on the use of observed data around the model trajectory, inside the variational data assimilation procedure.
Session 1, Numerical Data Assimilation Techniques
Monday, 30 July 2001, 1:00 PM-2:20 PM
Previous paper