Given the satisfying performances of the 3D-Var ARPEGE system operational at the time of the devastating «T1» and «T2» storms of 26/12/1999 06UTC and 27/12/1999 18UTC over France, it was of paramount importance to assess the performances of the new system on the same cases, and this for all the operational changes since the event. While the results oscillated around the same quality (with varying results depending on the range of prediction) for the T1 storm, the forecast of the T2 event was dramatically improved when introducing 4D-Var. This was not so surprising since part of the (partial) failure of the operational system at the time had been traced-back to a syndrome of exaggerated «first-guess-check» rejection of crucial data once the assimilated trajectory of the deepening low had started to diverge within the order of magnitude of its (small) active radius.
More interesting was the result of a study aiming (also with the help of reruns of the preceding year Xmas storm of 20/12/98) to understand the reasons for the high rate of successful forecasts of the ARPEGE system when other operational and research systems had had more forecasting problems for T1 and T2. Showing the lack of influence of model characteristics in pure forecasting mode was easy and brought us back to the data assimilation problem. But there it appeared that, once the rejection problems are cured by a more time-continuous procedure, the crucial ingredient lies not in the assimilation technique itself but in the tuning of the parameterisation set of the model and in particular in the computation of turbulent fluxes of heat and moisture in deep stable PBL situations. Work is underway to try and link this empirical finding with a more in-depth explanation of the influence of such «physical» choices on the use of observed data around the model trajectory, inside the variational data assimilation procedure.
Supplementary URL: