The VERA analysis scheme is based on the variational principle and does not need any first guess fields, so it is NWP model independent and can therefore also be used as an unbiased reference for real time model validation. For downscaling purposes VERA uses an a priori knowledge on small-scale physical processes over complex terrain, the so called fingerprint technique, which transfers information from rich to data sparse regions. Furthermore, it includes a sophisticated quality control tool. This is one of the fundamentals of this method, as a good data quality is absolutely necessary to obtain meteorologically and physically reliable analyses fields.
One key problem arises from the availability of a dense data set. Within the WWRP projects D-PHASE and COPS a joint activity has been started to collect GTS and non-GTS data from the national meteorological services in Central Europe for 2007. Data from more than 11.000 stations allow running the analysis with a spatial resolution of 8 km on an hourly basis. For special COPS-IOP's data from scientific networks have been included additionally and allow a spatial resolution of 2 km over the COPS area.
When defining errors in (model independent) analysis fields we have to consider the fact that analyses are not time dependent and that no perturbation method aimed at temporal evolution, as for example the growth of singular vectors of propagation matrices, is possible. Further, the method applied should respect two major sources of analysis errors: Observation errors and analysis or interpolation errors. Cross-validation which is often used to estimate the potential of an analysis method does not account for observation errors, also variance estimates provided by Kriging and Spline analysis systems are more dependent on station separation than on local variations of observed parameters.
With the concept of an analysis ensemble we hope to get a more detailed sight on all sources of analysis errors. For the computation of the VERA analysis ensemble members a sample of Gaussian random perturbations is produced for each station and parameter. The deviation of perturbations is based on the correction proposals by the VERA QC scheme to provide some natural limits for the ensemble.
Tests concerning the number of ensemble members show that visible differences in the resulting patterns can be found between 10 and 50 members but that no more structure is revealed by comparing a set of 50 with 100 members. Tests are also performed for an equal deviation of perturbations for all stations and for alternative analysis methods like Ordinary Kriging and the Barnes analysis method.
For the first experiments there is no structure imposed on the perturbations of the observational data except for an estimate of the observation uncertainties, which are estimates for monthly periods and, thus, do not represent the prevailing weather regime. In order to put more emphasis on the weather situation we aim to integrate the main synoptic field structures as weighting factors for the perturbations. In doing so it is assumed that frontal regions with strong gradients or heavy rain showers are associated with higher analysis uncertainty than regions with weak gradients and stratiform precipitation events or no rain, respectively.
First results of the ensembles approaches described above and a comparison of two approaches, the first based solely on QC error estimates and the second one, where the error estimates are combined with PCA based weighting factors, are given in the presentation.