A computationally efficient backward selection algorithm forms the backbone of the proposed targeting approach. To address the computational burden resulting from the expense of determining the impact of each measurement choice on the uncertainty reduction in the verification site, the backward selection algorithm exploits the commutativity of mutual information. This enables the contribution of each measurement choice to be computed by propagating information backwards from the verification space/time to the search space/time. This approach dramatically reduces the number of times of computationally expensive covariance updates -- equivalently, perturbation ensemble updates -- needed for finding the optimal targeting solution. Numerical experiments using an idealized chaos model verifies the effectiveness of the algorithm.
Due to limitation of available ensemble size for a realistic weather model, real implementation of the proposed targeting algorithm might suffer from performance degradation. This work performs sensitivity analysis to quantify the degree of impact that small ensemble size might have on the performance of the ensemble-based targeting. Two new concepts of range-to-noise ration (RNR) and probability of correct decision (PCD) are introduced in this quantification and their formulae are derived from statistical analysis of estimation error of mutual information. Theoretical prediction of the degree of impact of small ensemble size is verified to be consistent with the numerical results.
Supplementary URL: