The functional forms of convolved temperature components are clear from research literature and justify employing deterministic deconvolution to identify the anthropogenic and natural variability components needed for a statistical time series forecast that is consistent with scientific forecasting principles. This study has developed a deconvolution algorithm for recovering a signal component solution from the noisy historical record. Analyses of power spectral density show that the primary variability modes of interest operate on a multi-decadal basis, so that decadal low-pass signal processing filters should be employed for high frequency noise suppression. This is necessary for accurate application of statistical estimation methods and importantly serves to identify the smoothed level of the time series from which forecasts are made. The forecast is then an additive model with cyclical components appropriately phased on a parameterized trend from the level of the series. The underlying anthropogenic forcing trend is evolving logarithmically with rising atmospheric CO2 concentration according to effective climate response (ECR, temperature change per CO2 doubling). ECR incorporates other greenhouse gas forcings that are highly correlated with CO2 along with all feedback mechanisms and land surface interactions. ECR estimation is obscured by modes of variability whose estimated oscillatory functions must be subtracted from the historical series and filter-smoothed for the deconvolution process to be accurately applied. These are iteratively updated as ECR calculations are developed and relative importance of the sinusoidal functions clarifies. Rapid convergence to a solution is typically found, and a temperature forecast is completed with a projection of future CO2 atmospheric concentration.

The annual rate of change in CO2 was found to be well-represented by a logistic function whose extrapolation closely approximates CO2 concentrations in the RCP6.0 scenario of the IPCC AR5 report. Potential demographic, economic, and technological changes over the coming decades could modify CO2 evolution to the RCP4.5 scenario. It was found that the RCP2.6 and RCP8.5 scenarios for future CO2 concentrations are mathematically improbable.

The deconvolution algorithm was applied to the Met Office Hadley Centre's HadCRUT4 global temperature anomaly data set, and two multi-decadal cycles were identified: a primary mode operating with 64-year periodicity, and a secondary 20-year mode with 30% the strength of the primary. ECR was found to be 1.45oC/2xCO2 with the primary mode and 1.43oC when employing both. Considering forecasting principle guidelines (periodicity uncertainty ±5% vs ±20%, forecast horizon, attribution, incremental error reduction), only the primary mode was incorporated to forecasts. Forecasts were calculated to the end of this century spanning the RCP4.5 and RCP6.0 scenarios, although any CO2 scenario can be evaluated. The 64-year cycle has been attributed primarily to the Atlantic multidecadal oscillation (AMO), with supplemental influence of the Pacific decadal oscillation (PDO) that is phase shifted relative to the AMO. Paleoclimate data spanning the past several centuries and thermodynamic modeling of the Atlantic meridional overturning circulation confirm the predictable nature of the cycle; and just 1.3 cycles will arrive at an end-century forecast. The methodology has also been applied at a regional scale utilizing observational data records. Region-specific ECR values and cycle parameterizations are expected and were found, driven by local feedbacks, land surface characteristics, and relative local influences of the AMO, PDO and anthropogenic forcing.

Forecast model validation employing hold-out methods has been performed for time horizons in excess of 40 years. Mean errors with the HadCRUT4 global temperature data set at all horizons are essentially zero and demonstrate a ten-fold improvement in forecast accuracy relative to methods based in climate modeling and relative to persistence forecasts taken from levels of the time series. Those methods demonstrate increasing error biases of approximately 0.15oC per decade, with over-forecasting by climate models and under-forecasting by the persistence method. 20% of climate model bias was found to be attributable to CO2 forecast error and 80% of its error originated in internal model structural bias. Mean absolute error and confidence limits of this new forecast methodology are well-constrained relative to historical and anticipated changes in the level of the temperature time series, both at mid-century and at end-century.

Findings show that the temperature record contains a modulation of the anthropogenic trend by internal variability that is likely to continue into the future. Historical periods of warming and cooling are reconciled, including the 1970s cooling period and the current hiatus. The forecasts indicate the hiatus is likely to continue for the next two decades, followed by another warming period similar to the 1980s-90s. Temperature forecasts, regionally and globally, lie at the low end of the ranges suggested by IPCC climate model simulations, both at mid-century and end-century, and are more constrained than the wide range of AR5 projections. Global warming is found to approach, but not exceed, 2oC of total increase a century from now relative to the pre-industrial era, with approximately half of the increase having already occurred. The demonstrated error improvements of this forecasting methodology and its reconciliation with the observational record indicate that climate model temperature projections can be significantly improved upon by the adoption of statistical forecasting methods yielding accuracy improvement, reduction of uncertainties, reconciliation of research findings, and compliance with forecasting principles.