“Environmental modeling involves three basic steps. First, a theoretical model is developed. Next, the model is calibrated by comparing the theoretical model results with actual data and adjusting model coefficients to fit the data. Finally, the model is verified by comparing the results of the calibrated model with data under new conditions. If the model cannot be verified, then it wouldn’t be suitable for making policy decisions, and it would be necessary to further revise the model to assure it can predict future conditions.
Theoretical climate models have predicted that global temperatures would rise as a result of increases in CO2 levels. However, contrary to the model predictions, actual global temperatures haven’t risen significantly in the past 10-15 years despite continued increases in CO2 emissions. Therefore, the model coefficients and assumptions need to be adjusted to verify the model before it should be used to make major policy decisions. Claiming that the actual global temperature data of the past 10-15 years are anomalous, but that the model is right, manipulates the results to achieve predetermined outcomes. The climate modelers are saying: “Don’t confuse me with the facts.”
Unless climate models are verified with actual data, there is no assurance that expenditures to reduce CO2 emissions would result in any environmental benefit.”
Robert J. Foxen, P.E., former chair of U.S. Environmental Protection Agency task force responsible for deciding on construction grant funding.
CEO, Global Common, LLC, Garden City, N.Y.http://online.wsj.com/articles/computer-modeling-and-the-objectivity-of-the-process-letters-to-the-editor-1405884034