Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Corrected links that should have been relative instead of absolute.

A next stage toward radically open modeling is the ability to compare models. This can be done by running multiple simulation models using, as much as the model structures allow, identical input variables.

By comparing the different results produced by multiple models using the same or similar input variables, users can see how differences in the models' assumptions and structure lead to different visions of how the future will look. Comparison across models can be facilitated by storing each run in the scenario library.

It is worth noting that comparison of the results of simulation runs from multiple models that use the same or similar inputs is a frequently used technique of energy and climate modelers.

  • Stanford Energy Modeling Forum and Kyoto
    In the late 1990s, Stanford’s Energy Modeling Forum sponsored an effort in which thirteen teams that had built integrated assessment models (IAMs) ran simulations examining the anticipated results of implementation of the Kyoto Protocol. See John P. Weyant and Jennifer N. Hill, Introduction and Overview, Energy Journal, Special Issue, 1999, vii-xliv.
  • IPCC assessment reports
    These rely heavily on simulation runs of multiple general circulation models (GCMs) in which the same/similar assumptions are used. For example, see Climate Change 2007-The Physical Science Basis, Contribution of Working Group I to the Fourth Assessment Report of the IPCC, Chapter 8, Climate Models and their Evaluation, pp. 589-662.

In a sense, this is a different form of sensitivity testing where the results are tested, not just for sensitivity to specific parameter values, but for sensitivity to the underlying model structure.

Return to Steps toward radically open modeling