Systems & complexitySystems thinking and modelling

Optimization and Complexity: Calibrating the Model [Systems thinking & modelling series]

This is part 59 of a series of articles featuring the book Beyond Connecting the Dots, Modeling for Meaningful Results.

In addition to using historical data to assess the model fit, you can also use historical data to calibrate model parameters. Depending on the model, you may have many parameters for which you do not have a good way to determine their values. Earlier, we discussed how to use sensitivity testing to assess whether our results are resilient to this uncertainty and to build confidence in the model. Another way to build confidence in your parameter values is, rather than guessing the values of these uncertain parameters, to choose the set of values that results in the best fit between simulated and historical data. This is a semi-objective criterion that helps to remove potential personal biases from the modeling process.

Next edition: Optimization and Complexity: Goodness of Fit.

Article sources: Beyond Connecting the Dots, Insight Maker. Reproduced by permission.

Rate this post

Scott Fortmann-Roe and Gene Bellinger

Scott Fortmann-Roe, creator of Insight Maker, and Gene Bellinger, creator of SystemsWiki, have written the innovative interactive book "Beyond Connecting the Dots" to demystify systems thinking and modelling.

Related Articles

Back to top button