Presenters-0426: Difference between revisions
From CSDMS
(Created page with "{{Presenters temp |CSDMS meeting event title=CSDMS3.0 - Bridging Boundaries |CSDMS meeting event year=2019 |CSDMS meeting presentation type=Clinic |CSDMS meeting first name=Ba...") |
No edit summary |
||
Line 12: | Line 12: | ||
}} | }} | ||
{{Presenters presentation | {{Presenters presentation | ||
|CSDMS meeting abstract presentation=Many geophysical models require parameters that are not tightly constrained by | |CSDMS meeting abstract presentation=Many geophysical models require parameters that are not tightly constrained by observational data. Calibration represents methods by which these parameters are estimated by minimizing the difference between observational data and model simulated equivalents (the objective function). Additionally, uncertainty in estimated parameters is determined. | ||
In this clinic we will cover the basics of model calibration including: (1) determining an appropriate objective function, (2) major classes of calibration algorithms, (3) interpretation of results. | |||
In the hands-on portion of the the clinic, we will apply multiple calibration algorithms to a simple test case. For this, we will use Dakota, a package that supports the application of many different calibration algorithms. | |||
In the hands-on portion of the the clinic, we will apply multiple calibration | |||
algorithms to a simple test case. For this, we will use Dakota, a package that | |||
supports the application of many different calibration algorithms. | |||
|CSDMS meeting youtube code=0 | |CSDMS meeting youtube code=0 | ||
|CSDMS meeting participants=0 | |CSDMS meeting participants=0 |
Revision as of 12:36, 8 January 2019
CSDMS3.0 - Bridging Boundaries
Model Calibration with Dakota
Abstract
Many geophysical models require parameters that are not tightly constrained by observational data. Calibration represents methods by which these parameters are estimated by minimizing the difference between observational data and model simulated equivalents (the objective function). Additionally, uncertainty in estimated parameters is determined.
In this clinic we will cover the basics of model calibration including: (1) determining an appropriate objective function, (2) major classes of calibration algorithms, (3) interpretation of results.
In the hands-on portion of the the clinic, we will apply multiple calibration algorithms to a simple test case. For this, we will use Dakota, a package that supports the application of many different calibration algorithms.
Please acknowledge the original contributors when you are using this material. If there are any copyright issues, please let us know (CSDMSweb@colorado.edu) and we will respond as soon as possible.
Of interest for: