Form:CSDMS annual meeting
CSDMS 2.0: Moving ForwardMarch 23-25th 2013 Boulder Colorado, USA
Registration is now officially closed. Any requests to attend may be sent to firstname.lastname@example.org with subject line: CSDMS Mtg 2013.
Note: Do you want to make changes to you abstract?
- Select your registration record in "participants" and start making changes by clicking "Edit registration".
Helpful Information for Travels Home
SuperShuttle: 303-227-0000 Taxi Cab Service: 303-777-7777
Reminder: The meeting address is UCAR Center Green, Bldg. #CG1, 3080 Center Green Drive, Boulder CO 80301
Objectives and general description
The CSDMS Meeting 2013 is designed to launch CSDMS 2.0 and shape its direction through engaging on the technical and community challenges over the next five years.
The meeting includes: 1) State-of-the art keynote presentations in earth-surface dynamics and modeling; 2) Hands-on clinics related to community models, tools and approaches; 3) Transformative software products and approaches designed to be accessible, easy to use, and relevant; 4) New community initiatives to advance earth-surface process modeling across many disciplines; 5) Breakout sessions for Working and Focus Research Groups to update their strategic plans and define their long, medium and short term goals; 6) Poster Sessions; and more.
Poster Information: Those who are bringing posters have been assigned to one of the two poster sessions via an email that was sent on March 11th. (If you did not receive that email, contact: email@example.com). The poster boards are configured for 4' wide by 6' tall (portrait orientation) posters. There are only a few spots available for posters with landscape orientation.
Program Schedule updated March 23rd
Join online Pre meeting discussions
ARCADIS U.S., Inc.
A Coupled ADCIRC and SWAN model of Hurricane Surge and Waves. This presentation will briefly introduce the formulation, numerics, and parallel implementation of the coastal circulation model ADCIRC, discuss the strategy of coupling with the SWAN wave model, and provide background on recent enhancements of the bottom-friction formulation. Several recent applications of the coupled modeling system will be presented. Katy Barnhart
University of Colorado
Melting Coasts and Toppled Blocks: Modeling Coastal Erosion in Ice-Rich Permafrost Bluffs, Beaufort Sea, Alaska (with Robert S. Anderson, Irina Overeem, Gary Clow, and Frank Urban)
(thanks to Adam LeWinter and Tim Stanton)
Rates of coastal cliff erosion are a function of the geometry and substrate of the coast; storm frequency, duration, magnitude, and wave field; and regional sediment sources. In the Arctic, the duration of sea ice-free conditions limits the time over which coastal erosion can occur, and sea water temperature modulates erosion rates where ice content of coastal bluffs is high. Predicting how coastal erosion rates in this environment will respond to future climate change requires that we first understand modern coastal erosion rates.
Arctic coastlines are responding rapidly to climate change. Remotely sensed observations of coastline position indicate that the mean annual erosion rate along a 60-km reach of Alaska’s Beaufort Sea coast, characterized by high ice content and small grain size, doubled from 7 m yr-1 for the period 1955-1979 to 14 m yr-1 for 2002-2007. Over the last 30 years the duration of the open water season expanded from ∼45 days to ∼95 days, increasing exposure of permafrost bluffs to seawater by a factor of 2.5. Time-lapse photography indicates that coastal erosion in this environment is a halting process: most significant erosion occurs during storm events in which local water level is elevated by surge, during which instantaneous submarine erosion rates can reach 1-2 m/day. In contrast, at times of low water, or when sea ice is present, erosion rates are negligible.
We employ a 1D coastal cross-section numerical model of the erosion of ice-rich permafrost bluffs to explore the sensitivity of the system to environmental drivers. Our model captures the geometry and style of coastal erosion observed near Drew Point, Alaska, including insertion of a melt-notch, topple of ice-wedge-bounded blocks, and subsequent degradation of these blocks. Using consistent rules, we test our model against the temporal pattern of coastal erosion over two periods: the recent past (~30 years), and a short (~2 week) period in summer 2010. Environmental conditions used to drive model runs for the summer of 2010 include ground-based measurements of meteorological conditions (air temperature, wind speed, wind direction) and coastal waters (water level, wave field, water temperature), supplemented by high temporal frequency (4 frames/hour) time-lapse photography of the coast. Reconstruction of the 30-year coastal erosion history is accomplished by assembling published observations and records of meteorology and sea ice conditions, including both ground and satellite-based records, to construct histories of coastline position and environmental conditions. We model wind-driven water level set-up, the local wave field, and water temperature, and find a good match against the short-term erosion record. We then evaluate which environmental drivers are most significant in controlling the rates of coastal erosion, and which melt-erosion rule best captures the coastal history, with a series of sensitivity analyses. The understanding gained from these analyses provides a foundation for evaluating how continuing climate change may influence future coastal erosion rates in the Arctic.
Penn State University
Modeling The Isotopic “Age” of Water in Hydroecological Systems with PIHM Co-authors: Gopal Bhatt and Evan Thomas
Theories have been proposed using idealized tracer age modeling for ocean ventilation, atmospheric circulation, soil, stream and groundwater flow. In this research we developing new models for the dynamic age of water in hydroecological systems. Approaches generally assume a steady flow regime and stationarity in the concentration (tracer) distribution function for age, although recent work shows that this is not a necessary assumption. In this paper a dynamic model for flow, concentration, and age for soil water is presented including the effect of macropore behavior on the relative age of recharge and transpired water. Several theoretical and practical issues are presented including some new results for Shale Hills CZO (G. Bhatt, 2012).
Michael S. Eldred
DAKOTA: An Object-Oriented Framework for Simulation-Based Iterative Analysis The DAKOTA project began in 1994 with the primary objective of reusing software interfaces to design optimization tools. Over nearly 20 years of development, it has grown into an open source toolkit supporting a broad range of iterative analyses, typically focused on high-fidelity modeling and simulation on high-performance computers. Today, DAKOTA provides a delivery vehicle for uncertainty quantification research for both the NNSA and the office of science, enabling an emphasis on predictive science for stockpile stewardship, energy, and climate mission areas.
Starting with an overview of the DAKOTA architecture, this presentation will introduce processes for setting up iterative analyses, interfacing with computational simulations, and managing high-fidelity workflows. Algorithmic capabilities in optimization, calibration, sensitivity analysis, and uncertainty quantification (UQ) will be briefly overviewed, with special emphasis given to UQ. Core UQ capabilities include random sampling methods, local and global reliability methods, stochastic expansion methods, and epistemic interval propagation methods. This UQ foundation enables a variety of higher level analyses including design under uncertainty, mixed aleatory-epistemic UQ, and Bayesian inference.
Linking Sediment Transport Processes and Biogeochemistry with Application to the Louisiana Continental Shelf Though it enhances the exchange of porewater and solids with the overlying water, the role that sediment resuspension and redeposition play in biogeochemistry of coastal systems is debated. Numerical models of geochemical processes and diagenesis have traditionally parameterized relatively long timescales, and rarely attempted to include resuspension. Meanwhile, numerical models developed to represent sediment transport have largely ignored geochemistry. Here, we couple the Community Sediment Transport Modeling System (CSTMS) to a biogeochemical model within the Regional Ocean Modeling System (ROMS). The multi-layered sediment bed model accounts for erosion, deposition, and biodiffusion. It has recently been modified to include dissolved porewater constituents, particulate organic matter, and geochemical reactions.
For this talk, we explore the role that resuspension and redeposition play in biogeochemical cycles within the seabed and in benthic boundary layer by running idealized, one-dimensional test cases designed to represent a 20-m deep site on the Louisiana Shelf. Results from this are contrasted to calculations from an implementation similar to a standard diagenesis model. Comparing these, the results indicate that resuspension acts to enhance sediment bed oxygen consumption.
University of Texas
Building a Network for Sediment Experimentalists and Modelers Wonsuck Kim, University of Texas at Austin
Leslie Hsu, Lamont-Doherty Earth Observatory, Columbia University
Brandon McElroy, University of Wyoming, Laramie
Raleigh Martin, University of Pennsylvania
In the modeler community, hindcasting (a way to test models based on knowledge of past events) is required for all computer models before providing reliable results to users. CSDMS 2.0 “Moving forward” has proposed to incorporate benchmarking data into its modeling framework. Data collection in natural systems has been significantly advanced, but is still behind the resolution in time and space and includes natural variability beyond our understanding, which makes thorough testing of computer models difficult.
In the experimentalist community, research in Earth-surface processes and subsurface stratal development is in a data-rich era with rapid expansion of high-resolution, digitally based data sets that were not available even a few years ago. Millions of dollars has been spent to build and renovate flume laboratories. Advanced technologies and methodologies in experiment allow more number of sophisticated experiments in large scales at fine details. Joint effort between modelers and experimentalists is a natural step toward a great synergy between both communities.
Time for a coherent effort for building a strong global research network for these two communities is now. First, the both communities should initiate an effort to figure out a best practice, metadata for standardized data collection. Sediment experimentalists are an example community in the “long tail”, meaning that their data are often collected in one-of-a-kind experimental set-ups and isolated from other experiments. Second, there should be a centralized knowledge base (web-based repository for data and technology) easily accessible to modelers and experimentalists. Experimentalists also have a lot of “dark data,” data that are difficult or impossible to access through the Internet. This effort will result in tremendous opportunities for productive collaborations.
The new experimentalist and modeler network will be able to achieve the CSDMS current goal by providing high quality benchmark datasets that are well documented and easily accessible.
Underworld: A high-performance, modular long-term tectonics code Co-authors: John Mansour, Steve Quenette and Guillaume Duclaux
The Underworld code was designed for solving (very) long timescale geological deformations accurately, tracking deformation and evolving interfaces to very high strains. It uses a particle-in-cell based finite element method to track the material history accurately and highly-tuned multigrid solvers for fast implicit solution of the equations of motion. The implementation has been fully parallel since the inception of the project, and a plugin/component architecture ensures that extensions can be built without significant exposure to the underlying technicalities of the parallel implementation. We also paid considerable attention to model reproducibility and archiving — each run defines its entire input state and the repository state automatically.
A typical geological problems for which the code was designed is the deformation of the crust and lithospheric mantle by regional plate motions — these result in the formation of localised structures (e.g. faults), basins, folds and in the generation of surface topography. The role of surface processes — redistributing surface loads and changing boundary conditions, is known to be significant in modifying the response of the lithosphere to the plate-derived forces. The coupling of surface process codes to Underworld is feasible, but raises some interesting challenges (and opportunities !) such as the need to track horizontal deformations and match changes to the topography at different resolutions in each model. We will share some of our insights into this problem.
Growth and Abandonment: Quantifying First-order Controls on Wave Influenced Deltas. What determines the style of river delta growth? How do deltas change after fluvial sediment supply is cut off? River delta evolution is characterized by the progradation and transgression of individual (deltaic) lobes: the delta cycle. We investigate the behaviour of wave-influenced deltas with a simple shoreline model, and quantitatively relate several first-order controls. Mark Schmeeckle
Arizona State University
Turbulence- and Particle-Resolving Numerical Modeling of Sediment Transport. Turbulence, bedload, and suspended sediment transport are directly simulated by a coupled large eddy simulation of the fluid and a distinct element method for every sediment grain. This modeling system directly calculates the motion of all grains by resolved turbulence structures. The model directly calculates modification of the flow and turbulence by the grains, such as the effects of grain momentum extraction and density stratification. Simulations such as these can be used in the future to parameterize sediment transport in large-scale morphodynamic simulations. Mauro Werder
Simon Fraser University
Modeling channelized and distributed subglacial drainage in 2D This model of the subglacial drainage system simulates the pressurised flow of water at the ice-bed interface of glaciers and ice sheets. It includes both distributed and channelized water flow. Notably the model determines the geometry of the channel network as part of the solution. The resulting channel network is similar to subaerial stream networks with channels carving out hydraulic potential "valleys". However, there are some pronounced differences to subaerial drainage, for example that the time for a network to form (and decay) is on the order of weeks to months; or that, channels originating at point sources can lie on ridges of the hydraulic potential. The model employs a novel finite element approach to solve the parabolic equations for the hydraulic potential simultaneously on the 1D channel network and 2D distributed system.
Peter Burgess & Chris Jenkins
Royal Holloway, UK & Univ. of Co.
Three carbonate sedimentation models for CSDMS This workshop will showcase three different models of carbonate sedimentation, produced under the CSDMS umbrella: carboCat for facies, carboCell for guilds, carboPop for communities. Participants will be able to download and run (on own or provided machines) these models in Python and Matlab environments, discuss how to select appropriate parameters for them using the various databases being developed in concert with the models, and contribute to plans for further development of models and databases. Gary Clow
Introduction to the Weather Research & Forecasting (WRF) System, a High-Resolution Atmospheric Model WRF is a highly parallel state-of-the-art numerical weather prediction model hosted by the National Center for Atmospheric Research (NCAR). This community model was designed from the onset to be fairly flexible, supporting both operational forecasting and atmospheric research needs at scales ranging from meters to thousands of kilometers. Given the model’s physics implementation and it’s modular design, WRF naturally became the core for a number of more specialized models, including: HWRF (used to forecast the track and intensity of tropical cyclones), WRF-CHEM (simulates the emission, transport, mixing, and chemical transformation of trace gases and aerosols simultaneously with meteorology), Polar WRF (a version of WRF optimized for the polar regions), CWRF and CLWRF (versions of WRF modified to enable regional climate modeling), and planetWRF (a general purpose numerical model for planetary atmospheres used thus far for Mars, Venus, and Titan).
The goal of this clinic is to provide an overview of the WRF model, including: model architecture, physics options, data required to drive the model, standard model output, model applications, and system requirements. Several examples will be presented. A Basic Model Interface (BMI) is currently being developed for WRF to facilitate the coupling of this atmospheric model with other earth system models.
University of Colorado
Introduction to the Basic Model Interface and CSDMS Standard Names In order to simplify conversion of an existing model to a reusable, plug-and-play model component, CSDMS has developed a simple interface called the Basic Model Interface or BMI that model developers are asked to implement. In this context, an interface is a named set of functions with prescribed function names, argument types and return types. By design, the BMI functions are straightforward to implement in any of the languages supported by CSDMS, which include C, C++, Fortran (all years), Java and Python. Also by design, the BMI functions are noninvasive. A BMI-compliant model does not make any calls to CSDMS components or tools and is not modified to use CSDMS data structures. BMI therefore introduces no dependencies into a model and the model can still be used in a "stand-alone" manner. Any model that provides the BMI functions can be easily converted to a CSDMS plug-and-play component that has a CSDMS Component Model Interface or CMI.
Once a BMI-enabled model has been wrapped by CSDMS staff to become a CSDMS component, it automatically gains many new capabilities. This includes the ability to be coupled to other models even if their (1) programming language, (2) variable names, (3) variable units, (4) time-stepping scheme or (5) computational grid is different. It also gains (1) the ability to write output variables to standardized NetCDF files, (2) a "tabbed-dialog" graphical user interface (GUI), (3) a standardized HTML help page and (4) the ability to run within the CSDMS Modeling Tool (CMT).
This clinic will explain the key concepts of BMI, with step-by-step examples. It will also include an overview of the new CSDMS Standard Names, which provide a standard way to map input and output variable names between component models as part of BMI implementation. Participants are encouraged to read the associated CSDMS wiki pages in advance and bring model code with specific questions. See
1) BMI Page: BMI_Description
2) Standard Names Page: CSDMS_Standard_Names
University of Colorado
CMT clinic This clinic will look at the CSDMS Modeling Tool (CMT). We share the philosophy behind CMT, will demo the functionality of CMT and show what models are incorporated into it. New educational material on several models allows scientists and students to more easily use CSDMS models for classes and simple simulations and we will provide clinic participants with the latest information on these resources. The CMT clinic will be hands-on, we will run a few simple runs and visualize them. Finally, we will spend some time on discussing common problems and strategic solutions. Thomas Hauser & Monte Lunacek
University of Colorado
Python for Matlab users clinic This workshop is a hands-on introduction to using Python for computational science. Python is a powerful open source interpreted language that has been adopted widely in many application areas. The goal of this workshop is to teach participants how to use Python as an open source alternative for MATLAB in their computational workflows. While we will demonstrate how to implement MATLAB-based scientific computing workflows in Python, attendees are not required to have MATLAB or Python experience. The goal of this tutorial is to show how an open source alternative to MATLAB can be used productively for computational science research. In the first part of this workshop we will introduce basic Python concepts and iPython with a focus on migrating from MATLAB to Python. We will show how the Python modules Numpy and Scipy, for scientific computing, and Matplotlib, for plotting, can make Python as capable as MATLAB for computational science research. In the second part of the tutorial we will discuss on how to interface Python with compiled languages like C or Fortran to improve performance of numerical codes. Additionally we will show how to use distributed parallel computing on a supercomputer from interactive python notebooks.
This tutorial will be hands on, so we would like you to install python on your laptop before you arrive. The easiest way to get everything you need is to download the FREE Enthought distribution:
The installation is fairly straight forward, but if you have any questions, please feel free to email Monte: Monte.Lunacek@colorado.edu.
Toward Transparent, Refutable Hydrologic Models in Kansas or Oz. Numerical models are critical to integrating knowledge and data for environmental systems and understanding future consequences of management decisions, weather variability, climate change, and so on. To attain the transparency and refutability needed to understand predictions and uncertainty and use models wisely, this clinic presents a strategy that emphasizes fundamental questions about model adequacy, sensitivity analysis, and uncertainty evaluation, and consistent use of carefully designed metrics. Emphasizing fundamental questions reveals practical similarities in methods with widely varying theoretical foundations and computational demands. In a field where models take seconds to months for one forward run, a credible strategy must include frugal methods for those in Kansas who can only afford 10s to 100s of highly parallelizable model runs in addition to demanding methods for those in Oz who can afford to do 10,000s to 1,000,000s of model runs. Advanced computing power notwithstanding, people may be in Kansas because they have chosen complex, high-dimensional models, want quick insight into individual models, and/or need systematic comparison of many alternative models. This class will briefly review the fundamental questions, demonstrate relations between existing theoretical approaches, and address challenges and limitations. Students will be able to examine a model constructed using FUSE and compare results from computationally frugal method evaluations conducted in class and demanding methods for which results are provided.
During the clinic you will have the opportunity to run an exercise on your laptop. The exercise uses R, which is freely downloadable. The clinic is only an hour, so it will really be necessary to have downloaded and installed R prior to arriving. Do this as follows
go to http://cran.cnr.berkeley.edu/
Install version 2.15.3
Linux, Mac, or Windows versions are available.
You can install with or without administrative privileges.
The R scripts you will be working with and the file with results from Sobol' can be downloaded from ftp://ftpext.cr.usgs.gov/pub/cr/co/boulder/mchill, in case you would like to try it out. Here are the rest of the instructions for doing that, but you can wait and do this in class if you like, as long as you have downloaded R.
2) Open Rgui.exe In the bin subdirectory of the R distribution
3) Go to File > Open script "Sensitivities_Global_Local_v02.r"
4) Set your current working directory in the R script: setwd("full path") on line 17. This is the directory with the .r files distributed for class. Change any \ to /. There can be spaces in the pathname.
5) Run by using the shortcuts Ctrl+a and Ctrl+r.
PDF files are produced showing plots of results. We will go through what these mean in class.
The Sobol’ results take 6,000,000 model runs and about 12 hours, so can not be run in class. They are provided in the file:
Each line presents average results for a bootstrapped Sobol’ sample for a portion of the full parameter space. The averages for the entire range of parameters is on the line with grid index=101
The R script using this file to create plots; it does not do the runs.
UT San Antonio
Modeling of Earth Surface Dynamics and Related Problems using OpenFOAM®. This clinic aims to introduce the open source computational fluid dynamics (CFD) platform, OpenFOAM®, to the earth surface dynamics research community and to foster collaborations. OpenFOAM® is essentially a computational toolbox which solves general physical models (differential equations) using finite volume method. This short clinic is tailored to be suitable for an audience at various levels (from beginners to experienced code developers). It will provide an overview of OpenFOAM. We will demonstrate its usage in a variety of applications, including hydrodynamics, sedimentation, groundwater flows, buoyant plumes, etc. Participants can also bring the problems in their fields of interest and explore ways to solve them in OpenFOAM®. Knowledge of C++, object-oriented programming, and parallel computing is not required but will be helpful. Eckart Meiburg & students
University of California, SB
TURBINS using PETSc This clinic will provide information on how laboratory scale flows and field scale flows can be simulated by direct numerical simulations (DNS) and large-eddy simulations (LES) using parallel, high-performance computing facilities. DNS results, from the software TURBINS, of gravity and turbidity currents propagating over complex sea floor topography will be discussed. The use of the PETSc software package within the DNS simulations will be highlighted. LES results of high Reynolds number gravity and turbidity currents, and reversing buoyancy currents over a flat topography will be discussed. Issues relevant to LES such as grid resolution, grid convergence, subgrid models and wall-layer modeling will also be discussed. Helena Mitasova
North Carolina State Univ.
Modeling and analysis of evolving landscapes in GRASS GIS This clinic will introduce participants to GRASS6.4.3 with special focus on terrain modeling, geomorphometry, watershed analysis and modeling of landscape processes such as surface water flow and erosion/deposition. The hands-on section will explore lidar-based terrain models, multiple surface visualization, analysis of coastal lidar time series and visualization of terrain evolution using space-time cube. Overview of new capabilities in the GRASS7 development version will also be provided.
Notice: The participants will be expected to download and install GRASS6.4.3 as well as the practice data sets from the provided web site prior to the clinic. (see below)
Everything used in the clinic will be available through the following web site: http://courses.ncsu.edu/gis582/common/media/GRASS_clinic2013/GRASS_clinic.html
(I am still working on the material, but the install info is there).
Anytime before the clinic (which is on Monday March 25), please:
- download the data following the instructions for # 3. Data for the practice
- download and install GRASS following # 4. Software
- try opening GRASS following the instructions here, especially the video capture Getting started with GRASS
You don't need to go through the entire video or the instructions - we will do it in Boulder, for now just open GRASS and make sure you can display one of the provided map layers.
Please let Helena know if you have any problems: firstname.lastname@example.org
University of Miami
Dune erosion and overwash with XBeach A short tutorial and hands-on workshop to set up and run XBeach to predict the morphodynamic response of dune protected areas under hurricane conditions. We will cover the set up of the computational grid, boundary conditions, model processes and data analysis.
The XBeach model runs on a windows platform. If you have a Mac, you can still run the model provided you have software (like parallels or vmware) that enables you to run windows programs. To download XBeach, see: http://oss.deltares.nl/
University of Colorado
A very basic introduction to numerical methods for scientific computing I will give a overview of the basic foundations of numerical methods for modeling earth systems described by ordinary and partial differential equations. I will discuss the underlying foundations of finite-difference, finite-volume and finite-element methods using diffusion/conduction equations as an example. I will discuss explicit and implicit methods for time-stepping, and stability analysis of time-integration schemes. All numerical methods for ODEs and PDEs in some form arrive at algebraic approximations, translating them into systems of algebraic equations. I will discuss basic algorithms for solving systems of algebraic equations, and how they are incorporated into various software packages, and also emphasize the importance of sparsity in matrix computations. I will include examples derived from practical problems in reactive transport and glacier dynamics to illustrate how basic concepts apply to real-world problems and make a difference when we want to develop efficient and accurate models.
Interested to see who registered for the meeting?
Within its budget, CSDMS intends to support member applicants to attend the annual meeting. Towards this goal, we encourage members to fully or partially cover their expenses if capable. We additionally thank those in the industry and agency fields for understanding that 1) we cannot compensate federal agency participants since our own funding is from NSF, and 2) we request that our industrial/ corporate participants cover their own costs thereby allowing more academic participants to attend.
To the extent possible, CSDMS intends to reimburse the registration fee, lodging (shared rooms at 100% and single rooms at 50% at Millennium Harvest House Hotel), and a limited amount of travel expenses of qualified registrants - those members who have attended all three days of the meeting and are not industry or federal employees.
Important for foreign travelers requesting reimbursement: If you need a visa to travel to USA, select a business visa. If you need an invitation letter, please email email@example.com soonest. Also indicate whether specific wording is required in the letter. Second, we will need to copy the entry stamp in your passport sometime during the meeting as proof that you were here on business as required by US tax laws for reimbursement (especially when dealing with airfare.)
The application period for the student scholarship is now closed.
Travel, Lodging and Conference Center Information
The meeting will be held at UCAR Conference Center
Lodging for meeting participants is at the Millennium Harvest House Hotel
Please visit the CSDMS contact page for advice on ways to reach Boulder from the Denver Airport.