Property:CSDMS meeting abstract presentation

From CSDMS

This is a property of type Text.

Showing 500 pages using this property.
P
'''Derek Nueharth''' - "Evolution of divergent and strike-slip boundaries in response to surface processes" Plate tectonics describes the movement of rigid plates at the surface of the Earth as well as their complex deformation at three types of plate boundaries: 1) divergent boundaries such as rift zones and mid-ocean ridges, 2) strike-slip boundaries where plates grind past each other, such as the San Andreas Fault, and 3) convergent boundaries that form large mountain ranges like the Andes. The generally narrow deformation zones that bound the plates exhibit complex strain patterns that evolve through time. During this evolution, plate boundary deformation is driven by tectonic forces arising from Earth’s deep interior and from within the lithosphere, but also by surface processes, which erode topographic highs and deposit the resulting sediment into regions of low elevation. Through the combination of these factors, the surface of the Earth evolves in a highly dynamic way with several feedback mechanisms. At divergent boundaries, for example, tensional stresses thin the lithosphere, forcing uplift and subsequent erosion of rift flanks, which creates a sediment source. Meanwhile, the rift center subsides and becomes a topographic low where sediments accumulate. This mass transfer from foot-to hanging wall plays an important role during rifting, as it prolongs the activity of individual normal faults. When rifting continues, continents are eventually split apart, exhuming Earth’s mantle and creating new oceanic crust. Because of the complex interplay between deep tectonic forces that shape plate boundaries and mass redistribution at the Earth’s surface, it is vital to understand feedbacks between the two domains and how they shape our planet. Here, we use numerical models to provide insight on how surface processes influence tectonics at divergent and strike-slip boundaries through two studies. The first study takes a detailed look at the evolution of rift systems using two-dimensional models. Specifically, we extract faults from a range of rift models and correlate them through time to examine how fault networks evolve in space and time. By implementing a two-way coupling between the geodynamic code ASPECT and landscape evolution code FastScape, we investigate how the fault network and rift evolution are influenced by the system’s erosional efficiency, which represents many factors like lithology or climate. The second study uses the two-way numerical coupling between tectonics and landscape evolution to investigate how a strike-slip boundary responds to large sediment loads, and whether this is sufficient to form an entirely new type of flexural strike-slip basin. '''Danghan Xie''' - "Responses of mangrove forests to sea-level rise and human interventions: a bio-morphodynamic modelling study" Co-Authors - Christian Schwarz2,3, Maarten G. Kleinhans4 and Barend van Maanen5<br> 2Hydraulics and Geotechnics, Department of Civil Engineering, KU Leuven, Belgium <br> 3Department of Earth and Environmental Sciences, KU Leuven, Belgium <br> 4Department of Physical Geography, Utrecht University, Utrecht, the Netherlands<br> 5Department of Geography, University of Exeter, Exeter, UK <br> Corresponding author: Danghan Xie (danghan@bu.edu) <br> Mangroves preserve valuable coastal resources and services along tropical and subtropical shorelines. However, ongoing and future sea-level rise (SLR) is threatening mangrove habitats by increasing coastal flooding. Changing sediment availability, the development of coastal structures (such as barriers), and coastal restoration strategies (such as mangrove removal) not only constrain the living space of mangrove forests but also affect coastal landscape evolution. Due to limitations in studying various temporal and spatial scales in the field under SLR and human interventions, insights thus far remain inconclusive. Results of bio-morphodynamic model predictions can fill this gap by accounting for interactions between vegetation, hydrodynamic forces, and sediment transport. Here, we present a numerical modeling approach to studying bio-morphodynamic feedbacks within mangrove forests through a coupled model technique using Delft3d and Matlab. This approach takes into account (1) multiple colonization restrictions that control not only the initial mangrove colonization but also the subsequent response to SLR, (2) the possibility of coastal progradation and seaward mangrove expansion despite SLR under high sediment supply, (3) modulation of tidal currents based on vegetation presence and coastal profile evolution which, in turn, affect mangrove growth and even species distributions, and (4) profile reconfiguration under SLR which may contribute to the infilling of new accommodation space. Our model results display both spatial and temporal variations in sediment delivery across mangrove forests, leading to species replacements arising from landward sediment starvation and prolonged inundation. The strength of bio-morphodynamic feedbacks depends on variations in mangrove root density, which further steers the inundation-accretion decoupling and, as a result, mangrove distribution. Moreover, an extended analysis studying mangrove behaviors is conducted under varying coastal conditions, including varying tidal range, wave action, and sediment supply. The results indicate that mangroves in micro-tidal systems are most vulnerable, even if sediment availability is ample. Ultimately, coastal restoration strategies like mangrove removal aiming to reduce local mud might not be achieved due to sediment redistribution post mangrove removal, which could enhance coastal muddification. Further reading: * Xie, D., Schwarz, C., Brückner, M. Z., Kleinhans, M. G., Urrego, D. H., Zhou, Z., & Van Maanen, B. (2020). Mangrove diversity loss under sea-level rise triggered by bio-morphodynamic feedbacks and anthropogenic pressures. Environmental Research Letters, 15(11), 114033. https://doi.org/10.1088/1748-9326/abc122 * Xie, D., Schwarz, C., Kleinhans, M. G., Zhou, Z., & van Maanen, B. (2022). Implications of Coastal Conditions and Sea‐Level Rise on Mangrove Vulnerability: A Bio‐Morphodynamic Modeling  
(Thanks to Adam LeWinter and Tim Stanton)</i></span><br><br>Rates of coastal cliff erosion are a function of the geometry and substrate of the coast; storm frequency, duration, magnitude, and wave field; and regional sediment sources. In the Arctic, the duration of sea ice-free conditions limits the time over which coastal erosion can occur, and sea water temperature modulates erosion rates where ice content of coastal bluffs is high. Predicting how coastal erosion rates in this environment will respond to future climate change requires that we first understand modern coastal erosion rates.<br><br>Arctic coastlines are responding rapidly to climate change. Remotely sensed observations of coastline position indicate that the mean annual erosion rate along a 60-km reach of Alaska’s Beaufort Sea coast, characterized by high ice content and small grain size, doubled from 7 m yr-1 for the period 1955-1979 to 14 m yr-1 for 2002-2007. Over the last 30 years the duration of the open water season expanded from ∼45 days to ∼95 days, increasing exposure of permafrost bluffs to seawater by a factor of 2.5. Time-lapse photography indicates that coastal erosion in this environment is a halting process: most significant erosion occurs during storm events in which local water level is elevated by surge, during which instantaneous submarine erosion rates can reach 1-2 m/day. In contrast, at times of low water, or when sea ice is present, erosion rates are negligible.<br><br>We employ a 1D coastal cross-section numerical model of the erosion of ice-rich permafrost bluffs to explore the sensitivity of the system to environmental drivers. Our model captures the geometry and style of coastal erosion observed near Drew Point, Alaska, including insertion of a melt-notch, topple of ice-wedge-bounded blocks, and subsequent degradation of these blocks. Using consistent rules, we test our model against the temporal pattern of coastal erosion over two periods: the recent past (~30 years), and a short (~2 week) period in summer 2010. Environmental conditions used to drive model runs for the summer of 2010 include ground-based measurements of meteorological conditions (air temperature, wind speed, wind direction) and coastal waters (water level, wave field, water temperature), supplemented by high temporal frequency (4 frames/hour) time-lapse photography of the coast. Reconstruction of the 30-year coastal erosion history is accomplished by assembling published observations and records of meteorology and sea ice conditions, including both ground and satellite-based records, to construct histories of coastline position and environmental conditions. We model wind-driven water level set-up, the local wave field, and water temperature, and find a good match against the short-term erosion record. We then evaluate which environmental drivers are most significant in controlling the rates of coastal erosion, and which melt-erosion rule best captures the coastal history, with a series of sensitivity analyses. The understanding gained from these analyses provides a foundation for evaluating how continuing climate change may influence future coastal erosion rates in the Arctic.  
* Project Team 1: '''Exploring the effects of rainstorm sequences on a river hydrograph''', Brooke Hunter presenting (Brooke Hunter, University of Oregon, Celia Trunz, University of Arkansas, Lisa Luna, University of Potsdam, Tianyue Qu, University of Pittsburgh and Yuval Shmilovitz, The Hebrew University of Jerusalem ). * Project Team 2: '''Coupling grids with different geometries and scales: an example from fluvial geomorphology''', Rachel Bosch presenting (Rachel Bosch, University of Cincinnati, Shelby Ahrendt, University of Washington, Francois Clapuyt, Université Catholique de Louvain, Eric Barefoot, Rice University, Mohit Tunwal, Penn State University, Vinicius Perin, North Carolina State University, Edwin Saavedra Cifuentes, Northwestern University, Hima Hassenruck-Gudipati, University of Texas Austin and Josie Arcuri, Indiana University). * Project Team 3: '''Lagrangian particle transport through a tidal estuary''', Rachel Allen presenting (Rachel Allen, UC Berkeley, Ningjie Hu, Duke University, Jayaram Hariharan, University of Texas, Aleja Geiger-Ortiz, Colby College and Collin Roland, University of Wisconsin). * Project Team 4: '''Using Landlab to Model Tectonic Activities in a Landscape Evolution Model''', Gustav Pallisgaard-Olesen presenting (Gustav Pallisgaard-Olesen, Aarhus University, Xiaoni Hu, Penn State University, Eyal Mardar, Colorado State University, Liang Xue, Bowling Green State University, and Chris Sheehan, University of Cincinnati). * Project Team 5: '''Land geomorphology evolution over a continuous permafrost region by applying Ku-model and hillslope diffusion model''', Zhenming Wu presenting( Zhenming Wu, University of Reeding and Fien De Doncker, University of Lausanne).   +
11:00AM '''Introductions''' '''11:05AM Project Team 1: "Simulating Shoreline Change Using Coupled CoastSat and Coastline Evolution Model (CEM)"''', Ahmed Elghandour, TU Delft, Benton Franklin, UNC, Conner Lester, Duke U, Megan Gillen, MIT/WHOI, Meredith Leung, Oregon State U & Samuel Zapp, LSU. Sandy shorelines are areas of dynamic geomorphic change, evolving on timescales ranging from hours to centuries. As part of the CSDMS ESPIn workshop, this educational lab was designed to allow users to observe firsthand the long-term change of a sandy coast of their choosing and explore the processes driving that change. The CEM was developed by Ashton et al. (2001) as an exploratory model that uses wave climate characteristics to model the evolution of an idealized coastline. In this educational lab, we couple CoastSat (a python tool that extracts shoreline geometry from satellite imagery (Vos et al., 2019)) to the CEM by initializing the model with observed shorelines from anywhere in the world. The CEM is then further driven by an average wave climate derived from local buoy data. This allows users to visualize the evolution of any sandy beach in the world through time. Through an introductory-level coding exercise, users will learn how to extract complex datasets, run a geomorphic model, and explore the impact of different wave climates on a beach they care about. '''11:15AM Project Team 2: "Including wildfires in a landscape evolution model"''', Kevin Pierce, UBC, Laurent Roberge, Tulane U Nishani Moragoda, U Alabama. Wildfires modify sediment inputs to streams by removing vegetation and encouraging overland flow. Unfortunately our ability to calculate sediment delivery from wildfires remains limited. Here, we present a stochastic wildfire component we recently developed for Landlab. This work provides a new computational method to relate stream sediment yields to the frequency and magnitude of wildfires. '''11:25AM Project Team 3: "Landscape-Scale Modeling across a variable-slip fault "''', Emery Anderson-Merritt, U Mass, Tamara Aranguiz, UW, Katrina Gelwick, ETH Zurich, Francesco Pavano, Lehigh U, & Josh Wolpert, U Toronto. The accommodation of deformation along a strike-slip fault can result in oblique kinematics featuring along-strike gradients in horizontal and vertical components of movement. While strike-slip fault models often simplify factors such as channel sedimentation, erosion processes and channel geometry, complex rock uplift fields related to oblique faulting may significantly impact the dynamics of a drainage system. With the objective of representing these along-strike kinematic variations commonly observed in strike-slip fault settings, we modify an existing Landlab component for lateral faulting (Reitmann et al., 2019) to incorporate spatially variable rock uplift. Our simulations demonstrate landscape evolution in an oblique faulting setting, highlighting the complicated response of a landscape’s drainage network and other geomarkers. '''11:35AM Project Team 4: "Paleoclimate and Elevation Data Used to Implement the Frost Cracking Window Concept”''', Risa Madoff, U North Dakota, Jacob Hirschberg, Swiss Federal Research Inst, Allie Balter LDEO/Columbia U. Frost cracking is a key weathering process in cold environments (e.g., Hales & Roering). Concepts from previous work on frost cracking (Anderson, 1998) provide foundations for understanding regional controls on landscape evolution. Recent research applying transient climate records and the frost-cracking model to estimate weathering rates (Marshall et al., 2021) represent ways that computational approaches are being adopted in the community. To bring a frost-cracking model into the CSDMS framework, we combined elevation-scaled PMIP6 paleoclimate data with a soil thermal profile model extant in the CSDMS repository (Tucker, 2020) to estimate frost-cracking intensity at a landscape scale. Our frost-cracking model is hosted in an EKT Jupyter notebook for instructional and exploratory applications of the thermal diffusion equation and the relationship between temperature and landscape development. In the future, our model could be implemented to compare modeled frost-cracking intensities with contemporary geomorphology in regions with differing climate histories. '''11:45AM Project Team 5: "Make storms, make erosion: How do storm intensity, duration, and frequency influence river channel incision."''', Angel Monsalve, U Idaho, Sam Anderson, Tulane U, Safiya Alpheus, Penn State U, Muriel Bruckner, U Exeter, Mariel Nelson, UT, Austin, Grace Guryan, UT, Austin. Erosion in the river bed is usually associated with a representative scale of stream power or shear stress of a given flow discharge. However, on a catchment scale, assuming a constant, steady-state flow of water in channels may not be adequate to represent the erosion process because of the temporal and spatial variability in rainfall. We coupled three different landLab components (OveralndFlow, DetachmentLimitedErosion, and SpatialPrecipitationDistribution) to create a more realistic representation of the topography evolution at a basin-scale and analyze the influence of storm intensity, duration, and frequency on channel incision. '''11:55AM Project Team 6: "Simulation of sediment pulses in Landlab NetworkSedimentTransporter (NST) component"''', Se Jong Cho, USGS, Muneer Ahammad, Virginia Tech, Marius Huber, U de Lorraine, Mel Guirro, Durham U. We synthetically introduce sediment pulses to simulate erosive conditions, which may be caused by fire or landslide occurrences in the landscape, and sediment yield across river network using the Landlab NetworkSedimentTransporter (NST) component. The goal of the project is to couple existing landlab models with external drivers of sediment sources and other input conditions that drive sediment transport. '''12:05-12:15PM Team 7: "Simulating Craters on Planetary Surfaces"''', Emily Bamber, UT Austin, Gaia Stucky de Quay, Harvard Impact cratering has been and still is the main geomorphic process on many planetary bodies, and is therefore key to understanding the evolution of planetary surfaces and their habitability. There are existing numerical models of planetary surface evolution that include cratering, but they are written in Fortran. As part of the CSDMS ESPIn 2021 summer workshop, we used the concepts for simulating crater shape and frequency on the surface from Howard (2007), and wrote a python code to simulate cratering, specifically on Mars. This code is freely available on GitHub, and currently utilises the LandLab model grid, which means our model integrates easily with the numerous landscape evolution modules that already exist as part of LandLab. An educational lab detailing the approach to simulating craters has also been produced and is available on the CSDMS website.  
<i>Background</i><br>When it comes to building a general, efficient, surface process code, there are a couple of significant challenges that stand in our way. One is to address the interesting operators that appear in the mathematical formulation that are not commonly encountered in computational mechanics. The other is to cater for the many different formulations that have been put forward in the literature as no single, universal set of equations has been agreed upon by the community.<br><br><i>Computational Approach</i><br>We view Quagmire as a community toolbox and acknowledge that this means there is no one best way to formulate any of the landscape evolution models. We instead provide a collection of useful operators and examples of how to assemble various kinds of models. We do assume that:<br><ul><li>the surface is a single-valued height field described by the coordinates on a two-dimensional plane</li><li>the vertical evolution can be described by the time-derivative of the height field</li><li>the horizontal evolution can be described by an externally imposed velocity field</li><li>the formulation can be expressed through (non-linear) operators on the two dimensional plane</li><li>any sub-grid parameterisation (e.g. of stream bed geometry) is expressible at the grid scale</li><li>a parallel algorithm is desirable</li></ul>We don't make any assumptions about:<br><ul><li>the nature of the mesh: triangulation or a regular array of 'pixels'</li><li>the parallel decomposition (except that it is assumed to be general)</li><li>the specific erosion / deposition / transport model</li></ul>Quagmire is a collection of python objects that represent parallel vector and matrix operations on meshes and provide a number of useful surface-process operators in matrix-vector form. The implementation is through PETSc, numpy, and scipy. Quagmire is open source and a work in progress.<br><br><i>Mathematical Approach</i><br>Matrix-vector multiplication is the duct tape of computational science: fast, versatile, ubiquitous. Any problem that can be formulated in terms of basic linear algebra operations can usually be rendered into an abstract form that opens up multiple avenues to solve the resulting matrix equations and it is often possible to make extensive use of existing fast, parallel software libraries. Quagmire provides parallel, matrix-based operators on regular Cartesian grid and unstructured triangulations for local operations such as gradient evaluation, diffusion, smoothing but also for non-local, graph-based operations that include catchment identification, upstream summation, and flood-filling. <br>The advantage of the formulation is in the degree of abstraction that is introduced, separating the details of the discrete implementation from the solution phase. The matrix-based approach also makes it trivial to introduce concepts such as multiple-pathways in graph traversal / summation operations without altering the structure of the solution phase in any way. Although we have not yet done so, there are obvious future possibilities in developing implicit, non-linear solvers to further improve computational performance and to place the model equations in an inverse modelling framework.  
A comprehensive understanding of hydrologic processes affecting streamflow is required to effectively manage water resources to meet present and future human and environmental needs. The National Hydrologic Model (NHM), developed by the U.S. Geological Survey, can address these needs with an approach supporting coordinated, comprehensive, and consistent hydrologic modeling at multiple scales for the conterminous United States. The NHM fills knowledge gaps in ungaged areas, providing nationally consistent, locally informed, stakeholder relevant results. In this presentation, we will introduce the NHM and a publicly available Dockerized version that is currently providing daily operational results of water availability and stream temperature. We finish with a quick demonstration of a new experimental version of PRMS, the NHMs underlying hydrologic model, available through the CSDMS Python Modeling Toolkit (pymt).  +
A presentation from Phaedra and Greg, that was presented at the Modeling Collaboratory for Subduction Research Coordination Network Webinar Series, that features conversations between the leaders of successful interdisciplinary collaborations (see also https://www.sz4dmcs.org/webinars).  +
A range of Earth surface processes may drive rapid ice sheet retreat in the future, contributing to equally rapid global sea level rise. Though the pace of discovering these new feedback processes has accelerated in the past decade, predictions of future evolution of ice sheets are still subject to considerable uncertainty, originating from unknown future carbon emissions, and poorly understood ice sheet processes. In this talk, I explain why sea level rise projections past the next few decades are so uncertain, and how we are developing new stochastic ice sheet modeling methods to reduce uncertainty in projections, and the limits of uncertainty reduction. I also discuss the ongoing debate over whether uncertainty is important to consider at all in developing sea level projections that are usable by coastal planners.  +
A recent trend in the Earth Sciences is the adaptation of so-called “Digital Twins”. In Europe multi-million and even multi-billion projects are initiated for example, the Digital Twin of the Ocean and the Digital Twin Earth. But also many smaller digital-twin projects are popping up in the fields of city management, tunnels, hydraulic structures, waterways and coastal management. But what are Digital Twins really? Why are they now trending? What makes a Digital Twin different from a serious game, a numerical model or a simulator? In this session we will look at examples of digital twins, we will compare them to more traditional platforms and together define our expectations on future digital twins.  +
A wide variety of hydrological models are used by hydrologists: some differ because they were designed for different applications, some because of personal preferences of the modeller. All of them share the property that, like most scientific research code, it is rather hard to get someone elses model to run. The recently launched eWaterCycle platform takes away the headache of working with each other's models. In eWaterCycle models are run in containers and communicate with the central (Jupyter based) runtime environment through BMI. In this way a user can be talking to a Fortran model from Python without having to know anything about Fortran. Removing this headache allows hydrologists to easily run and couple each other's models facilitating science questions like the impact of model choice on results, or coupling different (regional, processes) models together with ease. In this talk I will highlight (and demonstrate) both the technology behind the eWaterCycle platform as well as the current and future research being done using the platform.  +
ANUGA is an open source software package capable of simulating small-scale hydrological processes such as dam breaks, river flooding, storm surges and tsunamis. ANUGA is a Python-language model that solves the Shallow Water Wave Equation on an unstructured triangular grid and can simulate shock waves and rapidly changing flows. It was developed by the Australian National University and Geosciences Australia and has an active developer and user community.<br><br>The package supports discontinuous elevation, or ‘jumps’ in the bed profile between neighbouring cells. This has a number of benefits. Firstly it can preserve lake-at-rest type stationary states with wet-dry fronts. It can also simulate very shallow frictionally dominated flow down sloping topography, as typically occurs in direct-rainfall flood models. A further benefit of the discontinuous-elevation approach, when combined with an unstructured mesh, is that the model can sharply resolve rapid changes in the topography associated with e.g. narrow prismatic drainage channels, or buildings, without the computational expense of a very fine mesh. The boundaries between such features can be embedded in the mesh using break-lines, and the user can optionally specify that different elevation datasets are used to set the elevation within different parts of the mesh (e.g. often it is convenient to use a raster digital elevation model in terrestrial areas, and surveyed channel bed points in rivers). The discontinuous-elevation approach also supports a simple and computationally efficient treatment of river walls. These are arbitrarily narrow walls between cells, higher than the topography on either side, where the flow is controlled by a weir equation and optionally transitions back to the shallow water solution for sufficiently submerged flows. This allows modelling of levees or lateral weirs which are much finer than the mesh size.<br><br>This clinic will provide a hands-on introduction to hydrodynamic modeling using ANUGA. We will discuss the structure and capabilities of the model as we build and run increasingly complex simulations involving channels and river walls. No previous knowledge of Python is required. Example input files will be provided and participants will be able to explore the code and outputs at their own pace.  
ANUGA is an open source software package capable of simulating small-scale hydrological processes such as dam breaks, river flooding, storm surges and tsunamis. Thanks to its modular structure, we’ve incorporated additional components to ANUGA that allow it to model suspended sediment transport and vegetation drag. ANUGA is a Python-language model that solves the Shallow Water Wave Equation on an unstructured triangular grid and can simulate shock waves and rapidly changing flows. It was developed by the Australian National University and Geosciences Australia and has an active developer and user community.<br><br>This clinic will provide a hands-on introduction to hydrodynamic modeling using ANUGA. We will discuss the structure and capabilities of the model as we build and run increasingly complex simulations. No previous knowledge of Python is required. Example input files will be provided and participants will be able to explore the code and outputs at their own pace.  +
Accurately characterizing the spatial and temporal variability of water and energy fluxes in many hydrologic systems requires an integrated modeling approach that captures the interactions and feedbacks between groundwater, surface water, and land- surface processes. Increasing recognition that these interactions and feedbacks play an important role in system behavior has lead to exciting new developments in coupled surface-subsurface modeling, with coupled surface-subsurface modeling becoming an increasingly useful tool for describing many hydrologic systems.<br><br>This clinic will provide a brief background on the theory of coupled surface-subsurface modeling techniques and parallel applications, followed by examples and hands-on experience using ParFlow, an open-source, object-oriented, parallel watershed flow model. ParFlow includes fully-integrated overland flow; the ability to simulate complex topography, geology and heterogeneity; and coupled land-surface processes including the land-energy budget, biogeochemistry, and snow processes. ParFlow is multi-platform and runs with a common I/O structure from laptop to supercomputer. ParFlow is the result of a long, multi-institutional development history and is now a collaborative effort between CSM, LLNL, UniBonn, and UC Berkeley. Many different configurations related to common hydrologic problems will be discussed through example problems.  +
Addressing society's water and energy challenges requires sustainable use of the Earth's critical zones and subsurface environment, as well as technological innovations in treatment and other engineered systems. Reactive transport models (RTMs) provide a powerful tool to inform engineering design and provide solutions for these critical challenges. In this keynote, I will showcase the flexibility and value of RTMs using real-world applications that focus on (1) assessing groundwater quality management with respect to nitrate under agricultural managed aquifer recharge, and (2) systematically investigating the physical, chemical and biological conditions that enhance CO2 drawdown rates in agricultural settings using enhanced weathering. The keynote will conclude with a discussion of the possibilities to advance the use of reactive transport models and future research opportunities therein.  +
Agent-Based Modeling (ABM) or Individual-Based Modeling is a research method rapidly increasing in popularity -- particularly among social scientists and ecologists interested in using simulation techniques to better understand the emergence of interesting system-wide patterns from simple behaviors and interactions at the individual scale. ABM researchers frequently partner with other scientists on a wide variety of topics related to coupled natural and human systems. Human societies impact (and are impacted by) various earth systems across a wide range of spatial and temporal scales, and ABM is a very useful tool for better understanding the effect of individual and social decision-making on various surface processes. The clinic will focus on introducing the basic toolkit needed to understand and pursue ABM research, and consider how ABM work differs from other computational modeling approaches. The clinic: - Will explore examples of the kinds of research questions and topics suited to ABM methods. - Will (attempt to) define some key concepts relevant to ABM research, such as emergence, social networks, social dilemmas, and complex adaptive systems. - Will provide an introduction to ABM platforms, particularly focused on NetLogo. - Discuss approaches to verification, validation, and scale dependency in the ABM world. - Introduce the Pattern-Oriented Modeling approach to ABM. - Discuss issues with reporting ABM research (ODD specification, model publishing). - Brainstorm tips and tricks for working with social scientists on ABM research.  +
Agent-Based Models (ABMs) can provide important insights into the nonlinear dynamics that emerge from the interactions of individual agents. While ABMs are commonly used in the social and ecological sciences, this rules-based modeling approach has not been widely adopted in the Surface Dynamics Modeling community. In this clinic, I will show how to build mixed models that utilize ABMs for some processes (e.g., forest dynamics and soil production) and numerical solutions to partial differential equations for other processes (e.g., hillside sediment transport). Specifically, I will introduce participants to pyNetLogo, a library that enables coupling between NetLogo ABMs and Python-based Landlab components. While active developers in either the NetLogo or Landlab communities will find this clinic useful, experience in both programming languages is not needed.  +
Agent-based modeling (ABM) developed as a method to simulate systems that include a number of agents – farmers, households, governments as well as biological organisms – that make decisions and interact according to certain rules. In environmental modeling, ABM is one of the best ways to explicitly account for human behavior, and to quantify cumulative actions of various actors distributed over the spatial landscape. This clinic provides an introduction to ABM and covers such topics as:<ol><li>Modeling heterogeneous agents that vary in attributes and follow different decision-strategies</li><li>Going beyond rational optimization and accommodating bounded rationality</li><li>Designing/representing agents’ interactions and learning.</ol>The clinic provides hands-on examples using the open-source modeling environment NetLogo https://ccl.northwestern.edu/netlogo. While no prior knowledge of NetLogo is required, participants are welcome to explore its super user-friendly tutorial. The clinic concludes with highlighting the current trends in ABM such as its applications in climate change research, participatory modeling and its potential to link with other types of simulations.  +
Agent-based modeling (ABM) is a powerful simulation tool with applications across disciplines. ABM has also emerged as a useful tool for capturing complex processes and interactions within socio-environmental systems. This workshop will offer a brief introduction to ABM for socio-environmental systems modeling including an overview of opportunities and challenges. Participants will be introduced to NetLogo, a popular programming language and modeling environment for ABM. In groups, participants will have the hands-on opportunity to program different decision-making methods in an existing model and observe how outcomes change. We will conclude with an opportunity for participants to raise questions or challenges they are experiencing with their own ABMs and receive feedback from the group.  +
An abstract was not required for this meeting  +
An overview of what the interagency Working Group stands for.  +
An update of what CSDMS has accomplished so far.  +
An update of what CSDMS has accomplished so far.  +
An update on CoMSES.  +
Answers to scientific questions often involve coupled systems that lie within separate fields of study. An example of this is flexural isostasy and surface mass transport. Erosion, deposition, and moving ice masses change loads on the Earth surface, which induce a flexural isostatic response. These isostatic deflections in turn change topography, which is a large control on surface processes. We couple a landscape evolution model (CHILD) and a flexural isostasy model (Flexure) within the CSDMS framework to understand interactions between these processes. We highlight a few scenarios in which this feedback is crucial for understanding what happens on the surface of the Earth: foredeeps around mountain belts, rivers at the margins of large ice sheets, and the "old age" of decaying mountain ranges. We also show how the response changes from simple analytical solutions for flexural isostasy to numerical solutions that allow us to explore spatial variability in lithospheric strength. This work places the spotlight on the kinds of advances that can be made when members of the broader Earth surface process community design their models to be coupleable, share them, and connect them under the unified framework developed by CSDMS. We encourage Earth surface scientists to unleash their creativity in constructing, sharing, and coupling their models to better learn how these building blocks make up the wonderfully complicated Earth surface system.  +
Are you confused about the best way to make your models and data accessible, reusable, and citable by others? In this clinic we will give you tools, information, and some dedicated time to help make your models and data FAIR - findable, accessible, interoperable and reusable. Models in the CSDMS ecosystem are already well on their way to being more FAIR than models that are not. But here, you will learn more about developments, guidelines, and tools from recent gatherings of publishers, repository leaders, and information technology practitioners at recent FAIR Data meetings, and translate this information into steps you can take to make your scientific models and data FAIR.  +
Are you interested in expanding the reach of your scientific data or models? One way of increasing the FAIRness of your digital resources (i.e., making them more findable, accessible, interoperable, and reproducible) is by annotating them with metadata about the scientific variables they describe. In this talk, we provide a simple introduction to the Scientific Variables Ontology (SVO) and show how, with only a small number of design patterns, it can be used to neatly unpack the definitions of even quite complex scientific variables and translate them into machine-readable form.  +
Are you tired of hearing about the FAIR Principles? This clinic is for you then, because after you participate you’ll never need to attend another one!* Good science depends on the careful and meticulous management and documentation of our research process. This includes our computational models, the datasets we use, the data transformation, analysis, and visualization scripts and workflows we build to evaluate and assess our models, and the assumptions and design decisions we make while writing our software. Join us for a Carpentries-style interactive clinic with hands-on exercises where we will provide concrete guidance and examples for how to approach, conceptualize, and transform your computational models of earth systems into FAIR contributions to the scientific record whether they are greenfield projects or legacy code with a focus on existing, open infrastructure (GitHub / GitLab / Zenodo). We’ll also cover containerization (Docker, Apptainer) as a way to transparently document system and software dependencies for your models, and how it can be used to support execution on the Open Science Grid Consortium’s Open Science Pool fair-share access compute resources. Big parallel fun! https://osg-htc.org ∗ individual results may vary, this statement is provably false  +
As agreed at earlier CSDMS forums, the major impediment in using AI for modeling the deep-ocean seafloor is a lack of training data, the data which guides the AI - whichever set of algorithms is chosen. This clinic will expose participants to globally-extensive datasets which are available through CSDMS. It will debate the scientific questions of why certain data work well, are appropriate to the processes, and are properly scaled. Participants are encouraged to bring their own AI challenges to the clinic.  +
As global population grows and infrastructure expands, the need to understand and predict processes at and near the Earth’s surface—including water cycling, soil erosion, landsliding, flood hazards, permafrost thaw, and coastal change—becomes increasingly acute. Progress in understanding and predicting these systems requires an ongoing integration of data and numerical models. Advances are currently hampered by technical barriers that inhibit finding, accessing, and operating modeling software and related tools and data sets. To address these challenges, we present the CSDMS@HydroShare, a cloud-based platform for accessing and running models, developing model-data workflows, and sharing reproducible results. CSDMS@HydroShare brings together cyberinfrastructure developed by two important community facilities: HydroShare (https://www.hydroshare.org/), which is an online collaboration environment for sharing data, models, and tools, and CSDMS Workbench (https://csdms.colorado.edu/wiki/Workbench), which is the integrated system of software tools, technologies, and standards for building, interfacing, and coupling models. This workshop presents how to use CSDMS@HydroShare to discover, access, and operate the Python Modeling Tool (PyMT). PyMT is one of the tools from the CSDMS Workbench, which allows users to interactively run and couple numerical models contributed by the community. In PyMT, there are already model components for coastal & permafrost modeling, stratigraphic and subsidence modeling, and terrestrial landscape evolution modeling. It also includes data components to access and download hydrologic and soil datasets from remote servers to feed the model components as inputs. This workshop aims to encourage the community to use existing or develop new model or data components under the PyMT modeling framework and share them through CSDMS@HydroShare to support reproducible research. This workshop includes hands-on exercises using tutorial Jupyter Notebooks and provides general steps for how to develop new components.  
At a global scale, deltas significantly concentrate people by providing diverse ecosystem services and benefits for their populations. At the same time, deltas are also recognized as one of the most vulnerable coastal environments, due to a range of adverse drivers operating at multiple scales. These include global climate change and sea-level rise, catchment changes, deltaic-scale subsidence and land cover changes, such as rice to aquaculture. These drivers threaten deltas and their ecosystem services, which often provide livelihoods for the poorest communities in these regions. Responding to these issues presents a development challenge: how to develop deltaic areas in ways that are sustainable, and benefit all residents? In response to this broad question we have developed an integrated framework to analyze ecosystem services in deltas and their linkages to human well-being. The main study area is part of the world’s most populated delta, the Ganges-Brahmaputra-Meghna Delta within Bangladesh. The framework adopts a systemic perspective to represent the principal biophysical and socio-ecological components and their interaction. A range of methods are integrated within a quantitative framework, including biophysical and socio-economic modelling, as well as analysis of governance through scenario development. The approach is iterative, with learning both within the project team and with national policy-making stakeholders. The analysis allows the exploration of biophysical and social outcomes for the delta under different scenarios and policy choices. Some example results will be presented as well as some thoughts on the next steps.  +
Bed material abrasion is a key control on the partitioning of basin scale sediment fluxes between coarse and fine material. While abrasion is traditionally treated as a simple exponential function of transport distance and a rock-specific abrasion coefficient, experimental studies have demonstrated greater complexity in the abrasion process: the rate of abrasion varies with clast angularity, transport rate, and grain size. Yet, few studies have attempted to assess the importance of these complexities in the field setting. Furthermore, existing approaches generally neglect the heterogeneity in size, abrasion potential, and clast density of the source sediment. Combining detailed field measurements and new modeling approaches, we quantify abrasion in the Suiattle River, a basin in the North Cascades of Washington State dominated by a single coarse sediment source: large, recurrent debris flows from a tributary draining Glacier Peak stratovolcano. Rapid downstream strengthening of river bar sediment and a preferential loss of weak, low-density vesicular volcanic clasts relative to non-vesicular ones suggest that abrasion is extremely effective in this system. The standard exponential model for downstream abrasion fails to reproduce observed downstream patterns in lithology and clast strength in the Suiattle, even when accounting for the heterogeneity of source material strength and the underestimate of abrasion rates by tumbler experiments. Incorporating transport-dependent abrasion into our model largely resolves this failure. These findings hint at the importance of abrasion and sediment heterogeneity in the morphodynamics of sediment pulse transport in river networks. A new modeling tool will allow us to tackle these questions: the NetworkSedimentTransporter, a Landlab component to model Lagrangian bed material transport and channel bed evolution. This tool will allow for future work on the interplay of bed material abrasion and size selective transport at the basin scale. While a simplified approach to characterizing abrasion is tempting, our work demonstrates that sediment heterogeneity and transport-dependent abrasion are important controls on the downstream fate of coarse sediment in fluvial systems.  
Biostabilizing organisms, such as saltmarsh and microphytobenthos, can play a crucial role in shaping the morphology of estuaries and coasts by locally stabilizing the sediment. However, their impact on large-scale morphology, which highly depends on the feedback between spatio-temporal changes in their abundance and physical forcing, remains highly uncertain. We studied the effect of seasonal growth and decay of biostabilizing organisms, in response to field calibrated physical forcings, on estuarine morphology over decadal timescales using a novel eco-morphodynamic model. The code includes temporal saltmarsh an microphytobenthos growth and aging as well as spatially varying vegetation fractions determined by mortality pressures. Growth representations are empirical and literature-based to avoid prior calibration. Novel natural patterns emerged in this model revealing that observed density gradients in vegetation are defined by the life-stages that increase vegetation resilience with age. The model revealed that the formation of seasonal and long term mud layering is governed by a ratio of flow velocity and hydroperiod altered by saltmarsh and microphytobenthos differently, showing that the type of biostabilizer determines the conditions under which mud can settle and be preserved. The results show that eco-engineering effects define emerging saltmarsh patterns from a combination of a positive effect reducing flow velocities and a negative effect enhancing hydroperiod. Consequently, saltmarsh and mud patterns emerge from their bilateral interactions that hence strongly define morphological development.  +
CSDMS 3.0 updates  +
CSDMS Basic Model Interface (BMI) - When equipped with a Basic Model Interface, a model is given a common set of functions for configuring and running the model (as well as getting and setting its state). Models with BMIs can communicate with each other and be coupled in a modeling framework. The coupling of models from different authors in different disciplines may open new paths to scientific discovery. In this first of a set of webinars on the CSDMS BMI, we'll provide an overview of BMI and the functions that define it. This webinar is appropriate for new users of BMI, although experienced users may also find it useful. '''Instructor:''' Mark Piper, Research Software Engineer, University of Colorado, Boulder '''When:''' November 13th, 12PM Eastern Time  +
CSDMS develops and maintains a suite of products and services with the goal of supporting research in the Earth and planetary surface processes community. This includes products such as Landlab, the Basic Model Interface, Data Components, the Model Repository, EKT Labs, and ESPIn. Examples of services include the Help Desk, Office Hours, Roadshows, RSEaaS, and EarthscapeHub. One problem, though, is that if the community doesn't know about these products and services, then they don't get used—and, like the Old Gods in Neil Gaiman's American Gods, they fade into obscurity. Let's break the cycle! Please join us for this webinar where we will present information about all of the products and services offered by CSDMS, and explain how they can help you accelerate your research. Attendees will leave with knowledge of what CSDMS can do for them, which they can bring back to their home institutions and apply to their research and share with their colleagues. <br>  +
CSDMS has developed a Web-based Modeling Tool – the WMT. WMT allows users to select models, to edit model parameters, and run the model on the CSDMS High-Performance Computing System. The web interface makes it straightforward to configure different model components and run a coupled model simulation. Users can monitor progress of simulations and download model output.<br><br> CSDMS has developed educational labs that use the WMT to teach quantitative concepts in geomorphology, hydrology, coastal evolution. These labs are intended to be used by Teaching assistants and Faculty alike. Descriptions of 4-hr hands-on labs have been developed for HydroTrend, Plume, Sedflux, CHILD, ERODE and ROMS-Lite. These labs include instructions for students to run the models and explore dominant parameters in sets of simulations. Learning objectives are split between topical concepts, on climate change and sediment transport amongst many others, and modeling strategies, modeling philosophy and critical assessment of model results.<br><br>In this clinic, we will provide an overview of the available models and labs, and their themes and active learning objectives. We will discuss the requirements and logistics of using the WMT in your classroom. We will run some simulations hands-on, and walk through one lab in more detail as a demonstration. Finally, the workshop intends to discuss future developments for undergraduate course use with the participants.  +
CSDMS has developed a Web-based Modeling Tool – the WMT. WMT allows users to select models, to edit model parameters, and run the model on the CSDMS High-Performance Computing System. The web tool makes it straightforward to configure different model components and run a coupled model simulation. Users can monitor progress of simulations and download model output.<br><br>CSDMS has designed educational labs that use the WMT to teach quantitative concepts in geomorphology, hydrology, coastal evolution and coastal sediment transport. These labs are intended for use by Teaching assistants and Faculty alike. Descriptions of 2 to 4-hr hands-on labs have been developed for HydroTrend, Plume, Sedflux, CHILD, TOPOFLOW and ROMS-Lite. These labs include instructions for students to run the models and explore dominant parameters in sets of simulations. Learning objectives are split between topical concepts, on climate change and sediment transport amongst many others, and modeling strategies, modeling philosophy and critical assessment of model results.<br><br>In this clinic, we will provide an overview of the available models and labs, and their themes and active learning objectives. We will discuss the requirements and logistics of using the WMT in your classroom. We will run some simulations hands-on, and walk through one lab in more detail as a demonstration. Finally, the workshop intends to discuss future developments for earning assessment tools with the participants.  +
CSDMS has developed the Basic Model Interface (BMI) to simplify the conversion of an existing model in C, C++, Fortran, Java, or Python into a reusable, plug-and-play component. By design, the BMI functions are straightforward to implement. However, in practice, the devil is in the details.<br><br>In this hands-on clinic, we will take a model -- in this case, an implementation of the two-dimensional heat equation in Python -- and together, we will write the BMI functions to transform it into a component. As we develop, we’ll unit test our component with nose, and we’ll explore how to use the component with a Jupyter Notebook. Optionally, we can set up a GitHub repository to store and to track changes to the code we write.<br><br>To get the most out of this clinic, come prepared to code! We have a lot to write in the time allotted. We recommend that clinic attendees have a laptop with the Anaconda Python distribution installed. We also request that you skim:<br><br>⤅ BMI description (https://csdms.colorado.edu/wiki/BMI_Description)<br>⤅ BMI documentation (http://bmi-forum.readthedocs.io/en/latest)<br>⤅ BMI GitHub repo(https://github.com/csdms/bmi-live)<br><br>before participating in the clinic.  +
CSDMS’s newly released Python Modeling Tool (PyMT) is an open source python package that provides convenient tools for coupling of models that use the Basic Model Interface. Historically, earth-surface process models have often been complex and difficult to work with. To help improve this situation and make the discovery process more efficient, the CSDMS Python Modeling Tool (PyMT) provides an environment in which community-built numerical models and tools can be initialized and run directly from a Python command line or Jupyter notebook. To illustrate how PyMT works and the advantages it provides, we will present a demonstration of two coupled models. By simplifying the process of learning, operating, and coupling models, PyMT frees researchers to focus on exploring ideas, testing hypotheses, and comparing models with data.  +
CSDMS’s newly released Python Modeling Tool (PyMT) is an open source Python package that provides convenient tools for coupling models that use the Basic Model Interface. Historically, earth-surface process models have often been complex and difficult to work with. To help improve this situation and make the discovery process more efficient, PyMT provides an environment in which community-built numerical models and tools can be initialized and run directly from a Python command line or a Jupyter Notebook. To illustrate how PyMT works and the advantages it provides, we will present a demonstration of two coupled models. By simplifying the process of learning, operating, and coupling models, PyMT frees researchers to focus on exploring ideas, testing hypotheses, and comparing models with data. Pre-registration required.<br><br>''See also: https://pymt.readthedocs.io/en/latest/''  +
Changing depth to water table and the associated stored water volume is a crucial component of the global hydrological cycle, with impacts on climate and sea level. However, long-term changes in global water-table distribution are not well understood. Coupled ground- and surface-water models are key to understanding the hydrologic evolution of post-glacial landscapes, the significance of terrestrial water storage, and the interrelationships between freshwater and climate. Here, I present the Water Table Model (WTM), which is capable of computing changes in water table elevation at large spatial scales and over long temporal scales. The WTM comprises groundwater and dynamic lake components to incorporate lakes into water-table elevation estimates. Sample results on both artificial and real-world topographies demonstrate the two-way coupling between dynamic surface-water and groundwater levels and flow.  +
Cheniers are ridges consisting of coarse-grained sediments, resting on top of muddy sediment. Along these muddy coastlines, cheniers provide shelter against wave attack, mitigating erosion or even enhancing accretion. As such, cheniers play an important role in the dynamics of the entire coastal landscape. This research focused on cheniers along mangrove-mud coasts. Therefore, chenier dynamics needed to be understood at the temporal and spatial scales of the mangrove vegetation as well. We developed a hybrid modelling approach, combining the strengths of complex process-based modelling (Delft3D), which allowed us to model the mixed-sediment dynamics at small temporal and spatial scales, with the strengths of a highly idealized profile model, providing low computational efforts for larger temporal and spatial scales.  +
Climate and tectonics ultimately drive the physical and chemical surface processes that evolve landscape structure, including the connectivity of landscape portions that facilitate or impede movement of organismal populations. Connectivity controls population spatial distribution, drives speciation where populations spatially fragment, and increases extinction susceptibility of species where its habitat shrinks. Here I demonstrate the role that landscape evolution models can have in exploring these process linkages in investigations of species diversification driven by climatic and tectonic forcings. The models were built with the tool, SpeciesEvolver that constructs lineages in response to environmental change at geologic, macroevolutionary, and landscape scales. I will also suggest how future studies can use landscape evolution models and tools such as SpeciesEvolver to pursue questions regarding the mechanisms by which lineages respond to the drivers and details of landscape evolution, and taxon-specific and region-specific interactions between biotas and their environments.  +
Climate-induced disturbances are expected to increase in frequency and intensity and affect coastal wetland ecosystem mainly through altering its hydrology. Investigating how wetland hydrology responds to climate disturbances is an important first step to understand the ecological response of coastal wetlands to these disturbances. In this talk, I am going to introduce my research work on improving the understanding of how the water storage of coastal wetlands at North Carolina, Delaware Bay, and the entire southeast U.S. changes under climatic disturbances. In particular, I will address the uncertainties in estimating water flow through coastal wetlands by considering 1) the regional-scale hydrologic interaction between uplands, coastal wetlands, and the ocean and 2) the impact of coastal eco-geomorphologic change on the freshwater and saltwater interaction on coastal marshlands.  +
Closing of the meeting  +
Cloud computing is a powerful tool for both analyzing large datasets and running models. This clinic will provide an introduction to approaches for accessing and using cloud resources for research in the Geosciences. During the hands-on portion of this clinic, participants will learn how to use Amazon Web Services (AWS) to open a terminal, analyze model output in python, and run a model, time permitting. This workshop assumes no experience with cloud computing.  +
Coastal Risk is a flood and natural hazard risk assessment technology company. Our mission is to help individuals, businesses and governments in the US and around the world achieve resilience and sustainability.<br>In the past year, Coastal Risk’s Technology supported nearly $2 billion in US commercial real estate investment and development. Coastal Risk’s unique business model combines high-tech, flood, climate and natural hazards risk assessments and high-value, risk communication reports with personalized, resilience-accelerating advice for individuals, corporations and governments. Our risk modeling and reports help save lives and property in the US. In order to take our system around the world, however, we need higher resolution DEMs. The 30m resolution currently available is a big obstacle to going international. This is something that we would like to get from NASA. Also, we are interested in high-resolution, “before-and-after” satellite imagery of flooded areas to compare with our modeling and to help individuals, businesses and governments understand how to better defend against floods.  +
Coastal communities facing shoreline erosion preserve their beaches both for recreation and for property protection. One approach is nourishment, the placement of externally-sourced sand to increase the beach’s width, forming an ephemeral protrusion that requires periodic re-nourishment. Nourishments add value to beachfront properties, thereby affecting re-nourishment choices for an individual community. However, the shoreline represents an alongshore-connected system, such that morphodynamics in one community are influenced by actions in neighboring communities. Prior research suggests coordinated nourishment decisions between neighbors were economically optimal, though many real-world communities have failed to coordinate, and the geomorphic consequences of which are unknown. Toward understanding this geomorphic-economic relationship, we develop a coupled model representing two neighboring communities and an adjacent non-managed shoreline. Within this framework, we examine scenarios where communities coordinate nourishment choices to maximize their joint net benefit versus scenarios where decision-making is uncoordinated such that communities aim to maximize their independent net benefits. We examine how community-scale property values affect choices produced by each management scheme and the economic importance of coordinating. The geo-economic model produces four behaviors based on nourishment frequency: seaward growth, hold the line, slow retreat, and full retreat. Under current conditions, coordination is strongly beneficial for wealth-asymmetric systems, where less wealthy communities acting alone risk nourishing more than necessary relative to their optimal frequency under coordination. For a future scenario, with increased material costs and background erosion due to sea-level rise, less wealthy communities might be unable to afford nourishing their beach independently and thus lose their beachfront properties.  +
Coastal environments are complex because of the interplay between aeolian and nearshore processes. Waves, currents, tides, and winds drive significant short term (<weekly) changes to coastal landforms which augment longer term (> annual) geomorphic trends. Great strides have been made in recent years regarding our ability to model coastal geomorphic change in this range of societally relevant time scales. However, a great disparity exists in modeling coastal evolution because subaqueous and subaerial processes are typically assessed completely independent of one another. By neglecting the co-evolution of subtidal and supratidal regions within our current framework, we are precluded from fully capturing non-linear dynamics of these complex systems. This has implications for predicting coastal change during both fair weather and storm conditions, hindering our ability to answer important scientific questions related to coastal vulnerability and beach building.<br><br>Recognizing these historic limitations, here we present the outline for a coupled subaqueous (XBeach) and subaerial (Coastal Dune Model) morphodynamic modeling system that is in active development with the goal of exploring coastal co-evolution on daily to decadal timescales. Furthermore we present recently collected datasets of beach and dune morphology in the Pacific Northwest US that will be used to validate trends observed within the coupled model platform.  +
Coastal flooding and related hazards have increasingly become one of the most impactful events as climate change continues to change the risk due to these events. Measuring the change in the risk of a particular flood level has therefore taken on a greater urgency, as historic measurements and statistics are no longer sufficient to measure the risk to coastal communities. Enabling our ability to compute these changes has become the focus as adaptation strategies due to the changing climate become increasingly critical. This talk will outline some of these challenges and ways we are attempting to address the problem in a multi-hazard aware way.  +
Coastal morphological evolution is caused by a wide range of coupled cross-shore and alongshore sediment transport processes associated with short waves, infragravity waves, and wave-induced currents. However, the fundamental transport mechanisms occur within the thin bottom boundary layer and are dictated by turbulence-sediment interaction and inter-granular interactions. In the past decade, significant progresses have been made in modeling sediment transport using Eulerian-Eulerian or Eulerian-Lagrangian two-phase flow approach. However, most of these models are limited to one-dimensional-vertical (1DV) formulation, which is only applicable to Reynolds-averaged sheet flow condition. Consequently, complex processes such as instabilities of the transport layer, bedform dynamics and turbulence-resolving capability cannot be simulated. The main objective of my research study was to develop a multi-dimensional four-way coupled two-phase model for sediment transport that can be used for Reynolds-averaged modeling for large-scale applications or for turbulence-resolving simulations at small-scale.  +
Coastal systems are an environmental sink for a wide range of materials of scientific interest, including sediments, nutrients, plastics, oils, seeds, and wood, to name only a few. Due to differences in material properties such as buoyancy, each of these materials are liable to have characteristic transport pathways which differ from the mean flow and each other, hydraulically “sorting” these materials in space. However, it remains difficult to quantify these differences in transport, due in part to the use of disparate models and approaches for each respective material. In this talk, I will advance a novel modeling framework for simulating the patterns of transport for a wide range of fluvially-transported materials using a single unified reduced-complexity approach, allowing us to compare and quantify differences in transport between materials. Using a hydrodynamic model coupled with the stochastic Lagrangian particle-routing model “dorado,” we are able to simulate at the process-level how local differences in material buoyancy lead to emergent changes in partitioning and nourishment in river deltaic systems. I will show some of the insights we have learned regarding the tendency for materials to be autogenically sorted in space, as well as progress we have made bridging between the process-level framework used in dorado and more physics-based approaches based on transport theory.  +
Computer models help us explore the consequences of scientific hypotheses at a level of precision and quantification that is impossible for our unaided minds. The process of writing and debugging the necessary code is often time-consuming, however, and this cost can inhibit progress. The code-development barrier can be especially problematic when a field is rapidly unearthing new data and new ideas, as is presently the case in surface dynamics.<br/><br/>To help meet the need for rapid, flexible model development, we have written a prototype software framework for two-dimensional numerical modeling of planetary surface processes. The Landlab software can be used to develop new models from scratch, to create models from existing components, or a combination of the two. Landlab provides a gridding module that allows you to create and configure a model grid in just a few lines of code. Grids can be regular or unstructured, and can readily be used to implement staggered-grid numerical solutions to equations for various types of geophysical flow. The gridding module provides built-in functions for common numerical operations, such as calculating gradients and integrating fluxes around the perimeter of cells. Landlab is written in Python, a high-level language that enables rapid code development and takes advantage of a wealth of libraries for scientific computing and graphical output. Landlab also provides a framework for assembling new models from combinations of pre-built components.<br/><br/>In this clinic we introduce Landlab and its capabilities. We emphasize in particular its flexibility, and the speed with which new models can be developed under its framework. In particular, we will introduce the many tools available within Landlab that make development of new functionality and new descriptions of physical processes both easy and fast. Participants will finish the clinic with all the knowledge necessary to build, run and visualize 2D models of various types of earth surface systems using Landlab.  
D-Claw is an extension of the software package GeoClaw (www.clawpack.org) for simulating flows of granular-fluid mixtures with evolving volume fractions. It was developed primarily for landslides, debris flows and related phenomena by incorporating principles of solid, fluid and soil mechanics. However, because the two-phase model accommodates variable phase concentrations, it can also be used to model fluid problems in the absence of solid content (the model equations reduce to the shallow water equations as the solid phase vanishes). We therefore use D-Claw to seamlessly simulate multifaceted problems that involve the interaction of granular-fluid mixtures and bodies of water. This includes a large number of cascading natural hazards, such as debris-avalanches and lahars that enter rivers and lakes, landslide-generated tsunamis, landslide dams and outburst floods that entrain debris, and debris-laden tsunami inundation. I will describe the basis of D-Claw's model equations and highlight some recent applications, including the 2015 Tyndall Glacier landslide and tsunami, potential lahars on Mt. Rainier that displace dammed reservoirs, and a hypothetical landslide-generated lake outburst flood near Sisters, Oregon.  +
DES3D (Dynamic Earth Solver in Three Dimensions) is a flexible, open-source finite element solver that models momentum balance and heat transfer in elasto-visco-plastic material in the Lagrangian form using unstructured meshes. It provides a modeling platform for long-term tectonics as well as various problems in civil and geotechnical engineering. On top of the OpenMP multi-thread parallelism, DES3D has recently adopted CUDA for GPU computing. The CUDA-enabled version shows speedup of two to three orders of magnitude compared to the single-thread performance, making high-resolution 3D models affordable. This clinic will provide an introduction to DynEarthSol3D’s features and capabilities and hands-on tutorials to help beginners start using the code for simple tectonic scenarios. Impact of the two types of parallelization on performance will be demonstrated as well.  +
Dakota (https://dakota.sandia.gov) is an open-source software toolkit, designed and developed at Sandia National Laboratories, that provides a library of iterative systems analysis methods, including sensitivity analysis, uncertainty quantification, optimization, and parameter estimation. Dakota can be used to answer questions such as: * What are the important parameters in my model? * How safe, robust, and reliable is my model? * What parameter values best match my observational data? Dakota has been installed on the CSDMS supercomputer, ''beach.colorado.edu'', and is available to all registered users. The full set of Dakota methods can be invoked from the command line on ''beach''; however, this requires detailed knowledge of Dakota, including how to set up a Dakota input file and how to pass parameters and responses between a model and Dakota. To make Dakota more accessible to the CSDMS community, a subset of its functionality has been configured to run through the CSDMS Web Modeling Tool (WMT; https://csdms.colorado.edu/wmt). WMT currently provides access to Dakota's vector, centered, and multidimensional parameter study methods.<br><br>In this clinic, we'll provide an overview of Dakota, then, through WMT, set up and perform a series of numerical experiments with Dakota on ''beach'', and evaluate the results. Other material can be downloaded from: https://github.com/mdpiper/dakota-tutorial.<br>  +
Dakota is a flexible toolkit with algorithms for parameter optimization, uncertainty quantification, parameter estimation, and sensitivity analysis. In this clinic we will work through examples of using Dakota to compare field observations with model output using methods of sensitivity analysis and parameter optimization. We will also examine how the choice of comparison metrics influences results. Methods will be presented in the context of the Landlab Earth-surface dynamics framework but are generalizable to other models. Participants who are not familiar with Landlab are encouraged (but not required) to sign up for the Landlab clinic, which will take place before this clinic.<br><br>Participants are encouraged to install both Landlab and Dakota on their computers prior to the clinic. Installation instructions for Landlab can be found at: http://landlab.github.io (select "Install" from the menu bar at the top of the page). Installation instructions for Dakota can be found at https://dakota.sandia.gov/content/install-dakota.  +
Dakota is a flexible toolkit with algorithms for parameter optimization, uncertainty quantification, parameter estimation, and sensitivity analysis. In this clinic we will cover the basics of the Dakota framework, work through examples of using Dakota to compare field observations with model output using methods of sensitivity analysis and parameter optimization, and briefly cover the theoretical background of the Dakota methods used. If time permits, we will examine how the choice of comparison metrics influences results. Methods will be presented in the context of the Landlab Earth-surface dynamics framework but are generalizable to other models. Participants who are not familiar with Landlab are encouraged (but not required) to sign up for the Landlab clinic, which will take place before this clinic.<br>Participants do not need to install Landlab or Dakota prior to the clinic but will need to sign up for a Hydroshare account. https://www.hydroshare.org/sign-up/. <br>For those students interested in installing Landlab or Dakota: Installation instructions for Landlab can be found at: http://landlab.github.io (select "Install" from the menu bar at the top of the page). Installation instructions for Dakota can be found at https://dakota.sandia.gov/content/install-dakota.  +
Dakota is an open-source toolkit with several types of algorithms, including sensitivity analysis (SA), uncertainty quantification (UQ), optimization, and parameter calibration. Dakota provides a flexible, extensible interface between computational simulation codes and iterative analysis methods such as UQ and SA methods. Dakota has been designed to run on high-performance computing platforms and handles a variety of parallelism. In this clinic, we will provide an overview of Dakota algorithms, specifically focusing on uncertainty quantification (including various types of sampling, reliability analysis, stochastic expansion, and epistemic methods), sensitivity analysis (including variance-based decomposition methods and design of experiments), and parameter calibration (including nonlinear least squares and Bayesian methods). The tutorial will provide an overview of the methods and discuss how to use them. In addition, we will briefly cover how to interface your simulation code to Dakota.  +
Data component is a software tool that wraps an API for a data source with a Basic Model Interface (BMI). It is designed to provide a consistent way to access various types of datasets and subsets of them without needing to know the original data API. Each data component can also interact with numerical models that are wrapped in the pymt modeling framework. This webinar will introduce the data component concept with a demonstration of several examples for time series, raster, and multidimensional space-time data.  +
Debris flows pose a substantial threat to downstream communities in mountainous regions across the world, and there is a continued need for methods to delineate hazard zones associated with debris-flow inundation. Here we present ProDF, a reduced-complexity debris-flow inundation model. We calibrated and tested ProDF against observed debris-flow inundation from eight study sites across the western United States. While the debris flows at these sites varied in initiation mechanism, volume, and flow characteristics, results show that ProDF is capable of accurately reproducing observed inundation in different settings and geographic areas. ProDF reproduced observed inundation while maintaining computational efficiency, suggesting the model may be applicable in rapid hazard assessment scenarios.  +
Decision framing is a key, early step in any effective decision support engagement in which modelers aim to inform decision and policy making. In this clinic participants will work through and share the results of decision framing exercises for a variety of policy decisions. We will organize the exercise using the XLRM elicitation, commonly used in decision making under deep uncertainty (DMDU) stakeholder engagements. The XLRM framework is useful because it helps organize relevant factors into the components of a decision-centric analysis. The letters X, L, R, and M refer to four categories of factors important to RDM analysis: outcome measures (M) that reflect decision makers’ goals; policy levers (L) that decision makers use to pursue their goals; uncertainties (X) that may affect the connection between policy choices and outcomes; and relationships (R), often instantiated in mathematical simulation models, between uncertainties and levers and outcomes.  +
Deep-learning emulators permit to reduce dramatically the computational times for solving physical models. Trained from a state-of-the-art high-order ice flow model, the Instructed Glacier Model (IGM, https://github.com/jouvetg/igm) is an easy-to-use python code based on the Tensorflow library that can simulate the 3D evolution of glaciers several orders of magnitude faster than the instructor model with minor loss of accuracy. Switching to Graphics Processing Unit (GPU) permits additional significant speed-ups, especially when modeling large-scale glacier networks and/or high spatial resolutions. Taking advantage of GPUs, IGM can also track a massive amount of particles moving within the ice flow, opening new perspectives for modeling debris transportation of any size (e.g., erratic boulders). Here I give an overview of IGM, illustrate its potential to simulate paleo and future glacier evolution in the Alps together with particle tracking applications, and do a quick live demo of the model.  +
Delta morphology  +
Deltas are highly sensitive to local human activities, land subsidence, regional water management, global sea-level rise, and climate extremes. In this talk, I’ll discuss a recently developed risk framework for estimating the sensitivity of deltas to relative sea level rise, and the expected impact on flood risk. We apply this framework to an integrated set of global environmental, geophysical, and social indicators over 48 major deltas to quantify how delta flood risk due to extreme events is changing over time. Although geophysical and relative sea-level rise derived risks are distributed across all levels of economic development, wealthy countries effectively limit their present-day threat by gross domestic product–enabled infrastructure and coastal defense investments. However, when investments do not address the long-term drivers of land subsidence and relative sea-level rise, overall risk can be very sensitive to changes in protective capability. For instance, we show how in an energy-constrained future scenario, such protections will probably prove to be unsustainable, raising relative risks by four to eight times in the Mississippi and Rhine deltas and by one-and-a-half to four times in the Chao Phraya and Yangtze deltas. This suggests that the current emphasis on short-term solutions on the world’s deltas will greatly constrain options for designing sustainable solutions in the long term.  +
Developed barriers are tightly-coupled systems driven by feedbacks between natural processes and human decisions to maintain development. Coastal property markets are dynamically linked to the physical environment: large tax revenues and high-value infrastructure necessitate defensive coastal management through beach nourishment, dune development, overwash removal, and construction of hard structures. In turn, changes to environmental characteristics such as proximity to the beach, beach width, and the height of dunes influence coastal property values. In this talk I will use a new exploratory model framework – the CoAStal Community-lAnDscape Evolution (CASCADE) model – to explore the coupled evolution of coastal real estate markets and barrier landscapes. The framework couples two geomorphic models of barrier evolution (Barrier3D and BRIE) with an agent-based real estate model – the Coastal Home Ownership Model (CHOM). CHOM receives information about the coastal environment and acts on that information to cause change to the environment, including decisions about beach nourishment and dune construction and maintenance. Through this coupled model framework, I will show how the effects of dune and beach management strategies employed in the wake of extreme storms cascade through decades to alter the evolution of barriers, inadvertently inhibiting their resilience to sea level rise and storms, and ultimately unraveling coastal real estate markets.  +
Developers of solvers for PDE-based models and other computationally intensive tasks are confronted with myriad complexity, from science requirements to algorithms and data structures to GPU programming models. We will share a fresh approach that has delivered order of magnitude speedups in computational mechanics workloads, minimizing incidental complexity while offering transparency and extensibility. In doing so, we'll examine the PETSc and libCEED libraries, validate performance models, and discuss sustainable architecture for community development. We'll also check out Enzyme, an LLVM-based automatic differentiation tool that can be used with legacy code and multi-language projects to provide adjoint (gradient) capabilities.  +
Digital twins are increasingly important in many domains, including for understanding and managing the natural environment. Digital twins of the natural environment are fueled by the unprecedented amounts of environmental data now available from a variety of sources from remote sensing to potentially dense deployment of earth-based sensors. Because of this, data science techniques inevitably have a crucial role to play in making sense of this complex, highly heterogeneous data. This webinar will reflect on the role of data science in digital twins of the natural environment, with particular attention on how resultant data models can work alongside the rich legacy of process models that exist in this domain. We will seek to unpick the complex two-way relationship between data and process understanding. By focusing on the interactions, we will end up with a template for digital twins that incorporates a rich, highly dynamic learning process with the potential to handle the complexities and emergent behaviors of this important area.  +
Does permafrost impart topographic signatures, and how does subsequent warming affect hillslope and channel form? Permafrost controls the depth to immobile soil, and tundra vegetation influences infiltration and erosion thresholds. I will use high-resolution maps of arctic landscapes to examine morphometric properties like hillslope length, curvature and drainage density as functions of climate and vegetation. I will then compare these data to existing models of climate-modulated sediment flux and channel incision in Landlab, exploring the effect of more nuanced representations of permafrost flux laws and hydrology. I will also compare modeled landscapes forced with Pleistocene-Holocene climate to mid-latitude landscape form.  +
During a clinic session in the 2013 CSDMS annual meeting, the OpenFOAM®, an open source computational fluid dynamics (CFD) platform, was first introduced by Dr. Xiaofeng Liu (now at Penn State University) for modeling general earth surface dynamics. OpenFOAM® provides various libraries, solvers and toolboxes for solving various fluid physics via finite volume method. The objective of this clinic is to further discuss its recent development and applications to coastal sediment transport. The clinic will start with an overview of a range of coastal applications using OpenFOAM®. We will then focus on a recently released solver, SedFOAM, for modeling sand transport by using an Eulerian two-phase flow methodology. Specifically, we will focus on applying the model to study wave-driven sheet flows and the occurrence of momentary bed failure. The code can be downloaded via CSDMS code repository and participants will receive a hands-on training of the coding style, available numerical schemes in OpenFOAM®, computational domain setup, input/output and model result analysis. Knowledge of C++, object-oriented programming, and parallel computing is not required but will be helpful.  +
During the clinic we'll introduce the new Delft3D Flexible Mesh modeling environment. We'll discuss the basic features and set up a simple 2D morphological model. The ongoing developments and the possibility to use BMI for runtime interaction will be presented as well. The user interface runs on Windows, so make sure that you have a Windows computer or virtual machine available during the meeting. The user interface will be provided precompiled; the computational kernels you'll have to compile yourself. We'll provide instructions on how to compile the FORTRAN/C kernels before the clinic.  +
Earth scientists face serious challenges when working with large datasets. Pangeo is a rapidly growing community initiative and open source software ecosystem for scalable geoscience using Python. Three of Pangeo’s core packages are 1) Jupyter, a web-based tool for interactive computing, 2) Xarray, a data-model and toolkit for working with N-dimensional labeled arrays, and 3) Dask, a flexible parallel computing library. When combined with distributed computing, these tools can help geoscientists perform interactive analysis on datasets up to petabytes in size. In this interactive tutorial we will demonstrate how to employ this platform using real science examples from hydrology, remote sensing, and oceanography. Participants will follow along using Jupyter notebooks to interact with Xarray and Dask running in Google Cloud Platform.  +
Earth surface processes are modulated by fascinating interactions between climate, tectonics, and biota. These interactions are manifested over diverse temporal and spatial scales ranging from seconds to millions of years, and microns to thousands of kilometers, respectively. Investigations into Earth surface shaping by biota have gained growing attention over the last decades and are a research frontier. In this lecture, I present an integration of new observational and numerical modeling research on the influence of vegetation type and cover on the erosion of mountains. I do this through an investigation of millennial timescale catchment denudation rates measured along the extreme climate and ecologic gradient of the western margin of South America.  +
Earthquakes are the most frequent source of classic tsunami waves. Other processes that generate tsunami waves include, landslides, volcanic eruption and meteorite impacts. Furthermore, atmospheric disturbances can also generate tsunami waves or at least tsunami-like waves, but we are just at the beginning of understanding their physics and frequency. Classic tsunami waves long waves with wavelength that are much longer than the water depth. For earthquake-generated tsunami waves that is true. However, landslides and meteorite impacts generate tsunami waves that are shorter which has a profound effect on the tsunami evolution, but no less dangerous.<br>Fortunately, tsunamis do not occur frequently enough in any given region to make meaningful prediction of the future tsunami hazard based only on recorded history. The geologic record has to be interrogated. The inversion of meaningful and quantitative data from the geologic record is the main goal of my research. However, there are problems with the geologic record. The most important problem is that we often have trouble to identify tsunami deposits. Second, it is very often difficult to separate the tsunami record from the storm record in regions where storms and tsunamis are competing agents of coastal change. Other problems are concerned with he completeness of the deposits, but also the fact that sedimentary environment before the tsunami hit most likely was eroded is no longer part of the record makes inversion especially tricky. In my research, I assume that the tsunami deposit is identified, but perhaps not complete and what we know about the pre-event conditions is limited.<br>My talk will cover how the geologic record is used to invert quantitative information about the causative process. We are going to look at grain sizes from sand to boulders and what we can learn from the transport of these very different grain sizes about tsunamis and their impacts along respective coastal areas. The models that are employed to invert flow characteristics from deposits are based on Monte-Carlo simulations to overcome the issue of not knowing the pre-tsunami conditions with great confidence. If time permits, we also see how sea-level change affects tsunami impact at the coast.  
Earth’s surface is the living skin of our planet – it connects physical, chemical, & biological systems. Over geological time, this surface evolves with rivers fragmenting the landscape into environmentally diverse range of habitats. These rivers not only carve canyons & form valleys, but also serve as the main conveyors of sediment & nutrients from mountains to continental plains & oceans. Here we hypothesise that it is not just geodynamics or climate, but their interaction, which, by regulating topography and sedimentary flows, determines long-term evolution of biodiversity. As such, we propose that surface processes are a prime limiting factor of diversification of Life on Earth before any form of intrinsic biotic process. To test this hypothesis, we use reconstructions of ancient climates & plate tectonics to simulate the evolution of landscape & sedimentary history over the entire Phanerozoic era, a period of 540 million years. We then compare these results with reconstructions of marine & continental biodiversity over geological times. Our findings suggest that biodiversity is strongly influenced by landscape dynamics, which at any given moment determine the carrying capacity of continental & oceanic domains, i.e., the maximum number of different species they can support at any given time. In the oceans, diversity closely correlates with the sedimentary flow from the continents, providing the necessary nutrients for primary production. Episodes of mass extinctions in the oceans have occurred shortly after a significant decrease in sedimentary flow, suggesting that a nutrient deficit destabilizes biodiversity & makes it particularly vulnerable to catastrophic events. On the continents, it took the gradual coverage of the surface with sedimentary basins for plants to develop & diversify, thanks to the development of more elaborate root systems. This slow expansion of terrestrial flora was further stimulated during tectonic episodes.  
Ecological Network Analysis (ENA) enables quantitative study of ecosystem models by formulating system-wide organizational properties, such as how much nutrient cycling occurs within the system, or how essential a particular component is to the entire ecosystem function. EcoNet is a free online software for modeling, simulation and analysis of ecosystem network models, and compartmental flow-storage type models in general. It combines dynamic simulation with Ecological Network Analysis. EcoNet does not require an installation, and runs on any platform equipped with a standard browser. While it is designed to be easy to use, it does contain interesting features such as discrete and continuous stochastic solutions methods.  +
Ecology is largely considered to have its foundations in physics, and indeed physics frames many of the constraints on ecosystem dynamics. Physics has its limitations, however, especially when dealing with strongly heterogeneous systems and with the absence of entities. Networks are convenient tools for dealing with heterogeneity and have a long history in ecology, however most research in networks is dedicated to uncovering the mechanisms that give rise to network types. Causality in complex heterogeneous systems deals more with configurations of processes than it does with objects moving according to laws. Phenomenological observation of ecosystems networks reveals regularities that the laws of physics are unequipped to determine. The ecosystem is not a machine, but rather a transaction between contingent organization and entropic disorder.  +
Economic losses and casualties due to riverine flooding increased in past decades and are most likely to further increase due to global change. To plan effective mitigation and adaptation measures and since floods often affect large areas showing spatial correlation, several global flood models (GFMs) were developed. Yet, they are either based on hydrologic or on hydrodynamic model codes. This may lower the accuracy of inundation estimates as large-scale hydrologic models often lack advanced routing schemes, reducing timeliness of simulated discharge, while hydrodynamic models depend on observed discharge or synthesized flood waves, hampering the representation of intra-domain processes.<br>To overcome this, GLOFRIM was developed. Currently, it allows for coupling one global hydrologic model, producing discharge and runoff estimates, with two hydrodynamics which perform the routing of surface water. By employing the Basic Model Interface (BMI) concept, both online and spatially explicit coupling of the models is supported. This way the coupled models remained unaffected, facilitating the separate development, storage, and updating of the models and their schematizations. Additionally, the framework is developed with easy accessibility and extensibility in mind, which allows other models to be added without extensive re-structuring. <br>In this presentation, the main underlying concepts of GLOFRIM as well as its workflow will be outlined, and first results showing the benefit of model coupling will be discussed. Besides, current limitations and need for future improvements will be pointed out. Last, current developments in code development, applications, and integrations with other research fields will be presented and discussed.  +
Ecosystems are in transition globally with critical societal consequences. Global warming, growing climatic extremes, land degradation, human-introduced herbivores, and climate-related disturbances (e.g., wildfires) drive rapid changes in ecosystem productivity and structure, with complex feedbacks in watershed hydrology, geomorphology, and biogeochemistry. There is a need to develop models that can represent ecosystem changes by incorporating the role of individual plant patches. We developed ecohydrologic components in Landlab that can be coupled to create models to simulate local soil moisture dynamics and plant dynamics with spatially-explicit cellular automaton plant establishment, mortality, fires, and grazing. In this talk, I will present a model developed to explore the interplay between ecosystem state, change in climate, resultant grass connectivity, fire frequency, and topography. A transition from a cool-wet climate to a warm-dry climate leads to shrub expansion due to drought-induced loss of grass connectivity. Shrubs dominate the ecosystem if dry conditions persist longer. The transition back to a tree or grass-dominated ecosystem from a shrub-dominated ecosystem can only happen when climate shifts from dry to wet. The importance of the length of dry or wet spells on ecosystem structure is highlighted. Aspect plays a critical role in providing topographical refugia for trees during dry periods and influences the rate of ecosystem transitions during climate change.  +
Ecosystems present spatial patterns controlled by climate, topography, soils, plant interactions, and disturbances. Geomorphic transport processes mediated by the state of the ecosystem leave biotic imprints on erosion rates and topography. This talk will address the following questions at the watershed scale: What are emergent properties of biotic landscapes, and how do they form? How do biotic landscapes respond to perturbations in space and time? First, formation of patterns and ecologic rates of change to perturbations in semiarid ecosystems will be investigated using Landlab. Second, we will examine eco-geosphere interactions and outcomes using a landscape evolution model. The role of solar radiation on ecogeomorphic forms, and watershed ecogeomorphic response to climate change will be elaborated. Finally, reflecting on the findings of previous research, some future directions in numerical modeling for linking ecosphere and geosphere will be discussed.  +
Environmental management decisions increasingly rely on quantitative integrated ecological models to forecast potential outcomes of management actions. These models are becoming increasingly complex through the integration of processes from multiple disciplines (e.g., linking physical process, engineering and ecological models). These integrated modeling suites are viewed by many decision makers as unnecessarily complex black boxes, which can lead to mistrust, misinterpretation and/or misapplication of model results. Numerical models have historically been developed without decision makers and stakeholders involved in model development, which further complicates communication as diverse project teams have differing levels of understanding of models and their uses. For example, explaining to a group of non-modelers how hydrodynamic model output was aggregated at ecologically-relevant scales can be difficult to explain to someone who was not exposed to that modeling decision. The mistrust of models and associated outputs can lead to poor decision-making, increase the risk of ineffective decisions and can lead to litigation over decisions. Improved integrated ecological model development practices are needed to increase transparency, include stakeholders and decision makers throughout the entire modeling process from conceptualization through application. This clinic describes a suite of techniques, best practices, and tools for rapid developing applied integrated ecological models in conjunction with technical stakeholder audiences and agency practitioners. First, a workshop approach for applied ecosystem modeling problems is described that cultivates a foundational understanding of integrated ecological models through hands-on, interactive model development. In this workshop environment, interdisciplinary and interagency working groups co-develop models in real-time which demystifies technical issues and educates participants on the modeling process. Second, a Toolkit for interActive Modeling (TAM) is presented as a simple platform for rapidly developing index-based ecological models, which we have found useful for developing a strong modeling foundation for large, multidisciplinary teams involved in environmental decision making. Third, the EcoRest R package is described, which provides a library of functions for computing habitat suitability and decision support via cost-effectiveness and incremental cost analysis. Based on 10 workshops over the last 8 years, these techniques facilitated rapid, transparent development and application of integrated ecological models, informed non-technical stakeholders of the complexity facing decision-makers, created a sense of model ownership by participants, built trust among partners, and ultimately increased “buy-in” of eventual management decisions.  
Established in 2005, GEO (http://www.earthobservations.org/) is a voluntary partnership of governments and organizations that envisions “a future wherein decisions and actions for the benefit of humankind are informed by coordinated, comprehensive and sustained Earth observations and information.” GEO Member governments include 96 nations and the European Commission, and 87 Participating Organizations comprised of international bodies with a mandate in Earth observations. Together, the GEO community is creating a Global Earth Observation System of Systems (GEOSS) that will link Earth observation resources world-wide across multiple Societal Benefit Areas - agriculture, biodiversity, climate, disasters, ecosystems, energy, health, water and weather - and make those resources available for better informed decision-making. Through the GEOSS Common Infrastructure (GCI), GEOSS resources, including Earth observation data (satellite, airborne, in situ, models), information services, standards and best practices, can be searched, discovered and accessed by scientists, policy leaders, decision makers, and those who develop and provide information services across the entire spectrum of users. The presentation will cover the GCI overall architecture and some possible future developments.  +
Exchanges of sediment between marshes and estuaries affect coastal geomorphology, wetland stability and habitat, but can be difficult to predict due to the many processes that influence dynamics in these systems. This study uses a modeling approach to analyze how spatially variability in marsh-edge erosion, vegetation, and hydrodynamic conditions affect sediment fluxes between marshes and estuaries in Barnegat Bay, New Jersey. Specifically, the three-dimensional Coupled Ocean-Atmosphere-Wave-Sediment Transport (COAWST) numerical model was used. Model results showed that marsh-estuarine sediment fluxes varied spatially due to changes in wave thrust, currents, and sediment availability.  +
Exploratory models that simulate landscape change incorporate only the most essential processes that are hypothesized to control a behavior of interest. These “rule-based” models have been used successfully to examine behaviors in natural landscapes over large spatial (many kms) and temporal scales (decades to millennia). In many geomorphic systems, the dynamics of developed landscapes differ significantly from natural landscapes. For example, humans can alter the physical landscape through the introduction of hard infrastructure and removal of vegetation. Humans can also modify the internal and external forces that naturally change landscapes, including flows of water, wind, and sediment as well as climatic factors. As with natural processes, in exploratory models human behavior must be parameterized. However, the level of detail to which human behavior can be reduced while still accurately reproducing feedbacks across the coupled human-natural landscape is a complex, user-based decision. In this clinic, we will work in small groups and through a Jupyter Notebook to parameterize a new human behavior within a modular coastal barrier evolution model (Barrier3D, within the CASCADE modeling framework). The clinic will incorporate discussions and prompts about how to broadly identify important model “ingredients” and reduce model complexity, and will therefore be generalizable to other geomorphic landscapes.  +
Fill-Spill-Merge (FSM) is an algorithm that distributes runoff on a landscape to fill or partially fill depressions. When a depression fills, excess water can overflow into neighbouring depressions or the ocean. In this clinic, we will use FSM to assess changes in a landscape’s hydrology when depressions in a DEM are partially or fully filled with water. We will discuss why it may be important to consider depressions more closely than just with removal. I will describe the design of the FSM algorithm, and then we will use FSM on a DEM to look at how landscape hydrology changes under different hydrologic conditions. This clinic may be helpful to those interested in topics such as landscape hydrology, landscape evolution, flow routing, hydrologic connectivity, and lake water storage.  +
Fire temporarily alters soil and vegetation properties, driving increases in runoff and erosion that can dramatically increase the likelihood of debris flows. In the immediate aftermath of fire, debris flows most often initiate when surface water runoff rapidly erodes sediment on steep slopes. Due to the complex interactions between runoff generation, sediment transport, and post-fire debris-flow initiation and growth, models that couple these processes can provide valuable insights into the ways in which topography, burn severity, and post-fire recovery influence debris-flow activity. Here, we describe such a model as well as attempts to parameterize temporal changes in model parameters throughout the post-fire recovery process. Simulations of watershed-scale response to individual rainstorms in several southern California burned areas suggest substantial reductions in debris-flow likelihood and volume within the first 1-2 years following fire. Results highlight the importance of considering local rainfall characteristics and sediment supply when using process-based numerical models to assess debris-flow potential. More generally, results provide a methodology for estimating the intensity and duration of rainfall associated with the initiation of runoff-generated debris flows as well as insights into the persistence of debris-flow hazards following fire.  +
Flood hazard in rivers can evolve from changes in the frequency and intensity of flood-flows (hydrologic effects) and in the channel capacity to carry flood-flows (morphologic effects). However, river morphology is complex and often neglected in flood planning. Here, we separate the impacts of morphology vs. hydrology on flood risk for 48 river gauges in Northwestern Washington State. We find that morphologic vs. hydrologic forcings are comparable but not regionally consistent. Prominent morphologic effects on flood-risk are forced by extreme natural events and anthropogenic disturbances. Based on morphologic changes, we identify five categories of river behavior relevant for flood-risk management.  +
Flood modelling at global scales represents a revolution in hydraulic science and has the potential to transform decision-making and risk management in a wide variety of fields. Such modelling draws on a rich heritage of algorithm and data set development in hydraulic modelling over the last 20 years, and is now beginning to yield new insights into current and future flood risk. This paper reviews this progress and outlines recent efforts to develop a 30m resolution true hydrodynamic model of the entire conterminous US. The model is built using an automated framework which uses US National Elevation Dataset, the HydroSHEDS river network, regionalised frequency analysis to determine extreme flow and rainfall boundary conditions and the USACE National Levee Dataset to characterize flood defences. Comparison against FEMA and USGS flood maps shows the continental model to have skill approaching that of bespoke models built with local data. The paper describes the development and testing of the model, and it use to estimate current and future flood risk in the US using high resolution population maps and development projections.  +
Flooding is one of the costliest natural disasters and recent events, including several hurricanes as well as flash floods, have been particularly devastating. In the US alone, the last few years have been record-breaking in terms of flood disasters and triggered many reactions in public opinions. Governments are now reviewing the available information to better mitigate the risks from flooding.<br>Typically, in the US, flood hazard mapping is done by federal agencies (USACE, FEMA and USGS), with traditionally, little room and need for research model development in flood hazard applications. Now, with the advent of the National Water Model, the status quo of flood hazard prediction in the US may be changing; however, inundation extent and floodplain depths in the National Water Model are still under early-stage development.<br>This Clinic provides a beginner introduction to the latest capabilities in large-scale 2-D modeling using the LISFLOOD-FP model developed by the University of Bristol with a nearly 20-year code history. This model has a very long history in research applications, while the algorithms behind the model made their way also into many existing industry model codes. The session will give participants insights into 2-D flood inundation modeling with LISFLOOD-FP and also a look at more sophisticated sub-grid channel implementations for large-scale application. More specifically, we will look at the data sets needed by the model and then run a simulation of the annual flooding on the Inner Niger Delta in Mali. The Clinic will also give participants the opportunity to look at some high-resolution LiDAR-based model results.  +
Floodplain construction involves the interplay between channel belt sedimentation and avulsion, overbank deposition of fines, and sediment reworking by channel migration. There has been considerable progress in numerical modelling of these processes over the past few years, for example, by using high resolution flow and sediment transport models to simulate river morphodynamics, albeit over relatively small time and space scales. Such spatially-distributed hydrodynamic models are also regularly used to simulate floodplain inundation and overbank sedimentation during individual floods. However, most existing models of long-term floodplain construction and alluvial architecture do not account for flood hydraulics explicitly. Instead, floodplain sedimentation is typically modelled as an exponential function of distance from the river, and avulsion thresholds are defined using topographic indices (e.g., lateral:downstream slope ratios or metrics of channel belt super-elevation). This presentation aims to provide an overview of these issues, and present results from a hydrodynamically-driven model of long-term floodplain evolution. This model combines a simple network-based model of channel migration with a 2D grid-based model of flood hydrodynamics and overbank sedimentation. The latter involves a finite volume solution of the shallow water equations and an advection-diffusion model for suspended sediment transport. Simulation results are compared with observations from several large lowland floodplains, and the model is used to explore hydrodynamic controls on long-term floodplain evolution and alluvial ridge construction.  +
Flow routing map is the cornerstone of spatially distributed hydrologic models. In this clinic we will introduce HexWatershed, a scale-free, mesh independent flow direction model. It supports DOE’s Energy Exascale Earth System Model (E3SM) to generate hydrologic parameters and river network representations on both structured and unstructured meshes. In this presentation, we will overview the capabilities of HexWatershed with an emphasis on river network representation and flow direction modeling. We will also provide participants with the tools to begin their own research with hydrologic model workflows. Through hands-on tutorials and demonstrations, participants will gain some insights into the relationship between meshes and flow direction, and how HexWatershed handles river network in various meshes. We will also demonstrate how to use the HexWatershed model outputs in the large-scale hydrologic model, Model for Scale Adaptive River Transport (MOSART). Participants will be provided with additional resources that can be used to extend the tutorial problems and gain additional familiarity with the tools and workflows introduced. Participants are welcome to bring and utilize their own computers capable of accessing the internet and running a web browser. Tutorials will involve simple scripting operations in the Python language. The conda utility will be used to install libraries. Both QGIS and VisIt packages will be used for visualization.  +
Fluvial incision since late Miocene time (5 Ma) has shaped the transition between the Central Rocky Mountains and adjacent High Plains. Despite a clear contrast in erodibility between the mountains and plains, erodibility has not been carefully accounted for in previous attempts to model the geomorphic evolution of this region. The focus of this work to date has been to constrain erodibility values with a simplistic, toy model, and to reconstruct the paleosurface of the Miocene Ogallala Formation prior to its dissection beginning at 5 Ma. This surface reconstruction will be used as an initial condition in subsequent modeling.  +
Food security and poverty in Bangladesh are very dependent on natural resources, which fluctuate with a changing environment. The ecosystem services supporting the rural population are affected by several factors including climate change, upstream river flow modifications, commercial fish catches in the Bay of Bengal, and governance interventions. The ESPA Deltas project aims to holistically describe the interaction between the interlinked bio-physical environment and the livelihoods of the rural poorest in coastal Bangladesh, who are highly dependent on natural resources and live generally on less than US$1.50 per day. Here we describe a new integrated model that allows a long-term analysis of the possible changes in this system by linking projected changes in physical processes (e.g. river flows, nutrients), with productivity (e.g. fish, rice), social processes (e.g. access, property rights, migration) and governance (e.g. fisheries, agriculture, water and land use management). Bayesian Networks and Bayesian Processes allow multidisciplinary integration and exploration of specific scenarios. This integrated approach is designed to provide Bangladeshi policy makers with science-based evidence of possible development trajectories. This includes the likely robustness of different governance options on natural resource conservation and poverty levels. Early results highlight the far reaching implications of sustainable resource use and international cooperation to secure livelihoods and ensure a sustainable environment in coastal Bangladesh.  +
From G.K. Gilbert's "The Convexity of Hilltops" to highly-optimized numerical implementations of drainage basin evolution, models of landscape evolution have been used to develop insight into the development of specific field areas, create testable predictions of landform development, demonstrate the consequences of our current theories for geomorphic processes, and spark imagination through hypothetical scenarios. In this talk, I discuss how the types questions tackled with landscape evolution models have changed as observational data (e.g., high-resolution topography) and computational technology (e.g., accessible high performance computing) have become available. I draw on a natural experiment in postglacial drainage basin incision and a synthetic experiment in a simple tectonic setting to demonstrate how landscape evolution models can be used to identify how much information the topography or other observable quantities provide in inferring process representation and tectonic history. In the natural example, comparison of multiple calibrated models provides insight into which process representations improve our ability to capture the geomorphic history of a site. Projections into the future characterize where in the landscape uncertainty in the model structure dominates over other sources of uncertainty. In the synthetic case, I explore the ability of a numerical inversion to recover geomorphic-process relevant (e.g., detachment vs. transport limited fluvial incision) and tectonically relevant (e.g., date of fault motion onset) system parameters.  +
GCAM is an open-source, global, market equilibrium model that represents the linkages between energy, water, land, climate, and economic systems. One of GCAM's many outputs is projected land cover/use by subregion. Subregional projections provide context and can be used to understand regional land dynamics; however, Earth System Models (ESMs) generally require gridded representations of land at finer scales. Demeter, a land use and land cover disaggregation model, was created to provide this service. Demeter directly ingests land projections from GCAM and creates gridded products that match the desired resolution, and land class requirements of the user.  +
GPUs can make models, simulations, machine learning, and data analysis much faster, but how? And when? In this clinic we'll discuss whether you should use a GPU for your work, whether you should buy one, which one to buy, and how to use one effectively. We'll also get hands-on and speed up a landscape evolution model together. This clinic should be of interest both to folks who would like to speed up their code with minimal effort as well as folks who are interested in the nitty gritty of pushing computational boundaries.  +
GeoClaw (http://www.geoclaw.org) is an open-source software package for solving two-dimensional depth-averaged equations over general topography using high-resolution finite volume methods and adaptive mesh refinement. Wetting-and-drying algorithms allow modeling inundation and overland flows. The primary applications where GeoClaw has been used are tsunami modeling and storm surge, although it has also been applied to dam break and other overland flooding problems. The first part of this clinic will present an overview of the capabilities of GeoClaw, including a number of new features have been added in the past few years. These include: - Depth-averaged Boussinesq-type dispersive equations that better model short-wavelength tsunamis, such as those generated by landslides or asteroid impacts. Solving these equations requires implicit solvers (due to the higher-order derivatives in the equations). This is now working with the adaptive mesh refinement (AMR) algorithms in GeoClaw, which are critical for problems that require high-resolution coastal modeling while also modeling trans-oceanic propagation, for example. - Better capabilities for extracting output at frequent times on a fixed spatial grid by interpolation from the AMR grids during a computation. The resulting output can then be use for making high-resolution animations or for post-processing (e.g. the velocity field at frequent times can be used for particle tracking, as needed when tracking tsunami debris, for example). - Ways to incorporate river flows or tidal currents into GeoClaw simulation. - Better coupling with the D-Claw code for modeling debris flows, landslides, lahars, and landslide-generated tsunamis. (D-Claw is primarily developed by USGS researchers Dave George and Katy Barnhart). The second part of the clinic will be a hands-on introduction to installing GeoClaw and running some of the examples included in the distribution, with tips on how best to get started on a new project. GeoClaw is distributed as part of Clawpack (http://www.clawpack.org), and is available via the CSDMS model repository. For those who wish to install the software in advance on laptops, please see http://www.clawpack.org/installing.html. We will also go through this briefly and help with any issues that arise on your laptop (provided it is a Mac or Linux machine; we do not support Windows.) You may need to install some prerequisites in advance, such as Xcode on a Mac (since we require "make" and other command line tools), a Fortran compiler such as gfortran, and basic scientific Python tools such as NumPy and Matplotlib. See https://www.clawpack.org/prereqs.html.  
GeoClaw (http://www.geoclaw.org) is an open-source software package for solving two-dimensional depth-averaged equations over general topography using high-resolution finite volume methods and adaptive mesh refinement. Wetting-and-drying algorithms allow modeling inundation or overland flows. The primary applications where GeoClaw has been used are tsunami modeling and storm surge, although it has also been applied to dam break problems and other overland floods. This tutorial will give an introduction to setting up a tsunami modeling problem in GeoClaw, including: * Overview of capabilities, * Installing the software, * Using Python tools provided in GeoClaw to acquire and work with topography DEMs and other datasets, * Setting run-time parameters, including specifying adaptive refinement regions, * The VisClaw plotting software to visualize results using Python tools or display on Google Earth. GeoClaw is distributed as part of Clawpack (http://www.clawpack.org). Those who wish to install the software in advance on laptops, please see http://www.clawpack.org/installing.html. https://razmag.ir/review-of-mesotherapy/ Tutorials can be found here: https://github.com/clawpack/geoclaw_tutorial_csdms2019  +
GeoClaw is an open source Fortran/Python package based on Clawpack (conservation laws package), which implements high-resolution finite volume methods for solving wave propagation problems with adaptive mesh refinement. GeoClaw was originally developed for tsunami modeling and been validated via benchmarking workshops of the National Tsunami Hazard Mitigation Program for use in hazard assessment studies funded through this program. Current project include developing new tsunami inundation maps for the State of Washington and the development of new probabilistic tsunami hazard assessment (PTHA) methodologies. The GeoClaw code has also been extended to the study of storm surge and forms the basis for D-Claw, a debris flow and landslide code being developed at the USGS and recently used to model the 2014 Oso, Washington landslide, for example.  +
Getting usable information out of climate and weather models can be a daunting task. The direct output from the models typically has unacceptable biases on local scales, and as a result a large number of methods have been developed to bias correct or downscale the climate model output. This clinic will describe the range of methods available as well as provide background on the pros and cons of different approaches. This will cover a variety of approaches from relatively simple methods that just rescale the original output, to more sophisticated statistical methods that account for broader weather patterns, to high-resolution atmospheric models. We will focus on methods for which output or code are readily available for end users, and discuss the input data required by different methods. We will follow this up with a practical session in which participants will be supplied a test dataset and code with which to perform their own downscaling. Participants interested in applying these methods to their own region of interest are encouraged to contact the instructor ahead of time to determine what inputs would be required.  +
Global models of Earth’s climate have expanded beyond their geophysical heritage to include terrestrial ecosystems, biogeochemical cycles, vegetation dynamics, and anthropogenic uses of the biosphere. Ecological forcings and feedbacks are now recognized as important for climate change simulation, and the models are becoming models of the entire Earth system. This talk introduces Earth system models, how they are used to understand the connections between climate and ecology, and how they provide insight to environmental stewardship for a healthy and sustainable planet. Two prominent examples discussed in the talk are anthropogenic land use and land-cover change and the global carbon cycle. However, there is considerable uncertainty in how to represent ecological processes at the large spatial scale and long temporal scale of Earth system models. Further scientific advances are straining under the ever-growing burden of multidisciplinary breadth, countered by disciplinary chauvinism and the extensive conceptual gap between observationalists developing process knowledge at specific sites and global scale modelers. The theoretical basis for Earth system models, their development and verification, and experimentation with these models requires a new generation of scientists, adept at bridging the disparate fields of science and using a variety of research methodologies including theory, numerical modeling, observations, and data analysis. The science requires a firm grasp of models, their theoretical foundations, their strengths and weaknesses, and how to appropriately use them to test hypotheses of the atmosphere-biosphere system. It requires a reinvention of how we learn about and study nature.  +
Google Earth Engine is a powerful geographic information system (GIS) that brings programmatic access and massively parallel computing to petabytes of publicly-available Earth observation data using Google’s cloud infrastructure. In this live-coding clinic, we’ll introduce some of the foundational concepts of workflows in Earth Engine and lay the groundwork for future self-teaching. Using the JavaScript API, we will practice: raster subsetting, raster reducing in time and space, custom asset (raster and vector) uploads, visualization, mapping functions over collections of rasters or geometries, and basic exporting of derived products.  +
Google Earth Engine(GEE) is a multi-petabyte catalog of satellite imagery and geospatial datasets with planetary-scale analysis capabilities. Now imagine all you need to work on it is a browser and an internet connection. This hands-on workshop will introduce you to and showcase cloud-native geospatial processing. We will explore the platform’s built-in catalog of 100+ petabytes of geospatial datasets and build some analysis workflows. Additional topics will also include uploading & ingesting your own data to Google Earth Engine, time series analysis essential for change monitoring, and data and code principles for effective collaboration. The hope is to introduce to cloud native geospatial analysis platform and to rethink data as we produce and consume more. If you want to follow along, bring your laptops, and register for an Earth Engine account here https://signup.earthengine.google.com P.S I recommend using a personal account :) you get to keep it  +
Granular materials are ubiquitous in the environment, in industry and in everyday life and yet are poorly understood. Modelling the behavior of a granular medium is critical to understanding problems ranging from hazardous landslides and avalanches in the Geosciences, to the design of industrial equipment. Typical granular systems contain millions of particles, but the underlying equations governing that collective motion are as yet unknown. The search for a theory of granular matter is a fundamental problems in physics and engineering and of immense practical importance for mitigating the risk of geohazards. Direct simulation of granular systems using the Discrete Element Method is a powerful tool for developing theories and modelling granular systems. I will describe the simulation technique and show its application to a diverse range of flows.  +
Great mentors engage early career scientists in research, open doors, speak the ‘unspoken rules’, and inspire the next generation. Yet many of us step into mentoring roles without feeling fully confident in the role, or uncertain how to create an inclusive environment that allows early career scientists from varied backgrounds to thrive. In this interactive workshop, we will share experiences and explore tools that can help build successful mentoring relationships, create supportive cohorts, and feel confident in becoming a great mentor.  +
Hazard assessment for post-wildfire debris flows, which are common in the steep terrain of the western United States, has focused on the susceptibility of upstream basins to generate debris flows. However, reducing public exposure to this hazard also requires an assessment of hazards in downstream areas that might be inundated during debris flow runout. Debris flow runout models are widely available, but their application to hazard assessment for post-wildfire debris flows has not been extensively tested. I will discuss a study in which we apply three candidate debris flow runout models in the context of the 9 January 2018 Montecito event. We evaluate the relative importance of flow volume and flow material properties in successfully simulating the event. Additionally, I will describe an in-progress user needs assessment designed to understand how professional decision makers (e.g., county emergency managers, floodplain manager, and Burned Area Emergency Response team members) might use post-fire debris flow inundation hazard assessment information. https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2021JF006245 Katy Barnhart is a Research Civil Engineer at the U.S. Geological Survey’s Geologic Hazards Science Center. She received her B.S.E. (2008) in Civil and Environmental Engineering from Princeton University and her M.S. (2010) and Ph.D. (2015) in Geological Sciences from the University of Colorado at Boulder. Her research uses numerical modeling to understand past and forecast future geomorphic change on a variety of timescales.  +
Here we present direct numerical simulation for the hysteresis of the Antarctic ice sheet and use linear response theory to use these kind of simulations to project Antarctica's sea level contribution to the end of the century. Related publications: * A. Levermann et al. 2020. Projecting Antarctica's contribution to future sea level rise from basal ice-shelf melt using linear response functions of 16 ice sheet models (LARMIP-2). Earth System Dynamics 11 (2020) 35-76, doi 10.5194/esd-11-35-2020. * J. Garbe, T. Albrecht, A. Levermann, J.F. Donges, R. Winkelmann, 2020. The Hysteresis of the Antarctic Ice Sheet. Nature 585 (2020), 538-544, doi: 10.1038/s41586-020-2727-5.  +
HexWatershed is a hydrologic flow direction model that supports structured and unstructured meshes. It uses state-of-the-art topological relationship-based stream burning and depression-filling techniques to produce high-quality flow-routing datasets across scales. HexWatershed has substantially improved over the past two years, including support for the DGGRID discrete global grid system (DGGS). This presentation will provide an overview of HexWatershed, highlighting its capabilities, new features, and improvements. Through hands-on tutorials and demonstrations, attendees will gain insights into the underlying philosophy of the HexWatershed model, and how to use HexWatershed products to run large-scale hydrologic models in watersheds worldwide. Specifically, this tutorial will cover major components in the HexWatershed ecosystem, including the computational mesh generation process, river network representation, and flow direction modeling. We will provide participants with resources to extend the tutorial problems and gain additional familiarity with the tools and workflows introduced. Attendees are encouraged to bring their laptops with internet access and a functional web browser. Tutorials will involve scripting operations in the Python language, such as Jupyter Notebook. We will use the Conda utility to install dependency libraries and Visual Studio Code to run the notebooks.  +
High-resolution topographic (HRT) data is becoming more easily accessible and prevalent, and is rapidly advancing our understanding of myriad surface and ecological processes. Landscape connectivity is the framework that describes the routing of fluids, sediments, and solutes across a landscape and is a primary control on geomorphology and ecology. Connectivity is not a static parameter, but rather a continuum that dynamically evolves on a range of temporal and spatial scales, and the observation of which is highly dependent on the available methodology. In this clinic we showcase the utility of HRT for the observation and characterization of landscapes and compare results with those of coarser spatial resolution data-sets. We highlight the potential for integrating HRT observations and parameters such as vegetation density, surface relief, and local slope variability with numerical surface process models. Participants will gain an understanding of the basics of HRT, data availability and basic analysis, and the use of HRT parameters in modeling.  +
How can we increase the diversity, richness and value of Spatial Data Infrastructure (SDI) to the Disasters and Natural Hazards community stakeholders? We’ll look at some of the current (and past) Open Geospatial Consortium initiatives to examine exciting work to enable sharing of complex data and models within the community using open standards.  +
Human settlements in dynamic environmental settings face the challenges both of managing their own impact on their surroundings and also adapting to change, which may be driven by a combination of local and remote factors, each of which may involve both human and natural forcings. Impacts of and responses to environmental change play out at multiple scales which involve complex nonlinear interactions between individual actors. These interactions can produce emergent results where the outcome at the community scale is not easily predicted from the decisions taken by individuals within the community. Agent-based simulations can be useful tools to explore the dynamics of both the human response to environmental change and the environmental impacts of human activity. Even very simple models can be useful in uncovering potential for unintended consequences of policy actions. Participatory simulations that allow people to interact with a system that includes simulated agents can be useful tools for teaching and communicating about such unintended consequences. I will report on progress on agent-based simulations of environmentally stressed communities in Bangladesh and Sri Lanka and preliminary results of using a participatory coupled model of river flooding and agent-based real estate markets to teach about unintended consequences of building flood barriers.  +
Humans alter natural geomorphic systems by modifying terrain morphology and through on-going actions that change patterns of sediment erosion, transport, and deposition. Long-term interactions between humans and the environment can be examined using numerical modeling. Human modifications of the landscape such as land cover change and agricultural tillage have been implemented within some landscape evolution models, yet little effort has been made to incorporate agricultural terraces. Terraces of various forms have been constructed for millennia in the Mediterranean, Southeast Asia, and South America; in those regions some terraces have undergone cycles of use, abandonment, and reuse. Current implementations of terraces in existing models are as static objects that uniformly impact landscape evolution, yet empirical studies have shown that terrace impact depends upon whether they are maintained or abandoned. We previously tested a simple terrace model that included a single terrace wall on a synthetic hillside with 20% slope for the impacts of maintenance and abandonment. In this research we modify the terrace model to include a wider variety of terrace forms and couple it with a landscape evolution model to test the extent terraced terrain morphology is related to terrace form. We also test how landscape evolution, after abandonment of terraced fields, differs based on length of time the terraces were maintained. We argue that construction and maintenance of terraces has a significant impact on the spatial patterning of sediment erosion and deposition and thus landscape evolution modeling of terraced terrain requires coupling with a dynamic model of terrace use.  +
Hurricanes can greatly modify the sedimentary record, but our coastal scientific modeling community has rather limited capability to predict such process. A three-dimensional sediment transport model was developed in the Regional Ocean Modeling System (ROMS) to study seabed erosion and deposition on the Louisiana shelf in response to Hurricanes Katrina and Rita in the year 2005. Conditions to either side of Hurricane Rita‚ storm track differed substantially, with the region to the east having stronger winds, taller waves and thus deeper erosions. This study indicated that major hurricanes can disturb the shelf at centimeter to meter levels on seabed.  +
Hydrology is a science of extremes; droughts and floods. In either case, the hydrologic response arises from the combination of many factors, such as terrain, land cover, land use, infrastructure, etc. Each has different, overlapping spatial domains. Superimposed upon these are temporal variations, driven by stochastic weather events that follow seasonal climatic regimes. To calculate risk (expected loss) requires a loss function (damage) and a response domain (flood depths) over which that loss is integrated. The watershed provides the spatial domain that collects all these factors. This talk will discuss the data used to characterize hydrologic response.  +
I will discuss an application of the Migration, Intensification, and Diversification as Adaptive Strategies (MIDAS) agent-based modeling framework to modeling labor migration across Bangladesh under the stressor of sea-level rise (SLR). With this example, I hope to highlight some hard-to-resolve challenges in representing adaptive decision-making under as-yet unexperienced stressors in models. Drawing together what is more and what is less known in projections for future adaptation, I will discuss strategies for ‘responsible’ presentation and dissemination of model findings.  +
If one system comes to (my) mind where the human element is intertwined with the environment, it is the Louisiana coastal area in the Southern United States. Often referred to as the working coast, coastal Louisiana supports large industries with its ports, navigation channels, oil, and productive fisheries. In addition to that, Louisianians have a significant cultural connection to the coastal wetlands and their natural resources. Unfortunately, the land is disappearing into the sea with coastal erosion rates higher than anywhere else in the US. Due to these high rates of land loss, this system needs rigorous protection and restoration. While the restoration plans are mostly focused on building land, the effects on, for example, fisheries of proposed strategies should be estimated as well before decisions can be made on how to move forward. Through several projects I have been involved in, from small modeling projects to bold coastal design programs, I present how coupled models play a key role in science-based coastal management that considers the natural processes as well as the human element.  +
In dry regions, escarpments are key landforms for exploring landform-rainfall interactions. Here we present a modeling approach for arid cliffs and sub-cliff slopes evolution incorporating rainfall forcing at the scale of individual rainstorms. We used numerical experiments to mechanistically test how arid cliffs and sub-cliff slopes evolve according to different geomorphic characteristics and variations in rainstorm properties.  +
In formulating tectono-geomorphic models of landscape evolution, Earth is typically divided into two domains; the surface domain in which “geomorphic” processes are solved for and a tectonic domain of earth deformation driven generally by differential plate movements. Here we present a single mechanical framework, Failure Earth Response Model (FERM), that unifies the physical description of dynamics within and between the two domains. FERM is constructed on the two, basic assumptions about the three-dimensional stress state and rheological memory: I) Material displacement, whether tectonic or geomorphic in origin, at or below Earth’s surface, is driven by local forces overcoming local resistance, and II) Large displacements, whether tectonic or geomorphic in origin, irreversibly alter Earth material properties enhancing a long term strain memory mapped into the topography. In addition to the gathering of stresses arising from far field tectonic processes, topographic relief, and the inertial surface processes into a single stress state for every point, the FERM formulation allows explicit consideration of the contributions to the evolving landscape of pore pressure fluctuations, seismic accelerations, and fault damage. Incorporation of these in the FERM model significantly influences the tempo of landscape evolution and leads to highly heterogeneous and anisotropic stress and strength patterns, largely predictable from knowledge of mantle kinematics. The resulting unified description permits exploration of surface-tectonic interactions from outcrop to orogen scales and allows elucidation of the high fidelity orogenic strain and climate memory contained in topography.  +
In landscape evolution models, climate change is often assumed to be synonymous with changes in rainfall. In many climate changes, however, the dominant driver of landscape evolution is changes in vegetation cover. In this talk I review case studies that attempt to quantify the impact of vegetation changes on landscape evolution, including examples from hillslope/colluvial, fluvial, and aolian environments, spatial scales of ~10 m to whole continents, and time scales from decadal to millennial. Particular attention is paid to how to parameterize models using paleoclimatic and remote sensing data.  +
In software engineering, an interface is a group of functions with prescribed names, argument types, and return types. When a developer implements an interface for a piece of software, they fill out the details for each function while keeping the signatures intact. CSDMS has developed the Basic Model Interface (BMI) for facilitating the conversion of a model written in C, C++, Fortran, Python, or Java into a reusable, plug-and-play component. By design, BMI functions are simple. However, when trying to implement them, the devil is often in the details. In this hands-on clinic, we'll take a simple model of the two-dimensional heat equation, written in Python, and together we'll write the BMI functions to wrap it, preparing it for transformation into a component. As we develop, we’ll explore how to use the wrapped model with a Jupyter Notebook. To get the most out of this clinic, come prepared to code! We'll have a lot to write in the time allotted for the clinic. We recommend that clinic attendees have a laptop with the Anaconda Python distribution installed. We also request that you review the * BMI description (http://csdms.colorado.edu/wiki/BMI_Description), and the * BMI documentation (https://bmi-spec.readthedocs.io) before the start of the clinic.  +
In software engineering, an interface is a set of functions with prescribed names, argument types, and return types. When a developer implements an interface for a piece of software, they fill out the details for each function while keeping the signatures intact. CSDMS has developed the Basic Model Interface (BMI) for facilitating the conversion of an existing model written in C, C++, Fortran, Python or Java into a reusable, plug-and-play component. By design, BMI functions are straightforward to implement. However, when trying to match BMI functions to model behaviors, the devil is often in the details.<br>In this hands-on clinic, we'll take a simple model--an implementation of the two-dimensional heat equation in Python--and together, we'll write the BMI functions to wrap it, preparing it for transformation into a component. As we develop, we’ll explore how to use the wrapped model with a Jupyter Notebook.<br>To get the most out of this clinic, come prepared to code! We'll have a lot to write in the time allotted for the clinic. We recommend that clinic attendees have a laptop with the Anaconda Python distribution installed. We also request that you read over:<br>BMI description (https://csdms.colorado.edu/wiki/BMI_Description)<br>BMI documentation (http://bmi-python.readthedocs.io)<br>before participating in the clinic.  +
In the modeler community, hindcasting (a way to test models based on knowledge of past events) is required for all computer models before providing reliable results to users. CSDMS 2.0 “Moving forward” has proposed to incorporate benchmarking data into its modeling framework. Data collection in natural systems has been significantly advanced, but is still behind the resolution in time and space and includes natural variability beyond our understanding, which makes thorough testing of computer models difficult.<br><br>In the experimentalist community, research in Earth-surface processes and subsurface stratal development is in a data-rich era with rapid expansion of high-resolution, digitally based data sets that were not available even a few years ago. Millions of dollars has been spent to build and renovate flume laboratories. Advanced technologies and methodologies in experiment allow more number of sophisticated experiments in large scales at fine details. Joint effort between modelers and experimentalists is a natural step toward a great synergy between both communities.<br><br>Time for a coherent effort for building a strong global research network for these two communities is now. First, the both communities should initiate an effort to figure out a best practice, metadata for standardized data collection. Sediment experimentalists are an example community in the “long tail”, meaning that their data are often collected in one-of-a-kind experimental set-ups and isolated from other experiments. Second, there should be a centralized knowledge base (web-based repository for data and technology) easily accessible to modelers and experimentalists. Experimentalists also have a lot of “dark data,” data that are difficult or impossible to access through the Internet. This effort will result in tremendous opportunities for productive collaborations.<br><br>The new experimentalist and modeler network will be able to achieve the CSDMS current goal by providing high quality benchmark datasets that are well documented and easily accessible.  
In this clinic I will give an overview of lsdtopotools so that, by the end of the session, you will be able to run and visualise topographic analyses using lsdtopotools and lsdviztools. I will show how to start an lsdtopotools session in google colab in under 4 minutes, and will also give a brief overview for more advanced users of how to use our docker container if you want access to local files. I will then use jupyter notebooks to give example analyses including simple data fetching and hillshading, basin selection, simple topographic metrics and channel extraction. Depending on the audience I will show examples of a) channel steepness analysis for applications in tectonic geomorphology b) calculation of inferred erosion rates based on detrital CRN concentrations c) terrace and valley extraction d) channel-hillslope coupling. In addition I will show our simple visualisation scripts that allow you to generate publication-ready images. All you need prior to the session is a google account that allows you to access colab, and an opentopography account so you can obtain an API key. The latter is not required but will make the session more fun as you can use data from anywhere rather than example datasets. If you are not an advanced user please do not read the next sentence, as you don’t need it and it is nerdy compu-jargon that will put you off the session. If you are an advanced user and wish to try the docker container you should install the docker client for your operating system and use the command “docker pull lsdtopotools/lsdtt_pytools_docker” when you have access to a fast internet connection.  +
In this clinic we will explore how to use the new cloud-based remote sensing platform from Google. Our hands-on clinic will teach you the basics of loading and visualizing data in Earth Engine, sorting through data, and creating different types of composite images. These techniques are a good starting point for more detailed investigations that monitor changes on earth’s surface. Prerequisites:<br>1) Bring your own laptop.<br>2) Chrome installed on your system: It will work with Firefox but has issues.<br>3) An active Google account - Register for an account with Google Earth Engine (https://earthengine.google.com/signup/)  +
In this clinic we will explore how to use the cloud-based remote sensing platform from Google. Our hands-on clinic will teach you the basics of loading and visualizing data in Earth Engine, sorting through data, and creating different types of composite images. These techniques are a good starting point for more detailed investigations that monitor changes on earth’s surface. Prerequisites include having Chrome installed on your system: It will work with Firefox but has issues and an active Google account. Once you have those please register for an account with Google Earth Engine (https://earthengine.google.com/signup/)  +
In this clinic we will first review concepts of glacial isostatic adjustment and the algorithm that is used to solve the sea level equation. We will then provide an overview of the sea level code, which calculates the viscoelastic response of the solid Earth, Earth’s gravity field, and rotation axis to changes in surface load while conserving water between ice sheets and oceans. Participants will run the code, explore manipulating the input ice changes, and investigate its effect on the predicted changes in sea level, solid Earth deformation, and gravity field.  +
In this clinic, we will explore RivGraph, a Python package for extracting and analyzing fluvial channel networks from binary masks. We will first look at some background and motivation for RivGraph's development, including some examples demonstrating how RivGraph provides the required information for building models, developing new metrics, analyzing model outputs, and testing hypotheses about river network structure. We will then cover--at a high level--some of the logic behind RivGraph's functions. The final portion of this clinic will be spent working through examples showing how to process a delta and a braided river with RivGraph and visualizing results. Please note: This clinic is designed to be accessible to novice Python users, but those with no Python experience may also find value. If you'd like to work through the examples during the workshop, please install RivGraph beforehand, preferably to a fresh Anaconda environment. Instructions can be found here: https://github.com/jonschwenk/RivGraph. It is also recommended that you have a GIS (e.g. QGIS) available for use for easy display/interrogation of results.  +
In this clinic, we will first demonstrate existing interactive computer-based activities used for teaching concepts in sedimentology and stratigraphy. This will be followed by a hands-on session for creating different modules based on the participants’ teaching and research interests. Active learning strategies improve student exam performance, engagement, attitudes, thinking, writing, self-reported participation and interest, and help students become better acquainted with one another (Prince, 2004). Specifically, computer-based active learning is an attractive educational approach for post-secondary educators, because developing these activities takes advantage of existing knowledge and skills the educator is likely to already have. The demonstration portion of the clinic will focus on the existing rivers2stratigraphy (https://github.com/sededu/rivers2stratigraphy) activity, which illustrates basin-scale development of fluvial stratigraphy through adjustments in system kinematics including sandy channel migration and subsidence rates. The activity allows users to change these system properties, so as to drive changing depositional patterns. The module utilizes a rules based model, which produces realistic channel patterns, but simplifies the simulation to run efficiently, in real-time. The clinic will couple rivers2stratigraphy to a conventional laboratory activity which interprets an outcrop photograph of fluvial stratigraphy, and discuss logistics of using the module in the classroom. For the second part of the clinic, familiarity with Python will be beneficial (but is not required); we will utilize existing graphical user interface (GUI) frameworks in developing new activities, aimed to provide a user-friendly means for students to interact with model codes while engaging in geological learning. Participants should plan to have Python installed on their personal computers prior to the workshop, and a sample module will be emailed beforehand to let participants begin exploring the syllabus. ''Prince, M. (2004). Does Active Learning Work? A Review of the Research. Journal of Engineering Education, 93(3), 223-231. doi: 10.1002/j.2168-9830.2004.tb00809.x''.  
In this clinic, we will introduce and experiment with open-source tools designed to promote rapid hypothesis testing for river delta studies. We will show how pyDeltaRCM, a flexible Python model for simulating river delta evolution, can be extended to incorporate any arbitrary processes or forcings. We will highlight how object-oriented model design enables community-driven model development, and how this promotes reproducible science. Our clinic will develop an extended model to simulate deltaic evolution into receiving basins with different slopes. Then, the clinic will step through some basic analyses of the model runs, interrogating both surface processes and subsurface structure. Our overall goal is to familiarize you with the tools we are developing and introduce our approach to software design, so that you may adopt these tools or strategies in your research. Please note that familiarity with Python will be beneficial for this clinic, but is not required. Hands-on examples will be made available via an online programming environment (Google CoLab or similar); instructions for local installation on personal computers will be provided prior to the workshop as well.  +
In this clinic, we will provide a brief introduction to a selection of models (USGS and others), including FaSTMECH (2D/3D hydraulic) and PRMS (watershed hydrology), that have implemented a Basic Model Interface (BMI) and are available in the Python Modeling Toolkit (PyMT). We will interactively explore Jupyter Notebook examples of both stand-alone model operation and, as time permits, loosely coupled integrated modeling applications. Participants will need a laptop with a web browser. Knowledge of Python, Jupyter Notebook, and hydrologic/hydraulic modeling is helpful, but not required.  +
In this clinic, we will talk about diversity in a way that makes it approachable and actionable. We advocate that actions in support of diversity can happen at all career levels, so everyone who is interested can partake. We will discuss concrete strategies and opportunities to help you bring a diverse research group together. Creating a diverse group can be through reaching out to undergraduate minority students to engage in undergraduate research experiences. This can be done ground-up, i.e. by graduate students in a mentoring role as productively as a faculty in a hiring role. We are all supervisors and mentors in our own ways. We will highlight a number of approaches to engage with underrepresented minority students when recruiting new graduate students, and suggest some concrete adjustments of your recruitment processes to be as inclusive as possible. But being proactive does not stop after recruitment. The clinic will have dedicated discussion time to engage in role play, and provide stories about situations in which you can be an ally. We will identify some pitfalls, ways to reclaim, and provide ideas for more inclusive meetings and mentoring. Lastly, together we can work on creating an overview of current programs that focus on diversity and inclusion, to apply for funding to take action.  +
In this clinic, we will use flow routing in models to determine various earth surface processes such as river incision and others. Landlab has several flow routing components that address multiflow-routing, depression-filling and the diversity of grid types. We'll see how to design a landscape evolution model with relatively rapid flow routing execution time on large grids.  +
In this presentation several modeling efforts in Chesapeake Bay will be reviewed that highlight how we can use 3-dimensional, time-dependent hydrodynamic models to provide insight into biogeochemical and ecological processes in marine systems. Two modeling studies will be discussed which illustrate the application of individual based modeling approaches to simulate the impact of 3-dimensional currents and mixing on pelagic organisms and how these interact with behavior to determine the fate of planktonic species. There are many applications of this approach related to fish and invertebrate (e.g., oyster) larvae transport and fate and also plankton that can be used to inform management efforts.<br><br>A long-term operational modeling project will be discussed that combines mechanistic and empirical modeling approaches to provide nowcasts and short-term forecasts of Sea Nettles, HAB, pathogen and also physical and biogeochemical properties for research, management and public uses in Chesapeake Bay. This is a powerful technique can be expanded to any marine system that has a hydrodynamic model and any marine organism for which the habitat can be defined. <br><br>Finally, a new research project will be reviewed where we are assessing the readiness of a suite of existing estuarine community models for determining past, present and future hypoxia events within the Chesapeake Bay, in order to accelerate the transition of hypoxia model formulations and products from academic research to operational centers. This work, which will ultimately provide the ability to do operational oxygen modeling in Chesapeake Bay (e.g., oxygen weather forecasts), can be extended to other coastal water bodies and any biogeochemical property.  +
In this presentation, James Byrne (Lead Research Software Engineer) and Jonathan Smith (Principal Research Scientist) from the British Antarctic Survey will be describing existing digital infrastructure projects and developments happening in and around BAS. They will give a flavour of how technology is influencing the development of environmental and polar science, covering numerous research and operational domains. They will be focusing on the digital infrastructure applied to IceNet, an AI-based deep learning infrastructure. We will then show how generalized approaches to digital infrastructure are being applied to other areas, including cutting-edge Autonomous Marine Operations Planning (AMOP) capabilities. We will end highlighting the challenges that need solving in working towards an Antarctic Digital Twin and how we might approach them.  +
In this talk, I will discuss the need for low carbon and sustainable computing. The current emissions from computing are almost 4% of the world total. This is already more than emissions from the airline industry and ICT emissions are projected to rise steeply over the next two decades. By 2040 emissions from computing alone will account for more than half of the emissions budget to keep global warming below 1.5°C. Consequently, this growth in computing emissions is unsustainable. The emissions from production of computing devices exceed the emissions from operating them, so even if devices are more energy efficient producing more of them will make the emissions problem worse. Therefore we must extend the useful life of our computing devices. As a society we need to start treating computational resources as finite and precious, to be utilized only when necessary, and as effectively as possible. We need frugal computing: achieving our aims with less energy and material.  +
In this webinar, I will present a new framework termed “Bayesian Evidential Learning” (BEL) that streamlines the integration of these four components common to building Earth systems: data, model, prediction, decision. This idea is published in a new book: “Quantifying Uncertainty in Subsurface Systems” (Wiley-Blackwell, 2018) and applied to five real case studies in oil/gas, groundwater, contaminant remediation and geothermal energy. BEL is not a method, but a protocol based on Bayesianism that lead to the selection of relevant methods to solve complex modeling and decision problems. In that sense BEL, focuses on purpose-driven data collection and model-building. One of the important contributions of BEL is that is a data-scientific approach that circumvents complex inversion modeling relies on machine learning from Monte Carlo with falsified priors. The case studies illustrate how modeling time can be reduced from months to days, making it practical for large scale implementations. In this talk, I will provide an overview of BEL, how it relies on global sensitivity analysis, Monte Carlo, model falsification, prior elicitation and data scientific methods to implement the stated principle of its Bayesian philosophy. I will cover an extensive case study involving the managing of the groundwater system in Denmark.  +
In this workshop we will explore publicly available socioeconomic and hydrologic datasets that can be used to inform riverine flood risks under present-day and future climate conditions. We will begin with a summary of different stakeholders’ requirements for understanding flood risk data, through the lens of our experience working with federal, state and local clients and stakeholders. We will then guide participants through the relevant data sources that we use to inform these studies, including FEMA floodplain maps, census data, building inventories, damage functions, and future projections of extreme hydrologic events. We will gather and synthesize some of these data sources, discuss how each data source can be used in impact analyses; and discuss the limitations of each available data source. We will conclude with a brainstorming session to discuss how the scientific community can better produce actionable information for community planners, floodplain managers, and other stakeholders who might face increasing riverine flood risks in the future.  +
Increased computing power, high resolution imagery, new geologic dating techniques, and a more sophisticated comprehension of the geodynamic and geomorphic processes that shape our planet place us on the precipice of major breakthroughs in understanding links among tectonics and surface processes. In this talk, I will use University of Washington’s “M9 project” to highlight research progress and challenges in coupled tectonics and surface processes studies over both short (earthquake) and long (mountain range) timescales. A Cascadia earthquake of magnitude 9 (M9) would cause shaking, liquefaction, landslides and tsunamis from British Columbia to northern California. The M9 project explores this risk, resilience and the mechanics of Cascadia subduction. At the heart of the project are synthetic ground motions generated from 3D finite difference simulations for 50 earthquake scenarios including factors not previously considered, such as the distribution and timing of energy release on the fault, the coherent variation of frequency content of fault motion with fault depth, and the 3D effects of the deep basins along Puget Sound. Coseismic landslides, likely to number in the thousands, represent one of the greatest risks to the millions of people living in Cascadia. Utilizing the synthetic ground motions and a Newmark sliding block analysis, we compute the landscape response for different landslide failure modes. Because an M9 subduction earthquake is well known to have occurred just over 300 years ago, evidence of coseismic landslides triggered by this event should still be present in Washington and Oregon landscapes. We are systematically hunting for these landslides using a combination of radiocarbon dating and surface roughness analysis, a method first developed to study landslides near to the Oso 2014 disaster site, to develop more robust regional landslide chronologies to compare to model estimations. Resolved ground motions and hillslope response for a single earthquake can then be integrated into coupled landscape evolution and geodynamic models to consider the topographic and surface processes response to subduction over millions of years. This example demonstrates the power of an integrative, multidisciplinary approach to provide deeper insight into coupled tectonic and surface processes phenomena over a range of timescales.  
Increasing physical complexity, spatial resolution, and technical coupling of numerical models for various earth systems require increasing computational resources, efficient code bases and tools for analysis, and community codevelopment. In these arenas, climate technology industries have leapfrogged academic and government science, particularly with regards to adoption of open community code and collaborative development and maintenance. In this talk, I will discuss industry coding practices I learned to bring into my workflow for efficient and rapid development, easier maintenance, collaboration and learning, and reproducibility.  +
Interested in which variables influence your model outcome? SALib (Sensitivity Analysis Library) provides commonly used sensitivity analysis methods implemented in a Python programming language package. In this clinic we will use these methods with example models to apportion uncertainty in model output to model variables. We will use models built with the Landlab Earth-surface dynamics framework, but the analyses can be easily adapted for other model software. No prior experience with Landlab or Python is necessary.  +
Introduction for the CSDMS 2020 annual meeting, presenting last years accomplishments and available resources for the community.  +
Introduction for the CSDMS 2021 annual meeting  +
Introduction to the Natural Hazard workshop  +
It is now well established that the evolution of terrestrial species is highly impacted by long term topographic changes (e.g., high biodiversity in mountain ranges globally). Recent advances in landscape and biological models have opened the gate for deep investigation of the feedback between topographic changes and biological processes over millions of years timescale (e.g., dispersal, adaptation, speciation). In this clinic, we will use novel codes that couple biological processes with FastScape, a widely used landscape evolution model, and explore biological processes and speciation during and after mountain building under different magnitudes of tectonic rock uplift rates. We will explore and deduce how the magnitude and pace of mountain building impact biodiversity and how such interactions can be tracked in mountain ranges today. Python and Jupyter Notebook will be used in the clinic, and basic knowledge in python is desirable.  +
It is well established that coupling and strong feedbacks may occur between solid Earth deformation and surface processes across a wide range of spatial and temporal scales. As both systems on their own encapsulate highly complex and nonlinear processes, fully-coupled simulations require advanced numerical techniques and a flexible platform to explore a multitude of scenarios. Here, we will demonstrate how the Advanced Solver for Problems in Earth's Convection and Tectonics (ASPECT) can be coupled with FastScape to examine feedbacks between lithospheric deformation and landscape evolution. The clinic will cover the fundamental equations being solved, how to design coupled simulations in ASPECT, and examples of coupled continental extension and landscape evolution.  +
JOSS is a developer friendly, peer reviewed academic journal for research software packages, providing a path to academic credit for scholarship disseminated via software. I'll give a tour of the journal, its submission/review process, and opportunities to get involved.  +
Jupyter Notebooks can be powerful tools for classroom teaching. This clinic explores different ways to use notebooks in teaching, common pitfalls to avoid, and best practices. It also introduces the CSDMS OpenEarthscape Hub, an online resource that instructors can use that eliminates the need to install software and provides students with direct access to various CSDMS tools.  +
Jupyter notebooks provide a very convenient way to communicate research results: they may contain narrative text, live code, equations and visualizations all in a single document. Beyond notebooks, the Jupyter ecosystem also provides many interactive, graphical components (widgets) that can be used within notebooks to further enhance the user experience. Those widgets serve a variety of purposes such as 2D (Ipympl, Bqplot, Ipycanvas) or 3D (Ipygany) scientific visualization, 2D (Ipyleaflet) or 3D (Pydeck) maps, etc. When the target audience is not familiar with coding, it is possible to turn Jupyter notebooks into interactive dashboards and publish them as stand-alone web applications (using Voilà). In this workshop, we will learn how to leverage this powerful Jupyter environment to build custom, interactive dashboards for exploring models of Earth surface processes in contexts like research, teaching and outreach. After introducing the basics of Jupyter widgets, we will focus on more advanced examples based on Fastscape and/or Landlab. We willl also spend some time on hands-on exercises as well as brainstorming dashboard ideas. Clinic materials and installation instructions can be found here: https://github.com/benbovy/jupyter-dash-csdms2021 Related links: - https://github.com/fastscape-lem/gilbert-board - https://github.com/fastscape-lem/ipyfastscape  +
Jurjen will share how FloodTags uses human observations from online media to detect and analyze new (and past) flood events. He also introduces a new approach to citizen engagement via chatbots in instant messengers. With this, local needs are revealed in detail and low-threshold two-way communication about flood risk is possible, even down to community level. How can these new techniques be functional in current flood risk management practices?  +
Landlab  +
Landlab is a Python toolkit for building, coupling, and exploring two-dimensional numerical models of Earth-surface dynamics. Hydroshare is an online collaborative environment for sharing data and models. Hydroshare allows users to run models remotely, without needing to install software locally. This clinic will illustrate example Landlab models and how to run them on Hydroshare. This clinic will provide an introduction to Landlab’s features and capabilities, including how to create a model grid, populate it with data, and run numerical algorithms for surface hydrology, hillslope sediment transport, and stream incision. We will illustrate how models can be used for both research and teaching purposes.  +
Landlab is a Python-based toolkit for building, coupling, and exploring two-dimensional numerical models of Earth-surface dynamics. This clinic will first provide a short hands-on introduction to Landlab's features and capabilities. We will highlight examples from several existing models built within the Landlab framework, including: coupling of local ecohydrologic processes, spatial plant interactions, and disturbances (fires and grazing); landscape evolution impacted by plants; overland flow impacted by changing soil properties; and effects of topographic structure on species distribution and evolution. Models will be run with various scenarios for climate change and anthropogenic disturbances, and evolution of state variables and fluxes across the landscape will be explored. We will also show the use of gridded climate data products to drive Landlab simulations. Participants are encouraged to install Landlab on their computers prior to the clinic. Installation instructions can be found at: http://landlab.github.io (select "Install" from the menu bar at the top of the page).  +
Landlab is a Python-language programming library that supports efficient creation of two-dimensional (2D) models of diverse earth-surface systems. For those new to Landlab, this clinic will provide a hands-on introduction to Landlab's features and capabilities, including how to create a grid, populate it with data, and run basic numerical algorithms. For experienced Landlab users, we will review some of the new features in this first full-release version, explore how to created integrated models by combining pre-built process components, and learn the basics of writing new components. Participants are encouraged to install Landlab on their computers prior to the clinic. Installation instructions can be found at: http://landlab.github.io (select "Install" from the menu bar at the top of the page). Clinic participants who have particular questions or applications in mind are encouraged to email the conveners ahead of the CSDMS meeting so that we can plan topics and exercises accordingly.  +
Landscape evolution involves manifold processes from different disciplines, including geology, geomorphology and ecohydrology, often interacting nonlinearly at different space-time scales. While this gives rise to fascinating patterns of interconnected networks of ridges and valleys, it also challenges Landscape Evolution Models (LEMs), which typically rely on long-term numerical simulations and mostly have only current topographies for comparison. While adding process complexity (and presumably realism) is certainly useful to overcome some of these challenges, is also exacerbates issues related to proper calibration and simulation. This talk advocates more focus on the theoretical analysis of LEMs to alleviate some of these issues. By focusing on the essential elements that distinguish landscape evolution, the resulting minimalist LEMs become more amenable to dimensional analysis and other methods of nonlinear field equations, used for example in fluid mechanics and turbulence, offering fertile ground to sharpen model formulation (i.e., the stream-power erosion term), unveil distinct dynamic regimes (e.g., unchannelized, from incipient valley formation, transitional and statistically self-similar fractal regime), and properly formulate questions related to the existence of steady state solution (as opposed to a situation of space time chaos, similar to a geomorphological turbulence). We also discuss benchmarks for evaluating numerical simulation and novel avenues for numerical methods, as well as ways to bridge between spatially discrete models (i.e., river networks) and continuous, partial-differential-equation models.  +
Landscape evolution models often generalize hydrology by assuming steady-state discharge to calculate channel incision. While this assumption is reasonable for smaller watersheds or larger precipitation events, non-steady hydrology is a more applicable condition for semi-arid landscapes, which are prone to short-duration, high-intensity storms. In these cases, the impact of a hydrograph (non-steady method) may be significant in determining long-term drainage basin evolution. This project links a two-dimensional hydrodynamic algorithm with a detachment-limited incision component in the Landlab modeling framework. Storms of varying intensity and duration are run across two synthetic landscapes, and incision rate is calculated throughout the hydrograph. For each case, peak discharge and total incision are compared to the values predicted by steady-state to evaluate the impact of the two hydrologic methods. We explore the impact of different critical shear stress values on total incision using the different flow methods. Finally, a watershed will be evolved to topographic steady-state using both the steady- and non-steady flow routing methods to identify differences in overall relief and drainage network configuration. Preliminary testing with no critical shear stress threshold has shown that although non-steady peak discharge is smaller than the peak predicted by the steady-state method, total incised depth from non-steady methods exceeds the steady-state derived incision depth in all storm cases. With the introduction of a incision threshold, we predict there will be cases where the steady-state method overestimates total incised depth compared to the non-steady method. Additionally, we hypothesize that watersheds evolved with the non-steady method will be characterized by decreased channel concavities. This work demonstrates that when modeling landscapes characterized by semi-arid climates, choice of hydrology method can significantly impact the resulting morphology.  
Landscapes developed in rock layers of differing erodibility are common on Earth, as well as on other planets. Hillslopes carved into the soft rock are typically characterized by steep, linear-to-concave up slopes or “ramps” mantled with material derived from the resistant layers above, often in the form of large blocks. To better understand the role of sediment size in hillslope evolution, we developed a 1-D numerical model of a hogback. The hybrid continuum-discrete model uses a traditional continuum treatment of soil transport while allowing for discrete, rules-based motion of large blocks of rocks. Our results show that feedbacks between weathering and transport of the blocks and underlying soft rock can create relief over time and lead to the development of concave-up slope profiles in the absence of rilling processes. In addition, the model reaches a quasi-steady state in which the topographic form and length of the ramp remains constant through time. We use an analytic approach to explore the mechanisms by which our model self-organizes to this state, including adjustment of soil depth, erosion rates, and block velocities along the ramp. An agreement of analytic solutions with the model shows that we understand its behavior well, and can carefully explore implications for hillslope evolution in the field. Current work explores the interactions between blocky hillslopes and channels in a 2-D numerical model built in Landlab. Our models provide a framework for exploring the evolution of layered landscapes and pinpoint the processes for which we require a more thorough understanding to predict their evolution over time.  +
Landscapes of the US Central Lowland were repeatedly affected by the Laurentide Ice Sheet. Glacial processes diminished relief and disrupted drainage networks. Deep valleys carved by glacial meltwater were disconnected from the surrounding uplands. The upland area lacking surface water connection to the drainage network is referred to as non-contributing area (NCA). Decreasing fractions of NCA on older surfaces suggests that NCA becomes drained over time. We propose that the integration could occur via: 1) capture of NCA as channels propagate into the upland or, 2) subsurface or intermittent surface connection of NCA to external drainage networks providing increased discharge to promote channel incision. We refer the two cases as “disconnected” and “connected” since the crucial difference between them is the hydrological connection of the upland to external drainage. We investigate the differences in evolution and morphology of channel networks in low relief landscapes under disconnected and connected regimes using the LandLab landscape evolution modeling platform. We observe substantially faster rates of erosion and integration of the channel network in the connected case. The connected case also creates longer, more sinuous channels than the disconnected case. Sensitivity tests indicate that hillslope diffusivity has little influence on the evolution and morphology. The fluvial erosion coefficient has significant impact on the rate of evolution, and it influences the morphology to a lesser extent. Our results and a qualitative comparison with landscapes of the glaciated US Central Lowland suggest that connection of NCAs is a potential control on the evolution and morphology of post-glacial landscapes.  +
Landslides mobilize tons of sediment in the blink of an eye. From an engineering perspective, one typically looks at topographical relief as a causal factor triggering landslides. From a geomorphological perspective, one could wonder how landslides and landslide derived sediment alter the evolution of landscapes. Curious to find out what landslides do with the evolution of landscapes? Tune in for this webinar to figure out how to use the Landlab HyLands component to address this question.  +
Launched in 2021 through a cooperative agreement with the National Science Foundation’s Coastlines and People (CoPe) Program, the Megalopolitan Coastal Transformation Hub is a partnership among 13 institutions, focused on four intertwined goals: 1) Doing science that is useful and used, specifically by facilitating flexible, equitable, and robust long-term planning to manage climate risk in the urban megaregion spanning Philadelphia, New Jersey, and New York City 2) Doing science that advances human understanding of how coastal climate hazards, coastal landforms, and human decisions at household, municipal, market, and policy scales interact to shape climate risk, 3) Training the next generation of leaders in transdisciplinary climate research and engagement, 4) Building a sustainable academic/stakeholder co-production partnership model for just, equitable, and inclusive climate action in diverse coastal, urban megaregions around the world. MACH's initial work has focused particularly on Philadelphia and its surroundings. Core themes within this work include: 1) Characterization of compound flood and heat+flood hazard and risk 2) The role of insurance in the interrelated insurance/mortgage/ housing markets 3) The impacts of flood risk on municipal finances 4) Improving equity considerations in the design of strategies to manage flood risks 5) Household decision-making regarding flood risk in low-income, renter-dominated neighborhoods This talk will introduce MACH and highlight emerging lessons from MACH's transdisciplinary research and engagement model.  +
Live demonstration  +
Macrobenthic species that live within or on top of estuarine sediments can destabilize local mud deposits through bioturbating activities. The resulting enhanced sediment availability will affect large-scale morphological change. We numerically model two contrasting bioturbating species by means of our novel literature-based eco-morphodynamic model. We find significant effects on local mud accumulation and bed elevation change leading to a large-scale reduction in deposited mud. In turn, the species-dependent mud content redefines their habitat and constricted species abundances. Combined species runs reveal a new ecological feedback facilitating survival of the dominant species as a result of combined eco-engineering activity.  +
Major fault systems are the primary manifestation of localized strain at tectonic plate boundaries. Slip on faults creates topography that is constantly reworked by erosion and sediment deposition. This in turn affects the stress state of the brittle upper crust. Numerical models commonly predict that surface processes can modulate the degree of strain localization, i.e., the partitioning of strain onto a given number of master faults and/or the lifespan of individual faults. The detailed mechanisms, potential magnitude, and geological evidence for such feedbacks however remain debated. We address this problem from the perspective of continental rifts, and at the scale of individual fault-bounded structures. Half-grabens in particular constitute ideal natural laboratories to investigate brittle deformation mechanisms (e.g., fault localization, elasto-plastic flexure...) in relation to continued erosion of the master fault footwall and sediment deposition on the hanging wall. Through an energy balance approach, we show that suppressing relief development in a half-graben can significantly enhance the lifespan of its master fault if the upper crust is moderately strong. Simple geodynamic simulations where tectonic topography is either entirely leveled or perfectly preserved confirm our analytical predictions.<br><br>Natural systems, however, lie somewhere in between these two endmembers. To better represent the true efficiency of surface processes at redistributing surficial masses, we couple a 2-D long-term tectonic code with a landscape evolution model that incorporates stream power erosion, hillslope diffusion, and sediment deposition. We identify a plausible range of landscape evolution parameters through morphological analyses of real normal fault-bounded massifs from the East African Rift and Western United States. This allows us to assess the sensitivity of half-graben evolution to a documented range of rheological, climatic, and lithological conditions. We find that half-grabens that reach topographic steady-state after a short amount of extension (~1 km) are more likely to accumulate master fault offsets on par with the thickness of the upper crust. Conversely, a longer phase of topographic growth ––for example due to low rock erodibility–– will favor the initiation of a new master fault and the abandonment of the initial one. A less erodible crust could thus be more prone to extension on a series of horsts and grabens, while more erodible units would deform as long-lived half-grabens. Lithological controls on erodibility could therefore constitute a form of structural inheritance in all geodynamic contexts.  
Major societal and environmental challenges require forecasting how natural processes and human activities affect one another. There are many areas of the globe where climate affects water resources and therefore food availability, with major economic and social implications. Today, such analyses require significant effort to integrate highly heterogeneous models from separate disciplines, including geosciences, agriculture, economics, and social sciences. Model integration requires resolving semantic, spatio-temporal, and execution mismatches, which are largely done by hand today and may take more than two years. The Model INTegration (MINT) project will develop a modeling environment which will significantly reduce the time needed to develop new integrated models, while ensuring their utility and accuracy. Research topics to be addressed include: 1) New principle-based semiautomatic ontology generation tools for modeling variables, to ground analytic graphs to describe models and data; 2) A novel workflow compiler using abductive reasoning to hypothesize new models and data transformation steps; 3) A new data discovery and integration framework that finds new sources of data, learns to extract information from both online sources and remote sensing data, and transforms the data into the format required by the models; 4) A new methodology for spatio-temporal scale selection; 5) New knowledge-guided machine learning algorithms for model parameterization to improve accuracy; 6) A novel framework for multi-modal scalable workflow execution; and 7) Novel composable agroeconomic models.  +
Man-made objects - 'junk', bombs, artificial reefs, containers - litter the seafloor. Many of the munitions remain active (unstable) and polluting, and are a danger for seabed engineering projects. We numerically modeled how they may move during powerful storms. Kinematic analysis per wave cycle (by period and orbital velocity) created a matrix of the movement probabilities, which were convolved with spatial (mapped) values of the same across the German Bight, taking bomb type and sediment type into account. The model can look at historical patterns of migration, and even predict movement in real-time as a storm evolves hour by hour.  +
Mangroves are a halophytic tree communities distributed along tropical and subtropical coastlines. They provide invaluable services, such as blue carbon storage, coastal protection and habitat for thousands of species. Despite their global importance, their responses to rapid climate change are yet to be fully understood. Particularly, it is unclear how mangroves will respond to future increases in net evaporation rates (i.e. evaporation - precipitation), which generally lead to an increase in the concentration of soil stressors such as sulfide and sulfate. We addressed this knowledge gap by collecting remote sensing data from a number of remote mangrove islands across the Caribbean and couple them with a numerical model that describes mangrove vegetated area as a function of net evaporation rate, outer edge island salinity, and hydraulic conductivity of the soil. We found that this modeling framework can capture the variability observed in our mangrove island database, suggesting that an increase in net evaporation rates lead to significant reductions in mangrove island vegetation. Moreover, based on future net evaporation rate scenarios from Global Climate models we find this trend will likely continue and predict that mangrove islands across the Caribbean will experience significant reduction in vegetated area.  +
Many geophysical models require parameters that are not tightly constrained by observational data. Calibration represents methods by which these parameters are estimated by minimizing the difference between observational data and model simulated equivalents (the objective function). Additionally, uncertainty in estimated parameters is determined. In this clinic we will cover the basics of model calibration including: (1) determining an appropriate objective function, (2) major classes of calibration algorithms, (3) interpretation of results. In the hands-on portion of the the clinic, we will apply multiple calibration algorithms to a simple test case. For this, we will use Dakota, a package that supports the application of many different calibration algorithms.  +
Many geoscientists and geoscience organizations vowed to work towards equity and committed to anti-racist action in 2020. But getting started on and staying committed to diversity, equity, and inclusion (DEI) work takes time, energy, and education. This clinic will be a learning and sharing space for everyone who is on a journey towards building a more equitable research unit. Everyone can participate in this clinic, regardless of whether you are just starting your journey or you have travelled many miles and whether your research unit is one person or 100 people. The clinic will begin with discussion and thought exercises about your personal identity. We will then think about what it means for our individual research units to be diverse, equitable, and inclusive. Finally, we will discuss actions you can take to build an anti-racist research unit. Participants will be invited to share their current DEI actions and discuss how they can be adapted for, or expanded in, other settings. The clinic aims to foster an environment in which participants can learn from each other, but participants will not be required to share. Upon completion of this clinic every participant should have a plan for implementing at least one new DEI action, including milestones and accountability checks.  +
Many problems of interest to CSDMS members involve solving systems of conservation laws or balance laws for water wave propagation and inundation, erosion and sediment transport, landscape evolution, or for the flow of overland floods, glaciers, lava, or groundwater. It is often natural to solve these partial differential equations numerically with finite volume methods, in which the domain of interest is divided in finite grid cells and the quantities of interest within each grid cell are updated every time step due to fluxes across the cell boundaries and/or processes within the cell. I will give a brief introduction to some of the general theory of finite volume methods and considerations that affect their accuracy and numerical stability, with illustrations from some of the applications mentioned above.  +
Meandering is one of the most unique processes in Earth surface dynamics. Integrating the Kinoshita high-sinuosity curve describing meander channel planform geometry into the modified version of Beck equations describing the riverbed topography, a prototype for a synthetic riverbed topography generating model is made for idealized meandering rivers. Such method can be readily extended to apply on any arbitrary river centerline resulting in the synthetic riverbed topography model, pyRiverBed, presented herein. A meander migration and neck cutoff submodel is also embedded in pyRiverBed, however, unlike existing meander evolution models, the present model aims its emphasis towards generating the riverbed topography for each snapshot during the migration process. The present model can help meandering river researchers to interpolate field measured bathymetry data using the synthetic bed, to design non-flatbed laboratory flumes for experiments, and to initialize their hydrodynamics and sediment transport numerical models. It can also provide guidance in stream restoration projects on designing a channel with morphodynamic equilibrium bed.  +
Melting of the Greenland Ice Sheet contributes to rising global sea levels. However, local sea level along much of the Greenland coast is falling due to postglacial rebound and a decrease in gravitational attraction from the ice sheet. This affects Greenlandic coastal communities, which have to adapt their coastal infrastructure, shipping routes, and subsistence fisheries. The “Greenland Rising” project is a collaboration between Lamont-Doherty Earth Observatory and the Greenland Institute of Natural Resources that focuses on assessing and preparing for changing sea level along Greenland’s coastline. While sea level is predicted to fall, the exact magnitude varies widely depending on past and present ice change as well as the viscoelastic properties of the subsurface. I will demonstrate how current sea level change depends on these parameters and how we can integrate numerical models of glacial isostatic adjustment with observations of past sea level and present-day uplift to constrain them. I will further briefly describe the role of co-production in this project, which has allowed us to coordinate bathymetric surveys with local stakeholders from the municipality, industry, and local Hunters and Fishers organization. Combining numerical predictions of sea level change with baseline bathymetry and benthic mapping promises to provide communities with a clearer picture of future environmental change.  +
Model analysis frameworks specify ideas by which models and data are combined to simulate a system on interest. A given modeling framework will provide methods for model parameterization, data and model error characterization, sensitivity analysis (including identifying observations and parameters important to calibration and prediction), uncertainty quantification, and so on. Some model analysis frameworks suggest a narrow range of methods, while other frameworks try to place a broader range of methods in context. Testing is required to understand how well a model analysis framework is likely to work in practice. Commonly models are constructed to produce predictions, and here the accuracy and precision of predictions are considered.<br><br>The design of meaningful tests depends in part on the timing of system dynamics. In some circumstances the predicted quantity is readily measured and changes quickly, such as for weather (temperature, wind and precipitation), floods, and hurricanes. In such cases meaningful tests involve comparing predictions and measured values and tests can be conducted daily, hourly or even more frequently. The benchmarking tests in rainfall-runoff modeling, such as HEPEX, are in this category. The theoretical rating curves of Kean and Smith provide promise for high flow predictions. Though often challenged by measurement difficulties, short timeframe systems provide the simplest circumstance for conducting meaningful tests of model analysis frameworks.<br><br>If measurements are not readily available and(or) the system responds to changes over decades or centuries, as generally occurs for climate change, saltwater intrusion of groundwater systems, and dewatering of aquifers, prediction accuracy needs to be evaluated in other ways. For example, in recent work two methods were used to identify the likely accuracy of different methods used to construct models of groundwater systems (including parameterization methods): (1) results of complex and simple models were compared and (2) cross-validation experiments. These and other tests can require massive computational resources for any but the simplest of problems. In this talk we discuss the importance of model framework testing in these longer-term circumstances and provide examples of tests from several recent publications. We further suggest that for these long-term systems, the design and performance of such tests are essential for the responsible development of model frameworks, are critical for models of these environmental systems to provide enduring insights, and are one of the most important uses of high performance computing in natural resource evaluation.  
Modelling and simulation are critical approaches to addressing geographic and environmental issues. To date, enormous relevant geo-analysis models have been developed to simulate geographic phenomena and processes that can be used to solve environmental, atmospheric and ecological problems. These models developed by different groups or people are heterogeneous and difficult to share with others. As a result, numerous international groups or organizations have designed and developed standards to unify geo-analysis models, such as OpenMI, BMI and OpenGMS-IS. Models that follow a specific standard can be shared and reused in their own standard framework, however, they still can't be reused by other standards. Thus, model interoperation may help models be shared and reused by different standards. This research aims at designing an interoperability solution that can help users reuse geo-analysis models based on other standards. In this research, we discussed several solutions for model interoperation and analyzed the features of different standards. Firstly, we developed three solutions for models interoperation between different standards and discussed their advantages and disadvantages. Then, we analyzed the key features of model interoperation, including model field mapping, function conversion, data exchange, and component reorganization. Finally, we have developed an interoperability engine for interoperation between models based on OpenMI, BMI, or OpenGMS-IS. We also provided case studies (using e.g. SWMM, FDS, and the Permamodel Frost Number component) to successfully demonstrate the model interoperation.  +
Modelling network-scale sediment (dis)connectivity and its response to anthropic pressures provides a foundation understanding of river processes and sediment dynamics that can be used to forecast future trajectories of river form and process. We present the basin-scale, dynamic sediment connectivity model D-CASCADE, which combines concepts of network modelling with empirical sediment transport formulas to quantify spatiotemporal sediment (dis)connectivity in river networks. The D-CASCADE framework describes sediment connectivity in terms of transfer rate through space and time while accounting for several hydro-morphological and anthropic factors affecting sediment transport. Add-ons can be integrated into D-CASCADE to model local changes in river geomorphology driven by sediment-induced variations in features. Here, we show an application of D-CASCADE to the well-documented Bega River catchment, NSW, Australia, where major geomorphic changes have occurred in the network post-European settlement (ES) after the 1850s, including widespread channel erosion and sediment mobilization. By introducing historic drivers of change in the correct chronological sequence, the D-CASCADE model successfully reproduced the timing and magnitude of major phases of sediment transport and associated channel adjustments over the last two centuries. With this confidence, we then ran the model to test how well it performs at estimating future trajectories of basin-scale sediment transport and sediment budgets at the river reach scale.  +
Modelling river physical processes is of critical importance for flood protection, river management and restoration of riverine environments. Because of the continuous increment of computational power and the development of novel numerical algorithms, numerical models are nowadays widely and standardly used. The freeware BASEMENT is a flexible tool for one and two-dimensional river process simulations that bundles solvers for hydrodynamic, morphodynamic, scalar advection-diffusion and feedbacks with riparian vegetation. The adoption of a fully costless workflow and a light GUI facilitate its broad utilization in research, practice and education. In this seminar I introduce the different tools within the BASEMENT suite, present some domains of application and ongoing developments.  +
Modern photogrammetry allows us to make very accurate three-dimensional models using images from consumer-grade cameras. Multiview stereo photogrammetry, also known as structure from motion (SfM) is now easily accessible. Coupled with drones, this is transformative technology that lets us all make better maps than the National Geodetic Survey could not long ago. This hands-on course will demonstrate the basic tools and provide some tips that will allow you to map your favorite field area with 3 - 5 cm horizontal resolution and vertical RMS errors of less than 10 cm. Even better resolution can be obtained for smaller areas, such as outcrops, archaeological digs, or your daughter's art project.<br>We will use Agisoft Photoscan Pro software...please download the free demo (Pro) version before the class. It works equally well on Mac, Windows, and Linux. If you have a choice, chose a machine with an NVidia graphics card. We encourage you to collect a set of images to bring to the class. Guidelines on how best to take images intended for SfM will be send around before the meeting.  +
Montane Cloud Forests (MCFs) are globally relevant ecological zones that spend the majority of their growing season in cloud and fog. Prior eco-physiological studies have demonstrated that MCFs are incredibly efficient at assimilating CO2 during photosynthesis. This increased efficiency is attributed to how plants in these ecosystems operate within their unique microclimates. Specifically, MCF trees maintain high photosynthesis rates under fog and low cloud conditions. While this has been observed and quantified in lab and field experiments, current sub-models of plant-atmosphere interactions within Earth systems models (ESMs) cannot recreate enhanced levels of gas exchange measured in ecophysiology studies. This lack of understanding leads to high uncertainty in ESM estimates of evapotranspiration and carbon assimilation rates for MCF ecosystems. It is critical to improve our estimates of MCF hydrologic and photosynthetic processes as these ecosystems are vulnerable to drought and microclimatic conditions are likely to be altered by climate change. This talk will explore the gaps in our process-based understanding of water, energy, and carbon budgets for MCFs, how these gaps lead to uncertainties in ESMs at different spatial and temporal scales, and how we can address these gaps in future work.  +
Natural disasters push the process of scientific discovery to its limits: Their enormous scale makes them difficult to recreate in the lab, their destructive power and rare occurrence limit the possibility of acquiring field data, and their profoundly nonlinear behavior over a wide range of scales poses significant modeling challenges. In this talk, I explore how we can leverage insights from four different natural systems to contribute to our fundamental scientific understanding of the role that multiphase processes play in the onset and evolution of extreme events and to our ability to mitigate associated risks.  +
No abstract  +
No abstract  +
No abstract has been submitted  +
No abstract submitted  +
No abstract was needed for this meeting  +
No abstract was needed for this meeting  +
No abstract was needed for this workshop  +
No abstract was needed for this workshop  +
No abstract was needed for this workshop  +
No abstract was needed for this workshop  +
No abstract was needed for this workshop  +
No abstract was needed for this workshop  +
No abstract was needed for this workshop  +
No abstract was needed for this workshop  +
No abstract was needed for this workshop  +
No abstract was needed for this workshop  +
No abstract was needed for this workshop  +
No abstract was needed for this workshop  +
No abstract was needed for this workshop  +
No abstract was provided for this presentation  +
No abstract was provided for this presentation  +
No abstract was provided for this presentation.  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting  +
No abstract was required for this meeting.  +
No abstract was required for this workshop  +
No abstracts was required for this meeting  +
Numerical modeling is at the core of prediction in coastal settings. Observational data is used in tandem with models for a variety of modeling tasks, but the perhaps the coupling could be tighter? I will discuss a range of Machine Learning tools that co-workers and I have integrated with coastal morphodynamic models that allow for a tight coupling of models and data, and provide morphodynamic insight.  +
Numerical models describe the world around us mathematically, allowing us to visualize changes to physical systems through both space and time. These models are essential tools for geoscientists, but writing your own model can be a daunting task. In this clinic, we’ll develop an understanding of what numerical models are, and then we’ll delve into the math that functions as the basis for many models. Participants will learn how to apply basic conservation principles to developing equations that describe a physical system that changes through time. This workshop will expose participants to deriving differential equations, and using basic Python programming to visualize their solutions. Prior experience is not necessary.  +
Numerical stratigraphic modelling of the impact of paleoclimate changes on earthscape evolution and sedimentary basin stratigraphy is of great value to better understand and predict the impact of global warming and increasingly frequent extreme events on the environment. To illustrate the contribution of stratigraphic modelling, we propose a modular model, ArcaDES (a.k.a. Dionisos), able to simulate geological processes in 3D on large scales of space and times (tens to hundreds of kilometres, and thousands to tens of millions of years). ArcaDES is a 3D software written in C++ and implemented within the Arcane object-oriented high-performance computing platform co-developed by the CEA and IFPEN. This modular code includes three main components to handle hydrology, accommodation space and sediment transport. Taking into account precipitation, evaporation and soil infiltration capacity, the first component calculates steady-state runoff, surface and ground water flows, and water table elevation. The second component considers tectonic subsidence and uplift, flexure, sea level variations and sediment compaction to define the accommodation space. The third component deals with time-averaged physical laws describing erosion, transport by fluvial and marine currents, and deposition of sediments from fluvial to deep-marine systems to calculate sediment distribution and stratigraphic architecture. This stratigraphic forward model is applied to two case studies: the Congo basin and the Alboran sea, to illustrate the impact of the last Holocene glaciations on the deep-sea fan of the Congo and the contouritic systems in the Alboran Sea.  +
Observations in coastal environments show that seabed resuspension can impact water quality and biogeochemical dynamics by vertically mixing sediment and water, and by redistributing material that has been entrained into the water column. Yet, ocean models that incorporate both sediment transport and biogeochemical processes are rare. The scientific community frequently utilizes hydrodynamic-sediment transport numerical models, but hydrodynamic-biogeochemical models ignore or simplify sediment processes, and have not directly accounted for the effect of resuspension on oxygen and nutrient dynamics.<br><br>This presentation focuses on development and implementation of HydroBioSed, a coupled hydrodynamic-sediment transport-biogeochemistry model that was developed within the open-source Regional Ocean Modeling System (ROMS) framework. HydroBioSed can account for processes including advection, resuspension, diffusion within the seabed and at the sediment-water interface, organic matter remineralization, and oxidation of reduced chemical species. Implementation of the coupled HydroBioSed model for different locations, including the Rhone River subaqueous delta and the northern Gulf of Mexico, have helped to quantify the effects of both sediment transport and biogeochemical processes. Results indicate that resuspension-induced exposure of anoxic, ammonium-rich portions of the seabed to the more oxic, ammonium-poor water column can significantly affect seabed-water column fluxes of dissolved oxygen and nitrogen. Also, entrainment of seabed organic matter into the water column may significantly draw down oxygen concentrations in some environments. Ongoing work focuses on how resuspension and redistribution of organic matter and sediment may influence oxygen dynamics in the Chesapeake Bay.  +
One of the challenges for modelers is to get their results into the hands of potential users. We do this by creating informative and relevant maps, charts, and indicators. Sometimes we try to go further. We want end-users to 'feel' the model, using techniques like haptic interactions, extended reality. We do this to help the user to get a better understanding (exploration, interaction) or to develop a shared concept (by socializing around the model), or to provide the user with an immersive experience (using photorealistic rendering). Using the BMI interface, which we also use for model coupling, we have changed several models from passive to interactive. We integrated these interactive models into different environments, such as the recently developed Virtual River Game, the Coastal Sandbox. Here we present recent developments, technical considerations and the results of the user studies that helped shape our vision towards more effective scientific communication and interaction.  +
One of the most intriguing issues in fine sediment transport, including turbidity currents, current-driven transport and wave-driven transport, is that the presence of sediments may significantly attenuate flow turbulence. Depending on the level of turbulence suppression, it may lead to the formation of lutocline (a sharp negative gradient of sediment concentration) which further encourages offshore-directed gravity flow; or it may cause catastrophic collapse of turbulence and sediment deposition. Through idealized 3D turbulence-resolving simulations of fine sediment (mud) transport in wave bottom boundary layer based on a pseudo-spectral scheme, our recent studies show that the transition of these flow modes can be caused by various degree of sediment-induced stable density stratification. This effort demonstrates the success of using a turbulence-resolving simulation tool to diagnose complex fine sediment transport processes. This talk further reports our recent development of this turbulence-resolving numerical model with a goal to provide a predictive tool for more realistic fine sediment transport applications.<br/><br/>Assuming a small Stokes number (St<0.3), which is appropriate for typical fine sediment, the Equilibrium approximation to the Eulerian two-phase flow equations is applied. The resulting simplified equations are solved with a high-accuracy hybrid spectral-compact finite difference scheme. The numerical approach extends the earlier pseudo-spectral model with a sixth-order compact finite difference scheme in the bed-normal direction. The compact finite difference scheme allows easy implementation of flow-dependent sediment properties and complex bottom boundary conditions. Hence, several new capabilities are included in the numerical simulation, such as rheological stress (enhance viscosity in high sediment concentration), hindered settling, erodible/depositional bottom boundary, and higher order inertia terms critical for fine sand fraction.<br/><br/>In the past decade, the role of wave bottom boundary layer in delivering fine sediment offshore via wave-supported gravity current (WSGC) has been well-recognized. We hypothesize that the generation, transport and termination of WSGC is directly associated with the flow modes discussed previously. In addition to the well-known Richardson number control (i.e., associated with sediment-induced density stratification), in this talk we will discuss how enhanced viscosity via rheological stress and high erodibility of the mud bed (e.g., low critical shear stress for unconsolidated mud bed) can trigger catastrophic collapse of turbulence and sediment deposition. The significance of bed erodibility in determining the resulting flow modes motivates future study regarding the effect of sand fraction on fine sediment transport via armoring.  
OpenFoamÒ is an open-source computational fluid dynamic platform, built upon a finite-volume framework with Messaging Passing Interface (MPI). In the past decade, OpenFoamÒ has become increasingly popular among researchers who are interested in fluvial and coastal processes. In this clinic, recent progress in developing OpenFoamÒ for several coastal applications will be discussed. In particular, we will focus on three subjects: (1) wave-induced seabed dynamics (pore-pressure response), (2) stratified flow application, particularly laboratory scale river plume modeling, and (3) 3D large-eddy simulation of wave-breaking and suspended sediment transport processes.<br>In particular, hand-on exercise will be given for 3D large-eddy simulation of wave-breaking processes to illustrate several important insights on how to use OpenFoamÒ to carry out high quality large-eddy simulations. Some cautionary notes and limitations will also be discussed.  +
Opening of the CSDMS 2023 annual meeting  +
Opening of the CSDMS annual meeting  +
Opening of the meeting  +
Opening of the meeting  +
Opening of the meeting  +
Our extensive transdisciplinary efforts since 2010 in the northern Gulf of Mexico (Mississippi, Alabama, and the Florida panhandle) have resulted in an advanced capability to model and assess hydrodynamic and ecological impacts of climate change at the coastal land margin (visit http://agupubs.onlinelibrary.wiley.com/hub/issue/10.1002/(ISSN)2328-4277.GULFSEARISE1/). The concerted efforts of natural and social scientists as well as engineers have contributed to a paradigm shift that goes well beyond “bathtub” approaches. Potential deleterious effects to barrier islands, shorelines, dunes, marshes, etc., are now better understood. This is because the methodology enables assessment of not just eustatic sea level rise (SLR), but gets to the basis of projections of climate change and the associated impacts, i.e., carbon emission scenarios. The paradigm shift, input from coastal resource managers, and future expected conditions now provides a rationale to evaluate and quantify the ability of natural and nature-based feature (NNBF) approaches to mitigate the present and future effects of surge and nuisance flooding.<br>Over the majority of the 20th century, the largely linear rate of eustatic SLR was realized by thermal expansion of seawater as a function of a gradual increase in the average annual global temperature. Global satellite altimetry indicates that the rate of global mean SLR has accelerated from approximately 1.6 to 3.4 mm/year. While the year-by-year acceleration of the rate of rise cannot be measured adequately, it is reasonable to assume that it was relatively stable throughout the 20th century. For the 21st century, general circulation models project that posed atmospheric carbon emission scenarios will result in higher global average temperatures. A warmer global system will introduce new mechanisms (e.g., land ice loss, isotatic adjustments, and changes in land water storage) that will contribute to relatively abrupt changes in sea state levels. The additions to thermal expansion will drive higher sea levels and the increases in sea level will be attained by further accelerations in the rate of the rise. Because of the nature of the new mechanisms that will govern sea levels, it is unlikely that future accelerations in the rate of rise will be smooth.<br>To further address the complications associated with relatively abrupt changes in SLR and related impacts of climate change at the coastal land margin we intend to: (1) refine, enhance, and extend the coupled dynamic, bio-geo-physical models of coastal morphology, tide, marsh, and surge; (2) advance the paradigm shift for climate change assessments by linking economic impact analysis and ecosystem services valuation directly to these coastal dynamics; (3) pursue transdisciplinary outcomes by engaging a management transition advisory group throughout the entire project process; and (4) deliver our results via a flexible, multi-platform mechanism that allows for region-wide or place-based assessment of NNBFs. This presentation will share examples of our recent efforts and discuss progress to-date.  
Our understanding of human systems has been synthesized and advanced by computationally representing human decision-making in agent-based models. Whether representing individuals, households, firms, or larger organization, agent-based modelling approaches are often used to model processes (e.g., urban growth, agricultural land management) that directly effect and are affected by natural systems. Contemporary efforts coupling models of human and natural systems have demonstrated that results significantly differ from isolated representations of either system. However, coupling models of human and natural systems is conceptually and computationally challenging. In addition to discussing these challenges and approaches to overcoming them, this talk will also suggest that research quantifying natural processes at the decision-making scale of the land user is needed. Using structure-from-motion and unmanned aerial vehicle (UAV) imagery, we can accurately quantify natural processes like soil erosion to a high level of accuracy and that frequently modelled processes (e.g., flow accumulation) typically differ from reality. Novel data from the field or parcel scale are needed to calibrate and validate our representation of natural processes if we are to advance our representation of feedbacks between natural processes and human decision-making. By improving our representation of both natural processes and human decision-making at the scale of the decision-maker, we add confidence in our ability to scale out to larger spatial extents that are reflective of natural processes (e.g., watershed) or policy driving human decisions from municipal, state, or national governments.  +
Overview and Update of CSDMS accomplishments  +
Panel discussion  +
Panel discussion on AI/ML  +
Parametric insurance represents a major breakthrough in the accessibility of risk financing for natural disasters. Instead of compensating for actual assessed loss, parametric (or index-based) insurance instead uses measurement of the hazard itself as a proxy for loss, paying out a pre-agreed amount for an event with certain intensity, location and, sometimes, duration. This allows for rapid settlement and reduced costs – of claims adjustment / processing and in the margin added by risk takers for uncertainty in projected outcomes. The quantitative, independent and objective nature of EO data, and also its availability in real time, makes it ideal as a basis for parametric insurance, particularly in the developing world where claims data for policy pricing is non-existent. Examples of parametric products based on EO data already in the market include protection against high and low rainfall, use of vegetation greenness indices, and footprint mapping as a basis for flood protection.  +
Part 1 will focus on the use of Doodler (https://github.com/Doodleverse/dash_doodler), a 'human-in-the-loop' labeling tool for image segmentation (described in this paper: https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2021EA002085). We'll cover the two primary uses of Doodler; a) for relatively rapid image segmentation of a small set of images, and b) for making libraries of labeled imagery for training Machine Learning models to automate the process of image segmentation on larger datasets. We'd ideally like participants to label the same imagery in-class so we can discuss image interpretation and label agreement. This may even result in a publishable dataset; participants would receive co-authorship and could opt-in/out. We will provide example datasets and models, but participants will also be encouraged to bring their own imagery sets. That way, participants will have time to familiarize themselves with the burgeoning Doodleverse tools (https://github.com/Doodleverse) in between classes on their own data.  +
Part 2 will focus on the use of Segmentation Gym (https://github.com/Doodleverse/segmentation_gym), for training and implementing deep-learning-based image segmentation models. Participants will be given datasets and models to use for their own model building and implementation, or optionally they may use their own data, for example label images they made in Part 1. Hardware needs, and common problems and their workarounds will be discussed.  +
Participants in this clinic will learn how to compile and run a Regional Ocean Modeling (ROMS) test case for an idealized continental shelf. The hydrodynamic model that we will use includes wave forcing and suspended sediment transport.<br/><br/>ROMS is an open source, three dimensional primitive equation hydrodynamic ocean model that uses a structured curvilinear horizontal grid and a stretched terrain following vertical grid. For more information see https://www.myroms.org. It currently has more than 4,000 registered users, includes modules for sediment transport and biogeochemistry, and has several options for turbulence closures and numerical schemes. Model input is specified using a combination of ASCII text files and NetCDF (Network Common Data Form) files. Output is written to NetCDF files. In part because ROMS was designed to provide flexibility for the choice of model parameterizations and processes, and to run in parallel, implementing the code can seem daunting, but in this clinic we will present an idealized ROMS model that can be run on the CSDMS cluster.<br/><br/>As a group, we will compile and run an idealized ROMS model on the CSDMS computer, Beach. The group will choose a modification to the standard model. While the modified model runs, we will explore methods for visualizing model output. Participants who have an account on Beach can try to run the model themselves. Clinic participants who have Matlab set up to visualize NetCDF files will be able to browse model output files during the clinic.<br/><br/>Following the clinic, participants should have access to tools for looking at ROMS output, an example ROMS model run, and experience with ROMS input and output files.  +
Participants in this clinic will learn how to run a Regional Ocean Modeling System (ROMS) test case for an idealized continental shelf model domain within the CSDMS Web Modeling Toolkit (WMT). The model implementation that we will use includes wave forcing, a riverine source, suspended sediment transport.<br><br>ROMS is an open source, three-dimensional primitive equation hydrodynamic ocean model that uses a structured curvilinear horizontal grid and a stretched terrain following vertical grid. For more information see https://www.myroms.org. It currently has more than 4,000 registered users, and the full model includes modules for sediment transport and biogeochemistry, and several options for turbulence closures and numerical schemes. In part because ROMS was designed to provide flexibility for the choice of model parameterizations and processes, and to run in parallel, implementing the code can seem daunting, but in this clinic, we will present an idealized ROMS model that can be run on the CSDMS cluster via the WMT. One goal is to provide a relatively easy introduction to the numerical modeling process that can be used within upper level undergraduate and graduate classes to explore sediment transport on continental shelves.<br><br>As a group, we will run an idealized ROMS model on the CSDMS computer, Beach. The group will choose a modification to the standard model. While the modified model runs, we will explore methods for visualizing model output. Participants who have access to WMT can run the model themselves. Clinic participants who have access to Matlab and/or Panoply will be able to browse model output files during the clinic.<br><br>Following the clinic, participants should have access to an example ROMS model run, experience running ROMS within the WMT and with ROMS input and output files, and. ROMS lesson plans.  +
Participatory modeling (PM) is a collaborative approach to formalize shared representations of a problem and design and test solutions through a joint modeling process. PM is well-suited for addressing complex social and environmental problems like climate change, social and economic injustice, and sustainable resource management. This workshop will introduce and test a prototype version of Fora.ai, a new PM platform developed at Northeastern University. Fora.ai is a simple digital environment that enables groups to collaboratively understand real world problems and create novel solutions. Stakeholders interact through this digital representation with input from other stakeholders, then iteratively revise and test solutions until diverse needs are addressed. Fora.ai provides quick simulation results for data-driven proof of concepts that are ready to be presented, designed, and implemented in the real world, giving everyone in a team the power to share their unique perspective and build the world they want to live in together.  +
Participatory modeling (PM) is a collaborative approach to formalize shared representations of a problem and design and test solutions through a joint modeling process. PM is well-suited for addressing complex social and environmental problems like climate change, social and economic injustice, and sustainable resource management. This workshop will introduce and test a prototype version of Fora.ai, a new PM platform developed at Northeastern University. Fora.ai is a simple digital environment that enables groups to collaboratively understand real world problems and create novel solutions. Stakeholders interact through this digital representation with input from other stakeholders, then iteratively revise and test solutions until diverse needs are addressed. Fora.ai provides quick simulation results for data-driven proof of concepts that are ready to be presented, designed, and implemented in the real world, giving everyone in a team the power to share their unique perspective and build the world they want to live in together.  +
Permafrost is one of the Arctic climate indicators, and feedback of thawing permafrost to the global climate system through the impacts on the carbon cycle remains an important research topic. Observations can assess the current state of permafrost, but models are eventually essential to make predictions of future permafrost state.<br>In this 2hr clinic, we will present a new, easy-to-access and comprehensive cyberinfrastructure for permafrost modeling. The ‘PermaModel Integrated Modeling Toolbox’ includes three permafrost models of increasing complexity. The IMT is embedded within the Community Surface Dynamics Modeling System Web Modeling Tool (WMT). We include multiple sets of sample inputs, representing a variety of climate and soil conditions and locations, to enable immediate use of the IMT.<br>The hands-on clinic teaches students and researchers how to run and use several permafrost models. The presented models are envisioned to be the suitable for quick exploration of hypotheses and for teaching purposes.  +
Permafrost is one of the Arctic climate indicators, and feedback of thawing permafrost to earth surface processes and vice versa is a research frontier. Observations can assess the current state of permafrost, but models are eventually essential to make predictions of future permafrost state and impacts on surface processes. In this 2hr clinic, we will present a new, easy-to-access and comprehensive cyberinfrastructure for permafrost modeling. The ‘Permafrost Modeling Toolbox’ includes three permafrost models of increasing complexity. The tools are embedded within the Community Surface Dynamics Modeling System Web Modeling Tool. We include multiple sets of sample inputs, representing a variety of climate and soil conditions and locations, to enable immediate use of the tools.<br>The hands-on clinic teaches students and researchers how to run and use several permafrost models with associated datasets. The presented models are envisioned to be the suitable for quick exploration of hypotheses and for teaching purposes. We will also explore options for model coupling, demonstrating an example of a model of coastal/delta sedimentation in permafrost environments.  +
Plastic pollution is a ubiquitous issue impacting the health of marine ecosystems worldwide. Yet, critical knowledge gaps surrounding the fate and transport of plastic once it enters the ocean impede remediation and prevention efforts. Predicting transport is difficult for any particle in the ocean, but microplastics present a particular challenge because their size and density fall outside the regimes of traditionally studied environmental particles such as low-density bubbles and high-density sediment. In this talk I will discuss recent work addressing these challenges with both modelling and experiments.  +
Plate tectonics is the primary process controlling the Earth’s surface topography. In recent years, geodynamicists have emphasised the role that deep mantle flow may play in directly creating long wavelength, low amplitude topography (a so-called “dynamic” contribution to surface topography). In parallel, geomorphologists have investigated how surface processes (erosion, transport and sedimentation) may affect dynamic topography, with the aim of better understanding its signature in the geological record. To achieve this, we have developed a new class of surface processes models that represent the combined effects of physical erosion and chemical alteration within continental interiors. In developing these models, we have paid much attention to maintaining high efficiency and stability such that they could be used to model large continental areas with sufficient spatial resolution to represent the processes at the appropriate scale. I will briefly present these algorithms as well as the results of two separate studies in which we explain the anomalously rapid erosion of surface material during the passage of a continent over a fixed source of dynamic topography driven by upward flow in the mantle. I will also comment on how these models are strongly dependent on precipitation patterns and, ultimately, will need to be fully coupled to climate models to provide more meaningful constraints on the past evolution of surface topography.  +
Predicting long-term Earth surface change, the impacts of short-term natural hazards and biosphere/geosphere dynamics requires computational models. Many existing numerical models quantitatively describe sediment transport processes, predicting terrestrial and coastal change at a great variety of scales. However, these models often address a single process or component of the earth surface system. The Community Surface Dynamics Modeling System is an NSF-funded initiative that supports the open software efforts of the surface processes community. CSDMS distributes >200 models and tools, and provides cyberinfrastructure to simulate lithosphere, hydrosphere, atmosphere, or cryosphere dynamics. Many of the most exciting problems in these fields arise at the interfaces of different environments and through complex interactions of processes. This workshop presents recent cyberinfrastructure tools for hypothesis-driven modeling— the Python Modeling Tool (PyMT) and LandLab. PyMT allows users to interactively run and couple numerical models contributed by the community. There are already tools for coastal & permafrost modeling, stratigraphic and subsidence modeling, and terrestrial landscape evolution modeling (including hillslope, overflow, landslide processes, and a suite of erosion processes with vegetation interactions), and these are easy to run and further develop in a Python environment. This 2-part tutorial aims to provide a short overview of the PyMT and Landlab, a demonstration of running a coupled model, and hands-on exercises using Jupyter notebooks in small groups of attendees. The organizers will facilitate break-out groups for discussion of pressing research needs and then have a plenary discussion with reports of each of the breakouts on future frontier applications of coupled landscape/bioscape process modeling. Materials for this clinic can be found at: https://github.com/csdms/csdms-2020  +
Predicting long-term Earth surface change, the impacts of short-term natural hazards and biosphere/geosphere dynamics requires computational models. Many existing numerical models quantitatively describe sediment transport processes, predicting terrestrial and coastal change at a great variety of scales. However, these models often address a single process or component of the earth surface system. The Community Surface Dynamics Modeling System is an NSF-funded initiative that supports the open software efforts of the surface processes community. CSDMS distributes >200 models and tools, and provides cyberinfrastructure to simulate lithosphere, hydrosphere, atmosphere, or cryosphere dynamics. Many of the most exciting problems in these fields arise at the interfaces of different environments and through complex interactions of processes. This workshop presents recent cyberinfrastructure tools for hypothesis-driven modeling— the Python Modeling Tool (PyMT) and LandLab. PyMT allows users to interactively run and couple numerical models contributed by the community. There are already tools for coastal & permafrost modeling, stratigraphic and subsidence modeling, and terrestrial landscape evolution modeling (including hillslope, overflow, landslide processes, and a suite of erosion processes with vegetation interactions), and these are easy to run and further develop in a Python environment. This 2-part tutorial aims to provide a short overview of the PyMT and Landlab, a demonstration of running a coupled model, and hands-on exercises using Jupyter notebooks in small groups of attendees. The organizers will facilitate break-out groups for discussion of pressing research needs and then have a plenary discussion with reports of each of the breakouts on future frontier applications of coupled landscape/bioscape process modeling. Materials for this clinic can be found at: https://github.com/csdms/csdms-2020  +
Process-based modeling offers interpretability and physical consistency in many domains of geosciences but struggles to leverage large datasets efficiently. Machine-learning methods, especially deep networks, have strong predictive skills yet are not easily interpretable and are unable to answer specific scientific questions. A recently proposed genre of physics-informed machine learning, called “differentiable” modeling (DM, https://t.co/qyuAzYPA6Y), trains neural networks (NNs) with process-based equations (priors) together in one stage (so-called “end-to-end”) to benefit from the best of both NNs and process-based paradigms. The NNs do not need target variables for training but can be indirectly supervised by observations matching the outputs of the combined model, and differentiability critically supports learning from big data. We propose that differentiable models are especially suitable as global- or continental-scale geoscientific models because they can harvest information from big earth observations to produce state-of-the-art predictions (https://mhpi.github.io/benchmarks/), enable physical interpretation naturally, extrapolate well (due to physical constraints) in space and time, enforce known physical laws and sensitivities, and leverage progress in modern AI computing architecture and infrastructure. Differentiable models can also synergize with existing process-based models in terms of providing to them parameters or identifying optimal processes, learning from the lessons of the community. Differentiable models can answer pressing societal questions on water resources availability, climate change impact assessment, water management, and disaster risk mitigation, among others. We demonstrate the power of differentiable modeling using computational examples in rainfall-runoff modeling, river routing, ecosystem and water quality modeling, and forcing fusion. We discuss how to address potential challenges such as implementing gradient tracking for implicit numerical schemes and addressing process tradeoffs. Furthermore, we show how differentiable modeling can enable us to ask fundamental questions in hydrologic sciences and get robust answers from big global data.  
Process-based models are able to predict velocity fields, sediment transport and associated morphodynamic developments over time. These models can generate realistic morphological patterns and stable morphodynamic developments over time scales of millennia under schematized model settings. However, more realistic case studies raise questions on model skill and confidence levels. Process-based models require detailed information on initial conditions (e.g. sediment characteristics, initial distribution of sediment fractions over the model domain), process descriptions (e.g. roughness and sediment transport formulations) and forcing conditions (e.g. time varying hydrodynamic and sediment forcing). The value of the model output depends to a high degree on the uncertainty associated with these model input parameters.<br/><br/>Our study explores a methodology to quantify model output uncertainty levels and to determine which parameters are responsible for largest output uncertainty. Furthermore we explore how model skill and uncertainty develop over time. We describe the San Pablo Bay (USA) case study and the Western Scheldt (Netherlands) case study in a 100 year hindcast and a more than 100 year forecast.<br/><br/>Remarkably, model skill and uncertainty levels depend on model input parameter variations only to a limited extent. Model skill is low first decades, but increases afterwards to become excellent after 70 years. The possible explanation is that the interaction of the major tidal forcing and the estuarine plan form governs morphodynamic development in confined environments to a high degree.  +
Proposed in 2018, the Open Modeling Foundation (OMF) initiative aims to establish an international open science community to enable the next generation modeling of human and natural systems. The OMF is envisioned as an alliance of modeling organizations that develops and administers a community-wide open modeling standards and best practices for the social, ecological, environmental, and geophysical sciences. It will support these efforts though informational, data, and technological resources for the scientific communities it serves. This webinar reviews the history of the OMF, its current status, future plans, and how scientists can participate in this initiative.  +
PyMT is the “Python Modeling Toolkit”. It is an Open Source Python package, developed by the Community Surface Dynamics Modeling System, that provides tools used to couple models that expose the Basic Model Interface (BMI). PyMT is: * a toolbox for coupling models of disparate time and space scales, * a collection of Earth-surface models, and * an extensible plug-in framework for user-contributed models. In this hands-on clinic we will use Jupyter Notebooks to explore how to run standalone models within PyMT. Since all PyMT models are based on the BMI, they all share the same user interface and so if you know how to run one model, you know how to run all PyMT models. We will then look at some of the model-coupling tools packaged with PyMT and how they can be used for more complex couplings. We will then run through examples that use these tools to couple models to data as well as to other PyMT models. We highly recommend that clinic attendees come with a laptop with the Anaconda Python distribution installed.  +
Quantitative analysis is often indispensable for making sound policy choices. But when decisionmakers confront today’s conditions of fast-paced, transformative, and even surprising change, they sometimes find that commonly used quantitative methods and tools prove counterproductive or lead them astray. Typically, quantitative analysis provides decisionmakers with information about the future by making predictions. But predictions are often wrong, and relying on them can be dangerous. Moreover, decisionmakers know that predictions are often wrong; this can cause them to discount or ignore the crucial information that quantitative analysis can provide. Fortunately, the combination of new information technology and new insights from the decision sciences now enables innovative ways to support decisions with quantitative analysis. This talk describes how one such approach—Robust Decision Making (RDM)—informs good decisions without requiring confidence in and agreement on predictions and offers examples of its increasing impact in a wide range of policy areas.  +
R has been widely used by ecologists. It is a powerful language to build statistical models. However, R applications in landscape ecology are relatively limited. In this model clinic, we will introduce R programming and two recently developed packages “NLMR” and “landscapetools” in generating and visualizing neutral landscapes. Neutral models are useful tools for testing the effect of different spatial processes on observed patterns, as they create landscape patterns in the absence of specific processes. Comparisons between a landscape model and a neutral model simulation will provide insights into how these specific processes affect landscape patterns. Different algorithms exist to generate neutral landscapes and they have been traditionally included in different programs. Now the NLMR package in R integrated all these different algorithms into one place. In addition to providing instructions on how to use R, “NLMR and “landscapetools packages”, we will showcase real-world examples on neutral landscapes’ applications in ecology, such as predicting coastal wetland change in response to sea-level rise.  +
Recent additions to Python have made it an increasingly popular language for data analysis. In particular, the pandas library provides an R-like data-fame in Python, which is data structure that resembles a spreadsheet. This provides an efficient way to load, slice, reshape, query, summarize, and visualize your data. Combining this with numpy, maplotlib, and scikit-learn creates a powerful set of tools for data analysis. In this hands-on tutorial, we will cover the basics of numpy, matplotlib, pandas, and introduce scikit-learn.  +
Recent technological advances in data collection techniques have yielded opportunities to better quantify stratigraphic stacking patterns, flow processes and sedimentation from outcrops of ancient sediment transport systems. These advancements created opportunities for field geologists to reduce uncertainty in the interpretation of the stratigraphic record and have likewise created data sets from which the efficacy of numerical models and physical experiments can be evaluated. The goals of this presentation are to (1) review some combined outcrop-model based studies, (2) discuss how these integrated studies test model and field-based uncertainty, and (3) share a vision for how field geologists and modelers can leverage from each other’s perspectives.<br/><br/>Five examples of studies that bridged the gap between outcrop stratigraphy and experimental and/or numerical models include: (1) documentation of how mineralogy varies spatially in submarine fans, (2) relating flow processes to sedimentation in sinuous submarine channels, (3) evaluating compensational stacking in deltas and submarine fans, (4) relating stratigraphic architecture of deltas to inherited water depth and seafloor gradient, and (5) testing how shelf-edge deltas pipe coarse-grained sediment to submarine fans. These and similarly focused studies are important because they used common workflows and quantitative methods to evaluate similarities and differences between modeled and natural systems, resulting in a more complete view of the processes and products being studied. Whereas common workflows can provide a means to test the efficacy of physical and numerical modeling, it is critical to consider how modeling sheds insight into how one interprets the stratigraphic record from outcrop and subsurface data sets.  +
Recent theoretical work suggests that autogenic processes in sediment transport systems have the capacity to shred signals of environmental and tectonic perturbations prior to transfer to the stratigraphic record. We view this theory as a major conceptual and quantitative breakthrough in long time scale Earth-surface processes and stratigraphy, but the general theory still needs to be adapted to deal with specific types of signals. Many argue that the tug of Relative Sea Level (RSL) change represents the most important boundary condition forcing affecting continental margin transport systems. However, we still lack quantitative theory to explain what properties RSL cycles must have to be stored in stratigraphy, thus limiting the usefulness of stratigraphy for defining paleo-environments. Results from our previously conducted laboratory experiments suggest that RSL cycles with amplitudes less than a channel depth and of periodicities less than the amount of time necessary to deposit, on average, one channel depth of stratigraphy over a delta-top are susceptible to signal shredding. Our hypothesis is supported using existing data sets and new numerical and physical experiments in which the surface process response and preserved record of RSL cycles of varying magnitudes and periodicities is constrained. Quantitative theory and predictions produced from this work is benchmarked against stratigraphy from the Late Miocene to Quaternary stratigraphy of the Mississippi Delta. During this time interval a significant change in the magnitude and periodicity of RSL cycles occurred. RSL cycles in the Late Miocene for the Mississippi Delta are predicted to be shredded, while more recent cycles are predicted to be preserved.  +
Repeated continental glaciation of the US Central Lowlands disrupted pre-Pleistocene fluvial drainage networks by filling valleys, rerouting major rivers, and incising oversize meltwater channels. Post-glacial landscapes are characterized by large fractions of non contributing area (NCA) which does not contribute flow to external drainage networks by steepest decent algorithms. Analysis of land surfaces most recently glaciated between 130,000 and 10,000 years ago suggests that NCA is lost over time as fluvial networks are reestablished. Low surface slopes combined with significant fractions of NCA make such fluvial network growth difficult to reconcile with standard treatments of flow routing. We develop modules in Land Lab that allow for connection of NCA via filling and spilling from closed depressions on the surface and through groundwater flow across subtle surface water divides to explore the impacts of these mechanisms of flow accumulation on the pace of evolution and morphology of resulting river networks. This work highlights the more general need to consider the relationship, or lack of relationship, between topography and river discharge.  +
Reporting to the community of what CSDMS has accomplished and what can be expected with CSDMS 3.0  +
Research communities and peer-review journals are increasingly requiring authors to make available the code and data behind computational results reported in published studies. The Whole Tale platform is an open-access and open source system designed to enable researchers to package and archive their code, data, computational workflow, and information about the computational environment to better enable others to assess and repeat their results. During this webinar, we will introduce participants to the concepts of computational reproducibility and transparency and demonstrate core features of the platform.  +
Research in Earth-surface processes and subsurface stratal development is in a data-rich era with rapid expansion of facilities that produce tremendous digital data with time and space resolution far beyond what we can collect in the field. Despite these advances, sediment experimentalists are an example community in the “long tail”, meaning that their data are often collected in one-of-a-kind experimental set-ups and isolated from other experiments. Experimentalists also have a lot of “dark data” that are difficult or impossible to access through the Internet. The Sediment Experimentalist Network (SEN) was formed to address these challenges. Over the last three years, SEN launched a Knowledge Base website, held international workshops, and provided educational short courses. Through workshops and short courses, SEN has identified and shared experimental data best practices, developed metadata standards for data collection, and fostered data management and sharing efforts within the experimentalist community. '''Now is the time to extend this collaboration toward Earth-surface modelers to advance geoscience research and education.''' We identified three grand challenges for SEN: (1) How best to relate experiments to natural systems and theory, (2) How to ensure comparability of experimental results from disparate facilities, and (3) How to distinguish external versus intrinsic processes observed in experiments. Experimentalist-modeler collaborations are essential for achieving solutions to all of these grand challenges. Theoretical and numerical modeling based on first principles can help to extrapolate insight from experiments to field scales, to compare results from different lab facilities, and to decouple autogenic processes and allogenic forcings in geomorphology and stratigraphy. The experimentalist-modeler collaborative effort will result in tremendous opportunities for overcoming grand challenges in our communities.  +
Researchers and decision makers are increasingly interested in understanding the many ways in which human and Earth systems interact with one another, at scales from local (e.g., a city) to regional to global. For example, how might changes in population, income, or technology development alter crop production, energy demand, or water withdrawals? How do changes in one region's demand for energy affect energy, water, and land in other regions? This session will focus on two models – GCAM and Demeter – that provide capabilities to address these types of questions. GCAM is an open-source, global, market equilibrium model that represents the linkages between energy, water, land, climate, and economic systems. A strength of GCAM is that it can be used to quickly explore, and quantify the uncertainty in, a large number of alternate future scenarios while accounting for multi-sector, human-Earth system dynamics. One of GCAM’s many outputs is projected land cover/use by subregion. Subregional projections provide context and can be used to understand regional land dynamics; however, Earth System Models (ESMs) generally require gridded representations of land at finer scales. Demeter, a land use and land cover disaggregation model, was created to provide this service. Demeter directly ingests land projections from GCAM and creates gridded products that match the desired resolution and land class requirements of the user. This clinic will introduce both GCAM and Demeter at a high-level. We will also provide a hands-on walk through for a reference case so attendees can become familiar with configuring and running these two models. Our goal will be for attendees to leave the clinic with an understanding of 1) the value of capturing a global perspective when informing subregional and local analysis, 2) possibilities to conduct scenario exploration experiments that capture multi-sector/scale dynamics, 3) and a hands-on experience with GCAM and Demeter.  
Researchers and decision makers are increasingly interested in understanding the many ways in which human and Earth systems interact with one another, at scales from local (e.g., a city) to regional to global. For example, how might changes in population, income, or technology cost alter crop production, energy demand, or water withdrawals? How do changes in one region's demand for energy affect energy, water, and land in other regions? This session will focus on two models – GCAM and Demeter – that provide the capability to address these types of questions.<br><br>GCAM is an open-source, global, market equilibrium model that represents the linkages between energy, water, land, climate, and economic systems (Calvin et al. 2019). A strength of GCAM is that it runs fast and can be used to explore, and quantify the uncertainty in, a large number of alternate future scenarios while accounting for multisector, human-Earth system dynamics. One of GCAM’s many outputs is projected land cover/use by subregion. Subregional projections provide context and can be used to understand regional land dynamics; however, Earth System Models (ESMs) generally require gridded representations of land at finer scales. Demeter, a land use and land cover disaggregation model, was created to provide this service (Vernon et al. 2018). Demeter directly ingests land projections from GCAM and creates gridded products that match the desired resolution and land class requirements of the user.<br><br>This clinic will introduce both GCAM and Demeter at a high-level. We will also provide a hands-on walk through for a reference case so attendees can become familiar with setting-up and running these two models. Our goal will be for attendees to leave the clinic with an understanding of 1) the value of capturing a global perspective when informing subregional and local analysis, 2) possibilities to conduct scenario exploration experiments that capture multisector/scale dynamics, 3) a hands-on experience with GCAM and Demeter, and 4) key model assumption drivers and simulated model results available.  
River deltas will likely experience significant land loss because of relative sea-level rise (RSLR), but predictions have remained elusive. Here, we use global data of RSLR and river sediment supply to build a validated model of delta response to RSLR for all ~10,000 deltas globally. Applying this model to predict future delta change, we find that all IPCC RCP sea-level scenarios lead to a net delta loss by the end of the 21st century, ranging from -52 ¬± 36 (1 s.d.) km2yr-1 for RCP2.6 to -808 ¬± 80 km2yr-1 for RCP8.5. We find that river dams, subsidence, and sea-level rise have had a comparable influence on reduced delta growth over the past decades, but that by 2100 under RCP8.5 more than 80% of delta land loss will be caused by climate-change driven sea-level rise.  +
SNAC (StGermaiN Analysis of Continua) is a 3D parallel explicit finite element code for modeling long-term deformations of lithosphere. It is an open source being distributed through Computational Infrastructure for Geodynamics (http://geodynamics.org/cig/software/snac/) as well as through CSDMS web site (https://csdms.colorado.edu/wiki/Model:SNAC).<br/><br/>This clinic will provide an overview of SNAC and lead participants through a typical work procedure for producing a 3D lithospheric deformation model on a high performance cluster. Specifically, participants will take the following steps: 0) acquiring an account on the CSDMS HPC (to be done before the clinic); 1) checking out the source code through a version control system; 2) building SNAC on the cluster; 3) getting familiar with SNAC by running a cookbook example in parallel and visualizing outputs; 4) modifying the source codes to customize a model.  +
Salt marshes are biogeomorphic features that are under increasing pressure from sea level rise, land use change, and other external stressors. Modeling of salt marshes has traditionally been “stovepiped” into three general disciplines: ecology, geomorphology, and engineering, resulting in contrasting approaches and relative rigor. I will highlight successes and failures across these efforts, and identify how the three disciplines can move forward using advances from each other.  +
Scientific communities and peer-review journals are increasingly requiring authors to make available the code and data behind computational results reported in published research. This tutorial will introduce participants to the NSF-funded Whole Tale platform, an open-access and open-source system designed to enable authors to package and archive their code, data, computational workflow and information about the computational environment to better enable others to repeat their results. We will walk through the basic features of the platform with hands-on exercises.  +
Seagrass provides a wide range of economically and ecologically valuable ecosystem services, with shoreline erosion control often listed as a key service. But seagrass can also alter the sediment dynamics and waves of back-barrier bays by reducing wave height and attenuating wave and current shear stresses acting on the sediment bed. This suggests that seagrass can play an important role in the evolution of the entire shallow coastal bay, back-barrier marsh, and barrier-island system, yet no study has previously examined these subsystems coupled together. Here we incorporate seagrass dynamics of the back-barrier bay into the existing coupled barrier-marsh model GEOMBEST+. In our new integrated model, bay depth and distance from the marsh edge determine the location of suitable seagrass habitat, and the presence or absence, size, and shoot density of seagrass meadows alters the bathymetry of the bay and wave power reaching the marsh edge. We use this model to run 3 sets of experiments to examine the coupled interactions of the back-barrier bay with both adjacent (marsh) and non-adjacent (barrier) subsystems. While seagrass reduces marsh edge erosion rates and increases progradation rates in many of our model simulations, seagrass surprisingly increases marsh edge erosion rates when sediment export from the back-barrier basin is negligible. Adding seagrass to the bay subsystem leads to increased deposition in the bay, reduced sediment available to the marsh, and enhanced marsh edge erosion until the bay reaches a new, shallower equilibrium depth. In contrast, removing seagrass liberates previously-sequestered sediment that is then delivered to the marsh, leading to enhanced marsh progradation. Lastly, we find that seagrass reduces barrier island migration rates in the absence of back-barrier marsh by filling accommodation space in the bay. These model observations suggest that seagrass meadows operate as dynamic sources and sinks of sediment that can influence the evolution of coupled marsh and barrier island landforms in unanticipated ways.  
Seasonal seagrass growth and senescence exert a strong influence on shallow coastal environments. We applied a hydrodynamic and sediment transport Delft3D model that included coupled effects of seagrass on flow, waves, and sediment resuspension in a shallow coastal bay to quantify seasonal seagrass effects on bay dynamics. Simulation results show that seagrass meadows significantly attenuated flow (60%) and waves (20%) and reduced suspended sediment concentration (85%) during the growing season. Although low-densities of seagrass in winter had limited effects on flow and wave attenuation, small changes in winter seagrass density could alter the annual sediment budget of these seagrass ecosystems.  +
Sediment diversions costing billions of dollars are planned on deltas globally, to mitigate land loss due to rising sea levels and subsidence. Downstream of engineered levee breaks, land building will rely on natural delta processes to disperse sediment. But, external factors known to affect natural delta processes vary between possible diversion sites (e.g., wave energy, basin substrate, marsh activity), making it difficult to quantitatively compare land-building potential between sites and optimally allocate engineering resources. We have implemented the pyDeltaRCM numerical model to provide an easily extensible platform for simulating delta evolution under arbitrary environmental factors. With the computationally efficient model, we isolate (and combine) these factors to observe effects on land building, and build a framework to quickly assess land-building potential at different sites. In this presentation, I will describe pyDeltaRCM model design, and show ongoing studies to assess land-building potential of diversions under different forcings. Model computational efficiency enables uncertainty quantification that will benefit diversion planning and resource allocation, by identifying relative impact of different external factors.  +
Sediment production and transfer processes shape river basins and networks and are driven by variability in precipitation, runoff and temperature. Changes in these hydrological and geomorphological processes are especially difficult to predict in temperature-sensitive environments such as the European Alps. We used a model chain to quantify possible impacts of climate change on sediment transfer and hazard in a debris flow-prone catchment in the Swiss Alps (Illgraben). We combined a stochastic weather generator1 with downscaled and bias-corrected climate change projections2 to generate climate simulations. These climate simulations then feed the hillslope-channel sediment cascade model, SedCas3, which is calibrated against observed debris-flow magnitudes estimated from force plate measurements4, to make predictions of sediment transfer and debris flow hazard in the Illgraben over the 21st century5. The results demonstrate the complex interplay between hydrology, sediment production and elevation in alpine catchment response to climate change. The hydrological potential to transport sediment and generate debris flows will increase, driven by increases in precipitation and air temperature. Indeed, if sediment supply to the channel by landslides were unlimited, this would result in an increase in future sediment yield of 48% by the end of the century. However, sediment transfer is also a function of sediment supply by landslides at the head of the catchment, driven by highly temperature sensitive freeze-thaw processes6. At the elevation of the Illgraben (<2000 m), freeze-thaw processes and thus sediment supply will decrease in a warming climate resulting in a decrease in sediment yield of 48% by the end of the century. This result and the competition between hydrological debris flow triggering potential and sediment supply is highly elevation dependent. As we increase mean catchment elevation, sediment production increases due to decreased snow cover and increased exposure of bedrock to freeze-thaw weathering, with implications for the application of findings to other catchments. Although uncertainties in our results are large, we show that these can mostly be attributed to irreducible internal climate variability. Our findings have important implications for the assessment of natural hazards and risks in mountain environments. REFERENCES 1 Fatichi et al., 2011: Simulation of future climate scenarios with a weather generator 2 National Centre for Climate Services, 2018: CH2018 - Climate Scenarios for Switzerland 3 Bennett et al., 2014: A probabilistic sediment cascade model of sediment transfer in the Illgraben 4 McArdell et al., 2007: Field observations of basal forces and fluid pressure in a debris flow 5Hirshberg et al., 2021: Climate change impacts on sediment yield and debris flow activity 6 Bennett et al., 2013: Patterns and controls of sediment production, transfer and yield in the Illgraben  
Sediment transport in rivers is a key parameter in landscape evolution, fluvial sedimentation, and river engineering. In particular, information on the time-averaged virtual velocity and the channel/floodplain exchange rate of sediment is extremely useful for quantifying long-term sediment transport dynamics. This data is expensive and time-consuming to obtain. A potential solution is to use luminescence, a property of matter normally used for dating. I develop a model based on conservation of energy and sediment mass to explain the patterns of luminescence in river channel sediment. The parameters from the model can then be used to estimate the time-averaged virtual velocity, characteristic transport lengthscales, storage timescales, and floodplain exchange rates of fine sand-sized sediment in a fluvial system. I show that this model can accurately reproduce the luminescence observed in previously published field measurements. I test these predictions in three rivers where the sediment transport information is well known: the South River and Difficult Run in Virginia, and Linganore Creek in Maryland. Each of these rivers tests key predictions of the model with the South River having favorable conditions, Difficult Run having large amounts of human influence, and Linganore Creek switching from alluvial to bedrock and vice versa along its course. In the South River, the model successfully reproduces the virtual-velocity and exchange rates from previously published data. In Difficult Run, we find that the influx of sediment from human development obfuscates the model-predicted pattern as expected. In Linganore Creek, the shift from alluvial covered to bedrock and back produces a change in the luminescence consistent with the predictions made by the model. From these results, I conclude that when model assumptions are upheld, luminescence can provide a useful method to obtain sediment transport information. This finding, coupled with the advent of portable luminescence technology, opens the door for rapid and inexpensive collection of long-term sediment transport data.  
Segmentation, or the classification of pixels (grid cells) in imagery, is ubiquitously applied in the natural sciences. An example close to the CSDMS community might be translating images of earth surface into arrays of land cover to be used as model initial conditions, or to test model output. Manual segmentation is often prohibitively time-consuming, especially when images have significant spatial heterogeneity of colors or textures. This Clinic is focused on demonstrating a machine learning method for image segmentation using two software tools: The first is “Doodler”, a fast, semi-automated, method for interactive segmentation of N-dimensional (x,y,N) images into two-dimensional (x,y) label images. It uses human-in-the-loop ML to achieve consensus between the labeler and a model in an iterative workflow. Second, we will demonstrate Segmentation Zoo, a python toolbox to segment imagery with a variety of deep learning models that uses output from Doodler with existing models, or train entirely new models. Ideally the clinic will be divided into two separate days. Day 1 would be a short introductory lecture, a live code demo, and then homework — participants will doodle imagery to gain familiarity with the software and create training data for a segmentation model. Day 2 would be a short introductory lecture on machine learning, and a live code demo for how to use doodled images in Segmentation Zoo (i.e., the images that participants doodled). There are two concrete goals for the clinic: 1) demonstrate how participants can use these two tools, and; 2) a group authored dataset of doodled images that will be placed in a Zenodo repository with all participants who contribute as coauthors. Doodler preprint: https://doi.org/10.31223/X59K83 Doodler repository: https://github.com/dbuscombe-usgs/dash_doodler Doodler Website: https://dbuscombe-usgs.github.io/dash_doodler/ Segmentation Zoo repository: https://github.com/dbuscombe-usgs/segmentation_zoo  
Seismic observations document how substantial amounts of sediments may be transported from the onshore to the offshore during formation of extensional continental margins. Thick sedimentary packages are, for example, found on the margins of Norway, the eastern US coast, and the Gulf of Mexico. In contrast, the Goban Spur, Galicia Bank, and the Red Sea are examples of sediment-starved margins. Such variations in the amount of sediments impact not only the development of offshore sedimentary basins, but the changes in mass balance by erosion and sedimentation can also interact with extensional tectonic processes. In convergent settings, such feedback relationships between erosion and tectonic deformation have long been highlighted: Erosion reduces the elevation and width of mountain belts and in turn tectonic activity and exhumation are focused at regions of enhanced erosion. But what is the role played by surface processes during formation of extensional continental margins? In this lecture, I will discuss geodynamic experiments that explore the response of continental rifts to erosion and sedimentation from initial rifting to continental break-up. These experiments show how the interaction of extensional tectonics and surface processes can fundamentally alter the width and topography of continent-ocean boundaries.  +
Seismo-acoustic techniques can provide continuous, real-time observations with high temporal resolution and broad spatial coverage for process monitoring, detection and characterization in accessible environments. These capabilities are rapidly advancing with the growing use of distributed acoustic sensing (DAS) systems, which use fiber optic cables to provide continuous records of ground motion comparable to large-N arrays of single-component accelerometers or geophones. Compared to traditional seismic arrays, DAS arrays can be tens of kilometers in length with spatial resolution of meters and sampling frequencies from millihertz to kilohertz. In this clinic, participants will learn about the basics of DAS instrumentation and deployment in an introductory lecture, and be introduced to hands-on DAS data input, analysis and visualization concepts through Jupyter notebooks. The clinic will also provide participants with resources for further exploring and utilizing DAS, including guides to open DAS datasets, and the growing resource lists and GitHub organization managed by the NSF-funded DAS Research Coordination Network (https://www.iris.edu/hq/initiatives/das_rcn).  +
Sequence is a modular 2D (i.e., profile) sequence stratigraphic model that is written in Python and implemented within the Landlab framework. Sequence represents time-averaged fluvial and marine sediment transport via differential equations. The modular code includes components to deal with sea level changes, sediment compaction, local or flexural isostasy, and tectonic subsidence and uplift. Development of the code was spurred by observations of repetitive stratigraphic sequences in western Turkey that are distorted by tectonics.  +
Sinuous channels commonly migrate laterally and interact with banks of different strengths—an interplay that links geomorphology and life, and shapes diverse landscapes from the seafloor to planetary surfaces. To investigate feedbacks between meandering rivers and landscapes over geomorphic timescales, numerical models typically represent bank properties using structured or unstructured grids. Grid-based models, however, implicitly include unintended thresholds for bank migration that can control simulated landscape evolution. I will present a vector-based approach to land surface- and subsurface-material tracking that overcomes the resolution-dependence inherent in grid-based techniques by allowing high-fidelity representation of bank-material properties for curvilinear banks and low channel lateral migration rates. The vector-based technique is flexible for tracking evolving topography and stratigraphy to different environments, including aggrading floodplains and mixed bedrock-alluvial river valleys. Because of its geometric flexibility, the vector-based material tracking approach provides new opportunities for exploring the co-evolution of meandering rivers and surrounding landscapes over geologic timescales.  +
Six years ago, we set out to study how complex systems simulations could support collaborative water planning. We hypothesized that, by allowing participants to see the hidden effects of land- and water-use decisions on water flow, such tools could provide a platform for collective and innovative solution-building to complex environmental problems. We first adopted a developmental and collaborative agent-based approach, where groups of stakeholders learned how to inform and use models to assess the impacts of different implementation strategies. Despite their improved understanding and enhanced exploration of solutions, participants resisted policy innovation beyond familiar strategies. We refined our approach towards facilitated interaction with complex systems models and additional interfaces to help stakeholders provide direct input to the simulations, comprehend model outputs, and negotiate tradeoffs. Participants challenged outdated and false assumptions and identified novel solutions to their water woes. Nevertheless, at times the dissonance between simulation outputs and participants’ expectations was too great to accept and own. We share three stories of the obstacles encountered and offer suggestions to overcome them: keep models and interfaces simple, make both biophysical processes and values visible and tangible, and explicitly structure the social aspects of the simulation’s use. We draw on our experiences to show what aspects of visualization can support participatory planning.  +
Society is facing unprecedented environmental challenges that have pushed us into a world dominated by transients and variability. Informed decision making in this era, at scales from the individual to the globe, requires explicit predictions on management-relevant timescales, based on the best available information, and considering a wide range of uncertainties. As a research community, we are not yet meeting this need. In this talk I will introduce the Ecological Forecasting Initiative (EFI), an international grass-roots research consortium aimed at building a community of practice. I will discuss EFI’s cross-cutting efforts to tackle community-wide bottlenecks in cyberinfrastructure, community standards, methods and tools, education, diversity, knowledge transfer, decision support, and our theoretical understanding of predictability. I will highlight examples of near real-time iterative ecological forecasts across a wide range of terrestrial and aquatic systems, as well as work done by my own group developing PEcAn (a terrestrial ecosystem model-data informatics and forecasting system) and our recent efforts to generalize these approaches to other forecasts. Finally, I will also introduce EFI’s ecological forecasting competition, which relies on a wide range of continually-updated NEON (National Ecological Observatory Network) data.  +
Socio-environmental systems (SES) modeling integrates knowledge and perspectives into conceptual and computational tools that explicitly recognize how human decisions affect the environment. With the advent of new techniques, data sources, and computational power on the one hand, and the growing sustainability challenges on the other, the expectation is that SES modeling should be more widely used to inform decision-making at multiple scales. This presentation will highlight the grand challenges that need to be overcome to accelerate the development and adaptation of SES modeling. These challenges include: bridging epistemologies across disciplines; multi-dimensional uncertainty assessment and management; scales and scaling issues; combining qualitative and quantitative methods and data; furthering the adoption and impacts of SES modeling on policy; capturing structural changes; representing human dimensions in SES; and leveraging new data types and sources. The presentation will outline the steps required to surmount the underpinning barriers and priority research areas in SES modelling and propose clear directions for future generations of models and modeling, to both their developers and users.  +
Software sustainability - the ability for software to continue to function - and the FAIR principles (Findable, Accessible, Interoperable and Reusable) are important features of software used in research. But how do they apply to research into environmental extremes? In this presentation, I will summarise the work of the Software Sustainability Institute, including my work on the FAIR principles for research software, and what we understand about the challenges and benefits of applying software sustainability and FAIR to this area.  +
Soil science has developed as a critical discipline of the biosphere and continues to develop every day; yet state-of-the-art modeling is unable to adequately synthesize many processes in applied earth system models. If we agree that soil is a critical life-supporting compartment that supports ecosystem functions (e.g., habitat for biodiversity) and ecosystem services (e.g., water filtration, nutrient management), and that produces food, feed, fiber and energy for our societies, then our inability to integrate soil processes into the broader array of earth system models is an issue that needs solving. Integration is an achievable goal. Other research communities have collaborated intensively over the past decades—specifically the climate modeling community—but even many of their approaches overlook (or over-average) the detailed and advanced shared knowledge of the soil compartment. This represents a gap in how scientific knowledge is implemented. Over the recent decades, a new generation of soil models has been developed, based on a whole systems approach comprising all physical, mechanical, chemical, and biological processes. The processes are needed to fill these critical knowledge gaps and contribute to the preservation of ecosystem function, improve our understanding of climate-change feedback processes, bridge basic soil science research and management, and facilitate the communication between science and society. The International Soil Modeling Consortium (ISMC) was formed in 2016 as a new community effort of soil modelers to improve how soil processes are communicated to other scientific communities, from earth dynamics to biogeosciences to global climate modelers. ISMC was formed around three themes: linking data and observations to models; creating the means for soil model intercomparison studies; and connecting our soil-related knowledge between science communities. Within less than 12 months of inception, ISMC has warehoused nearly 40 soil-related models, initiated data sets and platforms for modeling studies, and facilitated collaborations with several international groups, including CSDMS. In this discussion, we will describe the motivation and genesis of ISMC, present current status of our research, and seek to create new research partnerships.  
Soils control the influence of how land use and land cover (LULC) change the global water, energy, and biogeochemical cycles. However, Earth System Models often assume soil properties stay constant over time that leaves uncertainties in assessing LULC impacts. This study quantifies impacts of agriculture, pasture, grazing, vegetation harvest, and secondary vegetation cover on SOC, texture, and bulk density through meta-analyses. We showed how LULCs link to different soils and constructed a model to estimate how LULCs change soil properties and how climate and soil conditions alter the impacts. Results provide better land surface characteristics to improve Earth systems modeling.  +
Spring School Student Presentations  +
State of CSDMS  +