Property:CSDMS meeting abstract presentation
From CSDMS
This is a property of type Text.
P
Cloud computing is a powerful tool for both analyzing large datasets and running models. This clinic will provide an introduction to approaches for accessing and using cloud resources for research in the Geosciences. During the hands-on portion of this clinic, participants will learn how to use Amazon Web Services (AWS) to open a terminal, analyze model output in python, and run a model, time permitting. This workshop assumes no experience with cloud computing. +
Coastal Risk is a flood and natural hazard risk assessment technology company. Our mission is to help individuals, businesses and governments in the US and around the world achieve resilience and sustainability.<br>In the past year, Coastal Risk’s Technology supported nearly $2 billion in US commercial real estate investment and development. Coastal Risk’s unique business model combines high-tech, flood, climate and natural hazards risk assessments and high-value, risk communication reports with personalized, resilience-accelerating advice for individuals, corporations and governments. Our risk modeling and reports help save lives and property in the US. In order to take our system around the world, however, we need higher resolution DEMs. The 30m resolution currently available is a big obstacle to going international. This is something that we would like to get from NASA. Also, we are interested in high-resolution, “before-and-after” satellite imagery of flooded areas to compare with our modeling and to help individuals, businesses and governments understand how to better defend against floods. +
Coastal communities facing shoreline erosion preserve their beaches both for recreation and for property protection. One approach is nourishment, the placement of externally-sourced sand to increase the beach’s width, forming an ephemeral protrusion that requires periodic re-nourishment. Nourishments add value to beachfront properties, thereby affecting re-nourishment choices for an individual community. However, the shoreline represents an alongshore-connected system, such that morphodynamics in one community are influenced by actions in neighboring communities. Prior research suggests coordinated nourishment decisions between neighbors were economically optimal, though many real-world communities have failed to coordinate, and the geomorphic consequences of which are unknown. Toward understanding this geomorphic-economic relationship, we develop a coupled model representing two neighboring communities and an adjacent non-managed shoreline. Within this framework, we examine scenarios where communities coordinate nourishment choices to maximize their joint net benefit versus scenarios where decision-making is uncoordinated such that communities aim to maximize their independent net benefits. We examine how community-scale property values affect choices produced by each management scheme and the economic importance of coordinating. The geo-economic model produces four behaviors based on nourishment frequency: seaward growth, hold the line, slow retreat, and full retreat. Under current conditions, coordination is strongly beneficial for wealth-asymmetric systems, where less wealthy communities acting alone risk nourishing more than necessary relative to their optimal frequency under coordination. For a future scenario, with increased material costs and background erosion due to sea-level rise, less wealthy communities might be unable to afford nourishing their beach independently and thus lose their beachfront properties. +
Coastal environments are complex because of the interplay between aeolian and nearshore processes. Waves, currents, tides, and winds drive significant short term (<weekly) changes to coastal landforms which augment longer term (> annual) geomorphic trends. Great strides have been made in recent years regarding our ability to model coastal geomorphic change in this range of societally relevant time scales. However, a great disparity exists in modeling coastal evolution because subaqueous and subaerial processes are typically assessed completely independent of one another. By neglecting the co-evolution of subtidal and supratidal regions within our current framework, we are precluded from fully capturing non-linear dynamics of these complex systems. This has implications for predicting coastal change during both fair weather and storm conditions, hindering our ability to answer important scientific questions related to coastal vulnerability and beach building.<br><br>Recognizing these historic limitations, here we present the outline for a coupled subaqueous (XBeach) and subaerial (Coastal Dune Model) morphodynamic modeling system that is in active development with the goal of exploring coastal co-evolution on daily to decadal timescales. Furthermore we present recently collected datasets of beach and dune morphology in the Pacific Northwest US that will be used to validate trends observed within the coupled model platform. +
Coastal flooding and related hazards have increasingly become one of the most impactful events as climate change continues to change the risk due to these events. Measuring the change in the risk of a particular flood level has therefore taken on a greater urgency, as historic measurements and statistics are no longer sufficient to measure the risk to coastal communities. Enabling our ability to compute these changes has become the focus as adaptation strategies due to the changing climate become increasingly critical. This talk will outline some of these challenges and ways we are attempting to address the problem in a multi-hazard aware way. +
Coastal morphological evolution is caused by a wide range of coupled cross-shore and alongshore sediment transport processes associated with short waves, infragravity waves, and wave-induced currents. However, the fundamental transport mechanisms occur within the thin bottom boundary layer and are dictated by turbulence-sediment interaction and inter-granular interactions. In the past decade, significant progresses have been made in modeling sediment transport using Eulerian-Eulerian or Eulerian-Lagrangian two-phase flow approach. However, most of these models are limited to one-dimensional-vertical (1DV) formulation, which is only applicable to Reynolds-averaged sheet flow condition. Consequently, complex processes such as instabilities of the transport layer, bedform dynamics and turbulence-resolving capability cannot be simulated. The main objective of my research study was to develop a multi-dimensional four-way coupled two-phase model for sediment transport that can be used for Reynolds-averaged modeling for large-scale applications or for turbulence-resolving simulations at small-scale. +
Coastal systems are an environmental sink for a wide range of materials of scientific interest, including sediments, nutrients, plastics, oils, seeds, and wood, to name only a few. Due to differences in material properties such as buoyancy, each of these materials are liable to have characteristic transport pathways which differ from the mean flow and each other, hydraulically “sorting” these materials in space. However, it remains difficult to quantify these differences in transport, due in part to the use of disparate models and approaches for each respective material. In this talk, I will advance a novel modeling framework for simulating the patterns of transport for a wide range of fluvially-transported materials using a single unified reduced-complexity approach, allowing us to compare and quantify differences in transport between materials. Using a hydrodynamic model coupled with the stochastic Lagrangian particle-routing model “dorado,” we are able to simulate at the process-level how local differences in material buoyancy lead to emergent changes in partitioning and nourishment in river deltaic systems. I will show some of the insights we have learned regarding the tendency for materials to be autogenically sorted in space, as well as progress we have made bridging between the process-level framework used in dorado and more physics-based approaches based on transport theory. +
Computer models help us explore the consequences of scientific hypotheses at a level of precision and quantification that is impossible for our unaided minds. The process of writing and debugging the necessary code is often time-consuming, however, and this cost can inhibit progress. The code-development barrier can be especially problematic when a field is rapidly unearthing new data and new ideas, as is presently the case in surface dynamics.<br/><br/>To help meet the need for rapid, flexible model development, we have written a prototype software framework for two-dimensional numerical modeling of planetary surface processes. The Landlab software can be used to develop new models from scratch, to create models from existing components, or a combination of the two. Landlab provides a gridding module that allows you to create and configure a model grid in just a few lines of code. Grids can be regular or unstructured, and can readily be used to implement staggered-grid numerical solutions to equations for various types of geophysical flow. The gridding module provides built-in functions for common numerical operations, such as calculating gradients and integrating fluxes around the perimeter of cells. Landlab is written in Python, a high-level language that enables rapid code development and takes advantage of a wealth of libraries for scientific computing and graphical output. Landlab also provides a framework for assembling new models from combinations of pre-built components.<br/><br/>In this clinic we introduce Landlab and its capabilities. We emphasize in particular its flexibility, and the speed with which new models can be developed under its framework. In particular, we will introduce the many tools available within Landlab that make development of new functionality and new descriptions of physical processes both easy and fast. Participants will finish the clinic with all the knowledge necessary to build, run and visualize 2D models of various types of earth surface systems using Landlab.
Continental and global water models have long been trapped in slow growth and inadequate predictive power, as they are not able to effectively assimilate information from big data. While Artificial Intelligence (AI) models greatly improve performance, purely data-driven approaches do not provide strong enough interpretability and generalization. One promising avenue is “differentiable” modeling that seamlessly connects neural networks with physical modules and trains them together to deliver real-world benefits in operational systems. Differentiable modeling (DM) can efficiently learn from big data to reach state-of-the-art accuracy while preserving interpretability and physical constraints, promising superior generalization ability, predictions of untrained intermediate variables, and the potential for knowledge discovery. Here we demonstrate the practical relevance of a high-resolution, multiscale water model for operational continental-scale and global-scale water resources assessment. (https://bit.ly/3NnqDNB). Not only does it achieve significant improvements in streamflow simulation compared to the established national- and global water models, but it also produces much more reliable depictions of interannual changes in large river streamflow, freshwater inputs to estuaries, and groundwater recharge. As a related topic, we also showcase the value of foundation AI for global environmental change and its benefits for resource management. +
Convolutional Neural Networks have driven a revolution in computer vision and "AI" due to their ability to recognize complex spatial patterns. They are also finding more and more use in the geosciences. In this webinar we will go through what a CNN is, how to implement one using the PyTorch library, and some of the ways that we can interpret them to help our science. +
Current software engineering and data management practices amongst different research teams impede collaboration in geomorphology. For example, researchers who create software tools often do not document them, so the tools do not port easily to new systems. Often, tools go unmaintained after publication, so other teams that want to use the tool or conduct the same analysis will rewrite the software rather than reuse the existing code. This clinic will demonstrate several advances of the recently-launched sandpiper toolchain initiative that facilitate data reuse and research team collaboration, and reduce research effort duplication.
sandpiper is forging a data standard for regularly-gridded three dimensional data (i.e., time and two spatial dimensions), and building a software package for data analysis in Earth surface processes research. In this clinic, we will show the features of the data standard, how to create datasets that are compliant with the standard, and how existing datasets can be “rescued” and made findable and reusable. We will also demonstrate the analysis software package, and how it is being used for research. Importantly, sandpiper is a growing community of users, and we want you to join. Bring your data problems, and help us build solutions that work for the whole community. +
D-Claw is an extension of the software package GeoClaw (www.clawpack.org) for simulating flows of granular-fluid mixtures with evolving volume fractions. It was developed primarily for landslides, debris flows and related phenomena by incorporating principles of solid, fluid and soil mechanics. However, because the two-phase model accommodates variable phase concentrations, it can also be used to model fluid problems in the absence of solid content (the model equations reduce to the shallow water equations as the solid phase vanishes). We therefore use D-Claw to seamlessly simulate multifaceted problems that involve the interaction of granular-fluid mixtures and bodies of water. This includes a large number of cascading natural hazards, such as debris-avalanches and lahars that enter rivers and lakes, landslide-generated tsunamis, landslide dams and outburst floods that entrain debris, and debris-laden tsunami inundation. I will describe the basis of D-Claw's model equations and highlight some recent applications, including the 2015 Tyndall Glacier landslide and tsunami, potential lahars on Mt. Rainier that displace dammed reservoirs, and a hypothetical landslide-generated lake outburst flood near Sisters, Oregon. +
DES3D (Dynamic Earth Solver in Three Dimensions) is a flexible, open-source finite element solver that models momentum balance and heat transfer in elasto-visco-plastic material in the Lagrangian form using unstructured meshes. It provides a modeling platform for long-term tectonics as well as various problems in civil and geotechnical engineering. On top of the OpenMP multi-thread parallelism, DES3D has recently adopted CUDA for GPU computing. The CUDA-enabled version shows speedup of two to three orders of magnitude compared to the single-thread performance, making high-resolution 3D models affordable. This clinic will provide an introduction to DynEarthSol3D’s features and capabilities and hands-on tutorials to help beginners start using the code for simple tectonic scenarios. Impact of the two types of parallelization on performance will be demonstrated as well. +
Dakota (https://dakota.sandia.gov) is an open-source software toolkit, designed and developed at Sandia National Laboratories, that provides a library of iterative systems analysis methods, including sensitivity analysis, uncertainty quantification, optimization, and parameter estimation. Dakota can be used to answer questions such as:
* What are the important parameters in my model?
* How safe, robust, and reliable is my model?
* What parameter values best match my observational data?
Dakota has been installed on the CSDMS supercomputer, ''beach.colorado.edu'', and is available to all registered users. The full set of Dakota methods can be invoked from the command line on ''beach''; however, this requires detailed knowledge of Dakota, including how to set up a Dakota input file and how to pass parameters and responses between a model and Dakota. To make Dakota more accessible to the CSDMS community, a subset of its functionality has been configured to run through the CSDMS Web Modeling Tool (WMT; https://csdms.colorado.edu/wmt). WMT currently provides access to Dakota's vector, centered, and multidimensional parameter study methods.<br><br>In this clinic, we'll provide an overview of Dakota, then, through WMT, set up and perform a series of numerical experiments with Dakota on ''beach'', and evaluate the results.
Other material can be downloaded from: https://github.com/mdpiper/dakota-tutorial.<br> +
Dakota is a flexible toolkit with algorithms for parameter optimization, uncertainty quantification, parameter estimation, and sensitivity analysis. In this clinic we will work through examples of using Dakota to compare field observations with model output using methods of sensitivity analysis and parameter optimization. We will also examine how the choice of comparison metrics influences results. Methods will be presented in the context of the Landlab Earth-surface dynamics framework but are generalizable to other models. Participants who are not familiar with Landlab are encouraged (but not required) to sign up for the Landlab clinic, which will take place before this clinic.<br><br>Participants are encouraged to install both Landlab and Dakota on their computers prior to the clinic. Installation instructions for Landlab can be found at: http://landlab.github.io (select "Install" from the menu bar at the top of the page). Installation instructions for Dakota can be found at https://dakota.sandia.gov/content/install-dakota. +
Dakota is a flexible toolkit with algorithms for parameter optimization, uncertainty quantification, parameter estimation, and sensitivity analysis. In this clinic we will cover the basics of the Dakota framework, work through examples of using Dakota to compare field observations with model output using methods of sensitivity analysis and parameter optimization, and briefly cover the theoretical background of the Dakota methods used. If time permits, we will examine how the choice of comparison metrics influences results. Methods will be presented in the context of the Landlab Earth-surface dynamics framework but are generalizable to other models. Participants who are not familiar with Landlab are encouraged (but not required) to sign up for the Landlab clinic, which will take place before this clinic.<br>Participants do not need to install Landlab or Dakota prior to the clinic but will need to sign up for a Hydroshare account. https://www.hydroshare.org/sign-up/. <br>For those students interested in installing Landlab or Dakota: Installation instructions for Landlab can be found at: http://landlab.github.io (select "Install" from the menu bar at the top of the page). Installation instructions for Dakota can be found at https://dakota.sandia.gov/content/install-dakota. +
Dakota is an open-source toolkit with several types of algorithms, including sensitivity analysis (SA), uncertainty quantification (UQ), optimization, and parameter calibration. Dakota provides a flexible, extensible interface between computational simulation codes and iterative analysis methods such as UQ and SA methods. Dakota has been designed to run on high-performance computing platforms and handles a variety of parallelism. In this clinic, we will provide an overview of Dakota algorithms, specifically focusing on uncertainty quantification (including various types of sampling, reliability analysis, stochastic expansion, and epistemic methods), sensitivity analysis (including variance-based decomposition methods and design of experiments), and parameter calibration (including nonlinear least squares and Bayesian methods). The tutorial will provide an overview of the methods and discuss how to use them. In addition, we will briefly cover how to interface your simulation code to Dakota. +
Data component is a software tool that wraps an API for a data source with a Basic Model Interface (BMI). It is designed to provide a consistent way to access various types of datasets and subsets of them without needing to know the original data API. Each data component can also interact with numerical models that are wrapped in the pymt modeling framework. This webinar will introduce the data component concept with a demonstration of several examples for time series, raster, and multidimensional space-time data. +
Debris flows pose a substantial threat to downstream communities in mountainous regions across the world, and there is a continued need for methods to delineate hazard zones associated with debris-flow inundation. Here we present ProDF, a reduced-complexity debris-flow inundation model. We calibrated and tested ProDF against observed debris-flow inundation from eight study sites across the western United States. While the debris flows at these sites varied in initiation mechanism, volume, and flow characteristics, results show that ProDF is capable of accurately reproducing observed inundation in different settings and geographic areas. ProDF reproduced observed inundation while maintaining computational efficiency, suggesting the model may be applicable in rapid hazard assessment scenarios. +
Decades of research have shown that storm runoff in many settings is primarily composed of pre-event groundwater. This groundwater is actively flowing, sometimes against topographic gradients, and in quantities substantial enough to alter the catchment water balance. Such effects have been observed across diverse lithologies and topographic settings, including mountainous environments that fluvial landscape evolution models often intend to capture. Yet to this day, most landscape evolution models represent runoff as a simple overland flow process. To explore the effects of groundwater flow on landscape evolution, we have developed coupled models in Landlab that represent both geomorphic change and surface-subsurface flow processes, in which runoff generated by a distributed hydrological model drives stream power fluvial erosion. We examine (1) hydrological function and topography in headwaters at geomorphic steady-state, and (2) transient dynamics at orogen-scale drainage divides, grounding our work in case studies and large-sample analyses. The results suggest that interactions between the surface and subsurface are often critical to understanding landscape evolution, and that long-term coevolution of hydrological and geomorphic processes may explain certain emergent hydrological traits of watersheds today. +
