Property:CSDMS meeting abstract presentation

From CSDMS

This is a property of type Text.

Showing 100 pages using this property.
P
Fill-Spill-Merge (FSM) is an algorithm that distributes runoff on a landscape to fill or partially fill depressions. When a depression fills, excess water can overflow into neighbouring depressions or the ocean. In this clinic, we will use FSM to assess changes in a landscape’s hydrology when depressions in a DEM are partially or fully filled with water. We will discuss why it may be important to consider depressions more closely than just with removal. I will describe the design of the FSM algorithm, and then we will use FSM on a DEM to look at how landscape hydrology changes under different hydrologic conditions. This clinic may be helpful to those interested in topics such as landscape hydrology, landscape evolution, flow routing, hydrologic connectivity, and lake water storage.  +
Fire temporarily alters soil and vegetation properties, driving increases in runoff and erosion that can dramatically increase the likelihood of debris flows. In the immediate aftermath of fire, debris flows most often initiate when surface water runoff rapidly erodes sediment on steep slopes. Due to the complex interactions between runoff generation, sediment transport, and post-fire debris-flow initiation and growth, models that couple these processes can provide valuable insights into the ways in which topography, burn severity, and post-fire recovery influence debris-flow activity. Here, we describe such a model as well as attempts to parameterize temporal changes in model parameters throughout the post-fire recovery process. Simulations of watershed-scale response to individual rainstorms in several southern California burned areas suggest substantial reductions in debris-flow likelihood and volume within the first 1-2 years following fire. Results highlight the importance of considering local rainfall characteristics and sediment supply when using process-based numerical models to assess debris-flow potential. More generally, results provide a methodology for estimating the intensity and duration of rainfall associated with the initiation of runoff-generated debris flows as well as insights into the persistence of debris-flow hazards following fire.  +
Flood hazard in rivers can evolve from changes in the frequency and intensity of flood-flows (hydrologic effects) and in the channel capacity to carry flood-flows (morphologic effects). However, river morphology is complex and often neglected in flood planning. Here, we separate the impacts of morphology vs. hydrology on flood risk for 48 river gauges in Northwestern Washington State. We find that morphologic vs. hydrologic forcings are comparable but not regionally consistent. Prominent morphologic effects on flood-risk are forced by extreme natural events and anthropogenic disturbances. Based on morphologic changes, we identify five categories of river behavior relevant for flood-risk management.  +
Flood modelling at global scales represents a revolution in hydraulic science and has the potential to transform decision-making and risk management in a wide variety of fields. Such modelling draws on a rich heritage of algorithm and data set development in hydraulic modelling over the last 20 years, and is now beginning to yield new insights into current and future flood risk. This paper reviews this progress and outlines recent efforts to develop a 30m resolution true hydrodynamic model of the entire conterminous US. The model is built using an automated framework which uses US National Elevation Dataset, the HydroSHEDS river network, regionalised frequency analysis to determine extreme flow and rainfall boundary conditions and the USACE National Levee Dataset to characterize flood defences. Comparison against FEMA and USGS flood maps shows the continental model to have skill approaching that of bespoke models built with local data. The paper describes the development and testing of the model, and it use to estimate current and future flood risk in the US using high resolution population maps and development projections.  +
Flooding is one of the costliest natural disasters and recent events, including several hurricanes as well as flash floods, have been particularly devastating. In the US alone, the last few years have been record-breaking in terms of flood disasters and triggered many reactions in public opinions. Governments are now reviewing the available information to better mitigate the risks from flooding.<br>Typically, in the US, flood hazard mapping is done by federal agencies (USACE, FEMA and USGS), with traditionally, little room and need for research model development in flood hazard applications. Now, with the advent of the National Water Model, the status quo of flood hazard prediction in the US may be changing; however, inundation extent and floodplain depths in the National Water Model are still under early-stage development.<br>This Clinic provides a beginner introduction to the latest capabilities in large-scale 2-D modeling using the LISFLOOD-FP model developed by the University of Bristol with a nearly 20-year code history. This model has a very long history in research applications, while the algorithms behind the model made their way also into many existing industry model codes. The session will give participants insights into 2-D flood inundation modeling with LISFLOOD-FP and also a look at more sophisticated sub-grid channel implementations for large-scale application. More specifically, we will look at the data sets needed by the model and then run a simulation of the annual flooding on the Inner Niger Delta in Mali. The Clinic will also give participants the opportunity to look at some high-resolution LiDAR-based model results.  +
Floodplain construction involves the interplay between channel belt sedimentation and avulsion, overbank deposition of fines, and sediment reworking by channel migration. There has been considerable progress in numerical modelling of these processes over the past few years, for example, by using high resolution flow and sediment transport models to simulate river morphodynamics, albeit over relatively small time and space scales. Such spatially-distributed hydrodynamic models are also regularly used to simulate floodplain inundation and overbank sedimentation during individual floods. However, most existing models of long-term floodplain construction and alluvial architecture do not account for flood hydraulics explicitly. Instead, floodplain sedimentation is typically modelled as an exponential function of distance from the river, and avulsion thresholds are defined using topographic indices (e.g., lateral:downstream slope ratios or metrics of channel belt super-elevation). This presentation aims to provide an overview of these issues, and present results from a hydrodynamically-driven model of long-term floodplain evolution. This model combines a simple network-based model of channel migration with a 2D grid-based model of flood hydrodynamics and overbank sedimentation. The latter involves a finite volume solution of the shallow water equations and an advection-diffusion model for suspended sediment transport. Simulation results are compared with observations from several large lowland floodplains, and the model is used to explore hydrodynamic controls on long-term floodplain evolution and alluvial ridge construction.  +
Flow routing map is the cornerstone of spatially distributed hydrologic models. In this clinic we will introduce HexWatershed, a scale-free, mesh independent flow direction model. It supports DOE’s Energy Exascale Earth System Model (E3SM) to generate hydrologic parameters and river network representations on both structured and unstructured meshes. In this presentation, we will overview the capabilities of HexWatershed with an emphasis on river network representation and flow direction modeling. We will also provide participants with the tools to begin their own research with hydrologic model workflows. Through hands-on tutorials and demonstrations, participants will gain some insights into the relationship between meshes and flow direction, and how HexWatershed handles river network in various meshes. We will also demonstrate how to use the HexWatershed model outputs in the large-scale hydrologic model, Model for Scale Adaptive River Transport (MOSART). Participants will be provided with additional resources that can be used to extend the tutorial problems and gain additional familiarity with the tools and workflows introduced. Participants are welcome to bring and utilize their own computers capable of accessing the internet and running a web browser. Tutorials will involve simple scripting operations in the Python language. The conda utility will be used to install libraries. Both QGIS and VisIt packages will be used for visualization.  +
Fluvial incision since late Miocene time (5 Ma) has shaped the transition between the Central Rocky Mountains and adjacent High Plains. Despite a clear contrast in erodibility between the mountains and plains, erodibility has not been carefully accounted for in previous attempts to model the geomorphic evolution of this region. The focus of this work to date has been to constrain erodibility values with a simplistic, toy model, and to reconstruct the paleosurface of the Miocene Ogallala Formation prior to its dissection beginning at 5 Ma. This surface reconstruction will be used as an initial condition in subsequent modeling.  +
Food security and poverty in Bangladesh are very dependent on natural resources, which fluctuate with a changing environment. The ecosystem services supporting the rural population are affected by several factors including climate change, upstream river flow modifications, commercial fish catches in the Bay of Bengal, and governance interventions. The ESPA Deltas project aims to holistically describe the interaction between the interlinked bio-physical environment and the livelihoods of the rural poorest in coastal Bangladesh, who are highly dependent on natural resources and live generally on less than US$1.50 per day. Here we describe a new integrated model that allows a long-term analysis of the possible changes in this system by linking projected changes in physical processes (e.g. river flows, nutrients), with productivity (e.g. fish, rice), social processes (e.g. access, property rights, migration) and governance (e.g. fisheries, agriculture, water and land use management). Bayesian Networks and Bayesian Processes allow multidisciplinary integration and exploration of specific scenarios. This integrated approach is designed to provide Bangladeshi policy makers with science-based evidence of possible development trajectories. This includes the likely robustness of different governance options on natural resource conservation and poverty levels. Early results highlight the far reaching implications of sustainable resource use and international cooperation to secure livelihoods and ensure a sustainable environment in coastal Bangladesh.  +
From G.K. Gilbert's "The Convexity of Hilltops" to highly-optimized numerical implementations of drainage basin evolution, models of landscape evolution have been used to develop insight into the development of specific field areas, create testable predictions of landform development, demonstrate the consequences of our current theories for geomorphic processes, and spark imagination through hypothetical scenarios. In this talk, I discuss how the types questions tackled with landscape evolution models have changed as observational data (e.g., high-resolution topography) and computational technology (e.g., accessible high performance computing) have become available. I draw on a natural experiment in postglacial drainage basin incision and a synthetic experiment in a simple tectonic setting to demonstrate how landscape evolution models can be used to identify how much information the topography or other observable quantities provide in inferring process representation and tectonic history. In the natural example, comparison of multiple calibrated models provides insight into which process representations improve our ability to capture the geomorphic history of a site. Projections into the future characterize where in the landscape uncertainty in the model structure dominates over other sources of uncertainty. In the synthetic case, I explore the ability of a numerical inversion to recover geomorphic-process relevant (e.g., detachment vs. transport limited fluvial incision) and tectonically relevant (e.g., date of fault motion onset) system parameters.  +
GCAM is an open-source, global, market equilibrium model that represents the linkages between energy, water, land, climate, and economic systems. One of GCAM's many outputs is projected land cover/use by subregion. Subregional projections provide context and can be used to understand regional land dynamics; however, Earth System Models (ESMs) generally require gridded representations of land at finer scales. Demeter, a land use and land cover disaggregation model, was created to provide this service. Demeter directly ingests land projections from GCAM and creates gridded products that match the desired resolution, and land class requirements of the user.  +
GPUs can make models, simulations, machine learning, and data analysis much faster, but how? And when? In this clinic we'll discuss whether you should use a GPU for your work, whether you should buy one, which one to buy, and how to use one effectively. We'll also get hands-on and speed up a landscape evolution model together. This clinic should be of interest both to folks who would like to speed up their code with minimal effort as well as folks who are interested in the nitty gritty of pushing computational boundaries.  +
GeoClaw (http://www.geoclaw.org) is an open-source software package for solving two-dimensional depth-averaged equations over general topography using high-resolution finite volume methods and adaptive mesh refinement. Wetting-and-drying algorithms allow modeling inundation and overland flows. The primary applications where GeoClaw has been used are tsunami modeling and storm surge, although it has also been applied to dam break and other overland flooding problems. The first part of this clinic will present an overview of the capabilities of GeoClaw, including a number of new features have been added in the past few years. These include: - Depth-averaged Boussinesq-type dispersive equations that better model short-wavelength tsunamis, such as those generated by landslides or asteroid impacts. Solving these equations requires implicit solvers (due to the higher-order derivatives in the equations). This is now working with the adaptive mesh refinement (AMR) algorithms in GeoClaw, which are critical for problems that require high-resolution coastal modeling while also modeling trans-oceanic propagation, for example. - Better capabilities for extracting output at frequent times on a fixed spatial grid by interpolation from the AMR grids during a computation. The resulting output can then be use for making high-resolution animations or for post-processing (e.g. the velocity field at frequent times can be used for particle tracking, as needed when tracking tsunami debris, for example). - Ways to incorporate river flows or tidal currents into GeoClaw simulation. - Better coupling with the D-Claw code for modeling debris flows, landslides, lahars, and landslide-generated tsunamis. (D-Claw is primarily developed by USGS researchers Dave George and Katy Barnhart). The second part of the clinic will be a hands-on introduction to installing GeoClaw and running some of the examples included in the distribution, with tips on how best to get started on a new project. GeoClaw is distributed as part of Clawpack (http://www.clawpack.org), and is available via the CSDMS model repository. For those who wish to install the software in advance on laptops, please see http://www.clawpack.org/installing.html. We will also go through this briefly and help with any issues that arise on your laptop (provided it is a Mac or Linux machine; we do not support Windows.) You may need to install some prerequisites in advance, such as Xcode on a Mac (since we require "make" and other command line tools), a Fortran compiler such as gfortran, and basic scientific Python tools such as NumPy and Matplotlib. See https://www.clawpack.org/prereqs.html.  
GeoClaw (http://www.geoclaw.org) is an open-source software package for solving two-dimensional depth-averaged equations over general topography using high-resolution finite volume methods and adaptive mesh refinement. Wetting-and-drying algorithms allow modeling inundation or overland flows. The primary applications where GeoClaw has been used are tsunami modeling and storm surge, although it has also been applied to dam break problems and other overland floods. This tutorial will give an introduction to setting up a tsunami modeling problem in GeoClaw, including: * Overview of capabilities, * Installing the software, * Using Python tools provided in GeoClaw to acquire and work with topography DEMs and other datasets, * Setting run-time parameters, including specifying adaptive refinement regions, * The VisClaw plotting software to visualize results using Python tools or display on Google Earth. GeoClaw is distributed as part of Clawpack (http://www.clawpack.org). Those who wish to install the software in advance on laptops, please see http://www.clawpack.org/installing.html. https://razmag.ir/review-of-mesotherapy/ Tutorials can be found here: https://github.com/clawpack/geoclaw_tutorial_csdms2019  +
GeoClaw is an open source Fortran/Python package based on Clawpack (conservation laws package), which implements high-resolution finite volume methods for solving wave propagation problems with adaptive mesh refinement. GeoClaw was originally developed for tsunami modeling and been validated via benchmarking workshops of the National Tsunami Hazard Mitigation Program for use in hazard assessment studies funded through this program. Current project include developing new tsunami inundation maps for the State of Washington and the development of new probabilistic tsunami hazard assessment (PTHA) methodologies. The GeoClaw code has also been extended to the study of storm surge and forms the basis for D-Claw, a debris flow and landslide code being developed at the USGS and recently used to model the 2014 Oso, Washington landslide, for example.  +
Getting usable information out of climate and weather models can be a daunting task. The direct output from the models typically has unacceptable biases on local scales, and as a result a large number of methods have been developed to bias correct or downscale the climate model output. This clinic will describe the range of methods available as well as provide background on the pros and cons of different approaches. This will cover a variety of approaches from relatively simple methods that just rescale the original output, to more sophisticated statistical methods that account for broader weather patterns, to high-resolution atmospheric models. We will focus on methods for which output or code are readily available for end users, and discuss the input data required by different methods. We will follow this up with a practical session in which participants will be supplied a test dataset and code with which to perform their own downscaling. Participants interested in applying these methods to their own region of interest are encouraged to contact the instructor ahead of time to determine what inputs would be required.  +
Global models of Earth’s climate have expanded beyond their geophysical heritage to include terrestrial ecosystems, biogeochemical cycles, vegetation dynamics, and anthropogenic uses of the biosphere. Ecological forcings and feedbacks are now recognized as important for climate change simulation, and the models are becoming models of the entire Earth system. This talk introduces Earth system models, how they are used to understand the connections between climate and ecology, and how they provide insight to environmental stewardship for a healthy and sustainable planet. Two prominent examples discussed in the talk are anthropogenic land use and land-cover change and the global carbon cycle. However, there is considerable uncertainty in how to represent ecological processes at the large spatial scale and long temporal scale of Earth system models. Further scientific advances are straining under the ever-growing burden of multidisciplinary breadth, countered by disciplinary chauvinism and the extensive conceptual gap between observationalists developing process knowledge at specific sites and global scale modelers. The theoretical basis for Earth system models, their development and verification, and experimentation with these models requires a new generation of scientists, adept at bridging the disparate fields of science and using a variety of research methodologies including theory, numerical modeling, observations, and data analysis. The science requires a firm grasp of models, their theoretical foundations, their strengths and weaknesses, and how to appropriately use them to test hypotheses of the atmosphere-biosphere system. It requires a reinvention of how we learn about and study nature.  +
Google Earth Engine is a powerful geographic information system (GIS) that brings programmatic access and massively parallel computing to petabytes of publicly-available Earth observation data using Google’s cloud infrastructure. In this live-coding clinic, we’ll introduce some of the foundational concepts of workflows in Earth Engine and lay the groundwork for future self-teaching. Using the JavaScript API, we will practice: raster subsetting, raster reducing in time and space, custom asset (raster and vector) uploads, visualization, mapping functions over collections of rasters or geometries, and basic exporting of derived products.  +
Google Earth Engine(GEE) is a multi-petabyte catalog of satellite imagery and geospatial datasets with planetary-scale analysis capabilities. Now imagine all you need to work on it is a browser and an internet connection. This hands-on workshop will introduce you to and showcase cloud-native geospatial processing. We will explore the platform’s built-in catalog of 100+ petabytes of geospatial datasets and build some analysis workflows. Additional topics will also include uploading & ingesting your own data to Google Earth Engine, time series analysis essential for change monitoring, and data and code principles for effective collaboration. The hope is to introduce to cloud native geospatial analysis platform and to rethink data as we produce and consume more. If you want to follow along, bring your laptops, and register for an Earth Engine account here https://signup.earthengine.google.com P.S I recommend using a personal account :) you get to keep it  +
Granular materials are ubiquitous in the environment, in industry and in everyday life and yet are poorly understood. Modelling the behavior of a granular medium is critical to understanding problems ranging from hazardous landslides and avalanches in the Geosciences, to the design of industrial equipment. Typical granular systems contain millions of particles, but the underlying equations governing that collective motion are as yet unknown. The search for a theory of granular matter is a fundamental problems in physics and engineering and of immense practical importance for mitigating the risk of geohazards. Direct simulation of granular systems using the Discrete Element Method is a powerful tool for developing theories and modelling granular systems. I will describe the simulation technique and show its application to a diverse range of flows.  +
Great mentors engage early career scientists in research, open doors, speak the ‘unspoken rules’, and inspire the next generation. Yet many of us step into mentoring roles without feeling fully confident in the role, or uncertain how to create an inclusive environment that allows early career scientists from varied backgrounds to thrive. In this interactive workshop, we will share experiences and explore tools that can help build successful mentoring relationships, create supportive cohorts, and feel confident in becoming a great mentor.  +
Hazard assessment for post-wildfire debris flows, which are common in the steep terrain of the western United States, has focused on the susceptibility of upstream basins to generate debris flows. However, reducing public exposure to this hazard also requires an assessment of hazards in downstream areas that might be inundated during debris flow runout. Debris flow runout models are widely available, but their application to hazard assessment for post-wildfire debris flows has not been extensively tested. I will discuss a study in which we apply three candidate debris flow runout models in the context of the 9 January 2018 Montecito event. We evaluate the relative importance of flow volume and flow material properties in successfully simulating the event. Additionally, I will describe an in-progress user needs assessment designed to understand how professional decision makers (e.g., county emergency managers, floodplain manager, and Burned Area Emergency Response team members) might use post-fire debris flow inundation hazard assessment information. https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2021JF006245 Katy Barnhart is a Research Civil Engineer at the U.S. Geological Survey’s Geologic Hazards Science Center. She received her B.S.E. (2008) in Civil and Environmental Engineering from Princeton University and her M.S. (2010) and Ph.D. (2015) in Geological Sciences from the University of Colorado at Boulder. Her research uses numerical modeling to understand past and forecast future geomorphic change on a variety of timescales.  +
Here we present direct numerical simulation for the hysteresis of the Antarctic ice sheet and use linear response theory to use these kind of simulations to project Antarctica's sea level contribution to the end of the century. Related publications: * A. Levermann et al. 2020. Projecting Antarctica's contribution to future sea level rise from basal ice-shelf melt using linear response functions of 16 ice sheet models (LARMIP-2). Earth System Dynamics 11 (2020) 35-76, doi 10.5194/esd-11-35-2020. * J. Garbe, T. Albrecht, A. Levermann, J.F. Donges, R. Winkelmann, 2020. The Hysteresis of the Antarctic Ice Sheet. Nature 585 (2020), 538-544, doi: 10.1038/s41586-020-2727-5.  +
HexWatershed is a hydrologic flow direction model that supports structured and unstructured meshes. It uses state-of-the-art topological relationship-based stream burning and depression-filling techniques to produce high-quality flow-routing datasets across scales. HexWatershed has substantially improved over the past two years, including support for the DGGRID discrete global grid system (DGGS). This presentation will provide an overview of HexWatershed, highlighting its capabilities, new features, and improvements. Through hands-on tutorials and demonstrations, attendees will gain insights into the underlying philosophy of the HexWatershed model, and how to use HexWatershed products to run large-scale hydrologic models in watersheds worldwide. Specifically, this tutorial will cover major components in the HexWatershed ecosystem, including the computational mesh generation process, river network representation, and flow direction modeling. We will provide participants with resources to extend the tutorial problems and gain additional familiarity with the tools and workflows introduced. Attendees are encouraged to bring their laptops with internet access and a functional web browser. Tutorials will involve scripting operations in the Python language, such as Jupyter Notebook. We will use the Conda utility to install dependency libraries and Visual Studio Code to run the notebooks.  +
High-resolution topographic (HRT) data is becoming more easily accessible and prevalent, and is rapidly advancing our understanding of myriad surface and ecological processes. Landscape connectivity is the framework that describes the routing of fluids, sediments, and solutes across a landscape and is a primary control on geomorphology and ecology. Connectivity is not a static parameter, but rather a continuum that dynamically evolves on a range of temporal and spatial scales, and the observation of which is highly dependent on the available methodology. In this clinic we showcase the utility of HRT for the observation and characterization of landscapes and compare results with those of coarser spatial resolution data-sets. We highlight the potential for integrating HRT observations and parameters such as vegetation density, surface relief, and local slope variability with numerical surface process models. Participants will gain an understanding of the basics of HRT, data availability and basic analysis, and the use of HRT parameters in modeling.  +
How can we increase the diversity, richness and value of Spatial Data Infrastructure (SDI) to the Disasters and Natural Hazards community stakeholders? We’ll look at some of the current (and past) Open Geospatial Consortium initiatives to examine exciting work to enable sharing of complex data and models within the community using open standards.  +
Human settlements in dynamic environmental settings face the challenges both of managing their own impact on their surroundings and also adapting to change, which may be driven by a combination of local and remote factors, each of which may involve both human and natural forcings. Impacts of and responses to environmental change play out at multiple scales which involve complex nonlinear interactions between individual actors. These interactions can produce emergent results where the outcome at the community scale is not easily predicted from the decisions taken by individuals within the community. Agent-based simulations can be useful tools to explore the dynamics of both the human response to environmental change and the environmental impacts of human activity. Even very simple models can be useful in uncovering potential for unintended consequences of policy actions. Participatory simulations that allow people to interact with a system that includes simulated agents can be useful tools for teaching and communicating about such unintended consequences. I will report on progress on agent-based simulations of environmentally stressed communities in Bangladesh and Sri Lanka and preliminary results of using a participatory coupled model of river flooding and agent-based real estate markets to teach about unintended consequences of building flood barriers.  +
Humans alter natural geomorphic systems by modifying terrain morphology and through on-going actions that change patterns of sediment erosion, transport, and deposition. Long-term interactions between humans and the environment can be examined using numerical modeling. Human modifications of the landscape such as land cover change and agricultural tillage have been implemented within some landscape evolution models, yet little effort has been made to incorporate agricultural terraces. Terraces of various forms have been constructed for millennia in the Mediterranean, Southeast Asia, and South America; in those regions some terraces have undergone cycles of use, abandonment, and reuse. Current implementations of terraces in existing models are as static objects that uniformly impact landscape evolution, yet empirical studies have shown that terrace impact depends upon whether they are maintained or abandoned. We previously tested a simple terrace model that included a single terrace wall on a synthetic hillside with 20% slope for the impacts of maintenance and abandonment. In this research we modify the terrace model to include a wider variety of terrace forms and couple it with a landscape evolution model to test the extent terraced terrain morphology is related to terrace form. We also test how landscape evolution, after abandonment of terraced fields, differs based on length of time the terraces were maintained. We argue that construction and maintenance of terraces has a significant impact on the spatial patterning of sediment erosion and deposition and thus landscape evolution modeling of terraced terrain requires coupling with a dynamic model of terrace use.  +
Hurricanes can greatly modify the sedimentary record, but our coastal scientific modeling community has rather limited capability to predict such process. A three-dimensional sediment transport model was developed in the Regional Ocean Modeling System (ROMS) to study seabed erosion and deposition on the Louisiana shelf in response to Hurricanes Katrina and Rita in the year 2005. Conditions to either side of Hurricane Rita‚ storm track differed substantially, with the region to the east having stronger winds, taller waves and thus deeper erosions. This study indicated that major hurricanes can disturb the shelf at centimeter to meter levels on seabed.  +
Hydrology is a science of extremes; droughts and floods. In either case, the hydrologic response arises from the combination of many factors, such as terrain, land cover, land use, infrastructure, etc. Each has different, overlapping spatial domains. Superimposed upon these are temporal variations, driven by stochastic weather events that follow seasonal climatic regimes. To calculate risk (expected loss) requires a loss function (damage) and a response domain (flood depths) over which that loss is integrated. The watershed provides the spatial domain that collects all these factors. This talk will discuss the data used to characterize hydrologic response.  +
I will discuss an application of the Migration, Intensification, and Diversification as Adaptive Strategies (MIDAS) agent-based modeling framework to modeling labor migration across Bangladesh under the stressor of sea-level rise (SLR). With this example, I hope to highlight some hard-to-resolve challenges in representing adaptive decision-making under as-yet unexperienced stressors in models. Drawing together what is more and what is less known in projections for future adaptation, I will discuss strategies for ‘responsible’ presentation and dissemination of model findings.  +
If one system comes to (my) mind where the human element is intertwined with the environment, it is the Louisiana coastal area in the Southern United States. Often referred to as the working coast, coastal Louisiana supports large industries with its ports, navigation channels, oil, and productive fisheries. In addition to that, Louisianians have a significant cultural connection to the coastal wetlands and their natural resources. Unfortunately, the land is disappearing into the sea with coastal erosion rates higher than anywhere else in the US. Due to these high rates of land loss, this system needs rigorous protection and restoration. While the restoration plans are mostly focused on building land, the effects on, for example, fisheries of proposed strategies should be estimated as well before decisions can be made on how to move forward. Through several projects I have been involved in, from small modeling projects to bold coastal design programs, I present how coupled models play a key role in science-based coastal management that considers the natural processes as well as the human element.  +
In dry regions, escarpments are key landforms for exploring landform-rainfall interactions. Here we present a modeling approach for arid cliffs and sub-cliff slopes evolution incorporating rainfall forcing at the scale of individual rainstorms. We used numerical experiments to mechanistically test how arid cliffs and sub-cliff slopes evolve according to different geomorphic characteristics and variations in rainstorm properties.  +
In formulating tectono-geomorphic models of landscape evolution, Earth is typically divided into two domains; the surface domain in which “geomorphic” processes are solved for and a tectonic domain of earth deformation driven generally by differential plate movements. Here we present a single mechanical framework, Failure Earth Response Model (FERM), that unifies the physical description of dynamics within and between the two domains. FERM is constructed on the two, basic assumptions about the three-dimensional stress state and rheological memory: I) Material displacement, whether tectonic or geomorphic in origin, at or below Earth’s surface, is driven by local forces overcoming local resistance, and II) Large displacements, whether tectonic or geomorphic in origin, irreversibly alter Earth material properties enhancing a long term strain memory mapped into the topography. In addition to the gathering of stresses arising from far field tectonic processes, topographic relief, and the inertial surface processes into a single stress state for every point, the FERM formulation allows explicit consideration of the contributions to the evolving landscape of pore pressure fluctuations, seismic accelerations, and fault damage. Incorporation of these in the FERM model significantly influences the tempo of landscape evolution and leads to highly heterogeneous and anisotropic stress and strength patterns, largely predictable from knowledge of mantle kinematics. The resulting unified description permits exploration of surface-tectonic interactions from outcrop to orogen scales and allows elucidation of the high fidelity orogenic strain and climate memory contained in topography.  +
In landscape evolution models, climate change is often assumed to be synonymous with changes in rainfall. In many climate changes, however, the dominant driver of landscape evolution is changes in vegetation cover. In this talk I review case studies that attempt to quantify the impact of vegetation changes on landscape evolution, including examples from hillslope/colluvial, fluvial, and aolian environments, spatial scales of ~10 m to whole continents, and time scales from decadal to millennial. Particular attention is paid to how to parameterize models using paleoclimatic and remote sensing data.  +
In software engineering, an interface is a group of functions with prescribed names, argument types, and return types. When a developer implements an interface for a piece of software, they fill out the details for each function while keeping the signatures intact. CSDMS has developed the Basic Model Interface (BMI) for facilitating the conversion of a model written in C, C++, Fortran, Python, or Java into a reusable, plug-and-play component. By design, BMI functions are simple. However, when trying to implement them, the devil is often in the details. In this hands-on clinic, we'll take a simple model of the two-dimensional heat equation, written in Python, and together we'll write the BMI functions to wrap it, preparing it for transformation into a component. As we develop, we’ll explore how to use the wrapped model with a Jupyter Notebook. To get the most out of this clinic, come prepared to code! We'll have a lot to write in the time allotted for the clinic. We recommend that clinic attendees have a laptop with the Anaconda Python distribution installed. We also request that you review the * BMI description (http://csdms.colorado.edu/wiki/BMI_Description), and the * BMI documentation (https://bmi-spec.readthedocs.io) before the start of the clinic.  +
In software engineering, an interface is a set of functions with prescribed names, argument types, and return types. When a developer implements an interface for a piece of software, they fill out the details for each function while keeping the signatures intact. CSDMS has developed the Basic Model Interface (BMI) for facilitating the conversion of an existing model written in C, C++, Fortran, Python or Java into a reusable, plug-and-play component. By design, BMI functions are straightforward to implement. However, when trying to match BMI functions to model behaviors, the devil is often in the details.<br>In this hands-on clinic, we'll take a simple model--an implementation of the two-dimensional heat equation in Python--and together, we'll write the BMI functions to wrap it, preparing it for transformation into a component. As we develop, we’ll explore how to use the wrapped model with a Jupyter Notebook.<br>To get the most out of this clinic, come prepared to code! We'll have a lot to write in the time allotted for the clinic. We recommend that clinic attendees have a laptop with the Anaconda Python distribution installed. We also request that you read over:<br>BMI description (https://csdms.colorado.edu/wiki/BMI_Description)<br>BMI documentation (http://bmi-python.readthedocs.io)<br>before participating in the clinic.  +
In the modeler community, hindcasting (a way to test models based on knowledge of past events) is required for all computer models before providing reliable results to users. CSDMS 2.0 “Moving forward” has proposed to incorporate benchmarking data into its modeling framework. Data collection in natural systems has been significantly advanced, but is still behind the resolution in time and space and includes natural variability beyond our understanding, which makes thorough testing of computer models difficult.<br><br>In the experimentalist community, research in Earth-surface processes and subsurface stratal development is in a data-rich era with rapid expansion of high-resolution, digitally based data sets that were not available even a few years ago. Millions of dollars has been spent to build and renovate flume laboratories. Advanced technologies and methodologies in experiment allow more number of sophisticated experiments in large scales at fine details. Joint effort between modelers and experimentalists is a natural step toward a great synergy between both communities.<br><br>Time for a coherent effort for building a strong global research network for these two communities is now. First, the both communities should initiate an effort to figure out a best practice, metadata for standardized data collection. Sediment experimentalists are an example community in the “long tail”, meaning that their data are often collected in one-of-a-kind experimental set-ups and isolated from other experiments. Second, there should be a centralized knowledge base (web-based repository for data and technology) easily accessible to modelers and experimentalists. Experimentalists also have a lot of “dark data,” data that are difficult or impossible to access through the Internet. This effort will result in tremendous opportunities for productive collaborations.<br><br>The new experimentalist and modeler network will be able to achieve the CSDMS current goal by providing high quality benchmark datasets that are well documented and easily accessible.  
In this clinic I will give an overview of lsdtopotools so that, by the end of the session, you will be able to run and visualise topographic analyses using lsdtopotools and lsdviztools. I will show how to start an lsdtopotools session in google colab in under 4 minutes, and will also give a brief overview for more advanced users of how to use our docker container if you want access to local files. I will then use jupyter notebooks to give example analyses including simple data fetching and hillshading, basin selection, simple topographic metrics and channel extraction. Depending on the audience I will show examples of a) channel steepness analysis for applications in tectonic geomorphology b) calculation of inferred erosion rates based on detrital CRN concentrations c) terrace and valley extraction d) channel-hillslope coupling. In addition I will show our simple visualisation scripts that allow you to generate publication-ready images. All you need prior to the session is a google account that allows you to access colab, and an opentopography account so you can obtain an API key. The latter is not required but will make the session more fun as you can use data from anywhere rather than example datasets. If you are not an advanced user please do not read the next sentence, as you don’t need it and it is nerdy compu-jargon that will put you off the session. If you are an advanced user and wish to try the docker container you should install the docker client for your operating system and use the command “docker pull lsdtopotools/lsdtt_pytools_docker” when you have access to a fast internet connection.  +
In this clinic we will explore how to use the new cloud-based remote sensing platform from Google. Our hands-on clinic will teach you the basics of loading and visualizing data in Earth Engine, sorting through data, and creating different types of composite images. These techniques are a good starting point for more detailed investigations that monitor changes on earth’s surface. Prerequisites:<br>1) Bring your own laptop.<br>2) Chrome installed on your system: It will work with Firefox but has issues.<br>3) An active Google account - Register for an account with Google Earth Engine (https://earthengine.google.com/signup/)  +
In this clinic we will explore how to use the cloud-based remote sensing platform from Google. Our hands-on clinic will teach you the basics of loading and visualizing data in Earth Engine, sorting through data, and creating different types of composite images. These techniques are a good starting point for more detailed investigations that monitor changes on earth’s surface. Prerequisites include having Chrome installed on your system: It will work with Firefox but has issues and an active Google account. Once you have those please register for an account with Google Earth Engine (https://earthengine.google.com/signup/)  +
In this clinic we will first review concepts of glacial isostatic adjustment and the algorithm that is used to solve the sea level equation. We will then provide an overview of the sea level code, which calculates the viscoelastic response of the solid Earth, Earth’s gravity field, and rotation axis to changes in surface load while conserving water between ice sheets and oceans. Participants will run the code, explore manipulating the input ice changes, and investigate its effect on the predicted changes in sea level, solid Earth deformation, and gravity field.  +
In this clinic, we will explore RivGraph, a Python package for extracting and analyzing fluvial channel networks from binary masks. We will first look at some background and motivation for RivGraph's development, including some examples demonstrating how RivGraph provides the required information for building models, developing new metrics, analyzing model outputs, and testing hypotheses about river network structure. We will then cover--at a high level--some of the logic behind RivGraph's functions. The final portion of this clinic will be spent working through examples showing how to process a delta and a braided river with RivGraph and visualizing results. Please note: This clinic is designed to be accessible to novice Python users, but those with no Python experience may also find value. If you'd like to work through the examples during the workshop, please install RivGraph beforehand, preferably to a fresh Anaconda environment. Instructions can be found here: https://github.com/jonschwenk/RivGraph. It is also recommended that you have a GIS (e.g. QGIS) available for use for easy display/interrogation of results.  +
In this clinic, we will first demonstrate existing interactive computer-based activities used for teaching concepts in sedimentology and stratigraphy. This will be followed by a hands-on session for creating different modules based on the participants’ teaching and research interests. Active learning strategies improve student exam performance, engagement, attitudes, thinking, writing, self-reported participation and interest, and help students become better acquainted with one another (Prince, 2004). Specifically, computer-based active learning is an attractive educational approach for post-secondary educators, because developing these activities takes advantage of existing knowledge and skills the educator is likely to already have. The demonstration portion of the clinic will focus on the existing rivers2stratigraphy (https://github.com/sededu/rivers2stratigraphy) activity, which illustrates basin-scale development of fluvial stratigraphy through adjustments in system kinematics including sandy channel migration and subsidence rates. The activity allows users to change these system properties, so as to drive changing depositional patterns. The module utilizes a rules based model, which produces realistic channel patterns, but simplifies the simulation to run efficiently, in real-time. The clinic will couple rivers2stratigraphy to a conventional laboratory activity which interprets an outcrop photograph of fluvial stratigraphy, and discuss logistics of using the module in the classroom. For the second part of the clinic, familiarity with Python will be beneficial (but is not required); we will utilize existing graphical user interface (GUI) frameworks in developing new activities, aimed to provide a user-friendly means for students to interact with model codes while engaging in geological learning. Participants should plan to have Python installed on their personal computers prior to the workshop, and a sample module will be emailed beforehand to let participants begin exploring the syllabus. ''Prince, M. (2004). Does Active Learning Work? A Review of the Research. Journal of Engineering Education, 93(3), 223-231. doi: 10.1002/j.2168-9830.2004.tb00809.x''.  
In this clinic, we will introduce and experiment with open-source tools designed to promote rapid hypothesis testing for river delta studies. We will show how pyDeltaRCM, a flexible Python model for simulating river delta evolution, can be extended to incorporate any arbitrary processes or forcings. We will highlight how object-oriented model design enables community-driven model development, and how this promotes reproducible science. Our clinic will develop an extended model to simulate deltaic evolution into receiving basins with different slopes. Then, the clinic will step through some basic analyses of the model runs, interrogating both surface processes and subsurface structure. Our overall goal is to familiarize you with the tools we are developing and introduce our approach to software design, so that you may adopt these tools or strategies in your research. Please note that familiarity with Python will be beneficial for this clinic, but is not required. Hands-on examples will be made available via an online programming environment (Google CoLab or similar); instructions for local installation on personal computers will be provided prior to the workshop as well.  +
In this clinic, we will provide a brief introduction to a selection of models (USGS and others), including FaSTMECH (2D/3D hydraulic) and PRMS (watershed hydrology), that have implemented a Basic Model Interface (BMI) and are available in the Python Modeling Toolkit (PyMT). We will interactively explore Jupyter Notebook examples of both stand-alone model operation and, as time permits, loosely coupled integrated modeling applications. Participants will need a laptop with a web browser. Knowledge of Python, Jupyter Notebook, and hydrologic/hydraulic modeling is helpful, but not required.  +
In this clinic, we will talk about diversity in a way that makes it approachable and actionable. We advocate that actions in support of diversity can happen at all career levels, so everyone who is interested can partake. We will discuss concrete strategies and opportunities to help you bring a diverse research group together. Creating a diverse group can be through reaching out to undergraduate minority students to engage in undergraduate research experiences. This can be done ground-up, i.e. by graduate students in a mentoring role as productively as a faculty in a hiring role. We are all supervisors and mentors in our own ways. We will highlight a number of approaches to engage with underrepresented minority students when recruiting new graduate students, and suggest some concrete adjustments of your recruitment processes to be as inclusive as possible. But being proactive does not stop after recruitment. The clinic will have dedicated discussion time to engage in role play, and provide stories about situations in which you can be an ally. We will identify some pitfalls, ways to reclaim, and provide ideas for more inclusive meetings and mentoring. Lastly, together we can work on creating an overview of current programs that focus on diversity and inclusion, to apply for funding to take action.  +
In this clinic, we will use flow routing in models to determine various earth surface processes such as river incision and others. Landlab has several flow routing components that address multiflow-routing, depression-filling and the diversity of grid types. We'll see how to design a landscape evolution model with relatively rapid flow routing execution time on large grids.  +
In this presentation several modeling efforts in Chesapeake Bay will be reviewed that highlight how we can use 3-dimensional, time-dependent hydrodynamic models to provide insight into biogeochemical and ecological processes in marine systems. Two modeling studies will be discussed which illustrate the application of individual based modeling approaches to simulate the impact of 3-dimensional currents and mixing on pelagic organisms and how these interact with behavior to determine the fate of planktonic species. There are many applications of this approach related to fish and invertebrate (e.g., oyster) larvae transport and fate and also plankton that can be used to inform management efforts.<br><br>A long-term operational modeling project will be discussed that combines mechanistic and empirical modeling approaches to provide nowcasts and short-term forecasts of Sea Nettles, HAB, pathogen and also physical and biogeochemical properties for research, management and public uses in Chesapeake Bay. This is a powerful technique can be expanded to any marine system that has a hydrodynamic model and any marine organism for which the habitat can be defined. <br><br>Finally, a new research project will be reviewed where we are assessing the readiness of a suite of existing estuarine community models for determining past, present and future hypoxia events within the Chesapeake Bay, in order to accelerate the transition of hypoxia model formulations and products from academic research to operational centers. This work, which will ultimately provide the ability to do operational oxygen modeling in Chesapeake Bay (e.g., oxygen weather forecasts), can be extended to other coastal water bodies and any biogeochemical property.  +
In this presentation, James Byrne (Lead Research Software Engineer) and Jonathan Smith (Principal Research Scientist) from the British Antarctic Survey will be describing existing digital infrastructure projects and developments happening in and around BAS. They will give a flavour of how technology is influencing the development of environmental and polar science, covering numerous research and operational domains. They will be focusing on the digital infrastructure applied to IceNet, an AI-based deep learning infrastructure. We will then show how generalized approaches to digital infrastructure are being applied to other areas, including cutting-edge Autonomous Marine Operations Planning (AMOP) capabilities. We will end highlighting the challenges that need solving in working towards an Antarctic Digital Twin and how we might approach them.  +
In this talk, I will discuss the need for low carbon and sustainable computing. The current emissions from computing are almost 4% of the world total. This is already more than emissions from the airline industry and ICT emissions are projected to rise steeply over the next two decades. By 2040 emissions from computing alone will account for more than half of the emissions budget to keep global warming below 1.5°C. Consequently, this growth in computing emissions is unsustainable. The emissions from production of computing devices exceed the emissions from operating them, so even if devices are more energy efficient producing more of them will make the emissions problem worse. Therefore we must extend the useful life of our computing devices. As a society we need to start treating computational resources as finite and precious, to be utilized only when necessary, and as effectively as possible. We need frugal computing: achieving our aims with less energy and material.  +
In this webinar, I will present a new framework termed “Bayesian Evidential Learning” (BEL) that streamlines the integration of these four components common to building Earth systems: data, model, prediction, decision. This idea is published in a new book: “Quantifying Uncertainty in Subsurface Systems” (Wiley-Blackwell, 2018) and applied to five real case studies in oil/gas, groundwater, contaminant remediation and geothermal energy. BEL is not a method, but a protocol based on Bayesianism that lead to the selection of relevant methods to solve complex modeling and decision problems. In that sense BEL, focuses on purpose-driven data collection and model-building. One of the important contributions of BEL is that is a data-scientific approach that circumvents complex inversion modeling relies on machine learning from Monte Carlo with falsified priors. The case studies illustrate how modeling time can be reduced from months to days, making it practical for large scale implementations. In this talk, I will provide an overview of BEL, how it relies on global sensitivity analysis, Monte Carlo, model falsification, prior elicitation and data scientific methods to implement the stated principle of its Bayesian philosophy. I will cover an extensive case study involving the managing of the groundwater system in Denmark.  +
In this workshop we will explore publicly available socioeconomic and hydrologic datasets that can be used to inform riverine flood risks under present-day and future climate conditions. We will begin with a summary of different stakeholders’ requirements for understanding flood risk data, through the lens of our experience working with federal, state and local clients and stakeholders. We will then guide participants through the relevant data sources that we use to inform these studies, including FEMA floodplain maps, census data, building inventories, damage functions, and future projections of extreme hydrologic events. We will gather and synthesize some of these data sources, discuss how each data source can be used in impact analyses; and discuss the limitations of each available data source. We will conclude with a brainstorming session to discuss how the scientific community can better produce actionable information for community planners, floodplain managers, and other stakeholders who might face increasing riverine flood risks in the future.  +
Increased computing power, high resolution imagery, new geologic dating techniques, and a more sophisticated comprehension of the geodynamic and geomorphic processes that shape our planet place us on the precipice of major breakthroughs in understanding links among tectonics and surface processes. In this talk, I will use University of Washington’s “M9 project” to highlight research progress and challenges in coupled tectonics and surface processes studies over both short (earthquake) and long (mountain range) timescales. A Cascadia earthquake of magnitude 9 (M9) would cause shaking, liquefaction, landslides and tsunamis from British Columbia to northern California. The M9 project explores this risk, resilience and the mechanics of Cascadia subduction. At the heart of the project are synthetic ground motions generated from 3D finite difference simulations for 50 earthquake scenarios including factors not previously considered, such as the distribution and timing of energy release on the fault, the coherent variation of frequency content of fault motion with fault depth, and the 3D effects of the deep basins along Puget Sound. Coseismic landslides, likely to number in the thousands, represent one of the greatest risks to the millions of people living in Cascadia. Utilizing the synthetic ground motions and a Newmark sliding block analysis, we compute the landscape response for different landslide failure modes. Because an M9 subduction earthquake is well known to have occurred just over 300 years ago, evidence of coseismic landslides triggered by this event should still be present in Washington and Oregon landscapes. We are systematically hunting for these landslides using a combination of radiocarbon dating and surface roughness analysis, a method first developed to study landslides near to the Oso 2014 disaster site, to develop more robust regional landslide chronologies to compare to model estimations. Resolved ground motions and hillslope response for a single earthquake can then be integrated into coupled landscape evolution and geodynamic models to consider the topographic and surface processes response to subduction over millions of years. This example demonstrates the power of an integrative, multidisciplinary approach to provide deeper insight into coupled tectonic and surface processes phenomena over a range of timescales.  
Increasing physical complexity, spatial resolution, and technical coupling of numerical models for various earth systems require increasing computational resources, efficient code bases and tools for analysis, and community codevelopment. In these arenas, climate technology industries have leapfrogged academic and government science, particularly with regards to adoption of open community code and collaborative development and maintenance. In this talk, I will discuss industry coding practices I learned to bring into my workflow for efficient and rapid development, easier maintenance, collaboration and learning, and reproducibility.  +
Interested in which variables influence your model outcome? SALib (Sensitivity Analysis Library) provides commonly used sensitivity analysis methods implemented in a Python programming language package. In this clinic we will use these methods with example models to apportion uncertainty in model output to model variables. We will use models built with the Landlab Earth-surface dynamics framework, but the analyses can be easily adapted for other model software. No prior experience with Landlab or Python is necessary.  +
Introduction for the CSDMS 2020 annual meeting, presenting last years accomplishments and available resources for the community.  +
Introduction for the CSDMS 2021 annual meeting  +
Introduction to the Natural Hazard workshop  +
It is now well established that the evolution of terrestrial species is highly impacted by long term topographic changes (e.g., high biodiversity in mountain ranges globally). Recent advances in landscape and biological models have opened the gate for deep investigation of the feedback between topographic changes and biological processes over millions of years timescale (e.g., dispersal, adaptation, speciation). In this clinic, we will use novel codes that couple biological processes with FastScape, a widely used landscape evolution model, and explore biological processes and speciation during and after mountain building under different magnitudes of tectonic rock uplift rates. We will explore and deduce how the magnitude and pace of mountain building impact biodiversity and how such interactions can be tracked in mountain ranges today. Python and Jupyter Notebook will be used in the clinic, and basic knowledge in python is desirable.  +
It is well established that coupling and strong feedbacks may occur between solid Earth deformation and surface processes across a wide range of spatial and temporal scales. As both systems on their own encapsulate highly complex and nonlinear processes, fully-coupled simulations require advanced numerical techniques and a flexible platform to explore a multitude of scenarios. Here, we will demonstrate how the Advanced Solver for Problems in Earth's Convection and Tectonics (ASPECT) can be coupled with FastScape to examine feedbacks between lithospheric deformation and landscape evolution. The clinic will cover the fundamental equations being solved, how to design coupled simulations in ASPECT, and examples of coupled continental extension and landscape evolution.  +
JOSS is a developer friendly, peer reviewed academic journal for research software packages, providing a path to academic credit for scholarship disseminated via software. I'll give a tour of the journal, its submission/review process, and opportunities to get involved.  +
Jupyter Notebooks can be powerful tools for classroom teaching. This clinic explores different ways to use notebooks in teaching, common pitfalls to avoid, and best practices. It also introduces the CSDMS OpenEarthscape Hub, an online resource that instructors can use that eliminates the need to install software and provides students with direct access to various CSDMS tools.  +
Jupyter notebooks provide a very convenient way to communicate research results: they may contain narrative text, live code, equations and visualizations all in a single document. Beyond notebooks, the Jupyter ecosystem also provides many interactive, graphical components (widgets) that can be used within notebooks to further enhance the user experience. Those widgets serve a variety of purposes such as 2D (Ipympl, Bqplot, Ipycanvas) or 3D (Ipygany) scientific visualization, 2D (Ipyleaflet) or 3D (Pydeck) maps, etc. When the target audience is not familiar with coding, it is possible to turn Jupyter notebooks into interactive dashboards and publish them as stand-alone web applications (using Voilà). In this workshop, we will learn how to leverage this powerful Jupyter environment to build custom, interactive dashboards for exploring models of Earth surface processes in contexts like research, teaching and outreach. After introducing the basics of Jupyter widgets, we will focus on more advanced examples based on Fastscape and/or Landlab. We willl also spend some time on hands-on exercises as well as brainstorming dashboard ideas. Clinic materials and installation instructions can be found here: https://github.com/benbovy/jupyter-dash-csdms2021 Related links: - https://github.com/fastscape-lem/gilbert-board - https://github.com/fastscape-lem/ipyfastscape  +
Jurjen will share how FloodTags uses human observations from online media to detect and analyze new (and past) flood events. He also introduces a new approach to citizen engagement via chatbots in instant messengers. With this, local needs are revealed in detail and low-threshold two-way communication about flood risk is possible, even down to community level. How can these new techniques be functional in current flood risk management practices?  +
Landlab  +
Landlab is a Python toolkit for building, coupling, and exploring two-dimensional numerical models of Earth-surface dynamics. Hydroshare is an online collaborative environment for sharing data and models. Hydroshare allows users to run models remotely, without needing to install software locally. This clinic will illustrate example Landlab models and how to run them on Hydroshare. This clinic will provide an introduction to Landlab’s features and capabilities, including how to create a model grid, populate it with data, and run numerical algorithms for surface hydrology, hillslope sediment transport, and stream incision. We will illustrate how models can be used for both research and teaching purposes.  +
Landlab is a Python-based toolkit for building, coupling, and exploring two-dimensional numerical models of Earth-surface dynamics. This clinic will first provide a short hands-on introduction to Landlab's features and capabilities. We will highlight examples from several existing models built within the Landlab framework, including: coupling of local ecohydrologic processes, spatial plant interactions, and disturbances (fires and grazing); landscape evolution impacted by plants; overland flow impacted by changing soil properties; and effects of topographic structure on species distribution and evolution. Models will be run with various scenarios for climate change and anthropogenic disturbances, and evolution of state variables and fluxes across the landscape will be explored. We will also show the use of gridded climate data products to drive Landlab simulations. Participants are encouraged to install Landlab on their computers prior to the clinic. Installation instructions can be found at: http://landlab.github.io (select "Install" from the menu bar at the top of the page).  +
Landlab is a Python-language programming library that supports efficient creation of two-dimensional (2D) models of diverse earth-surface systems. For those new to Landlab, this clinic will provide a hands-on introduction to Landlab's features and capabilities, including how to create a grid, populate it with data, and run basic numerical algorithms. For experienced Landlab users, we will review some of the new features in this first full-release version, explore how to created integrated models by combining pre-built process components, and learn the basics of writing new components. Participants are encouraged to install Landlab on their computers prior to the clinic. Installation instructions can be found at: http://landlab.github.io (select "Install" from the menu bar at the top of the page). Clinic participants who have particular questions or applications in mind are encouraged to email the conveners ahead of the CSDMS meeting so that we can plan topics and exercises accordingly.  +
Landscape evolution involves manifold processes from different disciplines, including geology, geomorphology and ecohydrology, often interacting nonlinearly at different space-time scales. While this gives rise to fascinating patterns of interconnected networks of ridges and valleys, it also challenges Landscape Evolution Models (LEMs), which typically rely on long-term numerical simulations and mostly have only current topographies for comparison. While adding process complexity (and presumably realism) is certainly useful to overcome some of these challenges, is also exacerbates issues related to proper calibration and simulation. This talk advocates more focus on the theoretical analysis of LEMs to alleviate some of these issues. By focusing on the essential elements that distinguish landscape evolution, the resulting minimalist LEMs become more amenable to dimensional analysis and other methods of nonlinear field equations, used for example in fluid mechanics and turbulence, offering fertile ground to sharpen model formulation (i.e., the stream-power erosion term), unveil distinct dynamic regimes (e.g., unchannelized, from incipient valley formation, transitional and statistically self-similar fractal regime), and properly formulate questions related to the existence of steady state solution (as opposed to a situation of space time chaos, similar to a geomorphological turbulence). We also discuss benchmarks for evaluating numerical simulation and novel avenues for numerical methods, as well as ways to bridge between spatially discrete models (i.e., river networks) and continuous, partial-differential-equation models.  +
Landscape evolution models often generalize hydrology by assuming steady-state discharge to calculate channel incision. While this assumption is reasonable for smaller watersheds or larger precipitation events, non-steady hydrology is a more applicable condition for semi-arid landscapes, which are prone to short-duration, high-intensity storms. In these cases, the impact of a hydrograph (non-steady method) may be significant in determining long-term drainage basin evolution. This project links a two-dimensional hydrodynamic algorithm with a detachment-limited incision component in the Landlab modeling framework. Storms of varying intensity and duration are run across two synthetic landscapes, and incision rate is calculated throughout the hydrograph. For each case, peak discharge and total incision are compared to the values predicted by steady-state to evaluate the impact of the two hydrologic methods. We explore the impact of different critical shear stress values on total incision using the different flow methods. Finally, a watershed will be evolved to topographic steady-state using both the steady- and non-steady flow routing methods to identify differences in overall relief and drainage network configuration. Preliminary testing with no critical shear stress threshold has shown that although non-steady peak discharge is smaller than the peak predicted by the steady-state method, total incised depth from non-steady methods exceeds the steady-state derived incision depth in all storm cases. With the introduction of a incision threshold, we predict there will be cases where the steady-state method overestimates total incised depth compared to the non-steady method. Additionally, we hypothesize that watersheds evolved with the non-steady method will be characterized by decreased channel concavities. This work demonstrates that when modeling landscapes characterized by semi-arid climates, choice of hydrology method can significantly impact the resulting morphology.  
Landscapes developed in rock layers of differing erodibility are common on Earth, as well as on other planets. Hillslopes carved into the soft rock are typically characterized by steep, linear-to-concave up slopes or “ramps” mantled with material derived from the resistant layers above, often in the form of large blocks. To better understand the role of sediment size in hillslope evolution, we developed a 1-D numerical model of a hogback. The hybrid continuum-discrete model uses a traditional continuum treatment of soil transport while allowing for discrete, rules-based motion of large blocks of rocks. Our results show that feedbacks between weathering and transport of the blocks and underlying soft rock can create relief over time and lead to the development of concave-up slope profiles in the absence of rilling processes. In addition, the model reaches a quasi-steady state in which the topographic form and length of the ramp remains constant through time. We use an analytic approach to explore the mechanisms by which our model self-organizes to this state, including adjustment of soil depth, erosion rates, and block velocities along the ramp. An agreement of analytic solutions with the model shows that we understand its behavior well, and can carefully explore implications for hillslope evolution in the field. Current work explores the interactions between blocky hillslopes and channels in a 2-D numerical model built in Landlab. Our models provide a framework for exploring the evolution of layered landscapes and pinpoint the processes for which we require a more thorough understanding to predict their evolution over time.  +
Landscapes of the US Central Lowland were repeatedly affected by the Laurentide Ice Sheet. Glacial processes diminished relief and disrupted drainage networks. Deep valleys carved by glacial meltwater were disconnected from the surrounding uplands. The upland area lacking surface water connection to the drainage network is referred to as non-contributing area (NCA). Decreasing fractions of NCA on older surfaces suggests that NCA becomes drained over time. We propose that the integration could occur via: 1) capture of NCA as channels propagate into the upland or, 2) subsurface or intermittent surface connection of NCA to external drainage networks providing increased discharge to promote channel incision. We refer the two cases as “disconnected” and “connected” since the crucial difference between them is the hydrological connection of the upland to external drainage. We investigate the differences in evolution and morphology of channel networks in low relief landscapes under disconnected and connected regimes using the LandLab landscape evolution modeling platform. We observe substantially faster rates of erosion and integration of the channel network in the connected case. The connected case also creates longer, more sinuous channels than the disconnected case. Sensitivity tests indicate that hillslope diffusivity has little influence on the evolution and morphology. The fluvial erosion coefficient has significant impact on the rate of evolution, and it influences the morphology to a lesser extent. Our results and a qualitative comparison with landscapes of the glaciated US Central Lowland suggest that connection of NCAs is a potential control on the evolution and morphology of post-glacial landscapes.  +
Landslides mobilize tons of sediment in the blink of an eye. From an engineering perspective, one typically looks at topographical relief as a causal factor triggering landslides. From a geomorphological perspective, one could wonder how landslides and landslide derived sediment alter the evolution of landscapes. Curious to find out what landslides do with the evolution of landscapes? Tune in for this webinar to figure out how to use the Landlab HyLands component to address this question.  +
Launched in 2021 through a cooperative agreement with the National Science Foundation’s Coastlines and People (CoPe) Program, the Megalopolitan Coastal Transformation Hub is a partnership among 13 institutions, focused on four intertwined goals: 1) Doing science that is useful and used, specifically by facilitating flexible, equitable, and robust long-term planning to manage climate risk in the urban megaregion spanning Philadelphia, New Jersey, and New York City 2) Doing science that advances human understanding of how coastal climate hazards, coastal landforms, and human decisions at household, municipal, market, and policy scales interact to shape climate risk, 3) Training the next generation of leaders in transdisciplinary climate research and engagement, 4) Building a sustainable academic/stakeholder co-production partnership model for just, equitable, and inclusive climate action in diverse coastal, urban megaregions around the world. MACH's initial work has focused particularly on Philadelphia and its surroundings. Core themes within this work include: 1) Characterization of compound flood and heat+flood hazard and risk 2) The role of insurance in the interrelated insurance/mortgage/ housing markets 3) The impacts of flood risk on municipal finances 4) Improving equity considerations in the design of strategies to manage flood risks 5) Household decision-making regarding flood risk in low-income, renter-dominated neighborhoods This talk will introduce MACH and highlight emerging lessons from MACH's transdisciplinary research and engagement model.  +
Live demonstration  +
Macrobenthic species that live within or on top of estuarine sediments can destabilize local mud deposits through bioturbating activities. The resulting enhanced sediment availability will affect large-scale morphological change. We numerically model two contrasting bioturbating species by means of our novel literature-based eco-morphodynamic model. We find significant effects on local mud accumulation and bed elevation change leading to a large-scale reduction in deposited mud. In turn, the species-dependent mud content redefines their habitat and constricted species abundances. Combined species runs reveal a new ecological feedback facilitating survival of the dominant species as a result of combined eco-engineering activity.  +
Major fault systems are the primary manifestation of localized strain at tectonic plate boundaries. Slip on faults creates topography that is constantly reworked by erosion and sediment deposition. This in turn affects the stress state of the brittle upper crust. Numerical models commonly predict that surface processes can modulate the degree of strain localization, i.e., the partitioning of strain onto a given number of master faults and/or the lifespan of individual faults. The detailed mechanisms, potential magnitude, and geological evidence for such feedbacks however remain debated. We address this problem from the perspective of continental rifts, and at the scale of individual fault-bounded structures. Half-grabens in particular constitute ideal natural laboratories to investigate brittle deformation mechanisms (e.g., fault localization, elasto-plastic flexure...) in relation to continued erosion of the master fault footwall and sediment deposition on the hanging wall. Through an energy balance approach, we show that suppressing relief development in a half-graben can significantly enhance the lifespan of its master fault if the upper crust is moderately strong. Simple geodynamic simulations where tectonic topography is either entirely leveled or perfectly preserved confirm our analytical predictions.<br><br>Natural systems, however, lie somewhere in between these two endmembers. To better represent the true efficiency of surface processes at redistributing surficial masses, we couple a 2-D long-term tectonic code with a landscape evolution model that incorporates stream power erosion, hillslope diffusion, and sediment deposition. We identify a plausible range of landscape evolution parameters through morphological analyses of real normal fault-bounded massifs from the East African Rift and Western United States. This allows us to assess the sensitivity of half-graben evolution to a documented range of rheological, climatic, and lithological conditions. We find that half-grabens that reach topographic steady-state after a short amount of extension (~1 km) are more likely to accumulate master fault offsets on par with the thickness of the upper crust. Conversely, a longer phase of topographic growth ––for example due to low rock erodibility–– will favor the initiation of a new master fault and the abandonment of the initial one. A less erodible crust could thus be more prone to extension on a series of horsts and grabens, while more erodible units would deform as long-lived half-grabens. Lithological controls on erodibility could therefore constitute a form of structural inheritance in all geodynamic contexts.  
Major societal and environmental challenges require forecasting how natural processes and human activities affect one another. There are many areas of the globe where climate affects water resources and therefore food availability, with major economic and social implications. Today, such analyses require significant effort to integrate highly heterogeneous models from separate disciplines, including geosciences, agriculture, economics, and social sciences. Model integration requires resolving semantic, spatio-temporal, and execution mismatches, which are largely done by hand today and may take more than two years. The Model INTegration (MINT) project will develop a modeling environment which will significantly reduce the time needed to develop new integrated models, while ensuring their utility and accuracy. Research topics to be addressed include: 1) New principle-based semiautomatic ontology generation tools for modeling variables, to ground analytic graphs to describe models and data; 2) A novel workflow compiler using abductive reasoning to hypothesize new models and data transformation steps; 3) A new data discovery and integration framework that finds new sources of data, learns to extract information from both online sources and remote sensing data, and transforms the data into the format required by the models; 4) A new methodology for spatio-temporal scale selection; 5) New knowledge-guided machine learning algorithms for model parameterization to improve accuracy; 6) A novel framework for multi-modal scalable workflow execution; and 7) Novel composable agroeconomic models.  +
Man-made objects - 'junk', bombs, artificial reefs, containers - litter the seafloor. Many of the munitions remain active (unstable) and polluting, and are a danger for seabed engineering projects. We numerically modeled how they may move during powerful storms. Kinematic analysis per wave cycle (by period and orbital velocity) created a matrix of the movement probabilities, which were convolved with spatial (mapped) values of the same across the German Bight, taking bomb type and sediment type into account. The model can look at historical patterns of migration, and even predict movement in real-time as a storm evolves hour by hour.  +
Mangroves are a halophytic tree communities distributed along tropical and subtropical coastlines. They provide invaluable services, such as blue carbon storage, coastal protection and habitat for thousands of species. Despite their global importance, their responses to rapid climate change are yet to be fully understood. Particularly, it is unclear how mangroves will respond to future increases in net evaporation rates (i.e. evaporation - precipitation), which generally lead to an increase in the concentration of soil stressors such as sulfide and sulfate. We addressed this knowledge gap by collecting remote sensing data from a number of remote mangrove islands across the Caribbean and couple them with a numerical model that describes mangrove vegetated area as a function of net evaporation rate, outer edge island salinity, and hydraulic conductivity of the soil. We found that this modeling framework can capture the variability observed in our mangrove island database, suggesting that an increase in net evaporation rates lead to significant reductions in mangrove island vegetation. Moreover, based on future net evaporation rate scenarios from Global Climate models we find this trend will likely continue and predict that mangrove islands across the Caribbean will experience significant reduction in vegetated area.  +
Many geophysical models require parameters that are not tightly constrained by observational data. Calibration represents methods by which these parameters are estimated by minimizing the difference between observational data and model simulated equivalents (the objective function). Additionally, uncertainty in estimated parameters is determined. In this clinic we will cover the basics of model calibration including: (1) determining an appropriate objective function, (2) major classes of calibration algorithms, (3) interpretation of results. In the hands-on portion of the the clinic, we will apply multiple calibration algorithms to a simple test case. For this, we will use Dakota, a package that supports the application of many different calibration algorithms.  +
Many geoscientists and geoscience organizations vowed to work towards equity and committed to anti-racist action in 2020. But getting started on and staying committed to diversity, equity, and inclusion (DEI) work takes time, energy, and education. This clinic will be a learning and sharing space for everyone who is on a journey towards building a more equitable research unit. Everyone can participate in this clinic, regardless of whether you are just starting your journey or you have travelled many miles and whether your research unit is one person or 100 people. The clinic will begin with discussion and thought exercises about your personal identity. We will then think about what it means for our individual research units to be diverse, equitable, and inclusive. Finally, we will discuss actions you can take to build an anti-racist research unit. Participants will be invited to share their current DEI actions and discuss how they can be adapted for, or expanded in, other settings. The clinic aims to foster an environment in which participants can learn from each other, but participants will not be required to share. Upon completion of this clinic every participant should have a plan for implementing at least one new DEI action, including milestones and accountability checks.  +
Many problems of interest to CSDMS members involve solving systems of conservation laws or balance laws for water wave propagation and inundation, erosion and sediment transport, landscape evolution, or for the flow of overland floods, glaciers, lava, or groundwater. It is often natural to solve these partial differential equations numerically with finite volume methods, in which the domain of interest is divided in finite grid cells and the quantities of interest within each grid cell are updated every time step due to fluxes across the cell boundaries and/or processes within the cell. I will give a brief introduction to some of the general theory of finite volume methods and considerations that affect their accuracy and numerical stability, with illustrations from some of the applications mentioned above.  +
Meandering is one of the most unique processes in Earth surface dynamics. Integrating the Kinoshita high-sinuosity curve describing meander channel planform geometry into the modified version of Beck equations describing the riverbed topography, a prototype for a synthetic riverbed topography generating model is made for idealized meandering rivers. Such method can be readily extended to apply on any arbitrary river centerline resulting in the synthetic riverbed topography model, pyRiverBed, presented herein. A meander migration and neck cutoff submodel is also embedded in pyRiverBed, however, unlike existing meander evolution models, the present model aims its emphasis towards generating the riverbed topography for each snapshot during the migration process. The present model can help meandering river researchers to interpolate field measured bathymetry data using the synthetic bed, to design non-flatbed laboratory flumes for experiments, and to initialize their hydrodynamics and sediment transport numerical models. It can also provide guidance in stream restoration projects on designing a channel with morphodynamic equilibrium bed.  +
Melting of the Greenland Ice Sheet contributes to rising global sea levels. However, local sea level along much of the Greenland coast is falling due to postglacial rebound and a decrease in gravitational attraction from the ice sheet. This affects Greenlandic coastal communities, which have to adapt their coastal infrastructure, shipping routes, and subsistence fisheries. The “Greenland Rising” project is a collaboration between Lamont-Doherty Earth Observatory and the Greenland Institute of Natural Resources that focuses on assessing and preparing for changing sea level along Greenland’s coastline. While sea level is predicted to fall, the exact magnitude varies widely depending on past and present ice change as well as the viscoelastic properties of the subsurface. I will demonstrate how current sea level change depends on these parameters and how we can integrate numerical models of glacial isostatic adjustment with observations of past sea level and present-day uplift to constrain them. I will further briefly describe the role of co-production in this project, which has allowed us to coordinate bathymetric surveys with local stakeholders from the municipality, industry, and local Hunters and Fishers organization. Combining numerical predictions of sea level change with baseline bathymetry and benthic mapping promises to provide communities with a clearer picture of future environmental change.  +
Model analysis frameworks specify ideas by which models and data are combined to simulate a system on interest. A given modeling framework will provide methods for model parameterization, data and model error characterization, sensitivity analysis (including identifying observations and parameters important to calibration and prediction), uncertainty quantification, and so on. Some model analysis frameworks suggest a narrow range of methods, while other frameworks try to place a broader range of methods in context. Testing is required to understand how well a model analysis framework is likely to work in practice. Commonly models are constructed to produce predictions, and here the accuracy and precision of predictions are considered.<br><br>The design of meaningful tests depends in part on the timing of system dynamics. In some circumstances the predicted quantity is readily measured and changes quickly, such as for weather (temperature, wind and precipitation), floods, and hurricanes. In such cases meaningful tests involve comparing predictions and measured values and tests can be conducted daily, hourly or even more frequently. The benchmarking tests in rainfall-runoff modeling, such as HEPEX, are in this category. The theoretical rating curves of Kean and Smith provide promise for high flow predictions. Though often challenged by measurement difficulties, short timeframe systems provide the simplest circumstance for conducting meaningful tests of model analysis frameworks.<br><br>If measurements are not readily available and(or) the system responds to changes over decades or centuries, as generally occurs for climate change, saltwater intrusion of groundwater systems, and dewatering of aquifers, prediction accuracy needs to be evaluated in other ways. For example, in recent work two methods were used to identify the likely accuracy of different methods used to construct models of groundwater systems (including parameterization methods): (1) results of complex and simple models were compared and (2) cross-validation experiments. These and other tests can require massive computational resources for any but the simplest of problems. In this talk we discuss the importance of model framework testing in these longer-term circumstances and provide examples of tests from several recent publications. We further suggest that for these long-term systems, the design and performance of such tests are essential for the responsible development of model frameworks, are critical for models of these environmental systems to provide enduring insights, and are one of the most important uses of high performance computing in natural resource evaluation.  
Modelling and simulation are critical approaches to addressing geographic and environmental issues. To date, enormous relevant geo-analysis models have been developed to simulate geographic phenomena and processes that can be used to solve environmental, atmospheric and ecological problems. These models developed by different groups or people are heterogeneous and difficult to share with others. As a result, numerous international groups or organizations have designed and developed standards to unify geo-analysis models, such as OpenMI, BMI and OpenGMS-IS. Models that follow a specific standard can be shared and reused in their own standard framework, however, they still can't be reused by other standards. Thus, model interoperation may help models be shared and reused by different standards. This research aims at designing an interoperability solution that can help users reuse geo-analysis models based on other standards. In this research, we discussed several solutions for model interoperation and analyzed the features of different standards. Firstly, we developed three solutions for models interoperation between different standards and discussed their advantages and disadvantages. Then, we analyzed the key features of model interoperation, including model field mapping, function conversion, data exchange, and component reorganization. Finally, we have developed an interoperability engine for interoperation between models based on OpenMI, BMI, or OpenGMS-IS. We also provided case studies (using e.g. SWMM, FDS, and the Permamodel Frost Number component) to successfully demonstrate the model interoperation.  +
Modelling network-scale sediment (dis)connectivity and its response to anthropic pressures provides a foundation understanding of river processes and sediment dynamics that can be used to forecast future trajectories of river form and process. We present the basin-scale, dynamic sediment connectivity model D-CASCADE, which combines concepts of network modelling with empirical sediment transport formulas to quantify spatiotemporal sediment (dis)connectivity in river networks. The D-CASCADE framework describes sediment connectivity in terms of transfer rate through space and time while accounting for several hydro-morphological and anthropic factors affecting sediment transport. Add-ons can be integrated into D-CASCADE to model local changes in river geomorphology driven by sediment-induced variations in features. Here, we show an application of D-CASCADE to the well-documented Bega River catchment, NSW, Australia, where major geomorphic changes have occurred in the network post-European settlement (ES) after the 1850s, including widespread channel erosion and sediment mobilization. By introducing historic drivers of change in the correct chronological sequence, the D-CASCADE model successfully reproduced the timing and magnitude of major phases of sediment transport and associated channel adjustments over the last two centuries. With this confidence, we then ran the model to test how well it performs at estimating future trajectories of basin-scale sediment transport and sediment budgets at the river reach scale.  +
Modelling river physical processes is of critical importance for flood protection, river management and restoration of riverine environments. Because of the continuous increment of computational power and the development of novel numerical algorithms, numerical models are nowadays widely and standardly used. The freeware BASEMENT is a flexible tool for one and two-dimensional river process simulations that bundles solvers for hydrodynamic, morphodynamic, scalar advection-diffusion and feedbacks with riparian vegetation. The adoption of a fully costless workflow and a light GUI facilitate its broad utilization in research, practice and education. In this seminar I introduce the different tools within the BASEMENT suite, present some domains of application and ongoing developments.  +
Modern photogrammetry allows us to make very accurate three-dimensional models using images from consumer-grade cameras. Multiview stereo photogrammetry, also known as structure from motion (SfM) is now easily accessible. Coupled with drones, this is transformative technology that lets us all make better maps than the National Geodetic Survey could not long ago. This hands-on course will demonstrate the basic tools and provide some tips that will allow you to map your favorite field area with 3 - 5 cm horizontal resolution and vertical RMS errors of less than 10 cm. Even better resolution can be obtained for smaller areas, such as outcrops, archaeological digs, or your daughter's art project.<br>We will use Agisoft Photoscan Pro software...please download the free demo (Pro) version before the class. It works equally well on Mac, Windows, and Linux. If you have a choice, chose a machine with an NVidia graphics card. We encourage you to collect a set of images to bring to the class. Guidelines on how best to take images intended for SfM will be send around before the meeting.  +
Montane Cloud Forests (MCFs) are globally relevant ecological zones that spend the majority of their growing season in cloud and fog. Prior eco-physiological studies have demonstrated that MCFs are incredibly efficient at assimilating CO2 during photosynthesis. This increased efficiency is attributed to how plants in these ecosystems operate within their unique microclimates. Specifically, MCF trees maintain high photosynthesis rates under fog and low cloud conditions. While this has been observed and quantified in lab and field experiments, current sub-models of plant-atmosphere interactions within Earth systems models (ESMs) cannot recreate enhanced levels of gas exchange measured in ecophysiology studies. This lack of understanding leads to high uncertainty in ESM estimates of evapotranspiration and carbon assimilation rates for MCF ecosystems. It is critical to improve our estimates of MCF hydrologic and photosynthetic processes as these ecosystems are vulnerable to drought and microclimatic conditions are likely to be altered by climate change. This talk will explore the gaps in our process-based understanding of water, energy, and carbon budgets for MCFs, how these gaps lead to uncertainties in ESMs at different spatial and temporal scales, and how we can address these gaps in future work.  +
Natural disasters push the process of scientific discovery to its limits: Their enormous scale makes them difficult to recreate in the lab, their destructive power and rare occurrence limit the possibility of acquiring field data, and their profoundly nonlinear behavior over a wide range of scales poses significant modeling challenges. In this talk, I explore how we can leverage insights from four different natural systems to contribute to our fundamental scientific understanding of the role that multiphase processes play in the onset and evolution of extreme events and to our ability to mitigate associated risks.  +
No abstract  +
No abstract  +
No abstract has been submitted  +
No abstract submitted  +
No abstract was needed for this meeting  +
No abstract was needed for this meeting  +
No abstract was needed for this workshop  +