Property:CSDMS meeting abstract presentation

From CSDMS

This is a property of type Text.

Showing 50 pages using this property.
P
Fill-Spill-Merge (FSM) is an algorithm that distributes runoff on a landscape to fill or partially fill depressions. When a depression fills, excess water can overflow into neighbouring depressions or the ocean. In this clinic, we will use FSM to assess changes in a landscape’s hydrology when depressions in a DEM are partially or fully filled with water. We will discuss why it may be important to consider depressions more closely than just with removal. I will describe the design of the FSM algorithm, and then we will use FSM on a DEM to look at how landscape hydrology changes under different hydrologic conditions. This clinic may be helpful to those interested in topics such as landscape hydrology, landscape evolution, flow routing, hydrologic connectivity, and lake water storage.  +
Fire temporarily alters soil and vegetation properties, driving increases in runoff and erosion that can dramatically increase the likelihood of debris flows. In the immediate aftermath of fire, debris flows most often initiate when surface water runoff rapidly erodes sediment on steep slopes. Due to the complex interactions between runoff generation, sediment transport, and post-fire debris-flow initiation and growth, models that couple these processes can provide valuable insights into the ways in which topography, burn severity, and post-fire recovery influence debris-flow activity. Here, we describe such a model as well as attempts to parameterize temporal changes in model parameters throughout the post-fire recovery process. Simulations of watershed-scale response to individual rainstorms in several southern California burned areas suggest substantial reductions in debris-flow likelihood and volume within the first 1-2 years following fire. Results highlight the importance of considering local rainfall characteristics and sediment supply when using process-based numerical models to assess debris-flow potential. More generally, results provide a methodology for estimating the intensity and duration of rainfall associated with the initiation of runoff-generated debris flows as well as insights into the persistence of debris-flow hazards following fire.  +
Flood hazard in rivers can evolve from changes in the frequency and intensity of flood-flows (hydrologic effects) and in the channel capacity to carry flood-flows (morphologic effects). However, river morphology is complex and often neglected in flood planning. Here, we separate the impacts of morphology vs. hydrology on flood risk for 48 river gauges in Northwestern Washington State. We find that morphologic vs. hydrologic forcings are comparable but not regionally consistent. Prominent morphologic effects on flood-risk are forced by extreme natural events and anthropogenic disturbances. Based on morphologic changes, we identify five categories of river behavior relevant for flood-risk management.  +
Flood modelling at global scales represents a revolution in hydraulic science and has the potential to transform decision-making and risk management in a wide variety of fields. Such modelling draws on a rich heritage of algorithm and data set development in hydraulic modelling over the last 20 years, and is now beginning to yield new insights into current and future flood risk. This paper reviews this progress and outlines recent efforts to develop a 30m resolution true hydrodynamic model of the entire conterminous US. The model is built using an automated framework which uses US National Elevation Dataset, the HydroSHEDS river network, regionalised frequency analysis to determine extreme flow and rainfall boundary conditions and the USACE National Levee Dataset to characterize flood defences. Comparison against FEMA and USGS flood maps shows the continental model to have skill approaching that of bespoke models built with local data. The paper describes the development and testing of the model, and it use to estimate current and future flood risk in the US using high resolution population maps and development projections.  +
Flooding is one of the costliest natural disasters and recent events, including several hurricanes as well as flash floods, have been particularly devastating. In the US alone, the last few years have been record-breaking in terms of flood disasters and triggered many reactions in public opinions. Governments are now reviewing the available information to better mitigate the risks from flooding.<br>Typically, in the US, flood hazard mapping is done by federal agencies (USACE, FEMA and USGS), with traditionally, little room and need for research model development in flood hazard applications. Now, with the advent of the National Water Model, the status quo of flood hazard prediction in the US may be changing; however, inundation extent and floodplain depths in the National Water Model are still under early-stage development.<br>This Clinic provides a beginner introduction to the latest capabilities in large-scale 2-D modeling using the LISFLOOD-FP model developed by the University of Bristol with a nearly 20-year code history. This model has a very long history in research applications, while the algorithms behind the model made their way also into many existing industry model codes. The session will give participants insights into 2-D flood inundation modeling with LISFLOOD-FP and also a look at more sophisticated sub-grid channel implementations for large-scale application. More specifically, we will look at the data sets needed by the model and then run a simulation of the annual flooding on the Inner Niger Delta in Mali. The Clinic will also give participants the opportunity to look at some high-resolution LiDAR-based model results.  +
Floodplain construction involves the interplay between channel belt sedimentation and avulsion, overbank deposition of fines, and sediment reworking by channel migration. There has been considerable progress in numerical modelling of these processes over the past few years, for example, by using high resolution flow and sediment transport models to simulate river morphodynamics, albeit over relatively small time and space scales. Such spatially-distributed hydrodynamic models are also regularly used to simulate floodplain inundation and overbank sedimentation during individual floods. However, most existing models of long-term floodplain construction and alluvial architecture do not account for flood hydraulics explicitly. Instead, floodplain sedimentation is typically modelled as an exponential function of distance from the river, and avulsion thresholds are defined using topographic indices (e.g., lateral:downstream slope ratios or metrics of channel belt super-elevation). This presentation aims to provide an overview of these issues, and present results from a hydrodynamically-driven model of long-term floodplain evolution. This model combines a simple network-based model of channel migration with a 2D grid-based model of flood hydrodynamics and overbank sedimentation. The latter involves a finite volume solution of the shallow water equations and an advection-diffusion model for suspended sediment transport. Simulation results are compared with observations from several large lowland floodplains, and the model is used to explore hydrodynamic controls on long-term floodplain evolution and alluvial ridge construction.  +
Flow routing map is the cornerstone of spatially distributed hydrologic models. In this clinic we will introduce HexWatershed, a scale-free, mesh independent flow direction model. It supports DOE’s Energy Exascale Earth System Model (E3SM) to generate hydrologic parameters and river network representations on both structured and unstructured meshes. In this presentation, we will overview the capabilities of HexWatershed with an emphasis on river network representation and flow direction modeling. We will also provide participants with the tools to begin their own research with hydrologic model workflows. Through hands-on tutorials and demonstrations, participants will gain some insights into the relationship between meshes and flow direction, and how HexWatershed handles river network in various meshes. We will also demonstrate how to use the HexWatershed model outputs in the large-scale hydrologic model, Model for Scale Adaptive River Transport (MOSART). Participants will be provided with additional resources that can be used to extend the tutorial problems and gain additional familiarity with the tools and workflows introduced. Participants are welcome to bring and utilize their own computers capable of accessing the internet and running a web browser. Tutorials will involve simple scripting operations in the Python language. The conda utility will be used to install libraries. Both QGIS and VisIt packages will be used for visualization.  +
Fluvial incision since late Miocene time (5 Ma) has shaped the transition between the Central Rocky Mountains and adjacent High Plains. Despite a clear contrast in erodibility between the mountains and plains, erodibility has not been carefully accounted for in previous attempts to model the geomorphic evolution of this region. The focus of this work to date has been to constrain erodibility values with a simplistic, toy model, and to reconstruct the paleosurface of the Miocene Ogallala Formation prior to its dissection beginning at 5 Ma. This surface reconstruction will be used as an initial condition in subsequent modeling.  +
Food security and poverty in Bangladesh are very dependent on natural resources, which fluctuate with a changing environment. The ecosystem services supporting the rural population are affected by several factors including climate change, upstream river flow modifications, commercial fish catches in the Bay of Bengal, and governance interventions. The ESPA Deltas project aims to holistically describe the interaction between the interlinked bio-physical environment and the livelihoods of the rural poorest in coastal Bangladesh, who are highly dependent on natural resources and live generally on less than US$1.50 per day. Here we describe a new integrated model that allows a long-term analysis of the possible changes in this system by linking projected changes in physical processes (e.g. river flows, nutrients), with productivity (e.g. fish, rice), social processes (e.g. access, property rights, migration) and governance (e.g. fisheries, agriculture, water and land use management). Bayesian Networks and Bayesian Processes allow multidisciplinary integration and exploration of specific scenarios. This integrated approach is designed to provide Bangladeshi policy makers with science-based evidence of possible development trajectories. This includes the likely robustness of different governance options on natural resource conservation and poverty levels. Early results highlight the far reaching implications of sustainable resource use and international cooperation to secure livelihoods and ensure a sustainable environment in coastal Bangladesh.  +
From G.K. Gilbert's "The Convexity of Hilltops" to highly-optimized numerical implementations of drainage basin evolution, models of landscape evolution have been used to develop insight into the development of specific field areas, create testable predictions of landform development, demonstrate the consequences of our current theories for geomorphic processes, and spark imagination through hypothetical scenarios. In this talk, I discuss how the types questions tackled with landscape evolution models have changed as observational data (e.g., high-resolution topography) and computational technology (e.g., accessible high performance computing) have become available. I draw on a natural experiment in postglacial drainage basin incision and a synthetic experiment in a simple tectonic setting to demonstrate how landscape evolution models can be used to identify how much information the topography or other observable quantities provide in inferring process representation and tectonic history. In the natural example, comparison of multiple calibrated models provides insight into which process representations improve our ability to capture the geomorphic history of a site. Projections into the future characterize where in the landscape uncertainty in the model structure dominates over other sources of uncertainty. In the synthetic case, I explore the ability of a numerical inversion to recover geomorphic-process relevant (e.g., detachment vs. transport limited fluvial incision) and tectonically relevant (e.g., date of fault motion onset) system parameters.  +
GCAM is an open-source, global, market equilibrium model that represents the linkages between energy, water, land, climate, and economic systems. One of GCAM's many outputs is projected land cover/use by subregion. Subregional projections provide context and can be used to understand regional land dynamics; however, Earth System Models (ESMs) generally require gridded representations of land at finer scales. Demeter, a land use and land cover disaggregation model, was created to provide this service. Demeter directly ingests land projections from GCAM and creates gridded products that match the desired resolution, and land class requirements of the user.  +
GPUs can make models, simulations, machine learning, and data analysis much faster, but how? And when? In this clinic we'll discuss whether you should use a GPU for your work, whether you should buy one, which one to buy, and how to use one effectively. We'll also get hands-on and speed up a landscape evolution model together. This clinic should be of interest both to folks who would like to speed up their code with minimal effort as well as folks who are interested in the nitty gritty of pushing computational boundaries.  +
GeoClaw (http://www.geoclaw.org) is an open-source software package for solving two-dimensional depth-averaged equations over general topography using high-resolution finite volume methods and adaptive mesh refinement. Wetting-and-drying algorithms allow modeling inundation and overland flows. The primary applications where GeoClaw has been used are tsunami modeling and storm surge, although it has also been applied to dam break and other overland flooding problems. The first part of this clinic will present an overview of the capabilities of GeoClaw, including a number of new features have been added in the past few years. These include: - Depth-averaged Boussinesq-type dispersive equations that better model short-wavelength tsunamis, such as those generated by landslides or asteroid impacts. Solving these equations requires implicit solvers (due to the higher-order derivatives in the equations). This is now working with the adaptive mesh refinement (AMR) algorithms in GeoClaw, which are critical for problems that require high-resolution coastal modeling while also modeling trans-oceanic propagation, for example. - Better capabilities for extracting output at frequent times on a fixed spatial grid by interpolation from the AMR grids during a computation. The resulting output can then be use for making high-resolution animations or for post-processing (e.g. the velocity field at frequent times can be used for particle tracking, as needed when tracking tsunami debris, for example). - Ways to incorporate river flows or tidal currents into GeoClaw simulation. - Better coupling with the D-Claw code for modeling debris flows, landslides, lahars, and landslide-generated tsunamis. (D-Claw is primarily developed by USGS researchers Dave George and Katy Barnhart). The second part of the clinic will be a hands-on introduction to installing GeoClaw and running some of the examples included in the distribution, with tips on how best to get started on a new project. GeoClaw is distributed as part of Clawpack (http://www.clawpack.org), and is available via the CSDMS model repository. For those who wish to install the software in advance on laptops, please see http://www.clawpack.org/installing.html. We will also go through this briefly and help with any issues that arise on your laptop (provided it is a Mac or Linux machine; we do not support Windows.) You may need to install some prerequisites in advance, such as Xcode on a Mac (since we require "make" and other command line tools), a Fortran compiler such as gfortran, and basic scientific Python tools such as NumPy and Matplotlib. See https://www.clawpack.org/prereqs.html.  
GeoClaw (http://www.geoclaw.org) is an open-source software package for solving two-dimensional depth-averaged equations over general topography using high-resolution finite volume methods and adaptive mesh refinement. Wetting-and-drying algorithms allow modeling inundation or overland flows. The primary applications where GeoClaw has been used are tsunami modeling and storm surge, although it has also been applied to dam break problems and other overland floods. This tutorial will give an introduction to setting up a tsunami modeling problem in GeoClaw, including: * Overview of capabilities, * Installing the software, * Using Python tools provided in GeoClaw to acquire and work with topography DEMs and other datasets, * Setting run-time parameters, including specifying adaptive refinement regions, * The VisClaw plotting software to visualize results using Python tools or display on Google Earth. GeoClaw is distributed as part of Clawpack (http://www.clawpack.org). Those who wish to install the software in advance on laptops, please see http://www.clawpack.org/installing.html. https://razmag.ir/review-of-mesotherapy/ Tutorials can be found here: https://github.com/clawpack/geoclaw_tutorial_csdms2019  +
GeoClaw is an open source Fortran/Python package based on Clawpack (conservation laws package), which implements high-resolution finite volume methods for solving wave propagation problems with adaptive mesh refinement. GeoClaw was originally developed for tsunami modeling and been validated via benchmarking workshops of the National Tsunami Hazard Mitigation Program for use in hazard assessment studies funded through this program. Current project include developing new tsunami inundation maps for the State of Washington and the development of new probabilistic tsunami hazard assessment (PTHA) methodologies. The GeoClaw code has also been extended to the study of storm surge and forms the basis for D-Claw, a debris flow and landslide code being developed at the USGS and recently used to model the 2014 Oso, Washington landslide, for example.  +
Getting usable information out of climate and weather models can be a daunting task. The direct output from the models typically has unacceptable biases on local scales, and as a result a large number of methods have been developed to bias correct or downscale the climate model output. This clinic will describe the range of methods available as well as provide background on the pros and cons of different approaches. This will cover a variety of approaches from relatively simple methods that just rescale the original output, to more sophisticated statistical methods that account for broader weather patterns, to high-resolution atmospheric models. We will focus on methods for which output or code are readily available for end users, and discuss the input data required by different methods. We will follow this up with a practical session in which participants will be supplied a test dataset and code with which to perform their own downscaling. Participants interested in applying these methods to their own region of interest are encouraged to contact the instructor ahead of time to determine what inputs would be required.  +
Global models of Earth’s climate have expanded beyond their geophysical heritage to include terrestrial ecosystems, biogeochemical cycles, vegetation dynamics, and anthropogenic uses of the biosphere. Ecological forcings and feedbacks are now recognized as important for climate change simulation, and the models are becoming models of the entire Earth system. This talk introduces Earth system models, how they are used to understand the connections between climate and ecology, and how they provide insight to environmental stewardship for a healthy and sustainable planet. Two prominent examples discussed in the talk are anthropogenic land use and land-cover change and the global carbon cycle. However, there is considerable uncertainty in how to represent ecological processes at the large spatial scale and long temporal scale of Earth system models. Further scientific advances are straining under the ever-growing burden of multidisciplinary breadth, countered by disciplinary chauvinism and the extensive conceptual gap between observationalists developing process knowledge at specific sites and global scale modelers. The theoretical basis for Earth system models, their development and verification, and experimentation with these models requires a new generation of scientists, adept at bridging the disparate fields of science and using a variety of research methodologies including theory, numerical modeling, observations, and data analysis. The science requires a firm grasp of models, their theoretical foundations, their strengths and weaknesses, and how to appropriately use them to test hypotheses of the atmosphere-biosphere system. It requires a reinvention of how we learn about and study nature.  +
Google Earth Engine is a powerful geographic information system (GIS) that brings programmatic access and massively parallel computing to petabytes of publicly-available Earth observation data using Google’s cloud infrastructure. In this live-coding clinic, we’ll introduce some of the foundational concepts of workflows in Earth Engine and lay the groundwork for future self-teaching. Using the JavaScript API, we will practice: raster subsetting, raster reducing in time and space, custom asset (raster and vector) uploads, visualization, mapping functions over collections of rasters or geometries, and basic exporting of derived products.  +
Google Earth Engine(GEE) is a multi-petabyte catalog of satellite imagery and geospatial datasets with planetary-scale analysis capabilities. Now imagine all you need to work on it is a browser and an internet connection. This hands-on workshop will introduce you to and showcase cloud-native geospatial processing. We will explore the platform’s built-in catalog of 100+ petabytes of geospatial datasets and build some analysis workflows. Additional topics will also include uploading & ingesting your own data to Google Earth Engine, time series analysis essential for change monitoring, and data and code principles for effective collaboration. The hope is to introduce to cloud native geospatial analysis platform and to rethink data as we produce and consume more. If you want to follow along, bring your laptops, and register for an Earth Engine account here https://signup.earthengine.google.com P.S I recommend using a personal account :) you get to keep it  +
Granular materials are ubiquitous in the environment, in industry and in everyday life and yet are poorly understood. Modelling the behavior of a granular medium is critical to understanding problems ranging from hazardous landslides and avalanches in the Geosciences, to the design of industrial equipment. Typical granular systems contain millions of particles, but the underlying equations governing that collective motion are as yet unknown. The search for a theory of granular matter is a fundamental problems in physics and engineering and of immense practical importance for mitigating the risk of geohazards. Direct simulation of granular systems using the Discrete Element Method is a powerful tool for developing theories and modelling granular systems. I will describe the simulation technique and show its application to a diverse range of flows.  +
Great mentors engage early career scientists in research, open doors, speak the ‘unspoken rules’, and inspire the next generation. Yet many of us step into mentoring roles without feeling fully confident in the role, or uncertain how to create an inclusive environment that allows early career scientists from varied backgrounds to thrive. In this interactive workshop, we will share experiences and explore tools that can help build successful mentoring relationships, create supportive cohorts, and feel confident in becoming a great mentor.  +
Hazard assessment for post-wildfire debris flows, which are common in the steep terrain of the western United States, has focused on the susceptibility of upstream basins to generate debris flows. However, reducing public exposure to this hazard also requires an assessment of hazards in downstream areas that might be inundated during debris flow runout. Debris flow runout models are widely available, but their application to hazard assessment for post-wildfire debris flows has not been extensively tested. I will discuss a study in which we apply three candidate debris flow runout models in the context of the 9 January 2018 Montecito event. We evaluate the relative importance of flow volume and flow material properties in successfully simulating the event. Additionally, I will describe an in-progress user needs assessment designed to understand how professional decision makers (e.g., county emergency managers, floodplain manager, and Burned Area Emergency Response team members) might use post-fire debris flow inundation hazard assessment information. https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2021JF006245 Katy Barnhart is a Research Civil Engineer at the U.S. Geological Survey’s Geologic Hazards Science Center. She received her B.S.E. (2008) in Civil and Environmental Engineering from Princeton University and her M.S. (2010) and Ph.D. (2015) in Geological Sciences from the University of Colorado at Boulder. Her research uses numerical modeling to understand past and forecast future geomorphic change on a variety of timescales.  +
Here we present direct numerical simulation for the hysteresis of the Antarctic ice sheet and use linear response theory to use these kind of simulations to project Antarctica's sea level contribution to the end of the century. Related publications: * A. Levermann et al. 2020. Projecting Antarctica's contribution to future sea level rise from basal ice-shelf melt using linear response functions of 16 ice sheet models (LARMIP-2). Earth System Dynamics 11 (2020) 35-76, doi 10.5194/esd-11-35-2020. * J. Garbe, T. Albrecht, A. Levermann, J.F. Donges, R. Winkelmann, 2020. The Hysteresis of the Antarctic Ice Sheet. Nature 585 (2020), 538-544, doi: 10.1038/s41586-020-2727-5.  +
HexWatershed is a hydrologic flow direction model that supports structured and unstructured meshes. It uses state-of-the-art topological relationship-based stream burning and depression-filling techniques to produce high-quality flow-routing datasets across scales. HexWatershed has substantially improved over the past two years, including support for the DGGRID discrete global grid system (DGGS). This presentation will provide an overview of HexWatershed, highlighting its capabilities, new features, and improvements. Through hands-on tutorials and demonstrations, attendees will gain insights into the underlying philosophy of the HexWatershed model, and how to use HexWatershed products to run large-scale hydrologic models in watersheds worldwide. Specifically, this tutorial will cover major components in the HexWatershed ecosystem, including the computational mesh generation process, river network representation, and flow direction modeling. We will provide participants with resources to extend the tutorial problems and gain additional familiarity with the tools and workflows introduced. Attendees are encouraged to bring their laptops with internet access and a functional web browser. Tutorials will involve scripting operations in the Python language, such as Jupyter Notebook. We will use the Conda utility to install dependency libraries and Visual Studio Code to run the notebooks.  +
High-resolution topographic (HRT) data is becoming more easily accessible and prevalent, and is rapidly advancing our understanding of myriad surface and ecological processes. Landscape connectivity is the framework that describes the routing of fluids, sediments, and solutes across a landscape and is a primary control on geomorphology and ecology. Connectivity is not a static parameter, but rather a continuum that dynamically evolves on a range of temporal and spatial scales, and the observation of which is highly dependent on the available methodology. In this clinic we showcase the utility of HRT for the observation and characterization of landscapes and compare results with those of coarser spatial resolution data-sets. We highlight the potential for integrating HRT observations and parameters such as vegetation density, surface relief, and local slope variability with numerical surface process models. Participants will gain an understanding of the basics of HRT, data availability and basic analysis, and the use of HRT parameters in modeling.  +
How can we increase the diversity, richness and value of Spatial Data Infrastructure (SDI) to the Disasters and Natural Hazards community stakeholders? We’ll look at some of the current (and past) Open Geospatial Consortium initiatives to examine exciting work to enable sharing of complex data and models within the community using open standards.  +
Human settlements in dynamic environmental settings face the challenges both of managing their own impact on their surroundings and also adapting to change, which may be driven by a combination of local and remote factors, each of which may involve both human and natural forcings. Impacts of and responses to environmental change play out at multiple scales which involve complex nonlinear interactions between individual actors. These interactions can produce emergent results where the outcome at the community scale is not easily predicted from the decisions taken by individuals within the community. Agent-based simulations can be useful tools to explore the dynamics of both the human response to environmental change and the environmental impacts of human activity. Even very simple models can be useful in uncovering potential for unintended consequences of policy actions. Participatory simulations that allow people to interact with a system that includes simulated agents can be useful tools for teaching and communicating about such unintended consequences. I will report on progress on agent-based simulations of environmentally stressed communities in Bangladesh and Sri Lanka and preliminary results of using a participatory coupled model of river flooding and agent-based real estate markets to teach about unintended consequences of building flood barriers.  +
Humans alter natural geomorphic systems by modifying terrain morphology and through on-going actions that change patterns of sediment erosion, transport, and deposition. Long-term interactions between humans and the environment can be examined using numerical modeling. Human modifications of the landscape such as land cover change and agricultural tillage have been implemented within some landscape evolution models, yet little effort has been made to incorporate agricultural terraces. Terraces of various forms have been constructed for millennia in the Mediterranean, Southeast Asia, and South America; in those regions some terraces have undergone cycles of use, abandonment, and reuse. Current implementations of terraces in existing models are as static objects that uniformly impact landscape evolution, yet empirical studies have shown that terrace impact depends upon whether they are maintained or abandoned. We previously tested a simple terrace model that included a single terrace wall on a synthetic hillside with 20% slope for the impacts of maintenance and abandonment. In this research we modify the terrace model to include a wider variety of terrace forms and couple it with a landscape evolution model to test the extent terraced terrain morphology is related to terrace form. We also test how landscape evolution, after abandonment of terraced fields, differs based on length of time the terraces were maintained. We argue that construction and maintenance of terraces has a significant impact on the spatial patterning of sediment erosion and deposition and thus landscape evolution modeling of terraced terrain requires coupling with a dynamic model of terrace use.  +
Hurricanes can greatly modify the sedimentary record, but our coastal scientific modeling community has rather limited capability to predict such process. A three-dimensional sediment transport model was developed in the Regional Ocean Modeling System (ROMS) to study seabed erosion and deposition on the Louisiana shelf in response to Hurricanes Katrina and Rita in the year 2005. Conditions to either side of Hurricane Rita‚ storm track differed substantially, with the region to the east having stronger winds, taller waves and thus deeper erosions. This study indicated that major hurricanes can disturb the shelf at centimeter to meter levels on seabed.  +
Hydrology is a science of extremes; droughts and floods. In either case, the hydrologic response arises from the combination of many factors, such as terrain, land cover, land use, infrastructure, etc. Each has different, overlapping spatial domains. Superimposed upon these are temporal variations, driven by stochastic weather events that follow seasonal climatic regimes. To calculate risk (expected loss) requires a loss function (damage) and a response domain (flood depths) over which that loss is integrated. The watershed provides the spatial domain that collects all these factors. This talk will discuss the data used to characterize hydrologic response.  +
I will discuss an application of the Migration, Intensification, and Diversification as Adaptive Strategies (MIDAS) agent-based modeling framework to modeling labor migration across Bangladesh under the stressor of sea-level rise (SLR). With this example, I hope to highlight some hard-to-resolve challenges in representing adaptive decision-making under as-yet unexperienced stressors in models. Drawing together what is more and what is less known in projections for future adaptation, I will discuss strategies for ‘responsible’ presentation and dissemination of model findings.  +
If one system comes to (my) mind where the human element is intertwined with the environment, it is the Louisiana coastal area in the Southern United States. Often referred to as the working coast, coastal Louisiana supports large industries with its ports, navigation channels, oil, and productive fisheries. In addition to that, Louisianians have a significant cultural connection to the coastal wetlands and their natural resources. Unfortunately, the land is disappearing into the sea with coastal erosion rates higher than anywhere else in the US. Due to these high rates of land loss, this system needs rigorous protection and restoration. While the restoration plans are mostly focused on building land, the effects on, for example, fisheries of proposed strategies should be estimated as well before decisions can be made on how to move forward. Through several projects I have been involved in, from small modeling projects to bold coastal design programs, I present how coupled models play a key role in science-based coastal management that considers the natural processes as well as the human element.  +
In dry regions, escarpments are key landforms for exploring landform-rainfall interactions. Here we present a modeling approach for arid cliffs and sub-cliff slopes evolution incorporating rainfall forcing at the scale of individual rainstorms. We used numerical experiments to mechanistically test how arid cliffs and sub-cliff slopes evolve according to different geomorphic characteristics and variations in rainstorm properties.  +
In formulating tectono-geomorphic models of landscape evolution, Earth is typically divided into two domains; the surface domain in which “geomorphic” processes are solved for and a tectonic domain of earth deformation driven generally by differential plate movements. Here we present a single mechanical framework, Failure Earth Response Model (FERM), that unifies the physical description of dynamics within and between the two domains. FERM is constructed on the two, basic assumptions about the three-dimensional stress state and rheological memory: I) Material displacement, whether tectonic or geomorphic in origin, at or below Earth’s surface, is driven by local forces overcoming local resistance, and II) Large displacements, whether tectonic or geomorphic in origin, irreversibly alter Earth material properties enhancing a long term strain memory mapped into the topography. In addition to the gathering of stresses arising from far field tectonic processes, topographic relief, and the inertial surface processes into a single stress state for every point, the FERM formulation allows explicit consideration of the contributions to the evolving landscape of pore pressure fluctuations, seismic accelerations, and fault damage. Incorporation of these in the FERM model significantly influences the tempo of landscape evolution and leads to highly heterogeneous and anisotropic stress and strength patterns, largely predictable from knowledge of mantle kinematics. The resulting unified description permits exploration of surface-tectonic interactions from outcrop to orogen scales and allows elucidation of the high fidelity orogenic strain and climate memory contained in topography.  +
In landscape evolution models, climate change is often assumed to be synonymous with changes in rainfall. In many climate changes, however, the dominant driver of landscape evolution is changes in vegetation cover. In this talk I review case studies that attempt to quantify the impact of vegetation changes on landscape evolution, including examples from hillslope/colluvial, fluvial, and aolian environments, spatial scales of ~10 m to whole continents, and time scales from decadal to millennial. Particular attention is paid to how to parameterize models using paleoclimatic and remote sensing data.  +
In software engineering, an interface is a group of functions with prescribed names, argument types, and return types. When a developer implements an interface for a piece of software, they fill out the details for each function while keeping the signatures intact. CSDMS has developed the Basic Model Interface (BMI) for facilitating the conversion of a model written in C, C++, Fortran, Python, or Java into a reusable, plug-and-play component. By design, BMI functions are simple. However, when trying to implement them, the devil is often in the details. In this hands-on clinic, we'll take a simple model of the two-dimensional heat equation, written in Python, and together we'll write the BMI functions to wrap it, preparing it for transformation into a component. As we develop, we’ll explore how to use the wrapped model with a Jupyter Notebook. To get the most out of this clinic, come prepared to code! We'll have a lot to write in the time allotted for the clinic. We recommend that clinic attendees have a laptop with the Anaconda Python distribution installed. We also request that you review the * BMI description (http://csdms.colorado.edu/wiki/BMI_Description), and the * BMI documentation (https://bmi-spec.readthedocs.io) before the start of the clinic.  +
In software engineering, an interface is a set of functions with prescribed names, argument types, and return types. When a developer implements an interface for a piece of software, they fill out the details for each function while keeping the signatures intact. CSDMS has developed the Basic Model Interface (BMI) for facilitating the conversion of an existing model written in C, C++, Fortran, Python or Java into a reusable, plug-and-play component. By design, BMI functions are straightforward to implement. However, when trying to match BMI functions to model behaviors, the devil is often in the details.<br>In this hands-on clinic, we'll take a simple model--an implementation of the two-dimensional heat equation in Python--and together, we'll write the BMI functions to wrap it, preparing it for transformation into a component. As we develop, we’ll explore how to use the wrapped model with a Jupyter Notebook.<br>To get the most out of this clinic, come prepared to code! We'll have a lot to write in the time allotted for the clinic. We recommend that clinic attendees have a laptop with the Anaconda Python distribution installed. We also request that you read over:<br>BMI description (https://csdms.colorado.edu/wiki/BMI_Description)<br>BMI documentation (http://bmi-python.readthedocs.io)<br>before participating in the clinic.  +
In the modeler community, hindcasting (a way to test models based on knowledge of past events) is required for all computer models before providing reliable results to users. CSDMS 2.0 “Moving forward” has proposed to incorporate benchmarking data into its modeling framework. Data collection in natural systems has been significantly advanced, but is still behind the resolution in time and space and includes natural variability beyond our understanding, which makes thorough testing of computer models difficult.<br><br>In the experimentalist community, research in Earth-surface processes and subsurface stratal development is in a data-rich era with rapid expansion of high-resolution, digitally based data sets that were not available even a few years ago. Millions of dollars has been spent to build and renovate flume laboratories. Advanced technologies and methodologies in experiment allow more number of sophisticated experiments in large scales at fine details. Joint effort between modelers and experimentalists is a natural step toward a great synergy between both communities.<br><br>Time for a coherent effort for building a strong global research network for these two communities is now. First, the both communities should initiate an effort to figure out a best practice, metadata for standardized data collection. Sediment experimentalists are an example community in the “long tail”, meaning that their data are often collected in one-of-a-kind experimental set-ups and isolated from other experiments. Second, there should be a centralized knowledge base (web-based repository for data and technology) easily accessible to modelers and experimentalists. Experimentalists also have a lot of “dark data,” data that are difficult or impossible to access through the Internet. This effort will result in tremendous opportunities for productive collaborations.<br><br>The new experimentalist and modeler network will be able to achieve the CSDMS current goal by providing high quality benchmark datasets that are well documented and easily accessible.  
In this clinic I will give an overview of lsdtopotools so that, by the end of the session, you will be able to run and visualise topographic analyses using lsdtopotools and lsdviztools. I will show how to start an lsdtopotools session in google colab in under 4 minutes, and will also give a brief overview for more advanced users of how to use our docker container if you want access to local files. I will then use jupyter notebooks to give example analyses including simple data fetching and hillshading, basin selection, simple topographic metrics and channel extraction. Depending on the audience I will show examples of a) channel steepness analysis for applications in tectonic geomorphology b) calculation of inferred erosion rates based on detrital CRN concentrations c) terrace and valley extraction d) channel-hillslope coupling. In addition I will show our simple visualisation scripts that allow you to generate publication-ready images. All you need prior to the session is a google account that allows you to access colab, and an opentopography account so you can obtain an API key. The latter is not required but will make the session more fun as you can use data from anywhere rather than example datasets. If you are not an advanced user please do not read the next sentence, as you don’t need it and it is nerdy compu-jargon that will put you off the session. If you are an advanced user and wish to try the docker container you should install the docker client for your operating system and use the command “docker pull lsdtopotools/lsdtt_pytools_docker” when you have access to a fast internet connection.  +
In this clinic we will explore how to use the new cloud-based remote sensing platform from Google. Our hands-on clinic will teach you the basics of loading and visualizing data in Earth Engine, sorting through data, and creating different types of composite images. These techniques are a good starting point for more detailed investigations that monitor changes on earth’s surface. Prerequisites:<br>1) Bring your own laptop.<br>2) Chrome installed on your system: It will work with Firefox but has issues.<br>3) An active Google account - Register for an account with Google Earth Engine (https://earthengine.google.com/signup/)  +
In this clinic we will explore how to use the cloud-based remote sensing platform from Google. Our hands-on clinic will teach you the basics of loading and visualizing data in Earth Engine, sorting through data, and creating different types of composite images. These techniques are a good starting point for more detailed investigations that monitor changes on earth’s surface. Prerequisites include having Chrome installed on your system: It will work with Firefox but has issues and an active Google account. Once you have those please register for an account with Google Earth Engine (https://earthengine.google.com/signup/)  +
In this clinic we will first review concepts of glacial isostatic adjustment and the algorithm that is used to solve the sea level equation. We will then provide an overview of the sea level code, which calculates the viscoelastic response of the solid Earth, Earth’s gravity field, and rotation axis to changes in surface load while conserving water between ice sheets and oceans. Participants will run the code, explore manipulating the input ice changes, and investigate its effect on the predicted changes in sea level, solid Earth deformation, and gravity field.  +
In this clinic, we will explore RivGraph, a Python package for extracting and analyzing fluvial channel networks from binary masks. We will first look at some background and motivation for RivGraph's development, including some examples demonstrating how RivGraph provides the required information for building models, developing new metrics, analyzing model outputs, and testing hypotheses about river network structure. We will then cover--at a high level--some of the logic behind RivGraph's functions. The final portion of this clinic will be spent working through examples showing how to process a delta and a braided river with RivGraph and visualizing results. Please note: This clinic is designed to be accessible to novice Python users, but those with no Python experience may also find value. If you'd like to work through the examples during the workshop, please install RivGraph beforehand, preferably to a fresh Anaconda environment. Instructions can be found here: https://github.com/jonschwenk/RivGraph. It is also recommended that you have a GIS (e.g. QGIS) available for use for easy display/interrogation of results.  +
In this clinic, we will first demonstrate existing interactive computer-based activities used for teaching concepts in sedimentology and stratigraphy. This will be followed by a hands-on session for creating different modules based on the participants’ teaching and research interests. Active learning strategies improve student exam performance, engagement, attitudes, thinking, writing, self-reported participation and interest, and help students become better acquainted with one another (Prince, 2004). Specifically, computer-based active learning is an attractive educational approach for post-secondary educators, because developing these activities takes advantage of existing knowledge and skills the educator is likely to already have. The demonstration portion of the clinic will focus on the existing rivers2stratigraphy (https://github.com/sededu/rivers2stratigraphy) activity, which illustrates basin-scale development of fluvial stratigraphy through adjustments in system kinematics including sandy channel migration and subsidence rates. The activity allows users to change these system properties, so as to drive changing depositional patterns. The module utilizes a rules based model, which produces realistic channel patterns, but simplifies the simulation to run efficiently, in real-time. The clinic will couple rivers2stratigraphy to a conventional laboratory activity which interprets an outcrop photograph of fluvial stratigraphy, and discuss logistics of using the module in the classroom. For the second part of the clinic, familiarity with Python will be beneficial (but is not required); we will utilize existing graphical user interface (GUI) frameworks in developing new activities, aimed to provide a user-friendly means for students to interact with model codes while engaging in geological learning. Participants should plan to have Python installed on their personal computers prior to the workshop, and a sample module will be emailed beforehand to let participants begin exploring the syllabus. ''Prince, M. (2004). Does Active Learning Work? A Review of the Research. Journal of Engineering Education, 93(3), 223-231. doi: 10.1002/j.2168-9830.2004.tb00809.x''.  
In this clinic, we will introduce and experiment with open-source tools designed to promote rapid hypothesis testing for river delta studies. We will show how pyDeltaRCM, a flexible Python model for simulating river delta evolution, can be extended to incorporate any arbitrary processes or forcings. We will highlight how object-oriented model design enables community-driven model development, and how this promotes reproducible science. Our clinic will develop an extended model to simulate deltaic evolution into receiving basins with different slopes. Then, the clinic will step through some basic analyses of the model runs, interrogating both surface processes and subsurface structure. Our overall goal is to familiarize you with the tools we are developing and introduce our approach to software design, so that you may adopt these tools or strategies in your research. Please note that familiarity with Python will be beneficial for this clinic, but is not required. Hands-on examples will be made available via an online programming environment (Google CoLab or similar); instructions for local installation on personal computers will be provided prior to the workshop as well.  +
In this clinic, we will provide a brief introduction to a selection of models (USGS and others), including FaSTMECH (2D/3D hydraulic) and PRMS (watershed hydrology), that have implemented a Basic Model Interface (BMI) and are available in the Python Modeling Toolkit (PyMT). We will interactively explore Jupyter Notebook examples of both stand-alone model operation and, as time permits, loosely coupled integrated modeling applications. Participants will need a laptop with a web browser. Knowledge of Python, Jupyter Notebook, and hydrologic/hydraulic modeling is helpful, but not required.  +
In this clinic, we will talk about diversity in a way that makes it approachable and actionable. We advocate that actions in support of diversity can happen at all career levels, so everyone who is interested can partake. We will discuss concrete strategies and opportunities to help you bring a diverse research group together. Creating a diverse group can be through reaching out to undergraduate minority students to engage in undergraduate research experiences. This can be done ground-up, i.e. by graduate students in a mentoring role as productively as a faculty in a hiring role. We are all supervisors and mentors in our own ways. We will highlight a number of approaches to engage with underrepresented minority students when recruiting new graduate students, and suggest some concrete adjustments of your recruitment processes to be as inclusive as possible. But being proactive does not stop after recruitment. The clinic will have dedicated discussion time to engage in role play, and provide stories about situations in which you can be an ally. We will identify some pitfalls, ways to reclaim, and provide ideas for more inclusive meetings and mentoring. Lastly, together we can work on creating an overview of current programs that focus on diversity and inclusion, to apply for funding to take action.  +
In this clinic, we will use flow routing in models to determine various earth surface processes such as river incision and others. Landlab has several flow routing components that address multiflow-routing, depression-filling and the diversity of grid types. We'll see how to design a landscape evolution model with relatively rapid flow routing execution time on large grids.  +
In this presentation several modeling efforts in Chesapeake Bay will be reviewed that highlight how we can use 3-dimensional, time-dependent hydrodynamic models to provide insight into biogeochemical and ecological processes in marine systems. Two modeling studies will be discussed which illustrate the application of individual based modeling approaches to simulate the impact of 3-dimensional currents and mixing on pelagic organisms and how these interact with behavior to determine the fate of planktonic species. There are many applications of this approach related to fish and invertebrate (e.g., oyster) larvae transport and fate and also plankton that can be used to inform management efforts.<br><br>A long-term operational modeling project will be discussed that combines mechanistic and empirical modeling approaches to provide nowcasts and short-term forecasts of Sea Nettles, HAB, pathogen and also physical and biogeochemical properties for research, management and public uses in Chesapeake Bay. This is a powerful technique can be expanded to any marine system that has a hydrodynamic model and any marine organism for which the habitat can be defined. <br><br>Finally, a new research project will be reviewed where we are assessing the readiness of a suite of existing estuarine community models for determining past, present and future hypoxia events within the Chesapeake Bay, in order to accelerate the transition of hypoxia model formulations and products from academic research to operational centers. This work, which will ultimately provide the ability to do operational oxygen modeling in Chesapeake Bay (e.g., oxygen weather forecasts), can be extended to other coastal water bodies and any biogeochemical property.  +
In this presentation, James Byrne (Lead Research Software Engineer) and Jonathan Smith (Principal Research Scientist) from the British Antarctic Survey will be describing existing digital infrastructure projects and developments happening in and around BAS. They will give a flavour of how technology is influencing the development of environmental and polar science, covering numerous research and operational domains. They will be focusing on the digital infrastructure applied to IceNet, an AI-based deep learning infrastructure. We will then show how generalized approaches to digital infrastructure are being applied to other areas, including cutting-edge Autonomous Marine Operations Planning (AMOP) capabilities. We will end highlighting the challenges that need solving in working towards an Antarctic Digital Twin and how we might approach them.  +