Search by property
From CSDMS
This page provides a simple browsing interface for finding entities described by a property and a named value. Other available search interfaces include the page property search, and the ask query builder.
List of results
- Flow routing map is the cornerstone of spa … Flow routing map is the cornerstone of spatially distributed hydrologic models. In this clinic we will introduce HexWatershed, a scale-free, mesh independent flow direction model. It supports DOE’s Energy Exascale Earth System Model (E3SM) to generate hydrologic parameters and river network representations on both structured and unstructured meshes. </br></br>In this presentation, we will overview the capabilities of HexWatershed with an emphasis on river network representation and flow direction modeling. We will also provide participants with the tools to begin their own research with hydrologic model workflows. Through hands-on tutorials and demonstrations, participants will gain some insights into the relationship between meshes and flow direction, and how HexWatershed handles river network in various meshes. We will also demonstrate how to use the HexWatershed model outputs in the large-scale hydrologic model, Model for Scale Adaptive River Transport (MOSART). Participants will be provided with additional resources that can be used to extend the tutorial problems and gain additional familiarity with the tools and workflows introduced. Participants are welcome to bring and utilize their own computers capable of accessing the internet and running a web browser. Tutorials will involve simple scripting operations in the Python language. The conda utility will be used to install libraries. Both QGIS and VisIt packages will be used for visualization.t packages will be used for visualization. +
- Fluvial incision since late Miocene time ( … Fluvial incision since late Miocene time (5 Ma) has shaped the transition between the Central Rocky Mountains and adjacent High Plains. Despite a clear contrast in erodibility between the mountains and plains, erodibility has not been carefully accounted for in previous attempts to model the geomorphic evolution of this region. The focus of this work to date has been to constrain erodibility values with a simplistic, toy model, and to reconstruct the paleosurface of the Miocene Ogallala Formation prior to its dissection beginning at 5 Ma. This surface reconstruction will be used as an initial condition in subsequent modeling. initial condition in subsequent modeling. +
- Food security and poverty in Bangladesh ar … Food security and poverty in Bangladesh are very dependent on natural resources, which fluctuate with a changing environment. The ecosystem services supporting the rural population are affected by several factors including climate change, upstream river flow modifications, commercial fish catches in the Bay of Bengal, and governance interventions. The ESPA Deltas project aims to holistically describe the interaction between the interlinked bio-physical environment and the livelihoods of the rural poorest in coastal Bangladesh, who are highly dependent on natural resources and live generally on less than US$1.50 per day. Here we describe a new integrated model that allows a long-term analysis of the possible changes in this system by linking projected changes in physical processes (e.g. river flows, nutrients), with productivity (e.g. fish, rice), social processes (e.g. access, property rights, migration) and governance (e.g. fisheries, agriculture, water and land use management). Bayesian Networks and Bayesian Processes allow multidisciplinary integration and exploration of specific scenarios. This integrated approach is designed to provide Bangladeshi policy makers with science-based evidence of possible development trajectories. This includes the likely robustness of different governance options on natural resource conservation and poverty levels. Early results highlight the far reaching implications of sustainable resource use and international cooperation to secure livelihoods and ensure a sustainable environment in coastal Bangladesh.ainable environment in coastal Bangladesh. +
- Fora.ai is an intuitive digital environmen … Fora.ai is an intuitive digital environment that enables diverse stakeholder groups to collaboratively interact with embedded simulation models to understand real world socio-environmental problems and create novel and impactful solutions. Stakeholders interact with this digital representation and with each other, iteratively creating, revising and testing solutions until diverse needs are addressed. Workshop participants will use fora.ai’s interactive game-board to collectively build green infrastructure solutions to flooding in a neighborhood in Chelsea, Massachusetts. The virtual environment allows for participation in a facilitated process in which users will: 1) input their individual priorities, 2) collaboratively run simulations to understand flooding issues in the neighborhood, 3) co-design green infrastructure scenarios to address these problems, 4) see how their changes affect the simulation, and 5) deliberate on the tradeoffs that arise from each solution due to competing priorities. Participants will be introduced to the flooding model and, with facilitator assistance, engage in multiple iterations of the process of prioritization, solution-building, and reflection on results. This process will allow them to refine their proposed solutions towards a design they would jointly support for implementation, with an understanding of its benefits and drawbacks. The workshop will end with a focus group debrief. Laptops or tablets required.roup debrief. Laptops or tablets required. +
- From G.K. Gilbert's "The Convexity of Hill … From G.K. Gilbert's "The Convexity of Hilltops" to highly-optimized numerical implementations of drainage basin evolution, models of landscape evolution have been used to develop insight into the development of specific field areas, create testable predictions of landform development, demonstrate the consequences of our current theories for geomorphic processes, and spark imagination through hypothetical scenarios. In this talk, I discuss how the types questions tackled with landscape evolution models have changed as observational data (e.g., high-resolution topography) and computational technology (e.g., accessible high performance computing) have become available. I draw on a natural experiment in postglacial drainage basin incision and a synthetic experiment in a simple tectonic setting to demonstrate how landscape evolution models can be used to identify how much information the topography or other observable quantities provide in inferring process representation and tectonic history. In the natural example, comparison of multiple calibrated models provides insight into which process representations improve our ability to capture the geomorphic history of a site. Projections into the future characterize where in the landscape uncertainty in the model structure dominates over other sources of uncertainty. In the synthetic case, I explore the ability of a numerical inversion to recover geomorphic-process relevant (e.g., detachment vs. transport limited fluvial incision) and tectonically relevant (e.g., date of fault motion onset) system parameters. of fault motion onset) system parameters. +
- GCAM is an open-source, global, market equ … GCAM is an open-source, global, market equilibrium model that represents the linkages between energy, water, land, climate, and economic systems. One of GCAM's many outputs is projected land cover/use by subregion. Subregional projections provide context and can be used to understand regional land dynamics; however, Earth System Models (ESMs) generally require gridded representations of land at finer scales. Demeter, a land use and land cover disaggregation model, was created to provide this service. Demeter directly ingests land projections from GCAM and creates gridded products that match the desired resolution, and land class requirements of the user., and land class requirements of the user. +
- GPUs can make models, simulations, machine … GPUs can make models, simulations, machine learning, and data analysis much faster, but how? And when? In this clinic we'll discuss whether you should use a GPU for your work, whether you should buy one, which one to buy, and how to use one effectively. We'll also get hands-on and speed up a landscape evolution model together. This clinic should be of interest both to folks who would like to speed up their code with minimal effort as well as folks who are interested in the nitty gritty of pushing computational boundaries.ritty of pushing computational boundaries. +
- GeoClaw (http://www.geoclaw.org) is an ope … GeoClaw (http://www.geoclaw.org) is an open-source software package for solving two-dimensional depth-averaged equations over general topography using high-resolution finite volume methods and adaptive mesh refinement. Wetting-and-drying algorithms allow modeling inundation and overland flows. The primary applications where GeoClaw has been used are tsunami modeling and storm surge, although it has also been applied to dam break and other overland flooding problems.</br></br>The first part of this clinic will present an overview of the capabilities of GeoClaw, including a number of new features have been added in the past few years. These include:</br> </br> - Depth-averaged Boussinesq-type dispersive equations that better model short-wavelength tsunamis, such as those generated by landslides or asteroid impacts. Solving these equations requires implicit solvers (due to the higher-order derivatives in the equations). This is now working with the adaptive mesh refinement (AMR) algorithms in GeoClaw, which are critical for problems that require high-resolution coastal modeling while also modeling trans-oceanic propagation, for example.</br> </br> - Better capabilities for extracting output at frequent times on a fixed spatial grid by interpolation from the AMR grids during a computation. The resulting output can then be use for making high-resolution animations or for post-processing (e.g. the velocity field at frequent times can be used for particle tracking, as needed when tracking tsunami debris, for example).</br> </br> - Ways to incorporate river flows or tidal currents into GeoClaw simulation.</br></br> - Better coupling with the D-Claw code for modeling debris flows, landslides, lahars, and landslide-generated tsunamis. (D-Claw is primarily developed by USGS researchers Dave George and Katy Barnhart).</br> </br>The second part of the clinic will be a hands-on introduction to installing GeoClaw and running some of the examples included in the distribution, with tips on how best to get started on a new project.</br></br>GeoClaw is distributed as part of Clawpack (http://www.clawpack.org), and is available via the CSDMS model repository. For those who wish to install the software in advance on laptops, please see http://www.clawpack.org/installing.html. We will also go through this briefly and help with any issues that arise on your laptop (provided it is a Mac or Linux machine; we do not support Windows.) You may need to install some prerequisites in advance, such as Xcode on a Mac (since we require "make" and other command line tools), a Fortran compiler such as gfortran, and basic scientific Python tools such as NumPy and Matplotlib. See https://www.clawpack.org/prereqs.html.See https://www.clawpack.org/prereqs.html. +
- GeoClaw (http://www.geoclaw.org) is an ope … GeoClaw (http://www.geoclaw.org) is an open-source software package for solving two-dimensional depth-averaged equations over general topography using high-resolution finite volume methods and adaptive mesh refinement. Wetting-and-drying algorithms allow modeling inundation or overland flows. The primary applications where GeoClaw has been used are tsunami modeling and storm surge, although it has also been applied to dam break problems and other overland floods.</br></br>This tutorial will give an introduction to setting up a tsunami modeling problem in GeoClaw, including:</br>* Overview of capabilities,</br>* Installing the software,</br>* Using Python tools provided in GeoClaw to acquire and work with topography DEMs and other datasets,</br>* Setting run-time parameters, including specifying adaptive refinement regions,</br>* The VisClaw plotting software to visualize results using Python tools or display on Google Earth.</br></br>GeoClaw is distributed as part of Clawpack (http://www.clawpack.org). Those who wish to install the software in advance on laptops, please see http://www.clawpack.org/installing.html.</br></br>https://razmag.ir/review-of-mesotherapy/</br>Tutorials can be found here: https://github.com/clawpack/geoclaw_tutorial_csdms2019ub.com/clawpack/geoclaw_tutorial_csdms2019 +
- GeoClaw is an open source Fortran/Python p … GeoClaw is an open source Fortran/Python package based on Clawpack (conservation laws package), which implements high-resolution finite volume methods for solving wave propagation problems with adaptive mesh refinement. GeoClaw was originally developed for tsunami modeling and been validated via benchmarking workshops of the National Tsunami Hazard Mitigation Program for use in hazard assessment studies funded through this program. Current project include developing new tsunami inundation maps for the State of Washington and the development of new probabilistic tsunami hazard assessment (PTHA) methodologies. The GeoClaw code has also been extended to the study of storm surge and forms the basis for D-Claw, a debris flow and landslide code being developed at the USGS and recently used to model the 2014 Oso, Washington landslide, for example.14 Oso, Washington landslide, for example. +
- Getting usable information out of climate … Getting usable information out of climate and weather models can be a daunting task. The direct output from the models typically has unacceptable biases on local scales, and as a result a large number of methods have been developed to bias correct or downscale the climate model output. This clinic will describe the range of methods available as well as provide background on the pros and cons of different approaches. This will cover a variety of approaches from relatively simple methods that just rescale the original output, to more sophisticated statistical methods that account for broader weather patterns, to high-resolution atmospheric models. We will focus on methods for which output or code are readily available for end users, and discuss the input data required by different methods. We will follow this up with a practical session in which participants will be supplied a test dataset and code with which to perform their own downscaling. Participants interested in applying these methods to their own region of interest are encouraged to contact the instructor ahead of time to determine what inputs would be required.o determine what inputs would be required. +
- Global models of Earth’s climate have expa … Global models of Earth’s climate have expanded beyond their geophysical heritage to include terrestrial ecosystems, biogeochemical cycles, vegetation dynamics, and anthropogenic uses of the biosphere. Ecological forcings and feedbacks are now recognized as important for climate change simulation, and the models are becoming models of the entire Earth system. This talk introduces Earth system models, how they are used to understand the connections between climate and ecology, and how they provide insight to environmental stewardship for a healthy and sustainable planet. Two prominent examples discussed in the talk are anthropogenic land use and land-cover change and the global carbon cycle. However, there is considerable uncertainty in how to represent ecological processes at the large spatial scale and long temporal scale of Earth system models. Further scientific advances are straining under the ever-growing burden of multidisciplinary breadth, countered by disciplinary chauvinism and the extensive conceptual gap between observationalists developing process knowledge at specific sites and global scale modelers. The theoretical basis for Earth system models, their development and verification, and experimentation with these models requires a new generation of scientists, adept at bridging the disparate fields of science and using a variety of research methodologies including theory, numerical modeling, observations, and data analysis. The science requires a firm grasp of models, their theoretical foundations, their strengths and weaknesses, and how to appropriately use them to test hypotheses of the atmosphere-biosphere system. It requires a reinvention of how we learn about and study nature.on of how we learn about and study nature. +
- Google Earth Engine is a powerful geograph … Google Earth Engine is a powerful geographic information system (GIS) that brings programmatic access and massively parallel computing to petabytes of publicly-available Earth observation data using Google’s cloud infrastructure. In this live-coding clinic, we’ll introduce some of the foundational concepts of workflows in Earth Engine and lay the groundwork for future self-teaching. Using the JavaScript API, we will practice: raster subsetting, raster reducing in time and space, custom asset (raster and vector) uploads, visualization, mapping functions over collections of rasters or geometries, and basic exporting of derived products., and basic exporting of derived products. +
- Google Earth Engine(GEE) is a multi-petaby … Google Earth Engine(GEE) is a multi-petabyte catalog of satellite imagery and geospatial datasets with planetary-scale analysis capabilities. Now imagine all you need to work on it is a browser and an internet connection. This hands-on workshop will introduce you to and showcase cloud-native geospatial processing. </br></br>We will explore the platform’s built-in catalog of 100+ petabytes of geospatial datasets and build some analysis workflows. Additional topics will also include uploading & ingesting your own data to Google Earth Engine, time series analysis essential for change monitoring, and data and code principles for effective collaboration. The hope is to introduce to cloud native geospatial analysis platform and to rethink data as we produce and consume more.</br></br>If you want to follow along, bring your laptops, and register for an Earth Engine account here https://signup.earthengine.google.com</br></br>P.S I recommend using a personal account :) you get to keep itusing a personal account :) you get to keep it +
- Granular materials are ubiquitous in the e … Granular materials are ubiquitous in the environment, in industry and in everyday life and yet are poorly understood. Modelling the behavior of a granular medium is critical to understanding problems ranging from hazardous landslides and avalanches in the Geosciences, to the design of industrial equipment. Typical granular systems contain millions of particles, but the underlying equations governing that collective motion are as yet unknown. The search for a theory of granular matter is a fundamental problems in physics and engineering and of immense practical importance for mitigating the risk of geohazards. Direct simulation of granular systems using the Discrete Element Method is a powerful tool for developing theories and modelling granular systems. I will describe the simulation technique and show its application to a diverse range of flows.s application to a diverse range of flows. +
- Great mentors engage early career scientis … Great mentors engage early career scientists in research, open doors, speak the ‘unspoken rules’, and inspire the next generation. Yet many of us step into mentoring roles without feeling fully confident in the role, or uncertain how to create an inclusive environment that allows early career scientists from varied backgrounds to thrive. In this interactive workshop, we will share experiences and explore tools that can help build successful mentoring relationships, create supportive cohorts, and feel confident in becoming a great mentor.feel confident in becoming a great mentor. +
- Have you ever needed to use a software pac … Have you ever needed to use a software package and it won't build on your machine? Have you ever needed to distribute a set of software packages but your collaborators are grumbling that installing all of them is too much of a pain? These are common problems and there are tools that can help to take the pain away. Docker allows you to (1) prepare operating system images with software pre-installed on them, (2) run code inside these containerized OSes independent of the host machine, and (3) share these images online. Additionally, there are ready-made Docker images available for many popular software packages. In this webinar, I'll show how to use ready-made Docker images, how to make your own images, and how this tool can solve some of the more annoying problems that we encounter in scientific software development. If, like me, you viscerally hate learning to use new software tools, I get it, but I swear this one will get you out of a horrible jam some time.l get you out of a horrible jam some time. +
- Hazard assessment for post-wildfire debris … Hazard assessment for post-wildfire debris flows, which are common in the steep terrain of the western United States, has focused on the susceptibility of upstream basins to generate debris flows. However, reducing public exposure to this hazard also requires an assessment of hazards in downstream areas that might be inundated during debris flow runout. Debris flow runout models are widely available, but their application to hazard assessment for post-wildfire debris flows has not been extensively tested. I will discuss a study in which we apply three candidate debris flow runout models in the context of the 9 January 2018 Montecito event. We evaluate the relative importance of flow volume and flow material properties in successfully simulating the event. Additionally, I will describe an in-progress user needs assessment designed to understand how professional decision makers (e.g., county emergency managers, floodplain manager, and Burned Area Emergency Response team members) might use post-fire debris flow inundation hazard assessment information.</br></br>https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2021JF006245</br></br>Katy Barnhart is a Research Civil Engineer at the U.S. Geological Survey’s Geologic Hazards Science Center. She received her B.S.E. (2008) in Civil and Environmental Engineering from Princeton University and her M.S. (2010) and Ph.D. (2015) in Geological Sciences from the University of Colorado at Boulder. Her research uses numerical modeling to understand past and forecast future geomorphic change on a variety of timescales.morphic change on a variety of timescales. +
- Here we present direct numerical simulatio … Here we present direct numerical simulation for the hysteresis of the Antarctic ice sheet and use linear response theory to use these kind of simulations to project Antarctica's sea level contribution to the end of the century.</br></br> </br>Related publications:</br>* A. Levermann et al. 2020. Projecting Antarctica's contribution to future sea level rise from basal ice-shelf melt using linear response functions of 16 ice sheet models (LARMIP-2). Earth System Dynamics 11 (2020) 35-76, doi 10.5194/esd-11-35-2020. </br>* J. Garbe, T. Albrecht, A. Levermann, J.F. Donges, R. Winkelmann, 2020. The Hysteresis of the Antarctic Ice Sheet. Nature 585 (2020), 538-544, doi: 10.1038/s41586-020-2727-5., 538-544, doi: 10.1038/s41586-020-2727-5. +
- HexWatershed is a hydrologic flow directio … HexWatershed is a hydrologic flow direction model that supports structured and unstructured meshes. It uses state-of-the-art topological relationship-based stream burning and depression-filling techniques to produce high-quality flow-routing datasets across scales. HexWatershed has substantially improved over the past two years, including support for the DGGRID discrete global grid system (DGGS). </br></br>This presentation will provide an overview of HexWatershed, highlighting its capabilities, new features, and improvements. Through hands-on tutorials and demonstrations, attendees will gain insights into the underlying philosophy of the HexWatershed model, and how to use HexWatershed products to run large-scale hydrologic models in watersheds worldwide. Specifically, this tutorial will cover major components in the HexWatershed ecosystem, including the computational mesh generation process, river network representation, and flow direction modeling.</br>We will provide participants with resources to extend the tutorial problems and gain additional familiarity with the tools and workflows introduced. </br></br>Attendees are encouraged to bring their laptops with internet access and a functional web browser. Tutorials will involve scripting operations in the Python language, such as Jupyter Notebook. We will use the Conda utility to install dependency libraries and Visual Studio Code to run the notebooks.d Visual Studio Code to run the notebooks. +
- High-resolution topographic (HRT) data is … High-resolution topographic (HRT) data is becoming more easily accessible and prevalent, and is rapidly advancing our understanding of myriad surface and ecological processes. Landscape connectivity is the framework that describes the routing of fluids, sediments, and solutes across a landscape and is a primary control on geomorphology and ecology. Connectivity is not a static parameter, but rather a continuum that dynamically evolves on a range of temporal and spatial scales, and the observation of which is highly dependent on the available methodology. In this clinic we showcase the utility of HRT for the observation and characterization of landscapes and compare results with those of coarser spatial resolution data-sets. We highlight the potential for integrating HRT observations and parameters such as vegetation density, surface relief, and local slope variability with numerical surface process models. Participants will gain an understanding of the basics of HRT, data availability and basic analysis, and the use of HRT parameters in modeling.and the use of HRT parameters in modeling. +
- How can we increase the diversity, richnes … How can we increase the diversity, richness and value of Spatial Data Infrastructure (SDI) to the Disasters and Natural Hazards community stakeholders? We’ll look at some of the current (and past) Open Geospatial Consortium initiatives to examine exciting work to enable sharing of complex data and models within the community using open standards.within the community using open standards. +
- Human settlements in dynamic environmental … Human settlements in dynamic environmental settings face the challenges both of managing their own impact on their surroundings and also adapting to change, which may be driven by a combination of local and remote factors, each of which may involve both human and natural forcings. Impacts of and responses to environmental change play out at multiple scales which involve complex nonlinear interactions between individual actors. These interactions can produce emergent results where the outcome at the community scale is not easily predicted from the decisions taken by individuals within the community. Agent-based simulations can be useful tools to explore the dynamics of both the human response to environmental change and the environmental impacts of human activity. Even very simple models can be useful in uncovering potential for unintended consequences of policy actions. Participatory simulations that allow people to interact with a system that includes simulated agents can be useful tools for teaching and communicating about such unintended consequences. I will report on progress on agent-based simulations of environmentally stressed communities in Bangladesh and Sri Lanka and preliminary results of using a participatory coupled model of river flooding and agent-based real estate markets to teach about unintended consequences of building flood barriers.d consequences of building flood barriers. +
- Humans alter natural geomorphic systems by … Humans alter natural geomorphic systems by modifying terrain morphology and through on-going actions that change patterns of sediment erosion, transport, and deposition. Long-term interactions between humans and the environment can be examined using numerical modeling. Human modifications of the landscape such as land cover change and agricultural tillage have been implemented within some landscape evolution models, yet little effort has been made to incorporate agricultural terraces. Terraces of various forms have been constructed for millennia in the Mediterranean, Southeast Asia, and South America; in those regions some terraces have undergone cycles of use, abandonment, and reuse. Current implementations of terraces in existing models are as static objects that uniformly impact landscape evolution, yet empirical studies have shown that terrace impact depends upon whether they are maintained or abandoned. We previously tested a simple terrace model that included a single terrace wall on a synthetic hillside with 20% slope for the impacts of maintenance and abandonment. In this research we modify the terrace model to include a wider variety of terrace forms and couple it with a landscape evolution model to test the extent terraced terrain morphology is related to terrace form. We also test how landscape evolution, after abandonment of terraced fields, differs based on length of time the terraces were maintained. We argue that construction and maintenance of terraces has a significant impact on the spatial patterning of sediment erosion and deposition and thus landscape evolution modeling of terraced terrain requires coupling with a dynamic model of terrace use.pling with a dynamic model of terrace use. +
- Hurricanes can greatly modify the sediment … Hurricanes can greatly modify the sedimentary record, but our coastal scientific modeling community has rather limited capability to predict such process. A three-dimensional sediment transport model was developed in the Regional Ocean Modeling System (ROMS) to study seabed erosion and deposition on the Louisiana shelf in response to Hurricanes Katrina and Rita in the year 2005. Conditions to either side of Hurricane Rita‚ storm track differed substantially, with the region to the east having stronger winds, taller waves and thus deeper erosions. This study indicated that major hurricanes can disturb the shelf at centimeter to meter levels on seabed.f at centimeter to meter levels on seabed. +
- Hydrology is a science of extremes; drough … Hydrology is a science of extremes; droughts and floods. In either case, the hydrologic response arises from the combination of many factors, such as terrain, land cover, land use, infrastructure, etc. Each has different, overlapping spatial domains. Superimposed upon these are temporal variations, driven by stochastic weather events that follow seasonal climatic regimes. To calculate risk (expected loss) requires a loss function (damage) and a response domain (flood depths) over which that loss is integrated. The watershed provides the spatial domain that collects all these factors. This talk will discuss the data used to characterize hydrologic response. used to characterize hydrologic response. +
- I will discuss an application of the Migra … I will discuss an application of the Migration, Intensification, and Diversification as Adaptive Strategies (MIDAS) agent-based modeling framework to modeling labor migration across Bangladesh under the stressor of sea-level rise (SLR). With this example, I hope to highlight some hard-to-resolve challenges in representing adaptive decision-making under as-yet unexperienced stressors in models. Drawing together what is more and what is less known in projections for future adaptation, I will discuss strategies for ‘responsible’ presentation and dissemination of model findings.ation and dissemination of model findings. +
- If one system comes to (my) mind where the … If one system comes to (my) mind where the human element is intertwined with the environment, it is the Louisiana coastal area in the Southern United States. Often referred to as the working coast, coastal Louisiana supports large industries with its ports, navigation channels, oil, and productive fisheries. In addition to that, Louisianians have a significant cultural connection to the coastal wetlands and their natural resources. Unfortunately, the land is disappearing into the sea with coastal erosion rates higher than anywhere else in the US. Due to these high rates of land loss, this system needs rigorous protection and restoration. While the restoration plans are mostly focused on building land, the effects on, for example, fisheries of proposed strategies should be estimated as well before decisions can be made on how to move forward. Through several projects I have been involved in, from small modeling projects to bold coastal design programs, I present how coupled models play a key role in science-based coastal management that considers the natural processes as well as the human element.al processes as well as the human element. +
- In dry regions, escarpments are key landfo … In dry regions, escarpments are key landforms for exploring landform-rainfall interactions. Here we present a modeling approach for arid cliffs and sub-cliff slopes evolution incorporating rainfall forcing at the scale of individual rainstorms. We used numerical experiments to mechanistically test how arid cliffs and sub-cliff slopes evolve according to different geomorphic characteristics and variations in rainstorm properties.cs and variations in rainstorm properties. +
- In formulating tectono-geomorphic models o … In formulating tectono-geomorphic models of landscape evolution, Earth is typically divided into two domains; the surface domain in which “geomorphic” processes are solved for and a tectonic domain of earth deformation driven generally by differential plate movements. Here we present a single mechanical framework, Failure Earth Response Model (FERM), that unifies the physical description of dynamics within and between the two domains. FERM is constructed on the two, basic assumptions about the three-dimensional stress state and rheological memory: I) Material displacement, whether tectonic or geomorphic in origin, at or below Earth’s surface, is driven by local forces overcoming local resistance, and II) Large displacements, whether tectonic or geomorphic in origin, irreversibly alter Earth material properties enhancing a long term strain memory mapped into the topography. In addition to the gathering of stresses arising from far field tectonic processes, topographic relief, and the inertial surface processes into a single stress state for every point, the FERM formulation allows explicit consideration of the contributions to the evolving landscape of pore pressure fluctuations, seismic accelerations, and fault damage. Incorporation of these in the FERM model significantly influences the tempo of landscape evolution and leads to highly heterogeneous and anisotropic stress and strength patterns, largely predictable from knowledge of mantle kinematics. The resulting unified description permits exploration of surface-tectonic interactions from outcrop to orogen scales and allows elucidation of the high fidelity orogenic strain and climate memory contained in topography.nd climate memory contained in topography. +
- In landscape evolution models, climate cha … In landscape evolution models, climate change is often assumed to be synonymous with changes in rainfall. In many climate changes, however, the dominant driver of landscape evolution is changes in vegetation cover. In this talk I review case studies that attempt to quantify the impact of vegetation changes on landscape evolution, including examples from hillslope/colluvial, fluvial, and aolian environments, spatial scales of ~10 m to whole continents, and time scales from decadal to millennial. Particular attention is paid to how to parameterize models using paleoclimatic and remote sensing data.ing paleoclimatic and remote sensing data. +
- In response to the CSDMS community’s inter … In response to the CSDMS community’s interest, the Human Dimensions group is excited to host a virtual Coffee Hour on community engagement in earth systems science and policy projects. Please join us for our first Coffee Hour, which will include an engaging panel on the topic: “Engaging diverse stakeholders in earth-systems modeling projects.” We recognize the importance of working collaboratively with stakeholders in scientific projects (e.g., for knowledge co-creation, for guidance, and for implementation of solutions derived from the research), but we are not traditionally trained to do so. Rigorous scientific practices can sometimes be alienating and extractive, eroding the trust between the scientific community and the public that is necessary for the advancement of science, policy, and human wellbeing. We discuss here the challenges involved in community engagement and possible ways to overcome them. </br></br>Our panelists are Leilah Lyons, NSF, Laura Schmitt Olabisi, Michigan State University and Mehana Vaughan, University of Hawaii. The Coffee Hour will begin with a short introduction by each panelist, followed by a set of questions by the facilitators, and concluding with a period of open questions and discussion with the audience.uestions and discussion with the audience. +
- In software engineering, an interface is a … In software engineering, an interface is a group of functions with prescribed names, argument types, and return types. When a developer implements an interface for a piece of software, they fill out the details for each function while keeping the signatures intact. CSDMS has developed the Basic Model Interface (BMI) for facilitating the conversion of a model written in C, C++, Fortran, Python, or Java into a reusable, plug-and-play component. By design, BMI functions are simple. However, when trying to implement them, the devil is often in the details.</br></br>In this hands-on clinic, we'll take a simple model of the two-dimensional heat equation, written in Python, and together we'll write the BMI functions to wrap it, preparing it for transformation into a component. As we develop, we’ll explore how to use the wrapped model with a Jupyter Notebook.</br></br>To get the most out of this clinic, come prepared to code! We'll have a lot to write in the time allotted for the clinic. We recommend that clinic attendees have a laptop with the Anaconda Python distribution installed. We also request that you review the</br>* BMI description (http://csdms.colorado.edu/wiki/BMI_Description), and the</br>* BMI documentation (https://bmi-spec.readthedocs.io)</br>before the start of the clinic.hedocs.io) before the start of the clinic. +
- In software engineering, an interface is a … In software engineering, an interface is a set of functions with prescribed names, argument types, and return types. When a developer implements an interface for a piece of software, they fill out the details for each function while keeping the signatures intact. CSDMS has developed the Basic Model Interface (BMI) for facilitating the conversion of an existing model written in C, C++, Fortran, Python or Java into a reusable, plug-and-play component. By design, BMI functions are straightforward to implement. However, when trying to match BMI functions to model behaviors, the devil is often in the details.<br>In this hands-on clinic, we'll take a simple model--an implementation of the two-dimensional heat equation in Python--and together, we'll write the BMI functions to wrap it, preparing it for transformation into a component. As we develop, we’ll explore how to use the wrapped model with a Jupyter Notebook.<br>To get the most out of this clinic, come prepared to code! We'll have a lot to write in the time allotted for the clinic. We recommend that clinic attendees have a laptop with the Anaconda Python distribution installed. We also request that you read over:<br>BMI description (https://csdms.colorado.edu/wiki/BMI_Description)<br>BMI documentation (http://bmi-python.readthedocs.io)<br>before participating in the clinic./bmi-python.readthedocs.io)<br>before participating in the clinic. +
- In the modeler community, hindcasting (a w … In the modeler community, hindcasting (a way to test models based on knowledge of past events) is required for all computer models before providing reliable results to users. CSDMS 2.0 “Moving forward” has proposed to incorporate benchmarking data into its modeling framework. Data collection in natural systems has been significantly advanced, but is still behind the resolution in time and space and includes natural variability beyond our understanding, which makes thorough testing of computer models difficult.<br><br>In the experimentalist community, research in Earth-surface processes and subsurface stratal development is in a data-rich era with rapid expansion of high-resolution, digitally based data sets that were not available even a few years ago. Millions of dollars has been spent to build and renovate flume laboratories. Advanced technologies and methodologies in experiment allow more number of sophisticated experiments in large scales at fine details. Joint effort between modelers and experimentalists is a natural step toward a great synergy between both communities.<br><br>Time for a coherent effort for building a strong global research network for these two communities is now. First, the both communities should initiate an effort to figure out a best practice, metadata for standardized data collection. Sediment experimentalists are an example community in the “long tail”, meaning that their data are often collected in one-of-a-kind experimental set-ups and isolated from other experiments. Second, there should be a centralized knowledge base (web-based repository for data and technology) easily accessible to modelers and experimentalists. Experimentalists also have a lot of “dark data,” data that are difficult or impossible to access through the Internet. This effort will result in tremendous opportunities for productive collaborations.<br><br>The new experimentalist and modeler network will be able to achieve the CSDMS current goal by providing high quality benchmark datasets that are well documented and easily accessible.igh quality benchmark datasets that are well documented and easily accessible. +
- In this clinic I will give an overview of … In this clinic I will give an overview of lsdtopotools so that, by the end of the session, you will be able to run and visualise topographic analyses using lsdtopotools and lsdviztools. I will show how to start an lsdtopotools session in google colab in under 4 minutes, and will also give a brief overview for more advanced users of how to use our docker container if you want access to local files. I will then use jupyter notebooks to give example analyses including simple data fetching and hillshading, basin selection, simple topographic metrics and channel extraction. Depending on the audience I will show examples of a) channel steepness analysis for applications in tectonic geomorphology b) calculation of inferred erosion rates based on detrital CRN concentrations c) terrace and valley extraction d) channel-hillslope coupling. In addition I will show our simple visualisation scripts that allow you to generate publication-ready images. All you need prior to the session is a google account that allows you to access colab, and an opentopography account so you can obtain an API key. The latter is not required but will make the session more fun as you can use data from anywhere rather than example datasets. If you are not an advanced user please do not read the next sentence, as you don’t need it and it is nerdy compu-jargon that will put you off the session. If you are an advanced user and wish to try the docker container you should install the docker client for your operating system and use the command “docker pull lsdtopotools/lsdtt_pytools_docker” when you have access to a fast internet connection.have access to a fast internet connection. +
- In this clinic we will explore how to use … In this clinic we will explore how to use the cloud-based remote sensing platform from Google. Our hands-on clinic will teach you the basics of loading and visualizing data in Earth Engine, sorting through data, and creating different types of composite images. These techniques are a good starting point for more detailed investigations that monitor changes on earth’s surface. Prerequisites include having Chrome installed on your system: It will work with Firefox but has issues and an active Google account. Once you have those please register for an account with Google Earth Engine (https://earthengine.google.com/signup/)e (https://earthengine.google.com/signup/) +
- In this clinic we will explore how to use … In this clinic we will explore how to use the new cloud-based remote sensing platform from Google. Our hands-on clinic will teach you the basics of loading and visualizing data in Earth Engine, sorting through data, and creating different types of composite images. These techniques are a good starting point for more detailed investigations that monitor changes on earth’s surface. Prerequisites:<br>1) Bring your own laptop.<br>2) Chrome installed on your system: It will work with Firefox but has issues.<br>3) An active Google account - Register for an account with Google Earth Engine (https://earthengine.google.com/signup/)Google Earth Engine (https://earthengine.google.com/signup/) +
- In this clinic we will first review concep … In this clinic we will first review concepts of glacial isostatic adjustment and the algorithm that is used to solve the sea level equation. We will then provide an overview of the sea level code, which calculates the viscoelastic response of the solid Earth, Earth’s gravity field, and rotation axis to changes in surface load while conserving water between ice sheets and oceans. Participants will run the code, explore manipulating the input ice changes, and investigate its effect on the predicted changes in sea level, solid Earth deformation, and gravity field.olid Earth deformation, and gravity field. +
- In this clinic, we will explore RivGraph, … In this clinic, we will explore RivGraph, a Python package for extracting and analyzing fluvial channel networks from binary masks. We will first look at some background and motivation for RivGraph's development, including some examples demonstrating how RivGraph provides the required information for building models, developing new metrics, analyzing model outputs, and testing hypotheses about river network structure. We will then cover--at a high level--some of the logic behind RivGraph's functions. The final portion of this clinic will be spent working through examples showing how to process a delta and a braided river with RivGraph and visualizing results.</br></br></br>Please note: This clinic is designed to be accessible to novice Python users, but those with no Python experience may also find value. If you'd like to work through the examples during the workshop, please install RivGraph beforehand, preferably to a fresh Anaconda environment. Instructions can be found here: https://github.com/jonschwenk/RivGraph. It is also recommended that you have a GIS (e.g. QGIS) available for use for easy display/interrogation of results.for easy display/interrogation of results. +
- In this clinic, we will first demonstrate … In this clinic, we will first demonstrate existing interactive computer-based activities used for teaching concepts in sedimentology and stratigraphy. This will be followed by a hands-on session for creating different modules based on the participants’ teaching and research interests. Active learning strategies improve student exam performance, engagement, attitudes, thinking, writing, self-reported participation and interest, and help students become better acquainted with one another (Prince, 2004). Specifically, computer-based active learning is an attractive educational approach for post-secondary educators, because developing these activities takes advantage of existing knowledge and skills the educator is likely to already have.</br></br>The demonstration portion of the clinic will focus on the existing rivers2stratigraphy (https://github.com/sededu/rivers2stratigraphy) activity, which illustrates basin-scale development of fluvial stratigraphy through adjustments in system kinematics including sandy channel migration and subsidence rates. The activity allows users to change these system properties, so as to drive changing depositional patterns. The module utilizes a rules based model, which produces realistic channel patterns, but simplifies the simulation to run efficiently, in real-time. The clinic will couple rivers2stratigraphy to a conventional laboratory activity which interprets an outcrop photograph of fluvial stratigraphy, and discuss logistics of using the module in the classroom.</br></br>For the second part of the clinic, familiarity with Python will be beneficial (but is not required); we will utilize existing graphical user interface (GUI) frameworks in developing new activities, aimed to provide a user-friendly means for students to interact with model codes while engaging in geological learning. Participants should plan to have Python installed on their personal computers prior to the workshop, and a sample module will be emailed beforehand to let participants begin exploring the syllabus.</br></br>''Prince, M. (2004). Does Active Learning Work? A Review of the Research. Journal of Engineering Education, 93(3), 223-231. doi: 10.1002/j.2168-9830.2004.tb00809.x''.doi: 10.1002/j.2168-9830.2004.tb00809.x''. +
- In this clinic, we will introduce and expe … In this clinic, we will introduce and experiment with open-source tools designed to promote rapid hypothesis testing for river delta studies. We will show how pyDeltaRCM, a flexible Python model for simulating river delta evolution, can be extended to incorporate any arbitrary processes or forcings. We will highlight how object-oriented model design enables community-driven model development, and how this promotes reproducible science. Our clinic will develop an extended model to simulate deltaic evolution into receiving basins with different slopes. Then, the clinic will step through some basic analyses of the model runs, interrogating both surface processes and subsurface structure. Our overall goal is to familiarize you with the tools we are developing and introduce our approach to software design, so that you may adopt these tools or strategies in your research.</br></br>Please note that familiarity with Python will be beneficial for this clinic, but is not required. Hands-on examples will be made available via an online programming environment (Google CoLab or similar); instructions for local installation on personal computers will be provided prior to the workshop as well.be provided prior to the workshop as well. +
- In this clinic, we will provide a brief in … In this clinic, we will provide a brief introduction to a selection of models (USGS and others), including FaSTMECH (2D/3D hydraulic) and PRMS (watershed hydrology), that have implemented a Basic Model Interface (BMI) and are available in the Python Modeling Toolkit (PyMT). We will interactively explore Jupyter Notebook examples of both stand-alone model operation and, as time permits, loosely coupled integrated modeling applications.</br>Participants will need a laptop with a web browser. Knowledge of Python, Jupyter Notebook, and hydrologic/hydraulic modeling is helpful, but not required.lic modeling is helpful, but not required. +
- In this clinic, we will talk about diversi … In this clinic, we will talk about diversity in a way that makes it approachable and actionable. We advocate that actions in support of diversity can happen at all career levels, so everyone who is interested can partake.</br>We will discuss concrete strategies and opportunities to help you bring a diverse research group together. Creating a diverse group can be through reaching out to undergraduate minority students to engage in undergraduate research experiences. This can be done ground-up, i.e. by graduate students in a mentoring role as productively as a faculty in a hiring role. We are all supervisors and mentors in our own ways.</br>We will highlight a number of approaches to engage with underrepresented minority students when recruiting new graduate students, and suggest some concrete adjustments of your recruitment processes to be as inclusive as possible. </br>But being proactive does not stop after recruitment. The clinic will have dedicated discussion time to engage in role play, and provide stories about situations in which you can be an ally. We will identify some pitfalls, ways to reclaim, and provide ideas for more inclusive meetings and mentoring.</br>Lastly, together we can work on creating an overview of current programs that focus on diversity and inclusion, to apply for funding to take action.sion, to apply for funding to take action. +
- In this clinic, we will use flow routing i … In this clinic, we will use flow routing in models to determine various earth surface processes such as river incision and others. Landlab has several flow routing components that address multiflow-routing, depression-filling and the diversity of grid types. We'll see how to design a landscape evolution model with relatively rapid flow routing execution time on large grids.low routing execution time on large grids. +
- In this presentation several modeling effo … In this presentation several modeling efforts in Chesapeake Bay will be reviewed that highlight how we can use 3-dimensional, time-dependent hydrodynamic models to provide insight into biogeochemical and ecological processes in marine systems. Two modeling studies will be discussed which illustrate the application of individual based modeling approaches to simulate the impact of 3-dimensional currents and mixing on pelagic organisms and how these interact with behavior to determine the fate of planktonic species. There are many applications of this approach related to fish and invertebrate (e.g., oyster) larvae transport and fate and also plankton that can be used to inform management efforts.<br><br>A long-term operational modeling project will be discussed that combines mechanistic and empirical modeling approaches to provide nowcasts and short-term forecasts of Sea Nettles, HAB, pathogen and also physical and biogeochemical properties for research, management and public uses in Chesapeake Bay. This is a powerful technique can be expanded to any marine system that has a hydrodynamic model and any marine organism for which the habitat can be defined. <br><br>Finally, a new research project will be reviewed where we are assessing the readiness of a suite of existing estuarine community models for determining past, present and future hypoxia events within the Chesapeake Bay, in order to accelerate the transition of hypoxia model formulations and products from academic research to operational centers. This work, which will ultimately provide the ability to do operational oxygen modeling in Chesapeake Bay (e.g., oxygen weather forecasts), can be extended to other coastal water bodies and any biogeochemical property.ded to other coastal water bodies and any biogeochemical property. +
- In this presentation, James Byrne (Lead Re … In this presentation, James Byrne (Lead Research Software Engineer) and</br>Jonathan Smith (Principal Research Scientist) from the British Antarctic</br>Survey will be describing existing digital infrastructure projects and</br>developments happening in and around BAS. They will give a flavour of</br>how technology is influencing the development of environmental and polar</br>science, covering numerous research and operational domains. They will</br>be focusing on the digital infrastructure applied to IceNet, an AI-based</br>deep learning infrastructure. We will then show how generalized</br>approaches to digital infrastructure are being applied to other areas,</br>including cutting-edge Autonomous Marine Operations Planning (AMOP) capabilities. </br></br>We will end highlighting the challenges that need solving in working towards an Antarctic Digital Twin and how we might approach them.gital Twin and how we might approach them. +
- In this talk, I will discuss the need for … In this talk, I will discuss the need for low carbon and sustainable computing. The current emissions from computing are almost 4% of the world total. This is already more than emissions from the airline industry and ICT emissions are projected to rise steeply over the next two decades. By 2040 emissions from computing alone will account for more than half of the emissions budget to keep global warming below 1.5°C. Consequently, this growth in computing emissions is unsustainable. The emissions from production of computing devices exceed the emissions from operating them, so even if devices are more energy efficient producing more of them will make the emissions problem worse. Therefore we must extend the useful life of our computing devices. As a society we need to start treating computational resources as finite and precious, to be utilized only when necessary, and as effectively as possible. We need frugal computing: achieving our aims with less energy and material.</br></br>'''Additional links:'''<br></br>* Blog posts: </br>** On climate cost of AI</br>*** https://wimvanderbauwhede.codeberg.page/articles/the-insatiable-hunger-of-openai/</br>*** https://wimvanderbauwhede.codeberg.page/articles/google-search-vs-chatgpt-emissions/</br>*** https://wimvanderbauwhede.codeberg.page/articles/climate-cost-of-ai-revolution/</br>** On Frugal Computing</br>*** https://wimvanderbauwhede.codeberg.page/articles/frugal-computing/</br>*** https://wimvanderbauwhede.codeberg.page/articles/frugal-computing-consumer/</br>*** https://wimvanderbauwhede.codeberg.page/articles/frugal-computing-developer/</br>* University web site with slides and videos of seminar talks</br>** https://www.gla.ac.uk/schools/computing/research/researchthemes/lowcarbon/</br>* Low Carbon Computing Learning and Teaching Resources</br>** https://codeberg.org/jgrizou/Low-Carbon-Computing-Teaching-Resources/jgrizou/Low-Carbon-Computing-Teaching-Resources +
- In this webinar, I will present a new fram … In this webinar, I will present a new framework termed “Bayesian Evidential Learning” (BEL) that streamlines the integration of these four components common to building Earth systems: data, model, prediction, decision. This idea is published in a new book: “Quantifying Uncertainty in Subsurface Systems” (Wiley-Blackwell, 2018) and applied to five real case studies in oil/gas, groundwater, contaminant remediation and geothermal energy. BEL is not a method, but a protocol based on Bayesianism that lead to the selection of relevant methods to solve complex modeling and decision problems. In that sense BEL, focuses on purpose-driven data collection and model-building. One of the important contributions of BEL is that is a data-scientific approach that circumvents complex inversion modeling relies on machine learning from Monte Carlo with falsified priors. The case studies illustrate how modeling time can be reduced from months to days, making it practical for large scale implementations. In this talk, I will provide an overview of BEL, how it relies on global sensitivity analysis, Monte Carlo, model falsification, prior elicitation and data scientific methods to implement the stated principle of its Bayesian philosophy. I will cover an extensive case study involving the managing of the groundwater system in Denmark.ging of the groundwater system in Denmark. +
- In this workshop we will explore publicly … In this workshop we will explore publicly available socioeconomic and hydrologic datasets that can be used to inform riverine flood risks under present-day and future climate conditions. We will begin with a summary of different stakeholders’ requirements for understanding flood risk data, through the lens of our experience working with federal, state and local clients and stakeholders. We will then guide participants through the relevant data sources that we use to inform these studies, including FEMA floodplain maps, census data, building inventories, damage functions, and future projections of extreme hydrologic events. We will gather and synthesize some of these data sources, discuss how each data source can be used in impact analyses; and discuss the limitations of each available data source. We will conclude with a brainstorming session to discuss how the scientific community can better produce actionable information for community planners, floodplain managers, and other stakeholders who might face increasing riverine flood risks in the future.easing riverine flood risks in the future. +