Property:CSDMS meeting abstract presentation

From CSDMS

This is a property of type Text.

Showing 20 pages using this property.
P
GeoClaw (http://www.geoclaw.org) is an open-source software package for solving two-dimensional depth-averaged equations over general topography using high-resolution finite volume methods and adaptive mesh refinement. Wetting-and-drying algorithms allow modeling inundation and overland flows. The primary applications where GeoClaw has been used are tsunami modeling and storm surge, although it has also been applied to dam break and other overland flooding problems. The first part of this clinic will present an overview of the capabilities of GeoClaw, including a number of new features have been added in the past few years. These include: * Depth-averaged Boussinesq-type dispersive equations that better model short-wavelength tsunamis, such as those generated by landslides or asteroid impacts. Solving these equations requires implicit solvers (due to the higher-order derivatives in the equations). This is now working with the adaptive mesh refinement (AMR) algorithms in GeoClaw, which are critical for problems that require high-resolution coastal modeling while also modeling trans-oceanic propagation, for example. * Better capabilities for extracting output at frequent times on a fixed spatial grid by interpolation from the AMR grids during a computation. The resulting output can then be use for making high-resolution animations or for post-processing (e.g. the velocity field at frequent times can be used for particle tracking, as needed when tracking tsunami debris, for example). * Ways to incorporate river flows or tidal currents into GeoClaw simulation. * Better coupling with the D-Claw code for modeling debris flows, landslides, lahars, and landslide-generated tsunamis. (D-Claw is primarily developed by USGS researchers Dave George and Katy Barnhart). The second part of the clinic will be a hands-on introduction to installing GeoClaw and running some of the examples included in the distribution, with tips on how best to get started on a new project. GeoClaw is distributed as part of Clawpack (http://www.clawpack.org), and is available via the CSDMS model repository. For those who wish to install the software in advance on laptops, please see http://www.clawpack.org/installing.html. We will also go through this briefly and help with any issues that arise on your laptop (provided it is a Mac or Linux machine; we do not support Windows.) You may need to install some prerequisites in advance, such as Xcode on a Mac (since we require "make" and other command line tools), a Fortran compiler such as gfortran, and basic scientific Python tools such as NumPy and Matplotlib. See https://www.clawpack.org/prereqs.html.  
GeoClaw (http://www.geoclaw.org) is an open-source software package for solving two-dimensional depth-averaged equations over general topography using high-resolution finite volume methods and adaptive mesh refinement. Wetting-and-drying algorithms allow modeling inundation or overland flows. The primary applications where GeoClaw has been used are tsunami modeling and storm surge, although it has also been applied to dam break problems and other overland floods. This tutorial will give an introduction to setting up a tsunami modeling problem in GeoClaw, including: * Overview of capabilities, * Installing the software, * Using Python tools provided in GeoClaw to acquire and work with topography DEMs and other datasets, * Setting run-time parameters, including specifying adaptive refinement regions, * The VisClaw plotting software to visualize results using Python tools or display on Google Earth. GeoClaw is distributed as part of Clawpack (http://www.clawpack.org). Those who wish to install the software in advance on laptops, please see http://www.clawpack.org/installing.html. https://razmag.ir/review-of-mesotherapy/ Tutorials can be found here: https://github.com/clawpack/geoclaw_tutorial_csdms2019  +
GeoClaw is an open source Fortran/Python package based on Clawpack (conservation laws package), which implements high-resolution finite volume methods for solving wave propagation problems with adaptive mesh refinement. GeoClaw was originally developed for tsunami modeling and been validated via benchmarking workshops of the National Tsunami Hazard Mitigation Program for use in hazard assessment studies funded through this program. Current project include developing new tsunami inundation maps for the State of Washington and the development of new probabilistic tsunami hazard assessment (PTHA) methodologies. The GeoClaw code has also been extended to the study of storm surge and forms the basis for D-Claw, a debris flow and landslide code being developed at the USGS and recently used to model the 2014 Oso, Washington landslide, for example.  +
Get updates on Landlab, and meet fellow users! In this session, the development team will provide a briefing on the latest Landlab developments, and answer your questions.  +
Getting usable information out of climate and weather models can be a daunting task. The direct output from the models typically has unacceptable biases on local scales, and as a result a large number of methods have been developed to bias correct or downscale the climate model output. This clinic will describe the range of methods available as well as provide background on the pros and cons of different approaches. This will cover a variety of approaches from relatively simple methods that just rescale the original output, to more sophisticated statistical methods that account for broader weather patterns, to high-resolution atmospheric models. We will focus on methods for which output or code are readily available for end users, and discuss the input data required by different methods. We will follow this up with a practical session in which participants will be supplied a test dataset and code with which to perform their own downscaling. Participants interested in applying these methods to their own region of interest are encouraged to contact the instructor ahead of time to determine what inputs would be required.  +
Global models of Earth’s climate have expanded beyond their geophysical heritage to include terrestrial ecosystems, biogeochemical cycles, vegetation dynamics, and anthropogenic uses of the biosphere. Ecological forcings and feedbacks are now recognized as important for climate change simulation, and the models are becoming models of the entire Earth system. This talk introduces Earth system models, how they are used to understand the connections between climate and ecology, and how they provide insight to environmental stewardship for a healthy and sustainable planet. Two prominent examples discussed in the talk are anthropogenic land use and land-cover change and the global carbon cycle. However, there is considerable uncertainty in how to represent ecological processes at the large spatial scale and long temporal scale of Earth system models. Further scientific advances are straining under the ever-growing burden of multidisciplinary breadth, countered by disciplinary chauvinism and the extensive conceptual gap between observationalists developing process knowledge at specific sites and global scale modelers. The theoretical basis for Earth system models, their development and verification, and experimentation with these models requires a new generation of scientists, adept at bridging the disparate fields of science and using a variety of research methodologies including theory, numerical modeling, observations, and data analysis. The science requires a firm grasp of models, their theoretical foundations, their strengths and weaknesses, and how to appropriately use them to test hypotheses of the atmosphere-biosphere system. It requires a reinvention of how we learn about and study nature.  +
Google Earth Engine is a powerful geographic information system (GIS) that brings programmatic access and massively parallel computing to petabytes of publicly-available Earth observation data using Google’s cloud infrastructure. In this live-coding clinic, we’ll introduce some of the foundational concepts of workflows in Earth Engine and lay the groundwork for future self-teaching. Using the JavaScript API, we will practice: raster subsetting, raster reducing in time and space, custom asset (raster and vector) uploads, visualization, mapping functions over collections of rasters or geometries, and basic exporting of derived products.  +
Google Earth Engine(GEE) is a multi-petabyte catalog of satellite imagery and geospatial datasets with planetary-scale analysis capabilities. Now imagine all you need to work on it is a browser and an internet connection. This hands-on workshop will introduce you to and showcase cloud-native geospatial processing. We will explore the platform’s built-in catalog of 100+ petabytes of geospatial datasets and build some analysis workflows. Additional topics will also include uploading & ingesting your own data to Google Earth Engine, time series analysis essential for change monitoring, and data and code principles for effective collaboration. The hope is to introduce to cloud native geospatial analysis platform and to rethink data as we produce and consume more. If you want to follow along, bring your laptops, and register for an Earth Engine account here https://signup.earthengine.google.com P.S I recommend using a personal account :) you get to keep it  +
Granular materials are ubiquitous in the environment, in industry and in everyday life and yet are poorly understood. Modelling the behavior of a granular medium is critical to understanding problems ranging from hazardous landslides and avalanches in the Geosciences, to the design of industrial equipment. Typical granular systems contain millions of particles, but the underlying equations governing that collective motion are as yet unknown. The search for a theory of granular matter is a fundamental problems in physics and engineering and of immense practical importance for mitigating the risk of geohazards. Direct simulation of granular systems using the Discrete Element Method is a powerful tool for developing theories and modelling granular systems. I will describe the simulation technique and show its application to a diverse range of flows.  +
Great mentors engage early career scientists in research, open doors, speak the ‘unspoken rules’, and inspire the next generation. Yet many of us step into mentoring roles without feeling fully confident in the role, or uncertain how to create an inclusive environment that allows early career scientists from varied backgrounds to thrive. In this interactive workshop, we will share experiences and explore tools that can help build successful mentoring relationships, create supportive cohorts, and feel confident in becoming a great mentor.  +
Have you ever needed to use a software package and it won't build on your machine? Have you ever needed to distribute a set of software packages but your collaborators are grumbling that installing all of them is too much of a pain? These are common problems and there are tools that can help to take the pain away. Docker allows you to (1) prepare operating system images with software pre-installed on them, (2) run code inside these containerized OSes independent of the host machine, and (3) share these images online. Additionally, there are ready-made Docker images available for many popular software packages. In this webinar, I'll show how to use ready-made Docker images, how to make your own images, and how this tool can solve some of the more annoying problems that we encounter in scientific software development. If, like me, you viscerally hate learning to use new software tools, I get it, but I swear this one will get you out of a horrible jam some time.  +
Hazard assessment for post-wildfire debris flows, which are common in the steep terrain of the western United States, has focused on the susceptibility of upstream basins to generate debris flows. However, reducing public exposure to this hazard also requires an assessment of hazards in downstream areas that might be inundated during debris flow runout. Debris flow runout models are widely available, but their application to hazard assessment for post-wildfire debris flows has not been extensively tested. I will discuss a study in which we apply three candidate debris flow runout models in the context of the 9 January 2018 Montecito event. We evaluate the relative importance of flow volume and flow material properties in successfully simulating the event. Additionally, I will describe an in-progress user needs assessment designed to understand how professional decision makers (e.g., county emergency managers, floodplain manager, and Burned Area Emergency Response team members) might use post-fire debris flow inundation hazard assessment information. https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2021JF006245 Katy Barnhart is a Research Civil Engineer at the U.S. Geological Survey’s Geologic Hazards Science Center. She received her B.S.E. (2008) in Civil and Environmental Engineering from Princeton University and her M.S. (2010) and Ph.D. (2015) in Geological Sciences from the University of Colorado at Boulder. Her research uses numerical modeling to understand past and forecast future geomorphic change on a variety of timescales.  +
Here we present direct numerical simulation for the hysteresis of the Antarctic ice sheet and use linear response theory to use these kind of simulations to project Antarctica's sea level contribution to the end of the century. Related publications: * A. Levermann et al. 2020. Projecting Antarctica's contribution to future sea level rise from basal ice-shelf melt using linear response functions of 16 ice sheet models (LARMIP-2). Earth System Dynamics 11 (2020) 35-76, doi 10.5194/esd-11-35-2020. * J. Garbe, T. Albrecht, A. Levermann, J.F. Donges, R. Winkelmann, 2020. The Hysteresis of the Antarctic Ice Sheet. Nature 585 (2020), 538-544, doi: 10.1038/s41586-020-2727-5.  +
HexWatershed is a hydrologic flow direction model that supports structured and unstructured meshes. It uses state-of-the-art topological relationship-based stream burning and depression-filling techniques to produce high-quality flow-routing datasets across scales. HexWatershed has substantially improved over the past two years, including support for the DGGRID discrete global grid system (DGGS). This presentation will provide an overview of HexWatershed, highlighting its capabilities, new features, and improvements. Through hands-on tutorials and demonstrations, attendees will gain insights into the underlying philosophy of the HexWatershed model, and how to use HexWatershed products to run large-scale hydrologic models in watersheds worldwide. Specifically, this tutorial will cover major components in the HexWatershed ecosystem, including the computational mesh generation process, river network representation, and flow direction modeling. We will provide participants with resources to extend the tutorial problems and gain additional familiarity with the tools and workflows introduced. Attendees are encouraged to bring their laptops with internet access and a functional web browser. Tutorials will involve scripting operations in the Python language, such as Jupyter Notebook. We will use the Conda utility to install dependency libraries and Visual Studio Code to run the notebooks.  +
High-quality 3D visualizations can effectively engage audiences and aid our understanding environmental data yet creating them often feels out of reach for many researchers. The initial effort to overcome the technical complexity of gathering multiple datasets and 3D visualization software can be a barrier to producing photorealistic 3D images. This workshop aims to help participants create 3D visualizations of environmental data using Blender, a free, open-source 3D software. Here, we will focus on visualizing digital elevation models (DEMs), but the skills learned can also apply to visualizing other environmental datasets, for example, modeling results, field measurements, experimental data. We will use a Python-based workflow called TopoRivBlender to fetch, process, and visualize topographic, hydrographic, and satellite imagery geospatial data. With just a single command and a few minutes, this workflow will programmatically download an area of interest, process and project multiple geospatial datasets, and render photorealistic 3D visualizations with Blender. By the end of this workshop, participants will be able to: - Create their own Blender images of topography, hydrography, and satellite imagery of any location on Earth - Modify basic Blender settings for customizing the look of the 3D images - Describe data pipelines and how TopoRivBlender uses the workflow management tool Snakemake to fetch, process, and visualize the geospatial data  +
High-resolution topographic (HRT) data is becoming more easily accessible and prevalent, and is rapidly advancing our understanding of myriad surface and ecological processes. Landscape connectivity is the framework that describes the routing of fluids, sediments, and solutes across a landscape and is a primary control on geomorphology and ecology. Connectivity is not a static parameter, but rather a continuum that dynamically evolves on a range of temporal and spatial scales, and the observation of which is highly dependent on the available methodology. In this clinic we showcase the utility of HRT for the observation and characterization of landscapes and compare results with those of coarser spatial resolution data-sets. We highlight the potential for integrating HRT observations and parameters such as vegetation density, surface relief, and local slope variability with numerical surface process models. Participants will gain an understanding of the basics of HRT, data availability and basic analysis, and the use of HRT parameters in modeling.  +
How can we increase the diversity, richness and value of Spatial Data Infrastructure (SDI) to the Disasters and Natural Hazards community stakeholders? We’ll look at some of the current (and past) Open Geospatial Consortium initiatives to examine exciting work to enable sharing of complex data and models within the community using open standards.  +
Human settlements in dynamic environmental settings face the challenges both of managing their own impact on their surroundings and also adapting to change, which may be driven by a combination of local and remote factors, each of which may involve both human and natural forcings. Impacts of and responses to environmental change play out at multiple scales which involve complex nonlinear interactions between individual actors. These interactions can produce emergent results where the outcome at the community scale is not easily predicted from the decisions taken by individuals within the community. Agent-based simulations can be useful tools to explore the dynamics of both the human response to environmental change and the environmental impacts of human activity. Even very simple models can be useful in uncovering potential for unintended consequences of policy actions. Participatory simulations that allow people to interact with a system that includes simulated agents can be useful tools for teaching and communicating about such unintended consequences. I will report on progress on agent-based simulations of environmentally stressed communities in Bangladesh and Sri Lanka and preliminary results of using a participatory coupled model of river flooding and agent-based real estate markets to teach about unintended consequences of building flood barriers.  +
Humans alter natural geomorphic systems by modifying terrain morphology and through on-going actions that change patterns of sediment erosion, transport, and deposition. Long-term interactions between humans and the environment can be examined using numerical modeling. Human modifications of the landscape such as land cover change and agricultural tillage have been implemented within some landscape evolution models, yet little effort has been made to incorporate agricultural terraces. Terraces of various forms have been constructed for millennia in the Mediterranean, Southeast Asia, and South America; in those regions some terraces have undergone cycles of use, abandonment, and reuse. Current implementations of terraces in existing models are as static objects that uniformly impact landscape evolution, yet empirical studies have shown that terrace impact depends upon whether they are maintained or abandoned. We previously tested a simple terrace model that included a single terrace wall on a synthetic hillside with 20% slope for the impacts of maintenance and abandonment. In this research we modify the terrace model to include a wider variety of terrace forms and couple it with a landscape evolution model to test the extent terraced terrain morphology is related to terrace form. We also test how landscape evolution, after abandonment of terraced fields, differs based on length of time the terraces were maintained. We argue that construction and maintenance of terraces has a significant impact on the spatial patterning of sediment erosion and deposition and thus landscape evolution modeling of terraced terrain requires coupling with a dynamic model of terrace use.  +
Hurricanes can greatly modify the sedimentary record, but our coastal scientific modeling community has rather limited capability to predict such process. A three-dimensional sediment transport model was developed in the Regional Ocean Modeling System (ROMS) to study seabed erosion and deposition on the Louisiana shelf in response to Hurricanes Katrina and Rita in the year 2005. Conditions to either side of Hurricane Rita‚ storm track differed substantially, with the region to the east having stronger winds, taller waves and thus deeper erosions. This study indicated that major hurricanes can disturb the shelf at centimeter to meter levels on seabed.  +