Property:CSDMS meeting abstract presentation

From CSDMS

This is a property of type Text.

Showing 100 pages using this property.
P
<i>Background</i><br>When it comes to building a general, efficient, surface process code, there are a couple of significant challenges that stand in our way. One is to address the interesting operators that appear in the mathematical formulation that are not commonly encountered in computational mechanics. The other is to cater for the many different formulations that have been put forward in the literature as no single, universal set of equations has been agreed upon by the community.<br><br><i>Computational Approach</i><br>We view Quagmire as a community toolbox and acknowledge that this means there is no one best way to formulate any of the landscape evolution models. We instead provide a collection of useful operators and examples of how to assemble various kinds of models. We do assume that:<br><ul><li>the surface is a single-valued height field described by the coordinates on a two-dimensional plane</li><li>the vertical evolution can be described by the time-derivative of the height field</li><li>the horizontal evolution can be described by an externally imposed velocity field</li><li>the formulation can be expressed through (non-linear) operators on the two dimensional plane</li><li>any sub-grid parameterisation (e.g. of stream bed geometry) is expressible at the grid scale</li><li>a parallel algorithm is desirable</li></ul>We don't make any assumptions about:<br><ul><li>the nature of the mesh: triangulation or a regular array of 'pixels'</li><li>the parallel decomposition (except that it is assumed to be general)</li><li>the specific erosion / deposition / transport model</li></ul>Quagmire is a collection of python objects that represent parallel vector and matrix operations on meshes and provide a number of useful surface-process operators in matrix-vector form. The implementation is through PETSc, numpy, and scipy. Quagmire is open source and a work in progress.<br><br><i>Mathematical Approach</i><br>Matrix-vector multiplication is the duct tape of computational science: fast, versatile, ubiquitous. Any problem that can be formulated in terms of basic linear algebra operations can usually be rendered into an abstract form that opens up multiple avenues to solve the resulting matrix equations and it is often possible to make extensive use of existing fast, parallel software libraries. Quagmire provides parallel, matrix-based operators on regular Cartesian grid and unstructured triangulations for local operations such as gradient evaluation, diffusion, smoothing but also for non-local, graph-based operations that include catchment identification, upstream summation, and flood-filling. <br>The advantage of the formulation is in the degree of abstraction that is introduced, separating the details of the discrete implementation from the solution phase. The matrix-based approach also makes it trivial to introduce concepts such as multiple-pathways in graph traversal / summation operations without altering the structure of the solution phase in any way. Although we have not yet done so, there are obvious future possibilities in developing implicit, non-linear solvers to further improve computational performance and to place the model equations in an inverse modelling framework.  
A comprehensive understanding of hydrologic processes affecting streamflow is required to effectively manage water resources to meet present and future human and environmental needs. The National Hydrologic Model (NHM), developed by the U.S. Geological Survey, can address these needs with an approach supporting coordinated, comprehensive, and consistent hydrologic modeling at multiple scales for the conterminous United States. The NHM fills knowledge gaps in ungaged areas, providing nationally consistent, locally informed, stakeholder relevant results. In this presentation, we will introduce the NHM and a publicly available Dockerized version that is currently providing daily operational results of water availability and stream temperature. We finish with a quick demonstration of a new experimental version of PRMS, the NHMs underlying hydrologic model, available through the CSDMS Python Modeling Toolkit (pymt).  +
A presentation from Phaedra and Greg, that was presented at the Modeling Collaboratory for Subduction Research Coordination Network Webinar Series, that features conversations between the leaders of successful interdisciplinary collaborations (see also https://www.sz4dmcs.org/webinars).  +
A range of Earth surface processes may drive rapid ice sheet retreat in the future, contributing to equally rapid global sea level rise. Though the pace of discovering these new feedback processes has accelerated in the past decade, predictions of future evolution of ice sheets are still subject to considerable uncertainty, originating from unknown future carbon emissions, and poorly understood ice sheet processes. In this talk, I explain why sea level rise projections past the next few decades are so uncertain, and how we are developing new stochastic ice sheet modeling methods to reduce uncertainty in projections, and the limits of uncertainty reduction. I also discuss the ongoing debate over whether uncertainty is important to consider at all in developing sea level projections that are usable by coastal planners.  +
A recent trend in the Earth Sciences is the adaptation of so-called “Digital Twins”. In Europe multi-million and even multi-billion projects are initiated for example, the Digital Twin of the Ocean and the Digital Twin Earth. But also many smaller digital-twin projects are popping up in the fields of city management, tunnels, hydraulic structures, waterways and coastal management. But what are Digital Twins really? Why are they now trending? What makes a Digital Twin different from a serious game, a numerical model or a simulator? In this session we will look at examples of digital twins, we will compare them to more traditional platforms and together define our expectations on future digital twins.  +
A wide variety of hydrological models are used by hydrologists: some differ because they were designed for different applications, some because of personal preferences of the modeller. All of them share the property that, like most scientific research code, it is rather hard to get someone elses model to run. The recently launched eWaterCycle platform takes away the headache of working with each other's models. In eWaterCycle models are run in containers and communicate with the central (Jupyter based) runtime environment through BMI. In this way a user can be talking to a Fortran model from Python without having to know anything about Fortran. Removing this headache allows hydrologists to easily run and couple each other's models facilitating science questions like the impact of model choice on results, or coupling different (regional, processes) models together with ease. In this talk I will highlight (and demonstrate) both the technology behind the eWaterCycle platform as well as the current and future research being done using the platform.  +
ANUGA is an open source software package capable of simulating small-scale hydrological processes such as dam breaks, river flooding, storm surges and tsunamis. ANUGA is a Python-language model that solves the Shallow Water Wave Equation on an unstructured triangular grid and can simulate shock waves and rapidly changing flows. It was developed by the Australian National University and Geosciences Australia and has an active developer and user community.<br><br>The package supports discontinuous elevation, or ‘jumps’ in the bed profile between neighbouring cells. This has a number of benefits. Firstly it can preserve lake-at-rest type stationary states with wet-dry fronts. It can also simulate very shallow frictionally dominated flow down sloping topography, as typically occurs in direct-rainfall flood models. A further benefit of the discontinuous-elevation approach, when combined with an unstructured mesh, is that the model can sharply resolve rapid changes in the topography associated with e.g. narrow prismatic drainage channels, or buildings, without the computational expense of a very fine mesh. The boundaries between such features can be embedded in the mesh using break-lines, and the user can optionally specify that different elevation datasets are used to set the elevation within different parts of the mesh (e.g. often it is convenient to use a raster digital elevation model in terrestrial areas, and surveyed channel bed points in rivers). The discontinuous-elevation approach also supports a simple and computationally efficient treatment of river walls. These are arbitrarily narrow walls between cells, higher than the topography on either side, where the flow is controlled by a weir equation and optionally transitions back to the shallow water solution for sufficiently submerged flows. This allows modelling of levees or lateral weirs which are much finer than the mesh size.<br><br>This clinic will provide a hands-on introduction to hydrodynamic modeling using ANUGA. We will discuss the structure and capabilities of the model as we build and run increasingly complex simulations involving channels and river walls. No previous knowledge of Python is required. Example input files will be provided and participants will be able to explore the code and outputs at their own pace.  
ANUGA is an open source software package capable of simulating small-scale hydrological processes such as dam breaks, river flooding, storm surges and tsunamis. Thanks to its modular structure, we’ve incorporated additional components to ANUGA that allow it to model suspended sediment transport and vegetation drag. ANUGA is a Python-language model that solves the Shallow Water Wave Equation on an unstructured triangular grid and can simulate shock waves and rapidly changing flows. It was developed by the Australian National University and Geosciences Australia and has an active developer and user community.<br><br>This clinic will provide a hands-on introduction to hydrodynamic modeling using ANUGA. We will discuss the structure and capabilities of the model as we build and run increasingly complex simulations. No previous knowledge of Python is required. Example input files will be provided and participants will be able to explore the code and outputs at their own pace.  +
Accurately characterizing the spatial and temporal variability of water and energy fluxes in many hydrologic systems requires an integrated modeling approach that captures the interactions and feedbacks between groundwater, surface water, and land- surface processes. Increasing recognition that these interactions and feedbacks play an important role in system behavior has lead to exciting new developments in coupled surface-subsurface modeling, with coupled surface-subsurface modeling becoming an increasingly useful tool for describing many hydrologic systems.<br><br>This clinic will provide a brief background on the theory of coupled surface-subsurface modeling techniques and parallel applications, followed by examples and hands-on experience using ParFlow, an open-source, object-oriented, parallel watershed flow model. ParFlow includes fully-integrated overland flow; the ability to simulate complex topography, geology and heterogeneity; and coupled land-surface processes including the land-energy budget, biogeochemistry, and snow processes. ParFlow is multi-platform and runs with a common I/O structure from laptop to supercomputer. ParFlow is the result of a long, multi-institutional development history and is now a collaborative effort between CSM, LLNL, UniBonn, and UC Berkeley. Many different configurations related to common hydrologic problems will be discussed through example problems.  +
Addressing society's water and energy challenges requires sustainable use of the Earth's critical zones and subsurface environment, as well as technological innovations in treatment and other engineered systems. Reactive transport models (RTMs) provide a powerful tool to inform engineering design and provide solutions for these critical challenges. In this keynote, I will showcase the flexibility and value of RTMs using real-world applications that focus on (1) assessing groundwater quality management with respect to nitrate under agricultural managed aquifer recharge, and (2) systematically investigating the physical, chemical and biological conditions that enhance CO2 drawdown rates in agricultural settings using enhanced weathering. The keynote will conclude with a discussion of the possibilities to advance the use of reactive transport models and future research opportunities therein.  +
Agent-Based Modeling (ABM) or Individual-Based Modeling is a research method rapidly increasing in popularity -- particularly among social scientists and ecologists interested in using simulation techniques to better understand the emergence of interesting system-wide patterns from simple behaviors and interactions at the individual scale. ABM researchers frequently partner with other scientists on a wide variety of topics related to coupled natural and human systems. Human societies impact (and are impacted by) various earth systems across a wide range of spatial and temporal scales, and ABM is a very useful tool for better understanding the effect of individual and social decision-making on various surface processes. The clinic will focus on introducing the basic toolkit needed to understand and pursue ABM research, and consider how ABM work differs from other computational modeling approaches. The clinic: - Will explore examples of the kinds of research questions and topics suited to ABM methods. - Will (attempt to) define some key concepts relevant to ABM research, such as emergence, social networks, social dilemmas, and complex adaptive systems. - Will provide an introduction to ABM platforms, particularly focused on NetLogo. - Discuss approaches to verification, validation, and scale dependency in the ABM world. - Introduce the Pattern-Oriented Modeling approach to ABM. - Discuss issues with reporting ABM research (ODD specification, model publishing). - Brainstorm tips and tricks for working with social scientists on ABM research.  +
Agent-Based Models (ABMs) can provide important insights into the nonlinear dynamics that emerge from the interactions of individual agents. While ABMs are commonly used in the social and ecological sciences, this rules-based modeling approach has not been widely adopted in the Surface Dynamics Modeling community. In this clinic, I will show how to build mixed models that utilize ABMs for some processes (e.g., forest dynamics and soil production) and numerical solutions to partial differential equations for other processes (e.g., hillside sediment transport). Specifically, I will introduce participants to pyNetLogo, a library that enables coupling between NetLogo ABMs and Python-based Landlab components. While active developers in either the NetLogo or Landlab communities will find this clinic useful, experience in both programming languages is not needed.  +
Agent-based modeling (ABM) developed as a method to simulate systems that include a number of agents – farmers, households, governments as well as biological organisms – that make decisions and interact according to certain rules. In environmental modeling, ABM is one of the best ways to explicitly account for human behavior, and to quantify cumulative actions of various actors distributed over the spatial landscape. This clinic provides an introduction to ABM and covers such topics as:<ol><li>Modeling heterogeneous agents that vary in attributes and follow different decision-strategies</li><li>Going beyond rational optimization and accommodating bounded rationality</li><li>Designing/representing agents’ interactions and learning.</ol>The clinic provides hands-on examples using the open-source modeling environment NetLogo https://ccl.northwestern.edu/netlogo. While no prior knowledge of NetLogo is required, participants are welcome to explore its super user-friendly tutorial. The clinic concludes with highlighting the current trends in ABM such as its applications in climate change research, participatory modeling and its potential to link with other types of simulations.  +
Agent-based modeling (ABM) is a powerful simulation tool with applications across disciplines. ABM has also emerged as a useful tool for capturing complex processes and interactions within socio-environmental systems. This workshop will offer a brief introduction to ABM for socio-environmental systems modeling including an overview of opportunities and challenges. Participants will be introduced to NetLogo, a popular programming language and modeling environment for ABM. In groups, participants will have the hands-on opportunity to program different decision-making methods in an existing model and observe how outcomes change. We will conclude with an opportunity for participants to raise questions or challenges they are experiencing with their own ABMs and receive feedback from the group.  +
An abstract was not required for this meeting  +
An overview of what the interagency Working Group stands for.  +
An update of what CSDMS has accomplished so far.  +
An update of what CSDMS has accomplished so far.  +
An update on CoMSES.  +
Answers to scientific questions often involve coupled systems that lie within separate fields of study. An example of this is flexural isostasy and surface mass transport. Erosion, deposition, and moving ice masses change loads on the Earth surface, which induce a flexural isostatic response. These isostatic deflections in turn change topography, which is a large control on surface processes. We couple a landscape evolution model (CHILD) and a flexural isostasy model (Flexure) within the CSDMS framework to understand interactions between these processes. We highlight a few scenarios in which this feedback is crucial for understanding what happens on the surface of the Earth: foredeeps around mountain belts, rivers at the margins of large ice sheets, and the "old age" of decaying mountain ranges. We also show how the response changes from simple analytical solutions for flexural isostasy to numerical solutions that allow us to explore spatial variability in lithospheric strength. This work places the spotlight on the kinds of advances that can be made when members of the broader Earth surface process community design their models to be coupleable, share them, and connect them under the unified framework developed by CSDMS. We encourage Earth surface scientists to unleash their creativity in constructing, sharing, and coupling their models to better learn how these building blocks make up the wonderfully complicated Earth surface system.  +
Are you confused about the best way to make your models and data accessible, reusable, and citable by others? In this clinic we will give you tools, information, and some dedicated time to help make your models and data FAIR - findable, accessible, interoperable and reusable. Models in the CSDMS ecosystem are already well on their way to being more FAIR than models that are not. But here, you will learn more about developments, guidelines, and tools from recent gatherings of publishers, repository leaders, and information technology practitioners at recent FAIR Data meetings, and translate this information into steps you can take to make your scientific models and data FAIR.  +
Are you interested in expanding the reach of your scientific data or models? One way of increasing the FAIRness of your digital resources (i.e., making them more findable, accessible, interoperable, and reproducible) is by annotating them with metadata about the scientific variables they describe. In this talk, we provide a simple introduction to the Scientific Variables Ontology (SVO) and show how, with only a small number of design patterns, it can be used to neatly unpack the definitions of even quite complex scientific variables and translate them into machine-readable form.  +
Are you tired of hearing about the FAIR Principles? This clinic is for you then, because after you participate you’ll never need to attend another one!* Good science depends on the careful and meticulous management and documentation of our research process. This includes our computational models, the datasets we use, the data transformation, analysis, and visualization scripts and workflows we build to evaluate and assess our models, and the assumptions and design decisions we make while writing our software. Join us for a Carpentries-style interactive clinic with hands-on exercises where we will provide concrete guidance and examples for how to approach, conceptualize, and transform your computational models of earth systems into FAIR contributions to the scientific record whether they are greenfield projects or legacy code with a focus on existing, open infrastructure (GitHub / GitLab / Zenodo). We’ll also cover containerization (Docker, Apptainer) as a way to transparently document system and software dependencies for your models, and how it can be used to support execution on the Open Science Grid Consortium’s Open Science Pool fair-share access compute resources. Big parallel fun! https://osg-htc.org ∗ individual results may vary, this statement is provably false  +
As agreed at earlier CSDMS forums, the major impediment in using AI for modeling the deep-ocean seafloor is a lack of training data, the data which guides the AI - whichever set of algorithms is chosen. This clinic will expose participants to globally-extensive datasets which are available through CSDMS. It will debate the scientific questions of why certain data work well, are appropriate to the processes, and are properly scaled. Participants are encouraged to bring their own AI challenges to the clinic.  +
As global population grows and infrastructure expands, the need to understand and predict processes at and near the Earth’s surface—including water cycling, soil erosion, landsliding, flood hazards, permafrost thaw, and coastal change—becomes increasingly acute. Progress in understanding and predicting these systems requires an ongoing integration of data and numerical models. Advances are currently hampered by technical barriers that inhibit finding, accessing, and operating modeling software and related tools and data sets. To address these challenges, we present the CSDMS@HydroShare, a cloud-based platform for accessing and running models, developing model-data workflows, and sharing reproducible results. CSDMS@HydroShare brings together cyberinfrastructure developed by two important community facilities: HydroShare (https://www.hydroshare.org/), which is an online collaboration environment for sharing data, models, and tools, and CSDMS Workbench (https://csdms.colorado.edu/wiki/Workbench), which is the integrated system of software tools, technologies, and standards for building, interfacing, and coupling models. This workshop presents how to use CSDMS@HydroShare to discover, access, and operate the Python Modeling Tool (PyMT). PyMT is one of the tools from the CSDMS Workbench, which allows users to interactively run and couple numerical models contributed by the community. In PyMT, there are already model components for coastal & permafrost modeling, stratigraphic and subsidence modeling, and terrestrial landscape evolution modeling. It also includes data components to access and download hydrologic and soil datasets from remote servers to feed the model components as inputs. This workshop aims to encourage the community to use existing or develop new model or data components under the PyMT modeling framework and share them through CSDMS@HydroShare to support reproducible research. This workshop includes hands-on exercises using tutorial Jupyter Notebooks and provides general steps for how to develop new components.  
At a global scale, deltas significantly concentrate people by providing diverse ecosystem services and benefits for their populations. At the same time, deltas are also recognized as one of the most vulnerable coastal environments, due to a range of adverse drivers operating at multiple scales. These include global climate change and sea-level rise, catchment changes, deltaic-scale subsidence and land cover changes, such as rice to aquaculture. These drivers threaten deltas and their ecosystem services, which often provide livelihoods for the poorest communities in these regions. Responding to these issues presents a development challenge: how to develop deltaic areas in ways that are sustainable, and benefit all residents? In response to this broad question we have developed an integrated framework to analyze ecosystem services in deltas and their linkages to human well-being. The main study area is part of the world’s most populated delta, the Ganges-Brahmaputra-Meghna Delta within Bangladesh. The framework adopts a systemic perspective to represent the principal biophysical and socio-ecological components and their interaction. A range of methods are integrated within a quantitative framework, including biophysical and socio-economic modelling, as well as analysis of governance through scenario development. The approach is iterative, with learning both within the project team and with national policy-making stakeholders. The analysis allows the exploration of biophysical and social outcomes for the delta under different scenarios and policy choices. Some example results will be presented as well as some thoughts on the next steps.  +
Bed material abrasion is a key control on the partitioning of basin scale sediment fluxes between coarse and fine material. While abrasion is traditionally treated as a simple exponential function of transport distance and a rock-specific abrasion coefficient, experimental studies have demonstrated greater complexity in the abrasion process: the rate of abrasion varies with clast angularity, transport rate, and grain size. Yet, few studies have attempted to assess the importance of these complexities in the field setting. Furthermore, existing approaches generally neglect the heterogeneity in size, abrasion potential, and clast density of the source sediment. Combining detailed field measurements and new modeling approaches, we quantify abrasion in the Suiattle River, a basin in the North Cascades of Washington State dominated by a single coarse sediment source: large, recurrent debris flows from a tributary draining Glacier Peak stratovolcano. Rapid downstream strengthening of river bar sediment and a preferential loss of weak, low-density vesicular volcanic clasts relative to non-vesicular ones suggest that abrasion is extremely effective in this system. The standard exponential model for downstream abrasion fails to reproduce observed downstream patterns in lithology and clast strength in the Suiattle, even when accounting for the heterogeneity of source material strength and the underestimate of abrasion rates by tumbler experiments. Incorporating transport-dependent abrasion into our model largely resolves this failure. These findings hint at the importance of abrasion and sediment heterogeneity in the morphodynamics of sediment pulse transport in river networks. A new modeling tool will allow us to tackle these questions: the NetworkSedimentTransporter, a Landlab component to model Lagrangian bed material transport and channel bed evolution. This tool will allow for future work on the interplay of bed material abrasion and size selective transport at the basin scale. While a simplified approach to characterizing abrasion is tempting, our work demonstrates that sediment heterogeneity and transport-dependent abrasion are important controls on the downstream fate of coarse sediment in fluvial systems.  
Biostabilizing organisms, such as saltmarsh and microphytobenthos, can play a crucial role in shaping the morphology of estuaries and coasts by locally stabilizing the sediment. However, their impact on large-scale morphology, which highly depends on the feedback between spatio-temporal changes in their abundance and physical forcing, remains highly uncertain. We studied the effect of seasonal growth and decay of biostabilizing organisms, in response to field calibrated physical forcings, on estuarine morphology over decadal timescales using a novel eco-morphodynamic model. The code includes temporal saltmarsh an microphytobenthos growth and aging as well as spatially varying vegetation fractions determined by mortality pressures. Growth representations are empirical and literature-based to avoid prior calibration. Novel natural patterns emerged in this model revealing that observed density gradients in vegetation are defined by the life-stages that increase vegetation resilience with age. The model revealed that the formation of seasonal and long term mud layering is governed by a ratio of flow velocity and hydroperiod altered by saltmarsh and microphytobenthos differently, showing that the type of biostabilizer determines the conditions under which mud can settle and be preserved. The results show that eco-engineering effects define emerging saltmarsh patterns from a combination of a positive effect reducing flow velocities and a negative effect enhancing hydroperiod. Consequently, saltmarsh and mud patterns emerge from their bilateral interactions that hence strongly define morphological development.  +
CSDMS 3.0 updates  +
CSDMS Basic Model Interface (BMI) - When equipped with a Basic Model Interface, a model is given a common set of functions for configuring and running the model (as well as getting and setting its state). Models with BMIs can communicate with each other and be coupled in a modeling framework. The coupling of models from different authors in different disciplines may open new paths to scientific discovery. In this first of a set of webinars on the CSDMS BMI, we'll provide an overview of BMI and the functions that define it. This webinar is appropriate for new users of BMI, although experienced users may also find it useful. '''Instructor:''' Mark Piper, Research Software Engineer, University of Colorado, Boulder '''When:''' November 13th, 12PM Eastern Time  +
CSDMS develops and maintains a suite of products and services with the goal of supporting research in the Earth and planetary surface processes community. This includes products such as Landlab, the Basic Model Interface, Data Components, the Model Repository, EKT Labs, and ESPIn. Examples of services include the Help Desk, Office Hours, Roadshows, RSEaaS, and EarthscapeHub. One problem, though, is that if the community doesn't know about these products and services, then they don't get used—and, like the Old Gods in Neil Gaiman's American Gods, they fade into obscurity. Let's break the cycle! Please join us for this webinar where we will present information about all of the products and services offered by CSDMS, and explain how they can help you accelerate your research. Attendees will leave with knowledge of what CSDMS can do for them, which they can bring back to their home institutions and apply to their research and share with their colleagues. <br>  +
CSDMS has developed a Web-based Modeling Tool – the WMT. WMT allows users to select models, to edit model parameters, and run the model on the CSDMS High-Performance Computing System. The web interface makes it straightforward to configure different model components and run a coupled model simulation. Users can monitor progress of simulations and download model output.<br><br> CSDMS has developed educational labs that use the WMT to teach quantitative concepts in geomorphology, hydrology, coastal evolution. These labs are intended to be used by Teaching assistants and Faculty alike. Descriptions of 4-hr hands-on labs have been developed for HydroTrend, Plume, Sedflux, CHILD, ERODE and ROMS-Lite. These labs include instructions for students to run the models and explore dominant parameters in sets of simulations. Learning objectives are split between topical concepts, on climate change and sediment transport amongst many others, and modeling strategies, modeling philosophy and critical assessment of model results.<br><br>In this clinic, we will provide an overview of the available models and labs, and their themes and active learning objectives. We will discuss the requirements and logistics of using the WMT in your classroom. We will run some simulations hands-on, and walk through one lab in more detail as a demonstration. Finally, the workshop intends to discuss future developments for undergraduate course use with the participants.  +
CSDMS has developed a Web-based Modeling Tool – the WMT. WMT allows users to select models, to edit model parameters, and run the model on the CSDMS High-Performance Computing System. The web tool makes it straightforward to configure different model components and run a coupled model simulation. Users can monitor progress of simulations and download model output.<br><br>CSDMS has designed educational labs that use the WMT to teach quantitative concepts in geomorphology, hydrology, coastal evolution and coastal sediment transport. These labs are intended for use by Teaching assistants and Faculty alike. Descriptions of 2 to 4-hr hands-on labs have been developed for HydroTrend, Plume, Sedflux, CHILD, TOPOFLOW and ROMS-Lite. These labs include instructions for students to run the models and explore dominant parameters in sets of simulations. Learning objectives are split between topical concepts, on climate change and sediment transport amongst many others, and modeling strategies, modeling philosophy and critical assessment of model results.<br><br>In this clinic, we will provide an overview of the available models and labs, and their themes and active learning objectives. We will discuss the requirements and logistics of using the WMT in your classroom. We will run some simulations hands-on, and walk through one lab in more detail as a demonstration. Finally, the workshop intends to discuss future developments for earning assessment tools with the participants.  +
CSDMS has developed the Basic Model Interface (BMI) to simplify the conversion of an existing model in C, C++, Fortran, Java, or Python into a reusable, plug-and-play component. By design, the BMI functions are straightforward to implement. However, in practice, the devil is in the details.<br><br>In this hands-on clinic, we will take a model -- in this case, an implementation of the two-dimensional heat equation in Python -- and together, we will write the BMI functions to transform it into a component. As we develop, we’ll unit test our component with nose, and we’ll explore how to use the component with a Jupyter Notebook. Optionally, we can set up a GitHub repository to store and to track changes to the code we write.<br><br>To get the most out of this clinic, come prepared to code! We have a lot to write in the time allotted. We recommend that clinic attendees have a laptop with the Anaconda Python distribution installed. We also request that you skim:<br><br>⤅ BMI description (https://csdms.colorado.edu/wiki/BMI_Description)<br>⤅ BMI documentation (http://bmi-forum.readthedocs.io/en/latest)<br>⤅ BMI GitHub repo(https://github.com/csdms/bmi-live)<br><br>before participating in the clinic.  +
CSDMS’s newly released Python Modeling Tool (PyMT) is an open source python package that provides convenient tools for coupling of models that use the Basic Model Interface. Historically, earth-surface process models have often been complex and difficult to work with. To help improve this situation and make the discovery process more efficient, the CSDMS Python Modeling Tool (PyMT) provides an environment in which community-built numerical models and tools can be initialized and run directly from a Python command line or Jupyter notebook. To illustrate how PyMT works and the advantages it provides, we will present a demonstration of two coupled models. By simplifying the process of learning, operating, and coupling models, PyMT frees researchers to focus on exploring ideas, testing hypotheses, and comparing models with data.  +
CSDMS’s newly released Python Modeling Tool (PyMT) is an open source Python package that provides convenient tools for coupling models that use the Basic Model Interface. Historically, earth-surface process models have often been complex and difficult to work with. To help improve this situation and make the discovery process more efficient, PyMT provides an environment in which community-built numerical models and tools can be initialized and run directly from a Python command line or a Jupyter Notebook. To illustrate how PyMT works and the advantages it provides, we will present a demonstration of two coupled models. By simplifying the process of learning, operating, and coupling models, PyMT frees researchers to focus on exploring ideas, testing hypotheses, and comparing models with data. Pre-registration required.<br><br>''See also: https://pymt.readthedocs.io/en/latest/''  +
Changing depth to water table and the associated stored water volume is a crucial component of the global hydrological cycle, with impacts on climate and sea level. However, long-term changes in global water-table distribution are not well understood. Coupled ground- and surface-water models are key to understanding the hydrologic evolution of post-glacial landscapes, the significance of terrestrial water storage, and the interrelationships between freshwater and climate. Here, I present the Water Table Model (WTM), which is capable of computing changes in water table elevation at large spatial scales and over long temporal scales. The WTM comprises groundwater and dynamic lake components to incorporate lakes into water-table elevation estimates. Sample results on both artificial and real-world topographies demonstrate the two-way coupling between dynamic surface-water and groundwater levels and flow.  +
Cheniers are ridges consisting of coarse-grained sediments, resting on top of muddy sediment. Along these muddy coastlines, cheniers provide shelter against wave attack, mitigating erosion or even enhancing accretion. As such, cheniers play an important role in the dynamics of the entire coastal landscape. This research focused on cheniers along mangrove-mud coasts. Therefore, chenier dynamics needed to be understood at the temporal and spatial scales of the mangrove vegetation as well. We developed a hybrid modelling approach, combining the strengths of complex process-based modelling (Delft3D), which allowed us to model the mixed-sediment dynamics at small temporal and spatial scales, with the strengths of a highly idealized profile model, providing low computational efforts for larger temporal and spatial scales.  +
Climate and tectonics ultimately drive the physical and chemical surface processes that evolve landscape structure, including the connectivity of landscape portions that facilitate or impede movement of organismal populations. Connectivity controls population spatial distribution, drives speciation where populations spatially fragment, and increases extinction susceptibility of species where its habitat shrinks. Here I demonstrate the role that landscape evolution models can have in exploring these process linkages in investigations of species diversification driven by climatic and tectonic forcings. The models were built with the tool, SpeciesEvolver that constructs lineages in response to environmental change at geologic, macroevolutionary, and landscape scales. I will also suggest how future studies can use landscape evolution models and tools such as SpeciesEvolver to pursue questions regarding the mechanisms by which lineages respond to the drivers and details of landscape evolution, and taxon-specific and region-specific interactions between biotas and their environments.  +
Climate-induced disturbances are expected to increase in frequency and intensity and affect coastal wetland ecosystem mainly through altering its hydrology. Investigating how wetland hydrology responds to climate disturbances is an important first step to understand the ecological response of coastal wetlands to these disturbances. In this talk, I am going to introduce my research work on improving the understanding of how the water storage of coastal wetlands at North Carolina, Delaware Bay, and the entire southeast U.S. changes under climatic disturbances. In particular, I will address the uncertainties in estimating water flow through coastal wetlands by considering 1) the regional-scale hydrologic interaction between uplands, coastal wetlands, and the ocean and 2) the impact of coastal eco-geomorphologic change on the freshwater and saltwater interaction on coastal marshlands.  +
Closing of the meeting  +
Cloud computing is a powerful tool for both analyzing large datasets and running models. This clinic will provide an introduction to approaches for accessing and using cloud resources for research in the Geosciences. During the hands-on portion of this clinic, participants will learn how to use Amazon Web Services (AWS) to open a terminal, analyze model output in python, and run a model, time permitting. This workshop assumes no experience with cloud computing.  +
Coastal Risk is a flood and natural hazard risk assessment technology company. Our mission is to help individuals, businesses and governments in the US and around the world achieve resilience and sustainability.<br>In the past year, Coastal Risk’s Technology supported nearly $2 billion in US commercial real estate investment and development. Coastal Risk’s unique business model combines high-tech, flood, climate and natural hazards risk assessments and high-value, risk communication reports with personalized, resilience-accelerating advice for individuals, corporations and governments. Our risk modeling and reports help save lives and property in the US. In order to take our system around the world, however, we need higher resolution DEMs. The 30m resolution currently available is a big obstacle to going international. This is something that we would like to get from NASA. Also, we are interested in high-resolution, “before-and-after” satellite imagery of flooded areas to compare with our modeling and to help individuals, businesses and governments understand how to better defend against floods.  +
Coastal communities facing shoreline erosion preserve their beaches both for recreation and for property protection. One approach is nourishment, the placement of externally-sourced sand to increase the beach’s width, forming an ephemeral protrusion that requires periodic re-nourishment. Nourishments add value to beachfront properties, thereby affecting re-nourishment choices for an individual community. However, the shoreline represents an alongshore-connected system, such that morphodynamics in one community are influenced by actions in neighboring communities. Prior research suggests coordinated nourishment decisions between neighbors were economically optimal, though many real-world communities have failed to coordinate, and the geomorphic consequences of which are unknown. Toward understanding this geomorphic-economic relationship, we develop a coupled model representing two neighboring communities and an adjacent non-managed shoreline. Within this framework, we examine scenarios where communities coordinate nourishment choices to maximize their joint net benefit versus scenarios where decision-making is uncoordinated such that communities aim to maximize their independent net benefits. We examine how community-scale property values affect choices produced by each management scheme and the economic importance of coordinating. The geo-economic model produces four behaviors based on nourishment frequency: seaward growth, hold the line, slow retreat, and full retreat. Under current conditions, coordination is strongly beneficial for wealth-asymmetric systems, where less wealthy communities acting alone risk nourishing more than necessary relative to their optimal frequency under coordination. For a future scenario, with increased material costs and background erosion due to sea-level rise, less wealthy communities might be unable to afford nourishing their beach independently and thus lose their beachfront properties.  +
Coastal environments are complex because of the interplay between aeolian and nearshore processes. Waves, currents, tides, and winds drive significant short term (<weekly) changes to coastal landforms which augment longer term (> annual) geomorphic trends. Great strides have been made in recent years regarding our ability to model coastal geomorphic change in this range of societally relevant time scales. However, a great disparity exists in modeling coastal evolution because subaqueous and subaerial processes are typically assessed completely independent of one another. By neglecting the co-evolution of subtidal and supratidal regions within our current framework, we are precluded from fully capturing non-linear dynamics of these complex systems. This has implications for predicting coastal change during both fair weather and storm conditions, hindering our ability to answer important scientific questions related to coastal vulnerability and beach building.<br><br>Recognizing these historic limitations, here we present the outline for a coupled subaqueous (XBeach) and subaerial (Coastal Dune Model) morphodynamic modeling system that is in active development with the goal of exploring coastal co-evolution on daily to decadal timescales. Furthermore we present recently collected datasets of beach and dune morphology in the Pacific Northwest US that will be used to validate trends observed within the coupled model platform.  +
Coastal flooding and related hazards have increasingly become one of the most impactful events as climate change continues to change the risk due to these events. Measuring the change in the risk of a particular flood level has therefore taken on a greater urgency, as historic measurements and statistics are no longer sufficient to measure the risk to coastal communities. Enabling our ability to compute these changes has become the focus as adaptation strategies due to the changing climate become increasingly critical. This talk will outline some of these challenges and ways we are attempting to address the problem in a multi-hazard aware way.  +
Coastal morphological evolution is caused by a wide range of coupled cross-shore and alongshore sediment transport processes associated with short waves, infragravity waves, and wave-induced currents. However, the fundamental transport mechanisms occur within the thin bottom boundary layer and are dictated by turbulence-sediment interaction and inter-granular interactions. In the past decade, significant progresses have been made in modeling sediment transport using Eulerian-Eulerian or Eulerian-Lagrangian two-phase flow approach. However, most of these models are limited to one-dimensional-vertical (1DV) formulation, which is only applicable to Reynolds-averaged sheet flow condition. Consequently, complex processes such as instabilities of the transport layer, bedform dynamics and turbulence-resolving capability cannot be simulated. The main objective of my research study was to develop a multi-dimensional four-way coupled two-phase model for sediment transport that can be used for Reynolds-averaged modeling for large-scale applications or for turbulence-resolving simulations at small-scale.  +
Coastal systems are an environmental sink for a wide range of materials of scientific interest, including sediments, nutrients, plastics, oils, seeds, and wood, to name only a few. Due to differences in material properties such as buoyancy, each of these materials are liable to have characteristic transport pathways which differ from the mean flow and each other, hydraulically “sorting” these materials in space. However, it remains difficult to quantify these differences in transport, due in part to the use of disparate models and approaches for each respective material. In this talk, I will advance a novel modeling framework for simulating the patterns of transport for a wide range of fluvially-transported materials using a single unified reduced-complexity approach, allowing us to compare and quantify differences in transport between materials. Using a hydrodynamic model coupled with the stochastic Lagrangian particle-routing model “dorado,” we are able to simulate at the process-level how local differences in material buoyancy lead to emergent changes in partitioning and nourishment in river deltaic systems. I will show some of the insights we have learned regarding the tendency for materials to be autogenically sorted in space, as well as progress we have made bridging between the process-level framework used in dorado and more physics-based approaches based on transport theory.  +
Computer models help us explore the consequences of scientific hypotheses at a level of precision and quantification that is impossible for our unaided minds. The process of writing and debugging the necessary code is often time-consuming, however, and this cost can inhibit progress. The code-development barrier can be especially problematic when a field is rapidly unearthing new data and new ideas, as is presently the case in surface dynamics.<br/><br/>To help meet the need for rapid, flexible model development, we have written a prototype software framework for two-dimensional numerical modeling of planetary surface processes. The Landlab software can be used to develop new models from scratch, to create models from existing components, or a combination of the two. Landlab provides a gridding module that allows you to create and configure a model grid in just a few lines of code. Grids can be regular or unstructured, and can readily be used to implement staggered-grid numerical solutions to equations for various types of geophysical flow. The gridding module provides built-in functions for common numerical operations, such as calculating gradients and integrating fluxes around the perimeter of cells. Landlab is written in Python, a high-level language that enables rapid code development and takes advantage of a wealth of libraries for scientific computing and graphical output. Landlab also provides a framework for assembling new models from combinations of pre-built components.<br/><br/>In this clinic we introduce Landlab and its capabilities. We emphasize in particular its flexibility, and the speed with which new models can be developed under its framework. In particular, we will introduce the many tools available within Landlab that make development of new functionality and new descriptions of physical processes both easy and fast. Participants will finish the clinic with all the knowledge necessary to build, run and visualize 2D models of various types of earth surface systems using Landlab.  
D-Claw is an extension of the software package GeoClaw (www.clawpack.org) for simulating flows of granular-fluid mixtures with evolving volume fractions. It was developed primarily for landslides, debris flows and related phenomena by incorporating principles of solid, fluid and soil mechanics. However, because the two-phase model accommodates variable phase concentrations, it can also be used to model fluid problems in the absence of solid content (the model equations reduce to the shallow water equations as the solid phase vanishes). We therefore use D-Claw to seamlessly simulate multifaceted problems that involve the interaction of granular-fluid mixtures and bodies of water. This includes a large number of cascading natural hazards, such as debris-avalanches and lahars that enter rivers and lakes, landslide-generated tsunamis, landslide dams and outburst floods that entrain debris, and debris-laden tsunami inundation. I will describe the basis of D-Claw's model equations and highlight some recent applications, including the 2015 Tyndall Glacier landslide and tsunami, potential lahars on Mt. Rainier that displace dammed reservoirs, and a hypothetical landslide-generated lake outburst flood near Sisters, Oregon.  +
DES3D (Dynamic Earth Solver in Three Dimensions) is a flexible, open-source finite element solver that models momentum balance and heat transfer in elasto-visco-plastic material in the Lagrangian form using unstructured meshes. It provides a modeling platform for long-term tectonics as well as various problems in civil and geotechnical engineering. On top of the OpenMP multi-thread parallelism, DES3D has recently adopted CUDA for GPU computing. The CUDA-enabled version shows speedup of two to three orders of magnitude compared to the single-thread performance, making high-resolution 3D models affordable. This clinic will provide an introduction to DynEarthSol3D’s features and capabilities and hands-on tutorials to help beginners start using the code for simple tectonic scenarios. Impact of the two types of parallelization on performance will be demonstrated as well.  +
Dakota (https://dakota.sandia.gov) is an open-source software toolkit, designed and developed at Sandia National Laboratories, that provides a library of iterative systems analysis methods, including sensitivity analysis, uncertainty quantification, optimization, and parameter estimation. Dakota can be used to answer questions such as: * What are the important parameters in my model? * How safe, robust, and reliable is my model? * What parameter values best match my observational data? Dakota has been installed on the CSDMS supercomputer, ''beach.colorado.edu'', and is available to all registered users. The full set of Dakota methods can be invoked from the command line on ''beach''; however, this requires detailed knowledge of Dakota, including how to set up a Dakota input file and how to pass parameters and responses between a model and Dakota. To make Dakota more accessible to the CSDMS community, a subset of its functionality has been configured to run through the CSDMS Web Modeling Tool (WMT; https://csdms.colorado.edu/wmt). WMT currently provides access to Dakota's vector, centered, and multidimensional parameter study methods.<br><br>In this clinic, we'll provide an overview of Dakota, then, through WMT, set up and perform a series of numerical experiments with Dakota on ''beach'', and evaluate the results. Other material can be downloaded from: https://github.com/mdpiper/dakota-tutorial.<br>  +
Dakota is a flexible toolkit with algorithms for parameter optimization, uncertainty quantification, parameter estimation, and sensitivity analysis. In this clinic we will work through examples of using Dakota to compare field observations with model output using methods of sensitivity analysis and parameter optimization. We will also examine how the choice of comparison metrics influences results. Methods will be presented in the context of the Landlab Earth-surface dynamics framework but are generalizable to other models. Participants who are not familiar with Landlab are encouraged (but not required) to sign up for the Landlab clinic, which will take place before this clinic.<br><br>Participants are encouraged to install both Landlab and Dakota on their computers prior to the clinic. Installation instructions for Landlab can be found at: http://landlab.github.io (select "Install" from the menu bar at the top of the page). Installation instructions for Dakota can be found at https://dakota.sandia.gov/content/install-dakota.  +
Dakota is a flexible toolkit with algorithms for parameter optimization, uncertainty quantification, parameter estimation, and sensitivity analysis. In this clinic we will cover the basics of the Dakota framework, work through examples of using Dakota to compare field observations with model output using methods of sensitivity analysis and parameter optimization, and briefly cover the theoretical background of the Dakota methods used. If time permits, we will examine how the choice of comparison metrics influences results. Methods will be presented in the context of the Landlab Earth-surface dynamics framework but are generalizable to other models. Participants who are not familiar with Landlab are encouraged (but not required) to sign up for the Landlab clinic, which will take place before this clinic.<br>Participants do not need to install Landlab or Dakota prior to the clinic but will need to sign up for a Hydroshare account. https://www.hydroshare.org/sign-up/. <br>For those students interested in installing Landlab or Dakota: Installation instructions for Landlab can be found at: http://landlab.github.io (select "Install" from the menu bar at the top of the page). Installation instructions for Dakota can be found at https://dakota.sandia.gov/content/install-dakota.  +
Dakota is an open-source toolkit with several types of algorithms, including sensitivity analysis (SA), uncertainty quantification (UQ), optimization, and parameter calibration. Dakota provides a flexible, extensible interface between computational simulation codes and iterative analysis methods such as UQ and SA methods. Dakota has been designed to run on high-performance computing platforms and handles a variety of parallelism. In this clinic, we will provide an overview of Dakota algorithms, specifically focusing on uncertainty quantification (including various types of sampling, reliability analysis, stochastic expansion, and epistemic methods), sensitivity analysis (including variance-based decomposition methods and design of experiments), and parameter calibration (including nonlinear least squares and Bayesian methods). The tutorial will provide an overview of the methods and discuss how to use them. In addition, we will briefly cover how to interface your simulation code to Dakota.  +
Data component is a software tool that wraps an API for a data source with a Basic Model Interface (BMI). It is designed to provide a consistent way to access various types of datasets and subsets of them without needing to know the original data API. Each data component can also interact with numerical models that are wrapped in the pymt modeling framework. This webinar will introduce the data component concept with a demonstration of several examples for time series, raster, and multidimensional space-time data.  +
Debris flows pose a substantial threat to downstream communities in mountainous regions across the world, and there is a continued need for methods to delineate hazard zones associated with debris-flow inundation. Here we present ProDF, a reduced-complexity debris-flow inundation model. We calibrated and tested ProDF against observed debris-flow inundation from eight study sites across the western United States. While the debris flows at these sites varied in initiation mechanism, volume, and flow characteristics, results show that ProDF is capable of accurately reproducing observed inundation in different settings and geographic areas. ProDF reproduced observed inundation while maintaining computational efficiency, suggesting the model may be applicable in rapid hazard assessment scenarios.  +
Decision framing is a key, early step in any effective decision support engagement in which modelers aim to inform decision and policy making. In this clinic participants will work through and share the results of decision framing exercises for a variety of policy decisions. We will organize the exercise using the XLRM elicitation, commonly used in decision making under deep uncertainty (DMDU) stakeholder engagements. The XLRM framework is useful because it helps organize relevant factors into the components of a decision-centric analysis. The letters X, L, R, and M refer to four categories of factors important to RDM analysis: outcome measures (M) that reflect decision makers’ goals; policy levers (L) that decision makers use to pursue their goals; uncertainties (X) that may affect the connection between policy choices and outcomes; and relationships (R), often instantiated in mathematical simulation models, between uncertainties and levers and outcomes.  +
Deep-learning emulators permit to reduce dramatically the computational times for solving physical models. Trained from a state-of-the-art high-order ice flow model, the Instructed Glacier Model (IGM, https://github.com/jouvetg/igm) is an easy-to-use python code based on the Tensorflow library that can simulate the 3D evolution of glaciers several orders of magnitude faster than the instructor model with minor loss of accuracy. Switching to Graphics Processing Unit (GPU) permits additional significant speed-ups, especially when modeling large-scale glacier networks and/or high spatial resolutions. Taking advantage of GPUs, IGM can also track a massive amount of particles moving within the ice flow, opening new perspectives for modeling debris transportation of any size (e.g., erratic boulders). Here I give an overview of IGM, illustrate its potential to simulate paleo and future glacier evolution in the Alps together with particle tracking applications, and do a quick live demo of the model.  +
Delta morphology  +
Deltas are highly sensitive to local human activities, land subsidence, regional water management, global sea-level rise, and climate extremes. In this talk, I’ll discuss a recently developed risk framework for estimating the sensitivity of deltas to relative sea level rise, and the expected impact on flood risk. We apply this framework to an integrated set of global environmental, geophysical, and social indicators over 48 major deltas to quantify how delta flood risk due to extreme events is changing over time. Although geophysical and relative sea-level rise derived risks are distributed across all levels of economic development, wealthy countries effectively limit their present-day threat by gross domestic product–enabled infrastructure and coastal defense investments. However, when investments do not address the long-term drivers of land subsidence and relative sea-level rise, overall risk can be very sensitive to changes in protective capability. For instance, we show how in an energy-constrained future scenario, such protections will probably prove to be unsustainable, raising relative risks by four to eight times in the Mississippi and Rhine deltas and by one-and-a-half to four times in the Chao Phraya and Yangtze deltas. This suggests that the current emphasis on short-term solutions on the world’s deltas will greatly constrain options for designing sustainable solutions in the long term.  +
Developed barriers are tightly-coupled systems driven by feedbacks between natural processes and human decisions to maintain development. Coastal property markets are dynamically linked to the physical environment: large tax revenues and high-value infrastructure necessitate defensive coastal management through beach nourishment, dune development, overwash removal, and construction of hard structures. In turn, changes to environmental characteristics such as proximity to the beach, beach width, and the height of dunes influence coastal property values. In this talk I will use a new exploratory model framework – the CoAStal Community-lAnDscape Evolution (CASCADE) model – to explore the coupled evolution of coastal real estate markets and barrier landscapes. The framework couples two geomorphic models of barrier evolution (Barrier3D and BRIE) with an agent-based real estate model – the Coastal Home Ownership Model (CHOM). CHOM receives information about the coastal environment and acts on that information to cause change to the environment, including decisions about beach nourishment and dune construction and maintenance. Through this coupled model framework, I will show how the effects of dune and beach management strategies employed in the wake of extreme storms cascade through decades to alter the evolution of barriers, inadvertently inhibiting their resilience to sea level rise and storms, and ultimately unraveling coastal real estate markets.  +
Developers of solvers for PDE-based models and other computationally intensive tasks are confronted with myriad complexity, from science requirements to algorithms and data structures to GPU programming models. We will share a fresh approach that has delivered order of magnitude speedups in computational mechanics workloads, minimizing incidental complexity while offering transparency and extensibility. In doing so, we'll examine the PETSc and libCEED libraries, validate performance models, and discuss sustainable architecture for community development. We'll also check out Enzyme, an LLVM-based automatic differentiation tool that can be used with legacy code and multi-language projects to provide adjoint (gradient) capabilities.  +
Digital twins are increasingly important in many domains, including for understanding and managing the natural environment. Digital twins of the natural environment are fueled by the unprecedented amounts of environmental data now available from a variety of sources from remote sensing to potentially dense deployment of earth-based sensors. Because of this, data science techniques inevitably have a crucial role to play in making sense of this complex, highly heterogeneous data. This webinar will reflect on the role of data science in digital twins of the natural environment, with particular attention on how resultant data models can work alongside the rich legacy of process models that exist in this domain. We will seek to unpick the complex two-way relationship between data and process understanding. By focusing on the interactions, we will end up with a template for digital twins that incorporates a rich, highly dynamic learning process with the potential to handle the complexities and emergent behaviors of this important area.  +
Does permafrost impart topographic signatures, and how does subsequent warming affect hillslope and channel form? Permafrost controls the depth to immobile soil, and tundra vegetation influences infiltration and erosion thresholds. I will use high-resolution maps of arctic landscapes to examine morphometric properties like hillslope length, curvature and drainage density as functions of climate and vegetation. I will then compare these data to existing models of climate-modulated sediment flux and channel incision in Landlab, exploring the effect of more nuanced representations of permafrost flux laws and hydrology. I will also compare modeled landscapes forced with Pleistocene-Holocene climate to mid-latitude landscape form.  +
During a clinic session in the 2013 CSDMS annual meeting, the OpenFOAM®, an open source computational fluid dynamics (CFD) platform, was first introduced by Dr. Xiaofeng Liu (now at Penn State University) for modeling general earth surface dynamics. OpenFOAM® provides various libraries, solvers and toolboxes for solving various fluid physics via finite volume method. The objective of this clinic is to further discuss its recent development and applications to coastal sediment transport. The clinic will start with an overview of a range of coastal applications using OpenFOAM®. We will then focus on a recently released solver, SedFOAM, for modeling sand transport by using an Eulerian two-phase flow methodology. Specifically, we will focus on applying the model to study wave-driven sheet flows and the occurrence of momentary bed failure. The code can be downloaded via CSDMS code repository and participants will receive a hands-on training of the coding style, available numerical schemes in OpenFOAM®, computational domain setup, input/output and model result analysis. Knowledge of C++, object-oriented programming, and parallel computing is not required but will be helpful.  +
During the clinic we'll introduce the new Delft3D Flexible Mesh modeling environment. We'll discuss the basic features and set up a simple 2D morphological model. The ongoing developments and the possibility to use BMI for runtime interaction will be presented as well. The user interface runs on Windows, so make sure that you have a Windows computer or virtual machine available during the meeting. The user interface will be provided precompiled; the computational kernels you'll have to compile yourself. We'll provide instructions on how to compile the FORTRAN/C kernels before the clinic.  +
Earth scientists face serious challenges when working with large datasets. Pangeo is a rapidly growing community initiative and open source software ecosystem for scalable geoscience using Python. Three of Pangeo’s core packages are 1) Jupyter, a web-based tool for interactive computing, 2) Xarray, a data-model and toolkit for working with N-dimensional labeled arrays, and 3) Dask, a flexible parallel computing library. When combined with distributed computing, these tools can help geoscientists perform interactive analysis on datasets up to petabytes in size. In this interactive tutorial we will demonstrate how to employ this platform using real science examples from hydrology, remote sensing, and oceanography. Participants will follow along using Jupyter notebooks to interact with Xarray and Dask running in Google Cloud Platform.  +
Earth surface processes are modulated by fascinating interactions between climate, tectonics, and biota. These interactions are manifested over diverse temporal and spatial scales ranging from seconds to millions of years, and microns to thousands of kilometers, respectively. Investigations into Earth surface shaping by biota have gained growing attention over the last decades and are a research frontier. In this lecture, I present an integration of new observational and numerical modeling research on the influence of vegetation type and cover on the erosion of mountains. I do this through an investigation of millennial timescale catchment denudation rates measured along the extreme climate and ecologic gradient of the western margin of South America.  +
Earthquakes are the most frequent source of classic tsunami waves. Other processes that generate tsunami waves include, landslides, volcanic eruption and meteorite impacts. Furthermore, atmospheric disturbances can also generate tsunami waves or at least tsunami-like waves, but we are just at the beginning of understanding their physics and frequency. Classic tsunami waves long waves with wavelength that are much longer than the water depth. For earthquake-generated tsunami waves that is true. However, landslides and meteorite impacts generate tsunami waves that are shorter which has a profound effect on the tsunami evolution, but no less dangerous.<br>Fortunately, tsunamis do not occur frequently enough in any given region to make meaningful prediction of the future tsunami hazard based only on recorded history. The geologic record has to be interrogated. The inversion of meaningful and quantitative data from the geologic record is the main goal of my research. However, there are problems with the geologic record. The most important problem is that we often have trouble to identify tsunami deposits. Second, it is very often difficult to separate the tsunami record from the storm record in regions where storms and tsunamis are competing agents of coastal change. Other problems are concerned with he completeness of the deposits, but also the fact that sedimentary environment before the tsunami hit most likely was eroded is no longer part of the record makes inversion especially tricky. In my research, I assume that the tsunami deposit is identified, but perhaps not complete and what we know about the pre-event conditions is limited.<br>My talk will cover how the geologic record is used to invert quantitative information about the causative process. We are going to look at grain sizes from sand to boulders and what we can learn from the transport of these very different grain sizes about tsunamis and their impacts along respective coastal areas. The models that are employed to invert flow characteristics from deposits are based on Monte-Carlo simulations to overcome the issue of not knowing the pre-tsunami conditions with great confidence. If time permits, we also see how sea-level change affects tsunami impact at the coast.  
Earth’s surface is the living skin of our planet – it connects physical, chemical, & biological systems. Over geological time, this surface evolves with rivers fragmenting the landscape into environmentally diverse range of habitats. These rivers not only carve canyons & form valleys, but also serve as the main conveyors of sediment & nutrients from mountains to continental plains & oceans. Here we hypothesise that it is not just geodynamics or climate, but their interaction, which, by regulating topography and sedimentary flows, determines long-term evolution of biodiversity. As such, we propose that surface processes are a prime limiting factor of diversification of Life on Earth before any form of intrinsic biotic process. To test this hypothesis, we use reconstructions of ancient climates & plate tectonics to simulate the evolution of landscape & sedimentary history over the entire Phanerozoic era, a period of 540 million years. We then compare these results with reconstructions of marine & continental biodiversity over geological times. Our findings suggest that biodiversity is strongly influenced by landscape dynamics, which at any given moment determine the carrying capacity of continental & oceanic domains, i.e., the maximum number of different species they can support at any given time. In the oceans, diversity closely correlates with the sedimentary flow from the continents, providing the necessary nutrients for primary production. Episodes of mass extinctions in the oceans have occurred shortly after a significant decrease in sedimentary flow, suggesting that a nutrient deficit destabilizes biodiversity & makes it particularly vulnerable to catastrophic events. On the continents, it took the gradual coverage of the surface with sedimentary basins for plants to develop & diversify, thanks to the development of more elaborate root systems. This slow expansion of terrestrial flora was further stimulated during tectonic episodes.  
Ecological Network Analysis (ENA) enables quantitative study of ecosystem models by formulating system-wide organizational properties, such as how much nutrient cycling occurs within the system, or how essential a particular component is to the entire ecosystem function. EcoNet is a free online software for modeling, simulation and analysis of ecosystem network models, and compartmental flow-storage type models in general. It combines dynamic simulation with Ecological Network Analysis. EcoNet does not require an installation, and runs on any platform equipped with a standard browser. While it is designed to be easy to use, it does contain interesting features such as discrete and continuous stochastic solutions methods.  +
Ecology is largely considered to have its foundations in physics, and indeed physics frames many of the constraints on ecosystem dynamics. Physics has its limitations, however, especially when dealing with strongly heterogeneous systems and with the absence of entities. Networks are convenient tools for dealing with heterogeneity and have a long history in ecology, however most research in networks is dedicated to uncovering the mechanisms that give rise to network types. Causality in complex heterogeneous systems deals more with configurations of processes than it does with objects moving according to laws. Phenomenological observation of ecosystems networks reveals regularities that the laws of physics are unequipped to determine. The ecosystem is not a machine, but rather a transaction between contingent organization and entropic disorder.  +
Economic losses and casualties due to riverine flooding increased in past decades and are most likely to further increase due to global change. To plan effective mitigation and adaptation measures and since floods often affect large areas showing spatial correlation, several global flood models (GFMs) were developed. Yet, they are either based on hydrologic or on hydrodynamic model codes. This may lower the accuracy of inundation estimates as large-scale hydrologic models often lack advanced routing schemes, reducing timeliness of simulated discharge, while hydrodynamic models depend on observed discharge or synthesized flood waves, hampering the representation of intra-domain processes.<br>To overcome this, GLOFRIM was developed. Currently, it allows for coupling one global hydrologic model, producing discharge and runoff estimates, with two hydrodynamics which perform the routing of surface water. By employing the Basic Model Interface (BMI) concept, both online and spatially explicit coupling of the models is supported. This way the coupled models remained unaffected, facilitating the separate development, storage, and updating of the models and their schematizations. Additionally, the framework is developed with easy accessibility and extensibility in mind, which allows other models to be added without extensive re-structuring. <br>In this presentation, the main underlying concepts of GLOFRIM as well as its workflow will be outlined, and first results showing the benefit of model coupling will be discussed. Besides, current limitations and need for future improvements will be pointed out. Last, current developments in code development, applications, and integrations with other research fields will be presented and discussed.  +
Ecosystems are in transition globally with critical societal consequences. Global warming, growing climatic extremes, land degradation, human-introduced herbivores, and climate-related disturbances (e.g., wildfires) drive rapid changes in ecosystem productivity and structure, with complex feedbacks in watershed hydrology, geomorphology, and biogeochemistry. There is a need to develop models that can represent ecosystem changes by incorporating the role of individual plant patches. We developed ecohydrologic components in Landlab that can be coupled to create models to simulate local soil moisture dynamics and plant dynamics with spatially-explicit cellular automaton plant establishment, mortality, fires, and grazing. In this talk, I will present a model developed to explore the interplay between ecosystem state, change in climate, resultant grass connectivity, fire frequency, and topography. A transition from a cool-wet climate to a warm-dry climate leads to shrub expansion due to drought-induced loss of grass connectivity. Shrubs dominate the ecosystem if dry conditions persist longer. The transition back to a tree or grass-dominated ecosystem from a shrub-dominated ecosystem can only happen when climate shifts from dry to wet. The importance of the length of dry or wet spells on ecosystem structure is highlighted. Aspect plays a critical role in providing topographical refugia for trees during dry periods and influences the rate of ecosystem transitions during climate change.  +
Ecosystems present spatial patterns controlled by climate, topography, soils, plant interactions, and disturbances. Geomorphic transport processes mediated by the state of the ecosystem leave biotic imprints on erosion rates and topography. This talk will address the following questions at the watershed scale: What are emergent properties of biotic landscapes, and how do they form? How do biotic landscapes respond to perturbations in space and time? First, formation of patterns and ecologic rates of change to perturbations in semiarid ecosystems will be investigated using Landlab. Second, we will examine eco-geosphere interactions and outcomes using a landscape evolution model. The role of solar radiation on ecogeomorphic forms, and watershed ecogeomorphic response to climate change will be elaborated. Finally, reflecting on the findings of previous research, some future directions in numerical modeling for linking ecosphere and geosphere will be discussed.  +
Environmental management decisions increasingly rely on quantitative integrated ecological models to forecast potential outcomes of management actions. These models are becoming increasingly complex through the integration of processes from multiple disciplines (e.g., linking physical process, engineering and ecological models). These integrated modeling suites are viewed by many decision makers as unnecessarily complex black boxes, which can lead to mistrust, misinterpretation and/or misapplication of model results. Numerical models have historically been developed without decision makers and stakeholders involved in model development, which further complicates communication as diverse project teams have differing levels of understanding of models and their uses. For example, explaining to a group of non-modelers how hydrodynamic model output was aggregated at ecologically-relevant scales can be difficult to explain to someone who was not exposed to that modeling decision. The mistrust of models and associated outputs can lead to poor decision-making, increase the risk of ineffective decisions and can lead to litigation over decisions. Improved integrated ecological model development practices are needed to increase transparency, include stakeholders and decision makers throughout the entire modeling process from conceptualization through application. This clinic describes a suite of techniques, best practices, and tools for rapid developing applied integrated ecological models in conjunction with technical stakeholder audiences and agency practitioners. First, a workshop approach for applied ecosystem modeling problems is described that cultivates a foundational understanding of integrated ecological models through hands-on, interactive model development. In this workshop environment, interdisciplinary and interagency working groups co-develop models in real-time which demystifies technical issues and educates participants on the modeling process. Second, a Toolkit for interActive Modeling (TAM) is presented as a simple platform for rapidly developing index-based ecological models, which we have found useful for developing a strong modeling foundation for large, multidisciplinary teams involved in environmental decision making. Third, the EcoRest R package is described, which provides a library of functions for computing habitat suitability and decision support via cost-effectiveness and incremental cost analysis. Based on 10 workshops over the last 8 years, these techniques facilitated rapid, transparent development and application of integrated ecological models, informed non-technical stakeholders of the complexity facing decision-makers, created a sense of model ownership by participants, built trust among partners, and ultimately increased “buy-in” of eventual management decisions.  
Established in 2005, GEO (http://www.earthobservations.org/) is a voluntary partnership of governments and organizations that envisions “a future wherein decisions and actions for the benefit of humankind are informed by coordinated, comprehensive and sustained Earth observations and information.” GEO Member governments include 96 nations and the European Commission, and 87 Participating Organizations comprised of international bodies with a mandate in Earth observations. Together, the GEO community is creating a Global Earth Observation System of Systems (GEOSS) that will link Earth observation resources world-wide across multiple Societal Benefit Areas - agriculture, biodiversity, climate, disasters, ecosystems, energy, health, water and weather - and make those resources available for better informed decision-making. Through the GEOSS Common Infrastructure (GCI), GEOSS resources, including Earth observation data (satellite, airborne, in situ, models), information services, standards and best practices, can be searched, discovered and accessed by scientists, policy leaders, decision makers, and those who develop and provide information services across the entire spectrum of users. The presentation will cover the GCI overall architecture and some possible future developments.  +
Exchanges of sediment between marshes and estuaries affect coastal geomorphology, wetland stability and habitat, but can be difficult to predict due to the many processes that influence dynamics in these systems. This study uses a modeling approach to analyze how spatially variability in marsh-edge erosion, vegetation, and hydrodynamic conditions affect sediment fluxes between marshes and estuaries in Barnegat Bay, New Jersey. Specifically, the three-dimensional Coupled Ocean-Atmosphere-Wave-Sediment Transport (COAWST) numerical model was used. Model results showed that marsh-estuarine sediment fluxes varied spatially due to changes in wave thrust, currents, and sediment availability.  +
Exploratory models that simulate landscape change incorporate only the most essential processes that are hypothesized to control a behavior of interest. These “rule-based” models have been used successfully to examine behaviors in natural landscapes over large spatial (many kms) and temporal scales (decades to millennia). In many geomorphic systems, the dynamics of developed landscapes differ significantly from natural landscapes. For example, humans can alter the physical landscape through the introduction of hard infrastructure and removal of vegetation. Humans can also modify the internal and external forces that naturally change landscapes, including flows of water, wind, and sediment as well as climatic factors. As with natural processes, in exploratory models human behavior must be parameterized. However, the level of detail to which human behavior can be reduced while still accurately reproducing feedbacks across the coupled human-natural landscape is a complex, user-based decision. In this clinic, we will work in small groups and through a Jupyter Notebook to parameterize a new human behavior within a modular coastal barrier evolution model (Barrier3D, within the CASCADE modeling framework). The clinic will incorporate discussions and prompts about how to broadly identify important model “ingredients” and reduce model complexity, and will therefore be generalizable to other geomorphic landscapes.  +
Fill-Spill-Merge (FSM) is an algorithm that distributes runoff on a landscape to fill or partially fill depressions. When a depression fills, excess water can overflow into neighbouring depressions or the ocean. In this clinic, we will use FSM to assess changes in a landscape’s hydrology when depressions in a DEM are partially or fully filled with water. We will discuss why it may be important to consider depressions more closely than just with removal. I will describe the design of the FSM algorithm, and then we will use FSM on a DEM to look at how landscape hydrology changes under different hydrologic conditions. This clinic may be helpful to those interested in topics such as landscape hydrology, landscape evolution, flow routing, hydrologic connectivity, and lake water storage.  +
Fire temporarily alters soil and vegetation properties, driving increases in runoff and erosion that can dramatically increase the likelihood of debris flows. In the immediate aftermath of fire, debris flows most often initiate when surface water runoff rapidly erodes sediment on steep slopes. Due to the complex interactions between runoff generation, sediment transport, and post-fire debris-flow initiation and growth, models that couple these processes can provide valuable insights into the ways in which topography, burn severity, and post-fire recovery influence debris-flow activity. Here, we describe such a model as well as attempts to parameterize temporal changes in model parameters throughout the post-fire recovery process. Simulations of watershed-scale response to individual rainstorms in several southern California burned areas suggest substantial reductions in debris-flow likelihood and volume within the first 1-2 years following fire. Results highlight the importance of considering local rainfall characteristics and sediment supply when using process-based numerical models to assess debris-flow potential. More generally, results provide a methodology for estimating the intensity and duration of rainfall associated with the initiation of runoff-generated debris flows as well as insights into the persistence of debris-flow hazards following fire.  +
Flood hazard in rivers can evolve from changes in the frequency and intensity of flood-flows (hydrologic effects) and in the channel capacity to carry flood-flows (morphologic effects). However, river morphology is complex and often neglected in flood planning. Here, we separate the impacts of morphology vs. hydrology on flood risk for 48 river gauges in Northwestern Washington State. We find that morphologic vs. hydrologic forcings are comparable but not regionally consistent. Prominent morphologic effects on flood-risk are forced by extreme natural events and anthropogenic disturbances. Based on morphologic changes, we identify five categories of river behavior relevant for flood-risk management.  +
Flood modelling at global scales represents a revolution in hydraulic science and has the potential to transform decision-making and risk management in a wide variety of fields. Such modelling draws on a rich heritage of algorithm and data set development in hydraulic modelling over the last 20 years, and is now beginning to yield new insights into current and future flood risk. This paper reviews this progress and outlines recent efforts to develop a 30m resolution true hydrodynamic model of the entire conterminous US. The model is built using an automated framework which uses US National Elevation Dataset, the HydroSHEDS river network, regionalised frequency analysis to determine extreme flow and rainfall boundary conditions and the USACE National Levee Dataset to characterize flood defences. Comparison against FEMA and USGS flood maps shows the continental model to have skill approaching that of bespoke models built with local data. The paper describes the development and testing of the model, and it use to estimate current and future flood risk in the US using high resolution population maps and development projections.  +
Flooding is one of the costliest natural disasters and recent events, including several hurricanes as well as flash floods, have been particularly devastating. In the US alone, the last few years have been record-breaking in terms of flood disasters and triggered many reactions in public opinions. Governments are now reviewing the available information to better mitigate the risks from flooding.<br>Typically, in the US, flood hazard mapping is done by federal agencies (USACE, FEMA and USGS), with traditionally, little room and need for research model development in flood hazard applications. Now, with the advent of the National Water Model, the status quo of flood hazard prediction in the US may be changing; however, inundation extent and floodplain depths in the National Water Model are still under early-stage development.<br>This Clinic provides a beginner introduction to the latest capabilities in large-scale 2-D modeling using the LISFLOOD-FP model developed by the University of Bristol with a nearly 20-year code history. This model has a very long history in research applications, while the algorithms behind the model made their way also into many existing industry model codes. The session will give participants insights into 2-D flood inundation modeling with LISFLOOD-FP and also a look at more sophisticated sub-grid channel implementations for large-scale application. More specifically, we will look at the data sets needed by the model and then run a simulation of the annual flooding on the Inner Niger Delta in Mali. The Clinic will also give participants the opportunity to look at some high-resolution LiDAR-based model results.  +
Floodplain construction involves the interplay between channel belt sedimentation and avulsion, overbank deposition of fines, and sediment reworking by channel migration. There has been considerable progress in numerical modelling of these processes over the past few years, for example, by using high resolution flow and sediment transport models to simulate river morphodynamics, albeit over relatively small time and space scales. Such spatially-distributed hydrodynamic models are also regularly used to simulate floodplain inundation and overbank sedimentation during individual floods. However, most existing models of long-term floodplain construction and alluvial architecture do not account for flood hydraulics explicitly. Instead, floodplain sedimentation is typically modelled as an exponential function of distance from the river, and avulsion thresholds are defined using topographic indices (e.g., lateral:downstream slope ratios or metrics of channel belt super-elevation). This presentation aims to provide an overview of these issues, and present results from a hydrodynamically-driven model of long-term floodplain evolution. This model combines a simple network-based model of channel migration with a 2D grid-based model of flood hydrodynamics and overbank sedimentation. The latter involves a finite volume solution of the shallow water equations and an advection-diffusion model for suspended sediment transport. Simulation results are compared with observations from several large lowland floodplains, and the model is used to explore hydrodynamic controls on long-term floodplain evolution and alluvial ridge construction.  +
Flow routing map is the cornerstone of spatially distributed hydrologic models. In this clinic we will introduce HexWatershed, a scale-free, mesh independent flow direction model. It supports DOE’s Energy Exascale Earth System Model (E3SM) to generate hydrologic parameters and river network representations on both structured and unstructured meshes. In this presentation, we will overview the capabilities of HexWatershed with an emphasis on river network representation and flow direction modeling. We will also provide participants with the tools to begin their own research with hydrologic model workflows. Through hands-on tutorials and demonstrations, participants will gain some insights into the relationship between meshes and flow direction, and how HexWatershed handles river network in various meshes. We will also demonstrate how to use the HexWatershed model outputs in the large-scale hydrologic model, Model for Scale Adaptive River Transport (MOSART). Participants will be provided with additional resources that can be used to extend the tutorial problems and gain additional familiarity with the tools and workflows introduced. Participants are welcome to bring and utilize their own computers capable of accessing the internet and running a web browser. Tutorials will involve simple scripting operations in the Python language. The conda utility will be used to install libraries. Both QGIS and VisIt packages will be used for visualization.  +
Fluvial incision since late Miocene time (5 Ma) has shaped the transition between the Central Rocky Mountains and adjacent High Plains. Despite a clear contrast in erodibility between the mountains and plains, erodibility has not been carefully accounted for in previous attempts to model the geomorphic evolution of this region. The focus of this work to date has been to constrain erodibility values with a simplistic, toy model, and to reconstruct the paleosurface of the Miocene Ogallala Formation prior to its dissection beginning at 5 Ma. This surface reconstruction will be used as an initial condition in subsequent modeling.  +
Food security and poverty in Bangladesh are very dependent on natural resources, which fluctuate with a changing environment. The ecosystem services supporting the rural population are affected by several factors including climate change, upstream river flow modifications, commercial fish catches in the Bay of Bengal, and governance interventions. The ESPA Deltas project aims to holistically describe the interaction between the interlinked bio-physical environment and the livelihoods of the rural poorest in coastal Bangladesh, who are highly dependent on natural resources and live generally on less than US$1.50 per day. Here we describe a new integrated model that allows a long-term analysis of the possible changes in this system by linking projected changes in physical processes (e.g. river flows, nutrients), with productivity (e.g. fish, rice), social processes (e.g. access, property rights, migration) and governance (e.g. fisheries, agriculture, water and land use management). Bayesian Networks and Bayesian Processes allow multidisciplinary integration and exploration of specific scenarios. This integrated approach is designed to provide Bangladeshi policy makers with science-based evidence of possible development trajectories. This includes the likely robustness of different governance options on natural resource conservation and poverty levels. Early results highlight the far reaching implications of sustainable resource use and international cooperation to secure livelihoods and ensure a sustainable environment in coastal Bangladesh.  +
From G.K. Gilbert's "The Convexity of Hilltops" to highly-optimized numerical implementations of drainage basin evolution, models of landscape evolution have been used to develop insight into the development of specific field areas, create testable predictions of landform development, demonstrate the consequences of our current theories for geomorphic processes, and spark imagination through hypothetical scenarios. In this talk, I discuss how the types questions tackled with landscape evolution models have changed as observational data (e.g., high-resolution topography) and computational technology (e.g., accessible high performance computing) have become available. I draw on a natural experiment in postglacial drainage basin incision and a synthetic experiment in a simple tectonic setting to demonstrate how landscape evolution models can be used to identify how much information the topography or other observable quantities provide in inferring process representation and tectonic history. In the natural example, comparison of multiple calibrated models provides insight into which process representations improve our ability to capture the geomorphic history of a site. Projections into the future characterize where in the landscape uncertainty in the model structure dominates over other sources of uncertainty. In the synthetic case, I explore the ability of a numerical inversion to recover geomorphic-process relevant (e.g., detachment vs. transport limited fluvial incision) and tectonically relevant (e.g., date of fault motion onset) system parameters.  +
GCAM is an open-source, global, market equilibrium model that represents the linkages between energy, water, land, climate, and economic systems. One of GCAM's many outputs is projected land cover/use by subregion. Subregional projections provide context and can be used to understand regional land dynamics; however, Earth System Models (ESMs) generally require gridded representations of land at finer scales. Demeter, a land use and land cover disaggregation model, was created to provide this service. Demeter directly ingests land projections from GCAM and creates gridded products that match the desired resolution, and land class requirements of the user.  +
GPUs can make models, simulations, machine learning, and data analysis much faster, but how? And when? In this clinic we'll discuss whether you should use a GPU for your work, whether you should buy one, which one to buy, and how to use one effectively. We'll also get hands-on and speed up a landscape evolution model together. This clinic should be of interest both to folks who would like to speed up their code with minimal effort as well as folks who are interested in the nitty gritty of pushing computational boundaries.  +
GeoClaw (http://www.geoclaw.org) is an open-source software package for solving two-dimensional depth-averaged equations over general topography using high-resolution finite volume methods and adaptive mesh refinement. Wetting-and-drying algorithms allow modeling inundation and overland flows. The primary applications where GeoClaw has been used are tsunami modeling and storm surge, although it has also been applied to dam break and other overland flooding problems. The first part of this clinic will present an overview of the capabilities of GeoClaw, including a number of new features have been added in the past few years. These include: - Depth-averaged Boussinesq-type dispersive equations that better model short-wavelength tsunamis, such as those generated by landslides or asteroid impacts. Solving these equations requires implicit solvers (due to the higher-order derivatives in the equations). This is now working with the adaptive mesh refinement (AMR) algorithms in GeoClaw, which are critical for problems that require high-resolution coastal modeling while also modeling trans-oceanic propagation, for example. - Better capabilities for extracting output at frequent times on a fixed spatial grid by interpolation from the AMR grids during a computation. The resulting output can then be use for making high-resolution animations or for post-processing (e.g. the velocity field at frequent times can be used for particle tracking, as needed when tracking tsunami debris, for example). - Ways to incorporate river flows or tidal currents into GeoClaw simulation. - Better coupling with the D-Claw code for modeling debris flows, landslides, lahars, and landslide-generated tsunamis. (D-Claw is primarily developed by USGS researchers Dave George and Katy Barnhart). The second part of the clinic will be a hands-on introduction to installing GeoClaw and running some of the examples included in the distribution, with tips on how best to get started on a new project. GeoClaw is distributed as part of Clawpack (http://www.clawpack.org), and is available via the CSDMS model repository. For those who wish to install the software in advance on laptops, please see http://www.clawpack.org/installing.html. We will also go through this briefly and help with any issues that arise on your laptop (provided it is a Mac or Linux machine; we do not support Windows.) You may need to install some prerequisites in advance, such as Xcode on a Mac (since we require "make" and other command line tools), a Fortran compiler such as gfortran, and basic scientific Python tools such as NumPy and Matplotlib. See https://www.clawpack.org/prereqs.html.  
GeoClaw (http://www.geoclaw.org) is an open-source software package for solving two-dimensional depth-averaged equations over general topography using high-resolution finite volume methods and adaptive mesh refinement. Wetting-and-drying algorithms allow modeling inundation or overland flows. The primary applications where GeoClaw has been used are tsunami modeling and storm surge, although it has also been applied to dam break problems and other overland floods. This tutorial will give an introduction to setting up a tsunami modeling problem in GeoClaw, including: * Overview of capabilities, * Installing the software, * Using Python tools provided in GeoClaw to acquire and work with topography DEMs and other datasets, * Setting run-time parameters, including specifying adaptive refinement regions, * The VisClaw plotting software to visualize results using Python tools or display on Google Earth. GeoClaw is distributed as part of Clawpack (http://www.clawpack.org). Those who wish to install the software in advance on laptops, please see http://www.clawpack.org/installing.html. https://razmag.ir/review-of-mesotherapy/ Tutorials can be found here: https://github.com/clawpack/geoclaw_tutorial_csdms2019  +
GeoClaw is an open source Fortran/Python package based on Clawpack (conservation laws package), which implements high-resolution finite volume methods for solving wave propagation problems with adaptive mesh refinement. GeoClaw was originally developed for tsunami modeling and been validated via benchmarking workshops of the National Tsunami Hazard Mitigation Program for use in hazard assessment studies funded through this program. Current project include developing new tsunami inundation maps for the State of Washington and the development of new probabilistic tsunami hazard assessment (PTHA) methodologies. The GeoClaw code has also been extended to the study of storm surge and forms the basis for D-Claw, a debris flow and landslide code being developed at the USGS and recently used to model the 2014 Oso, Washington landslide, for example.  +
Getting usable information out of climate and weather models can be a daunting task. The direct output from the models typically has unacceptable biases on local scales, and as a result a large number of methods have been developed to bias correct or downscale the climate model output. This clinic will describe the range of methods available as well as provide background on the pros and cons of different approaches. This will cover a variety of approaches from relatively simple methods that just rescale the original output, to more sophisticated statistical methods that account for broader weather patterns, to high-resolution atmospheric models. We will focus on methods for which output or code are readily available for end users, and discuss the input data required by different methods. We will follow this up with a practical session in which participants will be supplied a test dataset and code with which to perform their own downscaling. Participants interested in applying these methods to their own region of interest are encouraged to contact the instructor ahead of time to determine what inputs would be required.  +
Global models of Earth’s climate have expanded beyond their geophysical heritage to include terrestrial ecosystems, biogeochemical cycles, vegetation dynamics, and anthropogenic uses of the biosphere. Ecological forcings and feedbacks are now recognized as important for climate change simulation, and the models are becoming models of the entire Earth system. This talk introduces Earth system models, how they are used to understand the connections between climate and ecology, and how they provide insight to environmental stewardship for a healthy and sustainable planet. Two prominent examples discussed in the talk are anthropogenic land use and land-cover change and the global carbon cycle. However, there is considerable uncertainty in how to represent ecological processes at the large spatial scale and long temporal scale of Earth system models. Further scientific advances are straining under the ever-growing burden of multidisciplinary breadth, countered by disciplinary chauvinism and the extensive conceptual gap between observationalists developing process knowledge at specific sites and global scale modelers. The theoretical basis for Earth system models, their development and verification, and experimentation with these models requires a new generation of scientists, adept at bridging the disparate fields of science and using a variety of research methodologies including theory, numerical modeling, observations, and data analysis. The science requires a firm grasp of models, their theoretical foundations, their strengths and weaknesses, and how to appropriately use them to test hypotheses of the atmosphere-biosphere system. It requires a reinvention of how we learn about and study nature.  +
Google Earth Engine is a powerful geographic information system (GIS) that brings programmatic access and massively parallel computing to petabytes of publicly-available Earth observation data using Google’s cloud infrastructure. In this live-coding clinic, we’ll introduce some of the foundational concepts of workflows in Earth Engine and lay the groundwork for future self-teaching. Using the JavaScript API, we will practice: raster subsetting, raster reducing in time and space, custom asset (raster and vector) uploads, visualization, mapping functions over collections of rasters or geometries, and basic exporting of derived products.  +
Google Earth Engine(GEE) is a multi-petabyte catalog of satellite imagery and geospatial datasets with planetary-scale analysis capabilities. Now imagine all you need to work on it is a browser and an internet connection. This hands-on workshop will introduce you to and showcase cloud-native geospatial processing. We will explore the platform’s built-in catalog of 100+ petabytes of geospatial datasets and build some analysis workflows. Additional topics will also include uploading & ingesting your own data to Google Earth Engine, time series analysis essential for change monitoring, and data and code principles for effective collaboration. The hope is to introduce to cloud native geospatial analysis platform and to rethink data as we produce and consume more. If you want to follow along, bring your laptops, and register for an Earth Engine account here https://signup.earthengine.google.com P.S I recommend using a personal account :) you get to keep it  +
Granular materials are ubiquitous in the environment, in industry and in everyday life and yet are poorly understood. Modelling the behavior of a granular medium is critical to understanding problems ranging from hazardous landslides and avalanches in the Geosciences, to the design of industrial equipment. Typical granular systems contain millions of particles, but the underlying equations governing that collective motion are as yet unknown. The search for a theory of granular matter is a fundamental problems in physics and engineering and of immense practical importance for mitigating the risk of geohazards. Direct simulation of granular systems using the Discrete Element Method is a powerful tool for developing theories and modelling granular systems. I will describe the simulation technique and show its application to a diverse range of flows.  +