Property:CSDMS meeting abstract presentation

From CSDMS

This is a property of type Text.

Showing 137 pages using this property.
P
Subduction zones are ever-evolving over a wide range of spatio-temporal scales, and there are a range of interactions between deep mantle flow and surface tectonics which are of relevance for processes from the megathrust cycle to the long-term evolution of the Earth. I review studies that seek to understand a number of the involved processes with a focus on the role of sediments and rheology for subduction rates and deep mantle structure, as well as the thermo-mechanical state of the mantle wedge.  +
Sundaland, the name given to the emerged parts of the Sunda Shelf during low sea level, currently lies approximately 100 m beneath the Java Sea and southwestern part of the South China Sea. The region is of particular interest in biogeography and biodiversity studies for its position at the junction between two major zoogeographic provinces that extend across the Equator and for its prevailing connection with mainland Southeast Asia. Using landscape evolution and connectivity analysis models, we will investigate how changes induced by drainage basins reorganisation and river captures have transformed the environment into fragmented habitats over the past million years. We will see that physiographic evolution has a strong control on the preferential connectivity pathways and triggers successive phases of expansion and compression of the migratory corridors across the shelf and is an important mechanism to consider in order to improve our understanding of species richness dynamics in the region.  +
Surface processes are constantly reworking the landscape of our planet with perhaps the most diverse and beautiful patterns of sediment displacement known to humanity. Capturing this diversity is important for advancing our knowledge of systems, and for sustainable exploitation of natural resources by future generations. From a modeler's perspective, great diversity comes with great uncertainty. Although it is understandably very hard to quantify uncertainty about geological events that happened many years ago, we argue that modeling this uncertainty explicitly is crucial to improve our understanding of subsurface heterogeneity, as stratigraphy is direct function of surface processes. In this modeling work (and code), we aim to build realistic stratigraphic models that are constrained to local data (e.g. from wells, or geophysics) and that are, at the same time, subject to surface processes reflected in flume records. Experiments have improved tremendously in recent years, and the amount of data that they generate is posing new challenges to the surface processes community, who is asking more often the question "How do we make use of all this?" Traditional models based on differential equations and constitutive laws are not flexible enough to digest this information, nor were they created with this purpose. The community faces this limitation where the models cannot be conditioned on experiments, and even after exhaustive manual calibration of unobserved input parameters, these models often show poor predictive power. Our choice of inverse modeling and (geo)statistics (a.k.a. data science) was thus made knowing that these disciplines can provide the community with what we need: the ability to condition models of stratigraphy to measurements taken on a flume tank.  +
Surface processes are influenced by viscous coupling of the deforming lithosphere to asthenospheric flow, as well as magma that migrates upward from the upper asthenosphere. Over the past few decades, significant advances have been made in finite element numerical methods that enable modeling of lithospheric deformation, viscous coupling to asthenospheric flow, and melt generation in the upper asthenosphere. In this work, we present new developments based on the NSF Computational Infrastructure for Geodynamics finite element code ASPECT (Advanced Solver for Problems in Earth’s Convection) that allow users to easily investigate these processes in distinct tectonic and geographic locations. . Users have the options to constrain their initial temperature and density conditions with laterally varying lithospheric thickness, layers of crustal thickness, and shear wave seismic velocity models in the sublithospheric mantle. We present case studies from regions along the East African Rift System that demonstrate these capabilities.  +
Terrestrial cosmogenic nuclides (TCN) are commonly used to assess denudation rates in soil-mantled uplands. The estimation of an inferred denudation rate (Dinf) from TCN concentrations typically relies on the assumptions of steady denudation rates during TCN accumulation and negligible impact from soil chemical erosion on soil mineral abundances. However, in many landscapes, denudation rates are not steady, and the composition of soil is markedly affected by chemical erosion, adding complexity to the analysis of TCN concentrations. We introduce a landscape evolution model that computes transient changes in topography, soil thickness, soil mineralogy, and soil TCN concentrations. With this model, we explored TCN responses in transient landscapes by imposing idealized perturbations in tectonically (bedrock uplift rate) and climatically sensitive parameters (soil production efficiency, hillslope transport efficiency, and mineral dissolution rate) on synthetic, steady-state landscapes. The experiments on synthetic landscapes delivered important insights about TCN responses in transient landscapes. Results showed that responses of Dinf to tectonic perturbations differ from those to climatic perturbations, indicating that spatial and temporal trends in Dinf serve as indicators of perturbation type and magnitude. Also, if soil chemical erosion is accounted for, basin-averaged Dinf inferred from TCN in stream sediment closely tracks actual basin-averaged denudation rate, showing that Dinf is a reliable representation of actual denudation rate, even in many transient landscapes. In addition, we demonstrate how this model can be applied to a real landscape in the Oregon Coast Range and how model predictions can be compared to field measurements of cosmogenic nuclides and chemical depletion in sediments. Overall, landscape evolution models infused with cosmogenic nuclides can be used to scrutinize methodological assumptions, reveal potential real-world patterns in transient landscapes, and deepen the comprehension of field data.  
Thawing of permafrost potentially affects the global climate system through the mobilization of greenhouse gases, and poses a risk to human infrastructure in the Arctic. The response of ice-rich permafrost landscapes to a changing climate is particularly uncertain, and challenging to be addressed with numerical models. A main reason for this is the rapidly changing surface topography resulting from melting of ground ice, which is referred to as thermokarst. It is expressed in characteristic landforms which alter the hydrology, the surface energy balance, and the redistribution of snow of the entire landscapes. Polygonal patterned tundra which is underlain by massive ice-wedges, is a prototype of a sensitive permafrost system which is increasingly subjected to thermokarst activity throughout the Arctic. In this talk I will present a scalable modeling approach, based on the CryoGrid land surface model, to investigate the degradation of ice-wedges. The numerical model takes into account lateral fluxes of heat, water, and snow between different topographic units of polygonal tundra and simulates topographic changes resulting from melting of excess ground ice (i.e., thermokarst), and from lateral erosion of sediment. We applied the model to investigate the influence of hydrological conditions on the development of different types of ice-wedge polygons in a study area in northern Siberia. We further used projections of future climatic conditions to confine the evolution of ice-wedge polygons in a changing climate, and assessed the amount of organic matter which could thaw under different scenarios. In a related study for a study site in northern Alaska, we demonstrated that the model setup can be used to study the effect of infrastructure on the degradation of ice-wedges. Altogether, our modeling approach can be seen as a blueprint to investigate complexly inter-related processes in ice-rich permafrost landscapes, and marks a step forward towards an improved representation of these landscapes in large-scale land surface models.  
The ADCIRC finite element coastal ocean model is used in real time decision support services for coastal and riverine hydrodynamics, tropical cyclone winds, and ocean wave modelling for public sector agencies including NOAA, FEMA, Coast Guard, and the US Army Corps of Engineers, among others. Recent developments in ADCIRC's real time automation system, the ADCIRC Surge Guidance System (ASGS), have now enabled real time modelling of active flood control scenarios (manipulation of pumps and flood gates) for decision support during riverine floods and tropical cyclone events. During these events, the results are presented to official decision makers with the Coastal Emergency Risks Assessment (CERA) web application, an intuitive and interactive tool that integrates model data with measured data to provide situational awareness across the area of responsibility. Case study events will be described, including official decisions that have been made with the ADCIRC in North Carolina (Irene 2011), Louisiana (Mississippi River flooding in 2016), and during the 2017 and 2018 hurricane seasons for Hurricanes Harvey, Irma, Maria, Florence, and Michael.  +
The Basic Model Interface (BMI) has been extended to allow tighter coupling of model components than is available in the BMI standard. To enable tighter coupling between models, we have developed the eXtended Model Interface (XMI) which extends the BMI functionality and enables coupling within the non-linear Picard iteration loop. The XMI subdivides the BMI update function into multiple functions. This subdivision allows data from other model components to affect matrix coefficients during each MODFLOW non-linear Picard iteration. Additional functions to subdivide the update function include prepare_timestep, do_timestep, finalize_timestep, prepare_solve, solve, and finalize_solve. We have developed a hypothetical model application that simulates characteristics common to hydrologic conditions in a large part of the Netherlands. The application tightly couples MODFLOW and MetaSWAP using a shared control volume approach and XMI. MetaSWAP is meta-model that simulates the unsaturated zone using a quasi steady-state formulation based on Richards’ equation. The coupling procedure consists of the following steps. After every solution of the groundwater heads within the non-linear Picard iteration loop, MetaSWAP determines the unsaturated zone flux and primary storage coefficients while ensuring mass balance for the shared control volume. Both variables (groundwater recharge and storage coefficients) are then communicated to MODFLOW and this sequence is repeated until the MODFLOW convergence criteria are met for a time step. The hypothetical model application demonstrates that MetaSWAP makes it possible to simulate the unsaturated zone in more detail than possible with the MODFLOW Unsaturated Zone Flow (UZF) Package and simulate soil moisture-based groundwater irrigation.  +
The CSDMS Integration Facility develops and maintains a suite of products and services with the goal of supporting research in the Earth and planetary surface processes community. This includes products such as Landlab, the Basic Model Interface, Data Components, the Model Repository, EKT Labs, and ESPIn. Examples of services include the Help Desk, Office Hours, RSEaaS, and the OpenEarthscape JupyterHub. One problem, though, is that if the community doesn't know about these products and services, then they don't get used—and, like the Old Gods in Neil Gaiman's "American Gods", they fade into obscurity. Let's break the cycle! Please join us for this clinic where we will present information about all of the products and services offered by CSDMS, and explain how they can help you accelerate your research. The clinic format will consist of a lecture (what are these products and services?), interactive exercises (how do these things work?), and listening (how can CSDMS provide better products and services?). Attendees will leave with knowledge of what CSDMS can do for them, which they can bring back to their home institutions and apply to their research and share with their colleagues.  +
The CSDMS Web Modeling Tool (WMT) is the web-based successor to the desktop Component Modeling Tool (CMT). WMT presents a drag-and-drop interface that allows users to build and run coupled surface dynamics models from a web browser on a desktop, laptop or tablet computer.<br/><br>With WMT, a user can:<br>* Design a coupled model from a list of available components<br>* Edit the parameters of the model components<br>* Save the coupled model to a server, where it can be accessed from any computer<br>* Set run parameters, including the computer/cluster on which to run the model<br>* Share saved modeling projects with others in the community<br>* Submit jobs to the high-performance computing system<br><br>Although WMT is web-based, the building and configuration of a model can be done offline. The user can then reconnect to save a model and submit it for a run.<br/>In this clinic we present an overview of WMT, including an explanation of the user interface, a listing of the currently available models and a discussion of how models can be run in operational mode or in reduced-input mode for teaching. We cap the clinic with a live demonstration of setting up, saving and running a coupled model on the CSDMS supercomputer system.  +
The CUAHSI Water Data Center (WDC) is community governed, NSF funded facility that enables data access and publication through a web services oriented architecture. The WDC maintains the largest catalog of time series water data in the world, which includes data sources that range from global to local coverage and include data sets that describe climate, streams, and soil. This session will touch upon a number of functions of the WDC including:<br><br>• How can I use WDC services to fulfill NSF Data Management requirements?<br><br>• What data are available through the WDC?<br><br>• How can I access data?<br><br>• How can I write custom software that accesses data published with the WDC?<br><br>Participants should anticipate this information to be presented through slides and should expect to leave with a comprehensive understanding of the research support services offered by the WDC.<br>START_WIDGET5dd744b2026e2aae-1END_WIDGET<br><br>  +
The California Delta, where the Sacramento and San Joaquin Rivers come together before flowing into the San Francisco Bay, functions as the heart of California. It is here that water originating from distal parts of the state mixes and is pumped to other far-flung regions, sustaining life, economies, cultures, and one of the biggest agricultural industries in the nation. However, for decades, water allocation planning has been steeped in controversy and legal gridlock, posing challenges for adaptation to rapidly changing climatic conditions, including increasing frequency and severity of droughts and floods and long-term changes in water availability. In other major estuaries such as the Chesapeake Bay and Florida Everglades, stakeholder-engaged, open-science modeling to evaluate multifaceted tradeoffs associated with water management decisions has created inroads through controversy and gridlock. Similar approaches, applied to specific challenges such as Chinook salmon management and localized wetland restoration in the Delta, have likewise promoted adaptive behaviors. In this talk, I highlight lessons learned from those examples and discuss how the Delta science community is incorporating those lessons into a larger-scale vision of “One Delta, one science, one modeling framework.” Fundamental to this vision are commonly held best practices, including widespread adoption of FAIR principles for model, metadata, and data dissemination, common cyberinfrastructure resources, and human resources to support outreach to communities not formerly represented in the use of models and implementation of best practices.  +
The Coastline Evolution Model (CEM) addresses coastline changes that arise from gradients in the net alongshore transport, over timescales that are long compared to storm cycles, and spatial scales that are larger than the cross-shore extent of the shoreface (kilometers on typical open ocean coasts). In the model, coastline morphodynamic feedbacks arise as coastline shapes determine spatial patterns of sediment flux, and gradients in that flux cause changes in shape. In this model system, waves approach from a wide range of directions, and the influences of the whole ‘wave climate’ combine to determine coastline changes and patterns. Wave shadowing—in which protruding coastline features change the local wave climates affecting other parts of the coastline—also plays a key role in coastline evolution in this model. A number of other processes or influences have been added to the model, including: river sediment input and delta evolution; effects of the composition of underlying rocks; two-way interactions between beach sediment and cliff erosion; and human shoreline stabilization.<br><br>This clinic will combine 1) explanations of model principles, assumptions, and limitations with 2) the opportunity for participants to gain some familiarity with running the model, by conducting their own simple model experiments.  +
The Community Earth System Model (CESM) is a comprehensive global Earth System model. As such, it requires the representation of many coupling across the various components (land, ice, ocean, atmosphere). In this talk, I will most focus on the current and upcoming representations of the physical and chemical couplings across the Earth surface in CESM.  +
The DAKOTA project began in 1994 with the primary objective of reusing software interfaces to design optimization tools. Over nearly 20 years of development, it has grown into an open source toolkit supporting a broad range of iterative analyses, typically focused on high-fidelity modeling and simulation on high-performance computers. Today, DAKOTA provides a delivery vehicle for uncertainty quantification research for both the NNSA and the office of science, enabling an emphasis on predictive science for stockpile stewardship, energy, and climate mission areas.<br/><br/>Starting with an overview of the DAKOTA architecture, this presentation will introduce processes for setting up iterative analyses, interfacing with computational simulations, and managing high-fidelity workflows. Algorithmic capabilities in optimization, calibration, sensitivity analysis, and uncertainty quantification (UQ) will be briefly overviewed, with special emphasis given to UQ. Core UQ capabilities include random sampling methods, local and global reliability methods, stochastic expansion methods, and epistemic interval propagation methods. This UQ foundation enables a variety of higher level analyses including design under uncertainty, mixed aleatory-epistemic UQ, and Bayesian inference.  +
The Delta Dynamics Collaboratory (DDC) is a four-year effort to develop an inter-disciplinary and multi-scale understanding of the interplay among and within the various sub-systems of deltas. It is funded through the National Science Foundation’s “Frontiers in Earth System Dynamics” (FESD) Program. The overall objective of the DDC is to develop tested, high-resolution, quantitative models incorporating morphodynamics, ecology, and stratigraphy to predict river delta dynamics over engineering to geologic time-scales. In this way we hope to specifically address questions of delta system dynamics, resilience, and sustainability. There are two laboratories in the DDC: a field laboratory for discovering process-interactions and testing model predictions (Wax Lake Delta, LA), and a virtual modeling laboratory. Here we report on the progress made to date in advancing models of delta processes and morphodynamic interactions. The models consist of three types: 1) reduced complexity delta models (RCDM); 2) a 2- and 3D eco-geo-morpho-dynamic sediment transport delta model; and 3) vegetation and fish population ecological models. The RCDM are focused on large-scale interactions, and as such offer the opportunity to explore aspects of system dynamics that may be harder to pick out of the details of a high-resolution model. ''DeltaRCM'' is a “2.5-D” cellular delta formation model that computes a depth-averaged flow field and bed topography as the delta evolves in time. The model adopts a Lagrangian view of transport: water and sediment fluxes are treated as a large number of "parcels" that are routed scholastically through a lattice grid. The probability field for routing the parcels is updated through time and is determined by a set of rules abstracting the governing physics of fluid flow and sediment transport. Sediment parcels are treated as "leaking buckets" that lose sediment to the bed by deposition and gains sediment from the bed by erosion. In the current version of the model sediment parcels represent coarse and fine materials respectively ("sand" and "mud"), which have different rules for routing and conditions for deposition and entrainment. DeltaRCM is able to produce delta morphology at the level of selforganized channel behaviors such as bifurcations and avulsions. The model can also record stratigraphy in terms of grain-size or deposition age. Validation work on the flow routing component of the model (''FlowRCM'') shows that the model gives reasonable channel-to-channel and channel-to-floodplain flow partitioning but falls short in predicting fine scale hydrodynamic details at fine scales (e.g., sub-channel scale). A second RCDM (Kim et al. 2009) is being modified to include self-formed channels and separate channel and floodplain elevations, treat alluvial-bedrock and bedrock-alluvial transitions in low-slope sand-bed rivers, and exploit new channel geometry closure rules for self-formed alluvial sand-bed channels developed during the course of this study. Along the lines of reduced complexity models, we have also developed a network-based modeling framework for understanding delta vulnerability to change. The deltaic system is mapped into a directed graph composed of a set of nodes (or vertices) and links (or edges) and represented by its connectivity or adjacency matrix. For flux routing a weighted adjacency matrix is used to reflect how fluxes are split downstream and to enforce mass balance. Using the proper tree representation, we show that operations on the adjacency matrix quantify several properties of interest, such as immediate or distant connectivity, distinct sub-networks, and downstream regions of influence from any point on the network. We use these representations to construct “vulnerability maps”, e.g., maps of delta locations where an imposed change in water and/or sediment fluxes would most drastically affect sediment and water delivery to the coastal zone outlets or to a specific region of the delta. Dam construction can be emulated by reducing water and sediment downstream by a given fraction, the location and operation of irrigation dykes can be varied, and different alternative management options can be evaluated in a simple yet spatially extensive framework. The current open-source state of the art in 3D delta morphodynamic modeling is Delft3DFLOW Version 6.00.00.2367 developed by Deltares, an independent, Dutch-based research institute for matters relating to water, soil and the subsurface (http://www.deltares.nl/en). We are using Delft3D 6.0 to test various hypotheses concerning the emergent behaviors of deltas subject to various sediment fluxes, basin depths, and base level variations, and to investigate the specific morphodynamics and sediment retention of Wax Lake Delta. Predictions of sand and mud transport through the various distributaries compare well with data collected by the FESD Wax Lake Team and indicate that total sediment load is rarely split equally at bifurcations, in accordance with earlier predictions. These and other studies have shown that improvements to Delft3D are needed to solve the following problems: 1) morphodynamic simulations of deltas are in part, an artifact of the underlying orthogonal grid structure; 2) the ecogeomorphic interactions are primitive; 3) the algorithm for eroding channel banks is ad hoc; and 4) simulations are restricted by computational inefficiencies. We are attempting to address these problems in collaboration with Deltares scientists. A mass-conservative, staggered, three-dimensional immersed boundary, shallow water Delft3D+ model is under development for flow on complex geometries. It allows channels to evolve independent of the underlying grid, and allows cohesive channel banks to erode laterally according to user-specified bank-erosion rules. The method consists of hybrid cut- ghost-cells: ghost cells are used for the momentum equations in order to prescribe the correct boundary condition at the immersed boundary, while cut-cells are used in the continuity equation to conserve mass. Results show that the resulting scheme is robust, does not suffer any time step limitation for small cut cells and conserves fluid mass up to machine precision. Comparisons with analytical solutions and reference numerical solutions on curvilinear grids confirm the quality of the method. To improve ecogeomorphic interactions, we have created a sub-grid vegetation-flow interaction module for Delft3D and Delft3D+ based upon the Baptist et al. (2005) equations. Baptist’s formulation is based on the theory that vegetation can be modeled as rigid cylinders, which influences the momentum calculation and turbulence structure. Vegetation is characterized by plant height, density, stem diameter, and drag coefficient in the model. The vertical flow velocity profile is divided into a constant zone of flow velocity inside the vegetated part and a logarithmic velocity profile above for submerged vegetation. Results show that in deltaic freshwater marshes, adding vegetation increases the fraction of sediment deposited inside the marsh but the vegetative roughness also forces more water into the channels, leading to more erosion in the channels and also more water by-passing the marsh surface. Thus under certain conditions, adding vegetation to freshwater marshes can reduce net deposition rates. In addition to the above-ground effects of plants, the role of roots in binding sediment is being modeled in a separate vegetation-root routine through increasing critical shear stress for erosion. When combined flow-wave shear stress is larger than a rooted-soil critical value, aggregate or block erosion occurs. The model is tested against cumulative sediment erosion and deposition on Wax Lake Delta during Hurricane Rita in 2005. The simulation shows that roots significantly change the sedimentation-erosion pattern at the marsh area by protecting the vegetated marshes from erosion. A fish dynamics model explores the co-evolution of fish populations, vegetation, and delta morphology. The model simulates the individuals of five fish species on a spatial grid of bathymetry, water levels, vegetated habitat, and basal prey. An existing version of this model uses historical water levels, together with fixed bathymetric maps, to determine water depths on each cell and its vegetation type. Model simulations follow each individual of each species through the processes of growth, reproduction, mortality, and movement. Individuals compete for space and invertebrate prey, and individuals of predatory species consume other model individuals. The sum over individuals for a species yields abundances, and the combination of abundance and growth yields productivity. We use the model to identify strong relationships between morphodynamic features (such as mouth bar hypsometry) and predicted total and species-specific fish productivity. As these models reach maturity in the next two years they will be incorporated into the CSDMS architecture and framework. All models will be open source and made freely available via the CSDMS Repository. If you have a specific immediate request please email sling@psu.edu  
The Earth’s surface is a boundary layer between internally-driven geodynamics and atmospheric forcing. In much of what we do as landscape modellers, our analysis of Earth surface can be enhanced by consideration and understanding of the substrate acted upon by hillslope, riverine and glacial processes. To explore the influence of crustal strength on patterns of fluvial incision, we use a conservative scaling rule to relate rock erodibility to field measurements of cohesive strength. In other models, grain sizes produced upon the erosion of rock are made a function of field measured fracture density values. By combining 3D geodynamic codes with landscape evolution models we are able to explore the sensitivity of surface processes to topographic and tectonic stresses, geological history, fault damage, seismic accelerations, pore pressures, and fluid flow. We present several examples where useful interpretations were made by integrating field, lab, and experimental data with geodynamic models, landscape evolution models, or a combination of both. Our examples are bias toward collisional settings – the Himalaya, the Southern Alps and Taiwan, but the approach is equally valid when considering strike-slip or extensional settings.  +
The Earth’s topography is the product of surface vertical motions caused by tectonic processes and modulated by erosional processes that cause redistribution of mass at the Earth’s surface over a wide range of scales. The efficiency of these gravity-driven processes scales with slope (and thus topography). This also implies that the time scale needed for an orogenic system to reach steady-state between tectonic uplift and erosion must scale with the height of the topography, the tectonic uplift rate and the degree of isostatic compensation. Using simple observations from a range of presently active orogenic systems, we estimate that this time scale is comprised between 1 and 15 Myrs. From these estimates, we can also derive a “generic” erosion law based on the Stream Power Law (SPL) to be used in geodynamic models. We also show that the degree of non-linearity of the erosional laws (i.e., the value of the slope exponent) controls the rate at which topography decays once tectonic uplift ceases. This simple behavior of any slope-dependent erosion law may explain the post-orogenic longevity of Earth’s topography.<br>The same processes that shape the Earth’s topography are, for the most, functions of the availability of moisture from the atmosphere. This has led to the conclusion that there must be a strong link between the efficiency of surface processes and climate. This link is however difficult to establish from observational evidence. For example, the effect of the Cenozoic cooling of the Earth’s climate on the efficiency of erosion in orogenic systems remains highly debated. We propose that this question is difficult to address because the wide range of erosional processes active at the Earth’s surface (such as fluvial erosion, hillslope processes, glacial abrasion, peri-glacial processes, chemical weathering, etc.) are characterized by different response times to climate perturbations. These response times may also depend on the dimensions of the topographic feature being eroded, the mean slope, the mean precipitation (or accumulation) rate and the nature of the rocks being eroded. It is therefore not surprising that a global correlation between climate change and erosional efficiency is difficult to evidence.<br>Recent work has also shown that erosional efficiency is strongly dependent on the variability of climate and, in particular, of precipitation. We will show how this climate variability has been introduced in fluvial erosional models using a simple stochastic approach. This requires, however, that mean precipitation and precipitation variability (or storminess) be translated into mean discharge and discharge variability. This can be achieved through the use of an eco-hydrological model that requires a limited number of parameters only.<br>To conclude we will use a surface processes model to demonstrate how tectonics, surface processes and climate interact with each other over geological time scales to create landforms that will ultimately exert a strong control on biodiversity, species richness and endemism. We will illustrate this point using the island of Madagascar as a case example.  
The Ensemble Framework For Flash Flood Forecasting (EF5) was developed to address a critical need for rapidly updating distributed hydrologic models capable of predicting flash floods. In the U.S. EF5 is used to run a 3-member ensemble forced by radar based precipitation as part of the Flooded Locations And Simulated Hydrographs (FLASH) product suite used by NWS. As part of the FLASH project a reanalysis was conducted from 2002-2011 to examine a climatology of flash flood events across the U.S. EF5 is also used by a NASA SERVIR applied science team for capacity building in East Africa. EF5 was designed with this use case in mind and as such is user-friendly with helpful error messages, cross-platform support, and open source.  +
The FAIR Principles mandate that all digital research objects should be findable, accessible, interoperable and reusable. Initially mainly perceived and applied for data, they are becoming increasingly important also for research software. We will discuss why and how FAIR principles for software differ from those for data, how software FAIRness can be assessed and measured, and what everybody can do to make their software FAIR(er). Finally, we survey community initiatives working towards the development and standardization of FAIR principles for research software, and ways to get involved.  +
The Geoscience Paper of the Future (GPF) Initiative was created to encourage geoscientists to publish papers together with their associated digital research products following best practices of reproducible articles, open science, and digital scholarship. A GPF includes: 1) Data available in a public repository, including metadata, a license specifying conditions of use, and a citation using a unique and persistent identifier; 2) Software available in a public repository, with documentation, a license for reuse, and a unique and citable using a persistent identifier; 3) Provenance of the results by explicitly describing method steps and their outcome in a workflow sketch, a formal workflow, or a provenance record. Learn to write a GPF and submit to a special section of AGU’s Earth and Space Sciences Journal. More at http://www.ontosoft.org/gpf/.  +
The Integrated Compartment Model (ICM) is a comprehensive and computationally efficient numerical tool that can be used to provide insights about coastal ecosystems and evaluate restoration and protection strategies. It includes physical and ecological processes, such as, hydrology, nutrients, vegetation, and morphology. The ICM can be used to estimate the individual and cumulative effects of restoration projects or strategies on the landscape and ecosystem and the level of impact/risk to communities. The ICM utilizes habitat suitability indices (HSIs) to predict broad spatial patterns of habitat change. It also provides input parameters to a more dynamic fish and shellfish community models to quantitatively predict potential changes in important fishery resources in the future.<br><br>The model is also used to examine the impact of climate change and future environmental scenarios (e.g. precipitation, Eustatic sea level rise, subsidence, nutrient loading, riverine runoff, storms, etc.) on the landscape and on the effectiveness of restoration or protection strategies.<br><br>The ICM is publically accessible code and research groups in the coastal ecosystem restoration and protection field are encouraged to explore its utility as a computationally efficient tool to examine ecosystems’ response to physical or ecological changes either due to future projections or to the implementation of restoration strategies.  +
The Integrated Compartment Model (ICM) was developed as part of the 2017 Coastal Master Plan modeling effort. It is a comprehensive and numerical hydrodynamic model coupled to various geophysical process models. Simplifying assumptions related to some of the flow dynamics are applied to increase the computational efficiency of the model. The model can be used to provide insights about coastal ecosystems and evaluate restoration strategies. It builds on existing tools where possible and incorporates newly developed tools where necessary. It can perform decadal simulations (~ 50 years) across the entire Louisiana coast. It includes several improvements over the approach used to support the 2012 Master Plan, such as: additional processes in the hydrology, vegetation, wetland and barrier island morphology subroutines, increased spatial resolution, and integration of previously disparate models into a single modeling framework. The ICM includes habitat suitability indices (HSIs) to predict broad spatial patterns of habitat change, and it provides an additional integration to a dynamic fish and shellfish community model which quantitatively predicts potential changes in important fishery resources. It can be used to estimate the individual and cumulative effects of restoration and protection projects on the landscape, including a general estimate of water levels associated with flooding. The ICM is also used to examine possible impacts of climate change and future environmental scenarios (e.g. precipitation, Eustatic sea level rise, subsidence, tropical storms, etc.) on the landscape and on the effectiveness of restoration projects. The ICM code is publically accessible, and coastal restoration and protection groups interested in planning-level modeling are encouraged to explore its utility as a computationally efficient tool to examine ecosystem response to future physical or ecological changes, including the implementation of restoration and protection strategies.<br>The following website contains the technical memos of the 2017 Coastal Master Plan modeling effort: http://coastal.la.gov/our-plan/2017-coastal-master-plan/  
The Mw 7.8 Kaikōura earthquake was a complex one, rupturing ~20 faults on and offshore including several that had not been recognised as active. It also generated tens of thousands of landslides, hundreds of landslide dams and damaged hillslopes that are now susceptible to failure during rainstorms and aftershocks. The landslide debris, when mobilised, will create new hazards – further landsliding, rapid aggregation, increased river channel instability and will threaten infrastructure into the future. Several large landslides closed State Highway 1 along the Kaikōura coast for over a year forcing major changes to New Zealand’s main transport route from Wellington to Christchurch. The road has reopened but repair work continues and it remains vulnerable to further disruption and closures.<br>These hazards are likely to persist for years to decades, requiring active management but also providing researchers with a natural laboratory with which to quantify post-earthquake landscape dynamics. Researchers in New Zealand and their overseas colleagues started to collect perishable data immediately after the earthquake and are continuing to do so. Repeat LiDAR surveys, ground profiling, field monitoring, laboratory testing and numerical modelling will be integrated to determine how hillslopes and rivers will respond to future forcing events. The goal is to produce an integrated set of predictive tools to manage earthquake and post-earthquake landslide risk.  +
The National Hydrologic Model (NHM) was developed to support coordinated, comprehensive, and consistent hydrologic modeling at multiple scales for the conterminous United States. The NHM development has been driven for the past decade by specific applications to meet stakeholder needs for accessible, adaptable surface water models that address local hydrologic modeling needs. NHM-based applications provide information to scientists, water resource managers, and the public to support advanced scientific inquiry and effective decision-making. The NHM infrastructure supports the execution of the Monthly Water Balance Model (NHM-MWBM) and the daily Precipitation Runoff Modeling System (NHM-PRMS). The NHM-PRMS balances all components of the water budget and can include simulation of stream temperature. Complete local models can be subset from the NHM-PRMS, then adapted and applied with local expertise to address stakeholder needs, providing nationally-consistent, locally informed, stakeholder relevant results. The NHM represents an opportunity for collaboration in the hydrologic community.  +
The National Science Foundation and the National Geospatial Intelligence Agency have a commercial imagery agreement that provides very high resolution DigitalGlobe Inc satellite imagery to the science community. We have leveraged this imagery, high performance computing and open source workflows and code to produce tens of thousands of digital surface models. We have successfully produced repeat topography timeseries for over one fifth of the entire planet. I will show examples of current projects to examine inundation risks at the 20 largest port cities around the globe, landslides and earth ruptures that occurred during the November 2016 Kaikoura earthquake in New Zealand and assessment of ice mass changes at all latitudes.  +
The Red Cross Red Crescent Climate Centre promotes the integration of climate science into humanitarian policy and practice. We know that if the best available tools aren't utilized, then there will be losses and suffering, as well as missed opportunities to build resilience. With partners, we develop innovative ways to connect forecasting and monitoring products with disaster management teams and disaster risk financing entities around the world. How does this play out on the ground? What challenges do we face with the existing weather and flood tools we utilize? What can we envision for a better future? In this interactive session, we'll collectively explore how the Climate Centre and others are preparing to improve how we link early warnings and early actions.  +
The Regional Ocean Modeling System (ROMS) is an open-source regional ocean model with a large user community. The version distributed as part of the Coupled Ocean–Atmosphere–Wave–Sediment Transport Modeling System (COAWST) includes two-way coupling with the atmosphere model WRF and the wave model SWAN, and contains routines for transport of non-cohesive sediment, biogeochemical processes, seagrass growth, and wave-flow-vegetation interactions. Recently, we implemented algorithms for treating cohesive and mixed sediment. These include the following: floc dynamics (aggregation and disaggregation in the water column); changes in floc characteristics in the seabed; erosion and deposition of cohesive and mixed (combination of cohesive and non-cohesive) sediment; and biodiffusive mixing of bed sediment. Additionally, changes to the sediment bed layering scheme were made that improve the fidelity of the modeled stratigraphic record. In this webinar, we will describe and demonstrate this new functionality and provide examples of these modules implemented in idealized test cases and a realistic application.<br><br>'''References''': Sherwood, C.R., A.L. Aretxabaleta, C.K. Harris, J.P. Rinehimer, R. Verney, B. Ferré. 2018. Cohesive and mixed sediment in the Regional Ocean Modeling System (ROMS v3.6) implemented in the Coupled Ocean Atmosphere Wave Sediment-Transport Modeling System (COAWST r1179). Geoscientific Model Development; https://doi.org/10.5194/gmd-11-1849-2018.<br><br> Tarpley, D.R., C.K. Harris, C.T. Friedrichs, and C.R. Sherwood. 2019. Tidal variation in cohesive sediment distribution and sensitivity to flocculation and bed consolidation in an idealized, partially mixed estuary. Journal of Marine Science and Engineering (JMSE); 7: 334; doi:10.3390/jmse7100334.  +
The Sediment Experimentalist Network (SEN) integrates the efforts of sediment experimentalists to build a Knowledge Base for guidance on best practices for data collection and management. The network facilitates cross-institutional collaborative experiments and communicates with the research community about data and metadata guidelines for sediment-based experiments. This effort aims to improve the efficiency and transparency of sedimentary research for field geologists and modelers as well as experimentalists.<br><br>The first part of this clinic will include a hand-on experiment using a desktop flume. We will create a physical model of a delta in a small flume on-site during the meeting. Fitting with the annual meeting theme, we will explore how delta morphology and stratigraphy capture climate change. The major goals will be to discuss the lifecycle of data and data management for experiments and to generate an example dataset for numerical model testing. Discussion will include practical aspects such as metadata requirements and naming variables.<br><br>In the second part, participants will learn how to engage in the SEN Knowledge Base and create an entry either using the collected data from the clinic experiment or participants’ own research data. We will focus our data usage and entry activities around the science theme of our experiments and associated model efforts: How do delta morphology and stratigraphy respond to external perturbations generated by climate change? We will explore www.sedexp.net to discover data from the experimentalist community, workflows, laboratory facilities and their capabilities for potential collaborations. This second part will also include discussion about a best practice for data preservation and reuse through the current infrastructure (e.g., SEN, SEAD, institutional data repositories). After getting to know the Knowledge Base and other cyberinfrastructure, we will discuss the possibility of experimentalist-modeler collaborations to address our science theme and achieve solutions to grand challenge goals.<br>The following google drive contains material of the clinic:<br>http://tinyurl.com/CSDMS-SEN<br><br>Enrollees will be contacted a couple weeks prior to the CSDMS meeting to engage in some brief pre-workshop activities to prepare for the clinic.There will be a short survey at the end about how to enhance collaborations between modeler and experimentalist communities.<br><br>'''More about SEN:'''<br>http://earthcube.org/group/sen<br>http://sedimentexperiments.blogspot.com<br>http://dx.doi.org/10.1016/j.geomorph.2015.03.039<br>  
The St. Anthony Falls Laboratory Virtual StreamLab (VSL3D) is a powerful multi-resolution and multi-physics Computational Fluid Dynamics (CFD) model for simulating 3D, unsteady, turbulent flows and sediment transport processes in real-life streams and rivers with arbitrarily complex structures, such as man-made hydraulic structures, woody debris, and even hydrokinetic turbine arrays. The code can handle arbitrarily complex geometry of waterways and embedded structures using novel immersed boundary strategies. Turbulence can be handled either via Reynolds-averaged Navier-Stokes (RANS) turbulence models or via large-eddy simulation (LES) coupled with wall models. Free-surface effects are simulated using a level-set, two-phase flow approach, which can capture complex free-surface phenomena, including hydraulic jumps, over arbitrarily complex bathymetry. A fully-coupled hydro-morphodynamic module has also been developed for simulating bedload and suspended load sediment transport in meandering rivers. A novel dual time-stepping quasi-synchronized approach has been developed to decouple the flow and sediment transport time scales, enabling efficient simulations of morphodynamic phenomena with long time scales, such as dune migration in rivers. The code is parallelized using MPI. This clinic will present a comphrehensive overview of the VSL3D, report extensive grid sensivity and validation studies with experimental data, and present a series of applications, including: 1) LES and unsteady RANS of turbulent flow and scalar transport in natural meandering streams; 2) LES of sand wave growth and evolution in a laboratory scale flume; 2) unsteady RANS of dune formation and migration in large scale meandering rivers with in stream rock structures (rock vanes, j-hooks, w-weirs, etc.); 3) LES of free-surface flows in natural and enginnered open channels; and 4) LES of gravity currents.<br/><br/>Representative references about the VSL3D code<br/><br/>1. Khosronejad, A., Hill, C., Kang, S., and Sotiropoulos, F., “Computational and Experimental Investigation of Scour Past Laboratory Models of Stream Restoration Rock Structures,” Advances in Water Resources, Volume 54, Pages 191–207, 2013.<br/><br/>2. Kang, S., and Sotiropoulos, F., “Assessing the predictive capabilities of isotropic, eddy-viscosity Reynolds-averaged turbulence models in a natural-like meandering channel,” Water Resources Research, Volume: 48, Article Number: W06505, DOI: 10.1029/2011WR011375, 2012. <br/><br/>3. Kang, S., Khosronejad, A., and Sotiropoulos, F., “Numerical simulation of turbulent flow and sediment transport processes in arbitrarily complex waterways,” Environmental Fluid Mechanics, Memorial Volume in Honor of Prof. Gerhard H. Jirka, Eds. W. Rodi & M Uhlmann, CRC Press (Taylor and Francis group), pp. 123-151, 2012.<br/><br/>4. Kang, S., and Sotiropoulos, F., “Numerical modeling of 3D turbulent free surface flow in natural waterways,” Advances in Water Resources, Volume: 40, Pages: 23-36, DOI: 10.1016/j.advwatres.2012.01.012, 2012. <br/><br/>5. Kang, S., and Sotiropoulos, F., “Flow phenomena and mechanisms in a field-scale experimental meandering channel with a pool-riffle sequence: Insights gained via numerical simulation,” Journal of Geophysical Research – Earth Surface, Volume: 116, Article Number: F03011 DOI: 10.1029/2010JF001814 Published: AUG 20 2011.<br/><br/>6. Khosronejad, A., Kang, S., Borazjani, I., and Sotiropoulos, F., “Curvilinear Immersed Boundary Method For Simulating Coupled Flow and Bed Morphodynamic Interactions due to Sediment Transport Phenomena,” Advances in Water Resources, Volume: 34, Issue: 7, Pages: 829-843 DOI: 10.1016/j.advwatres.2011.02.017, Published: JUL 2011. <br/><br/>7. Kang, S., Lightbody, A., Hill, C., and Sotiropoulos, F., “High-resolution numerical simulation of turbulence in natural waterways,” Advances in Water Resources, Volume 34, Issue 1, January 2011, Pages 98-113.  
The Sustainable Development Goals and of the Paris Agreement declare global commitments to climate stabilization and shared prosperity, but specific pathways for their simultaneous achievement remain unclear. On smaller time scales, governmental, agricultural, and economic systems require climate adaptation solutions. Integrated assessment models (IAMs) are essential tools for managing complex systems to meet these simultaneous imperatives, but are subject to theoretical, computational, and personal limitations. In this presentation, I will discuss the role of three IAMs (GLOBIOM, FeliX, and the Resilience Indicator Multihazard Model) in service of decision making under uncertainty for science and policy.  +
The Underworld code was designed for solving (very) long timescale geological deformations accurately, tracking deformation and evolving interfaces to very high strains. It uses a particle-in-cell based finite element method to track the material history accurately and highly-tuned multigrid solvers for fast implicit solution of the equations of motion. The implementation has been fully parallel since the inception of the project, and a plugin/component architecture ensures that extensions can be built without significant exposure to the underlying technicalities of the parallel implementation. We also paid considerable attention to model reproducibility and archiving — each run defines its entire input state and the repository state automatically.<br>A typical geological problems for which the code was designed is the deformation of the crust and lithospheric mantle by regional plate motions — these result in the formation of localised structures (e.g. faults), basins, folds and in the generation of surface topography. The role of surface processes — redistributing surface loads and changing boundary conditions, is known to be significant in modifying the response of the lithosphere to the plate-derived forces. The coupling of surface process codes to Underworld is feasible, but raises some interesting challenges (and opportunities !) such as the need to track horizontal deformations and match changes to the topography at different resolutions in each model. We will share some of our insights into this problem.  +
The analysis of river systems under change involves a wide range of spatial and temporal scales. Channel features range from sub-meter to kilometer scale and processes along river networks vary from instantaneous to geologic time scales. In the face of changes in forcings and anthropogenic modifications on the Earth surface, analysis and modeling of river systems is challenged by this vast range of scales and lack of measurements capable of capturing the heterogeneity that characterizes river systems. Remotely sensed data provide access to spatial and temporal information that can be integrated with modeling and field observations. In this presentation, I will show examples of river network studies that integrate multiple sources of data to address issues of flooding and coastal resilience. I will also discuss existing challenges and opportunities for future research.  +
The community WRF-Hydro system has evolved from a basic land surface modeling scheme for atmospheric models into a more comprehensive operational hydrologic prediction system.  Key to this evolution was explicit accounting for the need to represent different processes at different scales or with different types of spatial representations. The most recent evolution of the WRF-Hydro system was its implementation as the modeling system supporting the new NOAA National Water Model which become officially operational in August of 2016. This presentation will discuss the different kinds of configurations utilized within the NOAA National Water Model (NWM) and how the WRF-Hydro system was adapted to meet those requirements.   Specific emphasis will be placed on describing the spatial transformations and flux passing methods that were required to maintain coupling between different parts of the forecasting system.  Also discussed will be future work that is planned to enable new process representations within the NWM and how modeling approaches under the CSDMS has influenced this development.  +
The computational geodynamics community uses a diverse suite of methods and software to model solid Earth deformation across a wide range of spatial and temporal scales. Within the discipline of computational tectonics, a wide variety of options exists to model non-linear deformation of the lithosphere and convecting mantle, which range from CIG-supported software to widely used software by individual research groups to closed-source proprietary commercial codes. This presentation will review the available methods, open-source community resources, state-of-the-art applications and some of the numerical challenges associated with modeling non-linear tectonic processes. In particular, we will highlight the wide range of difficulties for efficiency, accuracy, computational analysis, storage, reproducibility, uncertainty quantification and verification. Last, we will present some solutions to these challenges in the context of multi-physics simulations and code coupling, with an outlook towards integration between the surface processes and lithospheric dynamics community.  +
The current operational NOAA-NWS National Water Model applies a uniform formulation to make continental scale flow predictions on the NHD+ drainage network. However, the literature demonstrates that given the spatial variability in dominant runoff generation mechanism and associated uncertainties in processes and parameters, skillful predictions require scientific evaluation of different model formulations in different hydrologic regions. Providing timely inland and coastal continental-scale predictions requires operations in an HPC environment. Legacy water resources models have dissimilar inputs and setup workflows, run-time environments, discretizations, solvers, and required forcing data. The sheer variety of approaches impedes model comparison and interoperability. The WaterML 2.0 Hy_Features standard provides a stable meta-model to describe the hydrologic landscape, and includes four fundamental topological elements: “catchment”, “flowpath”, “water body”, and their “nexus” linkages, which represent internal boundary conditions and provide natural breakpoints between models. The Hy_Features data model standard helps to unify model setup workflows. The Next Generation Water Resources Modeling Framework that is currently under development promotes interoperability, inter-comparison, model-based testing of research hypotheses, and ultimately improved agency-specific operational predictions while incorporating rapid adoption of advancements from the academic and federal research communities. This is achieved by using the drainage network as a graph to organize parallelization, and by extending the CSDMS Basic Model Interface (BMI) to include state-serialization functionality and to accommodate models with parallel formulations. This work in progress uses the open source development paradigm and participation by the research community is welcomed.  +
The degree to which subsurface architecture – pores space and connectivity—fluctuates and/or evolves is largely ignored in predictions of how Earth’s critical zone can respond to changes in biotic processes (direct and indirect) in the Anthropocene. Specifically, changes in microbial carbon decomposition rates and root growth can influence the generation of macropores, whose porosity accounts for only ~2% of the subsurface but accounts for ~70% of water transmitted to depth. We argue that the community needs to consider that changes in the subsurface structure throttles the partitioning of water, and thus the fluxes of carbon, nutrients, and weathering products. Using empirical data and modeling we explore this connectivity between biotic processes (e.g., root growth, carbon turn over) and subsurface pore structure from the pedon to the continental scale, quantifying the impact of this interaction on stocks and fluxes of water and nutrients. We then examine how over longer time periods, this change in hydrologic partitioning can influence the depth to which reaction fronts propagate into the subsurface and the role in which these changes could influence the trajectory of landscape evolution.  +
The development of quantitative models of landscape evolution requires a rigorous assessment of how well they represent natural system dynamics. For some locations and processes, the models have been shown to provide reasonable representations of landscape dynamics over a range of timescales; however, in a number of cases, rather straightforward tests suggest our simplified approaches may require further development. I discuss some of these lingering issues in quantitative geomorphology with a focus on the ubiquitous ‘stream power model’ and show how different approaches offer potential solutions to implementation in landscape evolution models. I will show examples focusing on how climate and lithology are represented in landscape evolution models as well as discuss the limitations of representing a river channel with a 1-dimensional model. I summarize by suggesting that an iterative approach of numerical predictions, rigorous field assessment, and adjustment of theory is needed to fully capture and understand the dynamics of landscapes.  +
The earth-surface science community is producing increasing volumes of data in laboratory, numerical, and field studies. What new tools and resources can help us wrangle these data into meaningful scientific knowledge? Wherever you might be in the research lifecycle – planning experiments, preparing a manuscript, or recovering old data from hard drives – this year’s Sediment Experimentalist Network (SEN) clinic will inform you on leading practices for data planning, management, and publication.<br>In recent years, policies and tools have evolved with the potential to greatly increase our efficiency in dealing with scientific data. We’ll cover topics that will help throughout the data and experimental lifecycle, including the SEN Knowledge Base (sedexp.net), the growing collection of research data repositories, and AGU’s Enabling FAIR data project.<br>Enrollees will be contacted a couple weeks prior to the CSDMS meeting to engage in some brief pre-workshop activities to prepare for the clinic. We encourage participants to come prepared with information about their research projects, from which we can engage in practical discussions about data planning and management.  +
The expense and logistics of monitoring streamflow (e.g. stage and discharge) and nearshore waves (e.g. height and period) using in situ instrumentation such as current meters, bubblers, pressure transducers, etc, limits the extent to which such important basic information can be acquired. Machine learning might offer a solution, if such information can be obtained remotely from time-lapse imagery using inexpensive consumable camera installations. To that end, I describe a proof-of-concept study into designing and implementing a single deep learning framework that can be used for both stream gaging and wave gauging from appropriate time-series of imagery. I show that it is possible to train the framework to estimate 1) stage and/or discharge from oblique imagery of streams at USGS gaging stations, using existing time-lapse camera infrastructure; and 2) nearshore wave height and period from oblique and rectified imagery from Argus systems. This proof-of-concept technique is based on deep convolutional neural networks (CNNs), which are deep learning models for regression tasks based on automated image feature extraction. The stream/wave gauge model framework consists of an existing generic CNN model to extract features from imagery - called a ‘base model', with additional layers to distill the feature information into lower dimensional spaces, prevent overfitting, and a final layer of dense neurons to predict continuously varying quantities. Given enough training data, the model can generalize well to a site despite variation in, for example, lighting, weather, snow cover, vegetation, and any transient objects in the scene. This development might offer the potential to train models for imagery at sites based on short deployments of in situ instrumentation, especially useful for sites where instrumentation is difficult or expensive to maintain for long periods. This entirely data-driven technique, at least for now, must be trained separately for each site and quantity, so would be suitable for very long-term, site-specific estimation of wave or hydraulic parameters from stationary camera installations, subsequent to a training period. Further development might promote low-cost (or even hobbyist) hydrodynamic and hydraulic monitoring anywhere.  
The interplay between tectonics and surface processes has long been recognized and explored through field observations, laboratory studies, and analogue and numerical modeling. However, the dependencies that link tectonics and the surface are complex and often difficult to unravel and visualize with current methods and concepts. To address these difficulties, it is common to create predictive models with algorithms that simplify these natural processes and limit their dependencies on one another.<br><br>In this clinic, we share some simple methods for isolating two tectonic processes: fault damage and fault slip, and explore how they influence the rates and patterns of surface processes. These tectonic processes are introduced as 3D patterns of rock damage and kinematics in a landscape evolution model using Matlab and CHILD. First, we discuss methods for scaling rock damage to erodibility for use in a stream power model. The erodibility field is based on the generic 3D geometry of planar fault damage zones. Next, we include fault slip by using a 3D kinematic solution for dip-slip, oblique-slip, and strike-slip motion. These models include a single slip plane that divides a block of crust into fixed and mobile components. Finally, we combine the rock damage and kinematic fields to observe their combined influence. In these combined models, rock damage becomes a function of the amount of motion accommodated by the slip plane. Throughout the clinic we will explain our methods, interpret model results, discuss their limitations, and postulate ways to improve upon them. The simple methods we employ in this clinic lay a foundation of understanding that can be broadened by use of dynamic, fully coupled models.<br><br>START_WIDGET5dd744b2026e2aae-38END_WIDGET<br>START_WIDGET5dd744b2026e2aae-39END_WIDGET<br><br>  +
The landscape serves as a nexus between the solid Earth with its geodynamic processes and the atmosphere. At many spatial and temporal scales landscape morphology and topography provide a constraint on the tectonics of the deeper Earth and the processes active or previously active within it. In order to unravel these, we need to understand the complex relationships between surface processes, their drivers and the Earth materials on which they act.<br>In my talk, I will explore recent developments in modelling surface processes within a single deformational framework. I will focus on collisional settings such as New Zealand’s Southern Alps, SE Alaska and the Himalaya where rapid uplift combines with vigorous climate regimes to create dynamic landscapes. Topics will include:<ul><li>Exploring the complete stress tensor (tectonic, dynamic, topography, fluvial, glacial)</li><li>Rock strength controls on topography and erosion rates</li><li>Failure Earth Response Model</li><li>Smooth particle hydrodynamics and its application to landscape evolution modelling</li></ul>  +
The last two decades have been a period of enormous growth of agent-based (or individual-based) (ABM) modeling in ecology. ABMs allow mechanistic detail to be represented for many aspects of variation of individual organisms. ABMs are suited to spatially explicit modeling of populations, communities, and ecosystems, taking into account both the complexity of the environment and the physiological and behavioral adaptations of organisms. Thus, ABMs can include links between effects of environmental factors on plants and animals and makes ABMs essential in projecting how climate change will affect ecological systems. Key studies using ABMs to both understand ecological systems and project future changes will be discussed. These ecological applications include forest dynamics, species conservation, and preservation of biodiversity. This will include a prognosis of the future directions.  +
The macroscopic behavior of granular materials is the result of the self-organizing complexity of the constituent grains. Granular materials are known for their ability to change phase, where each phase is characterized by distinct mechanical properties. This rich generic phenomenology has made it difficult to constrain generalized and adequate mathematical models for their mechanical behavior. Glaciers and ice streams often move by deformation of underlying melt-water saturated sediments. Glacier flow models including subglacial sediment deformation use simplified a priori assumptions for sediment rheology, which limit our ability to predict ice sheet dynamics in a changing climate.<br><br>In this talk I will present the soft-body Discrete Element Method which is a Lagrangian method I use in order to simulate the unique and diverse nature of granular dynamics in the subglacial environment. However, the method imposes intense computational requirements on the computational time step. The majority of steps in the granular dynamics algorithm are massively parallel, which makes the DEM an obvious candidate for exploiting the capabilities of modern GPUs. The granular computations are coupled to a fluid-dynamics solver in order to include grain-fluid feedbacks, which prove to be important for stick-slip behavior of glaciers.<br><br>All code is open source and freely licensed.  +
The multi-institution project "Shelf-Slope Sediment Exchange in the Northern Gulf of Mexico..." used a chain of models to examine the effect of hurricanes on the seabed in the Gulf of Mexico and their effect on infrastructure. The components were ROMS (Rutgers U) for weather, wave climate and currents; CSTMS (VIMS) for sediment transport, WBMsed for river discharges and hurriSlip for sediment failures and gravity ignitions (U Colorado), LES/RANS-TURBINS (U California, Santa Barbara) for downslope turbidity currents. The project was an ambitious testing of the ability of these models to connect, and achieved most of the goals set out. The test case was 3 years of oceanographic data for Louisiana-Mississippi-Alabama including storms Dolly, Gustav, and Ike.<br>Connection between the models was a challenge on several levels. (i) Physical dimensionalities of the models were between 3D, 2.5D, 2D, and pointwise 0D. Model resolutions also varied. (ii) The models also reported daily, 3-hourly, or event- based responses. However different spatial and temporal scales in nature are observed to show different variances on aspects like sediment, and event return-time scalings. These are an added, often unforeseen difficulty for assembling multi-models,and validating them with data. (iii) Important process-jumps are involved in the total chain of sediment transport. Perhaps the most riveting are seabed mass-failures, and turbidity-current ignitions. Actually, separate modules often requiring energy considerations are required to handle such sharp phase-changes - sections of metastable deposits on the seafloor becoming slides, and of advecting suspended sediment layers becoming turbulent gravity flows.  +
The natural elevation of the vast, flat landscape of the lower Ganges-Brahmaputra-Meghna (GBM) remains remarkably stable despite persistent relative sea level rise (rSLR). This stability stems from the tight coupling of the land and tides through a robust negative feedback induced by periodic flooding with sediment-rich water. As water levels increase, the inundation depth and duration also increase resulting in more sediment deposition. This has a stabilizing effect and largely negates the initial increase in water level such that the elevation surface appears unchanged. We refer to this stable elevation as the equilibrium elevation. Here, we investigate the strength of the inundation feedback and the resulting equilibrium elevation. We identify three main controls on this feedback - (1) annual rate of rSLR, (2) mean tidal range (TR), and (3) mean suspended sediment concentration (SSC). We explore the realistic parameter space of each using a simple, zero-dimensional mass balance model. Specifically, we ask (1) what equilibrium elevations are feasible, (2) how these equilibrium elevations compare to tides (e.g., relative to mean sea level (MSL) or mean high water (MHW)), and (3) how equilibrium elevation impacts the duration (hydroperiod) and intensity (depth) of a typical inundation cycle. Results show an incredibly robust feedback for most conditions with the notable exception of low SSCs (< 0.1 g/L). This low, yet realistic value of SSC represents a tipping point at which the equilibrium elevation drops precipitously. At higher rates of rSLR (> 8mm/yr) and lower TR (< 2 m) the equilibrium elevation results in complete drowning of the platform.  +
The objective of the clinic is: (1) to introduce the concept of essential terrestrial variables (ETVs) and HydroTerre1 as a continental scale ETV-repository for catchment modeling, and (2) to demonstrate the use of ETV’s with the Penn State Integrated Hydrologic Model (PIHM) for simulating the catchment water cycle. PIHM2 is a multi-process, multi-scale hydrologic model where the hydrologic processes are fully coupled using the semi-discrete finite volume method. PIHMgis3 is an open source, platform independent, and extensible distributed modeling framework for setup, execute, and analyze model simulations. Through the procedural framework of PIHMgis, participants will be introduced to multiple data processing tools, and presented with a live demonstration of (i) accessing HydroTerre ETV service, (ii) ETV geodata translator for PIHM, (iii) automated ingestion of model parameters from national geospatial databases, (iv) conditional domain decomposition of the watershed into quality triangular mesh elements for numerical simulation, (v) performing multi-state distributed hydrologic model simulations on desktop, and (vi) visualization of model results as time-series plots and geo-spatial maps. In the clinic, an application is developed for a small-scale hillslope catchment Susquehanna-Shalehills Critical Zone Observatory (SSHCZO), which serves as a guided example of the desktop workflow, which is readily used to develop your own catchment simulation.<br><br>1 http://www.hydroterre.psu.edu/HydroTerre/Help/Ethos.aspx<br><br>2 http://www.pihm.psu.edu/index.html<br><br>3 http://www.pihm.psu.edu/pihmgis_home.html  +
The ocean is an important component in Earth’s climate system and rapidly changing. Warming, loss of oxygen and sea-ice, acidification, and intensifying vertical density stratification critically affect ocean biogeochemistry including the photosynthetic production of organic matter. The latter supports the entire marine food web and plays a major role in regulating Earth’s climate by sequestering CO2 in the ocean’s interior. Despite the urgent scientific and societal need for quantifying the ocean’s present biogeochemical state and predicting how it is changing, state-of-the-art biogeochemical models are insufficiently validated and poorly constrained by observations. This is primarily due to the insufficient availability of biogeochemical ocean observations and especially problematic because biogeochemical models aren’t based on first principles, are highly non-linear and have many poorly know parameters. In this presentation I will illustrate some of these problems and then discuss opportunities that arise from a new global ocean observation initiative referred to as Biogeochemical (BGC) Argo. BGC Argo builds on the highly successful Argo program which has maintained a global array of almost 4,000 profiling floats that measure physical ocean properties and relay their data in real time. Capitalizing on this success and recent advances in sensor technology, the addition of biogeochemical sensors to Argo floats is now ongoing. By providing a broad suite of observations with unprecedented spatial and temporal coverage, and integrating it into biogeochemical models and data products, the BGC Argo program is likely to transform ocean biogeochemical analysis and prediction. I will present some early examples.  +
The oceans have absorbed a large fraction of anthropogenic carbon dioxide emissions, having consequences for ocean biogeochemistry and ecosystems via ocean acidification. Simulations with Earth System Models can be used to predict the future evolution of ocean carbon uptake and acidification in the coming decades and beyond, but there is substantial uncertainty in these model predictions, particularly on regional scales. Such uncertainty challenges decision makers faced with protecting the future health of ocean ecosystems. Uncertainty can be separated into three component parts: (1) uncertainty due to internal variability, (2) uncertainty due to model structure, and (3) uncertainty due to emission scenario. Here, we isolate and quantify the evolution of these three sources of prediction uncertainty in ocean carbon uptake over the next century using output from two sets of ensembles from the Community Earth System Model (CESM) along with output from models participating in the Fifth Coupled Model Intercomparison Project (CMIP5). We find that the three sources of prediction uncertainty in ocean carbon uptake are not constant, but instead vary with prediction lead time and the scale of spatial averaging. In order to provide valuable predictions to decision makers, we should invest in reducing the main sources of uncertainty.  +
The projected increases in the frequency and magnitude of hazards, which threaten the coastal hinterland, heighten the need for an enhanced understanding of the determining mechanisms for mangrove adaptation and their contribution to coastal safety. This research seeks to improve the understanding of the bio-physical processes governing the geomorphological evolution of the mangrove-mudflat system by combining spatially explicit observations of mangrove population dynamics with process-based modelling. Field observations were taken at the Le Ressouvenir- Chateau Margot mangrove-mudflat, within the 300m wide fringe and on the mudflat extending 6km offshore, along the Guyana coastline. This coastline resides 1m below sea level and, is subject to a semi-diurnal tidal regime with a maximum tidal range of 3.5m during spring tide. Using the data collected on the elevation, vegetation, water level, flow velocities, sediment concentration and wave heights; we developed a 2D depth averaged model using a process-based approach. On a high resolution grid of 10m, the model predicts the geomorphological development from the interaction between the intertidal flow, waves, sediment transport and the temporal and spatial variation in the mangrove growth, drag and bio-accumulation. Here, we coupled Delft3D-FM with a mangrove dynamics model capturing the Avicennia germinans and Laguncularia racemosa species under suitable inundation and competition regimes. Waves are critical for the transport of mud into the mangrove belt during high tide. Only when approaching spring tide is the inner part of the fringe inundation, creating a heightened platform which governs the species establishment. The channels form the major path for the tidal inflow during the lower tides, while the interior of the forest is an effective sediment sink during the higher tides. Sea level rise scenarios reinforce field observations for mangrove retreat and decay, with tipping points realized after 1.5m to 2.0m. Results indicate mangrove adaptability, to climate change and anthropogenic threats, hinges on the long term sedimentation responses and system conditions to promote the establishment of stable belt widths.  
The properties of hurricanes directly influence storm surges; however, the implications of climate change are unclear. In this work, we use numerical modeling to simulate the storm surges of historical storms under present day and projected end of century climate conditions and assess the impact of climate change on storm surge inundation. We use a convection permitting regional climate model, WRF, and a high fidelity storm surge model, ADCIRC, to simulate hurricanes and storm surges that impacted the Gulf of Mexico and Atlantic Coasts of the continental United States from 2000-2013. We find that the volume of inundation increases for over half of the simulated storms and the average change for all storms is +36%. The extent of inundation increases for over half of the simulated storms, and the average change for all storms is +25%. Notable increases in inundation occur near Texas, Mississippi, the Gulf Coast of Florida, the Carolinas, Virginia, and New York. Our calculations of inundation volume and extent suggest that at the end of the century, we can expect hurricanes to produce larger storm surge magnitudes in concentrated areas, as opposed to surges with lower magnitudes that are widespread. This type of modeling has the potential to significantly contribute to urban planning and resilience efforts of coastal communities.  +
The question of ecosystem dynamics is relevant from a scientific and management perspective. Knowing the natural tendencies and trajectories of ecosystems will assist in planning for their development and restoration. One key feature is how the ecosystem uses the available energy flows to move further from thermodynamic equilibrium and increase its overall complexity in terms of total biomass, biodiversity, network connectivity, and information. In this presentation, I review some of the main concepts that have been used to identify these dynamic trajectories. Namely, it can be shown using network analysis that a number of ecological goal functions pertaining to energy, exergy, biomass, embodied energy, entropy, and information are complementary displaying various angles of the same general complexification phenomena.  +
The spatial distribution of vegetation along the banks and floodplains of a river can drastically affect its geomorphic response to large floods. Plants influence sediment transport dynamics and the resulting patterns of erosion and deposition by steering the flow, changing the scale and intensity of turbulence, and increasing the effective cohesiveness of surface material. Efficiently simulating these interactions over river reaches requires simplifying the small-scale processes into measurable parameters that can reproduce the large-scale behavior of the system.<br/><br/>We present simulations of the evolution of the morphology of vegetated, mobile sand-bed rivers during this flows that were obtained by coupling the existing hydrodynamic model ANUGA with modules for sediment transport and vegetation. This model captures the effects of vegetation on mean flow velocity by treating plant stems as cylinders of specified diameter and spacing and calculating the drag they impart on the flow.<br/><br/>The outputs of this model were tested against a well-constrained natural experiment to determine the accuracy of the model predictions. Multi-temporal airborne lidar datasets capture the topographic change that occurred along a 12-km reach of the Rio Puerco, New Mexico, as a result of a large flood in 2006. The magnitude of deposition on the floodplain was found to correlate with vegetation density as well as distance from the primary sediment source. This relationship is reproduced by the model using only the simplest drag formulation. The local variability in deposit thickness was seen to depend strongly on the dominant species present, suggesting that plant-scale processes are reflected in the patch-scale behavior of the system. This indicates a need for more complex parameters that reflect the changes in turbulent energy and shear stress that result from different plant characteristics.  +
The theme of this meeting is Modeling Coupled Earth and Human Systems. Since the World3 model and the Limits to Growth report of 1972 there has been a sustained effort of integrated modeling of human activities and the Earth system. Despite the existence of integrated models, there is an increasing recognition that the social science is largely lacking from the modeling efforts. Having worked in both natural science and social science departments, I reflect on the different modeling cultures and the challenges in social science to use simulation models. Building on the work of the CoMSES Net I also provide some promising examples of agent-based models advancing social science.  +
The uncertainty surrounding the impact of sea-level-rise (SLR) and storms, which threaten the coastal hinterland, heightens the need for design guidelines on mangroves adaptation and their use in coastal safety. This research seeks to quantify the bio-physical processes governing the geomorphological evolution of mangrove-mudflat systems utilizing spatially explicit observations of mangrove population dynamics with process-based modelling. For calibration purposes and increased insight into interactions between hydrodynamics, sediment dynamics and mangroves, field observations were collected along Guyana coast. Results indicate mangrove adaptability hinges on the long term sedimentation responses and system conditions to promote the establishment of belt widths exceeding 300m.  +
The variability of the water cycle causes extremes such as droughts and floods and these have an impact on society. In the past two decades with the advent of improved satellite sensors, modeling and in-situ observations, quantification of these extremes has become possible. This talk will be based on characterization of floods and droughts in global continental river basins that can be used for monitoring and lead to prediction of spatial distribution and temporal variability.  +
Theories have been proposed using idealized tracer age modeling for ocean ventilation, atmospheric circulation, soil, stream and groundwater flow. In this research we developing new models for the dynamic age of water in hydroecological systems. Approaches generally assume a steady flow regime and stationarity in the concentration (tracer) distribution function for age, although recent work shows that this is not a necessary assumption. In this paper a dynamic model for flow, concentration, and age for soil water is presented including the effect of macropore behavior on the relative age of recharge and transpired water. Several theoretical and practical issues are presented including some new results for Shale Hills CZO (G. Bhatt, 2012).  +
There are few regions of the Earth that change more rapidly and consistently than the coastal zone. Despite this transience and its susceptibility to hazards, the coast continues to attract humans and development. Additionally, coastal deposits can hold important information about environmental changes in Earth's history, such as variations in relative sea level, sediment supply, or tectonics. Accordingly, deeper knowledge of the formative and destructive processes operating at the shore is of both scientific interest and societal importance. In this presentation, I will introduce a moving-boundary framework aimed to advance our quantitative understanding of the key processes that drive the evolution of low-lying coastal landscapes such as barrier islands, fluvial deltas, and continental shelves. I will also provide examples of how this mathematical framework can be applied at both field and laboratory scales.  +
There are many recent additions to Python that make it an excellent programming language for data analysis. This tutorial has two goals. First, we introduce several of the recent Python modules for data analysis. We provide hands-on exercises for manipulating and analyzing data using pandas and scikit-learn. Second, we execute examples using the Jupyter notebook, a web-based interactive development environment that facilitates documentation, sharing, and remote execution. Together these tools create a powerful, new way to approach scientific workflows for data analysis.  +
There are various visions of our future, but most policy-makers and scientists agree that life will be substantially different in the post-fossil era. The cheap and abundant supply of fossil energy has led to unprecedented population growth and to staggering levels of consumption of natural resources, undermining the carrying capacity of nature. Eroding ecosystems, the end of cheap oil and climate change call for new policies to support societal transformations toward low-carbon alternative futures. This understanding has already been expressed in recent EU legislation, which requires that domestic GHG emissions be cut by 80% between 1990 and 2050. Energy is a major driver of change and an important ‘currency’ that runs economic and social systems and influences environmental systems. Being so used to the abundant and uninterrupted supply of fossil energy, we tend to forget the important role that it plays in our everyday lives. Non-marginal, abrupt changes, such as during the Oil Crisis of the 1970s or the sudden sharp rise in oil prices in 2008 remind us how vulnerable societies are with respect to energy. Future transitions and climate induced changes are also unlikely to be smooth and require new modeling paradigms and methods that can handle step-change dynamics and work across a wide range of spatio-temporal scales, integrating the knowledge of many stakeholder communities.<br/><br/>Here we are operating in a generalized ‘socio-environmental model space’, which includes empirical models, conceptual stakeholder models, complex computer simulations, and data sets, and which can be characterized in several dimensions, such as model complexity, spatial and temporal resolution, disciplinary coverage, bias and focus, sensitivity and uncertainty, usability and relevance. In this space we need a ‘model calculus’ – a set of relationships and operations that can apply to individual models and groups of models. Model integration across disciplinary boundaries faces two big challenges. First we need to learn to deal with a variety of modeling paradigms and techniques, allowing different types of models to exchange information in a meaningful way (agent based models talk to systems dynamics, to computed global equilibrium models, to empirical models, etc.). Secondly, we need to provide integration techniques and tools that bring qualitative, conceptual, mental models of stakeholders together with the quantitative simulation models.<br/><br/>Greater transparency and accessibility can be achieved through enhancing documentation and communication of model functioning and strengths and limitations of various models and approaches. This extensive model documentation following improved and enhanced meta model standards is an important first step that makes sure that models (both qualitative and conceptual) ‘talk the same language’ and can exchange information and knowledge at various stages of research. This also helps us create the ontology, which can be further used for computer aided semantic mediation of models. This semantic mediation should include such functionality as consistency checks (checking for units, concepts, spatio-temporal resolution, etc.). This should also help to explore the different models along the complexity continuum to understand how information from more aggregated qualitative models can be transmitted to more elaborated and detailed quantitative simulations, and vice versa. This bears the promise of insight on the complex behavior of non-linear systems where regime shifts and non-equilibrium dynamics is usually better understood with simple models, while the more complicated models are easier to parametrize with data and can take into account more detailed information about particular systems and situations.  
There is global recognition to push forward mangrove restoration and conservation for climate mitigation and adaptation. Unfortunately, although our understanding of mangrove processes has significantly improved, 80-90% of the reported restoration projects have experienced failures. The main reasons are related to a poor understanding of the eco-geomorphological dynamics and mangrove species-specific ecological requirements. Mangrove restoration guidelines exist; however, they may be site-specific and cannot be easily replicated in other restoration cases. Hence, it emphasizes the need for a system understanding of mangrove ecosystem physical and ecological interactions. We developed a hybrid model by coupling the process-based hydro-morphodynamic model Delft3D-FM (DFM) and the individual-based mangrove model MesoFON (MFON). The model (DFMFON) allows us to resolve spatiotemporal processes, including tidal, seasonal, and decadal environmental changes with full-life-cycle mangrove interactions. The DFMFON model successfully reproduced observed spatiotemporal (seasonal-decadal) mangrove development, like the age-height relationship and morphodynamic delta features in a prograding Porong Delta, Indonesia.  +
This clinic aims to introduce the open source computational fluid dynamics (CFD) platform, OpenFOAM®, to the earth surface dynamics research community and to foster collaborations. OpenFOAM® is essentially a computational toolbox which solves general physical models (differential equations) using finite volume method. This short clinic is tailored to be suitable for an audience at various levels (from beginners to experienced code developers). It will provide an overview of OpenFOAM. We will demonstrate its usage in a variety of applications, including hydrodynamics, sedimentation, groundwater flows, buoyant plumes, etc. Participants can also bring the problems in their fields of interest and explore ways to solve them in OpenFOAM®. Knowledge of C++, object-oriented programming, and parallel computing is not required but will be helpful.  +
This clinic explores how to fully engage with the Landlab library by creating your own components. It is designed for those who already have some basic familiarity with Landlab and with scientific Python programming (registration for the “Introduction to Landlab” is recommended for those who have not already learned the basics).. In this workshop we will cover an overview of object-oriented programming (OOP), and will look at several examples of existing Landlab components to understand how they are written in an OOP framework. We will write a demo component as a group, and will then move on to writing our own components in small groups. Participants should come prepared with an idea of a process model they’d like to implement as a component. It is recommended, but not required, that participants in this workshop also register for the clinic “The Art of Modeling: From Concept to Math with Mass Balance,” in order to be equipped with an understanding of the math that will form the basis of their Landlab component. This workshop will involve coding in Python using the CSDMS JupyterHub. If you don't already have an account, follow the instructions to sign up at: https://csdms.colorado.edu/wiki/JupyterHub.  +
This clinic is intended for early career researchers interested in gaining an understanding of basic integrated modeling concepts as they relate to modeling earth science systems. The class will present key literature in the field, core concepts and terminology, and different integrated modeling systems. Past, present, and future trends for designing integrating modeling systems will be discussed. Participants will also gain experience applying integrated modeling concepts using CSDMS for simplified integrated modeling examples.  +
This clinic provides a brief tutorial introduction to the theory and implementation of Landlab for landscape evolution modeling. Topics include grid representation, working with data fields, and using Landlab components to create new integrated models. This clinic is intended for beginners with little to no experience using the Landlab library. Prior experience with Python programming is helpful but not necessary.  +
This clinic provides a brief tutorial introduction to the theory and implementation of Landlab for landscape evolution modeling. Topics include grid representation, working with data fields, and using Landlab components to create new integrated models. This clinic is intended for beginners with little to no experience using the Landlab library. Prior experience with Python programming is helpful but not necessary.  +
This clinic provides a brief tutorial introduction to the theory and implementation of Landscape Evolution Modeling. Participants will have the opportunity to work with simple models in the TerrainBento package, which provides a set of models that are built on the Landlab library. Topics include grid representation, working with data fields, and using Landlab Components to create new integrated models. '''To get to the clinic materials:''' * Load the notebooks in your Hub, just go to our GitHub page https://github.com/csdms/csdms2021_landlab_terrainbento * On that page, click the link under the agenda * Go to notebooks>landlab>Landlab_grids_csdms2021.ipynb for the landlab clinic * And to notebooks> landlab-terrainbento> Welcome_to_TerrainBento.ipynb for the terrainbento clinic. Note that for the terrainbento clinic, you will have to activate the TTBB kernel. There is even more material in the folder to practice your skills further. <br> This clinic runs on the CSDMS JupyterHub. If you don't already have an account, follow the instructions to sign up at: https://csdms.colorado.edu/wiki/JupyterHub. Run the lab Notebook by clicking the "start" link under the Run online heading at the top of this page. For more information, please contact us through the CSDMS Help Desk: https://csdms.github.io/help-desk.  +
This clinic will introduce CUAHSI products that support hydrologic modeling and analysis workflows. The HydroShare and JupyterHub platforms will be used to demonstrate the cyberinfrastructure capabilities that have been designed for community modeling and educational purposes. Participants should have a working knowledge of Python and should bring a laptop.  +
This clinic will introduce and demonstrate use of a powerful, flexible approach being used by the Pangeo community for working with large model output which works effectively on a range of computing systems including local machines, HPC facilities and the Cloud. Xarray is used for working with CF-compliant model output, Dask for parallelization, and Holoviz for interactive visualization in the browser. Rechunking data to improve performance for a variety of analysis use cases will also be covered.  +
This clinic will introduce deep learning methods for semantic segmentation of fluvial sedimentary landforms and riparian environments, using high-resolution aerial imagery. Deep neural networks are the current state-of-the-art for discrete classification of remotely sensed imagery from Earth observation platforms. The clinic will guide users through the process of preparing training datasets, training models, and evaluation. A number of different deep convolutional neural network architectures for image feature extraction and pixel-scale classifications will be explored and compared. The clinic will use the keras and tensorflow libraries within the python programming language. This hands-on class will be taught using Google colab through a browser, with the materials hosted on github. Participants will require a working knowledge of python. Some working knowledge of machine learning would be helpful, but we will assume no prior experience with machine/deep learning, neural networks, tensorflow, or keras.  +
This clinic will introduce participants to GRASS GIS tools with a focus on applications for coastal hazards analysis including flooding and coastal evolution. We will explain and practice GRASS GIS data management and concepts, and demonstrate them on examples of efficient LiDAR point cloud, raster, and vector data processing. The clinic will begin with a brief introduction to the GRASS GIS software and continue with a hands-on tutorial exploring coastal evolution through a LiDAR timeseries of Bald Head Island in North Carolina, USA. Finally, we will explore some of the inundation and flood modeling tools available in GRASS GIS. The tutorial will be formatted in a series of Jupyter Notebooks executed in a cloud-based (or locally installed) JupyterLab environment, taking advantage of the latest GRASS GIS Python features for Jupyter, including 2D, 3D, webmap, and temporal visualizations. By the end of the clinic, participants will have hands-on experience with: <ul> <li>Setting up GRASS GIS Projects and importing data</li> <li>Creating high-quality DEMs from LiDAR point clouds and computing topographic parameters</li> <li>Deriving shorelines from the DEMs</li> <li>Animating changes in topography over time and computing erosion rates</li> <li>Generating simplified storm surge inundation timeseries</li> </ul>  +
This clinic will look at the CSDMS Modeling Tool (CMT). We share the philosophy behind CMT, will demo the functionality of CMT and show what models are incorporated into it. New educational material on several models allows scientists and students to more easily use CSDMS models for classes and simple simulations and we will provide clinic participants with the latest information on these resources. The CMT clinic will be hands-on, we will run a few simple runs and visualize them. Finally, we will spend some time on discussing common problems and strategic solutions.  +
This clinic will offer you an introduction to developing food web models using Ecopath with Ecosim software. Ecopath with Ecosim (EwE) is an ecological modeling software suite for personal computers that has been built and extended on for almost thirty years. EwE is the first ecosystem level simulation model to be widely and freely accessible. EwE is the most applied tool for modeling marine and aquatic ecosystems globally, with over 400 models published to date, making EwE an important modeling approach to explore ecosystem related questions in marine science. In addition, Ecopath software was recognized as one of NOAA’s top ten scientific breakthroughs in the last 200 years. In this clinic, we will start with a brief introduction, then download the freeware and start setting up some simple models which we will use in example exercises. Note: the software works in a Windows environment; Mac computers can be used if they are set up with Parallels Desktop or a similar application to run programs in a Windows environment on a Mac.  +
This clinic will offer you an introduction to developing food web models using Ecopath with Ecosim software. Ecopath with Ecosim (EwE) is an ecological modeling software suite for personal computers that has been built and extended on for over thirty-five years. EwE is the first ecosystem level simulation model to be widely and freely accessible. EwE is the most applied tool for modeling marine and aquatic ecosystems globally, with over 400 models published to date, making EwE an important modeling approach to explore ecosystem related questions in marine science. In addition, Ecopath software was recognized as one of NOAA’s top ten scientific breakthroughs in the last 200 years. In this clinic, we will start with a brief introduction, then download the freeware and start setting up some simple models which we will use in example exercises. Note: the software works in a Windows environment; Mac computers can be used if they are set up with Parallels Desktop or a similar application to run programs in a Windows environment on a Mac.  +
This clinic will provide an introduction to the MATLAB-based geodynamic modeling code SiStER (Simple Stokes solver with Exotic Rheologies, available at: https://csdms.colorado.edu/wiki/Model:SiStER), with particular emphasis on problems that couple solid-Earth deformation and surface processes. Attendees will develop and run simulations where fault evolution (in rifts or orogens), lithospheric flexure and/or mantle flow interact with surficial mass redistribution through erosion and sedimentation.  +
This clinic will provide information on how laboratory scale flows and field scale flows can be simulated by direct numerical simulations (DNS) and large-eddy simulations (LES) using parallel, high-performance computing facilities. DNS results, from the software TURBINS, of gravity and turbidity currents propagating over complex sea floor topography will be discussed. The use of the PETSc software package within the DNS simulations will be highlighted. LES results of high Reynolds number gravity and turbidity currents, and reversing buoyancy currents over a flat topography will be discussed. Issues relevant to LES such as grid resolution, grid convergence, subgrid models and wall-layer modeling will also be discussed.  +
This experiential clinic will introduce and have attendees incorporate personal and social components into communications of their research from design stages to presentations. Participants will be encouraged at the start to try some pre-conference exercises related to professional networking and mentoring that will be examined and discussed during a reflective exercise near the end of the workshop. Comparisons will be made between conventional samples from publications that explain science motivations with motivations they might communicate after reflecting on the social relevance of their theoretical or modeling research. Workshop exercises will involve participants making slides, infographics, workflows, and/or writing text that practice weaving together the elements of their science with personal and social relevance. Feedback will be exchanged with other participants. The goals are to build confidence amongst participants in sharing aspects of human interest in their science and build skills they can practice in the future in their collaborations, research design, and presentations. An objective is to use self- and social- awareness skills when developing their science themes and notice differences when compared with when they are not used. The possibilities are endless. All career-stages are welcome and are expected to benefit from inter-career-stage and inter-subfield communications.  +
This hands-on virtual clinic will go over good practices for scientific software development to help you develop and publish FAIR (Findable, Accessible, Interoperable, and Reusable) scientific software. We will cover basic principles and examples from the field and then dive into common collaboration workflows in Git and GitHub that facilitate comprehension and reuse of your codebases. We will engage in live-coding exercises with test repositories on GitHub and help you develop a clear conceptual model of how Git works and how to keep a codebase commit history clean and comprehensible with branches, merging / rebasing, and pull requests.  +
This interactive clinic will provide attendees with the opportunity to learn and practice some key concepts for communicating technical knowledge to a range of audiences, from the general public to decision-makers. We will explore effective communication methods, messaging, and platforms, including social media and working with the press. This clinic will also provide attendees with the opportunity to workshop ideas for designing more impactful broader impacts or engagement programs. Attendees will leave with refined skills and useful resources for informing their science communication goals. This workshop is suitable for all skill and interest levels, and all career stages. The only requirement is an interest in interacting with your peers, sharing your unique perspective and experiences, and a willingness to support other attendees in building or honing their science communication skills.  +
This is a diversity panel discussion at the CSDMS 2019 annual meeting  +
This model of the subglacial drainage system simulates the pressurised flow of water at the ice-bed interface of glaciers and ice sheets. It includes both distributed and channelized water flow. Notably the model determines the geometry of the channel network as part of the solution. The resulting channel network is similar to subaerial stream networks with channels carving out hydraulic potential "valleys". However, there are some pronounced differences to subaerial drainage, for example that the time for a network to form (and decay) is on the order of weeks to months; or that, channels originating at point sources can lie on ridges of the hydraulic potential. The model employs a novel finite element approach to solve the parabolic equations for the hydraulic potential simultaneously on the 1D channel network and 2D distributed system.  +
This presentation provides an overview of two important concepts in natural hazards—social vulnerability and community resilience. Conceptually, vulnerability and resilience are related, but they are not the opposite extensions of one another. Instead they are driven by different questions: 1) what circumstances create the social burdens of risk and how do these affect the distribution of risks and losses (e.g. vulnerability); and 2) what enhances or reduces the ability of communities to prepare for, respond to, recover from, successfully adapt to, or anticipate hazard threats, and how does this vary geographically (resilience). In order to provide the scientific basis for hazard reduction policies and practices, measurement schemes for social vulnerability and community resilience are required. This presentation reviews an existing tool for measuring social vulnerability, the Social Vulnerability Index or SoVI®, which is widely used in the USA in both hazard mitigation planning and disaster recovery. Emerging metrics for monitoring community resilience are also described, beginning with the Baseline Resilience Indicators for Communities (or BRIC) Index. The spatial patterning and temporal variability in the indices as well as the importance of scale are described. Practical examples of how BRIC and SoVI have been used in the USA by emergency managers and hazards (spatial) planning are illustrated.  +
This presentation was part of a mini virtual workshop around coupling of Agent Based Models (ABM) and Grid Based Models, and shows how relatively easy it is to couple Grid Based Models with ABMs. Demonstrated notebooks can be found at: https://github.com/gregtucker/abm-landlab-mini-workshop  +
This presentation was part of a mini virtual workshop around coupling of Agent Based Models (ABM) and Grid Based Models, and shows how relatively easy it is to couple Grid Based Models with ABMs. Demonstrated notebooks can be found at: https://github.com/gregtucker/abm-landlab-mini-workshop  +
This presentation will briefly introduce the formulation, numerics, and parallel implementation of the coastal circulation model ADCIRC, discuss the strategy of coupling with the SWAN wave model, and provide background on recent enhancements of the bottom-friction formulation. Several recent applications of the coupled modeling system will be presented.  +
This tutorial introduces Xarray which is a Python library that provides (1) data structures for multi-dimensional labeled arrays, (2) a toolkit for scalable data analysis on large, complex datasets using Dask which extends the SciPy ecosystem (e.g. NumPy, Pandas, Scikit-Learn) to larger-than-memory or distributed environments. Attendees should be comfortable with basic Python programming (e.g., data structures, functions, etc.). Some prior exposure to Python data science libraries (e.g., NumPy, Pandas) is helpful. No specific domain knowledge is required to effectively participate in this tutorial.  +
This two-part clinic will introduce deep learning methods for semantic segmentation of high-resolution aerial imagery for the purposes of landuse/cover/form classification. The datasets we will use consist of images of shoreline environments, with a focus on general-purpose classification in terrestrial, fluvial and coastal ecology and geomorphology.<br><br>Deep neural networks are the current state-of-the-art for discrete classification of remotely sensed imagery from Earth observation platforms. The clinic will guide users through the process of preparing training datasets, training models, and evaluation. A number of different deep convolutional neural network architectures for image feature extraction and pixel-scale classifications will be explored and compared. The clinic will use the keras and tensorflow libraries within the python programming language. This hands-on class will be taught using Google colab through a browser, with the materials hosted on github. Participants will require a working knowledge of python. Some working knowledge of machine learning would be helpful, but we will assume no prior experience with machine/deep learning, neural networks, tensorflow, or keras.<br><br>Both the concepts and specific software would apply to many similar classification tasks at landscape scales. This clinic is composed of two, 2-hr sessions. You should sign up to both; the first clinic introduces the topic, data, and technology we use to solve the problem, and the second clinic implements these ideas and evaluates the results. '''Clinic materials can be found at:''' * https://mardascience.gitlab.io/deep_learning_landscape_classification * https://colab.research.google.com/drive/1krjeCmKoAng0BWy-4mzHVX-eAqQ9qy22?usp=sharing * https://colab.research.google.com/drive/1_ddXkrZCRne7qJ2RXHV5l3qOnk98KIyp?usp=sharing  +
This two-part clinic will introduce deep learning methods for semantic segmentation of high-resolution aerial imagery for the purposes of landuse/cover/form classification. The datasets we will use consist of images of shoreline environments, with a focus on general-purpose classification in terrestrial, fluvial and coastal ecology and geomorphology.<br><br>Deep neural networks are the current state-of-the-art for discrete classification of remotely sensed imagery from Earth observation platforms. The clinic will guide users through the process of preparing training datasets, training models, and evaluation. A number of different deep convolutional neural network architectures for image feature extraction and pixel-scale classifications will be explored and compared. The clinic will use the keras and tensorflow libraries within the python programming language. This hands-on class will be taught using Google colab through a browser, with the materials hosted on github. Participants will require a working knowledge of python. Some working knowledge of machine learning would be helpful, but we will assume no prior experience with machine/deep learning, neural networks, tensorflow, or keras.<br><br>Both the concepts and specific software would apply to many similar classification tasks at landscape scales. This clinic is composed of two, 2-hr sessions. You should sign up to both; the first clinic introduces the topic, data, and technology we use to solve the problem, and the second clinic implements these ideas and evaluates the results. '''Clinic materials can be found at:''' * https://mardascience.gitlab.io/deep_learning_landscape_classification * https://colab.research.google.com/drive/1krjeCmKoAng0BWy-4mzHVX-eAqQ9qy22?usp=sharing * https://colab.research.google.com/drive/1_ddXkrZCRne7qJ2RXHV5l3qOnk98KIyp?usp=sharing  +
This webinar presents an overview of Landlab 2.0, a Python programming toolkit for rapidly building and exploring numerical models of various Earth-surface processes. We’ll look at how to set up a numerical grid in just a few lines of code, and how to populate your grid with fields of data. We will also take a look at some of Landlab’s numerical functions, input-output utilities, and plotting routines. Finally, we will explore Landlab components: what they are, how to assemble them into integrated numerical models, and how to create new ones. Examples include surface-water hydrology, landscape evolution, tidal-marsh flow, and lithosphere flexure, among others.  +
This webinar presents an overview of the Landlab Toolkit: a Python package that makes it much easier to create two-dimensional grid-based models of various earth-surface processes. The webinar will provide a basic overview of Landlab, and illustrate some of its key capabilities in creating grids and working with modular "process components". The webinar will also present some example applications of Landlab for model-building, and provide pointers to tutorials, user guides, and other resources for those who wish to learn more.  +
This webinar will describe the efforts of our NSF-funded Research Coordination Network '''Building capacity to deepen the critical zone: expanding boundaries and exploring gradients through data-model synergy.''' Our mission is to enhance the diversity of participants and ideas in the critical zone (CZ) community, integrating scientists with broad interests in biology, hydrology, geology, atmospheric science, and computational sciences with the scientific goal of understanding the structure and evolution of the deep CZ through data-model integration across scales, and an equally important outreach goal of increasing diversity and inclusion in the earth sciences. We will hold a series of small conferences, workshops and webinars, focusing on such themes as the co-evolution of the land surface and the CZ “base”, scaling up local observations to global models, the response of the CZ structure to the Anthropocene, and the emerging tools for measuring such processes, among others.<br> This first webinar will give a broad introduction to our RCN and describe opportunities that you can partner with us if you are developing a proposal for the upcoming NSF CZ Collaboration Network call for proposals (NSF 19-586). We hope you consider working with us as part of your Broader Impacts should your theme be scientifically relevant to our mission.  +
This workshop introduces Swiftscape, a CPU/GPU-hybrid landscape evolution library with C++ and Python interfaces that can run hundreds of times faster than previous models. Participants will gain hands-on experience in both interfaces, offering flexibility and accessibility for diverse applications. Special focus will be given to the model's ability to run many simulations in parallel as well as its utility for solving inverse problems.  +
This workshop will showcase three different models of carbonate sedimentation, produced under the CSDMS umbrella: carboCat for facies, carboCell for guilds, carboPop for communities. Participants will be able to download and run (on own or provided machines) these models in Python and Matlab environments, discuss how to select appropriate parameters for them using the various databases being developed in concert with the models, and contribute to plans for further development of models and databases.  +
Though it enhances the exchange of porewater and solids with the overlying water, the role that sediment resuspension and redeposition play in biogeochemistry of coastal systems is debated. Numerical models of geochemical processes and diagenesis have traditionally parameterized relatively long timescales, and rarely attempted to include resuspension. Meanwhile, numerical models developed to represent sediment transport have largely ignored geochemistry. Here, we couple the Community Sediment Transport Modeling System (CSTMS) to a biogeochemical model within the Regional Ocean Modeling System (ROMS). The multi-layered sediment bed model accounts for erosion, deposition, and biodiffusion. It has recently been modified to include dissolved porewater constituents, particulate organic matter, and geochemical reactions.<br><br>For this talk, we explore the role that resuspension and redeposition play in biogeochemical cycles within the seabed and in benthic boundary layer by running idealized, one-dimensional test cases designed to represent a 20-m deep site on the Louisiana Shelf. Results from this are contrasted to calculations from an implementation similar to a standard diagenesis model. Comparing these, the results indicate that resuspension acts to enhance sediment bed oxygen consumption.  +
To Join: Zoom link: https://ncsu.zoom.us/j/6167058485?pwd=QUo1VUlKTTR4bWxRSEkxcTZ0SXMwZz09 Zoom ID 616-705-8485 passcode: 2021  +
TopoFlow is a plug-and-play, spatial hydrologic model distributed as an open-source Python package. The current version includes numerous hydrologic process components (all BMI-compliant), an extensive set of utilities for data preparation, river network delineation, visualization and basic calibration, the EMELI model coupling framework, sample data and a set of Jupyter notebooks for learning about the capabilities. The total package consists of around 90,000 lines of efficient code that uses NumPy and runs in Python 3.*. In this clinic, we will first cover some background information, install the package and then work through several Jupyter notebooks to explore the functionality.  +
TopoFlow is a spatially distributed hydrologic model that includes meteorology, snow melt, evapotranspiration, infiltration and flow routing components. It can model many different physical processes in a watershed with the goal of accurately predicting how various hydrologic variables will evolve in time in response to climatic forcings. In the past year, CSDMS IF staff integrated TopoFlow into the CSDMS Web Modeling Tool (WMT, https://csdms.colorado.edu/wmt) and developed new lesson plans for use with it.<br><br>The first part of this clinic focuses on the technical aspects of working with TopoFlow in WMT, including how to: load and couple components, get information on a component, set parameters, upload data files, save a model, and run a model. We’ll discuss features of the TopoFlow implementation in WMT, and explain choices that were made in bringing TopoFlow to the web.<br><br>In the second part of the clinic, we’ll focus on science and education. We will run several TopoFlow simulations on the CSDMS HPCC through WMT. Participants will explore parameter settings, submit runs, and view netCDF output using NASA’s Panoply tool. <br><br>The learning outcomes of this clinic are to have better insight into the behavior of TopoFlow components, and the implementation of these in WMT. Participants will learn how to do TopoFlow model runs, and will have access to TopoFlow online labs and teaching resources lesson plans.  +
Tsunami deposits can imperfectly record the hydraulic conditions of devastating extreme events. Sand entrainment, advection and deposition in these events occurs under strongly disequilibrium conditions in which traditional sediment transport models behave poorly. Quantitative models relating sediment characteristics to flow hydraulics hold the potential to improve coastal hazard assessments. However, data from recent natural tsunamis have rarely been accurate enough, over a range of parameter space, to quantitatively test proposed inverse models for predicting flow characteristics. To better understand how to “read” flow depth and velocity from disequilibrium deposits, we conducted controlled and repeatable laboratory flume experiments in which different grain size distributions (GSDs) of sand were entrained, transported and deposited by hydraulic bores. The bores were created by impounding and instantaneously releasing ~6 m^3 of water with a computer-controlled lift gate. The experiments represent 1/10 to 1/100 scale physical models of large events. Both flow characteristics (including Froude numbers) and suspended sediment transport characteristics (including Rouse numbers and grain size trends) scale consistently with documented natural tsunamis.<br>We use the experimental data to interpret how entrainment, transport and mixing influence deposit GSDs along the flume. Suspension-dominated deposits get finer and thinner in the direction of transport. The data show that two key controls on GSDs along the flume are (a) the size distribution of the sediment source, and (b) turbulent dispersion of grains. First, the influence of source GSDs on deposit GSDs is strongest near the sediment source. Size-dependent suspension and settling become increasingly important farther down the flume. Transport distances of 1-2 advection length scales are required for deposit GSDs to be sensitive to flow hydraulics. Second, turbulent dispersion strongly influences local deposit GSDs. Importantly, intermediate deposit grain size percentiles (e.g. D50) are less sensitive to dispersive transport than either the fine or coarse tails of local deposit GSDs. Using deposit GSDs along the flume, an advection-settling model best predicts flow depths and velocities when calculated for intermediate percentiles (e.g. D50), rather than for coarse size fractions (e.g. D95) as has been assumed in previous works. We also highlight areas where our knowledge and predictive ability is limited and could be improved using experiments, including understanding the degree to which grain size sorting occurs during entrainment into suspension, and also during energetic bedload transport. Overall, the work suggests that physical models of tsunami sediment transport and deposition are invaluable for evaluating equation assumptions, benchmarking model results, and rigorously evaluating model uncertainties.  
Turbulence, bedload, and suspended sediment transport are directly simulated by a coupled large eddy simulation of the fluid and a distinct element method for every sediment grain. This modeling system directly calculates the motion of all grains by resolved turbulence structures. The model directly calculates modification of the flow and turbulence by the grains, such as the effects of grain momentum extraction and density stratification. Simulations such as these can be used in the future to parameterize sediment transport in large-scale morphodynamic simulations.  +
Understanding and modeling the evolution of continental ice sheets such as Antarctica and Greenland can be a difficult task because a lot of the inputs used in transient ice flow models, either inferred from satellite or in-situ observations, carry large measurement errors that will propagate forward and impact projection assessments. Here, we aim at comprehensively quantifying error margins on model diagnostics such as mass outflux at the grounding line, maximum surface velocity and overall ice-sheet volume, applied to major outlet glaciers in Antarctica and Greenland. Our analysis relies on uncertainty quantification methods implemented in the Ice Sheet System Model (ISSM), developed at the Jet Propulsion Laboratory in collaboration with the University of California at Irvine. We focus in particular on sensitivity analysis to try and understand the local influence of specific inputs on model results, and sampling analysis to quantify error margins on model diagnostics. Our results demonstrate the expected influence of measurement errors in surface altimetry, bedrock position and basal friction  +
Understanding and predicting large scale watershed-ecosystem dynamics requires datasets that empower research at both the local and continental scale. Yet, creating, maintaining and delivering diverse harmonized datasets to researchers and decision-makers is costly and a relatively rare endeavor. In our lab, we have been working on two different projects meant to make it easier for anyone to better understand and predict the hydrobiogeochemical behavior of watersheds, big and small. In Macrosheds, we have harmonized all of the small watershed-ecosystem datasets in the LTER, CZO, USFS, and other programs where there is, at a minimum, data on streamflow and concentration of at least one dissolved constituent (e.g. Nitrate). This dataset provides a critical complement to datasets from larger watersheds like CAMELS and CAMELS-Chem, enabling more focused interrogation of watershed behavior at the scale of small streams. Second, we are actively rebuilding and improving on AquaSat - a dataset built to empower broader use of remote sensing for water quality. This data is focused on large rivers and lakes, visible to LandSat satellites (typically wider than 60 meters). Through both of these projects, we have learned critical lessons about what data end-users actually need, how to make their lives easier, the limits of data portals, and the community required to maintain open source software.  +
Understanding and predicting the response of vegetated ecosystems to climate change and quantifying the resulting carbon cycle feedbacks requires a coherent program of field and laboratory experiments, data synthesis and integration, model development and evaluation, characterization of knowledge gaps, and understanding of ecosystem structure and function. The U.S. Department of Energy supports such a program, which produces community data, models, and analysis capabilities aimed at projecting the impacts of environmental change on future atmospheric carbon dioxide levels, predicting changes in extreme events, and assessing impacts on energy production and use. Two computational approaches--one for quantifying representativeness of field sites and one for systematically assessing model performance--will be presented.<br><br>Resource and logistical constraints limit the frequency and extent of observations, particularly in the harsh environments of the arctic and the tropics, necessitating the development of a systematic sampling strategy to maximize coverage and objectively represent variability at desired scales. These regions host large areas of potentially vulnerable ecosystems that are poorly represented in Earth system models (ESMs), motivating two new field campaigns, called Next Generation Ecosystem Experiments (NGEE) for the Arctic and Tropics, funded by the U.S. Department of Energy. We developed a Multivariate Spatio-Temporal Clustering (MSTC) technique to provide a quantitative methodology for stratifying sampling domains, informing site selection, and determining the representativeness of measurement sites and networks. We applied MSTC to model results and data for the State of Alaska to characterize projected changes in ecoregions and to identify field sites for sampling important environmental gradients.<br><br>As ESMs have become more complex, there is a growing need for comprehensive and multi-faceted evaluation, analysis, and diagnosis of model results. The relevance of model predictions hinges in part on the assessment and reduction of uncertainty in predicted biogeochemical cycles, requiring repeatable, automated analysis methods and new observational and experimental data to constrain model results and inform model development. The goal of the International Land Model Benchmarking (ILAMB) project is to assess and improve the performance of land models by confronting ESMs with best-available observational data sets. An international team of ILAMB participants is developing a suite of agreed-upon model evaluation metrics and associated data at site, regional, and global scales. We are developing Open Source software tools for quantifying the fidelity of model performance, allowing modeling groups to assess confidence in the ability of their models to predict responses and feedbacks to global change.  
Understanding the performance of scientific applications can be a challenging endeavor given the constant evolution of architectures, programming models, compilers, numerical methods and the applications themselves. Performance integration testing is still not a reality for the majority of high-performance applications because of the complexity, computational cost, and lack of reliable automation. Hence, as part of the DOE SciDAC program, we are working on creating robust performance analysis workflows that capture application-specific performance issues and can be maintained and extended by the application scientists without requiring an external performance “expert”. The consumers of performance data include application developers, performance models, and autotuners. Once appropriate and sufficient performance data is available, our approach to using it to guide optimization is three-fold: (i) we investigate the most effective way to present performance results to the code developers (ii) we automate the selection of numerical methods based on generic performance models (as part of the NSF-funded Lighthouse project) and (iii) we explore the use of different types of performance models in low-level autotuning systems to reduce the size of the parameter search space. While code generation and autotuning are important for achieving performance portability, the majority of code development (including optimization) is still performed by humans. As part of the DOE IDEAS project, we are developing data-based methodologies to try to understand better how human teams work most effectively in developing high-quality, high-performance, enduring scientific software.  +
Understanding the processes that shape and reshape the earth's surface is a fundamental challenge in the geosciences. Numerical modeling—the glue between data and theory—is a key component of the effort to meet this challenge. The Community Surface Dynamics Modeling System (CSDMS) was formed to provide support for earth-surface dynamics modeling, and to accelerate the pace of discovery through software development, resource sharing, community coordination, knowledge exchange, and technical training.<br>The CSDMS approach is bottom-up: models, typically developed within the community, are nominated by the community for inclusion within the CSDMS Modeling Framework (CMF). The CMF provides loose, two-way coupling in a Python-based framework that can scale from an individual laptop to a high-performance computing environment. The CMF has a web-based front-end, the Web Modeling Tool (WMT), that's available for use by all community members.<br>The CMF is built on four key software technologies:<ul><li>Basic Model Interface. A Basic Model Interface (BMI), consisting of a common set of functions for initializing, running, and finalizing a model, is added to each model to be incorporated into the CMF.</li><li>Standard Names. Given variables from two models, Standard Names provides a semantic matching mechanism for determining whether—and the degree to which—the variables refer to the same quantity.</li><li>Babel. C, C++, Fortran, Java, and Python language bindings for a BMI-enabled model are generated by Babel.</li><li>Python Modeling Toolkit. The Python Modeling Toolkit (PyMT) is the framework part of the CMF, allowing Babel-wrapped models to be coupled and run in a Python environment. PyMT includes tools for time interpolation, grid mapping, data exchange, and visualization.</li></ul><br>In this presentation, I'll provide an overview of these core CSDMS software technologies, describing the problems they solve, how they benefit the community, and how they may accelerate scientific productivity. I'll include a Jupyter Notebook demonstration of using PyMT to interactively couple and run a landscape evolution model with a sediment transport model. I'll conclude with a list of issues still to be addressed by CSDMS.  
Update of the NSF PRE-EVENTS program  +
Update on what CSDMS has accomplished and what is planned to do in the coming 5years.  +
Urban areas located along the coastline face critical choices in the coming decades to respond effectively to climate change, especially with regards to sea level rise (SLR) and intensified ocean storms. These choices include adaptation to let the water in, retreat to avoid new flooded areas, or resilient infrastructure to keep the water out. Nature-based solutions (NBS), which range from restoration of existing ecosystems to infrastructure inspired by natural ecosystems, have the potential to soften the consequences of choosing either hard infrastructure or adaptation. However, in urban environments the lack of available land space may reduce the efficacy of traditional NBS (e.g. living shorelines). Here, we present work to understand and alleviate the problem of NBS efficacy in an urban area with little space to give back to the natural environment. We use coastal hydrodynamic models of the Boston Harbor to show the potential for a range of NBS to protect against storms and SLR with the available area for these kinds of infrastructure projects. We further show how these models can be simplified and used as tools to understand trade-offs between NBS, hard infrastructure, and retreat, which may be as likely to come from an adaptation strategy as from SLR. Finally, we discuss our models of combinations of these solutions, and the current potential for NBS to protect an urban area from climate change.  +
Using CSDMS in the Classroom - Learn about CSDMS software for running a suite of earth surface models through a web-based modeling tool (WMT). This webinar will share improved ways of using this tool in the classroom, gives a quick reminder demo, and points in detail to the resources online. '''Instructor:''' Irina Overeem, CSDMS Deputy Director, University of Colorado, Boulder  +
Using the CSDMS tools and resources, we have developed a new model coupling river, floodplain, and coastal processes to explore how interactions between upstream and downstream controls in a fluvio-deltaic system affect river channel processes and large-scale delta morphology. The River Avulsion and Floodplain Evolution Model (RAFEM, written in Python) and Coastline Evolution Model (CEM, written in C) are coupled using the CSDMS Basic Model Interface (BMI) and are available as part of the CSDMS software stack. Using the CSDMS High Performance Computing Cluster and the Dakota toolkit, we have explored how the wave climate (wave heights and offshore approach angles), sea-level rise rate, and the amount of in-channel aggradation required to trigger an avulsion (superelevation) influence avulsion frequency and location, impacting both delta morphology and the resulting stratigraphy. The model is structured modularly to invite further couplings with additional model components in the future.  +
Vegetation in river channels and on floodplains alters mean flow conditions, turbulence, sediment transport rates and local sedimentation patterns. Although many advances have been made to predict the impact of vegetation on flow conditions, relatively few studies have investigated how vegetation influences bedload fluxes. We first investigate how known vegetation impacts on flow turbulence can be used to better predict bedload transport and sedimentation within vegetation patches. To elucidate these mechanics we measured 2D velocity fields using PIV and bedload fluxes using high-speed video in simplified flume experiments. We used these laboratory measurements to test and develop bedload transport equations for vegetated conditions. Bedload transport equations did not accurately predict sediment fluxes unless they accounted for the spatial variability in the near-bed Reynolds stress. We then use this patch scale understanding to better predict how vegetation impacts channel morphology. Specifically, we investigate how vegetation influences point bar growth and shape through coupled laboratory experiments and 2D numerical modeling. We measured bedload fluxes, flow conditions and sedimentation rates on a point bar planted with natural vegetation at the Saint Anthony Falls Outdoor Stream Lab. We then calculated the detailed 2D flow field over the point bar throughout imposed flow hydrographs. Our results demonstrate that vegetation caused significant changes in the bar dimensions and depending on the flow level, led to the development of a side channel between the bar and the inner bank of the meander. Such a side channel could precipitate a change in channel morphology to a multi-thread channel. Accurate predictions of sedimentation caused by vegetation patches not only require an estimate of the spatial variation in shear stress (or velocity) within a patch but also how the vegetation alters the adjacent flow field and bedload sediment supply to the patch.  
Vegetation is a critical ecogeomorphic agent within landscapes and is instrumental to many physical, biochemical, and ecological processes that can vary across spatial and temporal scales (e.g., erosion, sediment deposition, primary productivity, nutrient cycling, etc.). Modeling vegetation dynamics can be challenging, not only because of these scale-dependent variations, but also because of the breadth of existing approaches. The purpose of this clinic is to provide a technical overview for incorporating or developing vegetation models for earth surface dynamics modeling questions. The instructors will introduce vegetation processes commonly modeled, existing types of vegetation models, and how to choose an appropriate level of complexity for your system. Attendees will gain hands-on experience with existing vegetation components within and outside the Landlab system. These models will include the Cellular Automaton Tree Grass Simulator (CATGraSS), a mechanistic, photosynthesis-driven generalized vegetation model as well as how to incorporate vegetation models from Netlogo into Landlab. While active developers in the Landlab community will find this clinic useful, advanced programming experience is not needed.  +
Water -- we drink it, we bathe in it, we feed it to our plants, we gaze admiringly as it falls off cliffs -- but how does it get from the sky to your tap? Will it always be free to chisel windingly through the countryside and leap from dazzling heights? The open-source model mosartwmpy (aka "wimpy") offers researchers a bird's eye view of water movement and reservoir operations across the conterminous United States. Wimpy has been translated into Python from its ancestor MOSART-WM (Model Of Scale Adaptive River Transport and Water Management) without sacrificing performance, leading to a more widely accessible and extensible model. By implementing the Basic Model Interface (BMI), Wimpy provides a familiar workflow with interoperability at the heart. This clinic will introduce mosartwmpy at a high-level and provide a hands-on, interactive tutorial demonstrating its capabilities for studying water shortages. Attendees should leave armed with a stronger understanding of the interplay between water movement and reservoir storage, and with the confidence to utilize mosartwmpy in their own research.  +
Water – too little, too much – will likely be the biggest future climate challenge for the world. This will be particularly true in vulnerable regions in Africa, where the response of rainfall to increasing greenhouse gas concentrations is a critical socio-economic issue, with implications for water resources, agriculture, and potential conflict. The geological record finds tropical Africa at times hyperarid and at other times covered with large megalakes, with abrupt transitions between these humid and dry states. Climate modeling allows us to explore the processes that combined to produce these past changes. In this talk, I will highlight what has been learned about the glacial-interglacial variations of African hydroclimate from models and data. Together, they provide a perspective on projections of future precipitation changes over tropical Africa.  +
Watersheds are complex natural landscape features that contain hillslopes and channel-networks. An entropy-based approach is used to explore the role of channel-network and hillslope towards the contribution to watershed complexity. The structural complexity is evaluated using width-function, which characterizes the spatial arrangement of channels, whereas incremental area-function, capturing the patterns of transport of fluxes, is used to study the functional complexity. Based on several catchments across the United States, our results show that hillslopes add significant complexity to the catchments and suggest the amount of hillslope information needed for accurate predictive modeling of hydrologic processes at the catchment scale.  +
We examine the distribution of waterbody sizes on lowland arctic deltas and explore whether ephemeral wetlands versus perennial lakes have different size distributions. We document that lake areas follow a lognormal distribution, while wetland area follow a power law distribution. We propose a mechanistic model of thermokarst lake growth which is consistent with the observed lognormal distribution, and argue that the power law distribution of wetland area is consistent with an inundated rough landscape, as observed in temperate wetlands. We conclude by examining the implications of these contrasting two processes on projections of future lake area change.  +
We illustrate results from a Landlab component that uses the framework and geomorphic transport laws developed by Sklar et al. (2017) to model the grain-size distribution resulting from the transformation of rock to soil. The equations model grain-size distribution as a function of weathering and denudation rate. We have implemented these equations to explore controls on sediment grain size in different parts of the Rio Blanco watershed, Puerto Rico.  +
We investigate the feedbacks between surface processes and tectonics in an extensional setting by coupling a 2-D geodynamical model with a landscape evolution law. Focusing on the evolution of a single normal fault, we show that surface processes significantly enhance the amount of horizontal extension a fault can accommodate before being abandoned in favor of a new fault. In simulations with very slow erosion rates, a 15 km- thick brittle layer extends via a succession of crosscutting short-lived faults (heave < 5 km). By contrast, when erosion rates are comparable to the regional extension velocity deformation is accommodated on long-lived faults (heave >10 km). Using simple scaling arguments, we quantify the effect of surface mass removal on the force balance acting on a growing normal fault. This leads us to propose that the major range-bounding normal faults observed in many continental rifts owe their large offsets to erosional and depositional processes.  +
We present results from a climate model integration with a multi-scale ocean component capable of locally enhancing resolution. The model is the NCAR Community Earth System Model (CESM), in which the ocean component contains a high-resolution ROMS nest for either the California Current System or the Benguela Current. In this presentation we will show results from century-long integrations showing that the better representation of coastal upwelling has both regional and global ramifications to the climate system. Using a comparative analysis of the two upwelling systems, we will show that enhancing the climate model representation of boundary currents is not simply a matter of enhanced resolution. Finally, we will use our multi-scale setup to distinguish between the role of atmospheric tele-connections and oceanic advection in propagating the upwelling signal.  +
We use numerical modeling to explain how deltaic processes and morphology are controlled by properties of the sediment input to the delta apex. We conducted 36 numerical experiments of delta formation varying the following sediment properties: median grain size, grain-size distribution shape, and percent cohesive sediment. As the dominant grain size increases deltas undergo a morphological transition from elongate with few channels to semi-circular with many channels. This transition occurs because the critical shear stress for erosion and the settling velocity of grains in transport set both the number of channel mouths on the delta and the dominant delta-building process. Together, the number of channel mouths and dominant process – channel avulsion, mouth bar growth, or levee growth – set the delta morphology. Coarse-grained, non-cohesive deltas have many channels that are dominated by avulsion, creating semi-circular planforms with relatively smooth delta fronts. Intermediate-grained deltas have many channels that are dominated by mouth bar growth, creating semi-circular planforms with bifurcated channel networks and rugose delta fronts. Fine-grained, cohesive deltas have a few channels, the majority of which are dominated by levee growth, creating elongate planforms with smooth delta fronts. The process-based model presented here provides a previously lacking mechanistic understanding of the effects of sediment properties on delta channel network and planform morphology.  +
Welcom  +
What are your colleagues doing to make their models FAIR - Findable, Accessible, Interoperable, and Reusable? What do publishers and editors expect, and how can you meet those requirements? There are a wide range of practices in use today. Learn what's going on in the CSDMS community and the broader Earth science community for making models and scientific code FAIR. Pre-registration required.  +
What determines the style of river delta growth? How do deltas change after fluvial sediment supply is cut off? River delta evolution is characterized by the progradation and transgression of individual (deltaic) lobes: the delta cycle. We investigate the behaviour of wave-influenced deltas with a simple shoreline model, and quantitatively relate several first-order controls.  +
When a tree falls into a river becomes instream large wood and promotes fundamental changes in river hydraulics and morphology, playing a relevant role in river ecology. By interacting with the flow and sediment, the instream large wood (i.e., downed trees, trunks, root wads and branches) contributes to maintaining the river's physical and ecological integrity. However, large quantities of wood can be transported and deposited during floods, enhancing the adverse effects of flooding at critical sections like bridges. Accurate predictions of large wood dynamics in terms of fluxes, depositional patterns, trajectories, and travel distance, still need to be improved, and observations remain scarce. Only recently, numerical models can help to this end. In contrast to other fluvial components such as fluid flow and sediment, for which numerical models have been extensively developed and applied over decades, numerical modelling of wood transport is still in its infancy. In this talk, I will describe the most recent advances and challenges related to the numerical modelling of instream large wood transport in rivers, focusing on the numerical model Iber-Wood. Iber-Wood is a two-dimensional computational fluid dynamics model that couples a Eulerian approach for hydrodynamics and sediment transport to a discrete element (i.e., Lagrangian) approach for wood elements. The model has been widely validated using flume and field observations and applied to several case studies and has been proven to accurately reproduce wood trajectories, patterns of wood deposition, and impacts of wood accumulations during floods.  +
Why is object-oriented programming important? We'll consider this from the perspective of a grad student starting a research project, a postdoc scrambling to publish papers, a researcher starting a collaboration, and a professor leading a group of students and postdocs. (15 min) Where can you find good, reliable information on object-oriented programming? The internet is filled with content from others who have devoted their careers to this topic. We'll show you where we think are the best places to get more information. (5 min) How does object-oriented programming work? We'll give a concrete demonstration of object-oriented programming in Python. Again, others have done this better, but seeing it in action may help you get a jump on using it yourself. (10 min)  +
Why is unit testing important? We'll consider this from the perspective of a grad student starting a research project, a postdoc scrambling to publish papers, a researcher starting a collaboration, and a professor leading a group of students and postdocs. (15 min) Where can you find good, reliable information on unit testing? The internet is filled with content from others who have devoted their careers to this topic. We'll show you where we think are the best places to get more information. (5 min) How does unit testing work? We'll give a concrete demonstration of unit testing in Python with pytest. Again, others have done this better, but seeing it in action may help you get a jump on using it yourself. (10 min)  +
Why is version control important? We'll consider this from the perspective of a grad student starting a research project, a postdoc scrambling to publish papers, a researcher starting a collaboration, and a professor leading a group of students and postdocs. (15 min) Where can you find good, reliable information on version control? The internet is filled with content from others who have devoted their careers to this topic. We'll show you where we think are the best places to get more information. (5 min) How does version control work? We'll give a concrete demonstration of version control using GitHub. Again, others have done this better, but seeing it in action may help you get a jump on using it yourself. (10 min)  +
Within our lifetime, climate change has the potential to drastically alter coastal resiliency. Atoll island nations are particularly vulnerable to climate change: from increasing ocean temperatures (causing coral die-off), to ocean acidification (decreasing coral resiliency), to increasing SLR. We must understand what will happen to the atoll islands since the land is where people live. However, we lack a comprehensive understanding about the primary processes driving atoll island evolution under rising sea levels and varying wave climate. This uncertainty in predictions hinders local communities’ preparation for the future; we must understand how atoll islands respond and evolve with changing environmental forcings on a global scale. However, to predict the response of these islands to changing climate, we must understand the feedbacks between physical and ecological processes at different temporal and spatial scales. In addition, we must account for the actions and processes taken by humans driving landscape change on these islands. My lab has focused on investigating the feedbacks inherent in these landscapes using numerical modeling and remote sensing.  +
Within our lifetime, climate change has the potential to drastically alter coastal resiliency. Atoll island nations are particularly vulnerable to climate change: from increasing ocean temperatures (causing coral die-off), to ocean acidification (decreasing coral resiliency), to increasing SLR. We must understand what will happen to the atoll islands because they are the inhabited portion of these systems. However, we lack a comprehensive understanding about the primary processes driving atoll island evolution under rising sea levels and varying wave climate. This uncertainty in predictions hinders local communities’ preparation for the future; we must understand how atoll islands respond and evolve with changing environmental forcings on a global scale. To predict the response of these islands to changing climate, we must understand the feedbacks between physical and ecological processes at different temporal and spatial scales. In addition, we must account for the actions and processes taken by humans driving landscape change on these islands. My lab has focused on investigating the feedbacks inherent in these landscapes using numerical modeling and remote sensing.  +
Writing the software to implement a two-dimensional numerical model can be a daunting exercise, even when the underlying discretization and numerical schemes are relatively simple. The challenge is even greater when the desired model includes ``advanced'' features such as an unstructured grid, a staggered-grid numerical solver, or input/output operations on gridded data. Landlab is a Python-language programming library that makes the process of 2D model-building simpler and more efficient. Landlab's core features include: (1) a gridding engine that lets you create and configure a structured or unstructured grid in just a few lines of code, and to attach data directly to the grid; (2) a library of pre-built process components that saves you from having to re-invent the wheel with common geoscience algorithms (such as flow routing on gridded terrain, linear and nonlinear diffusion, and elastic plate flexure); (3) a mechanism for coupling components together to create integrated model; and (4) a suite of tools for input/output and other common operations. Although Landlab's components are primarily related to earth-surface dynamics (including geomorphology and hydrology), its basic framework is applicable to many types of geophysical system. This clinic provides a hands-on tutorial introduction to Landlab. Participants will learn about Landlab's capabilities, and how to use it to build and run simple 2D models. Familiarity with the Python language and the Numpy library is helpful but not critical.  +
https://csdms.colorado.edu/wiki/2020_CSDMS_meeting-068, Landslides are key agents of sediment production and transport. Ongoing efforts to map and simulate landslides continuously improve our knowledge of landslide mechanisms. However, understanding sediment dynamics following landslide events is equally crucial for developing hazard mitigation strategies. An outstanding research challenge is to better constrain the dynamic feedbacks between landslides and fluvial processes. Fluvial processes simultaneously (i) act as conveyor belts evacuating landslide-derived sediment and (ii) lower the hillslope’s base level triggering further landsliding. Landslides in turn can choke river channels with sediment, thereby critically altering fluvial responses to external tectonic or climatic perturbations. Here, we present HYLANDS, a hybrid landscape evolution model, which is designed to numerically simulate both landslide activity and sediment dynamics following mass failure. The hybrid nature of the model is in its capacity to simulate both erosion and deposition at any place in the landscape. This is achieved by coupling the existing SPACE (Stream Power with Alluvium Conservation and Entrainment) model for channel incision with a new module simulating rapid, stochastic mass wasting (landsliding). In this contribution, we first illustrate the functionality of HYLANDS to capture river dynamics ranging from detachment-limited to transport-limited configurations. Subsequently, we apply the model to a portion of the Namch-Barwa massive in Eastern Tibet and compare simulated and observed landslide magnitude-frequency and area-volume scaling relationships. Finally, we illustrate the relevance of explicitly simulating stochastic landsliding and sediment dynamics over longer timescales on landscape evolution in general and river dynamics in particular under varying climatologic and tectonic configurations. With HYLANDS we provide a hybrid tool to understand both the long and short-term coupling between stochastic hillslope processes, river incision and source-to-sink sediment dynamics. We further highlight the unique potential of bridging those timescales to generate better assessments of both on-site and downstream landslide risks.  
icepack is a Python software package for simulating the flow of glaciers and ice sheets. Icepack is built on top of the finite element modeling library Firedrake, which makes it possible to describe physics problems using a domain-specific language (DSL) embedded into Python. This DSL makes the code you write look much more like the underlying math. Using this DSL, we've been able to create an ice flow model that users can easily extend and modify -- for example, substituting in your own sliding law -- while at the same time insulating them from some of the messier aspects of numerical modeling. In this talk, I'll describe some of the design decisions that went into icepack and why we made them as well as how we've involved graduate students in the development process. Finally, I'll give a live demo and discuss some future directions.  +
icepack is a Python software package for simulating the flow of glaciers and ice sheets. Icepack is built on top of the finite element modeling library Firedrake, which makes it possible to describe physics problems using a domain-specific language (DSL) embedded into Python. This DSL makes the code you write look much more like the underlying math. Using this DSL, we've been able to create an ice flow model that users can easily extend and modify -- for example, substituting in your own sliding law -- while at the same time insulating them from some of the messier aspects of numerical modeling. In this talk, I'll describe some of the design decisions that went into icepack and why we made them as well as how we've involved graduate students in the development process. Finally, I'll give a live demo and discuss some future directions.  +
meanderpy is a Python implementation of a simple kinematic model of meandering, based on the Howard and Knutson (1984) model. In contrast with previous implementations, we assume a simple linear relationship between curvature and migration rate and, using time-lapse satellite imagery, show that the model predicts 55% of the variance in migration rates in seven rivers of the Amazon Basin. It also predicts the formation of autogenic counter point bars: deposits related to channel segments along which the curvature and migration rate vectors have opposing orientations. These finer-grained deposit types tend to form on the upstream side of high-curvature, short bends that rapidly translate downstream. Wrapping simple geomorphic surfaces around the centerlines allows us to build three-dimensional stratigraphic models of both fluvial and submarine meandering systems.  +