Property:CSDMS meeting abstract
From CSDMS
This is a property of type Text.
2
We are advancing dynamic multi-hazard risk assessment (MHRA) methods for human ecology, using the Kutupalong Rohingya Refugee Camp (KTP) in southeastern Bangladesh as a case study. KTP, home to over 1.1 million refugees within a 15 km² area, represents one of the world’s most densely populated and hazard-prone humanitarian settlements. This research investigates hydro-meteorological risks—primarily shallow landslides and flash floods—before and after refugee settlement, with a focus on landscape changes driven by both anthropogenic and natural processes. We formulated two core hypotheses. The first posits that dynamic hazard modeling, incorporating both geological and anthropogenic factors, more accurately captures the cascading effects of landslides and flash floods than traditional static models. The second hypothesis suggests that hydro-meteorological risk at KRC has declined due to the incremental implementation of slope stabilization and restoration measures.
We began with a landslide hazard assessment using a sloped unit (SU)-based approach, building upon previous grid-based models employed at KTP. Our dynamic, time-lapse assessment, which examines pre- and post-refugee influx scenarios, identifies slope units with increasing, decreasing, or unchanged susceptibility over time. A Generalized Additive Model (GAM) applied at the SU scale outperforms conventional machine learning (ML) methods, providing a robust framework for surface hazard modeling. In parallel, we evaluated landscape degradation and recovery through above-ground biomass (AGB) estimation using Sentinel-2A imagery, NASA GEDI LiDAR, and ESA Biomass products. We estimated AGB for 2017 (pre-influx), 2019 (early restoration), and 2023 (ongoing recovery) using Random Forest, SVM, and XGBoost regression models. This integration of remote sensing and ML demonstrates the utility of multi-source data for tracking dynamic land-use change.
Further fieldwork is required to collect more geotechnical soil samples and detailed information on the geometry of the failure plane in selected large landslides. This will enable us to assess the interaction between slope-forming materials and the underlying bedrock interface, as well as model the velocity and volume of sliding materials in the form of run-out. Like landslides, we will dynamically simulate flash flood inundation to extract critical hydrodynamic parameters, including peak flow height, flow velocity, discharge, and flood arrival time, particularly for the 2017 and 2021 monsoon events at KTP. Multi-temporal DEM generation and land cover mapping will be the key in this regard.
A key contribution of this research lies in the integration of landslide and flash flood risk data to assess their cascading impacts on human ecology. This integrated risk information will be combined with engineering measures and economic modeling to assess the effectiveness and feasibility of the existing mitigation measures. Risk estimation will be conducted under changing hazard scenarios, comparing conditions immediately before the major refugee influx (2018 and earlier) with those in the post-intervention period (2022–2023). A similar modeling framework will also be applied to explore potential future hazard scenarios under evolving landscape and climate conditions.
We develop a hydroclimatological approach to modeling regional shallow landslide initiation by integrating spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at midelevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure. +
We have developed a simple approach to modeling how coastal marshes respond to changes in the rate of sea level rise (SLR) and sediment concentration. This approach, rooted in detailed numerical modeling and in field and remotely sensed observations, produces plan view distributions of elevations and the densities of multiple marsh species and bare soil. This modelling approach can be applied to specific marshes, to forecast marsh configurations for any combination of SLR rate and suspended sediment concentration in a channel network.
The approach involves techniques to detect the spatial distributions of fractional cover and biomass densities of multiple marsh species from satellite observations. These techniques were developed using a combination of field observations and drone and airborne lidar and multispectral data from the East Coast of the United States and the Venice Lagoon, and can be applied to any coastal environment with similar mixes of vegetation types. Biomass density can be treated as a function of elevation, representing the realized niches of observed vegetation species, or mixtures of species.
The approach also involves modeling demonstrating that, within marsh basins, tidal current velocities and rates of inorganic sediment deposition do not depend on vegetation properties. Given this simplification, and the relationship between biomass density and elevation (realized niches), solving for equilibrium depths and biomass densities as a function of distance from the nearest channel becomes straightforward (Figure 1).
The rate of change of depth D (below high-water level) is given by:
∂D/∂t = R - A_inorg - A_org
where R is the rate of SLR, A_inorg and A_org are the rates of accretion of inorganic and organic sediment, respectively. A_inorg is equal to DC, where C is sediment concentration. In Figure 1, equilibrium depths (∂D/∂t=0; R - A_inorg = A_org) are graphically determined.
Acknowledgments
Supported by the US NSF Geomorphology and Land Use Dynamics program (2016068), and AA, MM, and SS were also supported by the RETURN Extended Partnership and received funding from the European Union Next-GenerationEU (National Recovery and Resilience Plan – NRRP, Mission 4, Component 2, Investment 1.3 – D.D. 1243 2/8/2022, PE0000005).
We have implemented algorithms for simulating fine and cohesive sediment in the Regional Ocean Modeling System (ROMS). These include: floc dynamics (aggregation and disaggregation in the water column); changes in floc characteristics in the seabed; erosion and deposition of cohesive and mixed (cohesive and non-cohesive) sediment; and biodiffusive mixing of bed sediment. These routines supplement existing non-cohesive sediment routines in ROMS, thereby increasing the model ability to represent fine-grained environments where aggregation, disaggregation, and consolidation may be important. Additionally, we describe changes to the sediment bed layering scheme that improve the fidelity of the modeled stratigraphic record. This poster provides examples of these modules implemented in idealized test cases and a real-world application. +
A
We present a method that reconstructs daily snow thermal conductivities using air and ground temperature measurements. The method recovers the daily snow thermal conductivities over the entire snow season. By using reconstructed snow conductivities we can improve modeling of ground surface temperatures. Simulation of the ground surface temperatures by using changing in time snow thermal conductivities could potentially reduce ground temperature modeling uncertainty.
The developed method was applied to four permafrost observation stations in Alaska. Reconstructed snow thermal conductivity time series for the interior stations in Alaska revealed low conductivity values that reach their maximum towards the end of the snow season, while the northern stations showed high conductivity values that reach their maximum towards the middle of the snow season. The differences in snow conductivities between interior and northern stations are most likely due to wind compaction which is more pronounced in the Northern Arctic lowlands of Alaska. +
2
We provide a simple introduction to the Scientific Variables Ontology (SVO) and show how it can be used to tag scientific models and data with information about the scientific variables used by or contained in these types of resources. We demonstrate the application of SVO to a variety of domains by providing examples from CSDMS standard names, CF standard names, and NWIS parameter codes. +
What are the topographic, thermal and hydrologic conditions setting slope stability in frozen and thawing landscapes? We address this question with past and present records to inform models to predict future landscape change. Relict periglacial landscapes and slope deposits constrain timing and magnitude of slope instabilities in past glaciations. Using sediment records from these deposits, we show how hillslope denudation varies as a function of climate at both the Last Glacial Maximum and previous Pleistocene glaciations. In central Pennsylvania, organic geochemistry and plant macrofossils provide ecological constraints on depositional environments and climate conditions in an upland bog with periglacial sedimentation. Nearby we use cosmogenic isotopes to constrain erosion rates and depositional ages of periglacial debris in toeslope deposits. Remote sensing and field surveys in western Alaska summer 2019 will document the topographic and hydrologic controls on modern slope stability, as well as accumulation rate of sediment in the past. Coupling landscape evolution models with permafrost models should be capable of both hindcasting climate conditions from past sedimentology and forecasting slope stability in modern permafrost landscapes. Such models will require soil mobility to be linked to frozen and unfrozen water content, to be developed in collaboration with CSDMS researchers. +
What is the impact of boat-wake generated waves on lakebed sediment? Are large wakeboarding waves (up to 0.5 m in height) driving significant sediment resuspension or transport? Using Delft3D, we investigate the role of boat-wake waves in comparison to wind-waves on driving sediment transport and deposition in East Pond, Belgrade, ME. We approximate boat wakes using a spatially varying pressure field to simulate the boat wake over our time period. We validate our numerical model using field measurements of wave heights and near bed velocity under different boat wakes. We then test the relative importance of boat-wakes on driving morphodynamic change of East Pond in comparison to wind-generated waves given the frequency of use of wakeboarding and water skiing boats compared to the yearly wind climate. Even under the largest boat wakes (wake surfing), there are minimal velocities (< 10 cm/s) at the bed in either shallow or deep depths (4 of 7 m) but wave heights do reach up to 30 cm in deeper waters. Our analyses provide a method for estimating natural vs anthropogenic wave impacts on lake sedimentation and long-term water quality. +
A
When a layer of particle-laden fresh water is placed above clear, saline water, both Rayleigh-Taylor and double-diffusive instabilities may arise. In the absence of salinity, the dominant parameter is the ratio of the particle settling velocity to the viscous velocity scale. As long as this ratio is small, particle settling has a negligible influence on the instability growth. However, when the particles settle more rapidly than the instability grows, the growth rate decreases inversely proportional to the settling velocity. In the presence of a stably stratified salinity field, this picture changes dramatically. An important new parameter is the ratio of the height of the nose region that contains both salt and particles to the thickness of the salinity interface. If this ratio is small (large) the dominant instability mechanism will be double-diffusive (Rayleigh-Taylor) dominant. In contrast to situations without salinity, particle settling can have a destabilizing effect and significantly increase the growth rate. Scaling laws obtained from the linear stability results are seen to be consistent with experimental observations and theoretical arguments put forward by other authors. +
2
When a tree falls into a river becomes instream large wood and promotes fundamental changes in river hydraulics and morphology, playing a relevant role in river ecology. By interacting with the flow and sediment, the instream large wood (i.e., downed trees, trunks, root wads and branches) contributes to maintaining the river's physical and ecological integrity. However, large quantities of wood can be transported and deposited during floods, enhancing the adverse effects of flooding at critical sections like bridges. Accurate predictions of large wood dynamics in terms of fluxes, depositional patterns, trajectories, and travel distance, still need to be improved, and observations remain scarce. Only recently, numerical models can help to this end.
In contrast to other fluvial components such as fluid flow and sediment, for which numerical models have been extensively developed and applied over decades, numerical modelling of wood transport is still in its infancy. In this talk, I will describe the most recent advances and challenges related to the numerical modelling of instream large wood transport in rivers, focusing on the numerical model Iber-Wood. Iber-Wood is a two-dimensional computational fluid dynamics model that couples a Eulerian approach for hydrodynamics and sediment transport to a discrete element (i.e., Lagrangian) approach for wood elements. The model has been widely validated using flume and field observations and applied to several case studies and has been proven to accurately reproduce wood trajectories, patterns of wood deposition, and impacts of wood accumulations during floods. +
When oil spills occur in marine environments, the oil droplets, marine snow, and mineral grains can combine to form Oil Mineral Aggregates (OMAs), which have a wide range of settling velocities and densities. As a result, their properties can strongly influence the eventual fate of the oil. As part of the Consortium for Simulation of Oil-Microbial Interactions in the Ocean (CSOMIO), we evaluated the role of turbidity in partitioning oil into OMAs by incorporating flocculation and aggregation processes into the Community Sediment Transport Modeling System (CSTMS) within the Coupled Ocean-Atmosphere-Wave-and-Sediment Transport (COAWST) modeling framework. Specifically, an existing size-class based aggregation and fragmentation model (FLOCMOD) was adopted to examine the impact of oil on the vertical transport of sediment. FLOCMOD acts as a population balance flocculation model and allows particle exchanges through aggregation, shear breakup and collision breakup. Our one-dimensional SED_FLOC_TOY model represented a muddy 50-m deep site on the northern Gulf of Mexico continental shelf. It was driven by horizontally uniform, steady currents, salinity and temperature, extracted from a three-dimensional hydrodynamic model. The initial sediment distribution was split among 11 floc size classes (ranging from 1 to 1024 micron diameter). Sediment was input at the top of the water column to represent fall out from a freshwater plume. Flocculation processes removed mass from the smaller and larger classes through aggregation and breakup, which resulted in a net increase in sediment mass of the middle sizes. FLOCMOD’s collision and breakup efficiencies were parameterized to represent the presence or absence of oil. Sensitivity tests of collision and breakup efficiencies indicated that total suspended sediment mass was decreased by 40% by increasing/decreasing the collision/breakup efficiency. FLOCMOD was computationally expensive, in this test case, computation was slowed down by 1.5 times after incorporating the aggregation and fragmentation processes.
A
When we build models we create worlds that we hope will inform us about the world in which we live. We hope models will help us understand processes, causes and effects; avoid difficulties; benefit human endeavors; and accommodate and nurture the ecology which has its own beauty and importance, and upon which human existence and our economy depend. Here we discuss how models can be used to achieve these goals by considering the importance of transparency (revealed importance) and refutability (tested hypotheses). We consider models with substantial execution times (for our example one model run requires 20 minutes) and transparency and refutability available using computationally frugal methods. Challenges of using these methods include model nonlinearity; non-Gaussian errors and uncertainties in observations, parameters, and predictions; and integrating information from multiple data types and expert judgment. A synthetic test case illustrates the importance of transparency and refutability in model development. The test case represents transport of an environmental tracer (cfc) and contaminant (pce) in a groundwater system with large-scale heterogeneities. Transparency is served by identifying important and unimportant parameters and observations. The frugal methods identified consistently important and unimportant parameters for three sets parameters for which sum of squared weighted residuals (SOSWR; dimensionless; constructed with error-based weighting) varies between 5606 and 92. Observations important to the parameter values are largely consistent, but the order varies for results using different parameter values because of model nonlinearity. For each set of parameters these results required 17 model runs. Refutability is served by estimating parameter values that minimize SOSWR and evaluating resulting model fit and parameter values. The computationally frugal parameter-estimation method reduced SOSWR from 5606 to 92, displayed no evidence of local minima, and required about 100 model runs each of the 10 times it was executed. The similar important parameters and observations for different parameter sets and performance of parameter estimation suggest the utility of the computationally frugal methods even for models as nonlinear as the one considered here. The value of the kinds of insights gained in this work is highlighted by the 10,000s to 1,000,000s of model runs being conducted in many studies to obtain them.
While many researchers have mapped and tracked coastal erosion in the Yellow River Delta, determining its cause has proven nearly impossible, because myriad natural and anthropogenic processes are simultaneously affecting the delta. These processes include reduced sediment supply, reduced river discharge, changing tide and current patterns, new seawalls, groundwater withdrawal, substrate compaction, oil extraction, burgeoning urban centers, and rising sea level. Here, we use Interferometric Synthetic Aperture Radar (InSAR) to map surface deformation in the delta between the years 2007 and 2011. We find that rapid, localized subsidence of up to 22 cm/y is occurring along the coast, apparently related to groundwater extraction at aquaculture facilities. This finding has important consequences for the sustainability of the local aquaculture industry. Similar subsidence may also be occurring in deltas like the Mekong, though these signals may be difficult or impossible to measure. +
2
While the findability, accessibility, interoperability, and reusability (FAIR) principles have been well-established for data, their application to research software (RS) remains inconsistent across disciplines. Research software is essential to advancing our understanding of Earth systems, however without agreement and consistent application of FAIR principles, sharing, reproducing, or expanding on scientific results is challenging. Aligning research software with FAIR principles enhances their impact and ensures long-term usability. Numerous frameworks exist (e.g. FAIR4RS, FAIRShare, Howfairis, FAIRSoft Evaluator) and leverage a wide range of approaches, ranging from questionnaires to semi-automated workflows. Each of these efforts has systematic gaps that hinder practical implementation, and due to the fragmentation of these efforts and lack of adoption of standards, it is challenging to combine them into a more complete solution. Our work addresses this by first evaluating these existing frameworks, identifying strengths and limitations, and developing methodologies to improve the assessment and adoption of FAIR principles in hydrology-specific research software. In doing so, we aim to address three key questions: 1) How can domain-specific knowledge shape community adoption of FAIR4RS principles? 2) What are the critical metadata components needed to enhance FAIR4RS compliance in the hydrologic sciences? 3) How can automated tools and technologies facilitate FAIR4RS assessments and provide actionable recommendations for improving FAIR4RS compliance? In this work, we establish a methodology for representing FAIR principles in community-adopted, structured, metadata representations such as Schema.Org and CodeMeta that builds on the aforementioned efforts. We evaluate our approach on community model repositories using a lightweight Python framework that quantifies the FAIR alignment of community-developed research software within the water science community. This enables automated assessment of FAIR principles and generates a quality compliance report that helps identify gaps in metadata representation. By aligning our work with ongoing efforts from the Open Modeling Foundation, community metadata standards groups, and academic research, we offer methods for aligning and evaluating the FAIRness of research software in an extensible manner that can be adopted across multiple science domains.
Wildfires occur across diverse terrestrial landscapes and are widely studied for their socio-economic consequences. However, their role in triggering erosion and controlling sediment dynamics is less well understood. Wildfires reduce canopy and ground cover, disrupting soil infiltration capacity, thus increasing erosion by runoff and channel incision. Wildfire-induced erosion can change soil depth, possibly exposing less weathered material and increasing soil production rates. Despite extensive research into the effects of fire on hydrological and geomorphic processes, studies examining how wildfire-induced changes in soil production rates influence catchment-wide sediment fluxes are still lacking. Here, we propose to use the Landlab modeling framework to explore how wildfire regimes and soil production interact to control sediment mobilization in a landscape. Pulses of sediment can be triggered after a single fire, however, in areas with frequent wildfires, sediment supply may be limited by soil production, controlling the transition from soil-mantled to bedrock landscapes. We investigate how different wildfire regimes influence sediment production and transport over decadal to millennial timescales. We expect our modeling results to help quantify how soil production rates respond to different wildfire regimes, shaping sediment dynamics and ultimately controlling overall soil thickness. +
Will be sent before April 01, 2023 +
Wind-swept snow self-organizes into bedforms. These bedforms affect local and global energy fluxes, but have not been incorporated into Earth system models because the conditions governing their development are not well understood. We created statistical classifiers, drawn from 736 hours of time-lapse footage in the Colorado Front Range, that predict bedform presence as a function of windspeed and time since snowfall. These classifiers provide the first quantitative predictions of bedform and sastrugi presence in varying weather conditions.<br>The flat snow surfaces we saw were all short-lived. The probability that a surface remained flat, rather than bedform-covered, decreased with time and with the average shear stress exerted on the surface by the wind.<br>The most persistent snow features were an erosional bedform known as sastrugi. The likelihood that a surface was covered by sastrugi increased with time and with the highest wind speeds experienced by the surface.<br>These results identify the weather variables which have the strongest effect on snow surfaces. We expect that these variables will inform and feature in future process-based models of bedform growth. Our observations therefore represent a first step towards understanding a self-organized process that ornaments 8% of the surface of the Earth. +
Woody Plant Encroachment (WPE), an increase in density, cover and biomass of trees or shrubs in native grasslands, has been observed to be a major cause for dramatic changes in arid and semiarid grasslands of southwestern US over the last 150 years. Driven by overgrazing, reduced fire frequency, and climate change, WPE is considered as a major form of desertification. In Landlab, ecohydrologic plant dynamics, wildland fires, grazing, and resource distribution (erosion/deposition) are represented in separate components. Landlab has two existing cellular automata Ecohydrology models, built using these components, to study the impacts of WPE on the evolution of vegetation patterns. In the first model, physically based vegetation dynamics model is used to simulate biomass production based on local soil moisture and potential evapotranspiration driven by daily simulated weather, coupled with a cellular automata plant establishment and mortality rules. In this model, spatial dynamics of disturbance propagation (e.g., fire spread and intensity) is not explicitly modeled. In the second model, a simple stochastic cellular automata model with two state variables, vegetation cover and soil resource storage, are used to model resultant vegetation patterns based on probabilistic establishment-mortality interplay, mediated by post-disturbance resource redistribution, while explicit roles of climate are neglected. In this work, we coupled these two models to investigate the role of disturbances (fire and grazing) in a climate driven dynamic ecohydrologic context. In this coupled model, daily- weather driven physically based vegetation dynamics model is coupled with cellular automata plant establishment model that explicitly simulates spatial disturbance dynamics. The effects of encroachment factors and model complexity on resultant vegetation patterns are studied. +
