Property:CSDMS meeting abstract

From CSDMS

This is a property of type Text.

Showing 20 pages using this property.
2
Globally, more people are impacted by floods than all other forms of natural disasters combined. In global megacities, defined by the United Nations as cities with a population of over ten million, increased human exposure to flooding is both ubiquitous and extremely difficult to characterize. Over the past three decades, most of these cities have experienced a gradual or very rapid growth as the global population continues to urbanize. As both urban expansion and global climate change contribute to hydrologic intensification, and as globally more people live in urban areas than rural ones, the need to assess both the drivers and magnitude of flood risk associated with rapid growth in megacities is of critical humanitarian concern. Through a multitemporal analysis (2000, 2010, and 2020) of urban growth modes and urban landscape change detection using the Landsat dataset (ETM+, OLI), we estimate the growth rates and development patterns in ten global megacities (Guangzhou, Tokyo, Lagos, Jakarta, Delhi, Manila, Mumbai, Seoul, Mexico City, New York) representing different global climate zones using machine learning. Trends in runoff magnitudes over the time period are quantified and associated with urban expansion and non-stationarity in regional historical precipitation patterns. Preliminary results showed that the ten cities have experienced major flooding within the last ten years resulting mostly as a result of heavy rainfall.  +
Globally, the occurrence of extreme hydrologic events such as flooding is known to be the widespread aftermath of torrential rain and the impacts are adverse and devastating in built areas with proximity to water bodies. An example is the 2012 and 2022 flooding along the Niger and Benue rivers in Nigeria. While Nigeria experiences seasonal flooding during the rainy season, the decadal interval between these two catastrophic flood events and the similarities between the natural and anthropogenic conditions responsible for their occurrence prompted this study. Additionally, some hydrologic characteristics and attributes of these flood events are yet to be evaluated. Hence, for the 2012 and 2022 floods, we estimated and compared the floodwater depths at different sections of the Niger and Benue Rivers using the Floodwater Depth Estimation Tool (FwDETv2.0 and FwDETv2.1) implemented in Google Earth Engine, Jupyter Notebook, and ArcGIS Pro. Since this algorithm requires minimal input (flood inundation map and Digital Elevation Model) which favors data-sparse regions such as Nigeria, the potential for the FwDET tool to automatically quantify flood water depths, an important variable in flood intensity estimation was assessed. This tool could be invaluable in flood management and mitigation studies along the rivers.  +
Graphics Processing Units (GPUs) have been shown to be very successful in accelerating simulation in many fields. When they are used to accelerate simulation of earthquakes and tsunamis, a big challenge comes from the use of adaptive mesh refinement (AMR) in the code, often necessary for capturing dynamically evolving small-scale features without excessive resolution in other regions of the domain. Clawpack is an open source library for solving general hyperbolic wave-propagation problems with AMR. It is the basis for the GeoClaw package used for modeling tsunamis, storm surge, and floods. It has also been used for coupled seismic-tsunami simulations. Recently, we have accelerated the library with GPUs and observe a speed-up of 2.5 in a benchmark problem using AMR on a NVIDIA K20 GPU. Many functions that facilitate the execution of computing kernels are added. Customized and CPU thread-safe memory managers are designed to manage GPU and CPU memory pools, which is essential in eliminating overhead of memory allocation and de-allocation. A global reduction is conducted on each AMR grid patch for dynamically adjusting the time step. To avoid copying back fluxes at cell edges from the GPU memory to the CPU memory, the conservation fixes required between patches on different levels are also conducted on the GPU. Some of these kernels are merged into bigger kernels, which greatly reduces the overhead of launching CUDA kernels.  +
High quality Digital Elevation Models (DEMs) do not exist in coastal wetlands prior to the widespread use of aerial LiDAR beginning in the early 2000's. This makes it difficult to develop models that capture the historical evolution of specific coastal marshes, creating a challenge in communications between the modeling community and wetland managers who seek to understand model outputs in the context of their experience, observations, history of management decisions, and perception of risk. The project team is working with managers at four coastal wetlands to advance a method that will fill this data gap using historical remotely sensed imagery, historical in-situ observations, and machine learning. The team will compile Landsat imagery collected within one year of an existing high quality DEM. The suites of Landsat imagery will be processed to produce maps showing inundation frequency based on the Normalized Difference Water Index (NDWI), and these will be used as training data for a deep learning image segmentation model that relates inundation frequency with wetland elevation. The segmentation model will then be validated with observational data and applied to the period before DEMs are widely available but during which Landsat sensors are consistent with today’s standards (i.e. 1984 to the present).  +
Hourly precipitation for one historical (1991-2000) and two future periods (2031-2040 and 2071-2079) were generated using the Weather Research and Forecasting (WRF) Regional Climate Model (RCM). The climate simulations were conducted for the Southwest region of the United States using an hourly temporal and 10 km spatial resolution grid. The boundary forcing for the WRF model was developed by the Hadley Centre for Climate Prediction and Research/Met Office’s HadCM3 model with A2 emission scenario. The precipitation from the RCM-WRF model was bias-corrected using the observed data, and then used to quantify the impact of climate change on the magnitude and frequency of flood flow in the upper Santa Cruz River watershed (USCRW) in southern Arizona. The Computational Hydraulics and River Engineering two-dimensional (CHRE2D) model, a two-dimensional hydrodynamic and sediment transport model, was adapted for surface flow routing. The CHRE2D model was first calibrated using a storm event on July 15th, 1999, and then applied to the watershed for three selected periods. The simulated annual maximum discharges in two future periods were added to the historical records to obtain the flood frequency curve. Results indicate the peak discharges of 100-year, 200-year, and 500-year flood only increased slightly, and the increase is within the 90% confidence interval limits. Therefore, the flood magnitude and frequency curve will not change with the inclusion of projected future climate data for the study watershed.  +
A
Hurricanes are one of the most costly natural disasters impacting US coastal areas. Recent studies point towards an increase in damages caused by hurricanes, resulting from sea-level rise (SLR), possible hurricane intensification due to a warmer climate and increasing coastal populations. The SLR is one of the most significant factors of climate change that will impact coastal areas. Besides geometrical changes in coastal bays (i.e., deeper water depth and larger surface area), SLR is also expected to have substantial impacts on the patterns and process of coastal wetlands, thereby affecting surge generation and propagation inside the bays. We analyzed the impacts of SLR on hurricane storm surges, structural building damage, and population and businesses affected for coastal bays located on the Texas central coast. To evaluate the effects of SLR on surges, we considered its impacts on changes in land cover and bay geometry caused by SLR. The analyses were conducted using the hydrodynamic model ADCIRC and a wind and pressure field model (PBL) representing the physical properties of historical hurricane Bret and hypothetical storms. The effects of land cover change were represented within ADCIRC by the changes in the frictional drag at the sea bottom and changes in momentum transfer from the wind to the water column caused by vegetation losses. Simulations were performed using a high-resolution unstructured numerical mesh to study surge response in communities along the coastal bays of Texas. First, we evaluated the impacts of land cover changes due to SLR on the surge response. Second, we evaluated the impacts of neglecting land cover changes due to SLR on the surge response. Finally, we evaluated the overall effect of SLR on the mean maximum surge and the consequent extent of the flooded areas. Although the overall impacts of SLR on surge (i.e.: water elevation above mean water level) are highly dependent on storm conditions and specific locations within the study area, we showed that the mean maximum surge (spatial average within each bay) increases with SLR. The overall mean maximum surge within the study area increased on average approximately 0.1 m (SLR of 0.5 m) and 0.7 m (SLR of 2.0 m). Simulations neglecting land cover changes due to SLR did significantly underestimate the expected structural damage for buildings. This difference increased with SLR and was affected by the storm meteorological conditions. Stronger and faster storms were associated with higher underestimation. Although considering land cover changes resulted in an overall damage increase, for SLR below 0.5 m, this increase was almost negligible. As a result, the land cover changes arising from SLR are important for damage estimation considering SLR scenarios over at least 0.5 m. For example, when considering a SLR of 0.6 m, based on the Intergovernmental Panel on Climate Change’s (2007) high emission scenario, we demonstrated a 10% increase in building structural damage. The assimilation of land cover changes is especially important when calculating expected damages from high SLR scenarios. If a SLR of 2.0 m is assumed, a 35% increase in the expected structural damage to buildings is estimated. In summary, the changes in coastal bay geometry and land cover caused by SLR play an important role in the resulting surge response. The variability of the surge response is also greatly affected by location and the characteristics of the storm.  
2
Hydrologic connectivity can change as climate changes, seasonally, or even after a single rain event. Here, I assess the depression structure of the topography of the United States and determine its capacity to hold surface water in lakes. I provide results from a simulation indicating the pre-industrial water level in these depressions and the resulting degree of hydrologic connectivity. I then share results from a series of experimental simulations to modify water levels and the resulting hydrologic connectivity across the country.  +
A
IEDA (Integrated Earth Data Applications, www.iedadata.org) is a data facility funded through a contract with the US National Science Foundation to operate data systems and data services for solid earth geoscience data. There are many similarities between IEDA and its community of data producers and users and CSDMS and its community of model creators and users. IEDA has developed a comprehensive suite of data services that are designed to address the concerns and needs of investigators, especially researchers working in the 'Long Tail of Science' (Heidorn 2008). IEDA provides a data publication service, registering data sources (including models) with DOI to ensure their proper citation and attribution. IEDA works with publishers on advanced linkages between datasets in the IEDA repository and scientific online articles to facilitate access to the data, enhance their visibility, and augment their use and citation. IEDA also developed a comprehensive investigator support that includes tools, tutorials, and virtual or face-to-face workshops that guide and assist investigators with data management planning, data submission, and data documentation. A relationship between IEDA and CSDMS benefits the scientists from both communities by providing them with a broader range of tools and data services.  +
In Arctic landscapes, modern surface warming has significantly altered geomorphic process rates. Along the Beaufort Sea coastline bounding Alaska’s North Slope, the mean annual coastal erosion rate has doubled from ~7 m/yr for 1955-1979 to ~14 m/yr for 2002-2007. Locally the erosion rate reaches 30 m/yr. A robust understanding of the processes that govern the rate of erosion is required in order to predict the response of the coast and its adjacent landscape to a rapidly changing climate, with implications for sediment and carbon fluxes, oilfield infrastructure, and animal habitat. On the Beaufort Sea coast, bluffs in regions of ice-rich silt-dominated permafrost are abundant. This type of coast is vulnerable to rapid erosion due to its high ice content and the small grain size of bluff sediment. The bluff material at our study site near Drew Point is 64% ice, making the bluff susceptible to thermal erosion. Liberated sediment is removed from the system in suspension and does not form sheltering beaches or barrier islands which would provide a negative feedback to erosion. During the sea ice-free season, relatively warm waters abut the bluff and ocean water melts a notch into the 4-m tall bluffs. The bluffs ultimately fail by the toppling of polygonal blocks bounded by mechanically weak ice-wedges that are spaced roughly 10-20 m apart. The blocks then temporarily armor the coast against further attack. We document the style and the drivers of coastal erosion in this region through simultaneous measurements of the oceanic and atmospheric conditions, and time-lapse imagery. We extract proxies for erosion rate from time-lapse imagery of both a degrading block and a retreating bluff from the summer of 2010, and compare the proxy record with environmental conditions and melt rate models. These observations verify that the dominant process by which erosion occurs is thermal insertion of a notch, toppling of blocks, and subsequent melting of the ice in the block. The annual retreat rate is governed by the length of the sea ice-free season, water and air temperatures, and the water level history, including both storm surge and wave height. Motivated by these observations, we developed a numerical model to capture the evolution of the permafrost bluffs on the North Slope. We honor the high ice content of the bluff materials and the role of the toppled block in temporarily armoring the coast. We employ a positive degree day algorithm to drive subaerial melt, and a modified iceberg melting algorithm to determine rate of notch incision. Our model is first applied to the 2010 coastal retreat history, and is then used to address field and remote sensing observations over a variety of timescales. Finally, we employ the model to explore expected changes in coastal retreat rates in a range of climate scenarios that include increases in the duration of sea-ice free conditions, warming ocean temperatures, and changes in storm frequencies.  
2
In China, permafrost is mainly underlain on the Qinghai-Tibet Plateau (QTP), which is the largest mid-low latitude permafrost region in the world. Owing to the unique and extremely high altitude, permafrost area on the QTP approximately amounts to 1.06 million km2. Permafrost on the QTP is one of the most sensitive indicators to global climate change, because it is the product between the earth and atmosphere system. The active layer is the interface between the earth and atmosphere. To understand the present condition of active layer and permafrost thermal state is the foundation to learn about the hydrological cycles, infrastructures built on and in permafrost, soil carbon release and uptake, and biogeochemical and ecological processes in cold regions. The observations can depict the present state of permafrost, but models are eventually essential to predict future changes of permafrost. Despite the fact that geophysical surveys and boreholes are the most reliable sources of information about permafrost, they are extremely costly and are mostly available from relatively small regions. I tried to implement the Geophysical Institute Permafrost Lab Version2 (GIPL2) model on the Qinghai-Tibet Plateau (QTP). The GIPL2 model can provide more permafrost thermal state than those of statistical empirical models. I am interested in applying the GIPL2 model to the Qinghai Tibet Plateau in order to know the thermal state of QTP permafrost and its response to recent climate changes. The results of our present work using the original version of GIPL2 indicated that for the whole permafrost area of the QTP, the simulated ALT ranges from 0 to 8 m, with an average of 2.30 m. The simulated 18 ALT sites are generally underestimated compared with the observed values with the MBE value of -0.14 m and the RMSE value of 0.22 m.  +
In an ongoing NASA project, our team is producing enhanced global flood hazard maps from advanced modeling, remote sensing and big data analytics. The innovation is that we couple long-term Water Balance Model (WBM) global scale hydrologic flow simulations with the 2-D LISFLOOD-FP model to generate continental scale flood inundation maps that are then integrated with the flood map information from the DFO, including their radiometry-based satellite discharge estimations, i.e. “River Watch”. These remotely sensed discharge stations will be employed to associate flow return periods to the DFO satellite flood maps (up to the 25-year floodplain) that can then be cross-validated with frequencies of inundation from the flood model historic simulations. Furthermore, we collaborate with Google Inc and use their EE platform for big data analytics, such as downscaling our model simulations of flood hazard to adequate resolutions for decision-makers. This poster will present first achievements for Australia, Africa and CONUS, and discuss challenges and perspectives.  +
In complex systems, emergence occurs when a ‘new’ property arises at higher levels of organization that cannot be directly deduced from the behavior of constituent elements. While many geomorphic systems exhibit emergence, numerical models of surface processes typically address emergence by carefully selecting the appropriate spatio-temporal scale to parameterize the relevant physics, chemistry, and biology that is occurring at lower levels of organization. This is an effective strategy where finer-scale processes are either poorly constrained or intractable to model numerically. The concept of the geomorphic transport law reifies this strategy by adopting a ‘top down’ approach where surface processes are encoded into the set of partial differential equations chosen. However, as data resolution and computational power increase, there are new opportunities to build models that simulate processes from the ‘bottom up’. One such opportunity is in the simulation of biologically driven soil production and sediment transport. Biological systems exhibit some of the most compelling examples of emergence (e.g., insect societies, flocking behavior, fairy circles) that are readily simulated using Agent-Based Models (ABMs). Given that biota drive many of the most widely used geomorphic transport laws, it is worth taking stock of whether ABMs can provide new insights into surface process modeling. We present two promising examples where we think ABMs might provide new, testable predictions of soil production and sediment transport. The first example focuses on tree seeding, recruitment, growth, and death. Rules for soil production via tree root growth monotonically decrease with soil depth. However, because soil production in the model depends not only on individual tree root growth but also the probability of an unstressed tree growing at any given location, humped soil production functions emerge over the long-term. The second example focuses on one hypothesized mechanism for mima mound formation. Rules for burrowing organisms allow for preferential upslope transport of sediment into mounds while gravitational processes (i.e., creep) degrade mounds. Both examples highlight how ABMs help make rules for ecological dynamics explicit. Bulk coefficients common to conventional treatments of soil production and sediment transport laws are thus allowed to emerge from the empirically constrained rulesets that are used.  
In gravel-bedded rivers, bed material abrasion is a well-recognized control on the balance between fine and coarse sediment fluxes. We suggest that in some landscapes, abrasion may also be an important control on the morphodynamics of sediment pulses. Here, we employ a simple morphodynamic model to explore the extent to which bed material abrasion controls the downstream fate of sediment pulses in terms of transit time and the magnitude of response in channel bed elevation and grain size change. The Network Sediment Transporter (NST) is a Lagrangian 1-D morphodynamic model component that tracks bed sediment moving and interacting on a river network. The NST is implemented in Landlab, a Python-based package for modeling the Earth’s surface. The NST tracks ‘parcels’ of sediment (collections of grains of homogeneous size, density, etc.) as they transport through the network, allowing us to explicitly tag and follow sediment as originating in the mass wasting deposit and give that sediment unique abrasion characteristics. The model requires inputs about channel morphology, flow, and bed sediment attributes. Here, we compare the results of a simple sediment pulse simulation without abrasion of the bed material to an identical pulse with abrasion rates equal to measurements made on a volcanic mass wasting deposit in the Cascade Range of Washington. The differences between pulse behavior with and without abrasion have implications for hazards in volcanic terrains where channels are commonly subject to large mass wasting deposits of heterogeneous sedimentary characteristics. Understanding the fate of these large sediment pulses will increase our understanding of downstream channel aggradation and increases in flood frequency.  +
In many areas of the world, the environment has been engineered to reduce variability (increase robustness) for human development. As much of the agricultural land in the middle US is located in arid and semi-arid regions, agricultural practices depend on irrigation. Since the 1960’s thousands of fields are watered using center pivot irrigation, each of which requires about 800 gpm (4,361 m3/day) (New and Fipps, 2017). Groundwater supported irrigation was dependable for decades, but now many areas of the High Plains aquifer, which is partly composed of the Ogallala aquifer, is at risk of depletion, and farming is facing difficult circumstances. On the positive side, western Kansas has very high potential capacity for wind power production, but opportunities to use this locally produced energy to improve prospects for the farming community face scientific and engineering challenges and, communities are not aware of many potentially promising alternatives. The Food-Energy-Water calculator (FEW) is a tool designed to introduce new alternatives to these communities and the scientists, engineers, and governmental entities who support them. In this study, Agent-Based Modeling (ABM) is used to coordinate the many types of actors, information and alternatives relevant to this problem. For more creative agricultural scenarios, a crop model called Decision Support System for Agrotechnology Transfer (DSSAT) can be used to calculate crop yields and income. The resulting ABM based FEW calculator provides a more realistic and effective framework for managing the complexity between the human and natural system dynamics.  +
A
In most mountainous regions reconstructed glacial histories are the primary record of past climate and are typically based on unsorted accumulations of debris (moraines) deposited at the terminus of glaciers. Former glacier geometries— preserved as moraines and trim lines— are the primary constraint for extracting paleoclimate estimates using either equilibrium-line altitudes or numerical glacier models. It is an implicit assumption in the glacial geology community that terminal moraines were formed by glaciers responding to the mean value of summer temperature and winter precipitation at the time of formation. In reality glacier termini oscillate around a mean glacial length even in a steady climate, defined by a constant mean and constant standard deviation. These length oscillations are driven by the alignment of more negative (positive) periods of mass balance that arise out of random year-to-year climate variability. Because glaciers that override moraines almost always destroy them, the furthest terminal moraines from the headwall during the time period of interest represent the maximum excursion of the glacier from its mean length. This implies that paleoclimate estimates based upon the furthest terminal moraine are actually maximum estimates of climate change. We use a linearized glacier model developed by Roe and O’Neal (2009) to determine the mean length of eleven Last Glacial Maximum (LGM) glaciers in the northern Front Range, Colorado. Mean glacier lengths during the LGM were ~15% upvalley from the LGM terminal moraines. In the Colorado Front Range estimating LGM paleoclimate from the furthest terminal moraine rather than the mean length adds an extra ~1°C temperature change or an additional 25% increase in precipitation to estimate of differences from the modern climate. Furthermore, it is possible that ‘recessional’ moraines were formed by length oscillations driven by interannual variability.  +
2
In recent years a large number of numerical models have been developed and implemented to study basic and applied problems of research moprhodyanmics. Some of these models treat the bed material as uniform; others consider the bed material as a mixture of sand and gravel. The vast majority of the morphodynamic models that account for the non-uniformity of the bed material size are based on the active layer approximation, i.e. the channel bed deposit in two different regions. The active layer, which is the topmost part of the bed deposit, is modeled as mixed layer whose particles can interact with the bed material transport. Particles in the rest of the channel deposit, the substrate, can be exchanged with the bed material transport only when the channel bed aggrades or degrades. Morphdynamic formulations based on the active layer approximation, however, have well known limitations:1) they neglect the vertical fluxes within the deposit associated with e.g. bedform migration, 2) they cannot capture the infiltration of fine sediment and tracer stone dispersal and 3) the statistical nature of sediment entrainment is neglected. To overcome these limitations, Parker and coauthors in 2000 introduced a continuous, i.e. not layer-based, morphodynamic framework based on a stochastic description of the bed surface elevation, of the entrainment and deposition. In this framework particle entrainment rates are computed as a function of the flow and sediment characteristics, while particle deposition is estimated with a step length formulation. However, due to the lack of mathematical functions describing the variability of bed elevation, entrainment and deposition, the continuum framework has never been implemented. Here we present one of the first implementation of the continuum framework at laboratory scale and its validation against laboratory experiments on tracer stones dispersal. The validated model is then used to investigate the dependence of the model results on different particle step lengths.  
In recent years, seismic signals previously thought of as “noise” have become a subject of study for environmental seismologists. These signals can reveal Critical Zone and geomorphic processes which traditionally are not well-constrained, such as the roles biota play in weathering, movement of mass, and landscape evolution. Wind-driven tree sway is central to conceptual models of physical bedrock weathering and subsequent soil production. However, despite documentation, seismic signals of wind-tree interactions have been largely ignored by surface process researchers. Our work focuses on identifying the seismic signature of tree-captured wind by comparing seismic data in areas with little to no vegetation against heavily vegetated areas. Using meteorological and seismic data from the Transportable Array deployed in Alaska, we isolate this vegetation effect on seismicity by selecting for periods with high-wind events in the absence of rain. We hypothesize that there is a difference in strength of seismicity which scales with percent tree cover. We use a combination of wind speed and seismic data to explore the impact of vegetation on seismic amplitude and examine the spectral signature of wind moving trees in order to better understand its contribution toward soil production and nutrient/carbon cycling in the Critical Zone.  +
In terrestrial ecosystems, rock fractures as unsaturated reservoirs for vegetation have been recently recognized as a key ecohydrological process. However, it remains unclear how the coupling between plant water use strategies and rock water storage interplay. We selected Douglas fir and Engelmann spruce trees growing on both soils and exposed limestone cliffs in the Canadian Rockies. We measured sap flow, stem water potential, and superficial fracture substrate moisture for trees growing in rock fractures and glacial till. Isotopic analysis of precipitation, plants, and soil samples revealed that the trees do not have access to any long-term water sources but rather use recent precipitation values. To explore the relationship within the system, we built a stock-flow model with three stocks: the surface fracture, the deep fracture, and the tree itself, and observed that cliff trees respond slightly differently to water replenishment due to the cliff architecture. Plant regulation coupled with rock water storage is crucial to model water movement through plants in highly water-limited environments correctly. Our study highlights the importance of understanding how trees access rock moisture storage in water-limited environments.  +
In the same way that watersheds filter precipitation signals into a time series of flow response, watersheds also filter sediment production signals into a time series of bedload transport. Here, we describe the Mass Wasting Router, a new watershed-scale sediment production and transport model written for Landlab that couples an existing shallow landslide hazard model (LandslideProbability) with an existing network-scale bedload transport model (NetworkSedimentTransporter) by (1) delineating hillslope scale landslides from maps of landslide probability, (2) routing the landslides through the watershed using a “precipiton” or “agent” style model and (3) fluvially eroding the mass wasting deposits and creating parcels for the NetworkSedimentTransporter. Preliminary model runs indicate that variation in soil cohesion and precipitation intensity drive landslide-derived hillslope sediment production rates but valley storage processes, driven by debris flow deposition patterns, modulate bedload transport rates at the basin outlet.  +
In the southern San Andreas Fault zone, the San Gorgonio Pass (SGP) stands as a region of intricate structural complexity, pivotal for the assessment of seismic hazards due to its potential role in modulating earthquake rupture propagation. This investigation delves into the SGP's crucial function in earthquake dynamics amid ongoing discussions on slip partitioning among its fault strands, aiming to fill a substantial knowledge gap concerning fault activity spanning the last 1 to 100 thousand years. The challenge of estimating slip rates, exacerbated by a dearth of datable materials within the SGP's challenging terrain, calls for innovative methodologies to assess uplift rates along previously overlooked fault segments. In our study, we use thermoluminescence (TL) thermochronology to evaluate differential uplift by analysing bedrock erosion rates. Although AHe dating sheds light on thermal histories and erosion rates across millions of years, it falls short in detailing the recent uplift history vital for grasping Quaternary fault dynamics. In contrast, cosmogenic 10Be dating proves effective in measuring surface erosion rates over millennial timescales, providing insights into contemporary geological activities. TL dating, with its capacity to discern bedrock exhumation over 10-100 ka, acts as a bridge between the temporal scales of AHe thermochronology (Ma) and cosmogenic 10Be denudation rates (ka). By juxtaposing erosion rates across different faults within the SGP, our research aims to pinpoint active fault segments, thereby enriching our understanding of fault dynamics and seismic risk in the southern San Bernardino Mountains.  +