Past Projects

Interannual variability of the South Asian monsoon: From improved understanding to skillful predictions of extreme monsoon seasons

( Bordoni)

Understanding and predicting the interannual variability of the South Asian monsoon, the largest monsoon in the Earth's atmosphere, is both a priority and a long-standing challenge in climate science. Extreme monsoon seasons cause drought, floods, and other natural hazards affecting millions of people's lives and resulting in huge economic losses. Variability in monsoon strength has been associated with variability in either land surface properties or in Sea Surface Temperatures in tropical oceans. In this project, we shift focus to the role of large-scale circulations in both the tropics and the extra-tropics, to isolate large-scale patterns associated with strong or weak monsoon years, both in antecedent and monsoon months. This approach is motivated by a progressive shift in the theoretical understanding of large-scale monsoons, which highlights the potential influence of weather systems in the winter hemisphere on tropical monsoons and suggests the need for revisiting these mechanisms in the context of the interannual variability of the South Asian monsoon. Observations, which include most recent reanalysis- and satellite-based estimates of thermodynamical and dynamical parameters, are used extensively to characterize patterns associated with strong or weak monsoon seasons. Numerical experiments with comprehensive general circulation models (GCM) will be used to investigate the dynamics at play. Finally, the emerging antecedent relationships will be developed into statistical prediction schemes for monsoonal rainfall variability. Our goal is to advance the scientific understanding of the largest monsoon in the Earth's atmosphere, and most importantly to improve predictive skills of its year-to-year variability, with great societal impacts for the millions of people living in the socially vulnerable economies affected by the monsoon.

Towards the continuous monitoring of natural hazards from river floods and debris flows from seismic observations

(Tsai and Lamb)

The broad goal of this project is to work towards being able to continuously monitor flood and debris flow hazards using seismic observations. Our specific aims in Yr1 were to conduct primarily laboratory experiments in the Caltech Earth Surface Dynamics Laboratory (ESDL) to improve our mechanistic models of bedload and water-flow induced seismic noise. Yr2 goals are to extend our studies to debris flows, including field studies in the San Gabriels, and large-scale hillside flume experiments.

Exploring the potential for near-real-time monitoring of wildfires with airborne polarimetric SAR

( Simons)


Tracking 3-D displacements and volume changes induced by natural disasters using remote sensing technologies

(Avouac, Desbrun, and Leprince)

As the amount of remote sensing data is increasing rapidly, methods to exploit these data are lagging behind. This challenge needs to be addressed given the societal needs for objective information about the impact of environment changes and natural disasters.

We focus on two main problems using 3-D data whether extracted from stereo matching of optical imagery, or acquired using LiDAR:

  1. The tracking of a 3D surface to quantify the flux of a moving mass, such as when tracking sand dunes, glacier flow, slow landslides. In these processes, the 3D surface is expected to endure a smooth deformation and thus to retain identifiable features to allow for tracking,
  2. The precise evaluation of volume change when the 3D surface properties are not preserved in time because of brutal deformation induced by catastrophic landslides, or when automatic assessment of building damage is necessary.

Differencing high quality LiDAR data acquired before and after an earthquake shows the buildings which collapsed during that earthquake (Figure). Although the use of such data looks promising the measurements are noisy. Because LiDAR data are not always well registered, edge effects due to shadowing and occlusions appear at the edge of buildings, artificially introducing volume change detections in non-damaged buildings. Better registration of LiDAR data and more robust volume change detection would enable the automatic detection of damaged buildings. Also we would wish a robust enough technique that would be applicable to dataset of lesser geometric accuracy than LiDAR data, such as DEM produced from regular stereogrammetry.


Difference of two elevation LiDAR data acquired before and after an earthquake. The area covered is 1x1km2. The blue areas are negative elevations in excess of 2m (building collapse), and the red areas are positive elevations in excess of 2m (new buildings, vegetation growth, etc).

The Analysis of Dense Urban Seismic Networks

(Clayton, Tsai, and Ampuero)

The purpose of this study is to exploit new dense urban seismic networks that have been installed in the Los Angeles region. With these data we will examine the fine-scale structure of the region and the micro-seismicity that is occurring there. The goal is to better understand the earthquake initiation process and the possible role of tremor. A better image of the structure will also help define areas of anomalous ground shaking.

An example of a dense urban array is shown in Figure 1. In Figure 2, the recording of a small earthquake by this array is shown. The surprising result from this earthquake is the rapid variations in the ground shaking over a fairly small area. This earthquake was small, but it indicates what might be expected in a major earthquake.

THOR Clayton

We plan to exploit the continuous records from this network to do seismic interferometry. This method utilizes natural sources such as the "noise" generated by ocean waves interacting with the coast. If this can be made to be practical, then it would be possible to do seismic surveys without an active source.

THOR Carson EQ

Exploiting GPGPU Technology for Rapid Assimilation of Space Geodetic Data: Natural Disaster Science, Response and Recovery

(Simons, Aivazis, Stalzer, and Desbrun)

Our project aims to exploit general-purpose graphical processing units (GPGPUs) in software designed for space geodetic data processing and downstream modeling. We are primarily focused on tools to: 1) monitor regions prone to natural hazards (earthquakes, volcanoes, landslides), 2) construct disaster damage assessment using satellite imaging radar, and 3) generate physically informed models of the underlying events. The expected speedups provided by GPGPUs will enable us to ask new geophysical questions and to further our ability to rapidly respond to natural disasters. Developing this technology now is essential in order to prepare us for the expected deluge of satellite geodetic data over the next 5-7 years. Tools developed under this proposal will also be used by the Advanced Rapid Imaging and Analysis (ARIA) project - a collaboration between the Seismological Laboratory and JPL. The ARIA project aims to make modern space-based data and resulting models available with low enough latency to make them useful for natural hazard response and recovery efforts.

THOR Simons

Surface displacements due to the 2011 M9.0 Tohoku-Oki, Japan earthquake obtained by the ARIA project.

Constraining and understanding cloud feedbacks through hierarchical physical modeling

(Schneider and Teixeira)

The largest contributions to uncertainties in climate change projections come from uncertainties about how clouds and particularly low clouds respond to climate change. If low cloud cover increases as the climate warms, the increased planetary albedo implies a damping feedback on climate changes; if low cloud cover decreases as the climate warms, the reduced albedo implies an amplifying feedback. Existing theories and models do not even agree on the sign of low cloud cover changes as the climate warms, much less their magnitude. We are using a new approach based on hierarchical physical modeling to constrain and understand cloud feedbacks. Recent advances in computational fluid dynamics (large-eddy simulations) allow us to investigate the turbulent dynamics of clouds on scales of meters, and we integrate insights gained from such high-resolution simulations into large-scale models of Earth's climate. The goal of the project is to provide a physical understanding of how clouds respond to and interact with climate changes − an understanding that we plan to translate into improved representations of clouds in climate models.

THOR Schneider

Stratocumulus clouds such as here off Baja California cover large ocean areas. (Image credit: NASA/GSFC.)

The Caltech Virtual Shaker


The Caltech Virtual Shaker ( is a scientific gateway that is envisioned to host the world's largest (and perhaps only) database of building and bridge models for assessing societal risk from earthquakes globally. The database collates and archives models of existing and newly designed structures for use by the earthquake engineering and seismological research and education communities, as well as other stakeholders such as government entities, real-estate owners and developers, insurers and re-insurers and many more. The unique feature of this online gateway is the facility to remotely analyze the models under earthquake shaking on a high-performance computing cluster (HPCC) at Caltech. This facility empowers the stakeholders by freeing them from having to own expensive specialized software or hardware to assess risk. The ability to analyze models remotely is a strong incentive for users to build models and help populate the database. Once a critical mass of structural models and software is attained, the database is expected to grow organically at an exponential rate. A key component of the gateway is the visualization module for the creative rendering of realistic animations of the 3-D response of structures under strong seismic shaking.

The THOR funding is being used to focus specifically on constructing, validating, and testing high-fidelity numerical models of tall buildings in the greater Los Angeles metropolitan region. The long-term goal is to create a prototype rapid damage estimation product for southern California that, in the event of a large regional earthquake, extracts seismic source models from the USGS earthquakes gateway, automatically generates the ground motion at various southern California sites (using a finite-source version of the Caltech ShakeMovie gateway, and estimates responses of existing tall buildings in the greater Los Angeles region.

Virtual Shaker for the Northridge Earthquake.

Earthquake source processes, debris flows, and soil liquefaction: Physics-based modeling of failure in granular media

(Ampuero, Andrade, Lamb, and Lapusta)

California, as well as numerous other densely populated areas around the world, is affected by earthquakes, debris flows, landslides, and soil liquefaction. Yet physically-based models of these hazards, with parameters that can be estimated in the laboratory and evaluated in the field, are currently missing. Such models would allow for understanding the potential interaction between various hazards, predicting their consequences, and creating physics-based hazard maps. Understanding the role of physical processes at the grain scale and inferring key information to better inform macroscopic models is at the heart of the multiscale framework that we propose to develop in this project in conjunction with experiments (Fig. 1). An important future goal is to use the gained predictive understanding to develop strategies for hazards mitigation. In addition to its important societal implications, the proposed research will address a number of fundamental problems in Earth science and in multi-particle, multi-physics systems and their continuum representations.

Our study is focused on four collaborative projects in the dynamics of granular material in geophenomena using a combination of experiments, continuum numerical modeling, and multi-scale modeling.

THOR Ampuero

Schematic showing unified tomography-to-simulation framework across scales (left to right) and integration of characterization and simulation (top to bottom). Areas of relatively established understanding (shown in solid puzzle pieces), such as grain-scale tomography and simulation, are contrasted with focus areas that need major developments. Level sets, GEM, and NURBS are key computational ingredients to enable grain-scale characterization and simulations, which yield particle kinematics and forces. Using such quantities, multiscale methods provide the link between experiments and continuum plasticity models to complete the proposed framework. Multiscale methods hinge on theory and computations to extract central physical parameters such as friction and dilatancy. After Andrade et al., 2012.

  1. Physics-based modeling of debris flows, landslides and avalanches, and comparison to experiments

Landslides, avalanches and debris flows represent the instability of water-laden granular media on sloping terrain. These failures are major hazards and can result in significant loss of life and property. Yet there exist major knowledge gaps as to when these failures occur and how they evolve downstream. We are tackling landslide and debris flow mechanics including the conditions for incipient failure and run out length using a collaborative approach of laboratory experiments, continuum models and multi-scale models.

THOR Ampuero2

Experimental setup used to measure and correlate the angle of inclination through the life-cycle of an avalanche and the evolution of dilatancy.

Experimental in-channel bed failure at 47% slope.

  1. Microscopic understanding of earthquake friction laws

Recent studies indicate that slip in large seismic events takes place in a narrow, <10 mm, and perhaps <1 mm, band of a finely granulated rock material called fault gouge, often in the presence of fluids. Building a physics-based model of the behavior of the fault core under shear loading and seismic shaking in the presence of fluids is crucial to understanding of earthquake nucleation, triggering of earthquakes by natural and man-made effects, and nature of seismic radiation produced on faults. Our work includes developing 1) realistic laws for dilatancy during shear and 2) understanding for the properties of fault gouge materials and structure that contribute to rate and state laws.

  1. Initiation of soil liquefaction under seismic shaking

One of the major causes of damage induced by earthquakes is soil liquefaction, the loss of shear resistance of loose saturated soils due to complex interactions between deformation of the granular medium and buildup of pore fluid pressure. An increasing number of field and seismological observations, especially collocated recordings of ground motion and fluid pressure on borehole sensor arrays during recent earthquakes, provide a unique opportunity to test and refine models of dynamic soil response, and to close the loop between microscopic and macroscopic scale modeling of the initiation of soil liquefaction. We are implementing advanced soil constitutive equations in spectral element codes for local to regional scale seismic wave propagation, and developing a multi-scale computational framework that couples the continuum and granular descriptions of soil dynamics.

THOR AMpuero3

Liquefaction at the field scale with micromechanical structure zoomed in as inset.

  1. Effect of particle-scale statics on sediment trapping by vegetation in steep landscapes

Sediment yield from steep mountain terrain often increases by several orders of magnitude following wildfire leading to devastating debris flows, for reasons which have not been explained. One hypothesis that we are exploring is that prior to incineration, sediment is trapped behind vegetation on hillslopes that otherwise exceed the angle for sediment stability. We are conducting fieldwork and laboratory experiments to build an empirical model for the sediment storage capacity of plants (Figure 3) (Lamb et al., 2011; DiBiase et al., in review). Ongoing and future work is aimed at using physics-based numerical models that can probe the granular scale to understand mechanistically force-chain and grain bridging dynamics in these trapped sediment piles.

THOR Ampuero4

Community-building and integrative academic activities

Our groups had biweekly meetings in the fall and winter quarter of 2011-2012 to share the expertise and develop a common vision among the Ampuero, Andrade, Lamb, and Lapusta groups. These meetings, which involved the co-PIs, their students, and postdocs, enabled the different groups to learn about challenges, opportunities, and tools in experimental, theoretical, and numerical approaches to studies of granular geomaterials.

In addition, we have developed a new course Ge192 in 2011-2012, CE/Ge/ME 222 in 2012-2013: Earthquake source processes, debris flows, and soil liquefaction: Physics-based modeling of failure in granular media, in the spring of 2012 (6 credits). This seminar-style course focuses on granular dynamics and instabilities across geophysics, geology, mechanical engineering, and civil engineering. Specific topics included granular mechanics, discrete element modeling, soil mechanics, landslides, liquefaction, shear localization, and rate-and-state friction laws used for earthquakes. The course will be offered again in spring 2013 under a new (cross-listed) number CE/Ge/ME 222.

Mitigating catastrophic hazards on deltas from river avulsions and land loss

(Lamb and Fischer)

River deltas are home to more than a half billion people, however, the sustainability of societies on these coastal landscapes is far from certain due to abrupt and catastrophic changes in the course of a river, known as river avulsion. Much like earthquakes, avulsions occur regularly, but infrequently, and can strike with little or no warning resulting in massive loss of life and property. Many deltas in industrial nations, like the Mississippi Delta, are heavily engineered to prevent river avulsion, but these efforts deprive coastal areas of sediment unintentionally resulting in wetland destruction, land loss and increased risk of inundation from hurricanes, river floods and sea-level rise. To predict avulsion hazards on natural deltas and to develop sustainable solutions for land loss on engineered deltas, we need a mechanistic understanding of the timing and location of river avulsions on deltas. Our goal is to address this knowledge gap over a two-year period through an integrated approach that combines numerical and physical experiments on delta mechanics in the Caltech Earth Surface Dynamics Laboratory, stratigraphic analysis of avulsion patterns and timing in ancient deltas preserved in the sedimentary record, and application to avulsion and land-loss hazards on modern deltas.

Instrumenting a tall building in downtown Los Angeles

(Clayton, Heaton, Chandy, and Kohler)

We propose to place approximately 100 three-component motion sensors in a modern 52-story building in downtown Los Angeles. This will be a unique, showpiece facility for studying the motions of high-rise buildings that could potentially provide the basic observations that lead to a state-of-health monitoring system for earthquake safety in buildings. We will develop a finite-element model for the building that will be calibrated to motion sensor data, and will be used to test the detectability of various damage scenarios. The sensor array will be incorporated into the Community Seismic Network (CSN) project.

THOR Clayton building

Delta 49s total

Experimental river delta built by repeated switching of the river channel (avulsion) in the Caltech Earth Surface Dynamics Laboratory.
The entire video represents 40 hours of run time. Delta is 1.5 m long.