EARTH 270: Disasters and Natural Hazards
Estimated study time: 51 minutes
Table of contents
Sources and References
Primary textbook — Abbott, P.L. (2023). Natural Disasters (12th International ed.). McGraw Hill. A comprehensive and accessible survey of all major natural hazard types with rich case study material and updated coverage of climate change linkages.
Supplementary texts — Keller, E.A. and DeVecchio, D.E. (2019). Natural Hazards: Earth’s Processes as Hazards, Disasters, and Catastrophes (5th ed.). Pearson. Bryant, E. (2008). Tsunami: The Underrated Hazard (2nd ed.). Springer.
Online resources — USGS Earthquake Hazards Program (earthquake.usgs.gov); USGS Volcano Hazards Program (volcanoes.usgs.gov); NOAA National Hurricane Center (nhc.noaa.gov); NASA Earth Observatory (earthobservatory.nasa.gov); FEMA Hazus database; Global Volcanism Program (Smithsonian Institution); EM-DAT International Disaster Database (emdat.be).
Chapter 1: Introduction to Natural Hazards and Disasters
Defining Hazard, Risk, and Disaster
The study of natural hazards requires careful attention to terminology, because imprecise language can lead to misunderstanding of both the physical processes involved and the societal factors that determine who is vulnerable to them. The three key concepts — hazard, risk, and disaster — are related but distinct, and understanding their relationship is prerequisite to any serious analysis of how societies should respond to the threat of natural events.
The distinction between a hazard and a disaster is fundamentally sociopolitical, not geological. The same physical event may be a minor inconvenience for a wealthy, well-prepared community and a devastating disaster for a poor, marginalized one. This does not mean that geology is unimportant — the physical characteristics of the hazard (its intensity, duration, spatial extent, speed of onset, and frequency) clearly matter — but it means that reducing disaster risk requires addressing both the physical hazard and the social, economic, and institutional factors that determine vulnerability. This insight has transformed the field of disaster risk reduction since the 1990s, moving away from purely engineering approaches toward integrated frameworks that include community preparedness, land use planning, early warning systems, and post-disaster recovery.
The Role of the Scientific Method in Hazard Assessment
The scientific method is the foundation of natural hazard assessment. It proceeds from systematic observation and data collection, through the formulation of hypotheses about physical processes, through the testing of those hypotheses against new data, to the development of predictive models that can be used to assess future risk. The quality of hazard assessment depends critically on the quality and quantity of observational data: long historical records of past events, modern monitoring networks that track precursors in real time, and laboratory and field experiments that constrain the physical parameters of hazardous processes.
The application of the scientific method to natural hazards has two main products: deterministic hazard assessment (specifying what events are physically possible and characterizing their maximum magnitude, geographic extent, and intensity distribution) and probabilistic hazard assessment (quantifying the likelihood that a hazard of a given intensity will occur at a given location within a specified time period). Probabilistic seismic hazard analysis (PSHA), for example, combines information about the spatial distribution and rates of earthquakes on known faults, the attenuation of ground motion with distance, and the probability distribution of ground-shaking intensities from each source to produce a map showing the probability of exceeding a particular ground acceleration level over a 50-year period. Such maps are the basis for seismic building codes worldwide.
An important and often misunderstood aspect of hazard science is the distinction between prediction and forecasting. A deterministic prediction specifies exactly when, where, and how large an event will be — a standard that is unachievable for most natural hazards given the complexity and chaotic dynamics of the relevant physical systems. A forecast, by contrast, is a probabilistic statement: “there is a 60% probability of a magnitude 6+ earthquake in a specified region within the next 30 years.” The scientific goal is to make such forecasts as accurate as possible, but communicating probabilistic information to the public and to decision-makers is a significant challenge, particularly when the probabilities are framed over long time periods and the hazard is relatively infrequent.
Climate Change and Natural Hazards
A major theme of contemporary hazard science is the influence of human-caused climate change on the frequency, intensity, and distribution of natural hazards. It is essential to distinguish carefully between hazards that are directly driven by the climate system (atmospheric and hydrological hazards: hurricanes, floods, droughts, wildfires) and those that are primarily geological (earthquakes, volcanic eruptions, tsunamis) and not directly controlled by climate. However, even some geological hazards have secondary connections to climate: the retreat of glaciers changes the stress state of underlying bedrock and can affect landslide and volcanic eruption frequency; the rise in sea level increases flood risk from coastal storms and tsunamis; permafrost thaw destabilizes slopes and infrastructure in Arctic regions.
For weather-related hazards, the evidence for climate change impacts is clear and growing. Global average surface temperatures have risen approximately 1.2°C above pre-industrial levels as of the early 2020s, driven by emissions of greenhouse gases (primarily CO2 and CH4) from fossil fuel combustion, deforestation, and agriculture. The consequences for natural hazards include: intensification of the hydrological cycle (more intense precipitation events, more severe droughts), shifts in the geographic distribution of tropical cyclones (poleward expansion of the tropics, intensification of the strongest storms), longer and more severe wildfire seasons, accelerated coastal erosion and increased storm surge flooding from sea level rise, and increased flood risk from glacier and ice sheet melt.
Chapter 2: Plate Tectonics and Geological Hazards
Plate Tectonic Framework
The theory of plate tectonics, developed in the 1960s and 1970s, provides the unifying framework for understanding the spatial distribution of the most destructive geological hazards — earthquakes, volcanic eruptions, and tsunamis. The Earth’s lithosphere (the rigid outer layer, comprising the crust and the uppermost part of the upper mantle) is broken into a mosaic of plates that move relative to one another at rates of centimetres to tens of centimetres per year, driven by convection in the underlying mantle. The boundaries between plates are the sites of intense geological activity.
Three types of plate boundaries are recognized. Divergent boundaries occur where plates move apart, allowing hot mantle material to upwell and create new oceanic crust; the mid-ocean ridge system is the most extensive example, stretching more than 60,000 km around the globe. At divergent boundaries, volcanism is ubiquitous (producing basaltic lava along the ridge axis), and earthquakes are generally shallow and moderate in magnitude. The East African Rift Valley is a continental divergent boundary, where the African plate is slowly being torn apart, accompanied by volcanism (Kilimanjaro, Ol Doinyo Lengai) and seismic activity.
Convergent boundaries occur where plates move toward each other. If both plates are oceanic, the denser (older, cooler) one subducts beneath the other, producing a deep oceanic trench, a chain of volcanoes on the overlying plate (an island arc), and some of the world’s most powerful earthquakes in the subduction zone. If one plate is oceanic and the other continental, the oceanic plate subducts, producing a continental volcanic arc (the Andes, the Cascades, Japan) and great megathrust earthquakes at the interface between the subducting and overlying plates. If both plates are continental, neither subducts readily (continental crust is too buoyant), and a collision zone with very high mountains results (the Himalayas, the Alps), with seismic activity distributed over a broad zone but reduced volcanism.
Transform boundaries occur where plates slide horizontally past each other without creating or destroying crust. The San Andreas fault in California is the most famous example, where the Pacific plate moves northwestward past the North American plate at about 5 cm per year. Transform boundaries are sites of frequent, sometimes very large, earthquakes (the 1906 San Francisco earthquake, magnitude 7.9, occurred on the San Andreas fault) but no significant volcanism.
The “Ring of Fire” — the chain of volcanic arcs and convergent plate boundaries encircling the Pacific Ocean — hosts approximately 75% of the world’s active volcanoes and generates about 90% of the world’s earthquakes, including virtually all of the largest (magnitude 9+) earthquakes ever recorded. This geographic concentration of hazard is a direct consequence of the subduction of the Pacific and related plates beneath the surrounding continental and oceanic plates.
Chapter 3: Earthquakes — Mechanics, Effects, and Case Studies
The Physics of Earthquakes
Earthquakes are caused by the sudden rupture and slip along a fault — a fracture in the Earth’s crust or lithosphere along which the two sides have moved relative to each other. The energy driving this slip comes from the slow, continuous movement of tectonic plates, which loads elastic strain energy into the rocks adjacent to the fault over decades to centuries. When the shear stress on the fault exceeds the frictional resistance to sliding (the strength of the fault), the fault slips suddenly, releasing the stored elastic energy as seismic waves that propagate outward through the Earth and produce the shaking felt at the surface.
The mechanism of earthquake generation is best understood through the elastic rebound theory, proposed by Harry Reid following the 1906 San Francisco earthquake. Reid observed that geodetic measurements showed that rocks on both sides of the San Andreas fault had been slowly distorting (shearing) for decades before the earthquake, and that after the earthquake, the strain was released and the rocks “rebounded” to a less-distorted configuration. The cycle of strain accumulation, sudden rupture, and rebound is now understood to be the fundamental mechanism of all tectonic earthquakes.
The point within the Earth where an earthquake originates — where the rupture begins — is the focus or hypocenter. Its projection onto the Earth’s surface directly above is the epicenter. The magnitude of an earthquake is a measure of the energy released, originally quantified by Charles Richter in 1935 using seismograph records from California. The Richter scale is logarithmic: each whole-number increase in magnitude corresponds to roughly a 32-fold increase in energy release and a 10-fold increase in the amplitude of ground motion. The moment magnitude scale (Mw), now preferred by seismologists because it is physically more meaningful and does not saturate for large earthquakes, is defined in terms of the seismic moment \( M_0 \):
\[ M_w = \frac{2}{3}\log_{10}(M_0) - 10.7 \]where \( M_0 = \mu A \bar{d} \) (shear modulus \( \mu \) times fault area \( A \) times average slip \( \bar{d} \)). For the 2011 Tohoku earthquake (\( M_w = 9.0 \)), the fault area was approximately 500 km × 200 km, the average slip was about 10–20 m, and the total energy release was equivalent to approximately 600 million atomic bombs of the Hiroshima yield.
Types of Seismic Waves and Their Effects
Earthquakes produce several types of seismic waves that travel through and along the Earth. Body waves travel through the interior of the Earth and include P-waves (primary or compressional waves, in which particle motion is parallel to the direction of wave propagation) and S-waves (secondary or shear waves, in which particle motion is perpendicular to the direction of propagation). P-waves travel faster than S-waves and arrive first at a seismograph station; the S-P time delay is used to calculate distance to the earthquake epicenter. S-waves cannot travel through liquids (because liquids have no shear strength), which is why S-waves do not penetrate Earth’s liquid outer core — an observation that first revealed the existence of the liquid core.
Surface waves travel along the Earth’s surface and are generally more destructive than body waves because they have larger amplitudes, lower frequencies, and longer duration of shaking. Love waves (horizontal shear motion) and Rayleigh waves (retrograde elliptical motion) both contribute to surface-wave shaking. The long-period (10–100 second) Rayleigh waves from large earthquakes can travel around the Earth multiple times and are used to determine earthquake source parameters.
Liquefaction is a process in which water-saturated, unconsolidated sediments (particularly clean, loose sands) temporarily lose their shear strength during earthquake shaking and behave as a viscous fluid. During liquefaction, the dynamic loading of seismic waves causes an increase in pore water pressure that reduces the effective stress between grains to zero, eliminating friction and allowing the sediment to flow like a liquid. The consequences are spectacular and devastating: buildings sink or tilt, buried tanks and pipes rise buoyantly to the surface, and lateral spreading of saturated ground can carry structures hundreds of metres downslope. Liquefaction was a major cause of damage in the 1964 Alaska earthquake, the 1995 Kobe earthquake, the 2011 Tohoku earthquake, and the 2011 Christchurch earthquake.
The Tohoku earthquake of 11 March 2011 (\( M_w = 9.0 \)) was the most powerful earthquake ever recorded in Japan and the fourth most powerful worldwide since modern seismographic recording began. It occurred along the Japan Trench subduction zone, where the Pacific plate dips beneath the North American plate at approximately 8 cm per year. The earthquake ruptured a fault segment approximately 500 km long and 200 km wide, with average slip of 10–20 m and a maximum slip of over 50 m in some areas near the epicenter. The shaking lasted approximately 6 minutes — extraordinarily long for an earthquake — and caused significant damage to buildings as far away as Tokyo, 370 km from the epicenter.
The earthquake generated a devastating tsunami. The rupture produced a vertical displacement of the seafloor of several metres over an area of nearly 100,000 km2, displacing an enormous volume of water and generating a wave train that reached the Japanese coast within 10–40 minutes of the earthquake. Run-up heights (the elevation reached by the tsunami above sea level) exceeded 40 m in some locations — comparable to a 13-story building. The tsunami inundated approximately 500 km2 of coastal Japan, destroying entire towns, sweeping away 90,000 buildings, and killing approximately 15,900 people (with an additional 2,500 missing). The tsunami also triggered the Fukushima Daiichi nuclear disaster, when it overwhelmed the seawall protecting the plant and disabled backup generators for the cooling systems, leading to three reactor meltdowns and the release of radioactive materials.
The disaster exposed both the extraordinary effectiveness of Japan’s earthquake preparedness (the shaking itself killed relatively few people, because buildings largely performed well) and the limits of tsunami warning and evacuation systems when a very large tsunami strikes a densely populated coastline with limited run-up time. Post-disaster analysis led to comprehensive revision of tsunami hazard assessment methodology globally, with recognition that maximum credible tsunami heights had been systematically underestimated at many sites.
Seismic Hazard Assessment and Building Codes
Seismic hazard assessment is the quantitative estimation of the likelihood and intensity of earthquake ground motion at a specific location. The modern approach — Probabilistic Seismic Hazard Analysis (PSHA) — was developed by Cornell (1968) and integrates information from: (1) the geometry and activity rates of known faults in the region; (2) the magnitude-frequency relationship (Gutenberg-Richter law: log N = a − bM, where N is the number of earthquakes with magnitude ≥ M per unit time and area, and a and b are constants that vary by region); (3) ground motion prediction equations (GMPEs) that describe how shaking intensity decreases with distance from the source; and (4) the probability model for earthquake recurrence (typically a Poisson process for the time between events).
The output of PSHA is a hazard curve: a plot of the probability of exceeding a given ground acceleration level within a specified time period (commonly 10% probability of exceedance in 50 years, corresponding to a return period of approximately 475 years, used in standard building codes). The national seismic hazard maps produced by the USGS and the Geological Survey of Canada provide the basis for building code requirements across North America. Modern building codes require that structures designed for essential facilities (hospitals, emergency shelters, schools) must resist the Maximum Considered Earthquake (MCE) ground motion without collapse, and that ordinary buildings must have a low probability of collapse when subjected to the design-level ground motion.
Chapter 4: Volcanoes — Eruption Styles, Hazards, and Case Studies
The Physical Basis of Volcanism
Volcanic eruptions occur when magma — partially or fully molten rock from the mantle or lower crust — rises through the lithosphere and reaches the surface. The physical character of a volcanic eruption is primarily controlled by the composition of the magma, and particularly by its silica content and volatile (water, CO2, SO2) content. These two factors control the viscosity of the magma and the ease with which volatiles can exsolve and escape.
Mafic magmas (basalts) have low silica content (45–52 wt% SiO2), low viscosity, and relatively low volatile content. Their low viscosity means that dissolved gases can bubble out of the melt relatively easily as the magma rises and decompresses, producing either effusive (lava flow-dominated) eruptions or mildly explosive fire-fountain eruptions. Hawaiian-style eruptions, which produce the spectacular lava fountains and extensive pahoehoe and aa lava flows of the Hawaiian Islands, are the type example. Despite the dramatic visual impact of lava flows, they move slowly enough (centimetres to metres per hour for most pahoehoe flows; faster for channelised aa) that human fatalities from lava flows are rare, though property losses can be enormous.
Felsic magmas (rhyolites, dacites) have high silica content (65–75+ wt% SiO2), very high viscosity, and commonly high dissolved water contents. Their high viscosity prevents gas bubbles from escaping smoothly as the magma rises; instead, the bubbles grow and coalesce until the overpressure within the bubble network exceeds the tensile strength of the melt, causing catastrophic fragmentation of the magma into fine ash and pyroclastic particles. The result is an explosive eruption that can eject material into the stratosphere and produce pyroclastic flows, surges, and ash fall over vast areas. The 1991 Pinatubo eruption in the Philippines (Volume Explosivity Index, VEI = 6) and the 1883 Krakatau eruption (VEI = 6) are examples of this type of eruption.
Volcanic Hazards: A Comprehensive Survey
Volcanic eruptions generate a diverse array of hazards, each with distinct physical characteristics, geographic extent, warning time, and potential for harm. Understanding the specific nature of each hazard is essential for developing effective monitoring, warning, and evacuation strategies.
Lava flows are streams of molten rock that emanate from eruptive vents or fissures. Their destructive power comes primarily from the heat (temperatures of 700–1,200°C), which ignites structures and vegetation on contact, and from their ability to bury and destroy anything in their path. The speed of lava flows varies enormously: pahoehoe lava on shallow slopes may advance only centimetres per hour, while channelised aa flows can travel at tens of kilometres per hour on steep slopes, and lava tubes (insulated conduits within older flows) can transport lava tens of kilometres from the vent. The 2018 eruption of Kilauea’s lower East Rift Zone in Hawaii dramatically illustrated the hazard of lava flows in residential areas: approximately 700 homes were destroyed as slow-moving but unstoppable flows advanced through the Leilani Estates subdivision over a period of several months.
Pyroclastic density currents (PDCs) are the most lethal of all volcanic hazards. These are fast-moving, hot (300–700°C), turbulent mixtures of volcanic gas, ash, and rock fragments that flow across the landscape like a fluid under gravity, conforming to topography. They can travel at speeds of 100–700 km/h, making outrunning them impossible for anyone caught in their path. The collapse of an eruption column (when the erupted material becomes too dense to maintain its buoyancy in the atmosphere) generates a PDC directly; alternatively, PDCs can form from the gravitational collapse of a lava dome. The destruction of Pompeii and Herculaneum in 79 CE was caused by a series of PDCs from Vesuvius, which entombed the cities in metres of ash and pyroclastic deposits, preserving them for nearly two thousand years. The 1902 eruption of Mount Pelée on Martinique produced a directed PDC (a “nuée ardente” in the terminology of the era) that killed approximately 30,000 people in the town of St. Pierre in a matter of minutes, one of the deadliest volcanic events in recorded history.
Volcanic ash is ejected from explosive eruptions and can be dispersed by winds over enormous distances. Ash particles are tiny fragments of volcanic glass (not combustion ash) with sharp, angular shapes that make them highly abrasive. Even a few centimetres of ash accumulation can collapse agricultural buildings; 10–20 cm can cause roof collapse of modern buildings; heavy accumulations exceed 50 cm in some proximal areas. Ash is also severely harmful to aviation: it melts in jet engines at temperatures above 1,100°C and re-solidifies on turbine blades as glass, causing engine failure. The 2010 eruption of Eyjafjallajökull in Iceland (VEI 4) disrupted European air travel for six days, cancelling approximately 100,000 flights and affecting 10 million passengers, with economic losses estimated at approximately €1.3 billion.
Lahars (volcanic mudflows or debris flows) are a hazard unique to volcanoes and are responsible for some of the highest volcanic death tolls in recorded history. A lahar is a fast-moving mixture of volcanic debris and water that travels down river valleys, sometimes for hundreds of kilometres from the volcano, burying everything in its path under metres of concrete-hard deposit. Lahars can be triggered by: the interaction of erupted material with a summit snow or ice cap (meltwater lahars during eruptions); heavy rainfall on freshly deposited pyroclastic material on the volcano’s flanks (post-eruption lahars, which can occur months to years after an eruption); or the failure of crater lakes. The 1985 eruption of Nevado del Ruiz in Colombia produced a modest eruption that melted a small portion of the ice cap, generating lahars that travelled 70 km down the Lagunillas River and buried the town of Armero under several metres of mud, killing approximately 23,000 of the town’s 29,000 residents — one of the deadliest volcanic disasters of the twentieth century, made more tragic because it was predicted by volcanologists who were unable to convince authorities to evacuate the town.
Volcanic gases are emitted continuously by active volcanoes and during eruptions. The main volcanic gases are water vapour (H2O, typically 50–90% of total), carbon dioxide (CO2, 5–35%), sulphur dioxide (SO2, 0.5–10%), hydrogen sulphide (H2S), hydrogen chloride (HCl), and hydrogen fluoride (HF). While water vapour is harmless, the other gases pose significant risks. CO2 is heavier than air and can accumulate in low-lying areas to asphyxiating concentrations; in 1986, a CO2 eruption from the floor of Lake Nyos in Cameroon released a cloud of CO2 that rolled down valleys and asphyxiated approximately 1,800 people and 3,500 livestock. SO2 reacts with water in the atmosphere to form sulphuric acid aerosols (H2SO4), which scatter incoming solar radiation and can cause significant global cooling lasting 1–3 years following major eruptions. The 1991 Pinatubo eruption injected an estimated 20 million tonnes of SO2 into the stratosphere, reducing global average temperatures by approximately 0.5°C for 2 years and creating spectacular sunsets observed worldwide.
The 18 May 1980 eruption of Mount St. Helens in Washington State is the most thoroughly documented large volcanic eruption in the modern era, providing an extraordinary dataset for understanding eruption mechanics, hazard zonation, and emergency management. The eruption was triggered by a magnitude 5.1 earthquake beneath the volcano that destabilized the bulging north flank, causing a massive debris avalanche — the largest in recorded history — involving approximately 2.3 km3 of material. The removal of this material suddenly decompressed the magma body beneath, causing a horizontal blast that devastated an area of 600 km2, blowing down trees like matchsticks to distances of 25 km north of the summit.
The lateral blast was followed by a vertical eruption column that reached 25 km into the stratosphere and produced an estimated 1.2 km3 of pyroclastic material. The eruption killed 57 people, including the volcanologist David Johnston at the USGS observation post 9 km from the volcano. Johnston’s death, and the effective evacuation that kept the death toll as low as it was, illustrates both the irreducible danger of active volcanism and the life-saving importance of volcanic monitoring and hazard zonation. Prior to the eruption, geologists had established exclusion zones based on precursory activity including increased seismicity, ground deformation measured by tiltmeters and geodetic surveys, and increased fumarolic emission. Thousands of additional lives were saved by these precautionary measures, though legal disputes about property access within the exclusion zone complicated enforcement.
The long-term recovery of the ecosystem surrounding Mount St. Helens has been one of the most closely studied ecological experiments of the twentieth century, demonstrating the remarkable resilience of natural ecosystems to volcanic disturbance. Populations of pocket gophers, lupines, and other pioneering species survived within the blast zone and served as agents of biological recovery, while wind-dispersed seeds and spores colonized the bare pyroclastic deposits within weeks of the eruption.
Chapter 5: Atmospheric Processes and Severe Weather
The Atmospheric Engine and Weather Systems
The Earth’s atmosphere is a fluid dynamical system driven by uneven solar heating of the surface — the equatorial regions receive far more solar radiation per unit area than the poles, creating a temperature gradient that drives global circulation patterns. The interactions between differential solar heating, Earth’s rotation (the Coriolis effect), ocean heat transport, and land-sea contrasts produce the rich variety of atmospheric phenomena that we experience as weather.
The Coriolis effect, arising from Earth’s rotation, deflects moving air masses to the right in the Northern Hemisphere and to the left in the Southern Hemisphere. This deflection organizes atmospheric circulation into persistent patterns: the trade winds blow from east to west in the tropics, the westerlies dominate mid-latitudes, and the polar easterlies characterize high latitudes. Embedded in the mid-latitude westerlies are the jet streams — narrow bands of fast-moving air at tropopause level (8–12 km altitude) — that guide the development and movement of the mid-latitude cyclones (extratropical storms) that bring most of the precipitation to continental interiors.
Thunderstorms are convective storms that form when moist, unstable air rises rapidly, cools, and condenses, releasing latent heat that further accelerates the upward air motion. Individual thunderstorm cells can produce heavy rain, hail, lightning, and strong gusty winds, but they are generally short-lived. The most severe thunderstorms are supercells — organized, rotating convective systems characterized by a persistent mesocyclone (a rotating updraft) that can maintain themselves for hours and travel hundreds of kilometres. Supercell thunderstorms are responsible for the most violent tornadoes, large hail, and extreme wind gusts.
Tornadoes are violently rotating columns of air that extend from the base of a supercell thunderstorm to the Earth’s surface. They are rated on the Enhanced Fujita scale (EF0–EF5) based on damage indicators; EF5 tornadoes have wind speeds exceeding 322 km/h. The tornado with the highest verified wind speed ever measured (484 km/h) was associated with the 1999 Oklahoma City tornado outbreak. Tornadoes are most common in Tornado Alley — the region of the central United States extending from Texas northward through Oklahoma, Kansas, and Nebraska — where the unique combination of warm, moist air from the Gulf of Mexico, cool, dry air from the Rocky Mountains, and strong upper-level wind shear creates ideal conditions for supercell formation.
Chapter 6: Hurricanes and Tropical Cyclones
Formation and Intensification
Tropical cyclones (called hurricanes in the North Atlantic and eastern North Pacific, typhoons in the western North Pacific, and cyclones in the South Pacific and Indian Ocean) are the most powerful individual storms on Earth, capable of generating winds exceeding 300 km/h and affecting areas thousands of kilometres across. They form over warm tropical oceans (sea surface temperature must exceed approximately 26°C to a depth of 50 m) when atmospheric conditions allow sustained deep convection to organize into a rotating circulation centered on a low-pressure area.
The energy source for a tropical cyclone is the evaporation of warm ocean water: as moist air spirals into the low-pressure center and rises, condensation releases enormous amounts of latent heat, which drives further upward motion and strengthens the inflow at the surface. The Coriolis effect organizes this inflow into cyclonic rotation (counterclockwise in the Northern Hemisphere, clockwise in the Southern). The warm core structure — the tropospheric column within the cyclone being warmer than the environment at all levels — distinguishes tropical cyclones from extratropical (frontal) cyclones and is responsible for their circular, symmetric structure.
The Saffir-Simpson Hurricane Wind Scale (SSHWS) classifies Atlantic hurricanes on a 1–5 scale based on sustained wind speed. Category 1 storms have winds of 119–153 km/h and cause minimal damage; Category 5 storms exceed 252 km/h and cause catastrophic destruction to all but the most engineered structures. However, the wind scale has limitations as a hazard descriptor because it does not capture two of the most deadly aspects of hurricanes: storm surge and inland flooding from extreme rainfall.
Storm surge — the abnormal rise in sea level produced by a hurricane — is the leading cause of hurricane fatalities in the United States. The surge results from: (1) the wind-driven pile-up of water against the coast (wind setup); (2) the reduction in atmospheric pressure at the storm center, which allows sea level to rise by approximately 1 cm per hPa of pressure reduction (the inverted barometer effect); and (3) wave set-up from breaking waves. Storm surge magnitude depends strongly on the storm’s intensity and forward speed, the shape of the coastline and seafloor (shallow bathymetry greatly amplifies surge), and the angle at which the storm makes landfall. Hurricane Katrina (2005) produced a storm surge of 8–9 m along the Mississippi coast — among the highest ever recorded in the United States — that inundated coastal communities and contributed to the failure of the New Orleans levee system, ultimately causing more than 1,800 deaths.
Hurricane Katrina made landfall on the Louisiana and Mississippi coast on 29 August 2005 as a Category 3 storm (wind speed 205 km/h at landfall, after weakening from Category 4–5 intensity over the Gulf of Mexico). Its storm surge exceeded 8 m in some coastal areas of Mississippi, completely obliterating communities along the coast. In New Orleans, which lies mostly below sea level in a bowl surrounded by Lake Pontchartrain to the north and the Mississippi River to the south, the storm surge overtopped and breached numerous sections of the federally built flood control levee system, inundating approximately 80% of the city with water up to 6 m deep.
The death toll of 1,800+ and property losses exceeding $125 billion (the costliest natural disaster in US history to that point) reflected not just the physical hazard but deep failures of infrastructure, governance, and social equity. The levee system had been designed to withstand a Category 3 storm but had been maintained inadequately by the Army Corps of Engineers; post-disaster investigations found design flaws that likely would have caused failure even without the overtopping. The emergency management response was disorganized and inadequate: the slow Federal Emergency Management Agency (FEMA) response left survivors without water, food, or medical care for days in venues like the Superdome and the Convention Center. The disaster disproportionately affected the poor, elderly, and African-American residents of New Orleans who lacked the resources to evacuate before the storm and who lived in the lowest-lying, most flood-prone neighborhoods.
The aftermath of Katrina drove major reforms in federal emergency management, building codes, levee standards, and evacuation planning. The rebuilt New Orleans levee system, completed in 2011 at a cost of $14 billion, is now designed to withstand a 100-year storm. However, the fundamental vulnerability of a major city built below sea level in a hurricane-prone environment remains, and with sea level rise from climate change, the risk will increase over the coming decades regardless of the levee improvements.
Climate Change and Hurricanes
The relationship between climate change and tropical cyclones is an active area of scientific research with significant policy implications. Theory and models consistently predict that warming sea surface temperatures will allow tropical cyclones to achieve higher maximum intensities (more Category 4–5 storms), carry more water vapour (producing more extreme rainfall), and potentially expand their geographic range poleward as the warm ocean areas suitable for cyclone formation extend toward higher latitudes. However, whether the total number of tropical cyclones will increase or decrease with climate change is less certain, because climate change also affects atmospheric circulation patterns (particularly wind shear, which tends to suppress cyclone development) in complex ways.
The observational record shows clear evidence that the most intense tropical cyclones have become more frequent globally since the 1970s, consistent with theoretical predictions. Rapid intensification events — in which hurricane winds increase by at least 56 km/h in 24 hours, making forecast and response extremely difficult — have become more common over the past several decades, linked to warmer, deeper warm-water layers in the ocean. The 2017 Atlantic hurricane season, which included three simultaneous major (Category 3+) hurricanes for the first time in recorded history (Harvey, Irma, and Maria), caused unprecedented damage across the Caribbean, Texas, and Florida, and is consistent with the predicted increase in extreme hurricane activity.
Chapter 7: Floods and Dam Failures
The Physics and Hydrology of Floods
Floods are the most frequent, most widespread, and economically most damaging natural hazard globally, affecting more people than any other hazard type and causing approximately $40 billion in economic losses per year on average. They occur when the volume of water entering a river channel or coastal zone exceeds the capacity of that channel or zone to contain it, causing water to overflow onto adjacent land areas (the floodplain). Understanding the physical processes of flooding requires understanding rainfall-runoff relationships, channel hydraulics, and the geomorphological context of the floodplain.
The relationship between rainfall and runoff is controlled by several factors: the intensity and duration of the precipitation, the degree of saturation of the soil (antecedent soil moisture), the permeability and infiltration capacity of the soil, the slope of the terrain, and the density of vegetation. In an undisturbed forest, a significant fraction of rainfall is intercepted by the canopy, absorbed by the forest floor and root zone, and evapotranspired back to the atmosphere; very little becomes surface runoff. In an impervious urban area, nearly all rainfall that exceeds drain capacity becomes runoff immediately, with very little infiltration. Urbanization therefore dramatically increases flood peaks (the maximum flow rate during a storm) and decreases flood recurrence intervals (the average time between floods of a given magnitude).
The return period (or recurrence interval) of a flood is the average number of years between floods of a given magnitude or greater, estimated from historical records using frequency analysis. A 100-year flood has a 1% probability of being equalled or exceeded in any given year; a 10-year flood has a 10% annual probability. These probabilities are important to understand correctly: a 100-year flood does not mean one must wait 100 years after a 100-year flood to see another; rather, there is always a 1% chance in any given year. Designation of land as the “100-year floodplain” (the area inundated by the 100-year flood) is used in many jurisdictions to regulate construction and set insurance requirements, but this approach has been criticized for giving a false sense of security and for not accounting for changes in flood frequency due to urbanization, watershed modification, or climate change.
The flooding of Offutt Air Force Base near Omaha, Nebraska in March 2019 provides an instructive case study in how multiple contributing factors can combine to produce extreme flooding that overwhelms defence systems designed to protect critical infrastructure. The flooding was part of a broader Missouri River flood event triggered by the interaction of: (1) a rapidly strengthening extratropical cyclone (with surface pressure dropping more than 24 hPa in 24 hours, qualifying as a meteorological “bomb cyclogenesis”) that produced blizzard conditions with heavy snow across the upper Midwest; (2) a rapid warm-up that melted the snow quickly; (3) rain falling on frozen, saturated soils that had essentially zero additional infiltration capacity; and (4) elevated antecedent soil moisture from unusually heavy autumn precipitation in 2018.
The resulting flood was one of the most expensive in Nebraska history, damaging approximately 1 million acres of farmland, washing out roads and bridges, and causing approximately $3 billion in agricultural losses across Nebraska, Iowa, and South Dakota. At Offutt Air Force Base, floodwaters breached a protective berm and inundated approximately 30% of the base, damaging more than 30 buildings and destroying or damaging aircraft. The flooding exposed weaknesses in flood protection design for critical federal infrastructure, particularly the failure to consider compound extreme events (an extreme rainfall-melt event combined with unusual antecedent conditions).
The 2019 Nebraska floods also demonstrated the connection between climate change and Midwestern flooding: multiple studies have shown that precipitation extremes in the central United States have increased significantly over the past several decades, and the combination of a warming climate with periodic extreme events is expected to increase flood frequency and severity across the region throughout the twenty-first century.
Dam Failures and Tailings Dam Collapses
Engineered dams provide critical water storage, flood control, and hydroelectric power, but their failure can produce some of the most catastrophic rapid-onset disasters possible. Dam failures can result from: overtopping (the reservoir level rises above the dam crest, eroding the downstream face in earthfill dams); internal erosion and piping (progressive erosion of internal flow paths through the embankment); foundation failure (failure of the underlying bedrock or soil); slope instability of the reservoir banks causing a displacement wave; or seismic loading that liquefies the embankment or foundation materials.
The Vajont Dam disaster in Italy in 1963 illustrates the hazard of landslide-generated displacement waves in reservoirs. A slow-moving rockslide on the Monte Toc slope above the reservoir had been recognized as a potential hazard before the dam was filled, but monitoring was inadequate and the decision was made to continue filling the reservoir despite evidence of accelerating slope movement. On the evening of 9 October 1963, approximately 260 million m3 of rock slid into the reservoir at extremely high speed, generating a displacement wave that overtopped the 262-m-tall arch dam by more than 100 m. The wave devastated the downstream villages of Longarone, Pirago, Villanova, and Rivalta, killing approximately 2,000 people. The dam itself remained structurally intact — a testament to its engineering — but was made permanently useless by the landslide debris.
Chapter 8: Wildfires and Forest Management
Fire Ecology and the Physics of Wildfire Spread
Wildfire is a natural ecological process that has shaped vegetation communities, soil chemistry, and atmospheric composition for at least 350 million years (since the Carboniferous period, when atmospheric oxygen first rose to levels sufficient to sustain combustion). Many ecosystems — including boreal forests, Mediterranean shrublands, grasslands, and fire-adapted conifer forests — have evolved with fire and depend on periodic burning to maintain their ecological character. The suppression of fire in these ecosystems over the past century has, paradoxically, increased wildfire risk by allowing the accumulation of fuel loads (dead wood, shrub growth, dense understorey vegetation) far beyond the levels that would occur under natural fire regimes.
The spread of a wildfire is governed by three factors: fuel, weather, and topography — the “fire triangle” of wildland fire science. Fuel characteristics include the quantity, arrangement, moisture content, and chemical composition of combustible material. Weather conditions — wind speed and direction, temperature, relative humidity, and atmospheric instability — are the most variable and often the most critical factors. Topography affects fire spread through its influence on wind patterns and through the chimney effect on slopes, which causes fires to accelerate uphill as preheated air and gases from the burning front rise ahead of the fire.
The Camp Fire, which ignited near the town of Paradise in Butte County, California on 8 November 2018, became the deadliest and most destructive wildfire in California history. Driven by extreme Diablo wind conditions (offshore winds exceeding 80 km/h, relative humidity in single digits, and air temperatures in the mid-20s Celsius despite being November), the fire spread with extraordinary speed, advancing through the town of Paradise at a rate that outpaced the evacuation capacity of the road network. The town of Paradise (population approximately 27,000) was nearly completely destroyed in less than four hours. The fire killed 85 people, destroyed approximately 18,800 structures, and burned approximately 62,000 ha.
The Camp Fire was ignited by a failed electrical transmission line owned by Pacific Gas & Electric (PG&E), exposing deep questions about the liability of utility companies for wildfires ignited by their infrastructure — a problem that has become increasingly acute in California as warming and drying climate conditions extend the peak wildfire season and increase the frequency of extreme fire weather. The fire also exposed severe deficiencies in the road network serving mountain communities and in real-time evacuation planning: the single main road out of Paradise was quickly gridlocked, and dozens of people died in their cars attempting to evacuate.
Post-fire analysis showed that the Camp Fire’s extraordinary destructiveness reflected three overlapping factors: extreme fire weather driven by regional climate change and the persistence of drought; a century of fire suppression that had allowed unprecedented fuel accumulation in the Sierra Nevada foothills; and the expansion of residential development into the wildland-urban interface (WUI) — the zone where houses and other structures intermingle with undeveloped wildland vegetation — which had placed tens of thousands of residents in harm’s way.
Chapter 9: Landslides and Mass Wasting
Classification and Mechanics of Mass Movements
Mass wasting — the downslope movement of rock, soil, or debris under the influence of gravity — encompasses a broad spectrum of processes ranging from the gradual creep of soil over decades to the catastrophic collapse of entire mountain flanks in seconds. Understanding mass wasting requires understanding the balance between the driving force (gravity, resolved along the slope surface) and the resisting forces (the shear strength of the slope material, primarily cohesion and friction), and the conditions that upset this balance.
The stability of a slope can be quantified through the factor of safety (FS), defined as:
\[ FS = \frac{\text{Resisting forces}}{\text{Driving forces}} = \frac{\tau_s}{\tau_d} \]where \( \tau_s \) is the shear strength of the slope material and \( \tau_d \) is the driving shear stress. For FS > 1, the slope is stable; for FS < 1, failure occurs; FS = 1 represents the limit equilibrium condition. For a simple infinite slope model (a planar slope with uniform properties), the factor of safety for a shallow translational slide along a basal plane is:
\[ FS = \frac{c' + (z\gamma\cos^2\beta - u)\tan\phi'}{z\gamma\sin\beta\cos\beta} \]where \( c' \) is effective cohesion, \( z \) is slope height, \( \gamma \) is unit weight of slope material, \( \beta \) is slope angle, \( u \) is pore water pressure at the slip surface, and \( \phi' \) is effective internal friction angle. Several important lessons emerge from this equation: (1) increasing pore water pressure \( u \) (e.g., from heavy rainfall or snowmelt) reduces FS and can trigger failure on slopes that were previously stable; (2) steeper slopes have higher driving stresses; (3) the presence of a weak, low-cohesion, low-friction material (such as clay-rich fault gouge, expansive clay, or ice-rich permafrost) at the base of the slope drastically reduces resistance.
Rock avalanches (also called sturzstroms) are the most dramatic and puzzling type of mass wasting. When very large rock masses collapse (volumes greater than approximately 106 m3), the resulting avalanche travels much farther than simple friction-based models would predict — some travel at speeds exceeding 100 m/s and run out distances 10 times the vertical height of fall. The mechanism of this anomalously long runout is still debated, with proposed explanations including air layer lubrication, acoustic fluidization, rock fragmentation, and frictional heating causing vaporization at the base. Whatever the mechanism, the practical implication is that very large rockslide-avalanches can travel enormous distances at great speed and can devastate areas far beyond what would be expected from simple geometric analysis.
Earthquake-triggered landslides can cause casualties comparable to or exceeding the direct seismic shaking. The 1970 Ancash earthquake in Peru triggered a massive ice and rock avalanche from Nevado Huascarán that buried the towns of Yungay and Ranrahirca, killing approximately 18,000 people — more than half of the earthquake’s total death toll. The 2008 Sichuan earthquake in China triggered tens of thousands of landslides that dammed rivers, creating lakes that themselves posed downstream flood hazards when the natural dams failed. Post-earthquake landslide dam outbursts (jökulhlaup-like events) are an increasingly recognized secondary hazard that requires specific monitoring and management.
Chapter 10: Coastal Processes, Tsunamis, and Space Impacts
Coastal Hazards and Sea Level Rise
Coastlines are among the most dynamic environments on Earth, constantly shaped by the interplay of waves, tides, currents, sea level change, and sediment supply. They are also among the most densely populated zones — approximately half of the world’s population lives within 60 km of a coast — making coastal hazards of paramount societal importance.
Coastal erosion affects approximately 24% of the world’s sandy beaches at rates exceeding 0.5 m per year. It is caused by a combination of natural processes (wave energy, storm overwash, sea level rise) and human activities (reduction of sediment supply by dam construction, disruption of longshore sediment transport by jetties and breakwaters, removal of coastal vegetation, and direct extraction of sand and gravel). The predicted acceleration of sea level rise due to global warming — from the current rate of approximately 3.7 mm/year to potentially 10–15 mm/year by 2100 under high-emissions scenarios — will dramatically increase the rate of coastal erosion and the frequency of coastal flooding over the next century.
Storm surge flooding from tropical and extratropical cyclones is, as discussed in Chapter 6, the leading cause of hurricane fatalities, but coastal flooding also occurs from extratropical winter storms (nor’easters on the US East Coast, extratropical cyclones in the North Sea and Bay of Bengal). The 1953 North Sea flood was caused by a severe extratropical cyclone that combined with spring tides to produce storm surge heights of over 3 m along the coasts of the Netherlands, Belgium, and England, killing more than 2,500 people. This disaster directly motivated the construction of the massive Delta Works flood protection system in the Netherlands — arguably the most sophisticated coastal flood defense system in the world, completed in 1997 after decades of construction.
Tsunamis are not technically coastal hazards in the sense of being generated by coastal processes, but they are experienced as coastal inundation events of extraordinary destructive power. They are generated by any process that produces a rapid large-scale vertical displacement of the seafloor: subduction zone earthquakes are the most common source, but submarine landslides, volcanic eruptions, and (on geological timescales) bolide impacts can also generate tsunamis. The 2004 Indian Ocean tsunami (triggered by the magnitude 9.1 Sumatra-Andaman earthquake) killed approximately 230,000 people in 14 countries across the Indian Ocean basin — the deadliest tsunami in recorded history and one of the deadliest natural disasters of any kind. Many deaths occurred because there was no Indian Ocean Tsunami Warning System at the time (one was established subsequently) and because populations in many affected areas (particularly in Aceh Province, Indonesia) had no knowledge of tsunami warning signs or evacuation routes.
Space Impact Hazards
The Earth is continuously being bombarded by extraterrestrial material, ranging from microscopic dust particles (which account for most of the approximately 100 tonnes of extraterrestrial matter accreted by Earth per day) through metre-scale bolides (which explode with the energy of several kilotons of TNT in the upper atmosphere) to rare but catastrophically large impactors. The frequency of impact events decreases steeply with increasing impactor size: metre-scale events occur every few years, Tunguska-scale events (energy release equivalent to 10–15 megatons of TNT) occur every few hundred to thousand years, and extinction-level events (the Chicxulub impactor, 10 km diameter, 65 million years ago) occur on timescales of tens to hundreds of millions of years.
On 15 February 2013, a near-Earth asteroid approximately 20 m in diameter entered Earth’s atmosphere over Chelyabinsk, Russia, at a speed of approximately 19 km/s and at a shallow angle. The aerodynamic forces and heating caused it to fragment and explode at an altitude of approximately 23 km in an airburst releasing energy equivalent to approximately 500 kilotons of TNT (roughly 30 times the Hiroshima atomic bomb). The bright fireball was visible over a large region of Russia and Kazakhstan, and the blast wave arrived at the surface 2–3 minutes after the brightest flash, shattering windows and causing building damage across an area of 80 km radius. Approximately 1,500 people were injured, primarily by broken glass, and several hundred buildings were damaged.
The Chelyabinsk event is the largest documented bolide event since the 1908 Tunguska event (in which an approximately 50–80 m diameter object produced an airburst equivalent to approximately 10–15 megatons) and provides a sobering reminder that even relatively small near-Earth objects can produce significant harm. Crucially, the 2013 Chelyabinsk asteroid was not detected by any existing sky survey prior to its impact — it approached from the sunward direction, making it invisible to ground-based telescopes. This near-miss (in terms of warning) accelerated investment in near-Earth object (NEO) detection programs, including the DART mission (Double Asteroid Redirection Test), which in 2022 successfully demonstrated that a kinetic impactor could deflect an asteroid from its original trajectory — the first planetary defense test in history.
Risk Reduction, Preparedness, and the Sendai Framework
The Sendai Framework for Disaster Risk Reduction 2015–2030 is the international policy framework adopted by the United Nations General Assembly following the Hyogo Framework (2005–2015). It establishes four priorities for action: understanding disaster risk; strengthening disaster risk governance; investing in disaster risk reduction; and enhancing disaster preparedness for effective response. It also sets seven global targets, including substantially reducing global disaster mortality, the number of affected people, direct economic losses, and damage to critical infrastructure by 2030.
Central to effective disaster risk reduction is the concept of resilience — the capacity of individuals, communities, institutions, businesses, and systems to anticipate, absorb, accommodate, or recover from the effects of a hazardous event in a timely and efficient manner, including through the preservation and restoration of their essential basic structures and functions. Building resilience requires not just physical infrastructure improvements (better seismic buildings, higher levees, better forest management) but also social investments in education, healthcare, economic equity, and governance. The communities most devastated by natural hazards are almost always those that were already economically and socially marginalized; strengthening their resilience requires addressing those underlying vulnerabilities, not just the physical hazard.
Early warning systems are among the most cost-effective tools for reducing disaster mortality. Studies show that for each dollar invested in early warning systems, $4–36 in disaster losses can be prevented. Effective early warning requires: (1) monitoring of the hazard (seismometers, tide gauges, radar, weather stations, volcanic sensors); (2) analysis and forecasting capability to convert monitoring data into actionable hazard assessments; (3) dissemination of warnings to at-risk populations through multiple channels (sirens, text messages, television and radio broadcasts, social media); and (4) community preparedness such that warning recipients know what action to take. All four components must function for a warning system to save lives. The 2004 Indian Ocean tsunami demonstrated the catastrophic consequences of a gap in any component: even if warnings had been issued, many coastal communities lacked the awareness and infrastructure to respond appropriately.