fbpx

No flood gauges, no warning: 99% of US streams are off the radar amid rising flash flood risks – we saw the harm in 2024

No flood gauges, no warning: 99% of US streams are off the radar amid rising flash flood risks – we saw the harm in 2024
December 17, 2024 The Conversation

By , , and , University of Michigan

 is an independent and nonprofit source of news, analysis and commentary from academic experts.


Flooding is one of the deadliest and costliest natural disasters in the U.S., causing billions of dollars in damage each year. In 2024 alone, floods destroyed homes in over a dozen states and claimed more than 165 lives.

Southeast Texas was hit by flash flooding repeatedly in the spring, and then hit again by Hurricane Beryl. In one heartbreaking moment, a 4-year-old boy was swept away after his family’s car was submerged during a thunderstorm near Fort Worth.

In the Upper Midwest, days of rainfall in May caused flooding along the Mississippi River and its tributaries. A slow-moving storm in the Northeast in August caused catastrophic flooding in Connecticut.

The mountains of North Carolina and Tennessee saw some of the year’s most devastating flooding as the remnants of Hurricane Helene hit in September. Heavy rain poured down mountains, turning creeks and rivers into torrents that washed away homes and vehicles. More than 100 people died in North Carolina, and six workers drowned when their plastics factory was inundated in Tennessee.

Storms like these are intensifying faster, weakening more slowly and producing more extreme precipitation that the land can’t absorb fast enough. While many coastal areas are becoming more prepared for hurricane and tidal flooding, inland flood risk is less understood or easily anticipated.

These disasters underscore the importance of fast, accurate flood warnings. They’re also a reminder that extensive gaps still exist in the systems that monitor U.S. stream levels.

Current coverage is less than 1% of waterways

The National Weather Service uses advanced models to issue flood warnings. These models rely on historical trends, land cover information and a network of over 11,800 streamgages – sensors that provide near-real-time data on precipitation, streamflow and water depth – to simulate water flow. Much of that data is available online in real time.

However, the streamgage network covers less than 1% of the nation’s rivers and streams.

Deploying a single sensor costs over US$25,000 for permanent federal gauges, and nearly 70% of these costs can fall on communities. These high upfront costs, combined with rising operational expenses, significantly limit sensor coverage — particularly in small and urban watersheds prone to flash floods.The U.S. Geological Survey acknowledges that these sensors alone do not provide enough data at fast enough intervals to fully address flood risk.

Without data, risk is often underestimated

Flood risk can be estimated in waterways without streamgages, but not as accurately.

In these areas, computers use data from similar waterways to estimate stream flow. However, these assumptions, along with limited data and the evolving effects of climate changeintroduce uncertainty.

The resulting models often underestimate flow in smaller creeks and overlook the effects of urbanization. In particular, they can miss new risks in fast-developing areas, where changes to the landscape and more pavement can quickly funnel water in risky ways.

These flood models are used for more than warnings. They also guide risk assessments for development, insurance and decisions on building protective infrastructure, so accuracy is important.

A case study in Philadelphia

A July 2023 flash flood in Lower Makefield, a suburb of Philadelphia along the Delaware River, highlights the challenges of insufficient data coverage in urbanizing watersheds.

On July 15, heavy rain transformed Houghs Creek, a small tributary of the Delaware River, into a deadly torrent, washing out Washington Crossing Road and trapping multiple vehicles. Survivors recalled the chaos:

“All of a sudden, 3 inches of water, 4 inches of water, a foot of water just coming at us,” Chloe Weissman said.

“This huge gush of water just came down from … down a hill,” added Eli Weissman. “As it was coming down, cars were starting to float. [We] just tried to survive, laid on our back, feet heading down, grabbing trees, grabbing vines, grabbing whatever we could to stay afloat.”

A map of rainfall totals and flash flood warnings shows how important locally targeted information can be. The flash flooding in Upper Makefield (pink) that washed away cars occurred outside the zones listed in the warning. Julie Arbit/University of Michigan

The National Weather Service issued a flash flood warning at 5:18 p.m., but a phone alert wasn’t triggered until 6:09 p.m. – after the flooding had begun. While the Delaware River has a nearby streamgage, flood models did not predict the rapid flooding along this small tributary.

Urbanization around Houghs Creek has made these events more dangerous and less predictable, as impervious surfaces quickly funnel the water into low-lying areas. The flash flood underscored the need for hyperlocal data to improve predictive models and allow earlier, more accurate warnings.

Expanding coverage of stream flood levels

Addressing data gaps is essential for improving weather forecasting and emergency management.

One promising solution is expanding the streamgage network through public-private partnerships and encouraging state and local governments, small businesses, academic institutions and nonprofits to build and operate their own sensors. Greater coverage enables more accurate and timely flood forecasts, leading to improved warnings, more prepared communities and more effective emergency responses when disasters strike.

Engineers at the University of Michigan Digital Water Lab created one example of a low-cost, easy-to-deploy solution for flood monitoring. At its core is a controller connected to an ultrasonic sensor that measures water levels in a way similar to how bats navigate using sound. The data can be transmitted in real time for fast analysis.

The simplicity and affordability – around US$800 per sensor – of this system allows for widespread deployment, providing critical information to communities. Techniques such as validating readings against precipitation measurements, calibrating sensors with federal monitoring stations and using supervised machine learning can build confidence in the value of this third-party and citizen-generated data.

Eventually, nonfederal sensors like these may be integrated into federal flood models.

In the meantime, researchers have created open-source databases that consolidate all known gauge data and allow the public to supply information. These combined datasets allow more advanced and robust flood models, such as Google’s flood forecasting model, which covers large portions of the country.

Future of flood monitoring

Several universities are working together in a collaboration called FloodAware to develop a system that integrates “floodcams,” social media posts, smart city sensors and more to detect and warn residents of flash floods. Bringing these tools together could greatly expand the data available to meteorologists and emergency managers, improving flood risk assessments and warnings.

Combining diverse sources of data on a shared platform would establish a more comprehensive, accessible flood monitoring system. We believe that would empower communities with the information they need to advocate for protective measures, ultimately enhancing resilience in the face of climate change.


Catch more news at Great Lakes Now:

The hidden rivers fueling urban floods

Climate costs imperil Detroit’s unique, diverse Jefferson Chalmers neighborhood


Featured image: Ghost streams. (Photo Credit: Great Lakes Now)

0 Comments

Leave a reply

Your email address will not be published. Required fields are marked *

*