Professional Documents
Culture Documents
Meteorologists Can Finally Stop Throwing Out Good Data On Hurricanes
Meteorologists Can Finally Stop Throwing Out Good Data On Hurricanes
By Nicolas Rivero
News curator
But although we’ve gathered vast troves of storm data since the
late 1950s, at the start of this decade, hurricane models used
almost none of it. As Hurricanes Irene, Isaac, and Sandy passed
by—leaving nearly $100 billion in damage in their wake—the best
meteorologists could do was fit a few dropsonde data points into
their models.
A tempest in a terabyte
Data assimilation is a complicated and resource-intensive
computing problem. NOAA’s Hurricane Ensemble Data
Assimilation System runs on a 1,500-core processor and churns
through 10 terabytes of data to make sense of hundreds of
thousands of observations, every six hours. “The big bottleneck is
computing power,” said Jason Sippel, who develops data
assimilation systems for NOAA. “It’s expensive, and there are only
so many dollars to go around.”
Subscribe to the Daily Brief, our morning email with news and
insights you need to understand our changing world.