Katrina and Computing


Federal Computer Week’s Aliya Sternstein has an interesting piece in this week’s issue on the role of computing technology in helping predict and mitigate the cost of Hurricane Katrina.

Scientists are using a range of technologies to better predict the impact hurricanes can have on the economy and environment to minimize future damage and save lives.
Supercomputers, modeling programs and geographic information systems are some of the technologies scientists use to track the movement of hurricanes and predict damage. Experts warn, however, that skilled professionals are as crucial to accurate forecasting as technology.
Supercomputers aided the National Oceanic and Atmospheric Administration in accurately forecasting Hurricane Katrina’s path. The storm devastated the coastal areas of Alabama, Louisiana and Mississippi.
“Two and a half to three days before the hurricane hit, we were pretty much zoomed in on the Louisiana/Mississippi Gulf Coast as where the hurricane would hit,” said Jack Beven, a hurricane specialist at the NOAA Tropical Prediction Center. “It’s probably not the most accurate we’ve been, but it’s certainly pretty accurate.”

From what I understand, NOAA does a great job with the computing resources its been allocated. I’m just not sure they’ve been allocated nearly enough. The article points out that NOAA has been able to upgrade its supercomputing capacity from 0.5 teraflops to 1.5 teraflops within the last year. (Update (9/16/2005): This is questionable, see note below. More clarification below!**) That’s a great improvement, but given the scale of the problem they face, I’m not sure it’s adequate.
In its look at the state of computational science in the U.S. in the last year, the President’s Information Technology Advisory Committee (PITAC) (now disbanded, sigh) came up with a really interesting economic case for the need for increased computational resources in hurricane forecasting. I’ve cited it here once previously, but I’ll quote it again:

One nugget I found especially interesting from the presentation [of the PITAC Subcommittee on Computational Science] was an example of both the economic benefit and the health and safety benefit that will arise from more capable modeling enabled by advanced computing. The subcommittee noted that 40 percent of the $10 trillion U.S. economy is impacted by climate and weather. As one example of this, the subcommittee cited the hurricane warnings provided by the National Hurricane Center and the cost of the evacuations that often result. According to the subcommittee, there is $1 million in economic loss for each mile of coastline evacuated. With the current models, the U.S. now “over warns” by a factor of 3, with the average “over-warning” for a hurricane resulting in 200 miles of evacuations — or $200 million in unnecessary loss per event. Improved modeling (better algorithms, better software, more capable hardware, etc) would improve the accuracy of forecasts, saving lives and resources.

While over-warning probably wasn’t much of an issue in Katrina’s case, there are a number of capabilities that we currently lack that may have proven useful. Folks in the severe storms community tell me that current operational forecast models run by NOAA suffer from a number of limitations that work against obtaining accurate predictions of hurricane intensity and path. For example, they cite the lack of resolution in the current models that misses important fine-scale features like rain bands and the eye wall; the lack of coupling between atmospheric, wave and ocean prediction models; and computing resources that can generate only one or a few forecasts (as opposed to large ensembles), which impacts NOAA’s ability to improve forecasting skill and quantify uncertainty.
While NOAA’s move to a 1.5 teraflop capacity is a welcome change, it’s still far below what one would consider a “leadership class” computing capacity for the agency — like those available at NSF, NASA and DOE centers. I know it’s a coarse measure, but 1.5 teraflops doesn’t even get you in the top 300 fastest machines — never mind a machine capable of the kind of improvements hinted at above.* And it’s not all about big iron. NOAA needs additional resources to ramp up its infrastructure — software, hardware and personnel — and to boost basic research programs within the agency and the university community. Asking for any increase in resources anywhere is obviously very tough in the current budget environment, but the size of the “bump” required here is relatively small, given the potential benefit.
But none of this is intended to take away from the job NOAA has done with the resources it already has. Because of NOAA’s forecasts, there was ample warning that this major storm was barreling in on the Gulf Coast and there were reasonable estimates of what it was going to do once it got there. But given sufficient resources the models will get even better, which means the forecasts will get better — more accurate, more precise, and more timely. How much would it be worth to have the accuracy and precision we have now at 24-36 hours before a major storm available 3 days out? Or five days out?
I know it may seem a bit crass to be talking about boosting funding for computing only days after a tragedy as big as Katrina’s impact on the gulf coast, but events like this are a trigger for the reevaluation of national priorities, and it seems to me that computing resources at NOAA haven’t been a national priority for quite a while.
* Update: (9/16/2005) Actually, it looks like NOAA has slightly more adequate computing resources than the FCW article suggests. According to the Top500 list, NOAA has two machines capable of 4.4 teraflops and two capable of 1.8 teraflops. So I’m not sure what the FCW article reflects. That’s still quite some distance from “leadership class” computing, trailing machines in Japan, Sweden, Germany, Russia, Korea, China, and Australia, but it’s better than the figures quoted in the article above.
** Update 2: (9/16/2005) Aliya Sternstein writes to note that the 1.5 teraflop measurement cited in the FCW piece applies to the NWS system at the IBM facility in Gaithersburg, MD, not all of NOAA’s computational capacity.

Katrina and Computing