San Diego Union Tribune: On Supercomputing
The San Diego Union Tribune has a nice piece today on supercomputing, with a particular focus on the San Diego Supercomputer Center. Here’s a snippet:
Jean-Bernard Minster wants to know how a magnitude-7.7 earthquake would affect Southern California. J. Andrew McCammon wants to find a cure for AIDS. Michael Norman wants to learn how the universe began.
All of them rely on supercomputers in their quest for answers.
Twenty years ago this Monday, the San Diego Supercomputer Center began using what was then the world’s most powerful computer. Now, its data-crunching successors worldwide are indispensable to science, engineering, business, even the war on terrorism.
Fran Berman, director of the San Diego Supercomputer Center, said one way to think about these high-end tools is to compare them to high-performance race cars.
“It’s not easy for you and I to buy an Indy 500 car and to maintain that,” she said. “That’s where it’s important to have government and large-scale investment in these kinds of computers. … And a real concern from the scientific community right now is that (U.S.) leadership is really falling behind.”
In November 2004, Congress passed legislation calling for an additional $165 million a year for research to develop new supercomputers. But President Bush’s fiscal 2006 budget didn’t allocate any funds. Instead, it requested budget cuts for supercomputing research at the Department of Energy.
As we reported on Wednesday, Congress restored some of that funding in the FY 06 Energy and Water Appropriations.
Anyway, the article is called “Supercomputing now indispensable” and it’s worth a read…