In what could fairly be described as a “love in,” Thursday’s House Science Committee hearing on HR 4218, the High Performance Computing Revitalization Act of 2004 (HPCRA), featured witnesses from the Administration, industry, university and federal labs all singing the praises of the committee’s bill to amend the 1991 High Performance Computing and Communications Act. The Committee’s bill, discussed in a previous blog entry, attempts to address concerns within the computing community about interagency coordination in the government-wide Networking and Information Technology Research and Development (NITRD) program generally, and specifically within the high-performance computing community. In essence, the bill tries to do three things:
- Make sure US researchers have access to the best machines available;
- Make sure research moves forward on a broad range of architectures, software, applications, algorithms, etc.; and,
- Assure the interagency planning process really works.
Without exception, the four witnesses called to testify before the committee expressed strong support for the bill. While not going so far as to say the interagency planning process was broken, White House Office of Science and Technology Policy Director John Marburger agreed the bill would help strengthen interagency coordination in high-end computing and offered the Administration’s support for the bill.
Administration support will “grease the wheels” of the legislative process a bit for this particular bill, though it’s by no means an easy path to passage. From talking to various committee staff, it appears the biggest hurdle for the bill is actually on the Senate side. Senator Pete Domenici (R-NM), Chair of the Senate Committee on Energy and Natural Resources, needs to be convinced that the HPCRA doesn’t contain provisions that should be in his Energy bill (S 2095) — otherwise his reluctance to move anything through his committee (to which HPCRA would no doubt be referred) that looks like a piece of the Energy bill will stop the HPCRA in its tracks. On the House side, the path forward for the bill looks relatively clear. The Science Committee plans a “markup” on the bill in early June, and time for consideration on the House floor is already tentatively scheduled in July. Elements of the House Leadership are apparently very interested in making the bill part of a “improving national competitiveness” theme this summer.
Indeed, supercomputing’s role in national competitiveness was a theme of the industry representative on Thursday’s panel. Irving Wladawsky-Berger, VP for Technology and Strategy at IBM (also former co-chair of PITAC, and a founding member of the Computer Sciences and Telecommunications Board (CSTB) of the National Academies of Sciences) emphasized that supercomputers are essential to US industrial competitiveness. “We are becoming an increasingly integrated information society,” he said, noting that US CEOs are looking for ways to process, in real time, all of the information they have from customers and competitors in order to make the best decisions about their businesses. Wladawsky-Berger also noted the important role basic research plays in US competitiveness. “Innovation remains the key [to US competitiveness],” he said, “and research is the driver of innovation.”
Rick Stevens from Argonne National Lab told the panel that HPC is a critical technology for the nation, noting the enabling role it plays in all branches of science and education, and that its availability to researchers is a “pacing item” for much of science. Stevens also sought to downplay the idea that the Japanese Earth Simulator supercomputer — the current world’s fastest machine — represented a loss of leadership for the US in supercomputing. “The US is the undisputed leader in development of HPC systems, and a leader in education and training of talent,” he said. Stevens suggested that the most significant aspect of the Earth Simulator isn’t the technological accomplishment, rather its that its the result of a long-term, rich investment by the Japanese government to see it built. The $400 million price tag is, by way of comparison, 4 times the cost of top-tier US supercomputers.
The committee’s final witness, Dan Reed (former director of the National Center for Supercomputing Applications, and current director of the Renaissance Computer Institute at UNC-Chapel Hill) emphasized this need for continuing, consistent support of HPC research. “Today’s High Performance Computing is reaping the rewards of yesterday’s research,” he told the panel. “We must seed tomorrow’s crop of research ideas today, else tomorrow we will subsist on wild berries.” Reed cited the importance of HPC’s emergence as the third element of the research portfolio — complementing theory and experiment — and noted that the availability of HPC acts as an “intellectual lever” that advances discovery in all of science. Reed spent some time describing how HPC enables research and discovery in a range of disciplines, but focused on the life sciences.
The breadth of these examples highlights a unique aspect of high-performance computing that distinguishes it from other scientific instruments its universality as an intellectual amplifier. Powerful new telescopes advance astronomy, but not materials science. Powerful new particle accelerators advance high energy physics, but not genetics. In contrast, high-performance computing advances all of science and engineering, because all disciplines benefit from high-resolution model predictions, theoretical validations and experimental data analysis. As new scientific discoveries increasingly lie at the interstices of traditional disciplines, high-performance computing is the research integration enabler.
The question and answer portion of the hearing yielded some interesting exchanges. Science Committee Chairman Sherwood Boehlert began by asking the panel to identify the most important thing the federal government isn’t doing in HPC. Wladawsky-Berger cited the need for more encouragement for industry, universities and federal labs to collaborate on HPC problems. He also thought the grandest of challenges in HPC are in the life sciences — a change from the end of last century, when physics used to drive application development.
Steven’s noted the biggest thing missing from government HPC programs is “sustained development activity over multiple generations of hardware over multiple paths.” Researchers in the field should have a 10-15 year horizon for support, not 3-5 year commitments. He also raised some concerns about the just-released agency HECRTF report, noting that the report seemed to cover the development issues reasonably well, but was relatively silent about how to deploy the systems for researchers — something Marburger claimed would be the focus of a future report. Stevens (and Reed and Wladawsky-Berger) also complained that NIH isn’t nearly the player it could be in this area. Marburger tried to counter this assertion by noting NIH’s participation in the interagency working group and that NIH was “getting a lot of mileage out of the existing clusters,” but conceded that one of the things his office had to do a better job of was sustaining the attention of the agency leadership on the issue.
HPCRA co-sponsor Lincoln Davis (D-TN) asked whether, given the problems with coordination already noted, Congress should choose one particular agency to provide access to supercomputers. The panel indicated that NSF and DOE should both have leadership roles, because the agencies have complementary skills.
The other co-sponsor of HPCRA, Judy Biggert (R-IL) asked Marburger what he would change to strengthen interagency coordination if he had a “magic wand.” Marburger saw his biggest challenge as “maintaining engagement at a sufficiently high level with the agencies.” All of the witnesses seemed to stress the importance of the differentiation in roles between NSF, DOE and DOD.
All-in-all, the hearing was non-contentious and basically bi-partisan. The bill appears to be on track to move at the beginning of June. As I’ve noted before, the bill isn’t perfect, but does a decent job of doing what it sets out to do, I think. From a CRA perspective, I think there are things we’d like to see in the bill that aren’t yet there. Most importantly, perhaps, is some mention of the NSF’s crucial role in developing the next generation of HPC researchers, including researchers from currently underrepresented groups.
Keep an eye on this space to see how the bill shapes up….
Written statements for the hearing, as well as the hearing charter and a link to the new High-End Computing Revitalization Task Force report can be found here.