The previously scheduled, then postponed IT R&D hearing of the House Government Reform Subcommittee on Technology, Information Policy, Intergovernmental Relations and the Census has been rescheduled for July 7, 2004.
Apparently, same lineup as the first time around. Check here for details.
CRA joined 31 other scientific societies and universities yesterday in showing off the results of NSF-sponsored research at the 10th annual Coalition for National Science Funding Science Exhibition and Reception on Capitol Hill. CRA was ably represented at the event by DK Panda and his students (Jiuxing Liu, Pavan Balaji, Ranjit Noronha, and Sayantan Sur) from The Ohio State University, who presented work on software that allows high performance, scalable communication using the InfiniBand networking technology.
The Exhibition was a great opportunity for making the general case for federal support of basic research, especially at NSF. This year’s event was widely-attended. Many key congressional staffers, influential Members of Congress, and important members of the Administration and NSF took time out of their schedules to see the exhibits. Here’s the proof! (click for larger images)
The CRA booth. Underneath the table were four PCs clustered together with InfiniBand. Two monitors show the results of some benchmarking apps comparing InfiniBand to Gigabit connections.
NSF Director Arden Bement (left) listens to Professor Panda describe his research.
Ohio congressman Dave Hobson (R-OH), a very influential member of the House Appropriations Committee. Hobson has the distinction of being the only member to serve on the appropriations subcommittees for Defense, VA-HUD-Independent Agencies (home of NSF funding), and Energy and Water (which he chairs).
Hobson takes some time to speak with Panda’s students.
The Senate Committee on Energy and Natural Resources held a hearing Wednesday on a bill to authorize the building of a “Leadership Class” supercomputer at DOE. S. 2176, the High End Computing Revitalization Act of 2004, introduced by Senators Lamar Alexander (R-TN) and Jeff Bingaman (D-NM), would authorize $250 million worth of HEC R&D at the Department of Energy through FY 2009 ($40M in FY 05, building to $60 M in FY 09); it would authorize $500 million through FY 09 ($100 M a year from FY05-FY09) to construct a new supercomputer with “100 times the capability” of the fastest computer in existence at enactment; and it would authorize $50 million through FY09 ($10M a year) for the creation of a High-end Software Development Center. The money authorized would be “new” money, so appropriators would have to come up with additional money to fund it.
The bill is very similar to a House Science Committeebill introduced by Reps. Judy Biggert (R-IL) and Lincoln Davis (D-TN). Both are loosely based on the recommendations of the HEC Revitalization Task Force Report. The House bill contains fewer authorizations — no specific authorization for a software development center, for example — and more modest authorization levels: only authorizing $50 million in FY 05 for HEC R&D, building to $60 million in FY 07. Administration sources tell us that the President would likely sign the House version if presented to him, but would have serious problems with the more ambitious funding requests in the Senate version. The House version of the bill will likely head to the floor of the House in July for approval (along with the Science Committee’s broader HPC Reauthorization bill). The future of the Senate bill is less certain.
However, the Senate hearing was a good opportunity to get comment on the bill on the record. Testifying before Senators Alexander and Bingaman were James Decker, Principal Deputy Director of DOE’s Office of Science, (filling in for Office of Science Director Ray Orbach); Jeff Wadsworth, Director of Oak Ridge National Lab; David Turek, Vice President, Deep Computing at IBM; Dan Reed, CRA Board Member and Director of Renaissance Computing Institute at UNC-Chapel Hill; Vincent Scarafino, Manager of Numerically Intensive Computing at Ford Motor; and Dimitri Kuznezov, Director of Advanced Simulation and Computing at DOE NNSA.
As tends to happen lately when Members of Congress discuss high-end computing, much of the focus of the hearing centered around what could be done to recapture the “supercomputing lead” from the Japanese and their Earth Systems Simulator machine. In his opening remarks, Alexander noted that the ESS seemed to indicate that “Japan is king” of high-speed computing, a fact that had lead both Alexander and Bingaman to travel to Japan to be briefed on the new machine. Alexander said that he learned in the briefing that many US researchers and companies had requested time on the Japanese machine, marking the first time he could recall that US researchers were looking overseas for computational resources — a worrying precedent. His bill, he said, is focused on recapturing that lead.
Bingaman echoed many of Alexander’s comments, noting that he’d always considered supercomputing to be “one of the long poles in the tent for US leadership in science and technology.” He noted later during the question and answer period that high-end computing was “one area I’d prefer us not to have to outsource.”
Of particular concern to both Senators was the availability of both machines to US researchers not in DOE. “We’ve got a secret weapon [in the US] called our research universities,” Alexander said, noting that providing access to those universities should be an important piece of the supercomputing effort at DOE. Decker assured the Senators that time on any new “Leadership Class” machine would be merit-based, peer-reviewed and competitively awarded. Neither he nor ORNL’s Wadsworth mentioned what percentage of cycles would be available to university or private sector researchers.
The hearing was basically non-controversial, with both Senators promising to try and move forward with their bill. As that process moves forward, check this space for details.
Ed Felten’s Freedom to Tinker has the details on the latest development in the effort to reform portions of the Digital Millennium Copyright Act (DMCA) (pdf) to allow the distribution and use of circumvention technologies for non-infringing purposes. Tech giants Sun Microsystems and Intel, along with a number of other powerful tech firms, will announce today that they’ve banded together to form the Personal Technology Freedom Coalition focused on supporting Rep. Rick Boucher’s (D-VA) DMCA Reform Bill, H.R. 107.
The coalition announcement follows last week’s announcement that influential congressman Chris Cox (R-CA) — member of the Energy and Commerce committee and Chair of the Homeland Security committee — had agreed to co-sponsor HR 107.
However, it still appears that HR 107 has an uphill road to climb this Congress. While the House Energy and Commerce committee has already held a hearing on the issue at the subcommittee level, it’s not clear whether the full committee will have the time or inclination to review the bill. The bill has also been referred to the House Judiciary committee, which has also not shown any great interest in moving the bill. Perhaps the new coalition can bring pressure to bear on both committees and get the bill brought to the House floor…. Update: I guess this qualifies as not showing any great interest. Here’s the statement released by the House Judiciary Committee yesterday concerning HR 107:
Judiciary Committee Leaders Issue Statement on H.R. 107, the Digital Media Consumers’ Rights Act
WASHINGTON, D.C. – House Judiciary Committee Chairman F. James Sensenbrenner, Jr. (R-Wis.), Ranking Member John Conyers, Jr. (D-Mich.), and Judiciary Courts, the Internet, and Intellectual Property Subcommittee Chairman Lamar S. Smith (R-Tex.) issued the following statement regarding H.R. 107, the Digital Media Consumers’ Rights Act.
“We strongly oppose the substance of H.R. 107. This legislation would eviscerate a key provision of the Digital Millenium Copyright Act (DMCA), which is successfully protecting copyrighted works and providing consumers access to more digital content than ever before. In fact, a DVD player is now as common a household item as the VCR was 15 years ago precisely because of the DMCA. H.R. 107 would undo a law that is working and destroy the careful balance in copyright law between consumers’ rights and intellectual property rights.
“Furthermore, our strong objections to the substance of H.R. 107 are matched by our objections to what appears to be a bold jurisdictional power grab. The Judiciary Committee has – and has long had – exclusive jurisdiction over copyright law. Rest assured, we will wholeheartedly oppose this move in a bipartisan fashion, as we would expect Energy and Commerce Committee leaders to do if we attempted to write energy legislation.”
The President’s Information Technology Advisory Committee (PITAC) met for the third time since being reconstituted by President Bush in the Spring of 2003, approving a report (pdf) on Health Information Technology, getting an update on the progress of the subcommittee reviewing (pdf) federal cybersecurity R&D, and taking testimony for the subcommittee charged (pdf) with reviewing the current state of scientific computing.
The committee approved the final version of the subcommittee on health and information technology’s report without much discussion of the 12 recommendations. Progress has already been made in some of the areas highlighted by the committee. The President has already created, by executive order, a “National Health Information Technology Coordinator,” and appointed David J. Brailer, MD, Ph.D. , to fill the position. He’s also requested $100 million in funding for projects that demonstrate the promise of HIT, and has begun integrating events that highlight HIT into his campaign activities (as we’ve noted here previously).
The committee’s report is divided into two basic parts: 8 recommendations under the general theme of “Promoting the Electronic Health Record, Clinical Decision Support and Computerized Provider Order Entry”; and 4 additional recommendations concerning “Promoting secure, private, interoperable health information exchange.” Though the final version isn’t yet on the web, the draft version was approved virtually unchanged. More detail on the report in a future post.
The Cybersecurity panel didn’t have much to report as they’re still digesting the public testimony they took at the last PITAC meeting. They plan another public meeting of the subcommittee on July 29th, at which they’ll take testimony from representatives of DHS’s Homeland Security Advanced Research Projects Agency (HSARPA — DHS’s version of DARPA), NIST, DARPA, Gartner, FBI, DOD’s DDR&E, and NSA. They’ll also hold a public “town hall meeting” in conjunction with the GovSec 2004 conference in Washington, DC, on July 29th.
The third portion — really the bulk of the meeting — was devoted to the first taking of testimony by the subcommittee on computational science. The committee heard from four witnesses: Erik Jakobsson, Director of the Center of Bioinformatics and Computational Biology at NIH; Michael Strayer, Director of SciDAC at DOE’s Office of Science; Arden Bement, Interim Director of NSF; Ken Kennedy, Rice University professor and former co-chair of the previous incarnation of PITAC. All three government witnesses provided some detail about their agency’s efforts in computation science.
Jakobsson explained why computational science is so important at NIH — it’s essential for understanding biological complexity (he quoted NIH Director Elias Zerhouni’s defense of computation science at NIH, “You can’t translate what you don’t understand.”) The main issue with computational biology at NIH is that while its success has made it integral in research supported by the agency, deficiencies in the software (primarily) and the hardware (less so) are now the “rate-limiters” in biological innovation and discovery. PITAC Co-chair Ed Lazowska sought to understand whether NIH’s appreciation of the importance of computational science in its research was reflected in support for research in fundamental computer science — something NIH had been loathe to support in the past. Jakobsson replied that some recent solicitations by the agency were designed to attract domain=independent computational science research, but that the verdict of whether or not they were successful in that endeavor was not yet in.
DOE Office of Science Director Ray Orbach was unable to attend, so Michael Strayer, Director of the Office’s Scientific Discovery Through Advanced Computation (SciDAC) attended in his stead. Strayer made a couple of interesting points about the character of DOE Office of Science research: “no filter” at SciDAC — no prohibitions against the participation of foreign-born students in SciDAC related research; more than 50 percent of SciDAC funding goes to university researchers, 65 percent of requests for time on DOE systems are from university-based researchers. Demand far exceeds available cycles, Stayer said, but DOE Office of Science is making the cycles available even at the sacrifice of some of their “core” DOE programs.
Arden Bement gave the standard NSF presentation on Cyberinfrastructure, but noted that NSF is concerned with finding the right balance between computer science, domain science and general research. Specifically, NSF hopes to grow current cyber efforts, including:
$60 million in FY 05 for supercomputer operations;
adding up to 50 teraflops in capacity to PSC in FY 05-06;
$10 million in training and mentoring grants;
restarting the HEC-University Research Activity; and
developing domain specific and generic computational science activities.
On this last point, Bement said he hopes NSF will expand its domain specific activities, and that domain independent research will continue to see its funding in CISE. Bement was also asked to compare the US’s current efforts to those of the Japanese — especially in light of the Japanese Earth Simulator. Bement noted he thought the Japanese have “reached the peak of what vector machines can do” and that “surpassing it will take hybrid machines. Bement said efforts in CISE will focus beyond the frontiers — quantum and DNA-based computing, for example.
Regarding the agency’s cyberinfrastructure plan, CISE AD Peter Freeman noted that the Atkins’ report, which included a lofty $1 billion recommended increase for cyberinfrastructure programs at NSF, was an important guiding document, but probably an unrealistic request given the current fiscal environment. However, if you consider the Atkins’ report to be “broader than just NSF” than we have a chance to reach that number in aggregate efforts of the various agencies — but it’s important that we coordinate better to better leverage the resources of the federal government.
The final witness was Ken Kennedy, Rice University professor and former co-chair of the PITAC committee at the time of the release of the 1999 Information Technology Research: Investing in Our Future report. He delivered a report card on the federal government’s achievements in IT R&D since that report. In short, his report raises some concerns about the composition, management and increasingly short-term focus of the federal effort. His slides are an excellent overview of the current situation, so I’ve posted them here with his permission. Well worth a read.
The next full meeting of PITAC will be October 12, 2004. At that time, I think we’ll see a draft report from the Cybersecurity subcommittee, and perhaps some additional activity by the computational science committee. Whatever the case, keep it tuned here for the details as they happen.
The House Committee on Science today approved by voice vote two bills authorizing and codifying policy for High Performance Computing research and development at agencies under the committee’s jurisdiction. HR 4218, the High Performance Revitalization Act of 2004, attempts to address concerns within the computing community about interagency coordination in the government-wide Networking and Information Technology Research and Development (NITRD) program generally, and specifically within the high-performance computing community. In essence, the bill tries to do three things:
Make sure US researchers have access to the best machines available;
Make sure research moves forward on a broad range of architectures, software, applications, algorithms, etc.; and,
Assure the interagency planning process really works.
We’ve covered the bill previously here.
There was only one amendment to the bill, offered (and then withdrawn) by Rep. Brad Sherman (D-CA), that would have required a study of the moral and legal implications of computing research that might lead to machine cognition “equal or greater than human cognition.” Citing alternate possible futures for the research, Sherman evoked both Data from Star Trek: The Next Generation and HAL-9000 from 2001: A Space Odyssey while raising concerns about the implications of supercomputing efforts leading to machines smarter than humans in “as few as 25 years.” Committee Chairman Sherwood Boehlert (R-NY) opposed the amendment, objecting primarily to the broad language of the amendment. “While I appreciate the gentleman’s intent,” Boehlert said, “the amendment is so broadly written it would bring computer science research to a halt.” Would a computer that could beat a man at chess qualify, Boehlert wondered? Sherman agreed that the amendment could use some refinement, but also indicated it was a significant step from his original amendment, which would have prohibited research in any area in computer science that could lead to machine cognition on par with human cognition. Pointing out that the amendment was not likely to pass, Boehlert prevailed upon Sherman to withdraw the amendment without asking for a recorded vote, which would have required the members of the committee who’d left the markup for other engagements be called back to cast a vote. Sherman agreed and the bill passed by voice vote without further amendment.
Bill #2 was H.R. 4516, the Department of Energy High End Computing Act, introduced by Reps. Judy Biggert (R-IL) and Lincoln Davis (D-TN), which would authorize High-end computing at the Department of Energy. The bill, is similar to a Senate bill introduced by Sens. Jeff Bingaman (D-NM) and Lamar Alexander (R-TN), that is loosely modeled on the recommendations of the report (pdf) from the High End Computing Revitalization Task Force workshop CRA hosted last June. That bill would authorize $250 million worth of HEC R&D at the Department of Energy through FY 2009 ($40M in FY 05, building to $60 M in FY 09); it would authorize $500 million through FY 09 ($100 M a year from FY05-FY09) to construct a new supercomputer with “100 times the capability” of the fastest computer in existence at enactment; and it would authorize $50 million through FY09 ($10M a year) for the creation of a High-end Software Development Center. The money authorized would be “new” money, so appropriators would have to come up with additional money to fund it.
H.R. 4516 is a bit more modest than the Senate version, only authorizing $50 million in FY 05 for HEC R&D, building to $60 million in FY 07.
Having been approved by the Science Committee, both bills should head to the House floor soon, perhaps as early as next week. Watch this space for further details.
Thanks to Jeff Grove of ACM for pointing out this story (subscription req’d), by William New, in National Journal’sTech Daily (sub req’d) covering remarks by Department of Homeland Security Chief Security Officer Jack Johnson, DHS Chief Information Officer Steve Cooper, and FAA Deputy Director Thomas O’Keefe suggesting the great need for information security professionals in government and increased cyber security research and development. Some choice quotes:
“There is an incredibly shrinking pool of IT security professionals in government,” said Jack Johnson, chief security officer at the Homeland Security Department. “The bench is not just thin; the bench is non-existent,” he added in a sports reference to backup players. “We need to train the next generation” of IT professionals.
Johnson said Homeland Security does not have the IT workforce to build the systems it needs and is “absolutely dependent” on help from the research and academic communities. The department contracts a lot of work outside government, he said, but there are a limited number of cleared contractors and high turnover of personnel.
…
Thomas O’Keefe, deputy director of the Federal Aviation Administration (FAA) office of information systems security, said more research and development, and more collaboration among researchers and industry, is needed on cyber security.
“The sharing amongst bad guys is growing,” he said at a SecureE-Biz.net conference. “The sharing amongst the good guys on procurement, technology and approach needs to grow at an equal or greater rate. My observation is we’re just not as good at it.”
O’Keefe said firms are reluctant to mention their vulnerabilities because it may “unnecessarily put concern in people’s minds.” His office is working with the National Science Foundation to boost cyber-security research, as it is “still very small,” he said. He and others on the panel predicted continually growing cyber attacks. “You’ve got to expect cyber storms,” he said.
The president last year signed a law authorizing a significant increase in cyber-security R&D funding, but it was not requested in the fiscal 2005 White House budget proposal.
The profile of federally funded R&D at universities and colleges that emerges from this analysis raises issues of proportionality. Specifically, in the current funding profile, approximately two-thirds of the federal funds going to universities and colleges for the conduct of R&D is focused on only one field of science life science and federal R&D funding is concentrated at only a few research universities. These findings raise questions about whether other critical national needs that have substantial R&D components (such as environment, energy, homeland security, and education) are receiving the investment they require and whether the concentration of dollars at a few institutions is shortchanging science students at institutions that receive little or no federal R&D funding.
This finding is from a recently released report (pdf) by the Science and Technology Policy Institute for the National Science Foundation.
Richard Jones of the American Institute of Physics has a good summation of the report and the questions it raises about the federal R&D portfolio here.
The House Government Reform Subcommittee on Technology, Information Policy, Intergovernmental Relations and the Census planned to hold a hearing this Wednesday on “Defining Federal Information Technology Research and Development: Who? Where? What? Why? and How Much?” However, the events surrounding former President Ronald Reagan’s memorial here in DC have resulted in the hearing’s postponement. No news on a new date.
The committee had planned to hear testimony from two panels including NSFCISE Assistant Director Peter Freeman, DOEOffice of Science Director Ray Orbach, NCO/IT R&D Director Dave Nelson, and CRAGovernment Affairs co-Chair and PITAC co-Chair Ed Lazowska.
In the process of putting together his testimony, Lazowska developed a really nice set of bullet points making the case for federal support of IT R&D. So despite the hearing postponement, I’ve decided to post them here. In the next day or so I’ll add them to the Government Affairs site proper, but for now, here they are:
Advances in information technology (IT) are changing our lives, driving our economy, and transforming the conduct of science.
America is the world leader in IT innovation because of a complex interplay of universities, industry, and the federal government.
Essentially every aspect of IT upon which we rely today – every billion-dollar sub-category of the IT industry – bears the clear stamp of federally-supported university-based research. These relatively modest investments have played an essential role in the past, and will play an essential role in the future. [see figure 1]
Don’t confuse the IT industry’s research and development (R&D) expenditures with fundamental research that’s guiding our way to the future. The vast majority of corporate R&D in IT – far more than 95% – involves the engineering of the next version of the product. This development is essential. But the transforming ideas – and our nation’s long-term leadership – come from research. IT companies do very little of that. It is a natural and essential role of government to support fundamental research – R&D that looks out 5, 10, or 15 years, rather than just one product cycle.
An important aspect of federally-supported university-based research is that it produces people, as well as ideas. There is a huge projected shortfall in IT workers over the next 10 years – the vast majority of the entire projected workforce shortfall in all of science and engineering is in information technology. And these are jobs that require a Bachelors-level education or greater. [see figure 2 (pdf 48kb)]
While the overall federal investment in research has been increasing over the past 30 years, the vast majority of this increase has been in the biomedical fields. Compared to that, all other fields have been flat-lined. [see figure 3]
Recent increases in federal support for IT research, while important, have fallen far short of the level recommended by PITAC in 1999. The overall level of support continues to be dangerously inadequate in the context of the importance of the field and the opportunity for further advances. [see figure 4]
While many federal agencies are engaged in supporting IT R&D, two of these agencies have played by far the dominant role in driving IT innovation over the past 50 years: NSF and DARPA. No other agencies come close.
The research community has significant concerns about the continued low level of funding for the CISE Directorate at NSF. Additionally, the research community has significant concerns about several aspects of DARPA’s programs that discourage university participation in defense-related IT research.
There are additional concerns about the Department of Homeland Security’s failure to invest in cybersecurity R&D. Of DHS’s new R&D budget of nearly $1 billion, less than 2% is being invested in cybersecurity R&D. And even this shockingly low level of investment was the result of a Congressional outcry – DHS initially proposed less than 1%. IT systems constitute the control loop of most other elements of our nation’s critical infrastructure (e.g., the electric power grid, the air traffic control grid, the financial grid, the telecommunications grid), and constitute a significant vulnerability.
The track record is clear: the relatively modest federal IT R&D investment pays enormous dividends: changing our lives, driving our economy, and transforming the conduct of science.
Please use the Category and Archive Filters below, to find older posts. Or you may also use the search bar.
Gov Reform IT R&D Hearing Back On
/In: Policy /by Peter HarshaThe previously scheduled, then postponed IT R&D hearing of the House Government Reform Subcommittee on Technology, Information Policy, Intergovernmental Relations and the Census has been rescheduled for July 7, 2004.
Apparently, same lineup as the first time around. Check here for details.
Sci Com to hold E-voting Hearing Thursday
/In: Policy /by Peter HarshaThe House Science Committee will hold a hearing on testing and certification for voting equipment. Here’s an early version of the hearing charter.
Pics from the CNSF Science Exhibition on Capitol Hill
/In: CRA, People /by Peter HarshaCRA joined 31 other scientific societies and universities yesterday in showing off the results of NSF-sponsored research at the 10th annual Coalition for National Science Funding Science Exhibition and Reception on Capitol Hill. CRA was ably represented at the event by DK Panda and his students (Jiuxing Liu, Pavan Balaji, Ranjit Noronha, and Sayantan Sur) from The Ohio State University, who presented work on software that allows high performance, scalable communication using the InfiniBand networking technology.
The Exhibition was a great opportunity for making the general case for federal support of basic research, especially at NSF. This year’s event was widely-attended. Many key congressional staffers, influential Members of Congress, and important members of the Administration and NSF took time out of their schedules to see the exhibits. Here’s the proof! (click for larger images)
Thanks again to Professor Panda and his students!
Senate Hearing on High End Computing at DOE
/In: Policy /by Peter HarshaThe Senate Committee on Energy and Natural Resources held a hearing Wednesday on a bill to authorize the building of a “Leadership Class” supercomputer at DOE. S. 2176, the High End Computing Revitalization Act of 2004, introduced by Senators Lamar Alexander (R-TN) and Jeff Bingaman (D-NM), would authorize $250 million worth of HEC R&D at the Department of Energy through FY 2009 ($40M in FY 05, building to $60 M in FY 09); it would authorize $500 million through FY 09 ($100 M a year from FY05-FY09) to construct a new supercomputer with “100 times the capability” of the fastest computer in existence at enactment; and it would authorize $50 million through FY09 ($10M a year) for the creation of a High-end Software Development Center. The money authorized would be “new” money, so appropriators would have to come up with additional money to fund it.
The bill is very similar to a House Science Committee bill introduced by Reps. Judy Biggert (R-IL) and Lincoln Davis (D-TN). Both are loosely based on the recommendations of the HEC Revitalization Task Force Report. The House bill contains fewer authorizations — no specific authorization for a software development center, for example — and more modest authorization levels: only authorizing $50 million in FY 05 for HEC R&D, building to $60 million in FY 07. Administration sources tell us that the President would likely sign the House version if presented to him, but would have serious problems with the more ambitious funding requests in the Senate version. The House version of the bill will likely head to the floor of the House in July for approval (along with the Science Committee’s broader HPC Reauthorization bill). The future of the Senate bill is less certain.
However, the Senate hearing was a good opportunity to get comment on the bill on the record. Testifying before Senators Alexander and Bingaman were James Decker, Principal Deputy Director of DOE’s Office of Science, (filling in for Office of Science Director Ray Orbach); Jeff Wadsworth, Director of Oak Ridge National Lab; David Turek, Vice President, Deep Computing at IBM; Dan Reed, CRA Board Member and Director of Renaissance Computing Institute at UNC-Chapel Hill; Vincent Scarafino, Manager of Numerically Intensive Computing at Ford Motor; and Dimitri Kuznezov, Director of Advanced Simulation and Computing at DOE NNSA.
As tends to happen lately when Members of Congress discuss high-end computing, much of the focus of the hearing centered around what could be done to recapture the “supercomputing lead” from the Japanese and their Earth Systems Simulator machine. In his opening remarks, Alexander noted that the ESS seemed to indicate that “Japan is king” of high-speed computing, a fact that had lead both Alexander and Bingaman to travel to Japan to be briefed on the new machine. Alexander said that he learned in the briefing that many US researchers and companies had requested time on the Japanese machine, marking the first time he could recall that US researchers were looking overseas for computational resources — a worrying precedent. His bill, he said, is focused on recapturing that lead.
Bingaman echoed many of Alexander’s comments, noting that he’d always considered supercomputing to be “one of the long poles in the tent for US leadership in science and technology.” He noted later during the question and answer period that high-end computing was “one area I’d prefer us not to have to outsource.”
Of particular concern to both Senators was the availability of both machines to US researchers not in DOE. “We’ve got a secret weapon [in the US] called our research universities,” Alexander said, noting that providing access to those universities should be an important piece of the supercomputing effort at DOE. Decker assured the Senators that time on any new “Leadership Class” machine would be merit-based, peer-reviewed and competitively awarded. Neither he nor ORNL’s Wadsworth mentioned what percentage of cycles would be available to university or private sector researchers.
The hearing was basically non-controversial, with both Senators promising to try and move forward with their bill. As that process moves forward, check this space for details.
DMCA Reform Gathering Momentum?
/In: Policy /by Peter HarshaEd Felten’s Freedom to Tinker has the details on the latest development in the effort to reform portions of the Digital Millennium Copyright Act (DMCA) (pdf) to allow the distribution and use of circumvention technologies for non-infringing purposes. Tech giants Sun Microsystems and Intel, along with a number of other powerful tech firms, will announce today that they’ve banded together to form the Personal Technology Freedom Coalition focused on supporting Rep. Rick Boucher’s (D-VA) DMCA Reform Bill, H.R. 107.
The coalition announcement follows last week’s announcement that influential congressman Chris Cox (R-CA) — member of the Energy and Commerce committee and Chair of the Homeland Security committee — had agreed to co-sponsor HR 107.
However, it still appears that HR 107 has an uphill road to climb this Congress. While the House Energy and Commerce committee has already held a hearing on the issue at the subcommittee level, it’s not clear whether the full committee will have the time or inclination to review the bill. The bill has also been referred to the House Judiciary committee, which has also not shown any great interest in moving the bill. Perhaps the new coalition can bring pressure to bear on both committees and get the bill brought to the House floor….
Update: I guess this qualifies as not showing any great interest. Here’s the statement released by the House Judiciary Committee yesterday concerning HR 107:
Latest PITAC Highlights
/In: Policy /by Peter HarshaThe President’s Information Technology Advisory Committee (PITAC) met for the third time since being reconstituted by President Bush in the Spring of 2003, approving a report (pdf) on Health Information Technology, getting an update on the progress of the subcommittee reviewing (pdf) federal cybersecurity R&D, and taking testimony for the subcommittee charged (pdf) with reviewing the current state of scientific computing.
The committee approved the final version of the subcommittee on health and information technology’s report without much discussion of the 12 recommendations. Progress has already been made in some of the areas highlighted by the committee. The President has already created, by executive order, a “National Health Information Technology Coordinator,” and appointed David J. Brailer, MD, Ph.D. , to fill the position. He’s also requested $100 million in funding for projects that demonstrate the promise of HIT, and has begun integrating events that highlight HIT into his campaign activities (as we’ve noted here previously).
The committee’s report is divided into two basic parts: 8 recommendations under the general theme of “Promoting the Electronic Health Record, Clinical Decision Support and Computerized Provider Order Entry”; and 4 additional recommendations concerning “Promoting secure, private, interoperable health information exchange.” Though the final version isn’t yet on the web, the draft version was approved virtually unchanged. More detail on the report in a future post.
The Cybersecurity panel didn’t have much to report as they’re still digesting the public testimony they took at the last PITAC meeting. They plan another public meeting of the subcommittee on July 29th, at which they’ll take testimony from representatives of DHS’s Homeland Security Advanced Research Projects Agency (HSARPA — DHS’s version of DARPA), NIST, DARPA, Gartner, FBI, DOD’s DDR&E, and NSA. They’ll also hold a public “town hall meeting” in conjunction with the GovSec 2004 conference in Washington, DC, on July 29th.
The third portion — really the bulk of the meeting — was devoted to the first taking of testimony by the subcommittee on computational science. The committee heard from four witnesses: Erik Jakobsson, Director of the Center of Bioinformatics and Computational Biology at NIH; Michael Strayer, Director of SciDAC at DOE’s Office of Science; Arden Bement, Interim Director of NSF; Ken Kennedy, Rice University professor and former co-chair of the previous incarnation of PITAC. All three government witnesses provided some detail about their agency’s efforts in computation science.
Jakobsson explained why computational science is so important at NIH — it’s essential for understanding biological complexity (he quoted NIH Director Elias Zerhouni’s defense of computation science at NIH, “You can’t translate what you don’t understand.”) The main issue with computational biology at NIH is that while its success has made it integral in research supported by the agency, deficiencies in the software (primarily) and the hardware (less so) are now the “rate-limiters” in biological innovation and discovery. PITAC Co-chair Ed Lazowska sought to understand whether NIH’s appreciation of the importance of computational science in its research was reflected in support for research in fundamental computer science — something NIH had been loathe to support in the past. Jakobsson replied that some recent solicitations by the agency were designed to attract domain=independent computational science research, but that the verdict of whether or not they were successful in that endeavor was not yet in.
DOE Office of Science Director Ray Orbach was unable to attend, so Michael Strayer, Director of the Office’s Scientific Discovery Through Advanced Computation (SciDAC) attended in his stead. Strayer made a couple of interesting points about the character of DOE Office of Science research: “no filter” at SciDAC — no prohibitions against the participation of foreign-born students in SciDAC related research; more than 50 percent of SciDAC funding goes to university researchers, 65 percent of requests for time on DOE systems are from university-based researchers. Demand far exceeds available cycles, Stayer said, but DOE Office of Science is making the cycles available even at the sacrifice of some of their “core” DOE programs.
Arden Bement gave the standard NSF presentation on Cyberinfrastructure, but noted that NSF is concerned with finding the right balance between computer science, domain science and general research. Specifically, NSF hopes to grow current cyber efforts, including:
On this last point, Bement said he hopes NSF will expand its domain specific activities, and that domain independent research will continue to see its funding in CISE. Bement was also asked to compare the US’s current efforts to those of the Japanese — especially in light of the Japanese Earth Simulator. Bement noted he thought the Japanese have “reached the peak of what vector machines can do” and that “surpassing it will take hybrid machines. Bement said efforts in CISE will focus beyond the frontiers — quantum and DNA-based computing, for example.
Regarding the agency’s cyberinfrastructure plan, CISE AD Peter Freeman noted that the Atkins’ report, which included a lofty $1 billion recommended increase for cyberinfrastructure programs at NSF, was an important guiding document, but probably an unrealistic request given the current fiscal environment. However, if you consider the Atkins’ report to be “broader than just NSF” than we have a chance to reach that number in aggregate efforts of the various agencies — but it’s important that we coordinate better to better leverage the resources of the federal government.
The final witness was Ken Kennedy, Rice University professor and former co-chair of the PITAC committee at the time of the release of the 1999 Information Technology Research: Investing in Our Future report. He delivered a report card on the federal government’s achievements in IT R&D since that report. In short, his report raises some concerns about the composition, management and increasingly short-term focus of the federal effort. His slides are an excellent overview of the current situation, so I’ve posted them here with his permission. Well worth a read.
The next full meeting of PITAC will be October 12, 2004. At that time, I think we’ll see a draft report from the Cybersecurity subcommittee, and perhaps some additional activity by the computational science committee. Whatever the case, keep it tuned here for the details as they happen.
House Science Committee Approves Two Computing Measures
/In: Policy /by Peter HarshaThe House Committee on Science today approved by voice vote two bills authorizing and codifying policy for High Performance Computing research and development at agencies under the committee’s jurisdiction. HR 4218, the High Performance Revitalization Act of 2004, attempts to address concerns within the computing community about interagency coordination in the government-wide Networking and Information Technology Research and Development (NITRD) program generally, and specifically within the high-performance computing community. In essence, the bill tries to do three things:
We’ve covered the bill previously here.
There was only one amendment to the bill, offered (and then withdrawn) by Rep. Brad Sherman (D-CA), that would have required a study of the moral and legal implications of computing research that might lead to machine cognition “equal or greater than human cognition.” Citing alternate possible futures for the research, Sherman evoked both Data from Star Trek: The Next Generation and HAL-9000 from 2001: A Space Odyssey while raising concerns about the implications of supercomputing efforts leading to machines smarter than humans in “as few as 25 years.” Committee Chairman Sherwood Boehlert (R-NY) opposed the amendment, objecting primarily to the broad language of the amendment. “While I appreciate the gentleman’s intent,” Boehlert said, “the amendment is so broadly written it would bring computer science research to a halt.” Would a computer that could beat a man at chess qualify, Boehlert wondered? Sherman agreed that the amendment could use some refinement, but also indicated it was a significant step from his original amendment, which would have prohibited research in any area in computer science that could lead to machine cognition on par with human cognition. Pointing out that the amendment was not likely to pass, Boehlert prevailed upon Sherman to withdraw the amendment without asking for a recorded vote, which would have required the members of the committee who’d left the markup for other engagements be called back to cast a vote. Sherman agreed and the bill passed by voice vote without further amendment.
Bill #2 was H.R. 4516, the Department of Energy High End Computing Act, introduced by Reps. Judy Biggert (R-IL) and Lincoln Davis (D-TN), which would authorize High-end computing at the Department of Energy. The bill, is similar to a Senate bill introduced by Sens. Jeff Bingaman (D-NM) and Lamar Alexander (R-TN), that is loosely modeled on the recommendations of the report (pdf) from the High End Computing Revitalization Task Force workshop CRA hosted last June. That bill would authorize $250 million worth of HEC R&D at the Department of Energy through FY 2009 ($40M in FY 05, building to $60 M in FY 09); it would authorize $500 million through FY 09 ($100 M a year from FY05-FY09) to construct a new supercomputer with “100 times the capability” of the fastest computer in existence at enactment; and it would authorize $50 million through FY09 ($10M a year) for the creation of a High-end Software Development Center. The money authorized would be “new” money, so appropriators would have to come up with additional money to fund it.
H.R. 4516 is a bit more modest than the Senate version, only authorizing $50 million in FY 05 for HEC R&D, building to $60 million in FY 07.
Having been approved by the Science Committee, both bills should head to the House floor soon, perhaps as early as next week. Watch this space for further details.
Administration says more Cyber Security Research and IT Security Personnel Needed
/In: Funding, People, Research /by Peter HarshaThanks to Jeff Grove of ACM for pointing out this story (subscription req’d), by William New, in National Journal’s Tech Daily (sub req’d) covering remarks by Department of Homeland Security Chief Security Officer Jack Johnson, DHS Chief Information Officer Steve Cooper, and FAA Deputy Director Thomas O’Keefe suggesting the great need for information security professionals in government and increased cyber security research and development. Some choice quotes:
NSF Study Finds 2/3 of Federal R&D Funding in the Last Decade went to the Life Sciences
/In: Funding, Policy /by Peter HarshaThis finding is from a recently released report (pdf) by the Science and Technology Policy Institute for the National Science Foundation.
Richard Jones of the American Institute of Physics has a good summation of the report and the questions it raises about the federal R&D portfolio here.
Gov Reform Committee Plans, then Postpones IT R&D Hearing
/In: Policy /by Peter HarshaThe House Government Reform Subcommittee on Technology, Information Policy, Intergovernmental Relations and the Census planned to hold a hearing this Wednesday on “Defining Federal Information Technology Research and Development: Who? Where? What? Why? and How Much?” However, the events surrounding former President Ronald Reagan’s memorial here in DC have resulted in the hearing’s postponement. No news on a new date.
The committee had planned to hear testimony from two panels including NSF CISE Assistant Director Peter Freeman, DOE Office of Science Director Ray Orbach, NCO/IT R&D Director Dave Nelson, and CRA Government Affairs co-Chair and PITAC co-Chair Ed Lazowska.
In the process of putting together his testimony, Lazowska developed a really nice set of bullet points making the case for federal support of IT R&D. So despite the hearing postponement, I’ve decided to post them here. In the next day or so I’ll add them to the Government Affairs site proper, but for now, here they are: