Here’s some of the agency-by-agency wrap-up in the wake of the FY 2005 Omnibus Bill. We’ve detailed some of the blow-by-blow in the lead-up to final passage here. All figures include the 0.80 percent across-the-board cut imposed on all non-defense agencies to pay for additional spending in other parts of the bill. NSF: NSF will lose $105 million for FY 05 (compared to FY 04), a cut of 1.9%. The largest cut is to the Education and Human Resources Directorate ($98 million, 10%), with most of that cut falling on the Graduate Education and Research, Evaluation & Communication accounts. The Major Research Equipment account will see an increase of about $19 million over FY04. Research and Related Activities (home of CISE) was to be held essentially flat for FY05, but will lose $30 million (0.7%) as the result of the across-the-board cut. Here’s the breakout:
FY 2005 NSF Appropriations (in millions)
Account
FY 2004 Level
FY 05 Budget Request
FY 2005 House Mark
FY 2005 Senate Mark
FY 2005 Final Approps*
$ Change FY 05 Final vs FY 04
% Change FY 05 Final vs FY 04
Research and Related Activities
$4,251
$4,452
$4,152
$4,402
$4,221
-$30
-0.7%
Major Research Equip
$155
$213
$208
$130
$174
$19
12%
Education and Human Resources
$939
$771
$843
$929
$841
-$98
-10%
Salaries and Expenses
$219
$294
$250
$269
$223
$4
1.8%
National Science Board
$4
$4
$4
$4
$4
$0
0%
Inspector General
$10
$10
$10
$10
$10
$0
0%
Total
$5,578
$5,745
$5,467
$5,745
$5,473
-$105
-1.9%
*includes 0.80 percent across-the-board cut
Department of Energy Office of Science: The Office of Science received a 2.8 percent increase over FY 2004, to $3.6 billion. Included in the increase was $30 million for the development of a “Leadership Class” supercomputer at DOE ($25 million for hardware, $5 million for software development). Some additional details here. NIST Labs: The Labs faced a dire funding situation as a result of last year’s omnibus appropriation, but received some of that back this year in the form of a 10 percent increase, to $379 million. Not as good as the Senate appropriation level of $384 million, but better than the House approved level of $375 million. NASA: The NASA budget will increase 4.6 percent for FY 2005 to $16.1 billion, thanks in part to $800 million in additional funding targeted for the President’s Moon and Mars initiative. The $800 million was necessary to avoid a veto from the President and to ensure the support of GOP majority whip Rep. Tom Delay. Unfortunately, given the strict funding constraints placed on the appropriations committee by the congressional leadership and the Administration, the additional funding had to come at the expense of other agencies within the bill. National Institutes of Health: The National Institutes of Health (NIH) budget of $28.6 billion is just 2 percent above last year’s funding level, well off the 15 percent annual increases between 1998 and 2003. Most NIH institutes will receive increases between 1.6 and 2.5 percent. Rep. Vern Ehlers (R-MI) was among the first to issue a press release condemning the decrease in funding for the National Science Foundation in the Omnibus Bill. His press release can be found after the jump.
(Scroll down for the latest updates)
Conflicting rumors abound regarding the outlook for NSF in the FY 05 appropriations process. As House and Senate negotiators attempt to put the finishing touches on an omnibus appropriations bill by Friday or Saturday, word comes that NSF will likely not fare well in the bill. GovExec.comreports that a bit of rule-bending employed by the Senate to “find” an additional $1.2 billion in funding in their version of the VA-HUD-Independent agencies appropriation bill, which includes funding for NSF, isn’t acceptable to the House leadership or the White House budget office. So in order to stay within the budget cap, appropriators will have offset any increase in spending with funding from elsewhere in the bill.
In order to fund the President’s lunar/Mars initiative at NASA, it appears other agencies in the bill will bear the brunt. GovExec.com reports that NSF is slated for a $60 million cut overall compared to the agency’s FY 2004 funding level, but that “research funding” — presumably the agency’s Research and Related Activities account, which contains funding for NSF CISE — will “remain frozen” at FY 2004 levels.
A second rumor making the rounds suggests that the situation at NSF may be even more grim, with funding levels below the levels approved by the House appropriations committee. That level, you may recall, would be a 2.0 percent reduction in NSF’s budget compared to FY 2004.
We continue to press Congress on the importance of supporting funding at NSF at adequate levels. Keep an eye on this space for the latest details…. Update (11/19 12:30pm): Now hearing that the $60 million cut to NSF’s non-research account is in addition to an across-the-board 0.75 percent to all agencies, which would translate into another $41 million from NSF. Also hearing the bill will be released at 2 pm today. Update (11/20 11:30am): We’ve got the final numbers for NSF in the bill. $98 million cut from the Education and Human Resources account (plus an increase of $19 million to the Major Research Equipment account), and an across-the-board cut of 0.75 percent. Here’s a copy of <a href=bill language (pdf, 360kb) for NSF. Here’s the final breakout:
FY 2005 NSF Appropriations (in millions)
Account
FY 2004 Level
FY 05 Budget Request
FY 2005 House Mark
FY 2005 Senate Mark
FY 2005 Final Approps*
$ Change FY 05 Final vs FY 04
% Change FY 05 Final vs FY 04
Research and Related Activities
$4,251
$4,452
$4,152
$4,402
$4,221
-$30
-0.7%
Major Research Equip
$155
$213
$208
$130
$174
$19
12%
Education and Human Resources
$939
$771
$843
$929
$841
-$98
-10%
Salaries and Expenses
$219
$294
$250
$269
$223
$4
1.8%
National Science Board
$4
$4
$4
$4
$4
$0
0%
Inspector General
$10
$10
$10
$10
$10
$0
0%
Total
$5,578
$5,745
$5,467
$5,745
$5,473
-$105
-1.9%
*includes 0.80 percent across-the-board cut
Last Update: We’ve got the joint statement from the conferees regarding the NSF funding levels online now.
I lied. One More Update: Two things. One, the Energy and Water Appropriations bill did make it into the Omnibus, and it did contain $30 million for DOE’s Leadership Class Supercomputer ($25 million for hardware, $5 million for software development) we’ve coveredrecently. Two, the across the board cut was actually 0.8%, not 0.75% as I reported above. I’ll make the corrections soon, but at NSF’s level of resolution, it shouldn’t change things too much. Ok, chart is updated (11/22).
The House today re-passed HR 4516, the High End Computing Revitalization Act of 2004, which would authorize the creation of a “leadership class” supercomputer at DOE and a “High-end Software Development Center.” The House action means that the bill will now head to the President, who is expected to sign it.
We’ve covered the bill in detail in this space previously. Because it’s an authorization, it doesn’t actually include any money (just “authorizes” sums to be spent should the money get appropriated). Funding for a “leadership class” computer ($30 million, including $25 million for hardware) is included in the House version of the FY 2005 Energy and Water appropriations bill. However, it’s unlikely that bill will make it into the Omnibus Appropriations bill expected to be considered later this week because portions dealing with the proposed Yucca Mountain nuclear waste repository are deemed too contentious to get resolved before Congress adjourns. This means those agencies funded under the Energy and Water bill may not get an appropriation for FY 05 and may instead operate under a special “continuing resolution.” It’s not clear at this point what that continuing resolution might look like and whether or not it would contain any funding for the proposed supercomputer.
We’ll have a better idea by Thanksgiving when the 108th Congress is expected to adjourn for good.
The House Science Committee issued a press release marking the passage of HR 4516, but it doesn’t appear to be on their website yet. You can find it after the jump. Update: The Chronicle of Higher Ed has more (sub req’d), including a quote from CRA board member Dan Reed:
Daniel A. Reed, vice chancellor for information technology at the University of North Carolina at Chapel Hill, said that the law would increase the political visibility of supercomputing in the United States. Mr. Reed and other supporters of the bill say that the American supercomputing industry has lost its competitiveness and is not making products that can be used for cutting-edge research.
“This will help put it back on the front burner,” Mr. Reed said.
Update (11/22): The Energy and Water appropriations bill referred to above did get included in the Omnibus Appropriations bill, and it did include $30 million for DOE’s Leadership Class computing effort — $25 million for hardware, $5 million for software development. Update (11/30): The President has signed the bill!
The goal is to give all American commanders and troops a moving picture of all foreign enemies and threats – “a God’s-eye view” of battle.
This “Internet in the sky,” Peter Teets, under secretary of the Air Force, told Congress, would allow “marines in a Humvee, in a faraway land, in the middle of a rainstorm, to open up their laptops, request imagery” from a spy satellite, and “get it downloaded within seconds.”
The total cost of the project is expected to run to $24 billion over the next five years, plus an additional $5 billion for data encryption technologies.
Weiner quotes Vint Cerf in the piece, who is consulting on the project:
Vint Cerf, one of the fathers of the Internet and a Pentagon consultant on the war net, said he wondered if the military’s dream was realistic. “I want to make sure what we realize is vision and not hallucination,” Mr. Cerf said.
“This is sort of like Star Wars, where the policy was, ‘Let’s go out and build this system,’ and technology lagged far behind,” he said. “There’s nothing wrong with having ambitious goals. You just need to temper them with physics and reality.”
As we’ve noted before, DOD funding policies — especially at DARPA — have likely hamstrung some of technological progress that will be required to make full use of DOD’s network-centric strategy. University researchers, who played an important role in the development of the ARPANET, are increasingly unable to participate in DARPA-led networking research because much of that work is classified. Additionally, the style of the DARPA-sponsored research — more short-term rather than long-term — and a milestone-based approach to awarding the funding, with go/no-go decisions at 12 to 18 month intervals, isn’t well-suited to a university research setting. Because researchers are unwilling to propose work that can’t demonstrate results in 12-18 months, what’s proposed tends to be evolutionary, incremental research, rather than revolutionary proposals. And it looks like the new network may need some revolutionary proposals to reach its full potential:
To realize this vision, the military must solve a persistent problem. It all boils down to bandwidth.
Bandwidth measures how much data can flow between electronic devices. Too little for civilians means a Web page takes forever to load. Too little for soldiers means the war net will not work.
The bandwidth requirements seem bottomless. The military will need 40 or 50 times what it used at the height of the Iraq war last year, a Rand Corporation study estimates – enough to give front-line soldiers bandwidth equal to downloading three feature-length movies a second.
The Congressional Research Service said the Army, despite plans to spend $20 billion on the problem, may wind up with a tenth of the bandwidth it needs. The Army, in its “lessons learned” report from Iraq, published in May, said “there will probably never be enough resources to establish a complete and functioning network of communications, sensors, and systems everywhere in the world.”
The bottleneck is already great. In Iraq, front-line commanders and troops fight frequent software freezes. “To make net-centric warfare a reality,” said Tony Montemarano, the Defense Information Security Agency’s bandwidth expansion chief, “we will have to precipitously enhance bandwidth.”
As we’ve reportedrecently, the House and Senate Appropriations Committees have approved two markedly different versions of the FY 05 VA-HUD-Independent Agencies Appropriations bill that contains funding for NSF and NASA. The House bill, which stuck strictly to House approved budget caps, cut NSF by 2.0 percent across the board. The Senate bill employed some rule-bending and freed up enough funding in the bill to provide NSF with a 3 percent increase (the President’s requested level), including a 4.1% increase for CISE. Neither bill made it far enough in the appropriations process to get approval from either chamber.
It now appears that the VA-HUD bill will get folded in to the omnibus appropriations bill expected to be assembled when Congress returns on Nov 16th, but will include numbers far more similar to the House levels than the Senate.
In response to the original House bill, CRA activated its Computing Research Advocacy Network (CRAN) to urge the Senate to adopt higher numbers for NSF and the Computer and Information Science and Engineering (CISE) in particular. CRAN’s effort was reasonably successful: CISE’s increase in the Senate bill was the largest of any of the major directorates.
In response to the latest developments, CRA is once again calling on CRAN to get involved. Members of the appropriations conference committee need to hear from CRAN members, especially those whose representatives sit on the House and Senate Appropriations committee (who will serve as the conferees), about the importance of supporting NSF at the highest possible level. And they need to hear before November 16th!.
We’ve updated the CRAN Alert page to reflect the new situation and changed our sample letters as well. If you’re a member of CRAN, please contact your Senators and Representative in the House. If you’re not, please join!
We’ll have more details on the effort and the outcome as they emerge.
Just in time for the Supercomputing ’04 conference, the National AcademiesComputer Science and Telecommunications Board has released its report on the needs for U.S. supercomputing, Getting Up to Speed: The Future of Supercomputing.
Study chairs Susan Graham, UC Berkeley, and Marc Snir, UIUC (and a CRA board member), will present the report here at the SC 04 on Friday, November 12, at 8:30 am.
The report concludes
that the demands for supercomputing to strengthen U.S. defense and national security cannot be satisfied with current policies and levels of spending. The federal government should provide stable, long-term funding and support multiple supercomputing hardware and software vendors in order to give scientists and policy-makers better tools to solve problems in areas such as intelligence, nuclear stockpile stewardship, and climate change.
“Our situation has deteriorated during the past 10 years,” said Susan L. Graham, a computer scientist at the University of California, Berkeley, who was co-chairwoman of the panel.
The authors of the report, which was prepared for the Energy Department, said they were recommending that the federal government spend $140 million annually on new supercomputing technologies. The federal government currently spends about $42 million each year, according to a recent report of the High End Computing Revitalization Task Force, a federal government working group.
“If we don’t start doing something about this now there will be nothing available in 10 years when we really need these systems, ” Ms. Graham said.
IBM’s Blue Gene/L, being built for the National Nuclear Security Agency at Lawrence Livermore National Lab, attained 70.72 teraflops in recent testing, more than twice as fast as the current top machine on the Top500.org supercomputers list. Secretary of Energy Spencer Abraham made the announcement today, noting that in its final form, the Blue Gene/L will be about 9 times faster than the current #1 machine, the Japanese Earth Simulator.
From the release:
High performance computing is the backbone of the nations science and technology enterprise,” [Abraham said,] “which is why the Department has made supercomputing a top priority investment. Breakthroughs in applied scientific research are possible with the tremendous processing capabilities provided by extremely scalable computer systems such as BlueGene/L.
The New Scientist has more on the story here.
We noted Blue Gene/L’s first record-breaking performance figures back in September.
The next version of the official Top500 list, with Blue Gene/L expected at the top, will be released at next week’s Supercomputing 2004 conference.
The President’s Information Technology Advisory Committee met “virtually” today to hear an update on the efforts of the panel’s subcommittee on computational science. Dan Reed, who does just about everything at the University of North Carolina (Chancellor’s Eminient Professor, Vice-Chancellor for IT and CIO, and Director of the Renaissance Computing Institute — not to mention a current CRA board member) chairs the subcommittee and led the discussion of the subcommittee’s efforts. His slides (pdf) provide a pretty good summary of his talk. (Check slide 5 for a pic of Dan — back row, beneath the seal, with the beard.)
The Subcommittee has been tasked with figuring out:
1. How well is the federal government targeting the right research areas in computational science and are current agency priorities appropriate?
2. How well is federal funding for computational science balanced between short and long-term research, and low and high-risk research? Which areas of research have the greatest promise?
3. How well is funding balanced between the underlying techniques of computational science vs. applications in the science and engineering domains? Which areas have the greatest promise?
4. How well is computational science training and research integrated into the scientific domains that rely on computational science?
5. How effectively do federal agencies coordinate?
6. How has the federal investment kept up with the changing technology?
7. What barriers hinder realizing the highest potential of computational science? Dan’s presentation has more detail, but in short, the subcommittee has made some progress towards answering those questions and gotten some good input already from the community (but is still looking for more). It looks like the final report will emphasize how crucial computing has become to the progress of science, as well as to U.S. competitiveness and national security. The subcommittee makes the point that computing has become the third component of scientific discovery, complementing theory and experiment, and that it’s so integral that its limitations constrain scientific discovery.
Additionally, the subcommittee notes that complex multidisciplinary problems, from public policy through national security to scientific discovery and economic competitiveness, have emerged as new drivers of computational science.
One nugget I found especially interesting from the presentation was an example of both the economic benefit and the health and safety benefit that will arise from more capable modeling enabled by advanced computing. The subcommittee noted that 40 percent of the $10 trillion U.S. economy is impacted by climate and weather. As one example of this, the subcommittee cited the hurricane warnings provided by the National Hurricane Center and the cost of the evacuations that often result. According to the subcommittee, there is $1 million in economic loss for each mile of coastline evacuated. With the current models, the U.S. now “over warns” by a factor of 3, with the average “over-warning” for a hurricane resulting in 200 miles of evacuations — or $200 million in unnecessary loss per event. Improved modeling (better algorithms, better software, more capable hardware, etc) would improve the accuracy of forecasts, saving lives and resources. As someone tasked with making “the case for IT R&D” to Hill and Administration policymakers, I can tell you that these sort of examples really resonate.
The presentation has the full scoop, so I encourage you to read it and, even better, provide your input to the subcommittee. Dan’s contact information is in the presentation, or I’d be happy to forward input to the subcommittee as well. Additionally, the subcommittee will hold a “town hall” meeting at next week’s Supercomputing 2004 conference in Pittsburgh. So if you’re headed to the conference, plan on making it to the November 10th BOF session they’ve scheduled.
The subcommittee will then spend November and December gathering further input and drafting the report. They’ll present a draft at a January 2005 PITAC meeting, with the final draft hopefully approved by the full committee in March 2005.
With the current Administration now certainly in place for the next four years, the subcommittee’s report has the potential to be fairly influential in shaping federal support for computational science over the long term, so it’s definitely worth contributing to the effort.
Computer Scientists David Dill, Ed Felten, Joe Hall, Avi Rubin, Barbara Simons, Adam Stubblefield, and Dan Wallach have joined forces at evoting-experts.com to post news and commentary on e-voting issues (just in time for election day). The site has only been up a day or two and already has some good commentary on reports of voting problems in Texas, as well as a bunch of handy links.
If chaos does ensue on Tuesday (and even if it doesn’t), the site looks like it will be a great place to check in and get the scoop with a technical perspective.
The quality and trustworthiness of commercial software has become a matter of increasing concern to NSA officials, who are responsible for the security of Defense Department and intelligence software. NSA officials anticipate that many companies on whose software DOD and intelligence users rely will be moving significant portions of their commercial software development overseas within a few years.
NSA officials cannot force companies to develop software a certain way, Wolf said, “but we would like to get them to a point where they are producing commercial products that meet the needs of our users.” About 95 percent of the agency’s desktop PCs run Microsoft’s Windows operating system, Wolf said.
Appropriations Roundup
/In: Funding /by Peter HarshaHere’s some of the agency-by-agency wrap-up in the wake of the FY 2005 Omnibus Bill. We’ve detailed some of the blow-by-blow in the lead-up to final passage here. All figures include the 0.80 percent across-the-board cut imposed on all non-defense agencies to pay for additional spending in other parts of the bill.
NSF: NSF will lose $105 million for FY 05 (compared to FY 04), a cut of 1.9%. The largest cut is to the Education and Human Resources Directorate ($98 million, 10%), with most of that cut falling on the Graduate Education and Research, Evaluation & Communication accounts. The Major Research Equipment account will see an increase of about $19 million over FY04. Research and Related Activities (home of CISE) was to be held essentially flat for FY05, but will lose $30 million (0.7%) as the result of the across-the-board cut. Here’s the breakout:
Department of Energy Office of Science: The Office of Science received a 2.8 percent increase over FY 2004, to $3.6 billion. Included in the increase was $30 million for the development of a “Leadership Class” supercomputer at DOE ($25 million for hardware, $5 million for software development). Some additional details here.
NIST Labs: The Labs faced a dire funding situation as a result of last year’s omnibus appropriation, but received some of that back this year in the form of a 10 percent increase, to $379 million. Not as good as the Senate appropriation level of $384 million, but better than the House approved level of $375 million.
NASA: The NASA budget will increase 4.6 percent for FY 2005 to $16.1 billion, thanks in part to $800 million in additional funding targeted for the President’s Moon and Mars initiative. The $800 million was necessary to avoid a veto from the President and to ensure the support of GOP majority whip Rep. Tom Delay. Unfortunately, given the strict funding constraints placed on the appropriations committee by the congressional leadership and the Administration, the additional funding had to come at the expense of other agencies within the bill.
National Institutes of Health: The National Institutes of Health (NIH) budget of $28.6 billion is just 2 percent above last year’s funding level, well off the 15 percent annual increases between 1998 and 2003. Most NIH institutes will receive increases between 1.6 and 2.5 percent.
Rep. Vern Ehlers (R-MI) was among the first to issue a press release condemning the decrease in funding for the National Science Foundation in the Omnibus Bill. His press release can be found after the jump.
Read more →
NSF FY 2005 Appropriations Update
/In: Funding /by Peter Harsha(Scroll down for the latest updates)
Conflicting rumors abound regarding the outlook for NSF in the FY 05 appropriations process. As House and Senate negotiators attempt to put the finishing touches on an omnibus appropriations bill by Friday or Saturday, word comes that NSF will likely not fare well in the bill. GovExec.com reports that a bit of rule-bending employed by the Senate to “find” an additional $1.2 billion in funding in their version of the VA-HUD-Independent agencies appropriation bill, which includes funding for NSF, isn’t acceptable to the House leadership or the White House budget office. So in order to stay within the budget cap, appropriators will have offset any increase in spending with funding from elsewhere in the bill.
In order to fund the President’s lunar/Mars initiative at NASA, it appears other agencies in the bill will bear the brunt. GovExec.com reports that NSF is slated for a $60 million cut overall compared to the agency’s FY 2004 funding level, but that “research funding” — presumably the agency’s Research and Related Activities account, which contains funding for NSF CISE — will “remain frozen” at FY 2004 levels.
A second rumor making the rounds suggests that the situation at NSF may be even more grim, with funding levels below the levels approved by the House appropriations committee. That level, you may recall, would be a 2.0 percent reduction in NSF’s budget compared to FY 2004.
We continue to press Congress on the importance of supporting funding at NSF at adequate levels. Keep an eye on this space for the latest details….
Update (11/19 12:30pm): Now hearing that the $60 million cut to NSF’s non-research account is in addition to an across-the-board 0.75 percent to all agencies, which would translate into another $41 million from NSF. Also hearing the bill will be released at 2 pm today.
Update (11/20 11:30am): We’ve got the final numbers for NSF in the bill. $98 million cut from the Education and Human Resources account (plus an increase of $19 million to the Major Research Equipment account), and an across-the-board cut of 0.75 percent. Here’s a copy of <a href=bill language (pdf, 360kb) for NSF. Here’s the final breakout:
(in millions)
Level
Budget Request
House Mark
Senate Mark
Final Approps*
FY 05 Final vs FY 04
FY 05 Final vs FY 04
Last Update: We’ve got the joint statement from the conferees regarding the NSF funding levels online now.
I lied. One More Update: Two things. One, the Energy and Water Appropriations bill did make it into the Omnibus, and it did contain $30 million for DOE’s Leadership Class Supercomputer ($25 million for hardware, $5 million for software development) we’ve covered recently.
Two, the across the board cut was actually 0.8%, not 0.75% as I reported above. I’ll make the corrections soon, but at NSF’s level of resolution, it shouldn’t change things too much.Ok, chart is updated (11/22).Supercomputing Authorization Heads to President
/In: Funding /by Peter HarshaThe House today re-passed HR 4516, the High End Computing Revitalization Act of 2004, which would authorize the creation of a “leadership class” supercomputer at DOE and a “High-end Software Development Center.” The House action means that the bill will now head to the President, who is expected to sign it.
We’ve covered the bill in detail in this space previously. Because it’s an authorization, it doesn’t actually include any money (just “authorizes” sums to be spent should the money get appropriated). Funding for a “leadership class” computer ($30 million, including $25 million for hardware) is included in the House version of the FY 2005 Energy and Water appropriations bill. However, it’s unlikely that bill will make it into the Omnibus Appropriations bill expected to be considered later this week because portions dealing with the proposed Yucca Mountain nuclear waste repository are deemed too contentious to get resolved before Congress adjourns. This means those agencies funded under the Energy and Water bill may not get an appropriation for FY 05 and may instead operate under a special “continuing resolution.” It’s not clear at this point what that continuing resolution might look like and whether or not it would contain any funding for the proposed supercomputer.
We’ll have a better idea by Thanksgiving when the 108th Congress is expected to adjourn for good.
The House Science Committee issued a press release marking the passage of HR 4516, but it doesn’t appear to be on their website yet. You can find it after the jump.
Update: The Chronicle of Higher Ed has more (sub req’d), including a quote from CRA board member Dan Reed:
Update (11/22): The Energy and Water appropriations bill referred to above did get included in the Omnibus Appropriations bill, and it did include $30 million for DOE’s Leadership Class computing effort — $25 million for hardware, $5 million for software development.
Update (11/30): The President has signed the bill!
Read more →
NY Times on the DOD’s “War Net”
/In: Research /by Peter HarshaTim Weiner has an interesting piece in today’s New York Times about the Defense Department’s efforts to build it’s own Internet — the Global Information Grid. From the article:
The total cost of the project is expected to run to $24 billion over the next five years, plus an additional $5 billion for data encryption technologies.
Weiner quotes Vint Cerf in the piece, who is consulting on the project:
As we’ve noted before, DOD funding policies — especially at DARPA — have likely hamstrung some of technological progress that will be required to make full use of DOD’s network-centric strategy. University researchers, who played an important role in the development of the ARPANET, are increasingly unable to participate in DARPA-led networking research because much of that work is classified. Additionally, the style of the DARPA-sponsored research — more short-term rather than long-term — and a milestone-based approach to awarding the funding, with go/no-go decisions at 12 to 18 month intervals, isn’t well-suited to a university research setting. Because researchers are unwilling to propose work that can’t demonstrate results in 12-18 months, what’s proposed tends to be evolutionary, incremental research, rather than revolutionary proposals. And it looks like the new network may need some revolutionary proposals to reach its full potential:
Anyway, an interesting piece. Read the whole thing.
VA-HUD Appropriations Update…Not Good
/In: Funding /by Peter HarshaAs we’ve reported recently, the House and Senate Appropriations Committees have approved two markedly different versions of the FY 05 VA-HUD-Independent Agencies Appropriations bill that contains funding for NSF and NASA. The House bill, which stuck strictly to House approved budget caps, cut NSF by 2.0 percent across the board. The Senate bill employed some rule-bending and freed up enough funding in the bill to provide NSF with a 3 percent increase (the President’s requested level), including a 4.1% increase for CISE. Neither bill made it far enough in the appropriations process to get approval from either chamber.
It now appears that the VA-HUD bill will get folded in to the omnibus appropriations bill expected to be assembled when Congress returns on Nov 16th, but will include numbers far more similar to the House levels than the Senate.
In response to the original House bill, CRA activated its Computing Research Advocacy Network (CRAN) to urge the Senate to adopt higher numbers for NSF and the Computer and Information Science and Engineering (CISE) in particular. CRAN’s effort was reasonably successful: CISE’s increase in the Senate bill was the largest of any of the major directorates.
In response to the latest developments, CRA is once again calling on CRAN to get involved. Members of the appropriations conference committee need to hear from CRAN members, especially those whose representatives sit on the House and Senate Appropriations committee (who will serve as the conferees), about the importance of supporting NSF at the highest possible level. And they need to hear before November 16th!.
We’ve updated the CRAN Alert page to reflect the new situation and changed our sample letters as well. If you’re a member of CRAN, please contact your Senators and Representative in the House. If you’re not, please join!
We’ll have more details on the effort and the outcome as they emerge.
CSTB Releases Supercomputing Report
/In: Funding, Research /by Peter HarshaJust in time for the Supercomputing ’04 conference, the National Academies Computer Science and Telecommunications Board has released its report on the needs for U.S. supercomputing, Getting Up to Speed: The Future of Supercomputing.
Study chairs Susan Graham, UC Berkeley, and Marc Snir, UIUC (and a CRA board member), will present the report here at the SC 04 on Friday, November 12, at 8:30 am.
The report concludes
John Markoff of the New York Times has more on the report in a story today. Here’s a snippet:
DOE and IBM Supercomputer Now the World’s Fastest
/In: Research /by Peter HarshaIBM’s Blue Gene/L, being built for the National Nuclear Security Agency at Lawrence Livermore National Lab, attained 70.72 teraflops in recent testing, more than twice as fast as the current top machine on the Top500.org supercomputers list. Secretary of Energy Spencer Abraham made the announcement today, noting that in its final form, the Blue Gene/L will be about 9 times faster than the current #1 machine, the Japanese Earth Simulator.
From the release:
The New Scientist has more on the story here.
We noted Blue Gene/L’s first record-breaking performance figures back in September.
The next version of the official Top500 list, with Blue Gene/L expected at the top, will be released at next week’s Supercomputing 2004 conference.
PITAC Focuses on Computational Science
/In: Policy /by Peter HarshaThe President’s Information Technology Advisory Committee met “virtually” today to hear an update on the efforts of the panel’s subcommittee on computational science. Dan Reed, who does just about everything at the University of North Carolina (Chancellor’s Eminient Professor, Vice-Chancellor for IT and CIO, and Director of the Renaissance Computing Institute — not to mention a current CRA board member) chairs the subcommittee and led the discussion of the subcommittee’s efforts. His slides (pdf) provide a pretty good summary of his talk. (Check slide 5 for a pic of Dan — back row, beneath the seal, with the beard.)
The Subcommittee has been tasked with figuring out:
Dan’s presentation has more detail, but in short, the subcommittee has made some progress towards answering those questions and gotten some good input already from the community (but is still looking for more). It looks like the final report will emphasize how crucial computing has become to the progress of science, as well as to U.S. competitiveness and national security. The subcommittee makes the point that computing has become the third component of scientific discovery, complementing theory and experiment, and that it’s so integral that its limitations constrain scientific discovery.
Additionally, the subcommittee notes that complex multidisciplinary problems, from public policy through national security to scientific discovery and economic competitiveness, have emerged as new drivers of computational science.
One nugget I found especially interesting from the presentation was an example of both the economic benefit and the health and safety benefit that will arise from more capable modeling enabled by advanced computing. The subcommittee noted that 40 percent of the $10 trillion U.S. economy is impacted by climate and weather. As one example of this, the subcommittee cited the hurricane warnings provided by the National Hurricane Center and the cost of the evacuations that often result. According to the subcommittee, there is $1 million in economic loss for each mile of coastline evacuated. With the current models, the U.S. now “over warns” by a factor of 3, with the average “over-warning” for a hurricane resulting in 200 miles of evacuations — or $200 million in unnecessary loss per event. Improved modeling (better algorithms, better software, more capable hardware, etc) would improve the accuracy of forecasts, saving lives and resources. As someone tasked with making “the case for IT R&D” to Hill and Administration policymakers, I can tell you that these sort of examples really resonate.
The presentation has the full scoop, so I encourage you to read it and, even better, provide your input to the subcommittee. Dan’s contact information is in the presentation, or I’d be happy to forward input to the subcommittee as well. Additionally, the subcommittee will hold a “town hall” meeting at next week’s Supercomputing 2004 conference in Pittsburgh. So if you’re headed to the conference, plan on making it to the November 10th BOF session they’ve scheduled.
The subcommittee will then spend November and December gathering further input and drafting the report. They’ll present a draft at a January 2005 PITAC meeting, with the final draft hopefully approved by the full committee in March 2005.
With the current Administration now certainly in place for the next four years, the subcommittee’s report has the potential to be fairly influential in shaping federal support for computational science over the long term, so it’s definitely worth contributing to the effort.
New E-voting Blog
/In: Policy /by Peter HarshaComputer Scientists David Dill, Ed Felten, Joe Hall, Avi Rubin, Barbara Simons, Adam Stubblefield, and Dan Wallach have joined forces at evoting-experts.com to post news and commentary on e-voting issues (just in time for election day). The site has only been up a day or two and already has some good commentary on reports of voting problems in Texas, as well as a bunch of handy links.
If chaos does ensue on Tuesday (and even if it doesn’t), the site looks like it will be a great place to check in and get the scoop with a technical perspective.
NSA Decides Commercial Software Needs Security Help, Will Open Center
/In: Research /by Peter HarshaAccording to this piece in Federal Computer Week, the National Security Agency plans to create a government-funded research center devoted to “improving the security of commercial software.” The effort would include researchers at NSA and NIST, and researchers funded by DARPA, and the Department of Homeland Security.
From the article:
Read the whole thing here.