Computing Research Policy Blog

NY Times on the DOD’s “War Net”


Tim Weiner has an interesting piece in today’s New York Times about the Defense Department’s efforts to build it’s own Internet — the Global Information Grid. From the article:

The goal is to give all American commanders and troops a moving picture of all foreign enemies and threats – “a God’s-eye view” of battle.
This “Internet in the sky,” Peter Teets, under secretary of the Air Force, told Congress, would allow “marines in a Humvee, in a faraway land, in the middle of a rainstorm, to open up their laptops, request imagery” from a spy satellite, and “get it downloaded within seconds.”

The total cost of the project is expected to run to $24 billion over the next five years, plus an additional $5 billion for data encryption technologies.
Weiner quotes Vint Cerf in the piece, who is consulting on the project:

Vint Cerf, one of the fathers of the Internet and a Pentagon consultant on the war net, said he wondered if the military’s dream was realistic. “I want to make sure what we realize is vision and not hallucination,” Mr. Cerf said.
“This is sort of like Star Wars, where the policy was, ‘Let’s go out and build this system,’ and technology lagged far behind,” he said. “There’s nothing wrong with having ambitious goals. You just need to temper them with physics and reality.”

As we’ve noted before, DOD funding policies — especially at DARPA — have likely hamstrung some of technological progress that will be required to make full use of DOD’s network-centric strategy. University researchers, who played an important role in the development of the ARPANET, are increasingly unable to participate in DARPA-led networking research because much of that work is classified. Additionally, the style of the DARPA-sponsored research — more short-term rather than long-term — and a milestone-based approach to awarding the funding, with go/no-go decisions at 12 to 18 month intervals, isn’t well-suited to a university research setting. Because researchers are unwilling to propose work that can’t demonstrate results in 12-18 months, what’s proposed tends to be evolutionary, incremental research, rather than revolutionary proposals. And it looks like the new network may need some revolutionary proposals to reach its full potential:

To realize this vision, the military must solve a persistent problem. It all boils down to bandwidth.
Bandwidth measures how much data can flow between electronic devices. Too little for civilians means a Web page takes forever to load. Too little for soldiers means the war net will not work.
The bandwidth requirements seem bottomless. The military will need 40 or 50 times what it used at the height of the Iraq war last year, a Rand Corporation study estimates – enough to give front-line soldiers bandwidth equal to downloading three feature-length movies a second.
The Congressional Research Service said the Army, despite plans to spend $20 billion on the problem, may wind up with a tenth of the bandwidth it needs. The Army, in its “lessons learned” report from Iraq, published in May, said “there will probably never be enough resources to establish a complete and functioning network of communications, sensors, and systems everywhere in the world.”
The bottleneck is already great. In Iraq, front-line commanders and troops fight frequent software freezes. “To make net-centric warfare a reality,” said Tony Montemarano, the Defense Information Security Agency’s bandwidth expansion chief, “we will have to precipitously enhance bandwidth.”

Anyway, an interesting piece. Read the whole thing.

VA-HUD Appropriations Update…Not Good


As we’ve reported recently, the House and Senate Appropriations Committees have approved two markedly different versions of the FY 05 VA-HUD-Independent Agencies Appropriations bill that contains funding for NSF and NASA. The House bill, which stuck strictly to House approved budget caps, cut NSF by 2.0 percent across the board. The Senate bill employed some rule-bending and freed up enough funding in the bill to provide NSF with a 3 percent increase (the President’s requested level), including a 4.1% increase for CISE. Neither bill made it far enough in the appropriations process to get approval from either chamber.
It now appears that the VA-HUD bill will get folded in to the omnibus appropriations bill expected to be assembled when Congress returns on Nov 16th, but will include numbers far more similar to the House levels than the Senate.
In response to the original House bill, CRA activated its Computing Research Advocacy Network (CRAN) to urge the Senate to adopt higher numbers for NSF and the Computer and Information Science and Engineering (CISE) in particular. CRAN’s effort was reasonably successful: CISE’s increase in the Senate bill was the largest of any of the major directorates.
In response to the latest developments, CRA is once again calling on CRAN to get involved. Members of the appropriations conference committee need to hear from CRAN members, especially those whose representatives sit on the House and Senate Appropriations committee (who will serve as the conferees), about the importance of supporting NSF at the highest possible level. And they need to hear before November 16th!.
We’ve updated the CRAN Alert page to reflect the new situation and changed our sample letters as well. If you’re a member of CRAN, please contact your Senators and Representative in the House. If you’re not, please join!
We’ll have more details on the effort and the outcome as they emerge.

CSTB Releases Supercomputing Report


Just in time for the Supercomputing ’04 conference, the National Academies Computer Science and Telecommunications Board has released its report on the needs for U.S. supercomputing, Getting Up to Speed: The Future of Supercomputing.
Study chairs Susan Graham, UC Berkeley, and Marc Snir, UIUC (and a CRA board member), will present the report here at the SC 04 on Friday, November 12, at 8:30 am.
The report concludes

that the demands for supercomputing to strengthen U.S. defense and national security cannot be satisfied with current policies and levels of spending. The federal government should provide stable, long-term funding and support multiple supercomputing hardware and software vendors in order to give scientists and policy-makers better tools to solve problems in areas such as intelligence, nuclear stockpile stewardship, and climate change.

John Markoff of the New York Times has more on the report in a story today. Here’s a snippet:

“Our situation has deteriorated during the past 10 years,” said Susan L. Graham, a computer scientist at the University of California, Berkeley, who was co-chairwoman of the panel.
The authors of the report, which was prepared for the Energy Department, said they were recommending that the federal government spend $140 million annually on new supercomputing technologies. The federal government currently spends about $42 million each year, according to a recent report of the High End Computing Revitalization Task Force, a federal government working group.
“If we don’t start doing something about this now there will be nothing available in 10 years when we really need these systems, ” Ms. Graham said.

DOE and IBM Supercomputer Now the World’s Fastest


IBM’s Blue Gene/L, being built for the National Nuclear Security Agency at Lawrence Livermore National Lab, attained 70.72 teraflops in recent testing, more than twice as fast as the current top machine on the Top500.org supercomputers list. Secretary of Energy Spencer Abraham made the announcement today, noting that in its final form, the Blue Gene/L will be about 9 times faster than the current #1 machine, the Japanese Earth Simulator.
From the release:


“High performance computing is the backbone of the nation’s science and technology enterprise,” [Abraham said,] “which is why the Department has made supercomputing a top priority investment. Breakthroughs in applied scientific research are possible with the tremendous processing capabilities provided by extremely scalable computer systems such as BlueGene/L.”

The New Scientist has more on the story here.
We noted Blue Gene/L’s first record-breaking performance figures back in September.
The next version of the official Top500 list, with Blue Gene/L expected at the top, will be released at next week’s Supercomputing 2004 conference.

PITAC Focuses on Computational Science


The President’s Information Technology Advisory Committee met “virtually” today to hear an update on the efforts of the panel’s subcommittee on computational science. Dan Reed, who does just about everything at the University of North Carolina (Chancellor’s Eminient Professor, Vice-Chancellor for IT and CIO, and Director of the Renaissance Computing Institute — not to mention a current CRA board member) chairs the subcommittee and led the discussion of the subcommittee’s efforts. His slides (pdf) provide a pretty good summary of his talk. (Check slide 5 for a pic of Dan — back row, beneath the seal, with the beard.)
The Subcommittee has been tasked with figuring out:

  • 1. How well is the federal government targeting the right research areas in computational science and are current agency priorities appropriate?
  • 2. How well is federal funding for computational science balanced between short and long-term research, and low and high-risk research? Which areas of research have the greatest promise?
  • 3. How well is funding balanced between the underlying techniques of computational science vs. applications in the science and engineering domains? Which areas have the greatest promise?
  • 4. How well is computational science training and research integrated into the scientific domains that rely on computational science?
  • 5. How effectively do federal agencies coordinate?
  • 6. How has the federal investment kept up with the changing technology?
  • 7. What barriers hinder realizing the highest potential of computational science?
    Dan’s presentation has more detail, but in short, the subcommittee has made some progress towards answering those questions and gotten some good input already from the community (but is still looking for more). It looks like the final report will emphasize how crucial computing has become to the progress of science, as well as to U.S. competitiveness and national security. The subcommittee makes the point that computing has become the third component of scientific discovery, complementing theory and experiment, and that it’s so integral that its limitations constrain scientific discovery.
    Additionally, the subcommittee notes that complex multidisciplinary problems, from public policy through national security to scientific discovery and economic competitiveness, have emerged as new drivers of computational science.
    One nugget I found especially interesting from the presentation was an example of both the economic benefit and the health and safety benefit that will arise from more capable modeling enabled by advanced computing. The subcommittee noted that 40 percent of the $10 trillion U.S. economy is impacted by climate and weather. As one example of this, the subcommittee cited the hurricane warnings provided by the National Hurricane Center and the cost of the evacuations that often result. According to the subcommittee, there is $1 million in economic loss for each mile of coastline evacuated. With the current models, the U.S. now “over warns” by a factor of 3, with the average “over-warning” for a hurricane resulting in 200 miles of evacuations — or $200 million in unnecessary loss per event. Improved modeling (better algorithms, better software, more capable hardware, etc) would improve the accuracy of forecasts, saving lives and resources. As someone tasked with making “the case for IT R&D” to Hill and Administration policymakers, I can tell you that these sort of examples really resonate.
    The presentation has the full scoop, so I encourage you to read it and, even better, provide your input to the subcommittee. Dan’s contact information is in the presentation, or I’d be happy to forward input to the subcommittee as well. Additionally, the subcommittee will hold a “town hall” meeting at next week’s Supercomputing 2004 conference in Pittsburgh. So if you’re headed to the conference, plan on making it to the November 10th BOF session they’ve scheduled.
    The subcommittee will then spend November and December gathering further input and drafting the report. They’ll present a draft at a January 2005 PITAC meeting, with the final draft hopefully approved by the full committee in March 2005.
    With the current Administration now certainly in place for the next four years, the subcommittee’s report has the potential to be fairly influential in shaping federal support for computational science over the long term, so it’s definitely worth contributing to the effort.

  • New E-voting Blog


    Computer Scientists David Dill, Ed Felten, Joe Hall, Avi Rubin, Barbara Simons, Adam Stubblefield, and Dan Wallach have joined forces at evoting-experts.com to post news and commentary on e-voting issues (just in time for election day). The site has only been up a day or two and already has some good commentary on reports of voting problems in Texas, as well as a bunch of handy links.
    If chaos does ensue on Tuesday (and even if it doesn’t), the site looks like it will be a great place to check in and get the scoop with a technical perspective.

    NSA Decides Commercial Software Needs Security Help, Will Open Center


    According to this piece in Federal Computer Week, the National Security Agency plans to create a government-funded research center devoted to “improving the security of commercial software.” The effort would include researchers at NSA and NIST, and researchers funded by DARPA, and the Department of Homeland Security.
    From the article:

    The quality and trustworthiness of commercial software has become a matter of increasing concern to NSA officials, who are responsible for the security of Defense Department and intelligence software. NSA officials anticipate that many companies on whose software DOD and intelligence users rely will be moving significant portions of their commercial software development overseas within a few years.
    NSA officials cannot force companies to develop software a certain way, Wolf said, “but we would like to get them to a point where they are producing commercial products that meet the needs of our users.” About 95 percent of the agency’s desktop PCs run Microsoft’s Windows operating system, Wolf said.

    Read the whole thing here.

    CRA and USACM Urge Congress to Support NIST Labs


    CRA and ACM’s U.S. Public Policy Office today urged members of the House and Senate to adopt Senate approved funding levels for NIST Labs as part of the expected negotiation over omnibus appropriations legislation for FY 2005. As we’ve covered previously, NIST finds itself in dire funding straits as a result of decisions made by appropriators to cut $22 million in funding for the Labs in last year’s funding bill.
    Both the House and Senate appropriations committees have completed work on their respective bills, with the Senate bill coming closer to addressing the funding shortfall. The Senate bill would funnel more funding to the NIST Labs than the House version, adding $43 million to the FY 2004 number for a total of $384 million for FY 2005. In contrast, the House version would provide $375 million for FY 2005. Both versions are still well short of the Administration’s request of $423 million.
    CRA and USACM joined in writing to members of the House and Senate expected to be involved in the negotiations over the FY 05 Omnibus:


    October 29, 2004
    Dear Conferee:
    As representatives of two leading computing societies representing more than two hundred computing research institutions and over 85,000 computing professionals, we write to express our immense concern over the current funding level for the National Institute of Standards and Technology (NIST) Laboratory Program, and to urge you to support the program at the more appropriate level approved by the Senate in the Commerce, State, Justice and Judiciary Appropriations bill or higher.
    The NIST Labs have played an important role in the continuing progress of computing research that has, in turn, enabled the “new” economy. Advances in information technology have driven significant improvements in product design, development, and distribution for American industry, provided instant communications for people worldwide, and led new scientific disciplines like bioinformatics and nanotechnology that show great promise in improving a wide range of health and communications technologies.
    Within NIST’s Labs, the Computer Security Division (CSD) has played a crucial role in computer security by conducting research on security issues concerning emerging technologies, by promoting security assessment techniques, by providing security management guidance, and by generating greater awareness of the need for security. In particular, the CSD has demonstrated its ability to meld science and technology with commerce by working with industry and the cryptographic community to develop an Advanced Encryption Standard (AES). The CSD’s work on AES and its numerous other contributions have assisted the U.S. government, information technology industry, research enterprise, and the overall security of the Internet.
    Current work underway at the NIST labs will have profound effects on the nation’s cybersecurity, as many Federal agencies rely on NIST’s expertise and recommendations. Other areas where NIST’s work is crucial to the nation include electronic voting technologies and standards, as well as research into semiconductor manufacturing and nanotechnology that hold the promise for significant advancements in computing.
    Unfortunately, this work and NIST’s efforts to recruit talented researchers are in jeopardy as a result of the inadequate funding levels enacted as part of the FY 2004 appropriations process. To avoid jeopardizing NIST’s ability to produce materials trusted by the community, impairing its ability to conduct research, and detracting from some of its vital standards-oriented work, we urge you to make this funding a priority for FY 2005.
    As a neutral third party, NIST provides an invaluable setting for industry, academia, and government to work together on crucial technical issues. As a result, NIST and its work have tremendous credibility. The underfunding of NIST will adversely affect this credibility as well as NIST’s ability to function, and will have serious long-term consequences.
    The Computing Research Association (CRA) and the U.S. Public Policy Committee of the Association for Computing Machinery (USACM) stand ready to assist you as you address this important issue. We appreciate your continued support for research and development funding and would be pleased to answer any questions you or your staff might have.
    Sincerely,
    James D. Foley, CRA Chair
    Eugene Spafford, USACM Chair

    Previous CRA/USACM joint letter here.

    CSTB Calls for E-Voting White Papers


    Herb Lin sends word that the Computer Science and Telecommunications Board (CSTB) is seeking comments and white papers “relevant to the use of electronic voting systems.” Serious comments and white papers need to be received by November 22, 2004.

    CALL FOR INPUT — NRC Project on Electronic Voting
    A large number of American voters will be using electronic voting systems for the first time in the 2004 election.  Many issues and concerns have been raised about their use.  Recognizing this, the National Research Council (NRC) of the National Academies (which include the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine) has launched a project to develop a framework for understanding the technology, social, and operational issues relevant to decision-making about electronic voting.  Specifically, this project is intended to inform policy makers, election officials, and the interested public about the questions they should be asking about electronic voting systems in order to be better and more informed consumers of such systems.
    For purposes of this study, “electronic voting systems” are construed broadly to include any electronic device that plays (or that could play) an important role in any part of the voting system, from voter registration to ballot casting to the final certification of election results.
    To facilitate its study, the NRC’s Committee on Electronic Voting seeks white papers and other comments from interested parties on topics relevant to the use of electronic voting systems.  The papers received will serve to inform the committee, and will be posted on the project web site for the duration of the project. 
    At the discretion of the NRC, selected authors may be invited to address the committee about their analysis, findings, and conclusions, and some papers may be included as appendixes to the Committee’s final report.  The Committee may also reference the submitted materials, and include them or portions of them in its report.
    Individuals not wishing to submit a paper are invited to send any comments they wish on this subject to EVOTING-INPUT@NAS.EDU.  Serious comments will be transmitted to the committee.

    Full details after the jump…

    Read more

    Rare Presidential Campaign Post: Presidential Candidates Address Tech Policy


    CompTIA, the Computing Technology Industry Association, recently put a series of technology policy questions to both Sen. John Kerry and President George W. Bush, who apparently, to their credit, actually answered the questions.
    I don’t want to descend too far into the mire of dissecting the answers, but I wanted to highlight two that are particularly relevant to CRA’s interests. The first is What should the federal government do to address the issue of cyber security?:

    Bush response:
    Given the enormous importance of e-commerce, Internet-based communications, and the use of cyberspace to control portions of our physical infrastructure, cyber security is critical. The investments being made today in securing out Nation’s cyber infrastructure and in cyber security R&D are working to ensure that future generations of network software and hardware are less vulnerable to an attack and can maintain critical operations even when compromised.
    I announced the National Strategy to Secure Cyberspace in February 2003. This plan, which complements the National Strategy for the Physical Protection of Critical Infrastructure and Key Assets, depends on both public and private efforts to secure the many elements that comprise the national information infrastructure, including routers, switches, fiber-optic cables, and tens of millions of interconnected computers. The strategy provides five national cyber security priorities: a national security response system; vulnerability reduction program; an awareness and training program; a government cyberspace security program; and national security and international cyberspace security cooperation.
    Kerry response:
    In particular, worms and viruses are causing economic losses of billions of dollars a year. Experts have argued that future worms could allow attackers to rapidly control millions of Internet-connected computers. They could then use those computers to launch “denial of service attacks,” or steal and corrupt large quantities of sensitive information. Moreover, these worms could reach most vulnerable targets in an hour or less. We need a president who is actively supportive of developing technologies that will automatically detect and respond to these kinds of attacks.
    We need a president who will devote the energy of the White House to making our networks – our 21st century infrastructure – stronger and more secure. That means supporting a cyber security intelligence system ready to detect these threats. I will implement global standards and best practices so that weak links are strengthened. And we need a real partnership between the public and private sectors. Most of the infrastructure we need to protect doesn’t belong to government – and neither government nor business can fix these problems alone.

    The second is: How can the federal government better encourage investment in both basic and applied research and development?

    Bush response:
    America’s economy leads the world because our system of private enterprise rewards innovation. Entrepreneurs, scientists, and skilled workers create and apply the technologies that are changing our world. I believe that government must work to help create a new generation of American innovation and an atmosphere where innovation thrives. That is why it is crucial that we make the R&D tax credit permanent to spur private sector innovation.
    Science has always been an important priority in my Administration. My 2005 budget provides a record $132 billion for Federal R&D funding – a 44% increase over 2001 levels. I have committed 13.5% of total discretionary spending to R&D, which is a level of investment not seen since the height of the Apollo Space program in 1968. Basic research is supported with $26.8 billion – a 26% increase from 2001.
    I completed the doubling of the budget for the National Institute of Health (NIH) and increased the National Science Foundation’s (NSF) budget by 30%. Since 2001, funding for nanotechnology R&D doubled to $1 billion and funding for information technology R&D is up to $2 billion. My Hydrogen Fuel Initiative provides $228 million for hydrogen energy research in 2005 alone – more than triple what it was in 2001. And contrary to the myth propagated by my opponent, I am the first president to provide Federal funding for human embryonic stem cell research. Since 2001, my Administration has provided $35.5 million for stem cell research, and in 2003, the NIH funded $190 million in adult stem cell research.
    Kerry response:
    Federal support for long-term research that is beyond the time horizons of individual companies has played a critical role in creating high-tech products, services, and industries. This is particularly true for basic research at our nation’s universities, where we have the dual benefit of research and advanced training of our future scientists and engineers. The contribution of government-funded university research, however, is often critical for igniting the process of innovation. I want America to be the world leader in innovation and discovery and is committed to increasing the federal government’s investment in research and innovation.
    Among other things, I will boost support for the physical sciences and engineering by increasing research investments in agencies such as the National Science Foundation, the National Institutes of Health, the Department of Energy, the National Institute of Standards and Technology, and the National Aeronautics and Space Administration. This funding will help with the broad areas of science and technology that will provide the foundations for economic growth and prosperity in the 21st century, including advancements in nanotechnology, advanced manufacturing, IT, life sciences, clean energy and industrial biotechnology.

    I’ll give both candidates credit for voicing support for increased funding at NSF (in the latter answer) and cyber security R&D (in the former). If I’m going to quibble, I’d question the intensity of support for cyber security R&D noted by the President in the first answer by citing the total amount the Department of Homeland Security’s Science and Technology directorate will spend on cyber security R&D in FY 2004 ($18 million out of a total S&T budget of about $1 billion) as well as concerns we’ve raised in the past about the current state of cyber R&D. But it’s easier to quibble with the President because he’s the only one who has actually had to implement his priorities. Sen. Kerry’s answers are cast in the right direction, I think, but lack enough specificity to really know how they’d fare through the budget process.
    When it comes down to it, funding for science — especially fundamental research — tends to be a fairly bipartisan endeavor. Just to illustrate, I whipped up this little graph that shows how IT research and development funding has fared through the various administrations (click to make it big enough to see):
    justitrand_sm.jpg
    I put together the chart from NSF data and using the OMB FY 2004 deflators to get constant dollars. The years indicated are fiscal years, not calendar years, and the administrations are placed on the timeline so they cover the budgets for which they were responsible. For example, Reagan entered office in January, 1981, but his first budget (released in Feb 1981) was for the 1982 fiscal year. Also, the events placed on the chart are just ones that occurred to me as I was plotting this out as possibly relevant to the info on the chart, but are by no means exhaustive. I’d be really interested to hear feedback (harsha [at] cra.org) about other events others might consider relevant.
    The numbers beneath the names of the presidents on the graph represent the percentage increase in funding for IT R&D through the presidents’ terms. The graph only goes out to FY 2003, but President Bush’s numbers don’t improve much for FY 2004 — about 3.0 percent, not much higher than the rate of inflation. It’s hard to know if a President Kerry would be able to manage anything different given the current budget constraints — ongoing costs for the war on terror, increased pressure to constrain domestic spending to address the defecit, resistance to increased taxes, and a appropriations process that continues to pit science funding head-to-head with funding for veterans and federal housing programs.

    Please use the Category and Archive Filters below, to find older posts. Or you may also use the search bar.

    Categories

    Archives