Computing Research Policy Blog

“Computer Freedom and Privacy 2004” and Privacy R&D

I’m back from the 2004 edition of ACM’s Computer Freedom and Privacy Conference, held this year at the Claremont Hotel in Berkeley, California. This is the second time I’ve attended, and I’ve enjoyed it each time. The conference’s focus on the intersection between technology and civil rights brings together a fascinating blend of personalities — from EFF Founder John Gilmore to Rachel Brand of the Office of Legal Policy at the Department of Justice to Bill Scannell, of to Nualla O’Conner Kelley, Chief Privacy Officer, Department of Homeland Security. The sessions are always lively and thought-provoking.
A few issues seemed to get the most attention at this year’s conference — the perils of “Direct Recording Electronic” (DRE) voting systems, government profiling using TIA-like systems, and civil liberties issues surrounding Google services. Of these, I was particularly frustrated by the government profiling discussions. A number of speakers made the point (though Doug Tygar probably made it most emphatically) that the government spends a disproportionate amount of its IT privacy and security research funding on security rather than privacy. Given the current state of funding for federal cyber security R&D (see previous blog entry), that’s a sobering thought. But the frustrating part for me is that many of the same people at CFP who are now clamoring for more federal R&D for privacy related research were among the loudest voices calling for cancellation of DARPA’s TIA project (I’m not including Tygar in this, as I don’t know where he stood on TIA). Let me explain.
DARPA’s Total Information Awareness (pdf) project was an attempt to “design a prototype network that integrates innovative information technologies for detecting and preempting foreign terrorist activities against Americans.” In order to do this, DARPA was funding research into a range of technologies including real-time translation tools, data mining applications, and “privacy enhancing technologies” including development of a “privacy appliance” that would protect the identities of all individuals within any of the databases being searched until the government had the appropriate court order to reveal them. At CFP, Philippe Golle, from Xerox’s Palo Alto Research Center, described one such project at PARC (led by Teresa Lunt), that DARPA agreed to fund for 3 years as part of TIA. The plan was to create a “privacy appliance” that owners of commercial databases of interest to the government could deploy that would control government access to the databases using inference control (deciding what types of queries — individually or in aggregate — might divulge identifying information), access control and an immutable audit trail to protect individual privacy. Really neat stuff.
Anyway, the idea that the government might one day deploy a TIA-like system before all of the privacy and security challenges had been sorted out and thereby imperil American civil liberties and security was worrying to a great many people and organizations, including CRA. However, there seemed to be a number of different approaches among the various people and organizations to deal with the concerns. There was a vocal contingent that believed Congress should cancel TIA outright — the threat the research posed was greater than any possible good. CFP participant Jim Harper, of, addressed this approach directly at the conference, saying the reason groups like his try to kill government programs when they’re still in R&D and small is because they’re too hard to kill when they get big.
CRA had a more nuanced view, I believe, that argued that the challenges that needed to be overcome before any TIA-esque system would ever be fit for deployment were large and that CRA would oppose any deployment until concerns about privacy and security were met. However, we also argued that the research required to address those concerns was worthy of continued support — the problems of privacy and security (as well as the challenge of ever making something like TIA actually work) were truly difficult research problems…”DARPA hard” problems — and so we opposed any research moratorium.
Unsurprisingly, the “nuanced” position failed to carry the day once Congress got involved. At about the same time Congress was deciding TIA’s fate, stories broke in the press about DARPA’s FutureMAP project — which attempted to harness the predictive nature of markets to glean information about possible terrorist activities — and JetBlue airline’s release of customer data to the Defense Department (in violation of their privacy policies) that helped cement opinion that DARPA was out of control. It also didn’t help that the TIA program resided in DARPA’s Information Assurance Office, headed by the controversial Adm. John Poindexter. TIA’s fate was sealed. Congress voted to cut all funding for the program and eliminate the IAO office at DARPA that housed it.
However, Congress also recognized that some of the technologies under development might have a role to play in the war against terrorism. They included language in the appropriations bill (Sec 8131(a)) that allowed work on the technologies to continue at unspecified intelligence agencies, provided that work was focused on non-US citizens. As a result, much of the research that had been funded by DARPA has been taken up by the Advanced Research Development Agency, the research arm of the intelligence agencies. Because it’s classified, we have no way of knowing how much of TIA has been resurrected under ARDA. We also have no way of overseeing the research, no way of questioning the approach or implementation, no way of questioning the security or privacy protections (if any) included. In short, those who argued in support of a research moratorium just succeeded in driving the research underground.
Finally, one thing we do know about current TIA-related research efforts is that PARC’s work on privacy-enhancing technologies is no longer being funded.

Blogging on the Fly — Intel’s CEO Urges for More Basic Research Funding

I’m currently enjoying the Computers, Freedom and Privacy Conference, but thought I’d quickly point out a story in Tech Daily (subscription required), about Intel CEO Craig Barrett’s comments as part of the Task Force on the Future of American Innovation event on the Hill yesterday. Here are some of the choice bits:

As Intel CEO Craig Barrett announced a joint task force of technology companies and academia that favors more government funding of basic research, on Tuesday he criticized U.S. budget priorities and called on policymakers to “make investments in the 21st century.”
While emphasizing that the task force’s mission is not about “physical research versus agricultural subsidies,” Barrett took exception to the $30 billion in annual agriculture subsidies appropriated by Congress. He also said some of the $250 billion in the transportation bill pending before Congress would be better spent on physicists and engineers than roads and bridges.
“It’s a choice between a bridge to an island or a bridge to the future,” Barrett said at a news conference. He said the huge sums appropriated in such bills represent “an investment in the industries of the 19th century.”

Noting that funding for basic research has remained flat in inflation-adjusted dollars and cut by 37 percent as a share of gross domestic product over the past 30 years, the task force said such a trend will have dire consequences for American economic growth, global stability and prosperity.
They noted that technologies like the World Wide Web, fiber optics and magnetic resonance imaging all originated as basic research projects. Funding of such work at U.S. universities helped launch thousands of spin-off companies, employed hundreds of thousands of workers and generated billions of dollars in sales, the task force concluded.
The semiconductor industry alone had global sales of $166 billion last year, the task force stated.

The San Francisco Chronicle also has the story.

Congress Prepares Computing Research Authorization Bill

The House Science Committee is circulating a draft (pdf) of a bill to amend portions of the High Performance Computing Act of 1991 to address issues about coordination among federal agencies doing IT R&D. This is an important bill for a couple of reasons. First, the original HPC Act — besides being the bill sponsored by then Senator Al Gore that was the reason behind to the infamous “I took the initiative in creating the Internet” line from the 2000 presidential campaign — established the current structure for the now $2.0 billion a year federal investment in IT R&D and has done much to shape the discipline and the enormous amount innovation that has resulted…innovation that, in turn, has driven the new economy. So any alteration of the bill bears the weight of all of that success.
Second, the new bill is important for the message it sends. At a time when the overall budget for federal IT R&D has been basically flat (some agencies up, other down) for several years (graph) and when the Director of the White House’s Office of Science and Technology Policy has defended the flat budgets by claiming IT is a “mature” field, without the same complexity as the life sciences, it’s important to have Congress note that IT R&D is still vital to the nation for a whole host of reasons and is rich with challenges to solve.
The bill is also important because it attempts to address concerns within the computing community about interagency coordination in the NITRD program generally, and specifically within the high-performance computing community. If there’s one important point to take away from the Atkins Cyberinfrastructure Report, it’s that the cyberinfrastructure revolution is already underway. Agencies are already planning their own cyberinfrastructure strategies — NSF has reorganized and created a division of Shared Cyberinfrastructure, DOE has it’s own plan, NIH is considering its own health sciences network, NASA is considering its own geosciences network. It’s not clear, however, that there’s a whole lot of coordination going on between the agencies, despite the NITRD interagency coordination plan already in place. While a diversity of funding agencies is probably desirable in this space, coordination between the agencies can help insure that funding is spent in the most effective ways possible. The Science Committee bill addresses this by requiring the President’s Information Technology Advisory Committee (PITAC) review the NITRD coordination effort every two years and report to the President and Congress.
The bill also amends the HPC to direct the agencies under the Science Committee’s jurisdiction (NSF, NASA, DOE, NIST, NOAA, and EPA) to do a number of things consistent with their missions. You can see the complete list of responsibilities by looking at this section by section analysis of the bill (provided by the Committee staff). The Committee also provided this short-form summary of the bill’s intent:

Assuring U.S. Researchers Access to the Most Advanced High-Performance Computing Systems Available: the bill requires the High-Performance Computing Research and Development Program to “provide sustained access by the research community in the United States to high-performance computing systems that are among the most advanced in the world in terms of performance in solving scientific and engineering problems, including provision for technical support for users of such systems. The bill also specifically requires the National Science Foundation and the Office of Science at the Department of Energy to provide U.S. researchers with access to “world class” high-performance computing systems.
Assuring Balanced Progress on All Aspects of High-Performance Computing: in addition to assuring U.S. leadership in hardware infrastructure, the bill requires the program to support all aspects of high-performance computing for scientific and engineering applications, including software, algorithm and applications development, development of technical standards, development of new computer models for science and engineering problem solving, and education and training in all the disciplines that support advanced computing.
Assuring an Adequate Interagency Planning Process to Maintain Continued U.S. Leadership: the bill requires the Director of the Office of Science and Technology Policy to “develop and maintain a research, development, and deployment roadmap for the provision of high-performance computing systems for use by the research community in the United States.” This and other provisions in the bill are designed to ensure a robust ongoing planning and coordination process so that our national high-performance computing effort is not allowed to lag in the future.

The only negative that I can see with the bill is the lack of authorization funding amounts. The committee doesn’t set out any recommended funding levels for any of the agencies within the bill. This is largely, I think, a political necessity to avoid the difficulties that authorization bills with large funding levels encounter in the legislative process. The previous attempt to authorize NITRD programs (HR 3400, introduced by Research Subcommittee chair Nick Smith (R-MI)) failed to receive floor consideration in the 107th Congress because the House Leadership — already worried about a growing deficit — balked at the funding levels authorized in the bill.
Failing to include the numbers is a disappointment, mainly because having a set of authorized funding levels gives the community a useful target to use in advocated for appropriations from year to year. It’s also important symbolically, as an expression of the strength of support for the programs in Congress. However, not having any funding recommendations will likely make the bill sail through the legislative process.
CRA is interested in your feedback on the bill. Please give it a look and add your comment. The Committee plans to hold a hearing on the bill on April 29. Scheduled to testify are: OSTP Director John Marburger; Bill Bishop, CEO of SGI; Rick Stevens, from Argonne National Labs; and Dan Reed, CRA Board Member and University of North Carolina. The bill is on a fast track — the committee plans to mark it up and pass it out the following week.

PITAC Meeting Highlights

The President’s Information Technology Advisory Committee (PITAC) met yesterday in their second public session since being reconstituted last year after nearly two years of inactivity. The two items on the agenda were a report on the draft recommendations (pdf) of PITAC’s subcommittee on Health and IT, and the first taking of public testimony by the subcommittee on cyber security. CRA is well-represented on the Committee. Ed Lazowska, the co-chair, Dan Reed, and Gene Spafford are all current members of CRA’s Board of Directors, and committee member Dave Patterson is a former CRA board member and past Chair.
The cyber security portion of the meeting featured testimony from a number of agency officials that elicited some interesting give and take with the committee. Amit Yoran, Director of the National Cyber Security Division at the Department of Homeland Security raised some eyebrows with committee members when he suggested that venture capital, not the government, could better fund security research. Lazowska stopped him and pointed out that the private sector generally funds technologies that are, at most, a couple of years out. He noted that it was the federal government’s role to look 5 and 10 years out, and that venture capital plays an important role at the end of that pipeline. The exchange led Yoran to conclude that perhaps the committee, in its review of federal cyber R&D, should recommend DHS fund long-term, strategic investments in cyber security R&D.
This approach would mark a change in the agency’s current focus on short-term — six months or less — almost-ready-for-deployment technologies. But in his testimony later in the session, Simon Szykman, Director Cyber Security R&D at DHS, insisted the Department will continue to focus on the short-term research — the “low-hanging fruit” — for at least the next couple of years. In the future, he said, he hoped the department might one day include long-range research in up to 20 percent of its overall R&D portfolio. For now, Yoran and Szykman said the department is dependent upon the good work of agencies like NSF and DARPA for long-range research.
This presents a bit of a problem in that NSF and DARPA have their own issues regarding cyber security R&D. For NSF, the problem is primarily financial. NSF’s Carl Landwehr, a program director in CISE, testified that the agency receives far more good proposals in the area than it can fund. The recent $30 million Cyber Trust solicitation generated over 230 “small” proposals, of which the agency can fund about 30; 125 “medium” proposals, of which the agency can fund 6 or 8; and 30 large scale proposals, of which just 1 or 2 might receive funding. PITAC member Tom Leighton questioned whether that approximately 5-10 percent approval rate was typical of NSF programs and how many Landwehr thought would be determined to be good enough to fund after peer-review, if the agency had the funding. Landwehr said the funding rate wasn’t unusual for CISE programs, noting that the ITR program had a similar funding rate (NSF-wide the rate is probably closer to 30 percent), and that he expected that 25 percent of the proposals they received would likely be worthy of funding if NSF had the funds. In other words, NSF could easily fund 2.5 times their current cyber security R&D budget on good proposals if they had the funding.
This is a markedly different story than the one told by DARPA Director Tony Tether, who noted during his testimony that he thought DARPA program managers were “idea starved, not money starved” when it came to funding cyber security research. Tether also took considerable flak for the agency’s increased use of classification to limit the dissemination and discussion of its cyber security research underway. Tether defended the policy by noting that the Department of Defense is increasingly reliant on networking for its warfighting capability, therefore it is in the interest of national security to restrict any research that might expose a vulnerability or reveal a capability. However, since an estimated 85 percent of the DOD’s communications travel across commercial communications networks, this means that much of the research aimed at defending these networks is subject to restriction. The effects of this policy are numerous. For one, this limits significantly the contribution of university-based researchers in the DARPA research community — a community that has, historically, been vital to the advancement of computing (in part due to the inclusion of university researchers). However, this also means that the fruits of this research are unavailable to both the vitally important US commercial sector — which is heavily dependent upon secure networks for trillions of dollars of activity annually — and the other agencies of government, including DHS. Tether acknowledged this problem and suggested that perhaps there ought to be two parallel efforts — an unclassified track, funded by NSF and DHS, and a classified one supported by DARPA and the security agencies.
Funding is also currently a problem at DHS. Syzkman testified that the agency will likely have just over $1 billion in R&D funding in FY 05, but told Lazowska under questioning that cyber security R&D will account for just $18 million of that. Syzkman didn’t try to defend the funding, other than to suggest that the needs of other directorates within the department dictated the priorities in the Science and Technology directorate, and to suggest that the funding levels are the product of thinking that’s now over 18 months old. Future budgets, he suggested, will include more robust cyber security funding.
The plan for the subcommittee on cyber security at this point is to do some further fact-finding and develop a set of draft recommendations in time for the next meeting of PITAC in June. At the same time, the subcommittee on Health in IT will refine the draft recommendation (pdf) it presented at the meeting based on feedback from the committee and produce the first report on the issue. June will also likely mark the start of the third PITAC subcommittee’s work on the current state of scientific computing, headed by Dan Reed.
Stay tuned here for details….

White House Responds to UCS Complaint About US Science Policy

The White House last Friday released its response to a report by the Union of Concerned Scientists that claims

the administration is distorting and censoring scientific findings that contradict its policies; manipulating the underlying science to align results with predetermined political decisions; and undermining the independence of science advisory panels by subjecting panel nominees to political litmus tests that have little or no bearing on their expertise; nominating non-experts or underqualified individuals from outside the scientific mainstream or with industry ties; as well as disbanding science advisory committees altogether.

The administration response, authored by the Director of the White House Office of Science and Technology Policy, John Marburger, is a 20 page, point-by-point rebuttal. The gist:

In this Administration, science strongly informs policy. It is important to remember, however, that even when the science is clear – and often it is not – it is but one input into the policy process.

IEEE Gets Favorable OFAC Ruling, Publications Activities No Longer Restricted

Ed Felten’s Freedom to Tinker has a bit on the recent Treasury Department ruling in favor of IEEE regarding U.S. restrictions in place against copy-editing and publishing scientific papers whose authors come from countries under US trade embargoes. IEEE had been working the issue with the Treasury Department’s Office of Foreign Assets Control for about a year before receiving the favorable ruling. IEEE had stopped providing some services to its members in embargoed countries while it sought exemption from the trade rules. The Treasury Department’s decision means IEEE can resume peer-reviewing, editing, and publishing scholarly works from its members in embargoed countries such as Iran and Cuba.
Here’s the press release (pdf, 56kb) from IEEE.

CRA, USACM Urge Support for NIST Labs

In response to the dire funding situation for the National Institute of Standards and Technology (NIST) Labs program in FY 2004 and beyond, CRA and US ACM’s Public Policy Committee have joined in a letter to leaders in Congress calling for increased support in the FY 2005 appropriations process.
Among the labs most likely impacted by the cuts — cuts enacted as part of the FY 2004 Omnibus Appropriations bill passed in January — is NIST’s Computer Security Division, which has played a historic role in computer security by conducting security research on emerging technologies, promoting security assessment techniques, providing security management and guidance, and facilitating a greater awareness of the need for security.
The extended entry below has the full letter, or you can download a PDF version here (335k).

Read more

GrepLaw Interview With Spaf

GrepLaw, run by the Berkman Center for Internet and Society at Harvard, has an interview with CRA board member Gene Spafford (“Spaf”), on what it’s like to testify before Congress, the current spate of virus and worm attacks, his favorite operating systems, and his suggested reading list for “geeky legal types who want to become involved in the prevention, investigation, or prosecution of computer-related crimes.”

CRA Analysis of Computing Research in the FY 2005 Budget Request

As part of the American Association for the Advancement of Science (AAAS) annual review of R&D in the President’s Budget Request, CRA provides an analysis of computing research in the request. This is essentially a look at the current status of the Networking and Information Technology Research and Development initiative — the government-wide program that encompasses all federal IT R&D activities. In short, the President’s request would keep things pretty steady-state. A slight decline in overall funding — made up of slight increases at some agencies, and slight declines in other. But the overall funding requested still falls well short of the amount recommended by the President’s Information Technology Advisory Committee (PITAC) when they did their last comprehensive review of federal IT R&D funding back in 1999.
Here are the highlights from the report:


  • Networking and Information Technology Research and Development (NITRD) funding would fall 0.7 percent in FY 2005 to $2.00 billion across eleven federal agencies, under the President’s budget request.
  • The President’s request would increase funding for computing research at the National Science Foundation (NSF), the lead agency in the NITRD initiative, to $761 million in FY 2005, an increase of 0.9 percent.
  • Concerns about interagency coordination of large-scale “cyberinfrastructure” investments in FY 2005 will likely lead to greater congressional oversight of NITRD programs in 2004.
  • Read on to get the full scoop…

    Read more

    Please use the Category and Archive Filters below, to find older posts. Or you may also use the search bar.