Computing Research Policy Blog

DOD Technology and Privacy Advisory Committee Releases Final Report


Just a quick note to link to the final report (3.4 mb pdf) of the TAPAC on Safeguarding Privacy in the Fight Against Terrorism. The committee was chartered in the wake of the Terrorism Information Awareness controversy by the Secretary of Defense to “ensure the application of [TIA] or any like technology developed within DOD is carried out in accordance with U.S. law and American values related to privacy.” I’ll have more on the contents of the report in a future blog entry. Today’s Washington Post has an opinion piece from Heather McDonald of the Manhattan Institute taking particular issue with TAPAC’s recommendation that DOD seek a Foreign Intelligence Surveillance Court authorization before revealing any personally identifiable information known to or reasonably likely to concern US persons. I haven’t read enough of the report yet to know whether this recommendation really means what McDonald says it does.
Of course, as I’ve noted before, the irony of the attack on TIA is that research on privacy protecting technologies — the kinds of technologies that might allow the DOD to do a significant amount of data mining without revealing personally identifiable data — is no longer being funded as a result of the TIA shutdown, yet many of the other areas of TIA-related research are apparently moving forward “in the black” (in classified research in other agencies).
Anyway, I’ll share my thoughts on the report in another post in the near future….

Regulating Gmail


As a Gmail account holder (peter.harsha), I’ve got mixed feelings about news that the California State Senate has approved Sen. Liz Figueroa’s (D) bill placing restrictions on Google’s web-based e-mail service in order to prevent, Figueroa says, Google from “secretly oogling private e-mails.” While I’m happy on the one hand that government appears to be getting the message that privacy is an important issue — one maybe not so well understood by most consumers — I’m a bit nervous about the California legislature intervening.
I was especially nervous about Figueroa’s original bill, which would have “forbid Google from secretly scanning the actual content of e-mails for the purpose of placing targeted direct marketing ads” and required the company to “obtain the informed consent of every individual whose e-mails would be ‘oogled’.” By “every individual” Figueroa meant not only the Gmail account holder, but any person who e-mailed a Gmail account holder, or (presumably) anyone whose original e-mail message may have been forwarded by a third party to a Gmail account holder. I was primarily nervous because it seemed to me that the hurdle this restriction posed would effectively kill Gmail, and I was kind of intrigued by the service (despite Ed Felten’s objections). 🙂
Though the bill passed by the CA Senate (SB 1822) appears to have been amended heavily — gone is the outright prohibition against scanning e-mail without consent for marketing purposes, replaced with language that notes the many legitimate uses of e-mail scanning (spam filters, translation into audio for the blind, automatic sorting and forwarding, blocking image ads and web bugs, stripping HTML for handhelds) — the bill still notes that

In the context of electronic mail and instant messaging communications where electronic mail is scanned for purposes other than those [exceptions listed above], full and informed consent or notification of parties to the electronic mail communication is both appropriate and necessary.

The bill also places restrictions on how, even if granted consent, Google can make use of the e-mail scanning: it can only provide automated scanning to provide contemporaneous ads — which I believe was Google’s plan all along. But it also means Google can’t keep, for any purpose, any information or “user characteristics” it gleans from my email — even if that purpose might provide me some great benefit (I don’t know what exactly…great deals on products I’d like? pointers to information I might find useful?). Don’t get me wrong, I realize that there are plenty of nefarious things Google might be able to do with a monstrous database full of user data. But there might also be plenty of good things it could do — things I might even want them to do — in the future. This bill, it seems to me, would insure Google won’t have an opportunity to innovate at all in that area. What I worry about with this CA Senate action is the same thing I was worried about in the Total (Terrorism) Information Awareness debate and the ongoing P2P filesharing debate: the act of locking down technologies because some uses might be illegitimate can kill areas of legitimate research and innovation (or send them underground). I really worry that the legislative hammer is just too blunt an instrument to tinker with these technologies. Rather than artificially constraining the technologies because there’s a hypothetical chance they might be used for something nefarious, maybe the effort would be better focused on stopping those who are actually doing nefarious things.
Update: The San Jose Mercury News makes the same point about stifling innovation in an editorial.
Update 2: Gene Spafford sends an interesting e-mail with his perspective:

I think the best way to look at any of these issues is through the lens of the Fair Information Principles. They have been refined over the years, and enacted into the laws of countries around the world (including Canada). They also are consistent with standard ethics as practiced in a number of fields.
One of the standard ideas is that of informed consent. Information should be given only with consent, and then only after the uses of the information have been fully disclosed. Gmail doesn’t do that — if I send email to your gmail account, I have not been fully informed nor have I given consent. The California law restores that. You are correct that the law probably goes too far.
I think the TIA issue is addressed the same way. If you apply the fair information principles, then it was an unethical use of personal information.

Agency Funding Only Going to Get Worse?


Funding for federal agencies, including NSF, will face cuts in FY 2006, according to White House Office of Management and Budget budget planning guidance for agencies, the Washington Post reports today.
It’s important to point out that this is one of the very first steps in the budget process. The agencies will craft their budgets over the next 4-6 months keeping the OMB guidance in mind, then submit them to OMB for final approval before they become part of the President’s Budget Request in February 2005. And then Congress will take its crack at them during the 2005 legislative year. The numbers can and will change significantly before they’re finalized. However, the lower the number given to the agencies at the start of the process, the harder it is to raise it through the remainder of the process — so this guidance doesn’t bode well for some science agencies in FY 06.
From the story:

The funding levels referred to in the memo would be a tiny slice out of the federal budget — $2.3 billion, or 0.56 percent, out of the $412.7 billion requested for fiscal 2005 for domestic programs and homeland security that is subject to Congress’s annual discretion.
But the cuts are politically sensitive, targeting popular programs that Bush has been touting on the campaign trail. The Education Department; a nutrition program for women, infants and children; Head Start; and homeownership, job-training, medical research and science programs all face cuts in 2006.

The administration has widely touted a $1.7 billion increase in discretionary funding for the Education Department in its 2005 budget, but the 2006 guidance would pare that back by $1.5 billion. The Department of Veterans Affairs is scheduled to get a $519 million spending increase in 2005, to $29.7 billion, and a $910 million cut in 2006 that would bring its budget below the 2004 level.
Also slated for cuts are the Environmental Protection Agency, the National Science Foundation, the Small Business Administration, the Transportation Department, the Social Security Administration, the Interior Department and the Army Corps of Engineers.

Given OMB’s guidance, it’s easy to see why NSF’s Arden Bement was less than enthusiastic about future funding levels for his agency. The memo also apparently includes a proposed cut of 2.1 percent to the National Institutes of Health….

NSF Funding Outlook Grim But Cyberinfrastructure’s a Priority, says NSF Director


On Tuesday, National Science Foundation Director Arden Bement met with the Coalition for National Science Funding (of which CRA is a member) and warned the science community CNSF represents to lower expectations of increased funding for the agency in the near-term, saying the expectation of budget-doubling, as authorized by Congress and the President in 2002, “isn’t reasonable.”
“The NSF budget is correlated with the other non-defense research agencies,” Bement said, “and those are not scheduled to grow very much [in the President’s future budgets].” The Administration’s priorities are very clear, Bement said: secure the homeland, fight the war on terror, and reduce the budget deficit.
In light of the tough times ahead, Bement said the 3 percent increase (to $5.745 billion for FY05) in NSF funding requested by the President for should be seen as a symbolic show of support for the agency, especially as other agencies are seeing their budgets stay flat or decline in relation to inflation.
Given the relatively bleak outlook, Bement said the agency would pay special attention to three areas in the FY05 and FY06 budgets: improving productivity of researchers by increasing average grant size and duration; strengthening the national research infrastructure by investing in cyberinfrastructure research and development; and strengthening NSF management.
It was encouraging to hear Bement talk specifically of cyberinfrastructure in his remarks, especially as there was/is some concern in the computing community after the departure of Rita Colwell from NSF that the new NSF regime just might not “get” cyberinfrastructure. Bement, I think, is saying the right things, noting that research today is increasingly complex and “network-oriented” and that a crucial part of the enterprise is reliant on cyberinfrastructure. For FY 05, Bement said, NSF would spend ~$400 million on cyberinfrastructure related R&D foundation-wide and that funding would go to research beyond just “boxes and network” — research into algorithm development, the architecture of the net, software, etc.
The other two priority areas — increasing grant size and duration, and strengthening NSF management — are not particularly new challenges for the agency. Bement says he hopes to see average grant size grow beyond $142,000 (as it will be in FY05) to $160,000, and hopes that increasing grant duration will lead to proposals with more risk. He also noted that he’s growing concerned with the success rate of proposals dropping significantly, citing the increased number of proposals the agency receives (over 40,000 last year, compared to an average of 22-33k previously) and theorizing that other agencies decline in budgets and changes in funding regimes (see here and here for more) may be to blame for the increased pressure on NSF.
Bement also indicated he’s seeking $20 million in FY 05 for more staff at NSF. “We need more program officers,” he said, noting that proposals are now more complex than ever before, the administration of the process of peer-reviewing them is more complex, and there are a whole lot more of them to consider. “While the opportunities have never been greater,” he said, “the number of claimants has never been larger.”
It’s not clear how long Bement will remain in the driver’s seat at NSF. His appointment lasts only until the end of the year, so his job status is probably heavily dependent upon what happens in the elections in November. The grapevine, which had been chattering pretty loudly in March about a possible candidate for the permanent job (well, the six year term), seems to have quieted down considerably. It seems as though Bement will remain in charge at least through November.

Kalil on Google and America’s Innovation Policy


Tom Kalil has a nice column that explains the importance of federal support for fundamental research in the creation of Google (and makes the case that current US policy is hurting the environment that allows companies like Google to spawn and grow). The Google story is just one of the more recent examples of long-term, government-supported fundamental research helping develop and grow billion dollar industries and markets. It’s a story that has been repeated a lot in information technology. The National Academies Computer Science and Telecommunications Board even put together this somewhat hard-to-read graphic that shows 19 different IT-related technologies that, with government-support, each grew into billion dollar industries. (Note to self: redesign CSTB chart to make it clearer to read).
Kalil’s article notes some warning signs — we’re not producing enough students with science and engineering degrees, we’re relying too much on foreign students to fill the gap and tighter visa restrictions are affecting the supply, US share of publications in top science journals is declining — but he doesn’t delve into some of the specific causes, other than to note that in the President’s most recent budget “science funding in 21 of 24 science agencies would would be cut over the next five years…including NSF, NIH, and DOE Office of Science.” I’d add that I think the problems go beyond raw funding levels. I think we’re approaching the funding of fundamental research in a way different than in years past, especially in IT R&D, and especially at the Department of Defense. DOD and DARPA have always been crucially important to the development and advancement of computer science, and university researchers, in turn, have been crucially important to DOD and DARPA. However, changes in the way DARPA does business — from its moves to classify most of its computer security research, to its recent move to a ‘milestone’ based approach to funded research, where programs are evaluated on a 12 to 18 month cycle with ‘go/no go’ decisions at each step — have had the effect of discouraging university researchers from participating in DARPA-sponsored research. This is significant for a couple of reasons. First, it means some of the brightest minds in the country won’t/can’t work on DARPA’s important research problems. Second, it means university researchers have a hard time participating in maybe the most important aspect of DARPA-sponsored research, the community building around particular problems.
Computing research (and the country as a whole, I’d argue) has been well-served historically by having a two significant, diverse sources of funding in NSF and DARPA. NSF continues to be primarily a place for the single investigator — modest grants for small numbers of individual researchers. DARPA’s real strength historically, however, was different. DARPA program managers could identify a particular problem, then bring together and nurture communities of really smart people devoted to working on the problem. It was a very successful approach — DARPA is credited with between a third and a half of all the major innovations in computer science and technology (according to Michael Dertouzos). Between the two of them, the NSF and DARPA models have led to everything from graphical user interfaces, the Internet, and, well, Google.
So it concerns me that DARPA’s is discouraging (intentionally or unintentionally) university-based researchers from participating in their programs…maybe even more than the declining share of basic research in the DOD science and technology portfolio concerns me. And I think Kalil is right to be concerned with what we may reap in the future as a result of these policies today.

Administration Releases “Federal Plan for High End Computing” Report


Coinciding with yesterday’s House Science Committee hearing on HPC (see entry below), the White House released the latest report of the High-End Computing Revitalization Task Force (HECRTF), spelling out the Administration’s “forward looking plan” for high-end computing with three components:

  • an interagency R&D roadmap for high-end computing core technologies;
  • a federal high-end computing capacity and accessibility improvement plan; and,
  • recommendations relating to federal procurement of high-end computing systems.

The report is available as a pdf from the National Coordination Office for IT R&D.
In theory, this report will help shape the FY 2006 agency budgets, which are already being prepared. It’s hot of the presses, so I haven’t gotten all the way through it yet, but I’d be interested in your thoughts.

Highlights from the House Science Committee HPC Hearing


In what could fairly be described as a “love in,” Thursday’s House Science Committee hearing on HR 4218, the High Performance Computing Revitalization Act of 2004 (HPCRA), featured witnesses from the Administration, industry, university and federal labs all singing the praises of the committee’s bill to amend the 1991 High Performance Computing and Communications Act. The Committee’s bill, discussed in a previous blog entry, attempts to address concerns within the computing community about interagency coordination in the government-wide Networking and Information Technology Research and Development (NITRD) program generally, and specifically within the high-performance computing community. In essence, the bill tries to do three things:

  • Make sure US researchers have access to the best machines available;
  • Make sure research moves forward on a broad range of architectures, software, applications, algorithms, etc.; and,
  • Assure the interagency planning process really works.

Without exception, the four witnesses called to testify before the committee expressed strong support for the bill. While not going so far as to say the interagency planning process was broken, White House Office of Science and Technology Policy Director John Marburger agreed the bill would help strengthen interagency coordination in high-end computing and offered the Administration’s support for the bill.
Administration support will “grease the wheels” of the legislative process a bit for this particular bill, though it’s by no means an easy path to passage. From talking to various committee staff, it appears the biggest hurdle for the bill is actually on the Senate side. Senator Pete Domenici (R-NM), Chair of the Senate Committee on Energy and Natural Resources, needs to be convinced that the HPCRA doesn’t contain provisions that should be in his Energy bill (S 2095) — otherwise his reluctance to move anything through his committee (to which HPCRA would no doubt be referred) that looks like a piece of the Energy bill will stop the HPCRA in its tracks. On the House side, the path forward for the bill looks relatively clear. The Science Committee plans a “markup” on the bill in early June, and time for consideration on the House floor is already tentatively scheduled in July. Elements of the House Leadership are apparently very interested in making the bill part of a “improving national competitiveness” theme this summer.

Read more

“Bad Laws, Bad Code, Bad Behavior”


Declan McCullagh has an interesting piece at CNET News.com that describes the some of the difficulties Congress has trying to regulate technologies it doesn’t really understand. In their efforts to regulate things like peer-to-peer clients, spyware, and chat clients, members of Congress often cast their net way too broadly, drafting bills that would affect far broader swaths of the internet than they perhaps anticipated. Most of this, McCullagh argues, is because the members lack the expertise required to understand the implications of their legislation on technology. It’s a quick read, and I think it does a good job of demonstrating how important it is for groups like CRA, ACM, IEEE-CS, etc, to continue to offer to serve as resources for members when confronting technology issues.

Please use the Category and Archive Filters below, to find older posts. Or you may also use the search bar.

Categories

Archives