Computing Research Policy Blog

Business Week Notes DMCA, Induce Act’s Chill on Innovation


Heather Green has a great piece in this week’s issue of Business Week on the chilling effect of copyright legislation on research. Here’s a snippet:

Scientists like to probe the unknown and pioneer useful technologies. But in the spring of 2001, Edward W. Felten discovered that such efforts aren’t always welcome. A computer scientist at Princeton University, Felten took part in a contest sponsored by the Recording Industry Association of America to test technology for guarding music against piracy. He and his students quickly found flaws in the new antipiracy software and prepared to publish their results. But when the RIAA learned of the plan, it threatened to sue under the Digital Millennium Copyright Act (DMCA). Congress passed it back in 1998 to block hackers from breaking copy protection. And they wisely included a provision designed to let researchers such as Felten carry out their important work. Still, the RIAA deemed Felten’s line of study too sensitive.
Ultimately, faced with Felten’s countersuit, the RIAA backed off. But by that time news of the confrontation had rocked the tech community. The lesson many scientists drew was that copyright protection takes priority over research. “The legal tools that are being used to rein in bad behavior are so blunt that they block a lot of perfectly benign behavior,” Felten says. “That worries me.”
It’s a concern that reverberates broadly in tech circles at a time when Congress is considering tough new antipiracy legislation. Most people agree that the music and film industries have the right to defend themselves against illegal copying. But society needs to consider the potential impact on innovation. Many high-tech business leaders fear that new laws could hobble researchers who are trying to come up with inventions such as next-generation TV systems or even the electronic components for those inventions.

It’s a good read. Check out the whole thing. Felten has some additional commentary here, too.

No Compromise Reached on INDUCE, But Its Still Moving


Thanks to David Padgham (and USACM’s spiffy new blog) for pointing out this Wired story with the latest on sputtering talks to reach a compromise on the Induce Act.
It appears the tech community and the entertainment industry are still far apart on consensus language for the bill — originally designed to create a new form of secondary liability for copyright infringement that would hold technology makers and service providers liable for copyright violations by end users even if they never knew, contemplated, or intended to facilitate user infringement. Nevertheless, the Senate Judiciary Committee is still scheduled to consider the legislation at markup this morning.
We’ve covered this bill previously, but we’ll have more details as they emerge.
Update: Postponed again.
Another Update: Ernest Miller says it’s dead (for now) and has some additional commentary and links….

Senate Poised to Enable Terror Data Mining


Wired reports that the Senate could enable, as part of it’s National Intelligence Reform Act, work on a system “that would let government counter-terrorist investigators instantly query a massive system of interconnected commercial and government databases that hold billions of records on Americans.”

The proposed network is based on the Markle Foundation Task Force’s December 2003 report, which envisioned a system that would allow FBI and CIA agents, as well as police officers and some companies, to quickly search intelligence, criminal and commercial databases. The proposal is so radical, the bill allocates $50 million just to fund the system’s specifications and privacy policies.

In contrast to the PR battle surrounding a similar previous effort — DARPA’s Terrorism Information Awareness project — privacy and civil liberties protections are being touted prominently in advance. CMU Distinguished Professor of Computer Science Dave Farber, a member of the Markle Task Force, has posted an open letter (which he authored, along with Esther Dyson and Tara Lemmey) on his influential Interesting People e-mail list endorsing the proposed system provided the recommendations of the Task Force were implemented (“as looks likely”).

During the course of the debate in Congress over the implementation of the 9/11 Commission recommendations, valid questions have been raised over civil liberty concerns and role of such an information sharing network. We grappled with these same questions as we worked through our recommendations for the Task Force. We also learned important lessons from the problems of other efforts like the Total Information Awareness program (TIA) and MATRIX, both of which have raised serious privacy concerns. We eventually determined that you can achieve a balance between security and privacy if you ensure that strong guidelines, transparency, accountability and oversight are built into the network from the start.
In addition to the approach of building policy into the design of the network, the Task Force also designed the network not as a centralized database, but as a set of pointers and directories that allow only authorized users to gain access to information. The system also calls for regular and robust internal audits of how information is collected and stored and used. Privacy technologies such as anonymization, permission controls, and audit trails are built into the design of the network to prevent abuse. In addition, the Task Force also calls for a phased implementation to allow for appropriate public comment and a strong civil liberties board to oversee the system and ensure that privacy
The SHARE network capability, if implemented properly, would give us the ability to overcome the systematic barriers to information sharing that so seriously constrained our intelligence agencies prior to the 9/11 terrorist attacks, and that unfortunately still exist today. It would also provide us with the best opportunity not only to balance security and privacy, but to enhance them both as well.

CRA has argued in the past of the need to move forward with this sort of research and has faulted Congress for taking a heavy-handed approach in prohibiting similar work. Perhaps this new approach will allow some real progress in developing the technologies valuable in the war on terror while at the same time enabling the critical research needed to ensure that privacy and civil liberties concerns are met.

Financial Times Notes US Emphasis on Supercomputing


Thanks to Tom Jones for pointing out this story in the Financial Times on the increasing attention paid to supercomputing in the wake of the Japanese Earth Simulator’s 2 year reign at the top of the Top500.
Here’s a bit:

Hard drive by lobbyists helps US take supercomputer lead
By Simon London
An almost audible sigh of relief arose from Washington last week as Blue Gene/L, a computer built by International Business Machines, claimed the title of the world’s fastest supercomputer. Science and technology policymakers have spent the past two years fretting that the US was losing its lead in high-performance computing, with potentially serious implications for national competitiveness.
“We believe that to out-compete, we must out-compute,” said Deborah Wince-Smith, president of the Council on Competitiveness, one of many lobby groups pressing federal agencies to spend more on supercomputer research.
The lobbying campaign was sparked by the success of the Earth Simulator, a supercomputer built to model climate change by NEC, the Japanese electronics group. When full details of the Earth Simulator’s performance emerged in early 2002 it was clear that Japan had not only overtaken the US in terms of raw computing speed but done so by a metaphorical mile.

You can read the whole thing here.
And here’s our coverage of Blue Gene/L’s rise to the top.

DHS Cyber Security Chief Abruptly Resigns


Thanks to Rodney Peterson of Educause for pointing this out:

U.S. cybersecurity chief abruptly resigns, cites frustration
By TED BRIDIS, AP Technology Writer
WASHINGTON (AP) The government’s cybersecurity chief has abruptly resigned after one year with the Department of Homeland Security, confiding to industry colleagues his frustration over what he considers a lack of attention paid to computer security issues within the agency.
Amit Yoran, a former software executive from Symantec Corp., informed the White House about his plans to quit as director of the National Cyber Security Division and made his resignation effective at the end of Thursday, effectively giving a single’s day notice of his intentions to leave.
Yoran said Friday he ”felt the timing was right to pursue other opportunities.” It was unclear immediately who might succeed him even temporarily. Yoran’s deputy is Donald ”Andy” Purdy, a former senior adviser to the White House on cybersecurity issues.
Yoran has privately described frustrations in recent months to colleagues in the technology industry, according to lobbyists who recounted these conversations on condition they not be identified because the talks were personal.

We’ve been harping on DHS and the Administration for not taking cyber security — especially cyber security R&D — seriously enough, but this still comes as a surprise.
More details as we figure them out.
Update: Rodney Petersen has more at the Educause blog on the suddenness of Yoran’s departure and its implications for Higher Ed.

ACM Adopts Policy on E-Voting


From the ACM [a CRA affiliate organization] press release:


New York, September 27, 2004 — Seeking to bolster the security, accessibility, and public confidence in the voting process, ACM’s elected leadership has approved a public statement on the deployment and use of computer-based electronic voting (e-voting) systems for public elections. ACM’s position is that while computer-based e-voting systems have the potential to improve the electoral process, such systems must embody careful engineering, strong safeguards, and rigorous testing in both their design and operation.
“The use of computer-based systems to improve voting is a continuing process that will demand the ongoing involvement of technical experts, usability professionals, voting rights advocates, and dedicated election officials in the U.S. and other countries, said ACM President David Patterson. “As a leading voice in computing matters, ACM looks forward to working with all stakeholders in ensuring the integrity, security, and usability of systems used in public elections.”
Experts from the computing community have identified a variety of risks and vulnerabilities in many e-voting systems stemming from poor design, inferior software engineering processes, mediocre protective measures, and insufficient comprehensive testing. As a result, ACM has recommended that e-voting systems enable voters to inspect a physical (e.g., paper) record to verify the accuracy of their vote, and to serve as an independent check on the record produced and stored by the system. In addition, those records should be made permanent, not based solely in computer memory, to allow for an accurate recount.
ACM past president Barbara Simons, a prominent figure in the computing community on e-voting issues, agrees. “It is crucial that any computerized voting system provide a voter-verified audit trail that can be checked for accuracy by the voter when the vote is cast, and cannot be altered after the vote is cast,” she said.
The ACM statement on e-voting reflects the values in its long-held Code of Ethics and Professional Conduct. The Code states that computing professionals have a responsibility to share technical knowledge and expertise with the public by encouraging understanding of computing, including the impacts of computer systems and their limitations.
Prior to approving the statement, ACM engaged its membership, bringing the issue to their attention and soliciting their feedback in an online poll to gauge their support for the statement. Of the nearly 4,600 members from around the world who shared their opinions, 95 percent indicated their agreement with the statement. ACM continues to strengthen its position, visibility, and participation in government policy formulation by educating and informing policymakers on key issues in computing and information technology

And here’s the actual recommendation:

ACM Statement on Voting Systems
Virtually all voting systems in use today (punch-cards, lever machines, hand counted paper ballots, etc.) are subject to fraud and error, including electronic voting systems, which are not without their own risks and vulnerabilities. In particular, many electronic voting systems have been evaluated by independent, generally-recognized experts and have been found to be poorly designed; developed using inferior software engineering processes; designed without (or with very limited) external audit capabilities; intended for operation without obvious protective measures; and deployed without rigorous, scientifically-designed testing.
To protect the accuracy and impartiality of the electoral process, ACM recommends that all voting systems-particularly computer-based electronic voting systems-embody careful engineering, strong safeguards, and rigorous testing in both their design and operation. In addition, voting systems should enable each voter to inspect a physical (e.g., paper) record to verify that his or her vote has been accurately cast and to serve as an independent check on the result produced and stored by the system. Making those records permanent (i.e., not based solely in computer memory) provides a means by which an accurate recount may be conducted. Ensuring the reliability, security, and verifiability of public elections is fundamental to a stable democracy. Convenience and speed of vote counting are no substitute for accuracy of results and trust in the process by the electorate.

US Back on Top in Supercomputing


According to the Washington Post, IBM will announce today that its Blue Gene/L supercomputer has moved ahead of Japan’s Earth Simulator in speed, posting a working speed of 36.01 teraflops (versus the Earth Simulator’s 35.86 teraflops).
As we’ve covered a few times here on the blog, the spectre of having the Japanese in the top position of the Top 500 Supercomputing Sites list has focused a lot of attention in Congress and the Administration on the current state of high-end computing in the US. Some of the hand-wringing was a little overblown, I think. But, the real positive to come out of all that attention is a growth in understanding amongst policymakers of how crucial a robust computing research community is to national and economic competitiveness. Between the Administration’s efforts behind the “High End Computing Revitalization Task Force” and the Congressional efforts behind the DOE Supercomputing Authorization policymakers have heard repeatedly from members of the research community (in computing, and in all the other disciplines HEC enables and amplifies) and members of industry about the importance of a sustained commitment to HEC research and development. Let’s hope that they’re still receptive now that the US is back in the top slot.

INDUCE Could Find Its Way Into Spending Bill


Reuters reports that Senate backers of the INDUCE Act — including Senate Majority Leader Bill Frist and Minority Leader Tom Daschle — may attempt to attach the bill to a must-pass spending bill in an effort to secure passage for the controversial (and ill-conceived) legislation.
The Senate Judiciary Committee could take up the bill on Thursday, according to the Reuters’ report.
We’ve covered the bill here previously. The bill attempts to create a new form of secondary liability for copyright infringement that would hold technology makers and service providers liable for copyright violations by end users even if they never knew, contemplated, or intended to facilitate user infringement.
The U.S. Public Policy Committee of ACM has been keeping a close eye on the legislation and has more information.
Update: Much better coverage from Wired here.
Another Update: It now appears that the consensus within the technology community is strongly against the current version of the Induce Act. As Jason Schultz from EFF explains, the Business Software Alliance (BSA) (Microsoft, Apple, HP, IBM, Intel, etc), the Computer Systems Policy Project (Dell, HP, IBM, Intel, NCR, Motorola, etc), and the Information Technology Industry Council (Accenture, Canon, Cisco, Kodak, Oracle, Sun, etc) have all weighed in against the measure. EFF has links and analysis.
One More Update (Thursday, Sept 30, 2004): The Senate markup of the bill scheduled for today has been postponed.

House Leadership Wants Cyber Security Back In White House?


The AP reports today (via USA Today) that the House Republican leadership will propose moving the cyber security offices of the Department of Homeland Security back to the White House as part of the House version of the intelligence reorganization. According to the article, the change reflects “frustration among some Republican lawmakers about what they view as a lack of attention paid to cybersecurity by the Department of Homeland Security (DHS).”
CRA has certainly shared that frustration, especially a frustration with DHS’s relative lack of adequate funding support for cyber security research and development efforts at the agency. As we’ve noted before, cyber security gets a very small share of a $1 billion science and technology budget at DHS — $18 million in FY 04 (and that is double the amount the administration initially proposed). However, it’s not clear to me — having only seen the proposal summarized in news reports — that this new effort would have any effect on the current level of support for cyber security R&D or address any of the concerns we’ve raised (pdf) concerning cyber R&D efforts at other agencies as well.
Judging from the responses of the industry folks cited in the article, it doesn’t sound like very many folks were consulted before this got put on the fast track.
More details as we figure them out….

Please use the Category and Archive Filters below, to find older posts. Or you may also use the search bar.

Categories

Archives