(from The American Institute of Physics Bulletin of Science Policy News
Number 55: April 28, 2004)
long, but interesting
Two Perspectives: the Bush Administration and S&T Funding
“Science policy entails more than setting budgets, but that is a
major bottom line of the policy process.”- OSTP Director John
Marburger
Last week’s 29th Annual AAAS Forum on Science and Technology Policy
opened with two divergent views of the Bush Administration’s funding
of science and technology. The keynote speaker was John Marburger,
Director of the White House Office of Science and Technology
Policy. He was followed by Senate Minority Leader Tom Daschle
(D-SC). While the speakers agreed on the importance of science and
technology to the nation, they had different perspectives on the
Administration’s funding of science and technology. Selections
from their addresses on funding issues follow. Several other issues
were discussed; the full text of their remarks (with a series of
charts that Marburger referred to in his presentation) can be
accessed here
In viewing the recent funding of federal science and technology,
the often-repeated phrase is relevant: “The President proposes, and
the Congress disposes.” As Marburger stated, “I do want to
acknowledge that Congress has treated science well in its
appropriations . . . .”
MARBURGER:
“President Bush has made it abundantly clear that his budget
priorities have been to protect the nation, secure the homeland, and
revitalize the economy. His budget proposals to Congress are in line
with vigorous actions in each category. Increases in expenditures
for homeland security, in particular have dominated changes in the
discretionary budget during this Administration, and we have seen
the emergence of a significant new science and technology agency
within the Department of Homeland Security (DHS). The current budget
proposal for the DHS Science and Technology function is $1.2
billion, with an estimated total of $3.6 billion in homeland
security related R&D in all agencies. The science and engineering
communities exerted a significant influence on the structure of the
new department, particularly through the National Research Council
report ‘Making the Nation Safer.’
“Each of the three overarching Presidential priorities has strong
science and technology components. The President has sought, and
Congress has appropriated, substantial increases in Research and
Development budgets not only for homeland security, but also for
defense and for key areas of science and technology related to long
term economic strength.”
“R&D expenditures in this Administration are up 44% over the past
four years to a record $132 billion proposed for 2005 compared to
$91 billion in FY 2001, and the non-defense share is up 26%. The
President’s FY2005 Federal R&D budget request is the greatest share
of GDP in over 10 years, and its share of the domestic discretionary
budget, at 13.5% is the highest level in 37 years. Non-defense R&D
funding is the highest percentage of GDP since 1982. Total U.S. R&D
expenditures, including the private sector was at 2.65% of GDP in
2002, the most recent year for which I have data. I suspect it is
above that today. Its historical high was 2.87% in 1964 as NASA was
ramping up for the Apollo program.”
“The FY 2005 request commits 5.7% of total discretionary outlays to
non-defense R&D, the third highest level in the past 25 years.
“While the President has proposed to reduce the overall growth in
non-defense, non-homeland security spending to 0.5% this year to
address overall budget pressures, his budget expresses a commitment
to “non-security” science with a considerably higher growth rate at
2.5%.”
“During the current Administration, funding for basic research has
increased 26% to an all-time high of $26.8 billion in the FY 2005
budget request.”
“What Congress will do with the Presidential requests for science .
.. . is at this point an open question. I do want to acknowledge that
Congress has treated science well in its appropriations, and the
good figures for science during this Administration represent a
strong consensus between the Legislative and Executive branches that
science is important to our nation’s future.
“As I emphasized in 2002, priorities for these large expenditures
respond to two important phenomena that have shaped the course of
society and are affecting the relationship of society to science,
namely the rapid growth of technology, particularly information
technology, as the basis for a global economy, and the emergence of
terrorism as a destabilizing movement of global consequence.”
Later, in a section entitled “Priority Highlights,” Marburger cited
the following:
Health Sciences “Funding during these four years to NIH has
increased more than 40%, to $28.6 billion. In response to this
unprecedented National commitment, NIH as a whole has adopted an
important new roadmap for transforming new knowledge from its
research programs into tangible benefits for society. Emerging
interdisciplinary issues such as nutrition and aging together with
revolutionary capabilities for understanding the molecular origins
of disease, health, and biological function will continue to drive
change within NIH.
National Science Foundation ” In four years the NSF budget has
increased 30% over FY 2001 to $5.7 billion. Much of this funding has
gone to enhance the physical sciences and mathematics programs,
where advances often provide the foundation for achievements in
other areas, as well as increases to the social sciences and to the
NSF education programs.
“NASA has increased 13%, largely for exploration science that will
spur new discoveries, enhance technology development, and excite the
next generation of scientists and engineers.”
“DOE Science and technology programs have increased 10%, in such
important areas as basic physical science and advanced computing. As
the agency sponsoring the largest share of physical science, DOE’s
Office of Science is increasingly viewed as a high leverage area for
investment. DOE has engaged in years of intense planning,
culminating recently in a multi-year facilities roadmap that assigns
specific priorities to a spectrum of new projects.
Energy and Environment “This Administration is investing heavily
in technologies for producing and using energy in environmentally
friendly ways, from shorter term demonstration projects for
carbon-free power plants, to the very long term promise of nuclear
fusion for clean, scalable power generation. In the intermediate
term, technologies associated with the use of hydrogen as a medium
for energy transport and storage are receiving a great deal of
attention, not only in the U.S. but internationally. The President’s
Hydrogen Fuel initiative is a $1.2 billion, five-year program aimed
at developing the fuel cell and hydrogen infrastructure technologies
needed to make pollution-free hydrogen fuel cell cars widely
available by 2020.”
DASCHLE:
“Regrettably, rather than strengthening this [government – science]
partnership, I fear that the Bush Administration has allowed it to
erode in two critical ways. First, the Administration is abdicating
its responsibility to provide scientists with the funding
cutting-edge research demands. As you know, the federal government
has seen its R&D investments steadily decline as a share of the U.S.
economy, bringing the federal investment down to levels not seen
since the mid-60s. Public-sector investments in advanced research
have declined sharply, relative to our economic growth rate, and
barely kept pace with inflation. This year, federal funding for
research is set to increase 4.7 percent. However, the entire
increase would go to the Departments of Defense and Homeland
Security for the development of weapons systems and counterterrorism
technology. Make no mistake, these are necessary investments that
will make our nation safer. But the remaining federal R&D budget
that supports research into health, environmental, biological, and
other sciences, will all see funding reduced.
“In my home state of South Dakota, for instance, the Earth Research
Observation System is facing the possibility of deep cuts in staff
due to cuts to their budget. Their work helps us become more
responsible stewards of the environment, while increasing the yields
of farmers all over the world. And yet, this work is endangered due
to draconian budget cuts.”
“But we should be honest with ourselves. Outside the scientific
community, there is no hue and cry for more government funding of
R&D . There is no widespread public outrage when the Administration
disregards the unequivocal judgment of the scientific community. And
it’s unlikely that the science gap growing between the United States
and other developed nations will become a major issue in the
upcoming Presidential campaign.
“This represents a failure on our part. We have not done enough to
show the American people the connection between the work underway in
your laboratories and the problems that affect their lives. This
must change. The stakes simply could not be higher. What future
challenge will we fail to meet because America’s scientists were not
given the tools they need to discover new answers to old questions?
When rumors of a Nazi bomb program reached President Roosevelt, he
said simply, ‘Whatever the enemy may be planning, American science
will be equal to the challenge.’ Will future presidents be able to
speak with such confidence?”
A steep decline in graduate school applications from foreign students has university administrators pushing the federal government to reform the visa process. Their argument: The trend could cost U.S. schools much-needed revenue and research help, and make America seem isolated in the eyes of the world.
At a stop at a Veteran’s Affairs hospital in Baltimore yesterday, President Bush apparently drew from the forthcoming report of his IT Advisory Committee (PITAC) when he noted that, in health care, “the 21st-century is using a 19th-century paperwork system.” He’s calling for the digitization of most of America’s medical records within the decade.
He’s also says he’ll name a “Federal Coordinator For IT,” but I haven’t found any additional details.
Anyway, here’s the article from TheBostonChannel.com. Bush seems to draw from the work of the PITAC Subcommittee on Health IT, which announced its draft recommendations two weeks ago. Here’s a bit more:
The result is that files get misplaced and problems with drug interactions aren’t systematically checked, among other problems.
“These old methods of keeping records are real threats to patients and their safety and are incredibly costly,” he said.
Implementing a system where everyone has their own personal electronic medical record will protect patients, improve care and reduce cost, he said.
Bush acknowledged that patient privacy is a concern and a top priority
I’m back from the 2004 edition of ACM’sComputer Freedom and Privacy Conference, held this year at the Claremont Hotel in Berkeley, California. This is the second time I’ve attended, and I’ve enjoyed it each time. The conference’s focus on the intersection between technology and civil rights brings together a fascinating blend of personalities — from EFF Founder John Gilmore to Rachel Brand of the Office of Legal Policy at the Department of Justice to Bill Scannell, of DontSpyOn.us to Nualla O’Conner Kelley, Chief Privacy Officer, Department of Homeland Security. The sessions are always lively and thought-provoking.
A few issues seemed to get the most attention at this year’s conference — the perils of “Direct Recording Electronic” (DRE) voting systems, government profiling using TIA-like systems, and civil liberties issues surrounding Google services. Of these, I was particularly frustrated by the government profiling discussions. A number of speakers made the point (though Doug Tygar probably made it most emphatically) that the government spends a disproportionate amount of its IT privacy and security research funding on security rather than privacy. Given the current state of funding for federal cyber security R&D (see previous blog entry), that’s a sobering thought. But the frustrating part for me is that many of the same people at CFP who are now clamoring for more federal R&D for privacy related research were among the loudest voices calling for cancellation of DARPA’s TIA project (I’m not including Tygar in this, as I don’t know where he stood on TIA). Let me explain.
DARPA’s Total Information Awareness (pdf) project was an attempt to “design a prototype network that integrates innovative information technologies for detecting and preempting foreign terrorist activities against Americans.” In order to do this, DARPA was funding research into a range of technologies including real-time translation tools, data mining applications, and “privacy enhancing technologies” including development of a “privacy appliance” that would protect the identities of all individuals within any of the databases being searched until the government had the appropriate court order to reveal them. At CFP, Philippe Golle, from Xerox’s Palo Alto Research Center, described one such project at PARC (led by Teresa Lunt), that DARPA agreed to fund for 3 years as part of TIA. The plan was to create a “privacy appliance” that owners of commercial databases of interest to the government could deploy that would control government access to the databases using inference control (deciding what types of queries — individually or in aggregate — might divulge identifying information), access control and an immutable audit trail to protect individual privacy. Really neat stuff.
Anyway, the idea that the government might one day deploy a TIA-like system before all of the privacy and security challenges had been sorted out and thereby imperil American civil liberties and security was worrying to a great many people and organizations, including CRA. However, there seemed to be a number of different approaches among the various people and organizations to deal with the concerns. There was a vocal contingent that believed Congress should cancel TIA outright — the threat the research posed was greater than any possible good. CFP participant Jim Harper, of Privacilla.org, addressed this approach directly at the conference, saying the reason groups like his try to kill government programs when they’re still in R&D and small is because they’re too hard to kill when they get big.
CRA had a more nuanced view, I believe, that argued that the challenges that needed to be overcome before any TIA-esque system would ever be fit for deployment were large and that CRA would oppose any deployment until concerns about privacy and security were met. However, we also argued that the research required to address those concerns was worthy of continued support — the problems of privacy and security (as well as the challenge of ever making something like TIA actually work) were truly difficult research problems…”DARPA hard” problems — and so we opposed any research moratorium.
Unsurprisingly, the “nuanced” position failed to carry the day once Congress got involved. At about the same time Congress was deciding TIA’s fate, stories broke in the press about DARPA’s FutureMAP project — which attempted to harness the predictive nature of markets to glean information about possible terrorist activities — and JetBlue airline’s release of customer data to the Defense Department (in violation of their privacy policies) that helped cement opinion that DARPA was out of control. It also didn’t help that the TIA program resided in DARPA’s Information Assurance Office, headed by the controversial Adm. John Poindexter. TIA’s fate was sealed. Congress voted to cut all funding for the program and eliminate the IAO office at DARPA that housed it.
However, Congress also recognized that some of the technologies under development might have a role to play in the war against terrorism. They included language in the appropriations bill (Sec 8131(a)) that allowed work on the technologies to continue at unspecified intelligence agencies, provided that work was focused on non-US citizens. As a result, much of the research that had been funded by DARPA has been taken up by the Advanced Research Development Agency, the research arm of the intelligence agencies. Because it’s classified, we have no way of knowing how much of TIA has been resurrected under ARDA. We also have no way of overseeing the research, no way of questioning the approach or implementation, no way of questioning the security or privacy protections (if any) included. In short, those who argued in support of a research moratorium just succeeded in driving the research underground.
Finally, one thing we do know about current TIA-related research efforts is that PARC’s work on privacy-enhancing technologies is no longer being funded.
I’m currently enjoying the Computers, Freedom and Privacy Conference, but thought I’d quickly point out a story in Tech Daily (subscription required), about Intel CEO Craig Barrett’s comments as part of the Task Force on the Future of American Innovation event on the Hill yesterday. Here are some of the choice bits:
As Intel CEO Craig Barrett announced a joint task force of technology companies and academia that favors more government funding of basic research, on Tuesday he criticized U.S. budget priorities and called on policymakers to “make investments in the 21st century.”
While emphasizing that the task force’s mission is not about “physical research versus agricultural subsidies,” Barrett took exception to the $30 billion in annual agriculture subsidies appropriated by Congress. He also said some of the $250 billion in the transportation bill pending before Congress would be better spent on physicists and engineers than roads and bridges.
“It’s a choice between a bridge to an island or a bridge to the future,” Barrett said at a news conference. He said the huge sums appropriated in such bills represent “an investment in the industries of the 19th century.”
…
Noting that funding for basic research has remained flat in inflation-adjusted dollars and cut by 37 percent as a share of gross domestic product over the past 30 years, the task force said such a trend will have dire consequences for American economic growth, global stability and prosperity.
They noted that technologies like the World Wide Web, fiber optics and magnetic resonance imaging all originated as basic research projects. Funding of such work at U.S. universities helped launch thousands of spin-off companies, employed hundreds of thousands of workers and generated billions of dollars in sales, the task force concluded.
The semiconductor industry alone had global sales of $166 billion last year, the task force stated.
The House Science Committee is circulating a draft (pdf) of a bill to amend portions of the High Performance Computing Act of 1991 to address issues about coordination among federal agencies doing IT R&D. This is an important bill for a couple of reasons. First, the original HPC Act — besides being the bill sponsored by then Senator Al Gore that was the reason behind to the infamous “I took the initiative in creating the Internet” line from the 2000 presidential campaign — established the current structure for the now $2.0 billion a year federal investment in IT R&D and has done much to shape the discipline and the enormous amount innovation that has resulted…innovation that, in turn, has driven the new economy. So any alteration of the bill bears the weight of all of that success.
Second, the new bill is important for the message it sends. At a time when the overall budget for federal IT R&D has been basically flat (some agencies up, other down) for several years (graph) and when the Director of the White House’s Office of Science and Technology Policy has defended the flat budgets by claiming IT is a “mature” field, without the same complexity as the life sciences, it’s important to have Congress note that IT R&D is still vital to the nation for a whole host of reasons and is rich with challenges to solve.
The bill is also important because it attempts to address concerns within the computing community about interagency coordination in the NITRD program generally, and specifically within the high-performance computing community. If there’s one important point to take away from the Atkins Cyberinfrastructure Report, it’s that the cyberinfrastructure revolution is already underway. Agencies are already planning their own cyberinfrastructure strategies — NSF has reorganized and created a division of Shared Cyberinfrastructure, DOE has it’s own plan, NIH is considering its own health sciences network, NASA is considering its own geosciences network. It’s not clear, however, that there’s a whole lot of coordination going on between the agencies, despite the NITRD interagency coordination plan already in place. While a diversity of funding agencies is probably desirable in this space, coordination between the agencies can help insure that funding is spent in the most effective ways possible. The Science Committee bill addresses this by requiring the President’s Information Technology Advisory Committee (PITAC) review the NITRD coordination effort every two years and report to the President and Congress.
The bill also amends the HPC to direct the agencies under the Science Committee’s jurisdiction (NSF, NASA, DOE, NIST, NOAA, and EPA) to do a number of things consistent with their missions. You can see the complete list of responsibilities by looking at this section by section analysis of the bill (provided by the Committee staff). The Committee also provided this short-form summary of the bill’s intent:
Assuring U.S. Researchers Access to the Most Advanced High-Performance Computing Systems Available: the bill requires the High-Performance Computing Research and Development Program to “provide sustained access by the research community in the United States to high-performance computing systems that are among the most advanced in the world in terms of performance in solving scientific and engineering problems, including provision for technical support for users of such systems. The bill also specifically requires the National Science Foundation and the Office of Science at the Department of Energy to provide U.S. researchers with access to “world class” high-performance computing systems. Assuring Balanced Progress on All Aspects of High-Performance Computing: in addition to assuring U.S. leadership in hardware infrastructure, the bill requires the program to support all aspects of high-performance computing for scientific and engineering applications, including software, algorithm and applications development, development of technical standards, development of new computer models for science and engineering problem solving, and education and training in all the disciplines that support advanced computing. Assuring an Adequate Interagency Planning Process to Maintain Continued U.S. Leadership: the bill requires the Director of the Office of Science and Technology Policy to “develop and maintain a research, development, and deployment roadmap for the provision of high-performance computing systems for use by the research community in the United States.” This and other provisions in the bill are designed to ensure a robust ongoing planning and coordination process so that our national high-performance computing effort is not allowed to lag in the future.
The only negative that I can see with the bill is the lack of authorization funding amounts. The committee doesn’t set out any recommended funding levels for any of the agencies within the bill. This is largely, I think, a political necessity to avoid the difficulties that authorization bills with large funding levels encounter in the legislative process. The previous attempt to authorize NITRD programs (HR 3400, introduced by Research Subcommittee chair Nick Smith (R-MI)) failed to receive floor consideration in the 107th Congress because the House Leadership — already worried about a growing deficit — balked at the funding levels authorized in the bill.
Failing to include the numbers is a disappointment, mainly because having a set of authorized funding levels gives the community a useful target to use in advocated for appropriations from year to year. It’s also important symbolically, as an expression of the strength of support for the programs in Congress. However, not having any funding recommendations will likely make the bill sail through the legislative process.
CRA is interested in your feedback on the bill. Please give it a look and add your comment. The Committee plans to hold a hearing on the bill on April 29. Scheduled to testify are: OSTP Director John Marburger; Bill Bishop, CEO of SGI; Rick Stevens, from Argonne National Labs; and Dan Reed, CRA Board Member and University of North Carolina. The bill is on a fast track — the committee plans to mark it up and pass it out the following week.
More media coverage of CRA’s Taulbee Surveyresults that show a 23 percent decrease in undergraduate computer science majors in 2003. This time it’s a piece on NPR’sMorning Edition. You’ll need Real Player or Windows Media Player to listen.
Other stories on the subject are linked here.
The President’s Information Technology Advisory Committee (PITAC) met yesterday in their second public session since being reconstituted last year after nearly two years of inactivity. The two items on the agenda were a report on the draft recommendations (pdf) of PITAC’s subcommittee on Health and IT, and the first taking of public testimony by the subcommittee on cyber security. CRA is well-represented on the Committee. Ed Lazowska, the co-chair, Dan Reed, and Gene Spafford are all current members of CRA’s Board of Directors, and committee member Dave Patterson is a former CRA board member and past Chair.
The cyber security portion of the meeting featured testimony from a number of agency officials that elicited some interesting give and take with the committee. Amit Yoran, Director of the National Cyber Security Division at the Department of Homeland Security raised some eyebrows with committee members when he suggested that venture capital, not the government, could better fund security research. Lazowska stopped him and pointed out that the private sector generally funds technologies that are, at most, a couple of years out. He noted that it was the federal government’s role to look 5 and 10 years out, and that venture capital plays an important role at the end of that pipeline. The exchange led Yoran to conclude that perhaps the committee, in its review of federal cyber R&D, should recommend DHS fund long-term, strategic investments in cyber security R&D.
This approach would mark a change in the agency’s current focus on short-term — six months or less — almost-ready-for-deployment technologies. But in his testimony later in the session, Simon Szykman, Director Cyber Security R&D at DHS, insisted the Department will continue to focus on the short-term research — the “low-hanging fruit” — for at least the next couple of years. In the future, he said, he hoped the department might one day include long-range research in up to 20 percent of its overall R&D portfolio. For now, Yoran and Szykman said the department is dependent upon the good work of agencies like NSF and DARPA for long-range research.
This presents a bit of a problem in that NSF and DARPA have their own issues regarding cyber security R&D. For NSF, the problem is primarily financial. NSF’s Carl Landwehr, a program director in CISE, testified that the agency receives far more good proposals in the area than it can fund. The recent $30 million Cyber Trust solicitation generated over 230 “small” proposals, of which the agency can fund about 30; 125 “medium” proposals, of which the agency can fund 6 or 8; and 30 large scale proposals, of which just 1 or 2 might receive funding. PITAC member Tom Leighton questioned whether that approximately 5-10 percent approval rate was typical of NSF programs and how many Landwehr thought would be determined to be good enough to fund after peer-review, if the agency had the funding. Landwehr said the funding rate wasn’t unusual for CISE programs, noting that the ITR program had a similar funding rate (NSF-wide the rate is probably closer to 30 percent), and that he expected that 25 percent of the proposals they received would likely be worthy of funding if NSF had the funds. In other words, NSF could easily fund 2.5 times their current cyber security R&D budget on good proposals if they had the funding.
This is a markedly different story than the one told by DARPA Director Tony Tether, who noted during his testimony that he thought DARPA program managers were “idea starved, not money starved” when it came to funding cyber security research. Tether also took considerable flak for the agency’s increased use of classification to limit the dissemination and discussion of its cyber security research underway. Tether defended the policy by noting that the Department of Defense is increasingly reliant on networking for its warfighting capability, therefore it is in the interest of national security to restrict any research that might expose a vulnerability or reveal a capability. However, since an estimated 85 percent of the DOD’s communications travel across commercial communications networks, this means that much of the research aimed at defending these networks is subject to restriction. The effects of this policy are numerous. For one, this limits significantly the contribution of university-based researchers in the DARPA research community — a community that has, historically, been vital to the advancement of computing (in part due to the inclusion of university researchers). However, this also means that the fruits of this research are unavailable to both the vitally important US commercial sector — which is heavily dependent upon secure networks for trillions of dollars of activity annually — and the other agencies of government, including DHS. Tether acknowledged this problem and suggested that perhaps there ought to be two parallel efforts — an unclassified track, funded by NSF and DHS, and a classified one supported by DARPA and the security agencies.
Funding is also currently a problem at DHS. Syzkman testified that the agency will likely have just over $1 billion in R&D funding in FY 05, but told Lazowska under questioning that cyber security R&D will account for just $18 million of that. Syzkman didn’t try to defend the funding, other than to suggest that the needs of other directorates within the department dictated the priorities in the Science and Technology directorate, and to suggest that the funding levels are the product of thinking that’s now over 18 months old. Future budgets, he suggested, will include more robust cyber security funding.
The plan for the subcommittee on cyber security at this point is to do some further fact-finding and develop a set of draft recommendations in time for the next meeting of PITAC in June. At the same time, the subcommittee on Health in IT will refine the draft recommendation (pdf) it presented at the meeting based on feedback from the committee and produce the first report on the issue. June will also likely mark the start of the third PITAC subcommittee’s work on the current state of scientific computing, headed by Dan Reed.
Stay tuned here for details….
the administration is distorting and censoring scientific findings that contradict its policies; manipulating the underlying science to align results with predetermined political decisions; and undermining the independence of science advisory panels by subjecting panel nominees to political litmus tests that have little or no bearing on their expertise; nominating non-experts or underqualified individuals from outside the scientific mainstream or with industry ties; as well as disbanding science advisory committees altogether.
In this Administration, science strongly informs policy. It is important to remember, however, that even when the science is clear and often it is not it is but one input into the policy process.
Ed Felten’s Freedom to Tinker has a bit on the recent Treasury Department ruling in favor of IEEE regarding U.S. restrictions in place against copy-editing and publishing scientific papers whose authors come from countries under US trade embargoes. IEEE had been working the issue with the Treasury Department’s Office of Foreign Assets Control for about a year before receiving the favorable ruling. IEEE had stopped providing some services to its members in embargoed countries while it sought exemption from the trade rules. The Treasury Department’s decision means IEEE can resume peer-reviewing, editing, and publishing scholarly works from its members in embargoed countries such as Iran and Cuba.
Here’s the press release (pdf, 56kb) from IEEE.
Please use the Category and Archive Filters below, to find older posts. Or you may also use the search bar.
Perspectives on Bush Administration and S&T Funding
/In: Funding /by AndyBernat(from The American Institute of Physics Bulletin of Science Policy News
Number 55: April 28, 2004)
long, but interesting
AP Coverage of University Visa Issues
/In: Policy /by Peter HarshaFrom AP:
Here’s the full story: Universities Lobby for Simpler Visa Process
PITAC Work Finding its Way Into Presidential Campaign
/In: Policy /by Peter HarshaAt a stop at a Veteran’s Affairs hospital in Baltimore yesterday, President Bush apparently drew from the forthcoming report of his IT Advisory Committee (PITAC) when he noted that, in health care, “the 21st-century is using a 19th-century paperwork system.” He’s calling for the digitization of most of America’s medical records within the decade.
He’s also says he’ll name a “Federal Coordinator For IT,” but I haven’t found any additional details.
Anyway, here’s the article from TheBostonChannel.com. Bush seems to draw from the work of the PITAC Subcommittee on Health IT, which announced its draft recommendations two weeks ago. Here’s a bit more:
“Computer Freedom and Privacy 2004” and Privacy R&D
/In: Policy /by Peter HarshaI’m back from the 2004 edition of ACM’s Computer Freedom and Privacy Conference, held this year at the Claremont Hotel in Berkeley, California. This is the second time I’ve attended, and I’ve enjoyed it each time. The conference’s focus on the intersection between technology and civil rights brings together a fascinating blend of personalities — from EFF Founder John Gilmore to Rachel Brand of the Office of Legal Policy at the Department of Justice to Bill Scannell, of DontSpyOn.us to Nualla O’Conner Kelley, Chief Privacy Officer, Department of Homeland Security. The sessions are always lively and thought-provoking.
A few issues seemed to get the most attention at this year’s conference — the perils of “Direct Recording Electronic” (DRE) voting systems, government profiling using TIA-like systems, and civil liberties issues surrounding Google services. Of these, I was particularly frustrated by the government profiling discussions. A number of speakers made the point (though Doug Tygar probably made it most emphatically) that the government spends a disproportionate amount of its IT privacy and security research funding on security rather than privacy. Given the current state of funding for federal cyber security R&D (see previous blog entry), that’s a sobering thought. But the frustrating part for me is that many of the same people at CFP who are now clamoring for more federal R&D for privacy related research were among the loudest voices calling for cancellation of DARPA’s TIA project (I’m not including Tygar in this, as I don’t know where he stood on TIA). Let me explain.
DARPA’s Total Information Awareness (pdf) project was an attempt to “design a prototype network that integrates innovative information technologies for detecting and preempting foreign terrorist activities against Americans.” In order to do this, DARPA was funding research into a range of technologies including real-time translation tools, data mining applications, and “privacy enhancing technologies” including development of a “privacy appliance” that would protect the identities of all individuals within any of the databases being searched until the government had the appropriate court order to reveal them. At CFP, Philippe Golle, from Xerox’s Palo Alto Research Center, described one such project at PARC (led by Teresa Lunt), that DARPA agreed to fund for 3 years as part of TIA. The plan was to create a “privacy appliance” that owners of commercial databases of interest to the government could deploy that would control government access to the databases using inference control (deciding what types of queries — individually or in aggregate — might divulge identifying information), access control and an immutable audit trail to protect individual privacy. Really neat stuff.
Anyway, the idea that the government might one day deploy a TIA-like system before all of the privacy and security challenges had been sorted out and thereby imperil American civil liberties and security was worrying to a great many people and organizations, including CRA. However, there seemed to be a number of different approaches among the various people and organizations to deal with the concerns. There was a vocal contingent that believed Congress should cancel TIA outright — the threat the research posed was greater than any possible good. CFP participant Jim Harper, of Privacilla.org, addressed this approach directly at the conference, saying the reason groups like his try to kill government programs when they’re still in R&D and small is because they’re too hard to kill when they get big.
CRA had a more nuanced view, I believe, that argued that the challenges that needed to be overcome before any TIA-esque system would ever be fit for deployment were large and that CRA would oppose any deployment until concerns about privacy and security were met. However, we also argued that the research required to address those concerns was worthy of continued support — the problems of privacy and security (as well as the challenge of ever making something like TIA actually work) were truly difficult research problems…”DARPA hard” problems — and so we opposed any research moratorium.
Unsurprisingly, the “nuanced” position failed to carry the day once Congress got involved. At about the same time Congress was deciding TIA’s fate, stories broke in the press about DARPA’s FutureMAP project — which attempted to harness the predictive nature of markets to glean information about possible terrorist activities — and JetBlue airline’s release of customer data to the Defense Department (in violation of their privacy policies) that helped cement opinion that DARPA was out of control. It also didn’t help that the TIA program resided in DARPA’s Information Assurance Office, headed by the controversial Adm. John Poindexter. TIA’s fate was sealed. Congress voted to cut all funding for the program and eliminate the IAO office at DARPA that housed it.
However, Congress also recognized that some of the technologies under development might have a role to play in the war against terrorism. They included language in the appropriations bill (Sec 8131(a)) that allowed work on the technologies to continue at unspecified intelligence agencies, provided that work was focused on non-US citizens. As a result, much of the research that had been funded by DARPA has been taken up by the Advanced Research Development Agency, the research arm of the intelligence agencies. Because it’s classified, we have no way of knowing how much of TIA has been resurrected under ARDA. We also have no way of overseeing the research, no way of questioning the approach or implementation, no way of questioning the security or privacy protections (if any) included. In short, those who argued in support of a research moratorium just succeeded in driving the research underground.
Finally, one thing we do know about current TIA-related research efforts is that PARC’s work on privacy-enhancing technologies is no longer being funded.
Blogging on the Fly — Intel’s CEO Urges for More Basic Research Funding
/In: Funding /by Peter HarshaI’m currently enjoying the Computers, Freedom and Privacy Conference, but thought I’d quickly point out a story in Tech Daily (subscription required), about Intel CEO Craig Barrett’s comments as part of the Task Force on the Future of American Innovation event on the Hill yesterday. Here are some of the choice bits:
The San Francisco Chronicle also has the story.
Congress Prepares Computing Research Authorization Bill
/In: Policy /by Peter HarshaThe House Science Committee is circulating a draft (pdf) of a bill to amend portions of the High Performance Computing Act of 1991 to address issues about coordination among federal agencies doing IT R&D. This is an important bill for a couple of reasons. First, the original HPC Act — besides being the bill sponsored by then Senator Al Gore that was the reason behind to the infamous “I took the initiative in creating the Internet” line from the 2000 presidential campaign — established the current structure for the now $2.0 billion a year federal investment in IT R&D and has done much to shape the discipline and the enormous amount innovation that has resulted…innovation that, in turn, has driven the new economy. So any alteration of the bill bears the weight of all of that success.
Second, the new bill is important for the message it sends. At a time when the overall budget for federal IT R&D has been basically flat (some agencies up, other down) for several years (graph) and when the Director of the White House’s Office of Science and Technology Policy has defended the flat budgets by claiming IT is a “mature” field, without the same complexity as the life sciences, it’s important to have Congress note that IT R&D is still vital to the nation for a whole host of reasons and is rich with challenges to solve.
The bill is also important because it attempts to address concerns within the computing community about interagency coordination in the NITRD program generally, and specifically within the high-performance computing community. If there’s one important point to take away from the Atkins Cyberinfrastructure Report, it’s that the cyberinfrastructure revolution is already underway. Agencies are already planning their own cyberinfrastructure strategies — NSF has reorganized and created a division of Shared Cyberinfrastructure, DOE has it’s own plan, NIH is considering its own health sciences network, NASA is considering its own geosciences network. It’s not clear, however, that there’s a whole lot of coordination going on between the agencies, despite the NITRD interagency coordination plan already in place. While a diversity of funding agencies is probably desirable in this space, coordination between the agencies can help insure that funding is spent in the most effective ways possible. The Science Committee bill addresses this by requiring the President’s Information Technology Advisory Committee (PITAC) review the NITRD coordination effort every two years and report to the President and Congress.
The bill also amends the HPC to direct the agencies under the Science Committee’s jurisdiction (NSF, NASA, DOE, NIST, NOAA, and EPA) to do a number of things consistent with their missions. You can see the complete list of responsibilities by looking at this section by section analysis of the bill (provided by the Committee staff). The Committee also provided this short-form summary of the bill’s intent:
The only negative that I can see with the bill is the lack of authorization funding amounts. The committee doesn’t set out any recommended funding levels for any of the agencies within the bill. This is largely, I think, a political necessity to avoid the difficulties that authorization bills with large funding levels encounter in the legislative process. The previous attempt to authorize NITRD programs (HR 3400, introduced by Research Subcommittee chair Nick Smith (R-MI)) failed to receive floor consideration in the 107th Congress because the House Leadership — already worried about a growing deficit — balked at the funding levels authorized in the bill.
Failing to include the numbers is a disappointment, mainly because having a set of authorized funding levels gives the community a useful target to use in advocated for appropriations from year to year. It’s also important symbolically, as an expression of the strength of support for the programs in Congress. However, not having any funding recommendations will likely make the bill sail through the legislative process.
CRA is interested in your feedback on the bill. Please give it a look and add your comment. The Committee plans to hold a hearing on the bill on April 29. Scheduled to testify are: OSTP Director John Marburger; Bill Bishop, CEO of SGI; Rick Stevens, from Argonne National Labs; and Dan Reed, CRA Board Member and University of North Carolina. The bill is on a fast track — the committee plans to mark it up and pass it out the following week.
NPR Discusses Decline in CompSci Enrollments
/In: People /by Peter HarshaMore media coverage of CRA’s Taulbee Survey results that show a 23 percent decrease in undergraduate computer science majors in 2003. This time it’s a piece on NPR’s Morning Edition. You’ll need Real Player or Windows Media Player to listen.
Other stories on the subject are linked here.
PITAC Meeting Highlights
/In: Policy /by Peter HarshaThe President’s Information Technology Advisory Committee (PITAC) met yesterday in their second public session since being reconstituted last year after nearly two years of inactivity. The two items on the agenda were a report on the draft recommendations (pdf) of PITAC’s subcommittee on Health and IT, and the first taking of public testimony by the subcommittee on cyber security. CRA is well-represented on the Committee. Ed Lazowska, the co-chair, Dan Reed, and Gene Spafford are all current members of CRA’s Board of Directors, and committee member Dave Patterson is a former CRA board member and past Chair.
The cyber security portion of the meeting featured testimony from a number of agency officials that elicited some interesting give and take with the committee. Amit Yoran, Director of the National Cyber Security Division at the Department of Homeland Security raised some eyebrows with committee members when he suggested that venture capital, not the government, could better fund security research. Lazowska stopped him and pointed out that the private sector generally funds technologies that are, at most, a couple of years out. He noted that it was the federal government’s role to look 5 and 10 years out, and that venture capital plays an important role at the end of that pipeline. The exchange led Yoran to conclude that perhaps the committee, in its review of federal cyber R&D, should recommend DHS fund long-term, strategic investments in cyber security R&D.
This approach would mark a change in the agency’s current focus on short-term — six months or less — almost-ready-for-deployment technologies. But in his testimony later in the session, Simon Szykman, Director Cyber Security R&D at DHS, insisted the Department will continue to focus on the short-term research — the “low-hanging fruit” — for at least the next couple of years. In the future, he said, he hoped the department might one day include long-range research in up to 20 percent of its overall R&D portfolio. For now, Yoran and Szykman said the department is dependent upon the good work of agencies like NSF and DARPA for long-range research.
This presents a bit of a problem in that NSF and DARPA have their own issues regarding cyber security R&D. For NSF, the problem is primarily financial. NSF’s Carl Landwehr, a program director in CISE, testified that the agency receives far more good proposals in the area than it can fund. The recent $30 million Cyber Trust solicitation generated over 230 “small” proposals, of which the agency can fund about 30; 125 “medium” proposals, of which the agency can fund 6 or 8; and 30 large scale proposals, of which just 1 or 2 might receive funding. PITAC member Tom Leighton questioned whether that approximately 5-10 percent approval rate was typical of NSF programs and how many Landwehr thought would be determined to be good enough to fund after peer-review, if the agency had the funding. Landwehr said the funding rate wasn’t unusual for CISE programs, noting that the ITR program had a similar funding rate (NSF-wide the rate is probably closer to 30 percent), and that he expected that 25 percent of the proposals they received would likely be worthy of funding if NSF had the funds. In other words, NSF could easily fund 2.5 times their current cyber security R&D budget on good proposals if they had the funding.
This is a markedly different story than the one told by DARPA Director Tony Tether, who noted during his testimony that he thought DARPA program managers were “idea starved, not money starved” when it came to funding cyber security research. Tether also took considerable flak for the agency’s increased use of classification to limit the dissemination and discussion of its cyber security research underway. Tether defended the policy by noting that the Department of Defense is increasingly reliant on networking for its warfighting capability, therefore it is in the interest of national security to restrict any research that might expose a vulnerability or reveal a capability. However, since an estimated 85 percent of the DOD’s communications travel across commercial communications networks, this means that much of the research aimed at defending these networks is subject to restriction. The effects of this policy are numerous. For one, this limits significantly the contribution of university-based researchers in the DARPA research community — a community that has, historically, been vital to the advancement of computing (in part due to the inclusion of university researchers). However, this also means that the fruits of this research are unavailable to both the vitally important US commercial sector — which is heavily dependent upon secure networks for trillions of dollars of activity annually — and the other agencies of government, including DHS. Tether acknowledged this problem and suggested that perhaps there ought to be two parallel efforts — an unclassified track, funded by NSF and DHS, and a classified one supported by DARPA and the security agencies.
Funding is also currently a problem at DHS. Syzkman testified that the agency will likely have just over $1 billion in R&D funding in FY 05, but told Lazowska under questioning that cyber security R&D will account for just $18 million of that. Syzkman didn’t try to defend the funding, other than to suggest that the needs of other directorates within the department dictated the priorities in the Science and Technology directorate, and to suggest that the funding levels are the product of thinking that’s now over 18 months old. Future budgets, he suggested, will include more robust cyber security funding.
The plan for the subcommittee on cyber security at this point is to do some further fact-finding and develop a set of draft recommendations in time for the next meeting of PITAC in June. At the same time, the subcommittee on Health in IT will refine the draft recommendation (pdf) it presented at the meeting based on feedback from the committee and produce the first report on the issue. June will also likely mark the start of the third PITAC subcommittee’s work on the current state of scientific computing, headed by Dan Reed.
Stay tuned here for details….
White House Responds to UCS Complaint About US Science Policy
/In: Policy /by Peter HarshaThe White House last Friday released its response to a report by the Union of Concerned Scientists that claims
The administration response, authored by the Director of the White House Office of Science and Technology Policy, John Marburger, is a 20 page, point-by-point rebuttal. The gist:
IEEE Gets Favorable OFAC Ruling, Publications Activities No Longer Restricted
/In: Policy /by Peter HarshaEd Felten’s Freedom to Tinker has a bit on the recent Treasury Department ruling in favor of IEEE regarding U.S. restrictions in place against copy-editing and publishing scientific papers whose authors come from countries under US trade embargoes. IEEE had been working the issue with the Treasury Department’s Office of Foreign Assets Control for about a year before receiving the favorable ruling. IEEE had stopped providing some services to its members in embargoed countries while it sought exemption from the trade rules. The Treasury Department’s decision means IEEE can resume peer-reviewing, editing, and publishing scholarly works from its members in embargoed countries such as Iran and Cuba.
Here’s the press release (pdf, 56kb) from IEEE.