The Department of Defense has proposed a change to its regulations that would strengthen requirements for the control of export-controlled technologies for DOD research contractors, including universities. The proposed rule would require contractors to have an export control program that includes regular audits and training, segregated workplaces for export-controlled technologies, and “unique badging requirements for foreign nationals and foreign persons.”
While we at CRA understand and support the need for export control and deemed export regulation, we’re concerned that the proposed rule as it stands would make fundamental research subject to novel restrictions that could seriously impair the ability of colleges, universities, industrial and federal research labs to conduct fundamental research, which would have significant ramifications for America’s economic competitiveness and technological leadership in the world.
This concern stems in large part from the fact that the proposed rule doesn’t reference the fundamental research exemption, as found in a Reagan-era National Security Decision Directive still in effect (NSDD 189). Though it appears that the authors of the proposed rule didn’t intend to add new burdens for universities — rather, intended the change to clarify existing regulations — the worry is that without an explicit mention of NSDD-189, DOD agencies might incorrectly interpret compliance requirements to require access controls in all instances, even when fundamental research is being performed. There’s also a concern that the new rule would prompt DOD program managers and contract officers to include overly restrictive language in DOD contracts in order to protect themselves from any potential liability or culpability. As the Association of American Universitiespoints out (pdf), this likely would exacerbate the already significant problems that universities have experienced with troublesome clauses in contracts from industry.
So, as we did when the Department of Commerce announced they were considering a similar rule change (and still haven’t decided), CRA filed comments (pdf) with DOD, along with more than 100 other respondents. USACM also filed comments regarding the proposed rule, which you can find — along with a good blog post from David Padgham highlighting some of the other comments received — over at the USACM Technology Policy Blog.
We’ll keep track on the rule-making progress as it moves forward at both Commerce and DOD in the coming weeks….
A quick pointer to two interesting not-directly-related pieces running today. First is Aliya Sternstein’s article in Federal Computer Week that fleshes out the PITAC to PCAST switch we noted back on September 30th. She quotes CRA Chair Dan Reed and ITAA president Harris Miller:
Former PITAC member Dan Reed, vice chancellor of IT and chief information officer at the University of North Carolina at Chapel Hill, applauded [PCAST co-Chair Floyd] Kvamme’s idea to examine the federal government’s commitment to IT R&D.
“IT pervades so many aspects of science, technology and education that examining it in a holistic context has great value,” he said.
“PCAST is really the pre-eminent scientific advisory group to the president,” Reed said. “In some ways, this elevates the IT issues to a higher level.”
Some industry observers displayed mixed emotions about the turn of events, saying they will hold their breath until PCAST’s new lineup materializes and follows through on its promises.
“Having PITAC become part of PCAST is better than nothing, but frankly, I don’t think it’s an adequate solution,” said Harris Miller, president of the IT Association of America, which represents high-tech companies.
Although PCAST is more prestigious and well-regarded by the administration, the members already have too much on their plates, he said, adding that they likely cannot handle PCAST’s huge program plus all the items that the PITAC docket would add.
(There’s a brief comment from me in there as well.)
The other interesting piece is by ZDNet News’ Declan McCullagh and Anne Broache. It’s titled “U.S. cybersecurity due for FEMA-like calamity?” and it covers the lack of adequate attention the Department of Homeland Security has paid to cyber threats to critical infrastructures.
Auditors had warned months before Hurricane Katrina that FEMA’s internal procedures for handling people and equipment dispatched to disasters were lacking. In an unsettling parallel, government auditors have been saying that Homeland Security has failed to live up to its cybersecurity responsibilities and may be “unprepared” for emergencies.
“When you look at the events of Katrina, you kind of have to ask yourself the question, ‘Are we ready?'” said Paul Kurtz, president of the Cyber Security Industry Alliance, a public policy and advocacy group. “Are we ready for a large-scale cyberdisruption or attack? I believe the answer is clearly no.”
The article also features a nice quote from CRA government affairs committee co-Chair Ed Lazowska that sums up the concerns about the agency’s research efforts:
But the right tools and funding have to be in place, too, said Ed Lazowska, a computer science professor at the University of Washington. He co-chaired the president’s Information Technology Advisory Committee, which published a report in February that was critical of federal cybersecurity efforts.
“DHS has an appropriately large focus on weapons of mass destruction but an inappropriately small focus on critical infrastructure protection, and particularly on cybersecurity,” Lazowska said in an e-mail interview.
The department is currently spending roughly $17 million of its $1.3 billion science-and-technology budget on cybersecurity, he said. His committee report calls for a $90 million increase in National Science Foundation funding for cybersecurity research and development.
Until then, Lazowska said, “the nation is applying Band-Aids, rather than developing the inherently more secure information technology that our nation requires.”
Former House Speaker Newt Gingrich joined yesterday’s meeting of the National Academies’Computer Science and Telecommunications Board, ostensibly to talk about health care and IT — though he probably only spent a couple of minutes total on the topic. Instead, the board and those of us in the audience got Gingrich’s take on what’s wrong with America’s innovation ecosystem and his plan for addressing it. The presentation was very interesting — Gingrich is a remarkable extemporaneous speaker, even in front of an audience that I suspect was not full of Gingrich “fans.” I jotted down some brief notes as he outlined his recommendations and I reprint them here, just because I thought it was a nicely structured approach. According to Gingrich, we need to:
1) Dramatically, radically overhaul math and science education by:
paying students in grades 7-12 a wage equivalent to what they’d make at McDonald’s if they earn “B’s” or better.
eliminating regulation that prevents those with subject expertise from teaching that subject in schools (retired scientists and engineers, for example)
2) Triple the size of NSF
The Administration’s budget priorities are wrong. Congress is wrong. Regrets that his biggest mistake as speaker was not tripling NSF when they doubled NIH
3) Establish a national library of science similar to PubMed
especially needed for adults looking to further education
4) Need to dramatically deregulate our markets (presumably telecom)
need to have the highest capital investment in new technologies of any country in the world
5) We need to have “a vision of a dynamic successful future” in order to recruit the next generation of scientists and engineers
President has the right instinct with moon/mars, but the wrong program
there’s no coherent vision now of a scientifically exciting future
While he says it’s important to have a positive vision of the future for attracting future scientists and engineers, policymakers need to be motivated by the negatives. The current budget situation is a total mess, he said, but messes can be great opportunities. Increasing federal support for fundamental R&D is a really large change and “really large change is a long-wave process.” CEOs need to say to policymakers “here is what you have to do” and then communicate the downside:
We will lose without investment in NSF – “Do you want US to be the new Europe?”
The US is in a dominating position, but that position is not permanent. “We are temporarily and briefly the most powerful country in the world.”
Unfortunately, making the case is like the challenge of convincing relatively healthy people they should eat healthy and exercise. They don’t see a pressing need, even though the change would help them live longer, healthier lives. The US can “decay elegantly forever.” The challenge is to reverse that.
I thought it was a very interesting talk.
John Markoff, tech reporter for the NY Times (we’ve covered a fewofhis stories, including this really important one, here in this space) also participated in the meeting, running through his history of the rise of the personal computer, as told in his book What the Dormouse Said. Markoff also talked a bit about his frustration with what’s happening with tech coverage in journalism and at the Times — a move to cover much more of the business side of technology with less emphasis on the exciting stories about the science — but understood the pressures facing the publishers given the absolutely grim financial situations newspapers find themselves in at the moment. We’ve seen this in the advocacy community. The one “case” for the need to support fundamental research that seems to get the most traction both in the press and among policymakers at the moment is the “innovation” case — that is, the linkage between fundamental research performed by the nation’s universities and federal labs and innovation in U.S. industries. I suppose that’s not surprising. But more often it would be nice, I think — especially if one of our goals is inspiring the next generation of scientists and engineers — to see stories covering the excitement of the path to discovery, the quest for knowledge….
Anyway, on the whole, I thought it was a very enjoyable morning at the National Academies.
Ben Worthen has a great interview with former President’s IT Advisory Committee co-Chair Ed Lazowska in CIO Magazine in which Lazowska, freed from his role as presidential advisor after the President allowed PITAC’s charterto expire, pulls no punches describing the failure of the Administration to adequately support and prioritize cyber security research and development. Here’s a snippet:
[Lazowska:] Long-range R&D has always been the role of the national government. And the trend, despite repeated denials from the White House to the Department of Defense, has decreased funding for R&D. And of the R&D that does get funded, more and more of it is on the development side as opposed to longer-range research, which is where the big payoffs are in the long term. That’s a more fundamental problem that CIOs aren’t responsible for. [Worthen:] You feel strongly that the government’s treatment of cybersecurity R&D has been particularly neglectful.
[Lazowska:] PITAC found that the government is currently failing to fulfill this responsibility. (The word failing was edited out of our report, but it was the committee’s finding.) Let me talk very quickly about three federal agencies that you might think are focusing on this but are not:
» Most egregiously, the Department of Homeland Security simply doesn’t get cybersecurity. DHS has a science and technology (S&T) budget of more than a billion dollars annually. Of this, [only] $18 million is devoted to cybersecurity. For FY06, DHS’s S&T budget is slated to go up by more than $200 million, but the allocation to cybersecurity will decrease to $17 million! It’s also worth noting that across DHS’s entire S&T budget, only about 10 percent is allocated to anything that might reasonably be called “research” rather than “deployment.”
» Defense Advanced Research Projects Agency (DARPA) is investing in cybersecurity, but has classified all of its recent new program starts in this field. It’s fine to do classified research, but we must also recognize the negative consequences, and we should (but don’t) fund nonclassified research to make up for it. One negative consequence is that classified research is very slow to impact commercial IT systems, on which the entire nation, and even much of the Department of Defense, relies. Another negative consequence is that the nation’s university-based researchers cannot participate, because universities do not perform classified research. This eliminates many of the nation’s best cybersecurity researchers. It also means that students are not trained in cybersecuritythe training of students is an important byproduct of research.
President Bush ordered today that the President’s Council of Advisors for Science and Technology shall now serve as the President’s Information Technology Advisory Committee (PITAC), answering the question about what would become of PITAC after the President allowed that committee’s charter to expire last June. I’m not sure how the new responsibilities will be handled by PCAST — presumably the committee will be expanded somewhat to handle the load, but we’ll see.
I’m of two minds about the move. On the one hand, the membership of PCAST is top-notch. Having advisors of that stature become interested and invested in some of the issues of great concern to the IT community (like the overall level of federal support and the changing landscape for computing research) would add even more weight to our position. But I’m worried that the committee, which has a much broader charter than PITAC’s narrow focus on IT issues, won’t be able to examine the issues with the same depth that an independent IT advisory committee may have.
Anyway, we’ll keep a close eye on developments and report them here. Update: (five minutes after I posted above) The National Coordinating Office for IT is calling this an elevation of the role of external information technology advice in the White House. Here’s the <a href=OSTP press release. (pdf)
The release points out that PCAST is also the National Nanotechnology Advisory Panel and that the committee established a “technical advisory group” comprising of “about 50 top government and private sector nanotechnology scientists” that has proved “highly beneficial” to PCAST’s NNI assessments. They plan to do something similar for IT.
As more details are revealed, I’m thinking the positives outweigh the negatives. …
On September 15th, the Senate approved the FY 2006 Commerce, Science, Justice appropriations bill (its version of the House’s Science, State, Justice, Commerce bill, which the House approved back in June), approving funding for a number of science agencies for the coming year. As we noted back in June, the Senate indicated it was going to be less generous than the House for some key science agencies, and that indication held true. With the Senate’s action, we now have a substantial piece of the puzzle that is the annual appropriations process for science, so it seems like an appropriate time to summarize where we stand. After the agency-by-agency summaries, see the “outlook” section for the reasons why things will probably get even worse.
National Science Foundation (in millions of dollars)
FY05
FY06 Budget Request
FY06 House
FY06 Senate
House vs. FY05 (%)
Senate vs. FY05 (%)
Research and Related Activities
4,220.6
4,333.5
4,310
4,345.2
2.1%
3.0%
MREFC
173.7
250
193.4
193.4
11.3%
11.3%
Education and Human Resources
841.4
737
807
747
-4.1%
-11.2%
Salaries and Expenses
223.2
269
250
229.9
12.0%
3.0%
National Science Board
4
4
4
4
0%
0%
Office of the Inspector General
10
11.5
11.5
11.5
15%
15%
Total NSF
5,472.9
5,605
5,643.3
5,531
3.1%
1.1%
While the Senate increase of 1.1 percent for FY 2006 would be well below the expected rate of inflation over the next year (meaning its increase is actually a small cut in real-dollar terms), even the slightly-more-reasonable increase approved by the House would still be a decline in real dollars of $114 million over the FY 2004 level, marking the second straight year of real-dollar budget cuts to the only federal agency focused exclusively on basic research.
National Institute of Standards and Technology (in millions of dollars)
FY05
FY06 Budget Request
FY06 House
FY06 Senate
House vs. FY05 (%)
Senate vs. FY05 (%)
STRS (NIST Labs)
378.7
426.3
397.7
399.9
5.0%
5.6%
Computer Science and Applied Mathematics
65.4
Industrial Tech Services
247.9
46.8
106
246
-57.2%
-0.8%
MEP
107.5
46.8
106
106
-1.4%
-1.4%
ATP
0
140
Construction of Research Facilities
72.5
58.9
45
198.6
-37.9%
173.9%
Total NIST
699.1
532
548.7
844.5
-21.5%
20.8%
These numbers are subject to significant change during conference, as conferees will have to reconcile the $140 million discrepancy between the House and Senate numbers for NIST’s controversial Advanced Technology Program. Unfortunately, either NIST Labs or construction of research facilities will likely bear the brunt of the reconciliation.
National Aeronautics and Space Administration (in millions of dollars)
FY05
FY06 Budget Request
FY06 House
FY06 Senate
House vs. FY05 (%)
Senate vs. FY05 (%)
Science, Aeronautics and Exploration
7,806.13
9,661
9,725.8
9,761
24.6%
25.0%
Exploration Capabilities
8,358.4
6,763
6,712.9
6,603
-19.7%
-21.0%
Office of the Inspector General
31.3
32.4
32.4
32.4
3.5%
3.5%
Total NASA
16,195.8
16,456.4
16,471.1
16,396.4
1.7%
1.2%
3 Includes $126.0 million in supplemental appropriations
Note: NASA moved some programmatic funds between the Science and the Exploration Capabilities account for FY 06, resulting in the big net changes shown in the figures above.
National Oceanographic and Atmospheric Administration (in millions of dollars)
FY05
FY06 Budget Request
FY06 House
FY06 Senate
House vs. FY05 (%)
Senate vs. FY05 (%)
Operations Research and Facilities
2,793.61
2,531.2
2,447
3,203
12.4%
14.7%
Total NOAA
3,925.12
3,581.2
3,429
4,476
-12.6%
14.0%
1 Includes $24.0 million in supplemental appropriations 2 Includes $58.9 million in supplemental appropriations
The House and Senate funding levels are so far apart for NOAA it’s hard to imagine where the final number will be after the conference, or what other agency funding will look like as a result. It’s possible that the Senate could back off the NOAA number and use the difference to increase funding for NSF (the Senate did include some glowing praise for NSF in its report, just no significant funding), but it’s unclear anyone knows what the final outcome will be.
Outlook
In theory, with the Senate’s consideration of the Commerce, Science, Justice bill we should have a more complete view of how federal science agencies should fare in the FY 06 appropriations process. But in reality, even with the Senate and House numbers, there are other factors at play that make predicting a “final” number impossible at this point.
The most significant “X” factor is the impact of the federal response to Hurricane Katrina — ultimately expected to exceed over $200 billion in emergency spending. As that number grows, so does the chorus of voices calling for other cuts in federal spending to mitigate the blow to the budget (including efforts in the blogospheregainingsomeattention). The House Republican Study Committee — which numbers about 100 GOP congressmen — has already announced “Operation Offset,” a list of budget cuts (pdf) that could be used to help offset the predicted spending. Included in the recommendations are cuts to the NSF Math and Science Program (saving $188 million this year), canceling NASA’s Moon/Mars Initiative (saving $1.5 billion), and eliminating ATP and MEP ($140 million and $110 million respectively).
While it’s not likely that budget cuts will happen line-by-line as the RSC suggests, it’s possible that the leadership could adopt an across-the-board recision to all federal agencies as they did last year — which cost NSF (and every other Science agency, save NASA’s shuttle and moon/mars program) 2% of their FY 05 appropriation. Another slightly less likely possibility is that the appropriators decide to punt on FY 06 funding and pass a Continuing Resolution that lasts the duration of the fiscal year, freezing funding across the board.
With ten appropriations bills still unfinished and the start of the 2006 fiscal year only 5 days away, Congress will have to pass a Continuing Resolution to keep the federal government operating after October 1st. The first CR will hit the House floor on Wednesday and will likely keep things running through early November, absent the resolution of all outstanding appropriations bills. Funding in the CR will be set at either the lower of the Senate or House approved levels, or at the current level, with no new starts (or programs cancelled). There is some interest within the congressional leadership for sidestepping some of the concerns about Katrina offsets and the effect on the budget reconciliation process by passing a CR that would be in effect for all of FY 06.
Any of the scenarios above puts science funding in jeopardy for FY 06 and would prove very difficult for the science community to combat. The decision will be purely political, based on the extent of the leadership’s fears that they’ll be punished by conservatives in the 06 elections for spending so freely.
That noted, the science community still needs to weigh in on the appropriations debate. CRA’s Computing Research Advocacy Network will be doing its part with an activity targeted at urging House members to support — at the very least — the increases they’ve already approved for NSF, NIST, and DOE Science (including ASC) in conference, and urging Senate members to the House number. If you haven’t yet joined CRAN, now’s your chance. CRAN members, keep your eyes peeled for the next CRAN alert, which will provide all the details.
The Globe and Mail has an interesting article today and the disconnect between the perception of the computing job market (bad) and the reality (good).
David Kellam can do but he’s opting to teach.
He graduated from Queen’s University last year with a degree in computing. But he’s turning away from the tech sector as a long-term career prospect. Instead, the 23-year-old went back to Queen’s and enrolled in the faculty of education.
“I see no need to get myself stuck in a grey box somewhere pounding out code that may or may not be used inside some whale of an application,” Mr. Kellam says.
He is among a growing number of North American students and grads steering away from tech-sector jobs, presuming the industry is still in a post-bubble slump, with little in the way of employment opportunities.
How wrong they are, according to industry experts, who point to strong evidence that the tech industry is on the rise again and facing a supply-and-demand hiring disconnect.
It’s a good read.
The article cites data from CRA’s Taulbee Survey. If you haven’t checked out the new CRA Bulletin (now in Blog form!), it’s compiled by CRA’s Manager of Membership and Information Services Jay Vegso, who along with Survey’s and Evaluation coordinator Betsy Bizot, helps pull together all the Taulbee data every year. Jay has a number of informative posts on the IT workforce debate, the Taulbee data, workforce projections, and overall high-tech employment. A worthy addition to your blog roll….
Mort Zuckerman, editor-in-chief of US News and World Report uses his latest column to berate the Administration for cutting the federal investment in scientific research:
The American century, as the 20th century was known, was built on scientific progress. American corporations were the first to develop major in-house research labs and the first to work closely with academic institutions. After the Soviets launched Sputnik, we went into the overdrive that put a man on the moon.
In the second half of the 20th century, we reaped the harvest: fiber optics, integrated circuits, wireless communications, lasers, the Web, global positioning satellites, hybrid automobiles, video games, computers, and an enormous variety of medical technologies and drugs. All these inventions and discoveries transformed daily life around the world because American know-how and entrepreneurial energy married them to venture capital, then produced and marketed them.
…
Today, however, this is all being reversed. Why? Two reasons. The first is the cutback in federal support for advanced science. The second, many researchers believe, is that the Bush administration is fostering an antiscience culture. President Bush paved the way to double the National Science Foundation’s budget over five years, then, just two years later, he allowed Congress to cut the projected budget by $2 billion. Cut budgets for research and training, and we won’t have the economic growth tomorrow that we had yesterday. And this when we face, for the first time in our history, competition from low-wage, high-human-capital communities in China, India, and Asia. At the very least, it means fewer American jobs.
We must find the money to reverse this trend. It is not so much a current expenditure as an investment in our future. But money has to be accompanied by a recommitment to basing policy on professional analysis and scientific data from responsible agencies. An administration that packs advisory committees with industry representatives and disbands panels that provide advice unacceptable to political ideology is shortchanging the future of all of us.
Zuckerman also makes the case for the reestablishment of the Congressional Office of Technology Assessment — an office set up during the Nixon Administration to provide non-partisan advice to lawmakers on scientific and technical matters, but eliminated in FY 96 as part of congressional belt-tightening. While I agree that the current Administration appears to have issues with scientific advisory bodies that offer advice that conflicts with its policy goals, I’m not sure reconstituting OTA will help. As a veteran of the House Science Committee staff (though after OTA was disbanded), I can attest to the value of having direct contact between Members of Congress and researchers and technologists. I’m sympathetic to arguments that OTA, by virtue of the “buffer” they created between scientists and legislators, encouraged a “bureaucratic” approach to science policy, and I think the most critical functions of the office are probably well-tended to by entities like the Congressional Research Service, the National Academies, and the Government Accountability Office. Plus, as a science advocate now, I appreciate that organizations like CRA are more relied upon by key members of Congress and staff to provide input on science and technology policy.
But otherwise, I think Zuckerman’s piece is on the money. He’s certainly right about the importance of looking at federal support for research as an investment in the future of the country. Read the whole thing.
As mentioned previously, the House Science Committee met yesterday to focus on the threat cyber security vulnerabilities pose to various critical sectors of the Nation’s critical infrastructure. Representatives from the oil and gas, chemical, electrical and communications sectors all testified that their industries are becoming more and more dependent upon public networks, those networks are under serious threat from cyber attack, and the federal government has a clear role both in supporting information exchange and coordination among all the industry stakeholders, and supporting a research agenda aimed at addressing the threat, primarily in the long-term. I’m not sure there’s much more I need to add to that, other than to point to the archived video, the hearing charter (pdf), and the testimony of the five witnesses.
A few observations:
Committee chairman Sherwood Boehlert (R-NY) set the tone for the hearing in his opening statement by declaring that despite everything else that was taking place on the Hill that day — including the Roberts confirmation hearing and the party caucus meeting to choose a new Chairman of the Homeland Security Committee (Rep. Peter King (R-NY) was the choice) — he couldn’t think of another event more important than this hearing on cyber security.
We shouldnt have to wait for the cyber equivalent of a Hurricane Katrina – or even and Hurricane Ophelia might serve – to realize that we are inadequately prepared to prevent, detect and respond to cyber attacks.
And a cyber attack can affect a far larger area at a single stroke that can any hurricane. Not only that, given the increasing reliance of critical infrastructures on the Internet, a cyber attack could result in deaths as well as in massive disruption to the economy and daily life.
…
So our goal this morning is to help develop a cybersecurity agenda for the federal government, especially for the new Assistant Secretary. I never want to have to sit on a special committee set up to investigate why we were unprepared for a cyber attack. We know we are vulnerable, its time to act.
Despite federally-supported research and development in cyber security being cited as a critical need by each one of the industry witnesses, the only federal witness — Andy Purdy, Director of the National Cyber Security Division at DHS — didn’t mention R&D in his oral remarks other than to hope that he’d get a chance to talk about it during questioning (alas, he didn’t). In his written testimony, Purdy noted that DHS’ R&D goals are almost exclusively short-term:
Perform R&D aimed at improving the security of existing deployed technologies and to ensure the security of new emerging systems;
Develop new and enhanced technologies ofr the detection of, prevention of, and response to cyber attacks on the nation’s critical infrastructure; and
Facilitate the transfer of these technologies into the national infrastructure as a matter of urgency.
Of course, as PITAC found in its review of the nation’s cyber security R&D portfolio, even this narrow commitment to the short-term suffers from a severe lack of priority within the agency. The agency has requested only $17 million for FY 06 ($1 million less than last year) for cyber security research, out of a total S&T budget of over a billion dollars. I was disappointed that the members of the committee didn’t spend more time questioning DHS’ priority when it comes to funding cyber security R&D.
The hearing was well-attended by members of the committee. Despite lots of other events on the Hill, the hearing drew at least 23 different Members of Congress, with many sticking around to ask questions. There was plenty of room in the audience and the sections reserved for press however, which led Chairman Boehlert to complain that cyber security is still greeted with a “muffled yawn” outside his committee room and that he hoped it wasn’t going to take a “cyber Katrina” to wake people up about the dangerous threat.
I was pleased that Boehlert took a few minutes out of the question period to suggest to the industry representatives (SBC, British Petroleum, Dow Chemical, and American Electric Power were all represented) that they make use of their exceptionally persuasive “hired guns” in DC to advocate for more R&D and better coordination. The lobbyists need to be out there putting focus on the importance of this subject, he said.
Finally, an odd tack during the question and answer portion of the hearing: Rep. Roscoe Bartlett (R-MD) used his five minutes to berate DHS and the industry representatives for failing to plan and prepare adequately for the “ultimate low-probability, high-impact event” threatening the nation: a nuclear electromagnetic pulse attack. An EMP attack (by detonating a large yield nuclear weapon many miles in the atmosphere above the US) would potentially render every non-hardened microprocessor in the country completely inoperable, which given the ubiquitousness of microprocessors in just about everything, would have a devastating effect on the country. Bartlett was especially interested in hearing how the energy companies would cope, given that every transformer they operate would likely be destroyed, including ones we no longer have the ability to manufacture domestically. None of the witnesses could point to any significant preparation in their sectors.
Federal Computer Week’s Aliya Sternstein has an interesting piece in this week’s issue on the role of computing technology in helping predict and mitigate the cost of Hurricane Katrina.
Scientists are using a range of technologies to better predict the impact hurricanes can have on the economy and environment to minimize future damage and save lives.
Supercomputers, modeling programs and geographic information systems are some of the technologies scientists use to track the movement of hurricanes and predict damage. Experts warn, however, that skilled professionals are as crucial to accurate forecasting as technology.
Supercomputers aided the National Oceanic and Atmospheric Administration in accurately forecasting Hurricane Katrina’s path. The storm devastated the coastal areas of Alabama, Louisiana and Mississippi.
“Two and a half to three days before the hurricane hit, we were pretty much zoomed in on the Louisiana/Mississippi Gulf Coast as where the hurricane would hit,” said Jack Beven, a hurricane specialist at the NOAA Tropical Prediction Center. “It’s probably not the most accurate we’ve been, but it’s certainly pretty accurate.”
From what I understand, NOAA does a great job with the computing resources its been allocated. I’m just not sure they’ve been allocated nearly enough. The article points out that NOAA has been able to upgrade its supercomputing capacity from 0.5 teraflops to 1.5 teraflops within the last year. (Update (9/16/2005): This is questionable, see note below. More clarification below!**) That’s a great improvement, but given the scale of the problem they face, I’m not sure it’s adequate.
In its look at the state of computational science in the U.S. in the last year, the President’s Information Technology Advisory Committee (PITAC) (now disbanded, sigh) came up with a really interesting economic case for the need for increased computational resources in hurricane forecasting. I’ve cited it here once previously, but I’ll quote it again:
One nugget I found especially interesting from the presentation [of the PITAC Subcommittee on Computational Science] was an example of both the economic benefit and the health and safety benefit that will arise from more capable modeling enabled by advanced computing. The subcommittee noted that 40 percent of the $10 trillion U.S. economy is impacted by climate and weather. As one example of this, the subcommittee cited the hurricane warnings provided by the National Hurricane Center and the cost of the evacuations that often result. According to the subcommittee, there is $1 million in economic loss for each mile of coastline evacuated. With the current models, the U.S. now “over warns” by a factor of 3, with the average “over-warning” for a hurricane resulting in 200 miles of evacuations — or $200 million in unnecessary loss per event. Improved modeling (better algorithms, better software, more capable hardware, etc) would improve the accuracy of forecasts, saving lives and resources.
While over-warning probably wasn’t much of an issue in Katrina’s case, there are a number of capabilities that we currently lack that may have proven useful. Folks in the severe storms community tell me that current operational forecast models run by NOAA suffer from a number of limitations that work against obtaining accurate predictions of hurricane intensity and path. For example, they cite the lack of resolution in the current models that misses important fine-scale features like rain bands and the eye wall; the lack of coupling between atmospheric, wave and ocean prediction models; and computing resources that can generate only one or a few forecasts (as opposed to large ensembles), which impacts NOAA’s ability to improve forecasting skill and quantify uncertainty. While NOAA’s move to a 1.5 teraflop capacity is a welcome change, it’s still far below what one would consider a “leadership class” computing capacity for the agency — like those available at NSF, NASA and DOE centers. I know it’s a coarse measure, but 1.5 teraflops doesn’t even get you in the top 300 fastest machines — never mind a machine capable of the kind of improvements hinted at above.* And it’s not all about big iron. NOAA needs additional resources to ramp up its infrastructure — software, hardware and personnel — and to boost basic research programs within the agency and the university community. Asking for any increase in resources anywhere is obviously very tough in the current budget environment, but the size of the “bump” required here is relatively small, given the potential benefit.
But none of this is intended to take away from the job NOAA has done with the resources it already has. Because of NOAA’s forecasts, there was ample warning that this major storm was barreling in on the Gulf Coast and there were reasonable estimates of what it was going to do once it got there. But given sufficient resources the models will get even better, which means the forecasts will get better — more accurate, more precise, and more timely. How much would it be worth to have the accuracy and precision we have now at 24-36 hours before a major storm available 3 days out? Or five days out?
I know it may seem a bit crass to be talking about boosting funding for computing only days after a tragedy as big as Katrina’s impact on the gulf coast, but events like this are a trigger for the reevaluation of national priorities, and it seems to me that computing resources at NOAA haven’t been a national priority for quite a while. * Update: (9/16/2005) Actually, it looks like NOAA has slightly more adequate computing resources than the FCW article suggests. According to the Top500 list, NOAA has two machines capable of 4.4 teraflops and two capable of 1.8 teraflops. So I’m not sure what the FCW article reflects. That’s still quite some distance from “leadership class” computing, trailing machines in Japan, Sweden, Germany, Russia, Korea, China, and Australia, but it’s better than the figures quoted in the article above. ** Update 2: (9/16/2005) Aliya Sternstein writes to note that the 1.5 teraflop measurement cited in the FCW piece applies to the NWS system at the IBM facility in Gaithersburg, MD, not all of NOAA’s computational capacity.
Please use the Category and Archive Filters below, to find older posts. Or you may also use the search bar.
CRA Comments on DOD Export Regulations
/In: Policy /by Peter HarshaThe Department of Defense has proposed a change to its regulations that would strengthen requirements for the control of export-controlled technologies for DOD research contractors, including universities. The proposed rule would require contractors to have an export control program that includes regular audits and training, segregated workplaces for export-controlled technologies, and “unique badging requirements for foreign nationals and foreign persons.”
While we at CRA understand and support the need for export control and deemed export regulation, we’re concerned that the proposed rule as it stands would make fundamental research subject to novel restrictions that could seriously impair the ability of colleges, universities, industrial and federal research labs to conduct fundamental research, which would have significant ramifications for America’s economic competitiveness and technological leadership in the world.
This concern stems in large part from the fact that the proposed rule doesn’t reference the fundamental research exemption, as found in a Reagan-era National Security Decision Directive still in effect (NSDD 189). Though it appears that the authors of the proposed rule didn’t intend to add new burdens for universities — rather, intended the change to clarify existing regulations — the worry is that without an explicit mention of NSDD-189, DOD agencies might incorrectly interpret compliance requirements to require access controls in all instances, even when fundamental research is being performed. There’s also a concern that the new rule would prompt DOD program managers and contract officers to include overly restrictive language in DOD contracts in order to protect themselves from any potential liability or culpability. As the Association of American Universities points out (pdf), this likely would exacerbate the already significant problems that universities have experienced with troublesome clauses in contracts from industry.
So, as we did when the Department of Commerce announced they were considering a similar rule change (and still haven’t decided), CRA filed comments (pdf) with DOD, along with more than 100 other respondents.
USACM also filed comments regarding the proposed rule, which you can find — along with a good blog post from David Padgham highlighting some of the other comments received — over at the USACM Technology Policy Blog.
We’ll keep track on the rule-making progress as it moves forward at both Commerce and DOD in the coming weeks….
Two News Pieces: PCAST and Cyber Security
/In: R&D in the Press /by Peter HarshaA quick pointer to two interesting not-directly-related pieces running today. First is Aliya Sternstein’s article in Federal Computer Week that fleshes out the PITAC to PCAST switch we noted back on September 30th. She quotes CRA Chair Dan Reed and ITAA president Harris Miller:
(There’s a brief comment from me in there as well.)
The other interesting piece is by ZDNet News’ Declan McCullagh and Anne Broache. It’s titled “U.S. cybersecurity due for FEMA-like calamity?” and it covers the lack of adequate attention the Department of Homeland Security has paid to cyber threats to critical infrastructures.
The article also features a nice quote from CRA government affairs committee co-Chair Ed Lazowska that sums up the concerns about the agency’s research efforts:
Both are worth a read!
Gingrich and Markoff at CSTB
/In: Policy /by Peter HarshaFormer House Speaker Newt Gingrich joined yesterday’s meeting of the National Academies’ Computer Science and Telecommunications Board, ostensibly to talk about health care and IT — though he probably only spent a couple of minutes total on the topic. Instead, the board and those of us in the audience got Gingrich’s take on what’s wrong with America’s innovation ecosystem and his plan for addressing it. The presentation was very interesting — Gingrich is a remarkable extemporaneous speaker, even in front of an audience that I suspect was not full of Gingrich “fans.” I jotted down some brief notes as he outlined his recommendations and I reprint them here, just because I thought it was a nicely structured approach. According to Gingrich, we need to:
I thought it was a very interesting talk.
John Markoff, tech reporter for the NY Times (we’ve covered a few of his stories, including this really important one, here in this space) also participated in the meeting, running through his history of the rise of the personal computer, as told in his book What the Dormouse Said. Markoff also talked a bit about his frustration with what’s happening with tech coverage in journalism and at the Times — a move to cover much more of the business side of technology with less emphasis on the exciting stories about the science — but understood the pressures facing the publishers given the absolutely grim financial situations newspapers find themselves in at the moment. We’ve seen this in the advocacy community. The one “case” for the need to support fundamental research that seems to get the most traction both in the press and among policymakers at the moment is the “innovation” case — that is, the linkage between fundamental research performed by the nation’s universities and federal labs and innovation in U.S. industries. I suppose that’s not surprising. But more often it would be nice, I think — especially if one of our goals is inspiring the next generation of scientists and engineers — to see stories covering the excitement of the path to discovery, the quest for knowledge….
Anyway, on the whole, I thought it was a very enjoyable morning at the National Academies.
Lazowska on Cyber Security and the Failure of the Administration
/In: Policy /by Peter HarshaBen Worthen has a great interview with former President’s IT Advisory Committee co-Chair Ed Lazowska in CIO Magazine in which Lazowska, freed from his role as presidential advisor after the President allowed PITAC’s charter to expire, pulls no punches describing the failure of the Administration to adequately support and prioritize cyber security research and development. Here’s a snippet:
There’s also a great sidebar: Blame the Internet
And the main editorial for the issue: Who Owns Security?
All worth the read.
PCAST to Assume PITAC’s Role
/In: Policy /by Peter HarshaPresident Bush ordered today that the President’s Council of Advisors for Science and Technology shall now serve as the President’s Information Technology Advisory Committee (PITAC), answering the question about what would become of PITAC after the President allowed that committee’s charter to expire last June. I’m not sure how the new responsibilities will be handled by PCAST — presumably the committee will be expanded somewhat to handle the load, but we’ll see.
I’m of two minds about the move. On the one hand, the membership of PCAST is top-notch. Having advisors of that stature become interested and invested in some of the issues of great concern to the IT community (like the overall level of federal support and the changing landscape for computing research) would add even more weight to our position. But I’m worried that the committee, which has a much broader charter than PITAC’s narrow focus on IT issues, won’t be able to examine the issues with the same depth that an independent IT advisory committee may have.
Anyway, we’ll keep a close eye on developments and report them here.
Update: (five minutes after I posted above) The National Coordinating Office for IT is calling this an elevation of the role of external information technology advice in the White House. Here’s the <a href=OSTP press release. (pdf)
The release points out that PCAST is also the National Nanotechnology Advisory Panel and that the committee established a “technical advisory group” comprising of “about 50 top government and private sector nanotechnology scientists” that has proved “highly beneficial” to PCAST’s NNI assessments. They plan to do something similar for IT.
As more details are revealed, I’m thinking the positives outweigh the negatives. …
FY 06 Appropriations Update: A biggie
/In: FY06 Appropriations /by Peter HarshaOn September 15th, the Senate approved the FY 2006 Commerce, Science, Justice appropriations bill (its version of the House’s Science, State, Justice, Commerce bill, which the House approved back in June), approving funding for a number of science agencies for the coming year. As we noted back in June, the Senate indicated it was going to be less generous than the House for some key science agencies, and that indication held true. With the Senate’s action, we now have a substantial piece of the puzzle that is the annual appropriations process for science, so it seems like an appropriate time to summarize where we stand. After the agency-by-agency summaries, see the “outlook” section for the reasons why things will probably get even worse.
Latest Agency Funding Levels
NSF:
(in millions of dollars)
Budget Request
House
Senate
FY05 (%)
FY05 (%)
While the Senate increase of 1.1 percent for FY 2006 would be well below the expected rate of inflation over the next year (meaning its increase is actually a small cut in real-dollar terms), even the slightly-more-reasonable increase approved by the House would still be a decline in real dollars of $114 million over the FY 2004 level, marking the second straight year of real-dollar budget cuts to the only federal agency focused exclusively on basic research.
NIST:
(in millions of dollars)
Budget Request
House
Senate
FY05 (%)
FY05 (%)
Computer Science and Applied Mathematics
MEP
ATP
These numbers are subject to significant change during conference, as conferees will have to reconcile the $140 million discrepancy between the House and Senate numbers for NIST’s controversial Advanced Technology Program. Unfortunately, either NIST Labs or construction of research facilities will likely bear the brunt of the reconciliation.
Energy:
(in millions of dollars)
Budget Request
House
Senate
FY05 (%)
FY05 (%)
Basic Energy Science
Advanced Scientific Computing
NASA:
(in millions of dollars)
Budget Request
House
Senate
FY05 (%)
FY05 (%)
Note: NASA moved some programmatic funds between the Science and the Exploration Capabilities account for FY 06, resulting in the big net changes shown in the figures above.
It’s not clear yet how how supercomputing at NASA Ames would be further affected by these numbers….
NOAA:
(in millions of dollars)
Budget Request
House
Senate
FY05 (%)
FY05 (%)
2 Includes $58.9 million in supplemental appropriations
The House and Senate funding levels are so far apart for NOAA it’s hard to imagine where the final number will be after the conference, or what other agency funding will look like as a result. It’s possible that the Senate could back off the NOAA number and use the difference to increase funding for NSF (the Senate did include some glowing praise for NSF in its report, just no significant funding), but it’s unclear anyone knows what the final outcome will be.
Outlook
In theory, with the Senate’s consideration of the Commerce, Science, Justice bill we should have a more complete view of how federal science agencies should fare in the FY 06 appropriations process. But in reality, even with the Senate and House numbers, there are other factors at play that make predicting a “final” number impossible at this point.
The most significant “X” factor is the impact of the federal response to Hurricane Katrina — ultimately expected to exceed over $200 billion in emergency spending. As that number grows, so does the chorus of voices calling for other cuts in federal spending to mitigate the blow to the budget (including efforts in the blogosphere gaining some attention). The House Republican Study Committee — which numbers about 100 GOP congressmen — has already announced “Operation Offset,” a list of budget cuts (pdf) that could be used to help offset the predicted spending. Included in the recommendations are cuts to the NSF Math and Science Program (saving $188 million this year), canceling NASA’s Moon/Mars Initiative (saving $1.5 billion), and eliminating ATP and MEP ($140 million and $110 million respectively).
While it’s not likely that budget cuts will happen line-by-line as the RSC suggests, it’s possible that the leadership could adopt an across-the-board recision to all federal agencies as they did last year — which cost NSF (and every other Science agency, save NASA’s shuttle and moon/mars program) 2% of their FY 05 appropriation. Another slightly less likely possibility is that the appropriators decide to punt on FY 06 funding and pass a Continuing Resolution that lasts the duration of the fiscal year, freezing funding across the board.
With ten appropriations bills still unfinished and the start of the 2006 fiscal year only 5 days away, Congress will have to pass a Continuing Resolution to keep the federal government operating after October 1st. The first CR will hit the House floor on Wednesday and will likely keep things running through early November, absent the resolution of all outstanding appropriations bills. Funding in the CR will be set at either the lower of the Senate or House approved levels, or at the current level, with no new starts (or programs cancelled). There is some interest within the congressional leadership for sidestepping some of the concerns about Katrina offsets and the effect on the budget reconciliation process by passing a CR that would be in effect for all of FY 06.
Any of the scenarios above puts science funding in jeopardy for FY 06 and would prove very difficult for the science community to combat. The decision will be purely political, based on the extent of the leadership’s fears that they’ll be punished by conservatives in the 06 elections for spending so freely.
That noted, the science community still needs to weigh in on the appropriations debate. CRA’s Computing Research Advocacy Network will be doing its part with an activity targeted at urging House members to support — at the very least — the increases they’ve already approved for NSF, NIST, and DOE Science (including ASC) in conference, and urging Senate members to the House number. If you haven’t yet joined CRAN, now’s your chance. CRAN members, keep your eyes peeled for the next CRAN alert, which will provide all the details.
Where the Jobs are and Students aren’t
/In: People /by Peter HarshaThe Globe and Mail has an interesting article today and the disconnect between the perception of the computing job market (bad) and the reality (good).
It’s a good read.
The article cites data from CRA’s Taulbee Survey. If you haven’t checked out the new CRA Bulletin (now in Blog form!), it’s compiled by CRA’s Manager of Membership and Information Services Jay Vegso, who along with Survey’s and Evaluation coordinator Betsy Bizot, helps pull together all the Taulbee data every year. Jay has a number of informative posts on the IT workforce debate, the Taulbee data, workforce projections, and overall high-tech employment. A worthy addition to your blog roll….
Zuckerman in US News: Investing in Tomorrow
/In: R&D in the Press /by Peter HarshaMort Zuckerman, editor-in-chief of US News and World Report uses his latest column to berate the Administration for cutting the federal investment in scientific research:
Zuckerman also makes the case for the reestablishment of the Congressional Office of Technology Assessment — an office set up during the Nixon Administration to provide non-partisan advice to lawmakers on scientific and technical matters, but eliminated in FY 96 as part of congressional belt-tightening. While I agree that the current Administration appears to have issues with scientific advisory bodies that offer advice that conflicts with its policy goals, I’m not sure reconstituting OTA will help. As a veteran of the House Science Committee staff (though after OTA was disbanded), I can attest to the value of having direct contact between Members of Congress and researchers and technologists. I’m sympathetic to arguments that OTA, by virtue of the “buffer” they created between scientists and legislators, encouraged a “bureaucratic” approach to science policy, and I think the most critical functions of the office are probably well-tended to by entities like the Congressional Research Service, the National Academies, and the Government Accountability Office. Plus, as a science advocate now, I appreciate that organizations like CRA are more relied upon by key members of Congress and staff to provide input on science and technology policy.
But otherwise, I think Zuckerman’s piece is on the money. He’s certainly right about the importance of looking at federal support for research as an investment in the future of the country. Read the whole thing.
House Science Cyber Security and Critical Infrastructures Hearing Wrapup
/In: Events, Policy, Security /by Peter HarshaAs mentioned previously, the House Science Committee met yesterday to focus on the threat cyber security vulnerabilities pose to various critical sectors of the Nation’s critical infrastructure. Representatives from the oil and gas, chemical, electrical and communications sectors all testified that their industries are becoming more and more dependent upon public networks, those networks are under serious threat from cyber attack, and the federal government has a clear role both in supporting information exchange and coordination among all the industry stakeholders, and supporting a research agenda aimed at addressing the threat, primarily in the long-term. I’m not sure there’s much more I need to add to that, other than to point to the archived video, the hearing charter (pdf), and the testimony of the five witnesses.
A few observations:
Of course, as PITAC found in its review of the nation’s cyber security R&D portfolio, even this narrow commitment to the short-term suffers from a severe lack of priority within the agency. The agency has requested only $17 million for FY 06 ($1 million less than last year) for cyber security research, out of a total S&T budget of over a billion dollars. I was disappointed that the members of the committee didn’t spend more time questioning DHS’ priority when it comes to funding cyber security R&D.
Katrina and Computing
/In: R&D in the Press /by Peter HarshaFederal Computer Week’s Aliya Sternstein has an interesting piece in this week’s issue on the role of computing technology in helping predict and mitigate the cost of Hurricane Katrina.
From what I understand, NOAA does a great job with the computing resources its been allocated. I’m just not sure they’ve been allocated nearly enough. The article points out that NOAA has been able to upgrade its supercomputing capacity from 0.5 teraflops to 1.5 teraflops within the last year. (
Update (9/16/2005): This is questionable, see note below.More clarification below!**) That’s a great improvement, but given the scale of the problem they face, I’m not sure it’s adequate.In its look at the state of computational science in the U.S. in the last year, the President’s Information Technology Advisory Committee (PITAC) (now disbanded, sigh) came up with a really interesting economic case for the need for increased computational resources in hurricane forecasting. I’ve cited it here once previously, but I’ll quote it again:
While over-warning probably wasn’t much of an issue in Katrina’s case, there are a number of capabilities that we currently lack that may have proven useful. Folks in the severe storms community tell me that current operational forecast models run by NOAA suffer from a number of limitations that work against obtaining accurate predictions of hurricane intensity and path. For example, they cite the lack of resolution in the current models that misses important fine-scale features like rain bands and the eye wall; the lack of coupling between atmospheric, wave and ocean prediction models; and computing resources that can generate only one or a few forecasts (as opposed to large ensembles), which impacts NOAA’s ability to improve forecasting skill and quantify uncertainty.
While NOAA’s move to a 1.5 teraflop capacity is a welcome change, it’s still far below what one would consider a “leadership class” computing capacity for the agency — like those available at NSF, NASA and DOE centers. I know it’s a coarse measure, but 1.5 teraflops doesn’t even get you in the top 300 fastest machines — never mind a machine capable of the kind of improvements hinted at above.* And it’s not all about big iron. NOAA needs additional resources to ramp up its infrastructure — software, hardware and personnel — and to boost basic research programs within the agency and the university community. Asking for any increase in resources anywhere is obviously very tough in the current budget environment, but the size of the “bump” required here is relatively small, given the potential benefit.But none of this is intended to take away from the job NOAA has done with the resources it already has. Because of NOAA’s forecasts, there was ample warning that this major storm was barreling in on the Gulf Coast and there were reasonable estimates of what it was going to do once it got there. But given sufficient resources the models will get even better, which means the forecasts will get better — more accurate, more precise, and more timely. How much would it be worth to have the accuracy and precision we have now at 24-36 hours before a major storm available 3 days out? Or five days out?
I know it may seem a bit crass to be talking about boosting funding for computing only days after a tragedy as big as Katrina’s impact on the gulf coast, but events like this are a trigger for the reevaluation of national priorities, and it seems to me that computing resources at NOAA haven’t been a national priority for quite a while.
* Update: (9/16/2005) Actually, it looks like NOAA has slightly more adequate computing resources than the FCW article suggests. According to the Top500 list, NOAA has two machines capable of 4.4 teraflops and two capable of 1.8 teraflops. So I’m not sure what the FCW article reflects. That’s still quite some distance from “leadership class” computing, trailing machines in Japan, Sweden, Germany, Russia, Korea, China, and Australia, but it’s better than the figures quoted in the article above.
** Update 2: (9/16/2005) Aliya Sternstein writes to note that the 1.5 teraflop measurement cited in the FCW piece applies to the NWS system at the IBM facility in Gaithersburg, MD, not all of NOAA’s computational capacity.