Computing Research Policy Blog

PCAST to Assume PITAC’s Role


President Bush ordered today that the President’s Council of Advisors for Science and Technology shall now serve as the President’s Information Technology Advisory Committee (PITAC), answering the question about what would become of PITAC after the President allowed that committee’s charter to expire last June. I’m not sure how the new responsibilities will be handled by PCAST — presumably the committee will be expanded somewhat to handle the load, but we’ll see.
I’m of two minds about the move. On the one hand, the membership of PCAST is top-notch. Having advisors of that stature become interested and invested in some of the issues of great concern to the IT community (like the overall level of federal support and the changing landscape for computing research) would add even more weight to our position. But I’m worried that the committee, which has a much broader charter than PITAC’s narrow focus on IT issues, won’t be able to examine the issues with the same depth that an independent IT advisory committee may have.
Anyway, we’ll keep a close eye on developments and report them here.
Update: (five minutes after I posted above) The National Coordinating Office for IT is calling this an elevation of the role of external information technology advice in the White House. Here’s the <a href=OSTP press release. (pdf)
The release points out that PCAST is also the National Nanotechnology Advisory Panel and that the committee established a “technical advisory group” comprising of “about 50 top government and private sector nanotechnology scientists” that has proved “highly beneficial” to PCAST’s NNI assessments. They plan to do something similar for IT.
As more details are revealed, I’m thinking the positives outweigh the negatives. …

FY 06 Appropriations Update: A biggie


On September 15th, the Senate approved the FY 2006 Commerce, Science, Justice appropriations bill (its version of the House’s Science, State, Justice, Commerce bill, which the House approved back in June), approving funding for a number of science agencies for the coming year. As we noted back in June, the Senate indicated it was going to be less generous than the House for some key science agencies, and that indication held true. With the Senate’s action, we now have a substantial piece of the puzzle that is the annual appropriations process for science, so it seems like an appropriate time to summarize where we stand. After the agency-by-agency summaries, see the “outlook” section for the reasons why things will probably get even worse.

Latest Agency Funding Levels
NSF:

National Science Foundation
(in millions of dollars)
FY05 FY06
Budget Request
FY06
House
FY06
Senate
House vs.
FY05 (%)
Senate vs.
FY05 (%)
Research and Related Activities 4,220.6 4,333.5 4,310 4,345.2 2.1% 3.0%
MREFC 173.7 250 193.4 193.4 11.3% 11.3%
Education and Human Resources 841.4 737 807 747 -4.1% -11.2%
Salaries and Expenses 223.2 269 250 229.9 12.0% 3.0%
National Science Board 4 4 4 4 0% 0%
Office of the Inspector General 10 11.5 11.5 11.5 15% 15%
Total NSF 5,472.9 5,605 5,643.3 5,531 3.1% 1.1%

While the Senate increase of 1.1 percent for FY 2006 would be well below the expected rate of inflation over the next year (meaning its increase is actually a small cut in real-dollar terms), even the slightly-more-reasonable increase approved by the House would still be a decline in real dollars of $114 million over the FY 2004 level, marking the second straight year of real-dollar budget cuts to the only federal agency focused exclusively on basic research.

NIST:

National Institute of Standards and Technology
(in millions of dollars)
FY05 FY06
Budget Request
FY06
House
FY06
Senate
House vs.
FY05 (%)
Senate vs.
FY05 (%)
STRS (NIST Labs) 378.7 426.3 397.7 399.9 5.0% 5.6%
    Computer Science and Applied Mathematics

65.4
Industrial Tech Services 247.9 46.8 106 246 -57.2% -0.8%
    MEP
107.5 46.8 106 106 -1.4% -1.4%
    ATP
0 140
Construction of Research Facilities 72.5 58.9 45 198.6 -37.9% 173.9%
Total NIST 699.1 532 548.7 844.5 -21.5% 20.8%

These numbers are subject to significant change during conference, as conferees will have to reconcile the $140 million discrepancy between the House and Senate numbers for NIST’s controversial Advanced Technology Program. Unfortunately, either NIST Labs or construction of research facilities will likely bear the brunt of the reconciliation.

Energy:

Department of Energy
(in millions of dollars)
FY05 FY06
Budget Request
FY06
House
FY06
Senate
House vs.
FY05 (%)
Senate vs.
FY05 (%)
Office of Science 3,599.9 3,462.7 3,666 3,702 1.8% 2.9%
    Basic Energy Science
1,104.6 1,146 1,173.1 1,241 6.2% 12.3%
    Advanced Scientific Computing
232 207 246 207 6.0% -10.8%

NASA:

National Aeronautics and Space Administration
(in millions of dollars)
FY05 FY06
Budget Request
FY06
House
FY06
Senate
House vs.
FY05 (%)
Senate vs.
FY05 (%)
Science, Aeronautics and Exploration 7,806.13 9,661 9,725.8 9,761 24.6% 25.0%
Exploration Capabilities 8,358.4 6,763 6,712.9 6,603 -19.7% -21.0%
Office of the Inspector General 31.3 32.4 32.4 32.4 3.5% 3.5%
Total NASA 16,195.8 16,456.4 16,471.1 16,396.4 1.7% 1.2%
3 Includes $126.0 million in supplemental appropriations
Note: NASA moved some programmatic funds between the Science and the Exploration Capabilities account for FY 06, resulting in the big net changes shown in the figures above.

It’s not clear yet how how supercomputing at NASA Ames would be further affected by these numbers….

NOAA:

National Oceanographic and Atmospheric Administration
(in millions of dollars)
FY05 FY06
Budget Request
FY06
House
FY06
Senate
House vs.
FY05 (%)
Senate vs.
FY05 (%)
Operations Research and Facilities 2,793.61 2,531.2 2,447 3,203 12.4% 14.7%
Total NOAA 3,925.12 3,581.2 3,429 4,476 -12.6% 14.0%
1 Includes $24.0 million in supplemental appropriations
2 Includes $58.9 million in supplemental appropriations

The House and Senate funding levels are so far apart for NOAA it’s hard to imagine where the final number will be after the conference, or what other agency funding will look like as a result. It’s possible that the Senate could back off the NOAA number and use the difference to increase funding for NSF (the Senate did include some glowing praise for NSF in its report, just no significant funding), but it’s unclear anyone knows what the final outcome will be.

Outlook

In theory, with the Senate’s consideration of the Commerce, Science, Justice bill we should have a more complete view of how federal science agencies should fare in the FY 06 appropriations process. But in reality, even with the Senate and House numbers, there are other factors at play that make predicting a “final” number impossible at this point.

The most significant “X” factor is the impact of the federal response to Hurricane Katrina — ultimately expected to exceed over $200 billion in emergency spending. As that number grows, so does the chorus of voices calling for other cuts in federal spending to mitigate the blow to the budget (including efforts in the blogosphere gaining some attention). The House Republican Study Committee — which numbers about 100 GOP congressmen — has already announced “Operation Offset,” a list of budget cuts (pdf) that could be used to help offset the predicted spending. Included in the recommendations are cuts to the NSF Math and Science Program (saving $188 million this year), canceling NASA’s Moon/Mars Initiative (saving $1.5 billion), and eliminating ATP and MEP  ($140 million and $110 million respectively). 

While it’s not likely that budget cuts will happen line-by-line as the RSC suggests, it’s possible that the leadership could adopt an across-the-board recision to all federal agencies as they did last year — which cost NSF (and every other Science agency, save NASA’s shuttle and moon/mars program) 2% of their FY 05 appropriation. Another slightly less likely possibility is that the appropriators decide to punt on FY 06 funding and pass a Continuing Resolution that lasts the duration of the fiscal year, freezing funding across the board.

With ten appropriations bills still unfinished and the start of the 2006 fiscal year only 5 days away, Congress will have to pass a Continuing Resolution to keep the federal government operating after October 1st. The first CR will hit the House floor on Wednesday and will likely keep things running through early November, absent the resolution of all outstanding appropriations bills. Funding in the CR will be set at either the lower of the Senate or House approved levels, or at the current level, with no new starts (or programs cancelled). There is some interest within the congressional leadership for sidestepping some of the concerns about Katrina offsets and the effect on the budget reconciliation process by passing a CR that would be in effect for all of FY 06.

Any of the scenarios above puts science funding in jeopardy for FY 06 and would prove very difficult for the science community to combat. The decision will be purely political, based on the extent of the leadership’s fears that they’ll be punished by conservatives in the 06 elections for spending so freely.

That noted, the science community still needs to weigh in on the appropriations debate. CRA’s Computing Research Advocacy Network will be doing its part with an activity targeted at urging House members to support — at the very least — the increases they’ve already approved for NSF, NIST, and DOE Science (including ASC) in conference, and urging Senate members to the House number. If you haven’t yet joined CRAN, now’s your chance. CRAN members, keep your eyes peeled for the next CRAN alert, which will provide all the details.

Where the Jobs are and Students aren’t


The Globe and Mail has an interesting article today and the disconnect between the perception of the computing job market (bad) and the reality (good).

David Kellam can do but he’s opting to teach.
He graduated from Queen’s University last year with a degree in computing. But he’s turning away from the tech sector as a long-term career prospect. Instead, the 23-year-old went back to Queen’s and enrolled in the faculty of education.
“I see no need to get myself stuck in a grey box somewhere pounding out code that may or may not be used inside some whale of an application,” Mr. Kellam says.
He is among a growing number of North American students and grads steering away from tech-sector jobs, presuming the industry is still in a post-bubble slump, with little in the way of employment opportunities.
How wrong they are, according to industry experts, who point to strong evidence that the tech industry is on the rise again and facing a supply-and-demand hiring disconnect.

It’s a good read.
The article cites data from CRA’s Taulbee Survey. If you haven’t checked out the new CRA Bulletin (now in Blog form!), it’s compiled by CRA’s Manager of Membership and Information Services Jay Vegso, who along with Survey’s and Evaluation coordinator Betsy Bizot, helps pull together all the Taulbee data every year. Jay has a number of informative posts on the IT workforce debate, the Taulbee data, workforce projections, and overall high-tech employment. A worthy addition to your blog roll….

Zuckerman in US News: Investing in Tomorrow


Mort Zuckerman, editor-in-chief of US News and World Report uses his latest column to berate the Administration for cutting the federal investment in scientific research:

The American century, as the 20th century was known, was built on scientific progress. American corporations were the first to develop major in-house research labs and the first to work closely with academic institutions. After the Soviets launched Sputnik, we went into the overdrive that put a man on the moon.
In the second half of the 20th century, we reaped the harvest: fiber optics, integrated circuits, wireless communications, lasers, the Web, global positioning satellites, hybrid automobiles, video games, computers, and an enormous variety of medical technologies and drugs. All these inventions and discoveries transformed daily life around the world because American know-how and entrepreneurial energy married them to venture capital, then produced and marketed them.

Today, however, this is all being reversed. Why? Two reasons. The first is the cutback in federal support for advanced science. The second, many researchers believe, is that the Bush administration is fostering an antiscience culture. President Bush paved the way to double the National Science Foundation’s budget over five years, then, just two years later, he allowed Congress to cut the projected budget by $2 billion. Cut budgets for research and training, and we won’t have the economic growth tomorrow that we had yesterday. And this when we face, for the first time in our history, competition from low-wage, high-human-capital communities in China, India, and Asia. At the very least, it means fewer American jobs.
We must find the money to reverse this trend. It is not so much a current expenditure as an investment in our future. But money has to be accompanied by a recommitment to basing policy on professional analysis and scientific data from responsible agencies. An administration that packs advisory committees with industry representatives and disbands panels that provide advice unacceptable to political ideology is shortchanging the future of all of us.

Zuckerman also makes the case for the reestablishment of the Congressional Office of Technology Assessment — an office set up during the Nixon Administration to provide non-partisan advice to lawmakers on scientific and technical matters, but eliminated in FY 96 as part of congressional belt-tightening. While I agree that the current Administration appears to have issues with scientific advisory bodies that offer advice that conflicts with its policy goals, I’m not sure reconstituting OTA will help. As a veteran of the House Science Committee staff (though after OTA was disbanded), I can attest to the value of having direct contact between Members of Congress and researchers and technologists. I’m sympathetic to arguments that OTA, by virtue of the “buffer” they created between scientists and legislators, encouraged a “bureaucratic” approach to science policy, and I think the most critical functions of the office are probably well-tended to by entities like the Congressional Research Service, the National Academies, and the Government Accountability Office. Plus, as a science advocate now, I appreciate that organizations like CRA are more relied upon by key members of Congress and staff to provide input on science and technology policy.
But otherwise, I think Zuckerman’s piece is on the money. He’s certainly right about the importance of looking at federal support for research as an investment in the future of the country. Read the whole thing.

House Science Cyber Security and Critical Infrastructures Hearing Wrapup


As mentioned previously, the House Science Committee met yesterday to focus on the threat cyber security vulnerabilities pose to various critical sectors of the Nation’s critical infrastructure. Representatives from the oil and gas, chemical, electrical and communications sectors all testified that their industries are becoming more and more dependent upon public networks, those networks are under serious threat from cyber attack, and the federal government has a clear role both in supporting information exchange and coordination among all the industry stakeholders, and supporting a research agenda aimed at addressing the threat, primarily in the long-term. I’m not sure there’s much more I need to add to that, other than to point to the archived video, the hearing charter (pdf), and the testimony of the five witnesses.
A few observations:

  • Committee chairman Sherwood Boehlert (R-NY) set the tone for the hearing in his opening statement by declaring that despite everything else that was taking place on the Hill that day — including the Roberts confirmation hearing and the party caucus meeting to choose a new Chairman of the Homeland Security Committee (Rep. Peter King (R-NY) was the choice) — he couldn’t think of another event more important than this hearing on cyber security.

    We shouldn’t have to wait for the cyber equivalent of a Hurricane Katrina – or even and Hurricane Ophelia might serve – to realize that we are inadequately prepared to prevent, detect and respond to cyber attacks.
    And a cyber attack can affect a far larger area at a single stroke that can any hurricane. Not only that, given the increasing reliance of critical infrastructures on the Internet, a cyber attack could result in deaths as well as in massive disruption to the economy and daily life.

    So our goal this morning is to help develop a cybersecurity agenda for the federal government, especially for the new Assistant Secretary. I never want to have to sit on a special committee set up to investigate why we were unprepared for a cyber attack. We know we are vulnerable, it’s time to act.

  • Despite federally-supported research and development in cyber security being cited as a critical need by each one of the industry witnesses, the only federal witness — Andy Purdy, Director of the National Cyber Security Division at DHS — didn’t mention R&D in his oral remarks other than to hope that he’d get a chance to talk about it during questioning (alas, he didn’t). In his written testimony, Purdy noted that DHS’ R&D goals are almost exclusively short-term:

  • Perform R&D aimed at improving the security of existing deployed technologies and to ensure the security of new emerging systems;
  • Develop new and enhanced technologies ofr the detection of, prevention of, and response to cyber attacks on the nation’s critical infrastructure; and
  • Facilitate the transfer of these technologies into the national infrastructure as a matter of urgency.
  • Of course, as PITAC found in its review of the nation’s cyber security R&D portfolio, even this narrow commitment to the short-term suffers from a severe lack of priority within the agency. The agency has requested only $17 million for FY 06 ($1 million less than last year) for cyber security research, out of a total S&T budget of over a billion dollars. I was disappointed that the members of the committee didn’t spend more time questioning DHS’ priority when it comes to funding cyber security R&D.

  • The hearing was well-attended by members of the committee. Despite lots of other events on the Hill, the hearing drew at least 23 different Members of Congress, with many sticking around to ask questions. There was plenty of room in the audience and the sections reserved for press however, which led Chairman Boehlert to complain that cyber security is still greeted with a “muffled yawn” outside his committee room and that he hoped it wasn’t going to take a “cyber Katrina” to wake people up about the dangerous threat.
  • I was pleased that Boehlert took a few minutes out of the question period to suggest to the industry representatives (SBC, British Petroleum, Dow Chemical, and American Electric Power were all represented) that they make use of their exceptionally persuasive “hired guns” in DC to advocate for more R&D and better coordination. The lobbyists need to be out there putting focus on the importance of this subject, he said.
  • Finally, an odd tack during the question and answer portion of the hearing: Rep. Roscoe Bartlett (R-MD) used his five minutes to berate DHS and the industry representatives for failing to plan and prepare adequately for the “ultimate low-probability, high-impact event” threatening the nation: a nuclear electromagnetic pulse attack. An EMP attack (by detonating a large yield nuclear weapon many miles in the atmosphere above the US) would potentially render every non-hardened microprocessor in the country completely inoperable, which given the ubiquitousness of microprocessors in just about everything, would have a devastating effect on the country. Bartlett was especially interested in hearing how the energy companies would cope, given that every transformer they operate would likely be destroyed, including ones we no longer have the ability to manufacture domestically. None of the witnesses could point to any significant preparation in their sectors.
  • Katrina and Computing


    Federal Computer Week’s Aliya Sternstein has an interesting piece in this week’s issue on the role of computing technology in helping predict and mitigate the cost of Hurricane Katrina.

    Scientists are using a range of technologies to better predict the impact hurricanes can have on the economy and environment to minimize future damage and save lives.
    Supercomputers, modeling programs and geographic information systems are some of the technologies scientists use to track the movement of hurricanes and predict damage. Experts warn, however, that skilled professionals are as crucial to accurate forecasting as technology.
    Supercomputers aided the National Oceanic and Atmospheric Administration in accurately forecasting Hurricane Katrina’s path. The storm devastated the coastal areas of Alabama, Louisiana and Mississippi.
    “Two and a half to three days before the hurricane hit, we were pretty much zoomed in on the Louisiana/Mississippi Gulf Coast as where the hurricane would hit,” said Jack Beven, a hurricane specialist at the NOAA Tropical Prediction Center. “It’s probably not the most accurate we’ve been, but it’s certainly pretty accurate.”

    From what I understand, NOAA does a great job with the computing resources its been allocated. I’m just not sure they’ve been allocated nearly enough. The article points out that NOAA has been able to upgrade its supercomputing capacity from 0.5 teraflops to 1.5 teraflops within the last year. (Update (9/16/2005): This is questionable, see note below. More clarification below!**) That’s a great improvement, but given the scale of the problem they face, I’m not sure it’s adequate.
    In its look at the state of computational science in the U.S. in the last year, the President’s Information Technology Advisory Committee (PITAC) (now disbanded, sigh) came up with a really interesting economic case for the need for increased computational resources in hurricane forecasting. I’ve cited it here once previously, but I’ll quote it again:

    One nugget I found especially interesting from the presentation [of the PITAC Subcommittee on Computational Science] was an example of both the economic benefit and the health and safety benefit that will arise from more capable modeling enabled by advanced computing. The subcommittee noted that 40 percent of the $10 trillion U.S. economy is impacted by climate and weather. As one example of this, the subcommittee cited the hurricane warnings provided by the National Hurricane Center and the cost of the evacuations that often result. According to the subcommittee, there is $1 million in economic loss for each mile of coastline evacuated. With the current models, the U.S. now “over warns” by a factor of 3, with the average “over-warning” for a hurricane resulting in 200 miles of evacuations — or $200 million in unnecessary loss per event. Improved modeling (better algorithms, better software, more capable hardware, etc) would improve the accuracy of forecasts, saving lives and resources.

    While over-warning probably wasn’t much of an issue in Katrina’s case, there are a number of capabilities that we currently lack that may have proven useful. Folks in the severe storms community tell me that current operational forecast models run by NOAA suffer from a number of limitations that work against obtaining accurate predictions of hurricane intensity and path. For example, they cite the lack of resolution in the current models that misses important fine-scale features like rain bands and the eye wall; the lack of coupling between atmospheric, wave and ocean prediction models; and computing resources that can generate only one or a few forecasts (as opposed to large ensembles), which impacts NOAA’s ability to improve forecasting skill and quantify uncertainty.
    While NOAA’s move to a 1.5 teraflop capacity is a welcome change, it’s still far below what one would consider a “leadership class” computing capacity for the agency — like those available at NSF, NASA and DOE centers. I know it’s a coarse measure, but 1.5 teraflops doesn’t even get you in the top 300 fastest machines — never mind a machine capable of the kind of improvements hinted at above.* And it’s not all about big iron. NOAA needs additional resources to ramp up its infrastructure — software, hardware and personnel — and to boost basic research programs within the agency and the university community. Asking for any increase in resources anywhere is obviously very tough in the current budget environment, but the size of the “bump” required here is relatively small, given the potential benefit.
    But none of this is intended to take away from the job NOAA has done with the resources it already has. Because of NOAA’s forecasts, there was ample warning that this major storm was barreling in on the Gulf Coast and there were reasonable estimates of what it was going to do once it got there. But given sufficient resources the models will get even better, which means the forecasts will get better — more accurate, more precise, and more timely. How much would it be worth to have the accuracy and precision we have now at 24-36 hours before a major storm available 3 days out? Or five days out?
    I know it may seem a bit crass to be talking about boosting funding for computing only days after a tragedy as big as Katrina’s impact on the gulf coast, but events like this are a trigger for the reevaluation of national priorities, and it seems to me that computing resources at NOAA haven’t been a national priority for quite a while.
    * Update: (9/16/2005) Actually, it looks like NOAA has slightly more adequate computing resources than the FCW article suggests. According to the Top500 list, NOAA has two machines capable of 4.4 teraflops and two capable of 1.8 teraflops. So I’m not sure what the FCW article reflects. That’s still quite some distance from “leadership class” computing, trailing machines in Japan, Sweden, Germany, Russia, Korea, China, and Australia, but it’s better than the figures quoted in the article above.
    ** Update 2: (9/16/2005) Aliya Sternstein writes to note that the 1.5 teraflop measurement cited in the FCW piece applies to the NWS system at the IBM facility in Gaithersburg, MD, not all of NOAA’s computational capacity.

    Things Will Get Busier…


    Apologies for the dearth of timely updates recently. As many readers familiar with the congressional calendar are aware, Congress disappears for the entire month of August so that members can find their way back to their home districts, partake in a few county fairs and local parades, and generally get a longer-than-usual glimpse of how people outside the Beltway actually live. Consequently, you can see the tumbleweeds blow through the streets of DC until about Labor Day.
    Now that Congress is back in town and focused on confirming a Chief Justice, dealing with the aftermath of Katrina, and finishing all the must-pass appropriations bills — ideally before the end of the fiscal year on Sept 30th (they’ve finished just 2 of 12) — things are already heating up quickly, so expect this space to get a bit busier as well.
    For example, three events worthy of note are scheduled for this Thursday (September 15th). First, at 10 am, the House Science Committee will revisit federal support for cyber security R&D in a hearing that will focus on the risk cyber vulnerabilities pose to critical industries in the U.S. and what the federal government can do to help. Scheduled to testify are:

  • Mr. Donald “Andy” Purdy, Acting Director, National Cyber Security Division, Department of Homeland Security;
  • Mr. John Leggate, Chief Information Officer, British Petroleum Inc.;
  • Mr. David Kepler, Corporate Vice President, Shared Services, and Chief Information Officer, The Dow Chemical Company;
  • Mr. Andrew Geisse, Chief Information Officer, SBC Services Inc.; and
  • Mr. Gerald Freese, Director, Enterprise Information Security, American Electric Power.
  • Presumably, the committee hopes to hear from the industry representatives how significant the cyber threat is to their industries what the Department of Homeland Security is doing about it. Hopefully the committee and the industry witnesses press DHS about its minimal efforts to engage in long-range research to counter the threats. The hearing, like all Science Committee hearings, will be webcast live (10 am to noon) and archived on the Science Committee website.
    Also on Thursday are two policy lunches on Capitol Hill relevant to federal support for R&D. The Forum on Technology and Innovation, an offshoot of the Council on Competitiveness and co-chaired by Sen. John Ensign (R-NV) and Sen. Blanche Lincoln (D-AR), will hold a policy briefing on “Basic Research — The Foundation of the Innovation Economy.” Scheduled to speak are George Scalise, president of the Semiconductor Industry Association; Carl A. Batt, Director of the Cornell University/Ludwig Institute for Cancer Research Partnership; and Brian Halla, Chairman of the Board and CEO of National Semiconductor. The event is scheduled from 12:30 pm – 2:00 pm, in the Senate Hart building, room 209. Readers in DC can register to attend here. It looks like the forum archives video of their events, so those unable to attend might want to check afterwards for the video stream.
    Over on the House side, unfortunately at exactly the same time, is a briefing put on by the House R&D Caucus (CRA is a member of the advisory committee for the caucus) focused on the R&D tax credit. The event is sponsored by the R&D Credit Coalition, which is chock full of industry representatives. From the invite:

    Microwaves, laptops, car airbags, life-saving medical technologies and even your MP3 player have one thing in common.
    U.S.-based research helped create these innovative products. Research makes our lives better.

    Come learn how we can encourage U.S.-based research through the strengthening and extension of the R&D Credit. See real examples of how research continues to improve America.

    The briefing will be in 2325 Rayburn House Office Building, from Noon – 1:30 pm. DC-area folks wishing to attend can find the RSVP info here (pdf). Apparently attendees can also sign-up to drive “the latest hydrogen fuel cell cars,” which could be fun.
    The presence of so many U.S. manufacturers and companies on the panels and sponsor-cards for the briefings should add a little heft to the message of both events. I only wish that they hadn’t been scheduled for almost exactly the same time….

    Bay Area Industry, University, and Lab Group Urges Increased Fundamental IT Research


    In a letter (pdf) to John Marburger, Director of the White House Office of Science and Technology Policy, the Bay Area Science and Innovation Consortium — a group that includes representatives from IBM, HP, SIA, Lockheed-Martin and representatives from Bay Area universities and federal labs — urged the Adminstration to address concerns about federal support for fundamental research in IT. The letter makes a case that should be very familiar to readers of this blog — namely, that “at a time when the U.S. faces enormous challenges to its scientific and technological leadership, U.S. policy is headed in the wrong direction.”

    For example, the Defense Advanced Research Projects Agency is reducing university participation by: (1) classifying research, even in broad, enabling areas such as embedded software for wireless networks; (2) focusing more on shorter-term deliverables, and dramatically reducing its traditional levels of investment in high-risk, high-return research; and (3) evaluating success of projects on one-year time-scales. Between 1999 and 2004, DARPA’s research funding at the top-ranked computer science departments (Berkeley, Carnegie Mellon University, MIT, and Stanford) declined by 38-54 percent. These trends are not limited to IT research, but are evident in a broad range of fields.

    In fact, beyond just the top schools, the overall DARPA investment in university-led IT research has declined precipitously since FY 2001, falling from $199 million to $108 million in FY 2004 (in constant dollars).
    The letter goes on to point out the burden placed on NSF as a result of DARPA’s “retrenchment,” noting the precipitous fall of proposal success rates and the impact that has on the peer-review process — it becomes more conservative, resulting in proposals that tend not to be as high-risk and potentially high-return as we need to be supporting to keep the U.S. at the cutting-edge of technological innovation.
    BASIC makes two specific recommendations:

    1. DARPA should be given a clear mandate to dramatically increase its support of high-risk, unclassified, university-based research.

    2. The National Science Foundation should be given additional funding in the Administration’s FY 2007 budget for a “Pioneer Award” for IT research.

    These ~$500k awards would be for “individual scientists of exceptional creativity who propose pioneering approaches to major contemporary challenges.” The coalition urges an immediate funding increase for NSF to fund at least 25-50 of these pioneer awards, with an eventual “steady state” of 100-150 awards.
    It’s an interesting approach, and it makes essentially the same case we’ve been making about IT research — and many other groups have been making about the physical sciences and engineering generally. But the more groups that make this case — especially groups with significant industry membership like BASIC and the Task Force on the Future of American Innovation and the Council on Competitiveness and the American Electronics Association and the Telecommunication Industry Association and the Business Roundtable and many others, the harder it is for the Administration to ignore the message.
    You can read the full letter here (pdf).

    NSF’s New Networking Initiative in the News


    Last Thursday, NSF’s Computer and Information Science and Engineering directorate (CISE) officially unveiled their Global Environment for Networking Investigations (GENI) initiative, a program designed to “advance significantly the capabilities provided by networking and distributed systems.” As NSF points out in their fact sheet covering the program:

    The GENI Research Program will build on many years of knowledge and experience, encouraging researchers and designers to: reexamine all networking assumptions; reinvent where needed; design for intended capabilities; deploy and validate architectures; build new services and applications; encourage users to participate in experimentation; and take a system-wide approach to the synthesis of new architectures.

    The unveiling of the initiative did not go unnoticed in the press. Wired ran with the story on Friday, quoting CRA board member Jen Rexford and UCLA’s Len Kleinrock. Federal Computer Week also had coverage Friday. And today, the New York Times’ John Markoff takes a look.
    The program has the goal of supporting both a research program and a new “global experimental test facility” — all for an estimated $300 million. That’s a very ambitious budget number in the current environment. But making progress on the challenges posed — how do you design new networking and distributed system architectures that build in security, protect privacy, are robust and easy to use? — could make that $300 million seem like one of the better investments taxpayers have made. As Bob Kahn pointed out in his interview with C-Span last week, the original investment in the research behind what would become the Internet turned out to be a pretty good deal….
    In any case, we’ll follow the progress with the initiative as it moves forward. Any “new start” of this magnitude will require substantial effort and support from the community to demonstrate to policymakers the need addressed and opportunity presented by the new program. And we’ll be right there.

    Wall Street Journal on H1-B Visas


    The Wall Street Journal editorial page leads today (subscription required) by arguing that Congress should lift the cap on H1-B visas and that the market should dictate skilled labor immigration policy. Let’s see how much I can quote and claim a fair use exemption:

    [The H1-B visa cap means that] any number of fields dependent on high-skilled labor could be facing worker shortages: science, medicine, engineering, computer programming. It also means that tens of thousands of foreigners — who’ve graduated from U.S. universities and applied for the visas to stay here and work for American firms — will be shipped home to start companies or work for our global competitors.

    Congress sets the H-1B cap and could lift it as it has done in the past for short periods. Typically, however, that’s a years-long political process and cold comfort to companies that in the near term may be forced to look outside the U.S. to hire. Rather than trying to guess the number of foreign workers our economy needs year-to-year, Congress would be better off removing the cap altogether and letting the market decide.

    Contrary to the assertions of many opponents of immigration, from Capitol Hill to CNN, the size of our foreign workforce is mainly determined by supply and demand, not Benedict Arnold CEOs or a corporate quest for “cheap” labor. As the nearby table shows, since the H-1B quota was first enacted in 1992 there have been several years amid a soft economy in which it hasn’t been filled. When U.S. companies can find domestic workers to fill jobs, they prefer to hire them.

    And let’s not forget that these immigrant professionals create jobs, as the founders of Intel, Google, Sun Microsystems, Oracle, Computer Associates, Yahoo and numerous other successful ventures can attest. The Public Policy Institute of California did a survey of immigrants to Silicon Valley in 2002 and found that 52% of “foreign-born scientists and engineers have been involved in founding or running a start-up company either full-time or part-time.”

    They also include this handy and condescending guide to H1-B visa figures:

    The August void has been filled, to some degree, by discussion about immigration of skilled and unskilled foreign workers; among other things, the governors of Arizona and New Mexico have declared “states of emergency” along their borders and a debate in Herndon, Virginia over the establishment of a day laborer gathering site has brought immigration into the spotlight in the Washington newspapers and has spilled over into the Virginia gubernatorial race.

    So if there is a coming national debate about immigration of both skilled and unskilled workers, the computing research community has to be ready to voice our side and claim a seat at the table.

    Please use the Category and Archive Filters below, to find older posts. Or you may also use the search bar.

    Categories

    Archives