Computing Research Policy Blog

Things Will Get Busier…


Apologies for the dearth of timely updates recently. As many readers familiar with the congressional calendar are aware, Congress disappears for the entire month of August so that members can find their way back to their home districts, partake in a few county fairs and local parades, and generally get a longer-than-usual glimpse of how people outside the Beltway actually live. Consequently, you can see the tumbleweeds blow through the streets of DC until about Labor Day.
Now that Congress is back in town and focused on confirming a Chief Justice, dealing with the aftermath of Katrina, and finishing all the must-pass appropriations bills — ideally before the end of the fiscal year on Sept 30th (they’ve finished just 2 of 12) — things are already heating up quickly, so expect this space to get a bit busier as well.
For example, three events worthy of note are scheduled for this Thursday (September 15th). First, at 10 am, the House Science Committee will revisit federal support for cyber security R&D in a hearing that will focus on the risk cyber vulnerabilities pose to critical industries in the U.S. and what the federal government can do to help. Scheduled to testify are:

  • Mr. Donald “Andy” Purdy, Acting Director, National Cyber Security Division, Department of Homeland Security;
  • Mr. John Leggate, Chief Information Officer, British Petroleum Inc.;
  • Mr. David Kepler, Corporate Vice President, Shared Services, and Chief Information Officer, The Dow Chemical Company;
  • Mr. Andrew Geisse, Chief Information Officer, SBC Services Inc.; and
  • Mr. Gerald Freese, Director, Enterprise Information Security, American Electric Power.
  • Presumably, the committee hopes to hear from the industry representatives how significant the cyber threat is to their industries what the Department of Homeland Security is doing about it. Hopefully the committee and the industry witnesses press DHS about its minimal efforts to engage in long-range research to counter the threats. The hearing, like all Science Committee hearings, will be webcast live (10 am to noon) and archived on the Science Committee website.
    Also on Thursday are two policy lunches on Capitol Hill relevant to federal support for R&D. The Forum on Technology and Innovation, an offshoot of the Council on Competitiveness and co-chaired by Sen. John Ensign (R-NV) and Sen. Blanche Lincoln (D-AR), will hold a policy briefing on “Basic Research — The Foundation of the Innovation Economy.” Scheduled to speak are George Scalise, president of the Semiconductor Industry Association; Carl A. Batt, Director of the Cornell University/Ludwig Institute for Cancer Research Partnership; and Brian Halla, Chairman of the Board and CEO of National Semiconductor. The event is scheduled from 12:30 pm – 2:00 pm, in the Senate Hart building, room 209. Readers in DC can register to attend here. It looks like the forum archives video of their events, so those unable to attend might want to check afterwards for the video stream.
    Over on the House side, unfortunately at exactly the same time, is a briefing put on by the House R&D Caucus (CRA is a member of the advisory committee for the caucus) focused on the R&D tax credit. The event is sponsored by the R&D Credit Coalition, which is chock full of industry representatives. From the invite:

    Microwaves, laptops, car airbags, life-saving medical technologies and even your MP3 player have one thing in common.
    U.S.-based research helped create these innovative products. Research makes our lives better.

    Come learn how we can encourage U.S.-based research through the strengthening and extension of the R&D Credit. See real examples of how research continues to improve America.

    The briefing will be in 2325 Rayburn House Office Building, from Noon – 1:30 pm. DC-area folks wishing to attend can find the RSVP info here (pdf). Apparently attendees can also sign-up to drive “the latest hydrogen fuel cell cars,” which could be fun.
    The presence of so many U.S. manufacturers and companies on the panels and sponsor-cards for the briefings should add a little heft to the message of both events. I only wish that they hadn’t been scheduled for almost exactly the same time….

    Bay Area Industry, University, and Lab Group Urges Increased Fundamental IT Research


    In a letter (pdf) to John Marburger, Director of the White House Office of Science and Technology Policy, the Bay Area Science and Innovation Consortium — a group that includes representatives from IBM, HP, SIA, Lockheed-Martin and representatives from Bay Area universities and federal labs — urged the Adminstration to address concerns about federal support for fundamental research in IT. The letter makes a case that should be very familiar to readers of this blog — namely, that “at a time when the U.S. faces enormous challenges to its scientific and technological leadership, U.S. policy is headed in the wrong direction.”

    For example, the Defense Advanced Research Projects Agency is reducing university participation by: (1) classifying research, even in broad, enabling areas such as embedded software for wireless networks; (2) focusing more on shorter-term deliverables, and dramatically reducing its traditional levels of investment in high-risk, high-return research; and (3) evaluating success of projects on one-year time-scales. Between 1999 and 2004, DARPA’s research funding at the top-ranked computer science departments (Berkeley, Carnegie Mellon University, MIT, and Stanford) declined by 38-54 percent. These trends are not limited to IT research, but are evident in a broad range of fields.

    In fact, beyond just the top schools, the overall DARPA investment in university-led IT research has declined precipitously since FY 2001, falling from $199 million to $108 million in FY 2004 (in constant dollars).
    The letter goes on to point out the burden placed on NSF as a result of DARPA’s “retrenchment,” noting the precipitous fall of proposal success rates and the impact that has on the peer-review process — it becomes more conservative, resulting in proposals that tend not to be as high-risk and potentially high-return as we need to be supporting to keep the U.S. at the cutting-edge of technological innovation.
    BASIC makes two specific recommendations:

    1. DARPA should be given a clear mandate to dramatically increase its support of high-risk, unclassified, university-based research.

    2. The National Science Foundation should be given additional funding in the Administration’s FY 2007 budget for a “Pioneer Award” for IT research.

    These ~$500k awards would be for “individual scientists of exceptional creativity who propose pioneering approaches to major contemporary challenges.” The coalition urges an immediate funding increase for NSF to fund at least 25-50 of these pioneer awards, with an eventual “steady state” of 100-150 awards.
    It’s an interesting approach, and it makes essentially the same case we’ve been making about IT research — and many other groups have been making about the physical sciences and engineering generally. But the more groups that make this case — especially groups with significant industry membership like BASIC and the Task Force on the Future of American Innovation and the Council on Competitiveness and the American Electronics Association and the Telecommunication Industry Association and the Business Roundtable and many others, the harder it is for the Administration to ignore the message.
    You can read the full letter here (pdf).

    NSF’s New Networking Initiative in the News


    Last Thursday, NSF’s Computer and Information Science and Engineering directorate (CISE) officially unveiled their Global Environment for Networking Investigations (GENI) initiative, a program designed to “advance significantly the capabilities provided by networking and distributed systems.” As NSF points out in their fact sheet covering the program:

    The GENI Research Program will build on many years of knowledge and experience, encouraging researchers and designers to: reexamine all networking assumptions; reinvent where needed; design for intended capabilities; deploy and validate architectures; build new services and applications; encourage users to participate in experimentation; and take a system-wide approach to the synthesis of new architectures.

    The unveiling of the initiative did not go unnoticed in the press. Wired ran with the story on Friday, quoting CRA board member Jen Rexford and UCLA’s Len Kleinrock. Federal Computer Week also had coverage Friday. And today, the New York Times’ John Markoff takes a look.
    The program has the goal of supporting both a research program and a new “global experimental test facility” — all for an estimated $300 million. That’s a very ambitious budget number in the current environment. But making progress on the challenges posed — how do you design new networking and distributed system architectures that build in security, protect privacy, are robust and easy to use? — could make that $300 million seem like one of the better investments taxpayers have made. As Bob Kahn pointed out in his interview with C-Span last week, the original investment in the research behind what would become the Internet turned out to be a pretty good deal….
    In any case, we’ll follow the progress with the initiative as it moves forward. Any “new start” of this magnitude will require substantial effort and support from the community to demonstrate to policymakers the need addressed and opportunity presented by the new program. And we’ll be right there.

    Wall Street Journal on H1-B Visas


    The Wall Street Journal editorial page leads today (subscription required) by arguing that Congress should lift the cap on H1-B visas and that the market should dictate skilled labor immigration policy. Let’s see how much I can quote and claim a fair use exemption:

    [The H1-B visa cap means that] any number of fields dependent on high-skilled labor could be facing worker shortages: science, medicine, engineering, computer programming. It also means that tens of thousands of foreigners — who’ve graduated from U.S. universities and applied for the visas to stay here and work for American firms — will be shipped home to start companies or work for our global competitors.

    Congress sets the H-1B cap and could lift it as it has done in the past for short periods. Typically, however, that’s a years-long political process and cold comfort to companies that in the near term may be forced to look outside the U.S. to hire. Rather than trying to guess the number of foreign workers our economy needs year-to-year, Congress would be better off removing the cap altogether and letting the market decide.

    Contrary to the assertions of many opponents of immigration, from Capitol Hill to CNN, the size of our foreign workforce is mainly determined by supply and demand, not Benedict Arnold CEOs or a corporate quest for “cheap” labor. As the nearby table shows, since the H-1B quota was first enacted in 1992 there have been several years amid a soft economy in which it hasn’t been filled. When U.S. companies can find domestic workers to fill jobs, they prefer to hire them.

    And let’s not forget that these immigrant professionals create jobs, as the founders of Intel, Google, Sun Microsystems, Oracle, Computer Associates, Yahoo and numerous other successful ventures can attest. The Public Policy Institute of California did a survey of immigrants to Silicon Valley in 2002 and found that 52% of “foreign-born scientists and engineers have been involved in founding or running a start-up company either full-time or part-time.”

    They also include this handy and condescending guide to H1-B visa figures:

    The August void has been filled, to some degree, by discussion about immigration of skilled and unskilled foreign workers; among other things, the governors of Arizona and New Mexico have declared “states of emergency” along their borders and a debate in Herndon, Virginia over the establishment of a day laborer gathering site has brought immigration into the spotlight in the Washington newspapers and has spilled over into the Virginia gubernatorial race.

    So if there is a coming national debate about immigration of both skilled and unskilled workers, the computing research community has to be ready to voice our side and claim a seat at the table.

    NY Times on the “new” Computer Science Majors


    geek.gifThe New York Times has a great piece today by reporter Steve Lohr on computer science majors — what they do (“It’s so not programming,” one says), what the job market for their skills is like (pretty strong), and what some schools are doing to get the message out.

    On campuses today, the newest technologists have to become renaissance geeks. They have to understand computing, but they also typically need deep knowledge of some other field, from biology to business, Wall Street to Hollywood. And they tend to focus less on the tools of technology than on how technology is used in the search for scientific breakthroughs, the development of new products and services, or the way work is done.

    Edward D. Lazowska, a professor at the University of Washington, points to students like Mr. Michelson [who is going to medical school at Columbia after earning a computer science degree at Washington] as computer science success stories. The real value of the discipline, Mr. Lazowska said, is less in acquiring a skill with technology tools – the usual definition of computer literacy – than in teaching students to manage complexity; to navigate and assess information; to master modeling and abstraction; and to think analytically in terms of algorithms, or step-by-step procedures.

    The piece would be a great read even without the quotes from CRA’s Government Affairs co-Chair Lazowska and current board Chair Dan Reed. And it’s a good antidote to the more dour pieces we’ve seen recently about the future of the field.
    Give it a read: A Techie, Absolutely, and More

    NY Times on Supercomputing Arms Race


    The New York Times’ John Markoff, who launched much of the media and congressional attention on computer science this year with his April 2005 piece “Pentagon Redirects Its Research Dollars“, is still on the computing beat. His most recent is today’s “A New Arms Race to Build the World’s Mightiest Computer.” Here’s a sample:

    A global race is under way to reach the next milestone in supercomputer performance, many times the speed of today’s most powerful machines.
    And beyond the customary rivalry in the field between the United States and Japan, there is a new entrant – China – eager to showcase its arrival as an economic powerhouse.
    The new supercomputers will not be in use until the end of the decade at the earliest, but they are increasingly being viewed as crucial investments for progress in science, advanced technologies and national security.

    The article highlights the recent announcements of long-term commitments by a number of countries to fund efforts to develop petaflop-scale computing systems. France, China and Japan have all initiated multi-year investments in programs designed to produce petaflop machines in the next decade. While support for supercomputing research and development here in the U.S. continues to “remain a priority” in the Administration’s plans, our commitment to long-term support for the development of these leadership class machines isn’t as stellar as it could be. PITAC’s June 2005 report on the state of computational science in the U.S. put it a bit more bluntly:

    Yet, despite the great opportunities and needs, universities and the Federal government have not effectively recognized the strategic significance of computational science in either their organizational structures or their research and educational planning. These inadequacies compromise U.S. scientific leadership, economic competitiveness, and national security.

    As the Council on Competitiveness is fond of noting, in order to compete in the global economy, you must be able to out-compute your rivals. The U.S. needs to ensure that it maintains a commitment to the long-term R&D that will continue to “prime the pump” for the innovations in high-end computing that will allow us to keep pace with our international competitors. Adopting PITAC’s recommendations (pdf) would be a good place to start.

    Bob Kahn Talks to C-Span


    An interesting interview with Turing Award co-winner (and CRA board member) Robert Kahn by C-Span’s Brian Lamb ran yesterday, covering everything from the birth of the Internet, to his role at DARPA, and whether he was a geek in high school and college. As is usually the case with C-span programs, it’s pretty in-depth and worth watching. It’s viewable online, and there’s a written transcript as well.
    Here’s a snippet:

    LAMB: Today, or even in history, how much has the taxpayer, through the government, paid for, do you think, to create this Internet?
    KAHN: You know, I think, I don’t know the exact numbers and there may be no way to know the exact numbers, but I bet it’s the biggest bargain that the American taxpayer and the economy has ever had.
    In fact, I remember in the late 1990s when the Clinton administration was riding a big economic boom, they had come out with some numbers that said one-third of all the growth in the economy was due to Internet-related activities of one sort or another.
    I remember that when we built ARPANET, the very first of the networks, the actual money that was spent on the network piece of it was a few millions of dollars. I don’t have the exact number, but it was less than 10 million.
    And if you took into account the amount of money that was spent on the research community to help them get their computers up and develop applications, maybe over its lifecycle a few tens of millions, that would be my guess, I don’t have the exact numbers, and maybe they are not findable anymore, but it was a number like that back in the early ’70s.
    If you were to look at all the other monies that were spent in other agencies of the government, the Department of Energy had a major program, NASA had a major program in networking. Of course, you have all the National Science Foundation expenditures, you know, where money is spent on building other kinds of nets. I mentioned the satellite and radio net.
    But, you know, if you compare that with what private industry is putting in even on year today, private industry contributions dwarf everything that the federal government probably put in over its lifetime.
    And so that has got to be one of the biggest or most successful investments that has ever been made.

    (“Mad props” to Tom Jones for the head’s up!)

    MSNBC Highlights NCWIT and Computing’s Image Problem


    A nice follow-up to last week’s post on the “science gap” and some of the ways the computing community is dealing with its “image problem” can be found today over
    at MSNBC in a piece focusing on the new National Center for Women in IT (CRA and CRA-W form one “hub” of NCWIT — other hubs include the Anita Borg Institute for Women and Technology, ACM, The Colorado Coalition for Gender and IT, Georgia Tech, The Girl Scouts of the USA, and The University of California). The piece is called Fewer women find their way into tech and here’s a tease:

    The number of women considering careers in information technology has dropped to its lowest level since the mid-1970s — and one local nonprofit organization intends to do something about it.
    Based at the University of Colorado in Boulder, the National Center for Women and Information Technology (NCWIT) wants to know why women are losing interest in technology — and what can be done to bring them back.

    Read the whole thing.

    Thoughts on the “Science Gap” and the Appeal of Computing


    The Washington Post’s Politics Columnist (and resident contrarian) Robert Samuelson has an interesting Op-Ed in yesterday’s edition dealing with the fact that the U.S. is producing “a shrinking share of the world’s technological talent.” After noting that there’s a pay disparity between science and engineering PhDs and other “elites” like MBAs, doctors and lawyers that probably leads to the production disparity, Samuelson rightly points out that the simple fact that other countries are producing more S&E PhDs doesn’t mean that we necessarily lose.

    Not every new Chinese or Indian engineer and scientist threatens an American, through outsourcing or some other channel. Actually, most don’t. As countries become richer, they need more scientists and engineers simply to make their societies work: to design bridges and buildings, to maintain communications systems, and to test products. This is a natural process. The U.S. share of the world’s technology workforce has declined for decades and will continue to do so. By itself, this is not dangerous.
    The dangers arise when other countries use new technologies to erode America’s advantage in weaponry; that obviously is an issue with China. We are also threatened if other countries skew their economic policies to attract an unnatural share of strategic industries — electronics, biotechnology and aerospace, among others. That is an issue with China, some other Asian countries and Europe (Airbus).
    What’s crucial is sustaining our technological vitality. Despite the pay, America seems to have ample scientists and engineers. But half or more of new scientific and engineering PhDs are immigrants; we need to remain open to foreign-born talent. We need to maintain spectacular rewards for companies that succeed in commercializing new products and technologies. The prospect of a big payoff compensates for mediocre pay and fuels ambition. Finally, we must scour the world for good ideas. No country ever had a monopoly on new knowledge, and none ever will.

    Putting aside the fact that Samuelson apparently unwittingly puts his finger on the need for producing more US-born and naturalized S&E Phds — after all, given current agency practices, they are essentially the only ones able to do the defense-related research that will preserve “America’s advantage in weaponry” — he’s generally right on. The simple fact that other countries are producing S&E PhDs at rates higher than U.S. production isn’t the worry. The worry is when America’s global competition uses that newly-developed capacity for innovation and technological achievement to target sectors traditionally important to America’s strategic industries. IT is one such crucial sector.
    As Samuelson points out, one way to insure the U.S. remains dominant, especially in a sector like IT, is to make sure the U.S. continues to attract the best minds in the world to come study and work here. Unfortunately, as we’ve noted frequently over the last couple of years, the environment for foreign students in the U.S. is not nearly as welcoming as it once was.
    Another is to nuture and grow our own domestically-produced talent in the discipline. But the challenges here are also tall. The most recent issue of the Communications of the ACM contains a very interesting (and on point) piece (pdf) about whether the computing community in the U.S. needs to do a better job of evangelizing what’s truly exciting about the discipline to combat dropping enrollment rates and dropping interest in computing. The piece by Sanjeev Arora and Bernard Chazelle (thanks to Lance Fortnow for pointing it out on his excellent Computational Complexity blog), identifies the challenge:

    Part of the problem is the lack of consensus in the public at large on what computer science actually is. The Advanced Placement test is mostly about Java, which hurts the field by reducing it to programming. High school students know that the wild, exotic beasts of physics (black holes, antimatter, Big Bang) all roam the land of a deep science. But who among them are even aware that the Internet and Google also arose from an underlying science? Their list of computing “Greats” probably begins with Bill Gates and ends with Steve Jobs.

    We feel that computer science has a compelling story to tell, which goes far beyond spreadsheets, java applets, and the joy of mouse clicking (or evan Artificial Intelligence and robots). Universality, the duality between program and data, abstraction, recursion, tractability, virtualization, and fault tolerance are among its basic principles. No one would dispute that the very idea of computing is one of the greatest scientific and technological discoveries of the 20th century. Not only has it had huge societal and commercial impact but its conceptual significance is increasingly being felt in other sciences. Computer science is a new way of thinking.

    A recent study by the Pew Internet Project demonstrates that American teenagers are tied to computing technology: 89 percent send or read e-mail; 84 percent visit websites about TV, music or sport stars; 81 percent play online games; 76 percent read online news; 75 percent send or receive instant messages. Yet that increasing use of technology doesn’t appear to make them any more interested in studying the science behind the technology. Maybe that’s not surprising — the fact that most teenagers probably have access to and use cars doesn’t appear to be swelling the ranks of automotive engineers. Maybe there’s a perception among bright teenagers that computing is a “solved” problem — or as John Marburger, the President’s science advisor put it at a hearing before the House Science Committee early in his tenure, maybe it’s a “mature” discipline now, perhaps not worthy of the priority placed on other more “breakthrough” areas of study like nanotechnology. I think Arora and Chazelle do a good job of debunking that perception, demonstrating that computing is thick with challenges and rich science “indispensible to the nation” to occupy bright minds for years to come.
    But the perception persists. Computing has an image problem. Fortunately, the computing community isn’t standing still in trying to address it (though maybe it’s only just stood up). At the Computing Leadership Summit convened by CRA last February, a large and diverse group of stakeholders — including all the major computing societies, representatives from PITAC, NSF and the National Academies, and industry reps from Google, HP, IBM, Lucent, Microsoft, Sun, TechNet and others (complete list and summary here (pdf)) — committed to addressing two key issues facing computing: the current concerns of research funding support and computing’s “image” problem. Task forces have been formed, chairmen named (Edward Lazowska of U of Washington heads the research funding task force; Rick Rashid of Microsoft heads the “image” task force), and the work is underway. As the summary of the summit demonstrates, no ideas or possible avenues are off the table…. We’ll report more on the effort as it moves forward.
    As Arora, Chazelle and Samuelson all point out, the challenges are tall, but the stakes for the country (never mind the discipline) are even higher.

    Please use the Category and Archive Filters below, to find older posts. Or you may also use the search bar.

    Categories

    Archives