Computing Research Policy Blog

NY Times on the “new” Computer Science Majors


geek.gifThe New York Times has a great piece today by reporter Steve Lohr on computer science majors — what they do (“It’s so not programming,” one says), what the job market for their skills is like (pretty strong), and what some schools are doing to get the message out.

On campuses today, the newest technologists have to become renaissance geeks. They have to understand computing, but they also typically need deep knowledge of some other field, from biology to business, Wall Street to Hollywood. And they tend to focus less on the tools of technology than on how technology is used in the search for scientific breakthroughs, the development of new products and services, or the way work is done.

Edward D. Lazowska, a professor at the University of Washington, points to students like Mr. Michelson [who is going to medical school at Columbia after earning a computer science degree at Washington] as computer science success stories. The real value of the discipline, Mr. Lazowska said, is less in acquiring a skill with technology tools – the usual definition of computer literacy – than in teaching students to manage complexity; to navigate and assess information; to master modeling and abstraction; and to think analytically in terms of algorithms, or step-by-step procedures.

The piece would be a great read even without the quotes from CRA’s Government Affairs co-Chair Lazowska and current board Chair Dan Reed. And it’s a good antidote to the more dour pieces we’ve seen recently about the future of the field.
Give it a read: A Techie, Absolutely, and More

NY Times on Supercomputing Arms Race


The New York Times’ John Markoff, who launched much of the media and congressional attention on computer science this year with his April 2005 piece “Pentagon Redirects Its Research Dollars“, is still on the computing beat. His most recent is today’s “A New Arms Race to Build the World’s Mightiest Computer.” Here’s a sample:

A global race is under way to reach the next milestone in supercomputer performance, many times the speed of today’s most powerful machines.
And beyond the customary rivalry in the field between the United States and Japan, there is a new entrant – China – eager to showcase its arrival as an economic powerhouse.
The new supercomputers will not be in use until the end of the decade at the earliest, but they are increasingly being viewed as crucial investments for progress in science, advanced technologies and national security.

The article highlights the recent announcements of long-term commitments by a number of countries to fund efforts to develop petaflop-scale computing systems. France, China and Japan have all initiated multi-year investments in programs designed to produce petaflop machines in the next decade. While support for supercomputing research and development here in the U.S. continues to “remain a priority” in the Administration’s plans, our commitment to long-term support for the development of these leadership class machines isn’t as stellar as it could be. PITAC’s June 2005 report on the state of computational science in the U.S. put it a bit more bluntly:

Yet, despite the great opportunities and needs, universities and the Federal government have not effectively recognized the strategic significance of computational science in either their organizational structures or their research and educational planning. These inadequacies compromise U.S. scientific leadership, economic competitiveness, and national security.

As the Council on Competitiveness is fond of noting, in order to compete in the global economy, you must be able to out-compute your rivals. The U.S. needs to ensure that it maintains a commitment to the long-term R&D that will continue to “prime the pump” for the innovations in high-end computing that will allow us to keep pace with our international competitors. Adopting PITAC’s recommendations (pdf) would be a good place to start.

Bob Kahn Talks to C-Span


An interesting interview with Turing Award co-winner (and CRA board member) Robert Kahn by C-Span’s Brian Lamb ran yesterday, covering everything from the birth of the Internet, to his role at DARPA, and whether he was a geek in high school and college. As is usually the case with C-span programs, it’s pretty in-depth and worth watching. It’s viewable online, and there’s a written transcript as well.
Here’s a snippet:

LAMB: Today, or even in history, how much has the taxpayer, through the government, paid for, do you think, to create this Internet?
KAHN: You know, I think, I don’t know the exact numbers and there may be no way to know the exact numbers, but I bet it’s the biggest bargain that the American taxpayer and the economy has ever had.
In fact, I remember in the late 1990s when the Clinton administration was riding a big economic boom, they had come out with some numbers that said one-third of all the growth in the economy was due to Internet-related activities of one sort or another.
I remember that when we built ARPANET, the very first of the networks, the actual money that was spent on the network piece of it was a few millions of dollars. I don’t have the exact number, but it was less than 10 million.
And if you took into account the amount of money that was spent on the research community to help them get their computers up and develop applications, maybe over its lifecycle a few tens of millions, that would be my guess, I don’t have the exact numbers, and maybe they are not findable anymore, but it was a number like that back in the early ’70s.
If you were to look at all the other monies that were spent in other agencies of the government, the Department of Energy had a major program, NASA had a major program in networking. Of course, you have all the National Science Foundation expenditures, you know, where money is spent on building other kinds of nets. I mentioned the satellite and radio net.
But, you know, if you compare that with what private industry is putting in even on year today, private industry contributions dwarf everything that the federal government probably put in over its lifetime.
And so that has got to be one of the biggest or most successful investments that has ever been made.

(“Mad props” to Tom Jones for the head’s up!)

MSNBC Highlights NCWIT and Computing’s Image Problem


A nice follow-up to last week’s post on the “science gap” and some of the ways the computing community is dealing with its “image problem” can be found today over
at MSNBC in a piece focusing on the new National Center for Women in IT (CRA and CRA-W form one “hub” of NCWIT — other hubs include the Anita Borg Institute for Women and Technology, ACM, The Colorado Coalition for Gender and IT, Georgia Tech, The Girl Scouts of the USA, and The University of California). The piece is called Fewer women find their way into tech and here’s a tease:

The number of women considering careers in information technology has dropped to its lowest level since the mid-1970s — and one local nonprofit organization intends to do something about it.
Based at the University of Colorado in Boulder, the National Center for Women and Information Technology (NCWIT) wants to know why women are losing interest in technology — and what can be done to bring them back.

Read the whole thing.

Thoughts on the “Science Gap” and the Appeal of Computing


The Washington Post’s Politics Columnist (and resident contrarian) Robert Samuelson has an interesting Op-Ed in yesterday’s edition dealing with the fact that the U.S. is producing “a shrinking share of the world’s technological talent.” After noting that there’s a pay disparity between science and engineering PhDs and other “elites” like MBAs, doctors and lawyers that probably leads to the production disparity, Samuelson rightly points out that the simple fact that other countries are producing more S&E PhDs doesn’t mean that we necessarily lose.

Not every new Chinese or Indian engineer and scientist threatens an American, through outsourcing or some other channel. Actually, most don’t. As countries become richer, they need more scientists and engineers simply to make their societies work: to design bridges and buildings, to maintain communications systems, and to test products. This is a natural process. The U.S. share of the world’s technology workforce has declined for decades and will continue to do so. By itself, this is not dangerous.
The dangers arise when other countries use new technologies to erode America’s advantage in weaponry; that obviously is an issue with China. We are also threatened if other countries skew their economic policies to attract an unnatural share of strategic industries — electronics, biotechnology and aerospace, among others. That is an issue with China, some other Asian countries and Europe (Airbus).
What’s crucial is sustaining our technological vitality. Despite the pay, America seems to have ample scientists and engineers. But half or more of new scientific and engineering PhDs are immigrants; we need to remain open to foreign-born talent. We need to maintain spectacular rewards for companies that succeed in commercializing new products and technologies. The prospect of a big payoff compensates for mediocre pay and fuels ambition. Finally, we must scour the world for good ideas. No country ever had a monopoly on new knowledge, and none ever will.

Putting aside the fact that Samuelson apparently unwittingly puts his finger on the need for producing more US-born and naturalized S&E Phds — after all, given current agency practices, they are essentially the only ones able to do the defense-related research that will preserve “America’s advantage in weaponry” — he’s generally right on. The simple fact that other countries are producing S&E PhDs at rates higher than U.S. production isn’t the worry. The worry is when America’s global competition uses that newly-developed capacity for innovation and technological achievement to target sectors traditionally important to America’s strategic industries. IT is one such crucial sector.
As Samuelson points out, one way to insure the U.S. remains dominant, especially in a sector like IT, is to make sure the U.S. continues to attract the best minds in the world to come study and work here. Unfortunately, as we’ve noted frequently over the last couple of years, the environment for foreign students in the U.S. is not nearly as welcoming as it once was.
Another is to nuture and grow our own domestically-produced talent in the discipline. But the challenges here are also tall. The most recent issue of the Communications of the ACM contains a very interesting (and on point) piece (pdf) about whether the computing community in the U.S. needs to do a better job of evangelizing what’s truly exciting about the discipline to combat dropping enrollment rates and dropping interest in computing. The piece by Sanjeev Arora and Bernard Chazelle (thanks to Lance Fortnow for pointing it out on his excellent Computational Complexity blog), identifies the challenge:

Part of the problem is the lack of consensus in the public at large on what computer science actually is. The Advanced Placement test is mostly about Java, which hurts the field by reducing it to programming. High school students know that the wild, exotic beasts of physics (black holes, antimatter, Big Bang) all roam the land of a deep science. But who among them are even aware that the Internet and Google also arose from an underlying science? Their list of computing “Greats” probably begins with Bill Gates and ends with Steve Jobs.

We feel that computer science has a compelling story to tell, which goes far beyond spreadsheets, java applets, and the joy of mouse clicking (or evan Artificial Intelligence and robots). Universality, the duality between program and data, abstraction, recursion, tractability, virtualization, and fault tolerance are among its basic principles. No one would dispute that the very idea of computing is one of the greatest scientific and technological discoveries of the 20th century. Not only has it had huge societal and commercial impact but its conceptual significance is increasingly being felt in other sciences. Computer science is a new way of thinking.

A recent study by the Pew Internet Project demonstrates that American teenagers are tied to computing technology: 89 percent send or read e-mail; 84 percent visit websites about TV, music or sport stars; 81 percent play online games; 76 percent read online news; 75 percent send or receive instant messages. Yet that increasing use of technology doesn’t appear to make them any more interested in studying the science behind the technology. Maybe that’s not surprising — the fact that most teenagers probably have access to and use cars doesn’t appear to be swelling the ranks of automotive engineers. Maybe there’s a perception among bright teenagers that computing is a “solved” problem — or as John Marburger, the President’s science advisor put it at a hearing before the House Science Committee early in his tenure, maybe it’s a “mature” discipline now, perhaps not worthy of the priority placed on other more “breakthrough” areas of study like nanotechnology. I think Arora and Chazelle do a good job of debunking that perception, demonstrating that computing is thick with challenges and rich science “indispensible to the nation” to occupy bright minds for years to come.
But the perception persists. Computing has an image problem. Fortunately, the computing community isn’t standing still in trying to address it (though maybe it’s only just stood up). At the Computing Leadership Summit convened by CRA last February, a large and diverse group of stakeholders — including all the major computing societies, representatives from PITAC, NSF and the National Academies, and industry reps from Google, HP, IBM, Lucent, Microsoft, Sun, TechNet and others (complete list and summary here (pdf)) — committed to addressing two key issues facing computing: the current concerns of research funding support and computing’s “image” problem. Task forces have been formed, chairmen named (Edward Lazowska of U of Washington heads the research funding task force; Rick Rashid of Microsoft heads the “image” task force), and the work is underway. As the summary of the summit demonstrates, no ideas or possible avenues are off the table…. We’ll report more on the effort as it moves forward.
As Arora, Chazelle and Samuelson all point out, the challenges are tall, but the stakes for the country (never mind the discipline) are even higher.

…Or you sleep with the dropped packets


It will come as a surprise to no reader of this blog that gangs and organized crime have moved into cyberspace. And it will also come as no surprise that the media, legislative staff, and elected officials are usually a bit slow to grasp advances in technologies and their commensurate threats. (Let us not forget House Majority Leader Tom Delay’s invective aimed at Justice Kennedy for the heinous practice of “[doing] his own research on the Internet.” Which of the many “Internets” it was, Delay did not specify.)

The tech world has been abuzz for some time now over the role of organized crime and street gangs on the internet. Finally, after much pushing and prodding, it appears that the media may be paying attention.

Today’s New York Times includes an article entitled “The Rise of the Digital Thugs.”

Stealing and selling data has become so lucrative, analysts say, that corporate espionage, identity theft and software piracy have mushroomed as profit centers for criminal groups. Analysts say cyberextortion is the newest addition to the digital Mafia’s bag of tricks.

“Generally speaking, it’s pretty clear it’s on the upswing, but it’s hard to gauge how big of an upswing because in a lot of cases it seems companies are paying the money,” said Robert Richardson, editorial director of the Computer Security Institute, an organization in San Francisco that trains computer security professionals. “There’s definitely a group of virus writers and hackers in Russia and in the Eastern European bloc that the Russian mob has tapped into.”

[…]

Among 639 of the survey’s respondents, the average loss from unauthorized data access grew to $303,234 in 2004 from $51,545 in 2003; average losses from information theft rose to $355,552 from $168,529. The respondents suffered total losses in the two categories of about $62 million last year. While many cyberextortionists and cyberstalkers may be members of overseas crime groups, several recent prosecutions suggest that they can also be operating solo and hail from much less exotic climes – like the office building just down the street.

Additionally, a story in the March/April 2005 issue of Foreign Policy discusses the role of street gangs online and hints at their potential to bring gang-related financial dealings online. What starts as cybertagging will likely end up becoming something much worse as gangs increasingly become sophisticated business entities.

This is something that the community needs to proactively address in Congress and in the states. Cybercrime is being committed by organized enterprises here and abroad and it costs businesses annually millions, if not billions, in lost revenues, protection money paid, theft, and loss of reputation.

Bereft of Life, PITAC Rests in Peace… but still garners attention


Gene Spafford passed on an article from VARBusiness which illustrates the technical media’s attention to PITAC even two months after its expiry. The article speaks glowingly of PITAC, which it describes as “a group of technology-industry luminaries and academics assembled to act as a council [sic] to the president, Congress, and the federal agencies that are involved in [NITRD].” Adjectives used in describing the committee and its work include “insightful,” “expert,” “valuable.” The article quotes Harris Miller, president of the ITAA, at some length:


“It’s really disappointing,” says Harris Miller, president of the Information Technology Association of America…. “What you had was a group of leading people in the IT arena who came together to provide advice and thoughts on critical topics, and they’d really done some interesting and thoughtful work. It’s unfortunate.”

Harris, whose background falls on the public-policy side, speculates that some of the group’s recommendations may not have been taken well by the administration. Although he doesn’t know exactly why the group was dissolved, he says that, “If you want honest advice, you have to realize it’s sometimes not going to be praiseworthy.” And while the group might someday be reinstated, Harris says he hasn’t picked up on any indication that it will happen soon. “Obviously, the cybersecurity report had some pretty strong language about some shortcomings,” Harris says. “But it wasn’t like others weren’t saying the same things.”

The bigger point here is this: while PITAC may be dormant, it is still getting extremely favorable attention from the tech and mainstream media. In addition, the media seem to be inclined to believe that a major reason for PITAC’s current hibernation is its frank and well-founded criticisms of current policy. This is encouraging and, with sustained pressure, may mean that PITAC will someday return to doing its “insightful,” “expert,” “valuable” work.

University of Texas Hosts Computer Camp to Pique Girls Interest


Interesting article today in the University of Texas’ The Daily Texan about efforts at the school to encourage the participation of women in computer science. The school runs a one-week summer camp for junior and senior high school girls to expose them to the world of computer science, which, as the article points out, is heavily dominated by men. From the article:

First Bytes is not a “fat camp,” as some boys who saw the welcome signs in Jester had originally thought of the one-week UT summer camp for junior and senior high school girls that focuses on getting its attendees interested in computer science, a field heavily dominated by men.
The girls spent their week listening to math- and science-themed technical lectures and participating in interactive events. Non-computer-science fun was also added to the mix, including yoga classes, bowling and watching movies. “It’s not just about studying and being in school, but about being well-rounded,” said program coordinator, Mary Esthel Middleton.
There are 1,175 computer science students at UT, only 147 of whom are women, according to statistics cited by the department.
The First Bytes program, currently in its third year, aims to help correct that problem, Middleton said. The purpose of First Bytes, she said, was to “dispel the myth that computer science is only for guys,” and to ensure the girls understand that math and science careers are beneficial, that they can and do apply to a wide range of fields, including medicine and business.

Kudos to corporate sponsors IBM and Microsoft for supporting efforts like this and the goal of increasing participation of women and minorities in computer science (including the efforts of groups like CRA’s Committee on the Status of Women in Computing Research (CRA-W)). The most recent data suggests that the popularity of computer science as a major among freshmen women is at an all-time low, so there is obviously much work to be done.

Please use the Category and Archive Filters below, to find older posts. Or you may also use the search bar.

Categories

Archives