This article is published in the November 2005 issue.

Computing Education and Careers: Perceptions, Myth, and Reality

Musings from the Chair

If one believes the popular press, computer science careers are going the way of the passenger pigeon and the woolly mammoth.Of course, we know better. First, we’ve “seen this movie before” as enrollments dipped in the 1980s, before skyrocketing again during the dot-com boom. Some degree of oscillation is inevitable in a field where the core technologies evolve so rapidly.

Second, computing really is the consummate liberal arts education for the 21st century. We live in an information-rich world, which is nonetheless facing complex problems with subtle dependencies that cross diverse domains of science, engineering, business, ethics, and social and public policy.In such an environment, flexible problem- solving skills, information management acumen, and the ability to interact effectively with experts in multiple disciplines is the quintessential definition of a computing expert.

The burgeoning need for individuals who combine deep computing skills with domain knowledge and the willingness to work collaboratively with interdisciplinary teams belies the popular belief that computer scientists are a vanishing species.The change is subtle but important. We have long known that relatively few of our graduates regularly write compilers, build operating systems, derive complexity results, or design new chips. Instead, most of them apply their computing skills to advance business, scientific, or policy goals.

In the recent President’s IT Advisory Committee (PITAC) report on computational science, Computational Science: Ensuring America’s Competitive Future,1 which I had the privilege to help write, we discussed the university organizational structures and knowledge silos that often make it difficult for us to train students to work in groups whose members are drawn from multiple disciplines. Bill Gates has made similar points in his tours of college campuses—we need more information technology workers, not fewer, but they must have the right knowledge base and competitive skills.

In light of the changing marketplace and shifting enrollments, perhaps it is time for us to rethink some of our long-held assumptions about the computing curriculum and its relevance.This is a delicate balance—over-reaction to short-term trends is bad, but so is ossification and resistance to change.

Potential questions include:

  • What should every computing major know and how should this evolve? This is not merely a question about course contents and curricula, but rather a philosophical question about the core elements of computing education.
  • How do we increase the ability of our students to communicate and to work in multidisciplinary groups?
  • What are best practices in building collaborations with other disciplines, in both education and research?
  • How can we better publicize and market our successful collaborative engagements with other disciplines? Simply put, how do we increase the awareness of potential students that a computing education is a passport to rich intellectual engagement in the arts, humanities, science, engineering, business, and public policy?

The National Science Foundation will soon hold a series of invitation-only regional workshops on these and other issues related to computing education, “Integrative Computing Education and Research (ICER): Preparing IT Graduates for 2010 and Beyond.”I encourage all of us to think about these critical issues.
Dan Reed, CRA’s Board Chair, is the Chancellor’s Eminent Professor at the University of North Carolina at Chapel Hill and Director of the interdisciplinary Renaissance Computing Institute (RENCI).


1 This and other PITAC reports are available at

Computing Education and Careers: Perceptions, Myth, and Reality