This article is published in the September 2006 issue.

Argonne National Lab Celebrates 60th Anniversary


Argonne National Laboratory is a direct descendant of the Manhattan Project where Enrico Fermi and his colleagues created the world’s first controlled nuclear chain reaction.

Chartered in 1946, Argonne National Laboratory celebrates its 60th anniversary as one of the largest U.S. Department of Energy research centers. Argonne holds many honors and awards and includes Nobelists Enrico Fermi, Maria Goeppert Mayer and Alexei Abrikosov.

Argonne’s mission is to serve the Department of Energy (DOE) and national security by creating an environment where scientists can come together to create a brighter future. Its staff represents more than 60 different nations and includes scientists from every scientific discipline. The laboratory seeks the best and brightest in their fields and has over 1,500 scientists and engineers working together, creating a unique atmosphere that is diverse, dynamic and creative. Argonne’s continual innovation has led to more than 750 patents and the development of several spin-off organizations, facilitating the transfer and use of technology to industry for public benefit and economical growth.

The lab is situated on 1,500 wooded acres about 25 miles southwest of Chicago, enabling easy access to our facilities for many outside researchers from industry, academia and other government laboratories. These world-class facilities—including the Advanced Photon Source, the Center for Nanoscale Materials, the Intense Pulsed Neutron Source, and the Argonne Tandem-Linac Accelerator Facility—are visited by thousands every year, promoting the open exchange of ideas and collaboration.

Mathematics and Computer Science Division (MCS)

The basic mission of Argonne’s MCS Division is to increase scientific productivity by providing intellectual and technical leadership in the computing sciences. As early as the 1970s, Argonne spearheaded a series of software engineering projects that culminated in the release of EISPACK, LINPACK, FUNPACK, and MINPACK. Today, MCS researchers are continuing this tradition, with an added emphasis on portability and scalability.

Projects in the division range from algorithm development and software design in core areas such as optimization and automatic differentiation, to exploration of new technologies such as distributed (Grid) computing and bioinformatics, to numerical simulations in challenging areas such as magnetohydrodynamics. Thousands of researchers in academia and industry use MCS software in applications that include computational chemistry, protein structure, vortex dynamics, astrophysics, climate modeling, computational fluid dynamics, and reservoir simulation. Many of these activities involve collaboration or partnerships with universities, industry and other research institutions worldwide.

Four Thrusts

One major thrust of the division is applied mathematics and the incorporation of new numerical methods into portable, high-performance, open source software. MCS computer scientists design robust optimization algorithms, multiscale solvers for linear and nonlinear systems, and automatic differentiation techniques for sensitivity analysis. The new techniques are then incorporated into robust numerical toolkits for use by application scientists in solving large-scale problems. For example, PETSc (Portable, Extensible Toolkit for Scientific Computation) is widely used for applications including transonic flow, modeling vortex dynamics in high-temperature superconductors, parallelization of a 3D magnetostatics code, and study of compressible flows at low and transonic Mach number.

MCS researchers also try to create new technology that will make researchers more productive. For example, as part of the Common Component Architecture project, MCS is collaborating with computational chemists to ensure that their scientific software components are interoperable and reusable. MCS researchers are also exploring mathematical programs with equilibrium constraints, which arise, for example, in the modeling of electricity markets. A new technique developed at MCS now routinely solves applications that are orders of magnitude larger and more complex than previously possible, extending the scope of this important computational paradigm.

System software is another key thrust in the MCS Division. The major challenge here is to develop the technologies needed to fully exploit parallel supercomputers. Research covers a broad spectrum—from parallel programming and performance visualization tools, to high-performance I/O, to operating systems and runtime systems software for data management on petascale computers. When the MPI standard for message passing was under development, MCS computer scientists developed an implementation, called MPICH, that enabled rapid acceptance of the new paradigm; MPICH2, the latest release of this software, has been widely adopted by major computer vendors and users and recently won an R&D 100 award. One of the most exciting new projects at MCS is ZeptoOS, a collaboration between Argonne and the University of Oregon to develop very efficient and customized Linux kernels for petascale architectures with 10,000 to 1 million CPUs.

A third research thrust in MCS is distributed, or Grid, computing. Identified as one of the “ten technologies that will change the world” by MIT Technology Review in 2003, Grid computing seeks to facilitate scientific collaborations over long distances. MCS is attacking Grid computing from two angles. For group-to-group collaboration, MCS researchers developed the Access Grid. Designed to give participants the feeling of being in a single room, the Access Grid is a wall-sized tiled display with 10 million pixels, allowing geographically dispersed audiences to participate in workshops, college courses, seminars, and scientific projects. For example, the Access Grid has been used by doctors and specialists to examine patient scans simultaneously at multiple sites, enriching diagnostics and biomedical applications. For distance collaboration on a larger scale, MCS researchers have spearheaded the development of Grid middleware. The Globus Toolkit, now the de facto standard for Grid computing, enables international groups such as the GriPhyN project to share computer power, databases, and other online tools securely across corporate, institutional, and geographic boundaries.

The fourth thrust within the MCS Division is scientific simulation, a natural outgrowth and complement of other MCS work in advanced software and middleware. MCS has a growing bioinformatics group that is exploring high-throughput comparative and evolutionary analysis of genomes and metabolic networks. MCS bioinformaticists are working with researchers at the Pacific Northwest National Laboratory to find an organism that can clean radioactive materials that have seeped into the ground under tanks at the Hanford site, which produced nuclear materials. The bioinformatics group also provides valuable resources for the National Microbial Pathogen Data Resource at the National Institutes of Health and the Microbial Genomes Program at the Department of Energy. A second simulation group at MCS focuses on climate modeling. A major accomplishment by this group is the development of the Model Coupling Toolkit, a set of open-source software tools for creating coupled models. The toolkit is being used by the Community Climate System Model, a state-of-the-art climate model developed by DOE and NSF and used by U.S. Climate Change Science Program.

Preparing for the Petascale Revolution

Petascale computing systems promise new frontiers for research and computing applications. MCS’s driving goal is to carry out research that will make petaflop performance widespread as soon as hardware is available. The scientific opportunities enabled by these advanced machines include large-scale simulation studies, such as materials dynamics with million-to-billion atom clusters; nuclear reactor core design and performance with 3-D geometry and coupled thermal hydraulics and neutronics; photon-nano cluster interactions for understanding catalysis and ultrafast reaction dynamics; protein/DNA complexes to understand DNA repair and gene regulation; the evolutionary history of protein families to aid in protein engineering; and modeling of nucleosynthesis pathways of the heavy elements in nuclear astrophysics. Concurrently, MCS has launched an ambitious program to develop system software that can work effectively on petascale-sized systems, to devise generalized methods for scaling up broad classes of code, and to formulate performance models that will enable researchers to address scalability bottlenecks.

For example, MCS has teamed with IBM and Lawrence Livermore Laboratory to drive the design and requirements for the next IBM Blue Gene, which is projected to be available in 2008. In addition, MCS established the Blue Gene Consortium. Now, with approximately 50 members from national labs, universities, industry and research institutions, this group explores the capabilities of the Blue Gene/L architecture, sharing software and pooling the expertise and experience of researchers worldwide.

MCS is also leading the Grid Infrastructure Group for TeraGrid. This National Science Foundation-funded project provides extraordinarily large and fast distributed infrastructure for open scientific research. It is linked by networks operating at tens of gigabits per second. TeraGrid integrates high-performance computers, data resources and tools. These resources include more than 102 teraflops of computing capability and more than 15 petabytes (quadrillions of bytes) of online and archival data storage. TeraGrid supports rapid access and retrieval, enabling researchers to access over 100 discipline-specific databases around the globe. Argonne coordinates the entire multi-million-dollar, multi-institutional project as a joint effort with the University of Chicago. These resources are used for computationally intensive projects from severe weather prediction and earthquake modeling to detail modeling of blood circulation and understanding of the human brain.

A Small Group with a Wide Impact

While the staff of MCS is relatively small, just under a hundred permanent staff positions, its impact is great. The group measures its success based on the number of users of MCS software and the diversity of the applications. Argonne’s computer science research is used by a broad community of thousands, enabling them to solve complex problems and take advantage of parallel architectures.

Gail Pieper is Coordinator of Technical Editing and Writing at Argonne National Laboratory.

Argonne National Lab Celebrates 60th Anniversary