On Tuesday, National Science Foundation Director Arden Bement met with the Coalition for National Science Funding (of which CRA is a member) and warned the science community CNSF represents to lower expectations of increased funding for the agency in the near-term, saying the expectation of budget-doubling, as authorized by Congress and the President in 2002, “isn’t reasonable.”
“The NSF budget is correlated with the other non-defense research agencies,” Bement said, “and those are not scheduled to grow very much [in the President’s future budgets].” The Administration’s priorities are very clear, Bement said: secure the homeland, fight the war on terror, and reduce the budget deficit.
In light of the tough times ahead, Bement said the 3 percent increase (to $5.745 billion for FY05) in NSF funding requested by the President for should be seen as a symbolic show of support for the agency, especially as other agencies are seeing their budgets stay flat or decline in relation to inflation.
Given the relatively bleak outlook, Bement said the agency would pay special attention to three areas in the FY05 and FY06 budgets: improving productivity of researchers by increasing average grant size and duration; strengthening the national research infrastructure by investing in cyberinfrastructure research and development; and strengthening NSF management.
It was encouraging to hear Bement talk specifically of cyberinfrastructure in his remarks, especially as there was/is some concern in the computing community after the departure of Rita Colwell from NSF that the new NSF regime just might not “get” cyberinfrastructure. Bement, I think, is saying the right things, noting that research today is increasingly complex and “network-oriented” and that a crucial part of the enterprise is reliant on cyberinfrastructure. For FY 05, Bement said, NSF would spend ~$400 million on cyberinfrastructure related R&D foundation-wide and that funding would go to research beyond just “boxes and network” — research into algorithm development, the architecture of the net, software, etc.
The other two priority areas — increasing grant size and duration, and strengthening NSF management — are not particularly new challenges for the agency. Bement says he hopes to see average grant size grow beyond $142,000 (as it will be in FY05) to $160,000, and hopes that increasing grant duration will lead to proposals with more risk. He also noted that he’s growing concerned with the success rate of proposals dropping significantly, citing the increased number of proposals the agency receives (over 40,000 last year, compared to an average of 22-33k previously) and theorizing that other agencies decline in budgets and changes in funding regimes (see here and here for more) may be to blame for the increased pressure on NSF.
Bement also indicated he’s seeking $20 million in FY 05 for more staff at NSF. “We need more program officers,” he said, noting that proposals are now more complex than ever before, the administration of the process of peer-reviewing them is more complex, and there are a whole lot more of them to consider. “While the opportunities have never been greater,” he said, “the number of claimants has never been larger.”
It’s not clear how long Bement will remain in the driver’s seat at NSF. His appointment lasts only until the end of the year, so his job status is probably heavily dependent upon what happens in the elections in November. The grapevine, which had been chattering pretty loudly in March about a possible candidate for the permanent job (well, the six year term), seems to have quieted down considerably. It seems as though Bement will remain in charge at least through November.
Spreading the word about an event to be held at Google on June 2nd, in conjunction with the Anita Borg Institute: a panel discussion focusing on nontraditional routes into computer science, especially (but not exclusively) for women.
Here’s the official release: It’s Never Too Late: Careers in Computer Science.
Tom Kalil has a nice column that explains the importance of federal support for fundamental research in the creation of Google (and makes the case that current US policy is hurting the environment that allows companies like Google to spawn and grow). The Google story is just one of the more recent examples of long-term, government-supported fundamental research helping develop and grow billion dollar industries and markets. It’s a story that has been repeated a lot in information technology. The National AcademiesComputer Science and Telecommunications Board even put together this somewhat hard-to-read graphic that shows 19 different IT-related technologies that, with government-support, each grew into billion dollar industries. (Note to self: redesign CSTB chart to make it clearer to read).
Kalil’s article notes some warning signs — we’re not producing enough students with science and engineering degrees, we’re relying too much on foreign students to fill the gap and tighter visa restrictions are affecting the supply, US share of publications in top science journals is declining — but he doesn’t delve into some of the specific causes, other than to note that in the President’s most recent budget “science funding in 21 of 24 science agencies would would be cut over the next five years…including NSF, NIH, and DOE Office of Science.” I’d add that I think the problems go beyond raw funding levels. I think we’re approaching the funding of fundamental research in a way different than in years past, especially in IT R&D, and especially at the Department of Defense. DOD and DARPA have always been crucially important to the development and advancement of computer science, and university researchers, in turn, have been crucially important to DOD and DARPA. However, changes in the way DARPA does business — from its moves to classify most of its computer security research, to its recent move to a ‘milestone’ based approach to funded research, where programs are evaluated on a 12 to 18 month cycle with ‘go/no go’ decisions at each step — have had the effect of discouraging university researchers from participating in DARPA-sponsored research. This is significant for a couple of reasons. First, it means some of the brightest minds in the country won’t/can’t work on DARPA’s important research problems. Second, it means university researchers have a hard time participating in maybe the most important aspect of DARPA-sponsored research, the community building around particular problems.
Computing research (and the country as a whole, I’d argue) has been well-served historically by having a two significant, diverse sources of funding in NSF and DARPA. NSF continues to be primarily a place for the single investigator — modest grants for small numbers of individual researchers. DARPA’s real strength historically, however, was different. DARPA program managers could identify a particular problem, then bring together and nurture communities of really smart people devoted to working on the problem. It was a very successful approach — DARPA is credited with between a third and a half of all the major innovations in computer science and technology (according to Michael Dertouzos). Between the two of them, the NSF and DARPA models have led to everything from graphical user interfaces, the Internet, and, well, Google.
So it concerns me that DARPA’s is discouraging (intentionally or unintentionally) university-based researchers from participating in their programs…maybe even more than the declining share of basic research in the DOD science and technology portfolio concerns me. And I think Kalil is right to be concerned with what we may reap in the future as a result of these policies today.
Coinciding with yesterday’s House Science Committee hearing on HPC (see entry below), the White House released the latest report of the High-End Computing Revitalization Task Force (HECRTF), spelling out the Administration’s “forward looking plan” for high-end computing with three components:
an interagency R&D roadmap for high-end computing core technologies;
a federal high-end computing capacity and accessibility improvement plan; and,
recommendations relating to federal procurement of high-end computing systems.
The report is available as a pdf from the National Coordination Office for IT R&D.
In theory, this report will help shape the FY 2006 agency budgets, which are already being prepared. It’s hot of the presses, so I haven’t gotten all the way through it yet, but I’d be interested in your thoughts.
Make sure US researchers have access to the best machines available;
Make sure research moves forward on a broad range of architectures, software, applications, algorithms, etc.; and,
Assure the interagency planning process really works.
Without exception, the four witnesses called to testify before the committee expressed strong support for the bill. While not going so far as to say the interagency planning process was broken, White House Office of Science and Technology Policy Director John Marburger agreed the bill would help strengthen interagency coordination in high-end computing and offered the Administration’s support for the bill.
Administration support will “grease the wheels” of the legislative process a bit for this particular bill, though it’s by no means an easy path to passage. From talking to various committee staff, it appears the biggest hurdle for the bill is actually on the Senate side. Senator Pete Domenici (R-NM), Chair of the Senate Committee on Energy and Natural Resources, needs to be convinced that the HPCRA doesn’t contain provisions that should be in his Energy bill (S 2095) — otherwise his reluctance to move anything through his committee (to which HPCRA would no doubt be referred) that looks like a piece of the Energy bill will stop the HPCRA in its tracks. On the House side, the path forward for the bill looks relatively clear. The Science Committee plans a “markup” on the bill in early June, and time for consideration on the House floor is already tentatively scheduled in July. Elements of the House Leadership are apparently very interested in making the bill part of a “improving national competitiveness” theme this summer.
Declan McCullagh has an interesting piece at CNET News.com that describes the some of the difficulties Congress has trying to regulate technologies it doesn’t really understand. In their efforts to regulate things like peer-to-peer clients, spyware, and chat clients, members of Congress often cast their net way too broadly, drafting bills that would affect far broader swaths of the internet than they perhaps anticipated. Most of this, McCullagh argues, is because the members lack the expertise required to understand the implications of their legislation on technology. It’s a quick read, and I think it does a good job of demonstrating how important it is for groups like CRA, ACM, IEEE-CS, etc, to continue to offer to serve as resources for members when confronting technology issues.
I’ll be at the House Committee on Sciencehearing today on high performance computing and the committee’s bill, the High Performance Computing Act of 2004. White House Office of Science and Technology Policy Director John Marburger will be testifying, in addition to Irving Wladawsky-Berger, VP of Tech and Strategy for IBM and former co-chair of PITAC, Rick Stevens, of Argonne Nat’l Lab, and CRA Board Member Dan Reed, of UNC.
The hearing will be webcast live beginning at 10:30 am ET. The webcast will then be available online at the Science Committee website.
I’ll have a full report after the hearing.
(Clicking the thumbnail gets you a larger version) Accepting the Award
From left: NSF Director Arden Bement, CRA-W Co-Chair Mary Jean Harrold, former CRA-W Co-Chair Jan Cuny, White House OSTP Director John Marburger Group shot of all PAESMEM Awardees
Harrold and Cuny in the top row, third and fourth from the left, respectively Post Awards
From left: Harrold, Cuny, Marburger, CRA-W Co-Founder and ACM President Maria Klawe, former CRA Chair and NSF AD for CISE Peter Freeman Post Awards 2
From left: Freeman, Cuny, Klawe, Revi Sterling from Microsoft, and Harrold
I try to avoid gratuitous plugging of CRA or CRA activities here, but sometimes something is just too good not to mention. Today the President announced that CRA’s Committee on the Status of Women in Computing Research (CRA-W) had won a 2004 Presidential Award for Excellence in Science, Mathematics, and Engineering Mentoring (PAESMEM) for their long-running work to address the underrepresentation of women in computer science and engineering.
So today I got to spend a good part of the day hanging around the Eisenhower Executive Office Building on the grounds of the White House with CRA-W representatives Jan Cuny (also a CRA-W board member) and CRA-W Co-Chair Mary Jean Harrold (a CRA board member to be) as they received CRA-W’s award in a ceremony headlined by White House Office of Science and Technology Policy Director John Marburger. Also attending was CRA-W co-founder, and now ACM President, Maria Klawe. The President was, unfortunately, not able to attend as he was meeting with King Abdullah Bin Al Hussein of Jordan.
In any case, it was a great day for CRA-W. The President, in a note to the awardees read by Marburger, made it very clear that he believes that the innovation necessary to keep the nation flourishing can only be sustained by tapping into a broad and diverse, educated workforce, and that programs like the ones honored today would be the role models. Marburger himself called the organizations honored “exemplars” and leaders in the national effort to more fully develop the Nation’s human resources in science, mathematics and engineering. It was worthy praise for the women of CRA-W, who have been working since 1991 to “increase the number of women involved in computer science and engineering, increase the degree of success they experience, and provide a forum for addressing problems that often fall disproportionately within women’s domain.”
Watch this space for pictures of the event as soon as they are available. The extended entry (linked immediately below) contains the official CRA press release marking the award. Congrats to all the CRA-W participants, past and present!
OSTP also has a press release (pdf).
Update: NSF now has their press release online.
Microsoft and HPC
/In: Misc. /by Peter HarshaMicrosoft is apparently planning a new OS version targeted at HPC clusters, called “Windows Server HPC Edition.” ZDNet has the story.
NSF Funding Outlook Grim But Cyberinfrastructure’s a Priority, says NSF Director
/In: Research /by Peter HarshaOn Tuesday, National Science Foundation Director Arden Bement met with the Coalition for National Science Funding (of which CRA is a member) and warned the science community CNSF represents to lower expectations of increased funding for the agency in the near-term, saying the expectation of budget-doubling, as authorized by Congress and the President in 2002, “isn’t reasonable.”
“The NSF budget is correlated with the other non-defense research agencies,” Bement said, “and those are not scheduled to grow very much [in the President’s future budgets].” The Administration’s priorities are very clear, Bement said: secure the homeland, fight the war on terror, and reduce the budget deficit.
In light of the tough times ahead, Bement said the 3 percent increase (to $5.745 billion for FY05) in NSF funding requested by the President for should be seen as a symbolic show of support for the agency, especially as other agencies are seeing their budgets stay flat or decline in relation to inflation.
Given the relatively bleak outlook, Bement said the agency would pay special attention to three areas in the FY05 and FY06 budgets: improving productivity of researchers by increasing average grant size and duration; strengthening the national research infrastructure by investing in cyberinfrastructure research and development; and strengthening NSF management.
It was encouraging to hear Bement talk specifically of cyberinfrastructure in his remarks, especially as there was/is some concern in the computing community after the departure of Rita Colwell from NSF that the new NSF regime just might not “get” cyberinfrastructure. Bement, I think, is saying the right things, noting that research today is increasingly complex and “network-oriented” and that a crucial part of the enterprise is reliant on cyberinfrastructure. For FY 05, Bement said, NSF would spend ~$400 million on cyberinfrastructure related R&D foundation-wide and that funding would go to research beyond just “boxes and network” — research into algorithm development, the architecture of the net, software, etc.
The other two priority areas — increasing grant size and duration, and strengthening NSF management — are not particularly new challenges for the agency. Bement says he hopes to see average grant size grow beyond $142,000 (as it will be in FY05) to $160,000, and hopes that increasing grant duration will lead to proposals with more risk. He also noted that he’s growing concerned with the success rate of proposals dropping significantly, citing the increased number of proposals the agency receives (over 40,000 last year, compared to an average of 22-33k previously) and theorizing that other agencies decline in budgets and changes in funding regimes (see here and here for more) may be to blame for the increased pressure on NSF.
Bement also indicated he’s seeking $20 million in FY 05 for more staff at NSF. “We need more program officers,” he said, noting that proposals are now more complex than ever before, the administration of the process of peer-reviewing them is more complex, and there are a whole lot more of them to consider. “While the opportunities have never been greater,” he said, “the number of claimants has never been larger.”
It’s not clear how long Bement will remain in the driver’s seat at NSF. His appointment lasts only until the end of the year, so his job status is probably heavily dependent upon what happens in the elections in November. The grapevine, which had been chattering pretty loudly in March about a possible candidate for the permanent job (well, the six year term), seems to have quieted down considerably. It seems as though Bement will remain in charge at least through November.
Event: Panel on Women Entering Comp Sci (at Google, June 2)
/In: People /by Peter HarshaSpreading the word about an event to be held at Google on June 2nd, in conjunction with the Anita Borg Institute: a panel discussion focusing on nontraditional routes into computer science, especially (but not exclusively) for women.
Here’s the official release: It’s Never Too Late: Careers in Computer Science.
Kalil on Google and America’s Innovation Policy
/In: Policy /by Peter HarshaTom Kalil has a nice column that explains the importance of federal support for fundamental research in the creation of Google (and makes the case that current US policy is hurting the environment that allows companies like Google to spawn and grow). The Google story is just one of the more recent examples of long-term, government-supported fundamental research helping develop and grow billion dollar industries and markets. It’s a story that has been repeated a lot in information technology. The National Academies Computer Science and Telecommunications Board even put together this somewhat hard-to-read graphic that shows 19 different IT-related technologies that, with government-support, each grew into billion dollar industries. (Note to self: redesign CSTB chart to make it clearer to read).
Kalil’s article notes some warning signs — we’re not producing enough students with science and engineering degrees, we’re relying too much on foreign students to fill the gap and tighter visa restrictions are affecting the supply, US share of publications in top science journals is declining — but he doesn’t delve into some of the specific causes, other than to note that in the President’s most recent budget “science funding in 21 of 24 science agencies would would be cut over the next five years…including NSF, NIH, and DOE Office of Science.” I’d add that I think the problems go beyond raw funding levels. I think we’re approaching the funding of fundamental research in a way different than in years past, especially in IT R&D, and especially at the Department of Defense. DOD and DARPA have always been crucially important to the development and advancement of computer science, and university researchers, in turn, have been crucially important to DOD and DARPA. However, changes in the way DARPA does business — from its moves to classify most of its computer security research, to its recent move to a ‘milestone’ based approach to funded research, where programs are evaluated on a 12 to 18 month cycle with ‘go/no go’ decisions at each step — have had the effect of discouraging university researchers from participating in DARPA-sponsored research. This is significant for a couple of reasons. First, it means some of the brightest minds in the country won’t/can’t work on DARPA’s important research problems. Second, it means university researchers have a hard time participating in maybe the most important aspect of DARPA-sponsored research, the community building around particular problems.
Computing research (and the country as a whole, I’d argue) has been well-served historically by having a two significant, diverse sources of funding in NSF and DARPA. NSF continues to be primarily a place for the single investigator — modest grants for small numbers of individual researchers. DARPA’s real strength historically, however, was different. DARPA program managers could identify a particular problem, then bring together and nurture communities of really smart people devoted to working on the problem. It was a very successful approach — DARPA is credited with between a third and a half of all the major innovations in computer science and technology (according to Michael Dertouzos). Between the two of them, the NSF and DARPA models have led to everything from graphical user interfaces, the Internet, and, well, Google.
So it concerns me that DARPA’s is discouraging (intentionally or unintentionally) university-based researchers from participating in their programs…maybe even more than the declining share of basic research in the DOD science and technology portfolio concerns me. And I think Kalil is right to be concerned with what we may reap in the future as a result of these policies today.
Administration Releases “Federal Plan for High End Computing” Report
/In: Policy /by Peter HarshaCoinciding with yesterday’s House Science Committee hearing on HPC (see entry below), the White House released the latest report of the High-End Computing Revitalization Task Force (HECRTF), spelling out the Administration’s “forward looking plan” for high-end computing with three components:
The report is available as a pdf from the National Coordination Office for IT R&D.
In theory, this report will help shape the FY 2006 agency budgets, which are already being prepared. It’s hot of the presses, so I haven’t gotten all the way through it yet, but I’d be interested in your thoughts.
Highlights from the House Science Committee HPC Hearing
/In: Policy /by Peter HarshaIn what could fairly be described as a “love in,” Thursday’s House Science Committee hearing on HR 4218, the High Performance Computing Revitalization Act of 2004 (HPCRA), featured witnesses from the Administration, industry, university and federal labs all singing the praises of the committee’s bill to amend the 1991 High Performance Computing and Communications Act. The Committee’s bill, discussed in a previous blog entry, attempts to address concerns within the computing community about interagency coordination in the government-wide Networking and Information Technology Research and Development (NITRD) program generally, and specifically within the high-performance computing community. In essence, the bill tries to do three things:
Without exception, the four witnesses called to testify before the committee expressed strong support for the bill. While not going so far as to say the interagency planning process was broken, White House Office of Science and Technology Policy Director John Marburger agreed the bill would help strengthen interagency coordination in high-end computing and offered the Administration’s support for the bill.
Administration support will “grease the wheels” of the legislative process a bit for this particular bill, though it’s by no means an easy path to passage. From talking to various committee staff, it appears the biggest hurdle for the bill is actually on the Senate side. Senator Pete Domenici (R-NM), Chair of the Senate Committee on Energy and Natural Resources, needs to be convinced that the HPCRA doesn’t contain provisions that should be in his Energy bill (S 2095) — otherwise his reluctance to move anything through his committee (to which HPCRA would no doubt be referred) that looks like a piece of the Energy bill will stop the HPCRA in its tracks. On the House side, the path forward for the bill looks relatively clear. The Science Committee plans a “markup” on the bill in early June, and time for consideration on the House floor is already tentatively scheduled in July. Elements of the House Leadership are apparently very interested in making the bill part of a “improving national competitiveness” theme this summer.
Read more →
“Bad Laws, Bad Code, Bad Behavior”
/In: Policy /by Peter HarshaDeclan McCullagh has an interesting piece at CNET News.com that describes the some of the difficulties Congress has trying to regulate technologies it doesn’t really understand. In their efforts to regulate things like peer-to-peer clients, spyware, and chat clients, members of Congress often cast their net way too broadly, drafting bills that would affect far broader swaths of the internet than they perhaps anticipated. Most of this, McCullagh argues, is because the members lack the expertise required to understand the implications of their legislation on technology. It’s a quick read, and I think it does a good job of demonstrating how important it is for groups like CRA, ACM, IEEE-CS, etc, to continue to offer to serve as resources for members when confronting technology issues.
House Science Committee Meets Today to Discuss High Performance Computing
/In: Policy /by Peter HarshaI’ll be at the House Committee on Science hearing today on high performance computing and the committee’s bill, the High Performance Computing Act of 2004. White House Office of Science and Technology Policy Director John Marburger will be testifying, in addition to Irving Wladawsky-Berger, VP of Tech and Strategy for IBM and former co-chair of PITAC, Rick Stevens, of Argonne Nat’l Lab, and CRA Board Member Dan Reed, of UNC.
The hearing will be webcast live beginning at 10:30 am ET. The webcast will then be available online at the Science Committee website.
I’ll have a full report after the hearing.
Pics from the CRA-W Presidential Award Ceremony
/In: CRA, People /by Peter Harsha(Clicking the thumbnail gets you a larger version)
Accepting the Award
From left: NSF Director Arden Bement, CRA-W Co-Chair Mary Jean Harrold, former CRA-W Co-Chair Jan Cuny, White House OSTP Director John Marburger
Group shot of all PAESMEM Awardees
Harrold and Cuny in the top row, third and fourth from the left, respectively
Post Awards
From left: Harrold, Cuny, Marburger, CRA-W Co-Founder and ACM President Maria Klawe, former CRA Chair and NSF AD for CISE Peter Freeman
Post Awards 2
From left: Freeman, Cuny, Klawe, Revi Sterling from Microsoft, and Harrold
President Honors CRA-W with Award for Mentoring
/In: CRA, People /by Peter HarshaI try to avoid gratuitous plugging of CRA or CRA activities here, but sometimes something is just too good not to mention. Today the President announced that CRA’s Committee on the Status of Women in Computing Research (CRA-W) had won a 2004 Presidential Award for Excellence in Science, Mathematics, and Engineering Mentoring (PAESMEM) for their long-running work to address the underrepresentation of women in computer science and engineering.
So today I got to spend a good part of the day hanging around the Eisenhower Executive Office Building on the grounds of the White House with CRA-W representatives Jan Cuny (also a CRA-W board member) and CRA-W Co-Chair Mary Jean Harrold (a CRA board member to be) as they received CRA-W’s award in a ceremony headlined by White House Office of Science and Technology Policy Director John Marburger. Also attending was CRA-W co-founder, and now ACM President, Maria Klawe. The President was, unfortunately, not able to attend as he was meeting with King Abdullah Bin Al Hussein of Jordan.
In any case, it was a great day for CRA-W. The President, in a note to the awardees read by Marburger, made it very clear that he believes that the innovation necessary to keep the nation flourishing can only be sustained by tapping into a broad and diverse, educated workforce, and that programs like the ones honored today would be the role models. Marburger himself called the organizations honored “exemplars” and leaders in the national effort to more fully develop the Nation’s human resources in science, mathematics and engineering. It was worthy praise for the women of CRA-W, who have been working since 1991 to “increase the number of women involved in computer science and engineering, increase the degree of success they experience, and provide a forum for addressing problems that often fall disproportionately within women’s domain.”
Watch this space for pictures of the event as soon as they are available. The extended entry (linked immediately below) contains the official CRA press release marking the award. Congrats to all the CRA-W participants, past and present!
OSTP also has a press release (pdf).
Update: NSF now has their press release online.
Read more →