Senator John McCain released his technology agenda this week. He supports some of the ideas that have high importance to the S&T community. McCains plan includes:
Making the R&D tax credit permanent
Lowering the corporate tax rate to 25%
Allowing companies to write off of new equipment and technology in the first year
Keeping the Internet tax-free
Limiting taxes on wireless services
Fully funding the America COMPETES Act
Expanding H1B visas
A crack down on piracy
Increased funding for Patent Office
Protecting intellectual property around the world
Increasing broadband to underserved areas
Increasing S&T expertise and use in government
Of course, a big focus of the computing research advocacy community has been seeing the funding commitments approved as part of the COMPETES act, which include doubling the budgets of three key federal science agencies — NSF, NIST, and DOE Science — over the next seven years fully realized. And it appears that McCain supports that goal. However, his senior policy staff has sent mixed messages. Douglas Holtz-Eakin, a senior policy advisor to McCain told NPR that “Doubling is a nice fun number for political purposes. It’s clean, it’s smooth. But it doesn’t reflect a balancing of political priorities. There will be competing demands for funds.”
Senator Barack Obama has had a technology agenda on his campaign web site for awhile but now that McCain has come out with his the comparisons can begin. There are several areas of agreement between the two such as making the R&D tax credit permanent, increasing broadband to underserved areas, and increasing the protection of intellectual property around the world. Obama, however, supports Net neutrality (though he doesnt use that term) and flatly states he would double the funding for basic science research. Obamas plan also calls for allowing foreign students who earn degrees at US higher education institutions to stay in the country and earn citizenship and emphasizes the need to increase the number of American students, particularly women and minorities, who obtain undergraduate degrees in STEM fields. Also, in addition to increasing the intellectual property protections around the world, Obama calls for reforming the patent system.
While most of this sounds great, there is very little in either plan about how to accomplish these goals. However, knowing that science and technology issues are being discussed at the highest levels of campaign politics means the messages the community are sending are getting through.
As mentioned in this space on Wednesday, the House Science and Technology Committee held a hearing Thursday morning to review the federal Networking and Information Technology Research and Development program (NITRD — alternately pronounced “NIGHT-erd” or “NIGHTER-dee”), the 13 agency, $3.3 billion budget activity that represents the federal government’s investment in IT research and development. The hearing mainly focused on the recommendations issued last year by the President’s Council of Advisors for Science and Technology (PCAST) in their review of the federal IT R&D ecosystem, Leadership Under Challenge: Information Technology R&D in a Competitive World (pdf) (which we’ve also covered here). The hearing represents the first step in a process that will result in legislation next year that will attempt to once again amend the High Performance Computing Act of 1991 (most recently amended as part of the America COMPETES Act, passed in Aug 2007) to codify some of those recommendations.
(You can watch an archived webcast of the hearing and see copies of each witnesses’ written testimony at the House S&T Committee website.)
Testifying before the members were Chris Greer, Director of the NITRD National Coordination Office; Dan Reed, CRA Board Chair; Craig Stewart, Associate Dean of Research Technologies at Indiana University and representing the Coalition for Academic Scientific Computation (CASC); and Don C. Winter, VP of Engineering and IT at Boeing’s Phantom Works. Greer was there to talk about what the NITRD NCO is doing and intends to do about acting on the recommendations of the PCAST report; Reed was there as both someone who was deeply involved in writing the PCAST recommendations and who also has a strong connection to the computing research community; Stewart was there to speak for the academic HPC users and researchers; and Winter was there to bring a corporate/private sector perspective to the panel. All filled their assigned roles well.
Chairman Bart Gordon (D-TN) opened the hearing by noting his favorable impression of the NITRD program. From his opening remarks:
I believe the NITRD program has been largely a success. It has made a substantial contribution to moving computation to an equal place along side theory and experiment for conducting research in science and engineering.
In addition, it has developed the computing and networking infrastructure needed to support leading edge research and to drive the technology forward for a range of commercial applications that benefit society broadly.
The technical advances that led to todays computing devices and networks, and the software that drive them, evolved from past research sponsored by industry and government, often in partnership, and conducted by industry, universities, and federal labs.
Greer used his opening remarks to detail the efforts NITRD NCO has already undertaken in response to the PCAST recommendations (though he indicated that they would probably have embarked on the process even without a recommendation), including a strategic planning process that will produce a plan for NITRD for release in 2009. Greer also didn’t take issue with any of the PCAST recommendations — in fact, no witness (or Member of Congress) took issue with the recommendations in general — and largely agreed that the program needs to improve it’s interagency planning.
Reed emphasized a few concerns about the overall IT R&D ecosystem in his remarks, noting in particular his concern that the federal portfolio for IT R&D has lost a key piece of what made it such a success with the withdrawal of DARPA support for much university computer science research. Historically, the diversity of funding approaches and mission needs at both DARPA and NSF drove some truly innovative research in computing. With DARPA’s absence, university computing research has become a “monoculture” of research supported by a single agency: NSF. Indeed, NSF now supports 86 percent of federal obligations for computer science research in U.S. universities. As a result, Reed argued, the process has gotten more conservative — more incremental and evolutionary rather than revolutionary research proposals. This lack of diversity in approaches and mission-needs threaten to constrain the robust pace of innovation in the space, he noted. (Dan posts some additional thoughts on his testimony on his blog today.)
Stewart opened by endorsing fully the recommendations of the PCAST on behalf of CASC, but focused some of his remarks specifically on the workforce issues faced by the field. The declining interest of U.S. students in S&E — and particularly IT fields — represents a huge challenge for America’s future competitiveness, he argued. Programs that could increase the participation of American students in the STEM fields (Science, Technology, Engineering and Mathematics) should be a strong focus of the committee, and he commended the Members for their work in getting such strong emphasis on STEM education in the America COMPETES Act.
Winter really focused his remarks on the importance of the PCAST recommendation to emphasize cyber/physical systems (CPS) as a research area in the IT research and development portfolio. CPS are very important to the aviation industry, he argued, and the industry badly needs advances in technology development and tool development in the space and are reliant on the research community to get that work done.
The member questions tended to focus on how best to get NITRD agencies to collaborate on research agendas and how to set priorities given limited funding. Of particular interest to Chairman Gordon was how the NITRD program could embrace the PCAST recommendation that the program ought to be rebalanced to emphasize more high-risk, long-range research efforts. Would this require new money, he asked? Greer thought that through better coordination, the agencies could do a lot to re-prioritize existing funding, but that new money was also likely required. Reed noted that it’s not just an agency problem, there’s also a cultural component within the computing research community that needs to change, too. Researchers need to think more audaciously in their research proposals and reviewers need to be willing to reward those proposals that are high-risk, but potentially high-payoff. More funding would ease some of the pressure to award conservative proposals rather than risky ones, of course, but this still requires a mindset change within the computing community — something Reed said the community is starting to focus on.
Rep. Jerry McNerny (D-CA) raised a question related to Reed’s testimony about the undesirability of a research monoculture in the long-term part of the IT R&D portfolio. Wouldn’t a single agency, assuming it’s well run, manage and coordinate the long-range research better than if that research were spread among different agencies, he asked? Reed explained that, while its true that a single agency could certainly take on that piece of the portfolio by itself, historically, having a diversity of different funding models and agency missions available to researchers has proven to be an incredibly productive way to enable innovation in the IT sector. NSF is very good at individual investigator initiated research, for example, and DARPA was very good at placing big bets on hard problems and hand-picking communities of researchers to focus on them. Between just these two diverse approaches an enormous number of innovations resulted.
There was also a recurring focus on cyber security in the member questions, in part spurred by the discussion about the ubiquitousness of computing devices and the increased access we now have to them. Winter pointed out that cyber security wasn’t always a concern for a company like Boeing, despite a widespread use of embedded computing devices in things like avionics systems. But now, these systems increasingly communicate with the world outside the airplane — exchanging data with other aircraft and other assets in the battlespace, enhancing the effectiveness of the systems, but also increasing their vulnerability to cyber attacks. There is much research to be done, the panelists agreed, on understanding how to secure these cyber-physical systems, and there were great concerns expressed that the current and projected workforce in the area is inadequate to the task ahead. Support for research in the area helps produce that workforce, the panelists noted.
Finally, there was also brief discussion about Reed’s recommendation, as someone who has served on both PCAST and the President’s Information Technology Advisory Committee (PITAC) before it was folded into PCAST, in support of reconstituting PITAC in order to really get adequate oversight of the NITRD program. Though there are some within the Administration who oppose the push to reconstitute PITAC, there was no objection from the committee members to the suggestion — in fact, Chairman Gordon pointed out that their reauthorization of HPCC in the America COMPETES Act actually called for the same thing. So perhaps we can look forward to the return of PITAC in the next Administration.
And that was about it. Despite a good turnout among Members of Congress for the hearing (I counted 11 present at various times), the committee managed to wrap up its review of the program in just 56 minutes — a record, in my experience, for a full-committee hearing of the House S&T committee. I take that as a good sign, however. The issues confronting the program are pretty clear, the steps required to address them aren’t terribly controversial, it just remains to do them. In the next few weeks/months, we hope to see the direction the committee plans to take regarding the PCAST recommendations.
As always, we’ll have all the details here….
Yesterday, Yahoo, Hewlett-Packard, and Intel announced they are partnering with three universities and their governments in United States, Germany, and Singapore to build a new cloud-computing research initiative. Google and IBM launched a similar program last fall, which centered around six American universities. (Microsoft and Intel also launched a different university research partnership earlier this year.)
The new program will provide researchers with six test-bed data centers (one at each of the university and industry partners), each furnished with between 1,000 and 4,000 processors. The University of Illinois at Urbana-Champaign will represent American academia in the partnership, and will be supported in part by the NSF.
As the New York Times Steve Lohr points out, This is competition at its best.
Tomorrow Dan Reed, CRAs Board chair will testify before the House Science and Technology Committee on the state of the Networking and Information Technology Research and Development (NITRD) program. Dan is a part of the Presidents Council of Advisors on Science and Technology (PCAST), which released a reportlast summer on the state of NITRD.
Testifying along side Dan will be Dr. Chris Greer, Director of the National Coordination Office of NITRD, Dr. Craig Stewart, Associate Dean of Research Technologies at Indiana University and representing the Coalition for Academic Scientific Computation, and Don C. Winter, Vice President of Engineering & Information Technology at Phantom Works, a Boeing Company.
The hearing charter is available online and the witness testimony should be posted soon. The hearing will be web cast so you can watch it live at 10 am. Well bring you highlights here after the hearing.
A couple of small announcements:
First, those of you who attended CRA’s biennial conference at Snowbird last week already heard this call, but for those who didn’t (or who need to be reminded), we want your research highlights! CRA and the Computing Community Consortium are in the process of gathering recent computing research highlights to feature prominently in CRA and CCC publications — on the web, in our advocacy efforts, and in our print publications — and we’d like yours.
What we’re asking is that you add this e-mail address — highlights@cra.org — to any press release distribution list your department or institution may have to publicize your exciting research results. We’re gathering those interesting stories, putting them into a searchable database, and then highlighting selected ones on the CRA and CCC websites. The model here is something like the very popular Astronomy Photo of the Day, where each day a new photo or graphic (or video) having something to do with astronomy is featured along with a nice succinct description. While we don’t anticipate being able to feature new computing research daily, we hope to refresh it frequently enough (weekly?) to make it worth checking back often. But, to do that, we need your highlights.
To fill the pot, we’re accepting any release your department or institution may have sent in the last 24 months or so. Obviously, we’d like to feature the most timely ones, but we don’t mind pushing the clock back a bit for anything truly exciting. So, please submit yours today, and make sure your press offices have highlights@cra.org on their distribution lists.
In other news, we’ve created some new CRA-related “groups” on two popular social networking sites: LinkedIn and Facebook. Both are for those involved in, or just fans of, CRA. To join the LinkedIn one, go here and we’ll approve you. On Facebook, you can find us here. We hope you’ll take a look!
Last Tuesday, NYT science commentator John Tierney discussed how Congress has recently ramped up enforcement of Title IX among universities’ science departments. Will a “quota system”–an idea Tierney floats in the third paragraph of his piece–be an outcome of Title IX enforcement?
So far, the increased enforcement has only consisted of periodic compliance reviews, which had been long-neglected by the NSF, Department of Energy, and NASA, according to a 2004 Government Accountability Office report. These reviews are intended to make sure grantee departments are not discriminatory.
Of course, since some fields like computer science have many more men than women–both among students and faculty–there is concern that the government might start considering everyone “discriminatory” using the yardstick of proportionality and quotas. For athletics departments, such rigorous Title IX enforcement has led to a huge increase in the participation and achievement of women athletes, but at the expense of some male sports.
The sciences are not necessarily in the same boat as sports: although most would agree that women face an uphill battle in the sciences, how much of the gap can be explained by discrimination remains an open question. “60 percent of biology majors and 70 percent of psychology Ph.D.’s” are women, raising the possibility that more women simply prefer other fields, as psychologist Susan Pinker argues.
Another possibility is that if discrimination is having any effect, most of it happens before girls reach college. One study suggests that differences at adolescence explain different outcomes 20 years later.
For now, though, the compliance reviews haven’t rocked any boats. But the threat of a Title IX bludgeon hanging over departments’ heads is sure to add urgency to debates about the shortage of women in fields like computer science and what to do about it.
Voters ballots may be more partisan than ever, but the vast majority of Americans can agree that we need to invest in science and technology, according to a recent poll. 71 percent of polled voters said they would be more likely to vote for a candidate who is committed to making sure the federal budget invests in scientific research. And a whopping 86 percent said they would be more likely to vote for a candidate committed to public investments in science and technology education.
Such investments have majority support among democrats as well as republicans (and independents, too), demonstrating the broad bipartisan consensus behind funding for science. Hat tip: Gene Spafford
Ed Lazowska, Chair of the Computing Community Consortium, has a passionate post today on the CCC Blog about what the latest numbers from CRA’s Taulbee Survey really mean. The news is not, he points out, that computer science bachelors degrees show another year of decline — that was completely predictable from the enrollment statistics for freshman CS majors published four years ago in the survey. The real news (as we noted back in March) is that for the first time in many years, freshman interest in CS as a major increased and enrollments have stabilized — indicating that perhaps we may have turned a corner. What’s responsible for the turnaround? According to Lazowska:
[B]y far the most important factors are (a) the job market (or peoples sense of the job market), and (b) the level of buzz associated with the field.
Lets start by considering graduate enrollment, rather than undergraduate enrollment. For the past 15 years, the number of Ph.D.s granted annually in computer science has been in the 900-1100 range. Suddenly, though, in the past 2 years, it has climbed to 1800. Why is this? The answer is totally obvious:
In 2001, lots of startup companies went bust.
This dumped onto the job market a number of the best bachelors graduates from a few years before, who now had two or three years of experience under their belts.
This made it hard for some excellent new bachelors graduates of 2001 and 2002 to get the super-exciting jobs they had anticipated they were competing with people whose academic records were every bit as good as theirs, but who also had 2 or 3 years of experience working at a hot startup.
Because these great new bachelors graduates couldnt get exciting jobs, they went to graduate school instead.
And, mirabile dictu, 6 years later, theyre emerging with Ph.D.s.
This is not a news flash it didnt take a genius to predict, a few years ago, that it was going to happen, and it doesnt take a genius to explain it, either.
Similarly for bachelors degrees. Starting in about 2002, there was lots of news about the tech bust. Tech was no longer sexy. Jobs were no longer plentiful. Subsequently, there was a lot of misleading information about the impact of offshoring. And the newspapers never bothered to report that by late 2004, US IT employment was back to the 2000-2001 level we had fully recovered from the bust somehow that wasnt considered newsworthy. So its not surprising that interest in bachelors programs decreased sharply, and that 4 and 5 years later, the number of degrees granted precisely mirrored this decline.
Also, its not surprising that things are turning around. Google is hot. Tech in general is hot. There are startups everywhere. Its clear to anyone that there are plenty of jobs. (By the way, given the incredible state of todays bachelors job market, it doesnt take a genius to predict that the number of Ph.D. graduates in 2014 will show a decline. When you read the scary headlines 6 years from now, remember that you heard it here first!)
Ed also talks about the experience at his institution, the University of Washington, tries to put the “crisis” in computer science in perspective by offering up some comparisons to the other science and engineering disciplines, and emphasizes the bright outlook suggested by various Dept. of Labor workforceprojections (pdf). In typical Lazowska style, it’s a forceful but accurate refutation of the standard story on CS enrollments we’ve seen for the last few years. It’s definitely worth a read (and comment!) over at the excellent CCC Blog (Disclaimer: CCC is an activity of CRA, but that doesn’t make it any less awesome.)
The Emergency Supplemental for FY08 — the last chance to rectify the appropriations shortfall for science caused by the FY 08 Omnibus Appropriation — has been signed by the President and is now law. Though science funding made it into the supplemental — one of the few non-defense items in the bill — the win for the science community is somewhat symbolic. The amount included ($400 million — see here for a breakdown) is only about a third of the total shortfall of the FY08 appropriations, but it is nevertheless a sign that Congress and the White House understand the importance of research funding and are willing to back up their vocal support with some additional funding.
Meanwhile, the FY 09 appropriations process marches on, with some better news for science. As always, stay tuned here for the latest as the appropriations cycle moves forward (or not) this year.
The Coalition for National Science Funding held another successful Science Exposition on Capitol Hill last night and once again CRA played a part. Manning this year’s booth for CRA was Dr. R. Michael Young from North Carolina State University who did a fantastic job showing his work using the underlying technology of video games for more serious educational and research purposes. The exhibit received a great deal of attention from Congressional staff, Members of Congress, and other exhibitors. The event, a sort of science fair for Congress and staff, had 32 booths manned by researchers representing universities and scientific societies featuring some of the important research funded by the National Science Foundation. NSF showed its support for the event with staff coming out en masse including Director of NSF Arden Bement, shown here with Rep. David Price (NC).
Several hundred attendees roamed the room this year including a number of Congressmen, such as Rep. Price and Rep. Vernon Ehlers (MI), shown here with Dr. Young at the CRA exhibit. Other federal agencies who attended were NASA and OMB.
As we’ve noted before in this space, personal visits to members of Congress and their staff are vital to getting the message about the importance of computing research out. If you are coming to Washington and would like to visit your Representative and Senators, let us know and we’ll be happy to help with appointments and provide materials for your use!
Please use the Category and Archive Filters below, to find older posts. Or you may also use the search bar.
A Look at the Presidential Candidates Technology Agendas
/In: American Competitiveness Initiative, Funding, People, Policy /by MelissaNorrSenator John McCain released his technology agenda this week. He supports some of the ideas that have high importance to the S&T community. McCains plan includes:
Of course, a big focus of the computing research advocacy community has been seeing the funding commitments approved as part of the COMPETES act, which include doubling the budgets of three key federal science agencies — NSF, NIST, and DOE Science — over the next seven years fully realized. And it appears that McCain supports that goal. However, his senior policy staff has sent mixed messages. Douglas Holtz-Eakin, a senior policy advisor to McCain told NPR that “Doubling is a nice fun number for political purposes. It’s clean, it’s smooth. But it doesn’t reflect a balancing of political priorities. There will be competing demands for funds.”
Senator Barack Obama has had a technology agenda on his campaign web site for awhile but now that McCain has come out with his the comparisons can begin. There are several areas of agreement between the two such as making the R&D tax credit permanent, increasing broadband to underserved areas, and increasing the protection of intellectual property around the world. Obama, however, supports Net neutrality (though he doesnt use that term) and flatly states he would double the funding for basic science research. Obamas plan also calls for allowing foreign students who earn degrees at US higher education institutions to stay in the country and earn citizenship and emphasizes the need to increase the number of American students, particularly women and minorities, who obtain undergraduate degrees in STEM fields. Also, in addition to increasing the intellectual property protections around the world, Obama calls for reforming the patent system.
While most of this sounds great, there is very little in either plan about how to accomplish these goals. However, knowing that science and technology issues are being discussed at the highest levels of campaign politics means the messages the community are sending are getting through.
House S&T Committee Reviews Federal IT R&D Program
/In: Events, Funding, Policy, Research /by Peter HarshaAs mentioned in this space on Wednesday, the House Science and Technology Committee held a hearing Thursday morning to review the federal Networking and Information Technology Research and Development program (NITRD — alternately pronounced “NIGHT-erd” or “NIGHTER-dee”), the 13 agency, $3.3 billion budget activity that represents the federal government’s investment in IT research and development. The hearing mainly focused on the recommendations issued last year by the President’s Council of Advisors for Science and Technology (PCAST) in their review of the federal IT R&D ecosystem, Leadership Under Challenge: Information Technology R&D in a Competitive World (pdf) (which we’ve also covered here). The hearing represents the first step in a process that will result in legislation next year that will attempt to once again amend the High Performance Computing Act of 1991 (most recently amended as part of the America COMPETES Act, passed in Aug 2007) to codify some of those recommendations.
(You can watch an archived webcast of the hearing and see copies of each witnesses’ written testimony at the House S&T Committee website.)
Testifying before the members were Chris Greer, Director of the NITRD National Coordination Office; Dan Reed, CRA Board Chair; Craig Stewart, Associate Dean of Research Technologies at Indiana University and representing the Coalition for Academic Scientific Computation (CASC); and Don C. Winter, VP of Engineering and IT at Boeing’s Phantom Works. Greer was there to talk about what the NITRD NCO is doing and intends to do about acting on the recommendations of the PCAST report; Reed was there as both someone who was deeply involved in writing the PCAST recommendations and who also has a strong connection to the computing research community; Stewart was there to speak for the academic HPC users and researchers; and Winter was there to bring a corporate/private sector perspective to the panel. All filled their assigned roles well.
Chairman Bart Gordon (D-TN) opened the hearing by noting his favorable impression of the NITRD program. From his opening remarks:
Greer used his opening remarks to detail the efforts NITRD NCO has already undertaken in response to the PCAST recommendations (though he indicated that they would probably have embarked on the process even without a recommendation), including a strategic planning process that will produce a plan for NITRD for release in 2009. Greer also didn’t take issue with any of the PCAST recommendations — in fact, no witness (or Member of Congress) took issue with the recommendations in general — and largely agreed that the program needs to improve it’s interagency planning.
Reed emphasized a few concerns about the overall IT R&D ecosystem in his remarks, noting in particular his concern that the federal portfolio for IT R&D has lost a key piece of what made it such a success with the withdrawal of DARPA support for much university computer science research. Historically, the diversity of funding approaches and mission needs at both DARPA and NSF drove some truly innovative research in computing. With DARPA’s absence, university computing research has become a “monoculture” of research supported by a single agency: NSF. Indeed, NSF now supports 86 percent of federal obligations for computer science research in U.S. universities. As a result, Reed argued, the process has gotten more conservative — more incremental and evolutionary rather than revolutionary research proposals. This lack of diversity in approaches and mission-needs threaten to constrain the robust pace of innovation in the space, he noted. (Dan posts some additional thoughts on his testimony on his blog today.)
Stewart opened by endorsing fully the recommendations of the PCAST on behalf of CASC, but focused some of his remarks specifically on the workforce issues faced by the field. The declining interest of U.S. students in S&E — and particularly IT fields — represents a huge challenge for America’s future competitiveness, he argued. Programs that could increase the participation of American students in the STEM fields (Science, Technology, Engineering and Mathematics) should be a strong focus of the committee, and he commended the Members for their work in getting such strong emphasis on STEM education in the America COMPETES Act.
Winter really focused his remarks on the importance of the PCAST recommendation to emphasize cyber/physical systems (CPS) as a research area in the IT research and development portfolio. CPS are very important to the aviation industry, he argued, and the industry badly needs advances in technology development and tool development in the space and are reliant on the research community to get that work done.
The member questions tended to focus on how best to get NITRD agencies to collaborate on research agendas and how to set priorities given limited funding. Of particular interest to Chairman Gordon was how the NITRD program could embrace the PCAST recommendation that the program ought to be rebalanced to emphasize more high-risk, long-range research efforts. Would this require new money, he asked? Greer thought that through better coordination, the agencies could do a lot to re-prioritize existing funding, but that new money was also likely required. Reed noted that it’s not just an agency problem, there’s also a cultural component within the computing research community that needs to change, too. Researchers need to think more audaciously in their research proposals and reviewers need to be willing to reward those proposals that are high-risk, but potentially high-payoff. More funding would ease some of the pressure to award conservative proposals rather than risky ones, of course, but this still requires a mindset change within the computing community — something Reed said the community is starting to focus on.
Rep. Jerry McNerny (D-CA) raised a question related to Reed’s testimony about the undesirability of a research monoculture in the long-term part of the IT R&D portfolio. Wouldn’t a single agency, assuming it’s well run, manage and coordinate the long-range research better than if that research were spread among different agencies, he asked? Reed explained that, while its true that a single agency could certainly take on that piece of the portfolio by itself, historically, having a diversity of different funding models and agency missions available to researchers has proven to be an incredibly productive way to enable innovation in the IT sector. NSF is very good at individual investigator initiated research, for example, and DARPA was very good at placing big bets on hard problems and hand-picking communities of researchers to focus on them. Between just these two diverse approaches an enormous number of innovations resulted.
There was also a recurring focus on cyber security in the member questions, in part spurred by the discussion about the ubiquitousness of computing devices and the increased access we now have to them. Winter pointed out that cyber security wasn’t always a concern for a company like Boeing, despite a widespread use of embedded computing devices in things like avionics systems. But now, these systems increasingly communicate with the world outside the airplane — exchanging data with other aircraft and other assets in the battlespace, enhancing the effectiveness of the systems, but also increasing their vulnerability to cyber attacks. There is much research to be done, the panelists agreed, on understanding how to secure these cyber-physical systems, and there were great concerns expressed that the current and projected workforce in the area is inadequate to the task ahead. Support for research in the area helps produce that workforce, the panelists noted.
Finally, there was also brief discussion about Reed’s recommendation, as someone who has served on both PCAST and the President’s Information Technology Advisory Committee (PITAC) before it was folded into PCAST, in support of reconstituting PITAC in order to really get adequate oversight of the NITRD program. Though there are some within the Administration who oppose the push to reconstitute PITAC, there was no objection from the committee members to the suggestion — in fact, Chairman Gordon pointed out that their reauthorization of HPCC in the America COMPETES Act actually called for the same thing. So perhaps we can look forward to the return of PITAC in the next Administration.
And that was about it. Despite a good turnout among Members of Congress for the hearing (I counted 11 present at various times), the committee managed to wrap up its review of the program in just 56 minutes — a record, in my experience, for a full-committee hearing of the House S&T committee. I take that as a good sign, however. The issues confronting the program are pretty clear, the steps required to address them aren’t terribly controversial, it just remains to do them. In the next few weeks/months, we hope to see the direction the committee plans to take regarding the PCAST recommendations.
As always, we’ll have all the details here….
GOVERNMENT, INDUSTRY, AND ACADEMIA COLLABORATE TO STUDY CLOUD COMPUTING
/In: Research /by MelissaNorrYesterday, Yahoo, Hewlett-Packard, and Intel announced they are partnering with three universities and their governments in United States, Germany, and Singapore to build a new cloud-computing research initiative. Google and IBM launched a similar program last fall, which centered around six American universities. (Microsoft and Intel also launched a different university research partnership earlier this year.)
The new program will provide researchers with six test-bed data centers (one at each of the university and industry partners), each furnished with between 1,000 and 4,000 processors. The University of Illinois at Urbana-Champaign will represent American academia in the partnership, and will be supported in part by the NSF.
As the New York Times Steve Lohr points out, This is competition at its best.
CRA Board Chair to Testify at House Science and Technology Hearing
/In: Funding, People, Policy, Research /by MelissaNorrTomorrow Dan Reed, CRAs Board chair will testify before the House Science and Technology Committee on the state of the Networking and Information Technology Research and Development (NITRD) program. Dan is a part of the Presidents Council of Advisors on Science and Technology (PCAST), which released a report last summer on the state of NITRD.
Testifying along side Dan will be Dr. Chris Greer, Director of the National Coordination Office of NITRD, Dr. Craig Stewart, Associate Dean of Research Technologies at Indiana University and representing the Coalition for Academic Scientific Computation, and Don C. Winter, Vice President of Engineering & Information Technology at Phantom Works, a Boeing Company.
The hearing charter is available online and the witness testimony should be posted soon. The hearing will be web cast so you can watch it live at 10 am. Well bring you highlights here after the hearing.
We Want Your Research Highlights!
/In: Computing Community Consortium (CCC), CRA, People, R&D in the Press, Research /by Peter HarshaA couple of small announcements:
First, those of you who attended CRA’s biennial conference at Snowbird last week already heard this call, but for those who didn’t (or who need to be reminded), we want your research highlights! CRA and the Computing Community Consortium are in the process of gathering recent computing research highlights to feature prominently in CRA and CCC publications — on the web, in our advocacy efforts, and in our print publications — and we’d like yours.
What we’re asking is that you add this e-mail address — highlights@cra.org — to any press release distribution list your department or institution may have to publicize your exciting research results. We’re gathering those interesting stories, putting them into a searchable database, and then highlighting selected ones on the CRA and CCC websites. The model here is something like the very popular Astronomy Photo of the Day, where each day a new photo or graphic (or video) having something to do with astronomy is featured along with a nice succinct description. While we don’t anticipate being able to feature new computing research daily, we hope to refresh it frequently enough (weekly?) to make it worth checking back often. But, to do that, we need your highlights.
To fill the pot, we’re accepting any release your department or institution may have sent in the last 24 months or so. Obviously, we’d like to feature the most timely ones, but we don’t mind pushing the clock back a bit for anything truly exciting. So, please submit yours today, and make sure your press offices have highlights@cra.org on their distribution lists.
In other news, we’ve created some new CRA-related “groups” on two popular social networking sites: LinkedIn and Facebook. Both are for those involved in, or just fans of, CRA. To join the LinkedIn one, go here and we’ll approve you. On Facebook, you can find us here. We hope you’ll take a look!
Title IX’s Growing Interest in Science
/In: Diversity in Computing, Funding, Policy, R&D in the Press /by MelissaNorrLast Tuesday, NYT science commentator John Tierney discussed how Congress has recently ramped up enforcement of Title IX among universities’ science departments. Will a “quota system”–an idea Tierney floats in the third paragraph of his piece–be an outcome of Title IX enforcement?
So far, the increased enforcement has only consisted of periodic compliance reviews, which had been long-neglected by the NSF, Department of Energy, and NASA, according to a 2004 Government Accountability Office report. These reviews are intended to make sure grantee departments are not discriminatory.
Of course, since some fields like computer science have many more men than women–both among students and faculty–there is concern that the government might start considering everyone “discriminatory” using the yardstick of proportionality and quotas. For athletics departments, such rigorous Title IX enforcement has led to a huge increase in the participation and achievement of women athletes, but at the expense of some male sports.
The sciences are not necessarily in the same boat as sports: although most would agree that women face an uphill battle in the sciences, how much of the gap can be explained by discrimination remains an open question. “60 percent of biology majors and 70 percent of psychology Ph.D.’s” are women, raising the possibility that more women simply prefer other fields, as psychologist Susan Pinker argues.
Another possibility is that if discrimination is having any effect, most of it happens before girls reach college. One study suggests that differences at adolescence explain different outcomes 20 years later.
For now, though, the compliance reviews haven’t rocked any boats. But the threat of a Title IX bludgeon hanging over departments’ heads is sure to add urgency to debates about the shortage of women in fields like computer science and what to do about it.
Voters Overwhelmingly Support Investing in Science
/In: Computing Education, Funding, Policy, Research /by MelissaNorrVoters ballots may be more partisan than ever, but the vast majority of Americans can agree that we need to invest in science and technology, according to a recent poll.
71 percent of polled voters said they would be more likely to vote for a candidate who is committed to making sure the federal budget invests in scientific research. And a whopping 86 percent said they would be more likely to vote for a candidate committed to public investments in science and technology education.
Such investments have majority support among democrats as well as republicans (and independents, too), demonstrating the broad bipartisan consensus behind funding for science.
Hat tip: Gene Spafford
The Real Story on CS Enrollments
/In: Computing Community Consortium (CCC), People, R&D in the Press /by Peter HarshaEd Lazowska, Chair of the Computing Community Consortium, has a passionate post today on the CCC Blog about what the latest numbers from CRA’s Taulbee Survey really mean. The news is not, he points out, that computer science bachelors degrees show another year of decline — that was completely predictable from the enrollment statistics for freshman CS majors published four years ago in the survey. The real news (as we noted back in March) is that for the first time in many years, freshman interest in CS as a major increased and enrollments have stabilized — indicating that perhaps we may have turned a corner. What’s responsible for the turnaround? According to Lazowska:
Ed also talks about the experience at his institution, the University of Washington, tries to put the “crisis” in computer science in perspective by offering up some comparisons to the other science and engineering disciplines, and emphasizes the bright outlook suggested by various Dept. of Labor workforce projections (pdf). In typical Lazowska style, it’s a forceful but accurate refutation of the standard story on CS enrollments we’ve seen for the last few years. It’s definitely worth a read (and comment!) over at the excellent CCC Blog (Disclaimer: CCC is an activity of CRA, but that doesn’t make it any less awesome.)
Supplemental Signed By President
/In: American Competitiveness Initiative, Funding, FY08 Appropriations, FY09 Appropriations /by MelissaNorrThe Emergency Supplemental for FY08 — the last chance to rectify the appropriations shortfall for science caused by the FY 08 Omnibus Appropriation — has been signed by the President and is now law. Though science funding made it into the supplemental — one of the few non-defense items in the bill — the win for the science community is somewhat symbolic. The amount included ($400 million — see here for a breakdown) is only about a third of the total shortfall of the FY08 appropriations, but it is nevertheless a sign that Congress and the White House understand the importance of research funding and are willing to back up their vocal support with some additional funding.
Meanwhile, the FY 09 appropriations process marches on, with some better news for science. As always, stay tuned here for the latest as the appropriations cycle moves forward (or not) this year.
Another Successful Capitol Hill Science Fair
/In: CRA, Events, People, Research /by MelissaNorrThe Coalition for National Science Funding held another successful Science Exposition on Capitol Hill last night and once again CRA played a part. Manning this year’s booth for CRA was Dr. R. Michael Young from North Carolina State University who did a fantastic job showing his work using the underlying technology of video games for more serious educational and research purposes. The exhibit received a great deal of attention from Congressional staff, Members of Congress, and other exhibitors. The event, a sort of science fair for Congress and staff, had 32 booths manned by researchers representing universities and scientific societies featuring some of the important research funded by the National Science Foundation. NSF showed its support for the event with staff coming out en masse including Director of NSF Arden Bement, shown here with Rep. David Price (NC).
Several hundred attendees roamed the room this year including a number of Congressmen, such as Rep. Price and Rep. Vernon Ehlers (MI), shown here with Dr. Young at the CRA exhibit. Other federal agencies who attended were NASA and OMB.
As we’ve noted before in this space, personal visits to members of Congress and their staff are vital to getting the message about the importance of computing research out. If you are coming to Washington and would like to visit your Representative and Senators, let us know and we’ll be happy to help with appointments and provide materials for your use!