UPDATE #1 1/29: Yesterday afternoon, the Trump Administration released a new memo to clarify the earlier one on pausing federal grant and other financial assistance programs. Punch Bowl has the memo itself and severalothernews outlets are summarizing its contents and directions from the White House. It says that, “any program not implicated by the President’s Executive Orders is not subject to the pause,” and then specifically lists the executive orders listed in the original memo that apply. It continues by saying, “any program that provides direct benefits to individuals is not subject to the pause,” and that the, “guidance establishes a process for agencies to work with OMB to determine quickly whether any program is inconsistent,” with the listed Executive Orders. The memo also makes the argument that this is not a freeze on all Federal financial assistance programs and that it is not impoundment. While this clarification is helpful, it does not clear up all the confusion from the original order and what specific programs are impacted and for how long.
As well, just before the 5pm Tuesday deadline for the original memo to take effect, a federal judge in Washington DC temporarily blocked the Trump administration from enforcing this directive. This halts the freeze on funding until Monday February 3rd. Additionally, a coalition of 21 state attorney generals announced legal action against the White House budget office over this matter.
In related news, NSF has created an Executive Order Implementation webpage to ensure the widest dissemination of information and updates on the subject. The other research agencies are likely to set up similar pages soon.
CRA is continuing to actively monitor the situation and we are engaging with our stakeholders to stay informed. We will continue to provide updates as the situation changes.
Original post: Last night, the Trump Administration issued a memo to “Temporary Pause of Agency Grant, Loan, and Other Financial Assistance Programs.” The order from the Office of Management and Budget (OMB) requires all federal agencies to, “temporarily pause all activities related to obligation or disbursement of all Federal financial assistance.” The stated purpose is, “to review agency programs and determine the best uses of the funding for those programs consistent with the law and the President’s priorities.” The pause is ordered to take effective today, January 28th, at 5pm and is to remain in effect until February 10th, though it could be extended.
The OMB memo requires every federal agency to pause:
(i) issuance of new awards;
(ii) disbursement of Federal funds under all open awards; and
(iii) other relevant agency actions that may be implicated by the executive orders, to the extent permissible by law.
OMB also requires all, “Federal agencies to pause all activities associated with open NOFOs (Notice of Funding Opportunity), such as conducting merit review panels.” The order explicitly exempts mandatory spending like Social Security and Medicare, as well as, “assistance provided directly to individuals.” This order will likely impact much, if not all, discretionary funding that the Federal Government spends, including research funding. However, there is significant confusion about how the memo will be implemented and whether it is legal.
This is likely the first move in the expected impoundment actions that the Trump Administration has been telegraphing. Impoundment is when the President does not spend federal dollars that have been legally appropriated by Congress. With a few execeptions, the Executive Branch is required by the Constitution to spend funds as the Legislative Branch directs (this is where the term “power of the purse” comes from). Legally this situation will turn on whether this is a temporary pause to perform a review or not. Already key members of Congress have called on the Administration to stop this action and, “ensure all federal resources are delivered in accordance with the law.”
CRA is monitoring the situation closely and will report out any developments. In our conversations with policymakers, we will explain the damaging impacts this order will have on the nation’s computing research community. As well, we strongly encourage all members of our community to contact their Senators and Representatives in Congress and tell them how this action will impact their constituents and districts.
On January 20th, the inaugural day of the Trump Administration, President Trump unveiled a series of executive orders and actions. Executive orders are documents or actions issued by the president to manage the operations of the Federal Government, specifically the executive branch departments and agencies. While these EOs have the force of law, they are in force until they are rescinded, either by the issuing administration or a succeeding one, or are deemed unconstitutional by the courts. Almost all of these orders were a fulfillment of campaign promises made by Trump during the 2024 election. Several of these EOs impact the computing research community, either directly or indirectly.
The “Ending Radical and Wasteful Government DEI Programs and Preferencing” EO orders the Director of the Office of Management and Budget (OMB), assisted by the US Attorney General and the Director of OPM, to “coordinate the termination of all discriminatory programs, including illegal DEI and “diversity, equity, inclusion, and accessibility” (DEIA) mandates, policies, programs, preferences, and activities in the Federal Government, under whatever name they appear.” Within 60 days each agency, department, or commission shall terminate, “all DEI, DEIA, and “environmental justice” offices and positions (including but not limited to “Chief Diversity Officer” positions); all “equity action plans,” “equity” actions, initiatives, or programs, “equity-related” grants or contracts; and all DEI or DEIA performance requirements for employees, contractors, or grantees.” Additionally, they are to provide the White House with a list of:
(A) agency or department DEI, DEIA, or “environmental justice” positions, committees, programs, services, activities, budgets, and expenditures in existence on November 4, 2024, and an assessment of whether these positions, committees, programs, services, activities, budgets, and expenditures have been misleadingly relabeled in an attempt to preserve their pre-November 4, 2024, function;
(B) Federal contractors who have provided DEI training or DEI training materials to agency or department employees; and
(C) Federal grantees who received Federal funding to provide or advance DEI, DEIA, or “environmental justice” programs, services, or activities since January 20, 2021.
There is an expectation that all federal employees who work in DEI offices will be furloughed or reassigned by the end of this week. For all intents and purposes, this order outlaws DEI initiatives within the federal government.
The order that is widest in scope, will also impact the community directly: Initial Rescission of Harmful Executive Orders and Actions. This order revokes a large number of executive orders issued by President Biden, the most significant being Executive Order 14110 of October 30, 2023 (AKA: “Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence” or the AI EO). The full impact of rescinding the AI EO is unclear at the moment, as several actions ordered in the EO have already occurred. But the fate of such things as the NAIRR pilot project at NSF and the AI Safety Institute at NIST are unclear at this time. Both programs enjoy wide, bipartisan support in the Congress and the tech industry; if they are required to shut down, that could be an unintended outcome of revoking the original order. CRA is consulting with our friends and allies in Congress and the Executive Branch to get more information on the full impact of revoking this order.
This order also requires the National Security Advisor to review of all National Security Memos (NSMs) that were issued by the Biden Administration and make recommendations on which ones that should be revoked or kept in force. This could impact the NSM covering artificial intelligence.
There are several executive orders that were issued that will impact the computing research community in less direct or obvious ways. A select list of these orders are:
Establishing and Implementing the President’s “Department of Government Efficiency” – This order sets up the so called “Department of Government Efficiency” within the executive office of the President (i.e.: the White House). It does this by renaming and re-tasking the United States Digital Service, which was an office set up by President Obama to provide consultation services to federal agencies on information technology. The DOGE office is given an 18-month timeline to do its work and will terminate on July 4, 2026. The effort, which has had an unclear directive until now, is tasked with “modernizing Federal technology and software to maximize efficiency and productivity.” Intriguingly, the DOGE effort is not tasked with recommending the removal of regulations or funding for specific programs; that had been a stated goal but seems to have fallen by the wayside. There is always a possibility that its scope could change with a future EO though.
Restoring Accountability to Policy-Influencing Positions within the Federal Workforce and Hiring Freeze – These two orders will likely have a number of unintended impacts, particularly at the federal research funding agencies. With the “Restoring Accountability EO,” the likely impact will be shifting most, if not all, government scientists and rotators into this new “Schedule Policy/Career” (formerly called Schedule F) classification of the federal workforce. This has been a concern since Schedule F was issued toward the end of the first Trump Administration. This action is meant to classify career federal employees more like political appointments, removing civil service protections and allowing for easier removal from their positions. This could have a serious impact on how career government scientists perform their jobs. As for the hiring freeze EO, it is initially set as a 90-day period, while OMB and the DOGE office develop a workforce reduction plan. However, it’s easy to see that being extended for an indeterminate amount of time. An extended freeze could impact rotators at the research agencies and the processes that they oversee.
Protecting the United States from Foreign Terrorists and other National Security and Public Safety Threats – This EO covers immigration and could impact the high-skilled immigration changes taken by the Biden Administration in 2022. The order calls on the Departments of State, Justice, and Homeland Security, as well as the Director of National Intelligence, to conduct a review of all immigration actions taken by the Biden Administration and reestablish a “uniform baseline” that existed before that Administration came into office in 2021. It also orders those same departments to identify countries which, “vetting and screening information is so deficient as to warrant a partial or full suspension on the admission of nationals from those countries.” It further orders an evaluation of, “all visa programs to ensure that they are not used by foreign nation-states or other hostile actors to harm the security, economic, political, cultural, or other national interests of the United States.” Given the importance of high-skilled immigration to the computing, IT, and CS research communities, any changes in this area could have a significant impact.
This is just the first slate of Executive Orders from the Trump Administration. We are expecting more in the coming weeks and months. As well, these orders may be expanded upon or have unintended impacts that have not been accounted for yet. CRA is still in the process of reviewing these orders and monitoring the unfolding situation. Please check back for more updates.
The Computing Research Association (CRA) launched the first set of white papers in its 2024-2025 Quadrennial Papers series. We publish this series every four years to explore areas and issues in computing research that have the potential to address national priorities. These Quad Papers aim to provide a comprehensive overview of the computing research field, detailing potential research directions, challenges, and recommendations for policymakers and the computing research community.
Topics covered in this first set of papers include:
CRA released six papers in this initial wave and plans to release additional papers in the coming weeks.
The computing research community’s active engagement with elected officials and research agency policymakers ensures that these stakeholders remain informed about the latest breakthroughs and challenges in the field. This engagement represents one of the many ways that the CRA interacts with policymakers in the science and technology policy arena.
As the 2024 calendar year was winding down, Congress rushed to avoid a lapse in funding authority, which would have led to a government shutdown. Despite many political complications, cooler heads prevailed and a continuing resolution was passed into law. The new CR allows the federal departments and agencies to continue to operate, at the previous year’s funding levels, until March 14th. Unfortunately, that is the only certainty that the situation created.
As we have mentioned before while tracking the Fiscal Year 2025 (FY25) budget process, the outcome of the November 2024 elections is dictating the final outcome for the budget process. Since Republicans now have unified control of the federal government, they will be able to decide on final funding levels for FY25 and likely beyond. They are telegraphing a budget reconciliation strategy which has the potential to pass into law sweeping changes in federal spending. Details are sparse at the moment but the prospects for significant, across-the-board cuts to non-defense funding are very likely.
We mention “non-defense funding” because almost all of the federal research agencies, such as NSF, DOE SC, NASA, NIH, and NIST, that support researchers in the computing fields are categorized as non-defense spending.
However, in order for these funding cuts to come to pass, Congressional Republicans, particularly in the closely divided House of Representatives, must remain united. If they stay together, they will be able to pass their agenda into law; if they continue to operate as they have for the last two years, they are much more likely to falter and rely on Democratic votes to handle must-pass legislation. Time will tell, but the vote for the Speakership of the House will be an early indicator of how Congress will operate for the next two years.
This outlook does not cover the administration of incoming President Trump. The potential challenges, complications, and opportunities of the next administration will be covered in a future Policy Blog post. Put simply, there are many areas of concern but there are also potential positives for the computing research community. Again, time will tell.
CRA is monitoring the budget situation very closely and we will report out any develops over the next several months. And we will continue to impress on elected officials and policymakers of both parties that federal support for fundamental computing research is essential for the well-being of the nation.
Just before the holidays, President-Elect Trump announced that Michael Kratsios will be nominated as the Director of the Office of Science & Technology Policy (OSTP). Informally known as the President’s Science Advisor, the Director of OSTP oversees the office tasked with providing advice to the President, and the Executive Office of the President, on matters related to science and technology, as well as coordinating the Administration’s science and tech policy among the assorted federal research agencies. Mr. Kratsios comes into the position with extensive experience in the first Trump Administration, where he served both at OSTP and the Defense Department in senior research policy roles. He will also serve as an assistant to the President for science and technology.
It was also announced that computer scientist Lynne Parker, of the University of Tennessee, Knoxville, was named counselor to Kratsios and executive director of the President’s Council of Advisors on Science and Technology (PCAST). Dr. Parker previously served in OSTP from 2018 to 2022, in both the Trump and Biden Administrations, and at NSF in the CISE Directorate as Division Director for Information and Intelligent Systems from 2015 to 2016. In her previous role at OSTP, she led national AI policy efforts as Deputy Chief Technology Officer of the United States, the founding Director of the National AI Initiative Office, and Assistant Director for AI.
Other high level OSTP staff announced were Sriram Krishnan, who will serve as Senior Policy Advisor for Artificial Intelligence, and Bo Hines, who will be the Executive Director of the new Presidential Council of Advisers for Digital Assets (informally dubbed the “Crypto Council”). This follows up on the news earlier in December that David Sacks will serve as advisor to the president for AI and cryptocurrency and co-chair to PCAST.
These advisor and staff announcements are encouraging news, as this means that OSTP’s leadership will be in place much sooner than in Trump’s first term, when it was over 2 years before the office had a confirmed director. The news about Mr. Kratsios and Dr. Parker, in particular, is good news, as both are well known to the computing research community and are well respected in their fields. Exactly when they will be in place is hard to tell at the moment; Kratsios’ nomination will need to be confirmed by the Senate before he can take up his position full time. But the other positions should be able to start soon after the Administration is sworn into office on January 20th.
This news also drives home that artificial intelligence and blockchain technology will be important issues in the second Trump Administration’s science and technology portfolio. CRA will continue to impress to elected officials and policymakers the essential role that fundamental research will play in the develop and advancement of both technologies, as well as the role other computing research disciplines will be to the future prosperity of the country.
The Bipartisan House Task Force on Artificial Intelligence released its long-awaited report today. The report contains 66 key findings and 89 recommendations, organized into 15 chapters that cover a diverse array of policy areas, all to, “ensure America continues to lead the world in responsible AI innovation.” It represents the culmination of almost a year of consultations with, “business leaders, government officials, technical experts, academics, and legal scholars.” This report is similar to the Senate AI Roadmap Report released in May.
The Task Force adopted several high-level principles to frame their policy analysis:
Identify AI Issue Novelty
Promote AI Innovation
Protect Against AI Risks and Harms
Empower Government with AI
Affirm the use of a Sectoral Regulatory Structure
Take an Incremental Approach
Keep Humans at the Center of AI Policy
These principles are reflected in the chapter on, “Research, Development, & Standards,” which is of most immediate importance to CRA’s members. In it, the Task Force recommends that Congress continue to monitor and evaluate the impact that AI will have on different industries and the nation as a whole. Additionally, it calls to support fundamental R&D in the field of AI at American universities. However, it does not provide a specific amount or target that Congress should aim for to support this field; in comparison, the Senate report recommended at least $32 billion a year to support non-defense AI research.
The chapter on research also contains several wide-ranging recommendations that can impact the AI research community. They cover such topics as increasing tech transfer at universities; to promoting the development of infrastructure and data to enable AI research; to continued engagement in international standards development; to promoting public-private partnerships for AI R&D.
The report has several sections on policy areas of concern around AI (such as civil right and civil liberties, education and workforce, intellectual property, and open and closed systems) to sector specific chapters (healthcare, small business, agriculture, and energy usage and data centers). At over 250 pages, it is a wide-ranging report.
The release of this report is a good step in the direction of Congress tackling the complexities of AI. However, it is not the last action. As the Task Force says in their report, this is, “certainly not the final word on AI issues for Congress. Instead, it should be viewed as a tool for identifying and evaluating AI policy proposals.”
As with the Senate’s May report, the House Task Force is sending the right signals and saying the right things about the importance of research in handling the impact of artificial intelligence on the nation. What is needed now is to follow up this report with legislative action. CRA will continue to monitor any developments in this space and will advocate for the important role that the research community plays in any policy discussions around artificial intelligence.
Today, the Biden Administration released a National Security Memorandum (NSM) on artificial intelligence. At a high policy level, the NSM directs the executive branch to take steps to, “(1) ensure that the United States leads the world’s development of safe, secure, and trustworthy AI; (2) harness cutting-edge AI technologies to advance the U.S. Government’s national security mission; and (3) advance international consensus and governance around AI.” According to the White House, the NSM’s, “fundamental premise is that advances at the frontier of AI will have significant implications for national security and foreign policy in the near future.”
This memo has been long expected, as it was mentioned at the end of the AI Executive Order, which was released at the end of October 2023.
The NSM takes several steps to advance the use and adoption of AI for national security purposes. From directing improvements to the security and diversity of semiconductor chips supply chain; to “formally designating the AI Safety Institute” at NIST as the primary point of between US industry and the federal government; to doubling down on the NAIRR pilot; to directing the creation of a “Framework to Advance AI Governance and Risk Management in National Security;” to directing the US Government to collaborate with allies and partner nations to, “establish a stable, responsible, and rights-respecting governance framework,” to use AI in ways that adhere to international law and respect human rights. More details can be read in the Fact Sheet released by the White House.
This is a good step to continue the adoption of AI in the operations of the federal government, as well as to do so in a socially responsible way. However, more remains to be done. CRA will monitor the actions taken around AI at the federal level and we will continue to advocate that fundamental AI research, as well as research into its use and applications, must be a key part of any national strategy around this technology.
Update 9/26/24: The Senate approved the CR last night and sent the bill to President Biden to be signed into law.
Original Post: With less than a week till the beginning of Fiscal Year 2025 (October 1st), Congress is rushing to pass a continuing resolution (or CR) to avoid a government shut. Over the weekend, Congressional leaders announced a bicameral, bipartisan compromise to pass a relatively clean CR that would extend funding authority, at FY24 levels, to December 20th. This is likely the best-case scenario given the situation, as House Speaker Johnson (R-LA) originally floated a March 2025 CR with a politically toxic voting amendment attached to it. A clean CR, which does not contain politically contentious policy provisions or funding changes, avoids a messy political fight and the possibility of a government shut down until after the election.
However, confidence that Congress will finalize FY25 by December 20th would be misplaced. As we have said repeatedly during the budget process, the outcome of the November elections will directly impact how FY25 is closed out. If the political calculus is relatively unchanged after the election (meaning a split in control of Congress and Democrats retain the White House), it is likely that this Congress will want to close out FY25 to give the 119th Congress a clean slate. However, if either side gains a clear advantage in the election, the likely outcome will be punting the matter into 2025, where that side will have more say over the funding details. We will have to wait for events to play out.
On September 11th, the House Science, Space, and Technology Committee passed nine pieces of legislation covering a diverse set of topics around artificial intelligence (AI). The bills are now sent to the full House of Representatives for consideration and potentially passage into law.
NAIRR, a cyberinfrastructure resources proposed by a Congressionally established task force of the same name, was started up as a pilot program by NSF at the beginning of 2024. The goal of NAIRR, which will be run by NSF and overseen by an interagency steering committee, is to provide, “free or low-cost access to datasets and computing resources for development of AI workflows,” helping to democratize the development and use of artificial intelligence. The bill version would authorize NAIRR to receive $430 million a year for five years (FY2025 to FY2030).
Of all the bills the Science Committee moved on the 11th, this is the most likely to become law. That is because there is a Senate version that was passed by the Senate Commerce Committee over the summer and it enjoys wide, bipartisan support in both chambers of Congress. However, expectations on that happening should be kept in check, as there are few legislative working days left in the year and Congress does not have many legislative vehicles to move this to passage. If it were to move, it would likely do so as an amendment in the annual defense policy bill. There are also rumors of an AI supplemental funding bill during the lame duck session of Congress; however, this author would caution that until we know the outcome of the November elections, such an effort is unlikely.
H.R. 9402, the NSF AI Education Act of 2024
This bill support NSF’s education and professional develop mission related to AI. First, it allows NSF to support scholarships and fellowships in AI, and specifically includes community colleges for support. It also directs NSF to support professional development for students, teachers, faculty, and industry professionals. This includes supplements to students and faculty to attain skills, training, or education in AI, as well as fellowships for industry and school professionals.
Next, this legislation establishes a “Centers of AI Excellence” program at NSF. The agency is directed to run the program in coordination with NIST’s Regional Technology Hubs, while also leveraging the NSF Engines program and other NSF efforts it deems necessary and useful. The establishment of such a center is to, “enhance educational outcomes and drive workforce development by integrating artificial intelligence into teaching, learning, and community engagement.” The bill’s language specifically includes community colleges and “area center and technical educational schools” to the list of groups who can take part in the Center program.
The legislation also directs NSF to make awards to promote research regarding teaching models, and materials for AI and its integration into classrooms, teaching, and learning for Pre-K through grade 12 students who are from low-income, rural, or Tribal populations. Finally, the legislation incorporates AI skills development into the National STEM Teacher Corp.
H.R. 9211, the LIFT AI Act
The Lift AI Act would allow NSF to make awards to institutions of higher education or nonprofit organizations to support research activities to develop educational curricula and evaluation methods for AI literacy at the K-12 level. It also allows NSF to carry out these activities through new or existing funding; it does not provide additional funding authorizations.
There is also a sense of Congress clause that talks about the importance of AI literacy in K-12 for the nation’s future workforce and how it underpins the country’s economic and national security. Such clauses don’t have much legal strength, but they provide a useful rhetorical point for the research agencies to say “Congress deems this subject important and that’s why we’re funding these activities.”
H.R. 9403, the Expanding AI Voices Act
This bill directs NSF to broaden participation and capacity in AI research, education, and workforce development. It does this by competitive awards to institutions that are not one of the top 100 institutions (as determined by Federal R&D expenditures during the 3-year period prior to the year of the award), HBCUs, MSIs, tribal colleges or universities, or a consortia of any of the these entities.
An eligible institution may use the funds to carry out:
Development or expansion of research programs in AI and related disciplines.
Faculty recruitment and professional development in AI.
Bridge programs focus on preparing post-baccalaureate students for grad programs in AI.
Provide or broker access to research resources, including computer resources, networking, data facilities, and software engineering support for AI R&D.
Community building activities to foster mutually beneficial public-private collaboration with Federal research agencies, industry, Federal laboratories, academia, and nonprofit organizations.
Development and hosting of intra- or inter-institutional workshops to broaden workforce participation in AI.
Activities to integrate ethical and responsible practices and principles into education programs in AI
Other activities necessary to build research capacity, education pathways, and workforce development pathways in AI.
NSF is then directed, when performing outreach to the community about this program, to take into account all regions of the country, and to especially consider people from underserved communities and groups historically underrepresented in STEM. No additional funding authorization is provided.
H.R. 9215, the Workforce for AI Trust Act
This bill is meant to, “facilitate a workforce of trained experts to build trustworthy AI systems.” The legislative language is split between sections for NSF and NIST.
For NSF, it allows the agency to support graduate and postdoc research fellowships across disciplines. The language includes humanities and social sciences for fields that should be included. The fellowships are meant for the, “integration of ethical and responsible practices and principles into the design, development, training, deployment, evaluation, and understanding of artificial intelligence systems.” The bill also directs NSF to make awards for the development and hosting of intra and inter-institutional workshops on integrating perspectives and skills from multiple disciplines toward the deployment, evaluation, and understanding of AI systems. Finally, it directs NSF to integrate these perspectives into the agency’s peer review process.
For the NIST section of the bill, it amends NIST’s AI mission to have the agency support education and workforce development activities to expand the AI workforce. This includes careers related to helping organizations govern, map, measure, and manage AI related risks, including testing, evaluation, verification, and validation of AI systems.
H.R. 9466, the AI Development Practices Act
This bill directs NIST to “catalog and evaluate emerging practices and norms for communicating certain characteristics of artificial intelligence systems, including relating to transparency, robustness, resilience, security, safety, and usability, and for other purposes.” This is perhaps the most esoteric bill that the committee considered, but it covers an essential function of the federal agency whose mission is to assemble standards for industry. The language makes clear that any guidance that NIST develops must remain voluntary.
H.R. 9497, the AI Advancement and Reliability Act
This bill establishes a “Center for AI Advancement and Reliability” at NIST in order to ensure US leadership in, “research, development, and evaluation of the reliability, robustness, resilience, security, and safety of artificial intelligence systems.” The center is to coordinate with NSF, OSTP, DOE, DOD, DHS, and other departmental secretaries or agencies as considers appropriate. The bill also directs NIST to establish a consortium of stakeholders from academic, industry and civil society. The center is authorized to receive $10 million for Fiscal Year 2025.
H.R. 9197, the Small Business Artificial Intelligence Advancement Act
This is a small business assistance bill that directs NIST to provide guidance to help such entities utilize advances in the AI marketplace.
H.R. 9194, the Nucleic Acid Screening for Biosecurity Act
Finally, the shortest bill the Science Committee considered, it’s a biology sciences bill that touches on the potential impact AI could have on nucleic acid screening sciences.
Today six leading organizations — AAAI, ACM, CRA, IEEE-USA, SIAM, and USENIX — representing more than 400,000 people in computing, information technology, science, and innovation across US industry, academia, and government, joined together to call on Congressional leaders of both parties to fully fund the research agencies contained in the CHIPS and Science Act of 2022:
September 11, 2024
The Honorable Mike Johnson
Speaker of the House
United States House of Representatives
Washington, DC 20515
The Honorable Mitch McConnell
Republican Leader
United States Senate
Washington, DC 20515
The Honorable Chuck Schumer
Majority Leader
United States Senate
Washington, DC 20515
The Honorable Hakeem Jeffries
Democratic Leader
United States House of Representatives
Washington, DC 20515
Dear Speaker Johnson, Leader Schumer, Leader McConnell, and Leader Jeffries:
As six leading organizations representing more than 400,000 people in computing, information technology, science, and innovation across US industry, academia, and government, we urge Congress to appropriate the vital funding levels authorized in the CHIPS and Science Act of 2022 for the National Science Foundation (NSF), the Department of Energy’s (DOE) Office of Science, the National Institute of Standards & Technology (NIST), and other critical technology and innovation-driven agencies.
For several years and across both parties, Congress has rightly prioritized U.S. leadership in the global future of critical technologies and industries. Investments in fundamental R&D across the federal government foster innovative breakthroughs, drive job growth, and ensure the country’s national and economic security amidst growing global competition.
The U.S. is the current world leader in R&D of critical and emerging technologies because of our dynamic research ecosystem, a key component of which is the federal investment in fundamental research. We have seen time and time again, from the Internet to exascale computing to artificial intelligence, that breakthroughs supported by the federal research agencies pay huge dividends to the country and help launch new sectors of the economy. By one measure, the return on investment of non-defense government R&D is between 150 and 300 percent. All Americans should be proud of that record of accomplishment and the return on their investment.
But the nation’s commitment to fundamental research, particularly in the computing and IT fields, has been slipping. Significant cuts to the budgets of NSF, NIST, and DOE in FY2024 will have a serious impact on the future of U.S. leadership in technological innovation. Fewer good ideas and new research visions will see investments, fewer graduate students will be produced, and the innovation economy driven by what the National Science Board has described as the “extraordinarily productive interplay between academia, government, and industry” will move forward more slowly – at a time when our global adversaries are increasing their investments. A nation that is striving to lead the world in artificial intelligence, quantum computing, high performance computing, and related fields cannot do that with cuts or flat funding at NSF, DOE, NIST, and the other research agencies.
Congress should fund these agencies at the highest amount possible in the Fiscal Year 2025 budget and look at opportunities for supplemental funding, such as in artificial intelligence legislation, to meet our emerging technology and computing research, workforce, and infrastructure needs. We are not alone in calling on the nation’s leaders to prioritize the commitments made to the country’s research efforts at NSF, DOE, NIST, and the other federal research agencies, at the levels authorized in the CHIPS and Science Act of 2022 – the National Science Board, the Science and Technology Action Committee, and chief technology officers from across industry have recently made similar calls. This is not a partisan issue; it must be an American priority. We cannot risk missing this moment and losing our hard-fought position as the world leader in scientific research to our competitors.
Association for the Advancement of Artificial Intelligence (AAAI) aaai.org
Trump Administration Orders Pause in All Federal Grants
/In: Funding, Impediments to Research Highlights, Information Technology R&D Highlights, Policy, R&D in the Press, Research /by Brian MosleyUPDATE #2 1/29: The Trump Administration has rescinded the original order for the funding freeze.
UPDATE #1 1/29: Yesterday afternoon, the Trump Administration released a new memo to clarify the earlier one on pausing federal grant and other financial assistance programs. Punch Bowl has the memo itself and several other news outlets are summarizing its contents and directions from the White House. It says that, “any program not implicated by the President’s Executive Orders is not subject to the pause,” and then specifically lists the executive orders listed in the original memo that apply. It continues by saying, “any program that provides direct benefits to individuals is not subject to the pause,” and that the, “guidance establishes a process for agencies to work with OMB to determine quickly whether any program is inconsistent,” with the listed Executive Orders. The memo also makes the argument that this is not a freeze on all Federal financial assistance programs and that it is not impoundment. While this clarification is helpful, it does not clear up all the confusion from the original order and what specific programs are impacted and for how long.
As well, just before the 5pm Tuesday deadline for the original memo to take effect, a federal judge in Washington DC temporarily blocked the Trump administration from enforcing this directive. This halts the freeze on funding until Monday February 3rd. Additionally, a coalition of 21 state attorney generals announced legal action against the White House budget office over this matter.
In related news, NSF has created an Executive Order Implementation webpage to ensure the widest dissemination of information and updates on the subject. The other research agencies are likely to set up similar pages soon.
CRA is continuing to actively monitor the situation and we are engaging with our stakeholders to stay informed. We will continue to provide updates as the situation changes.
Original post: Last night, the Trump Administration issued a memo to “Temporary Pause of Agency Grant, Loan, and Other Financial Assistance Programs.” The order from the Office of Management and Budget (OMB) requires all federal agencies to, “temporarily pause all activities related to obligation or disbursement of all Federal financial assistance.” The stated purpose is, “to review agency programs and determine the best uses of the funding for those programs consistent with the law and the President’s priorities.” The pause is ordered to take effective today, January 28th, at 5pm and is to remain in effect until February 10th, though it could be extended.
The OMB memo requires every federal agency to pause:
OMB also requires all, “Federal agencies to pause all activities associated with open NOFOs (Notice of Funding Opportunity), such as conducting merit review panels.” The order explicitly exempts mandatory spending like Social Security and Medicare, as well as, “assistance provided directly to individuals.” This order will likely impact much, if not all, discretionary funding that the Federal Government spends, including research funding. However, there is significant confusion about how the memo will be implemented and whether it is legal.
This is likely the first move in the expected impoundment actions that the Trump Administration has been telegraphing. Impoundment is when the President does not spend federal dollars that have been legally appropriated by Congress. With a few execeptions, the Executive Branch is required by the Constitution to spend funds as the Legislative Branch directs (this is where the term “power of the purse” comes from). Legally this situation will turn on whether this is a temporary pause to perform a review or not. Already key members of Congress have called on the Administration to stop this action and, “ensure all federal resources are delivered in accordance with the law.”
CRA is monitoring the situation closely and will report out any developments. In our conversations with policymakers, we will explain the damaging impacts this order will have on the nation’s computing research community. As well, we strongly encourage all members of our community to contact their Senators and Representatives in Congress and tell them how this action will impact their constituents and districts.
First Slate of Trump Administration Executive Orders and Actions Released; Several Impact the Computing Research Community
/In: Impediments to Research Highlights, Information Technology R&D Highlights, People, Policy, R&D in the Press, Research /by Brian MosleyOn January 20th, the inaugural day of the Trump Administration, President Trump unveiled a series of executive orders and actions. Executive orders are documents or actions issued by the president to manage the operations of the Federal Government, specifically the executive branch departments and agencies. While these EOs have the force of law, they are in force until they are rescinded, either by the issuing administration or a succeeding one, or are deemed unconstitutional by the courts. Almost all of these orders were a fulfillment of campaign promises made by Trump during the 2024 election. Several of these EOs impact the computing research community, either directly or indirectly.
The “Ending Radical and Wasteful Government DEI Programs and Preferencing” EO orders the Director of the Office of Management and Budget (OMB), assisted by the US Attorney General and the Director of OPM, to “coordinate the termination of all discriminatory programs, including illegal DEI and “diversity, equity, inclusion, and accessibility” (DEIA) mandates, policies, programs, preferences, and activities in the Federal Government, under whatever name they appear.” Within 60 days each agency, department, or commission shall terminate, “all DEI, DEIA, and “environmental justice” offices and positions (including but not limited to “Chief Diversity Officer” positions); all “equity action plans,” “equity” actions, initiatives, or programs, “equity-related” grants or contracts; and all DEI or DEIA performance requirements for employees, contractors, or grantees.” Additionally, they are to provide the White House with a list of:
There is an expectation that all federal employees who work in DEI offices will be furloughed or reassigned by the end of this week. For all intents and purposes, this order outlaws DEI initiatives within the federal government.
The order that is widest in scope, will also impact the community directly: Initial Rescission of Harmful Executive Orders and Actions. This order revokes a large number of executive orders issued by President Biden, the most significant being Executive Order 14110 of October 30, 2023 (AKA: “Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence” or the AI EO). The full impact of rescinding the AI EO is unclear at the moment, as several actions ordered in the EO have already occurred. But the fate of such things as the NAIRR pilot project at NSF and the AI Safety Institute at NIST are unclear at this time. Both programs enjoy wide, bipartisan support in the Congress and the tech industry; if they are required to shut down, that could be an unintended outcome of revoking the original order. CRA is consulting with our friends and allies in Congress and the Executive Branch to get more information on the full impact of revoking this order.
This order also requires the National Security Advisor to review of all National Security Memos (NSMs) that were issued by the Biden Administration and make recommendations on which ones that should be revoked or kept in force. This could impact the NSM covering artificial intelligence.
There are several executive orders that were issued that will impact the computing research community in less direct or obvious ways. A select list of these orders are:
Establishing and Implementing the President’s “Department of Government Efficiency” – This order sets up the so called “Department of Government Efficiency” within the executive office of the President (i.e.: the White House). It does this by renaming and re-tasking the United States Digital Service, which was an office set up by President Obama to provide consultation services to federal agencies on information technology. The DOGE office is given an 18-month timeline to do its work and will terminate on July 4, 2026. The effort, which has had an unclear directive until now, is tasked with “modernizing Federal technology and software to maximize efficiency and productivity.” Intriguingly, the DOGE effort is not tasked with recommending the removal of regulations or funding for specific programs; that had been a stated goal but seems to have fallen by the wayside. There is always a possibility that its scope could change with a future EO though.
Restoring Accountability to Policy-Influencing Positions within the Federal Workforce and Hiring Freeze – These two orders will likely have a number of unintended impacts, particularly at the federal research funding agencies. With the “Restoring Accountability EO,” the likely impact will be shifting most, if not all, government scientists and rotators into this new “Schedule Policy/Career” (formerly called Schedule F) classification of the federal workforce. This has been a concern since Schedule F was issued toward the end of the first Trump Administration. This action is meant to classify career federal employees more like political appointments, removing civil service protections and allowing for easier removal from their positions. This could have a serious impact on how career government scientists perform their jobs. As for the hiring freeze EO, it is initially set as a 90-day period, while OMB and the DOGE office develop a workforce reduction plan. However, it’s easy to see that being extended for an indeterminate amount of time. An extended freeze could impact rotators at the research agencies and the processes that they oversee.
Protecting the United States from Foreign Terrorists and other National Security and Public Safety Threats – This EO covers immigration and could impact the high-skilled immigration changes taken by the Biden Administration in 2022. The order calls on the Departments of State, Justice, and Homeland Security, as well as the Director of National Intelligence, to conduct a review of all immigration actions taken by the Biden Administration and reestablish a “uniform baseline” that existed before that Administration came into office in 2021. It also orders those same departments to identify countries which, “vetting and screening information is so deficient as to warrant a partial or full suspension on the admission of nationals from those countries.” It further orders an evaluation of, “all visa programs to ensure that they are not used by foreign nation-states or other hostile actors to harm the security, economic, political, cultural, or other national interests of the United States.” Given the importance of high-skilled immigration to the computing, IT, and CS research communities, any changes in this area could have a significant impact.
This is just the first slate of Executive Orders from the Trump Administration. We are expecting more in the coming weeks and months. As well, these orders may be expanded upon or have unintended impacts that have not been accounted for yet. CRA is still in the process of reviewing these orders and monitoring the unfolding situation. Please check back for more updates.
CRA Launches 2024-2025 Quadrennial Papers Which Explore Computing Research Issues that have the Potential to Address National Priorities
/In: Computing Community Consortium (CCC), Computing Education, CRA, People, Policy, R&D in the Press, Research /by Brian MosleyThe Computing Research Association (CRA) launched the first set of white papers in its 2024-2025 Quadrennial Papers series. We publish this series every four years to explore areas and issues in computing research that have the potential to address national priorities. These Quad Papers aim to provide a comprehensive overview of the computing research field, detailing potential research directions, challenges, and recommendations for policymakers and the computing research community.
Topics covered in this first set of papers include:
CRA released six papers in this initial wave and plans to release additional papers in the coming weeks.
The computing research community’s active engagement with elected officials and research agency policymakers ensures that these stakeholders remain informed about the latest breakthroughs and challenges in the field. This engagement represents one of the many ways that the CRA interacts with policymakers in the science and technology policy arena.
FY25 Appropriations Update: Budget Punted Till March, Final Funding Numbers Remain Uncertain
/In: Defense R&D Highlights, Funding, FY25 Appropriations, Information Technology R&D Highlights, Research /by Brian MosleyAs the 2024 calendar year was winding down, Congress rushed to avoid a lapse in funding authority, which would have led to a government shutdown. Despite many political complications, cooler heads prevailed and a continuing resolution was passed into law. The new CR allows the federal departments and agencies to continue to operate, at the previous year’s funding levels, until March 14th. Unfortunately, that is the only certainty that the situation created.
As we have mentioned before while tracking the Fiscal Year 2025 (FY25) budget process, the outcome of the November 2024 elections is dictating the final outcome for the budget process. Since Republicans now have unified control of the federal government, they will be able to decide on final funding levels for FY25 and likely beyond. They are telegraphing a budget reconciliation strategy which has the potential to pass into law sweeping changes in federal spending. Details are sparse at the moment but the prospects for significant, across-the-board cuts to non-defense funding are very likely.
We mention “non-defense funding” because almost all of the federal research agencies, such as NSF, DOE SC, NASA, NIH, and NIST, that support researchers in the computing fields are categorized as non-defense spending.
However, in order for these funding cuts to come to pass, Congressional Republicans, particularly in the closely divided House of Representatives, must remain united. If they stay together, they will be able to pass their agenda into law; if they continue to operate as they have for the last two years, they are much more likely to falter and rely on Democratic votes to handle must-pass legislation. Time will tell, but the vote for the Speakership of the House will be an early indicator of how Congress will operate for the next two years.
This outlook does not cover the administration of incoming President Trump. The potential challenges, complications, and opportunities of the next administration will be covered in a future Policy Blog post. Put simply, there are many areas of concern but there are also potential positives for the computing research community. Again, time will tell.
CRA is monitoring the budget situation very closely and we will report out any develops over the next several months. And we will continue to impress on elected officials and policymakers of both parties that federal support for fundamental computing research is essential for the well-being of the nation.
President-Elect Trump Names OSTP Director and Other High Level Science & Tech Policy Staff
/In: Artificial Intelligence, Information Technology R&D Highlights, People, Policy, R&D in the Press, Research /by Brian MosleyJust before the holidays, President-Elect Trump announced that Michael Kratsios will be nominated as the Director of the Office of Science & Technology Policy (OSTP). Informally known as the President’s Science Advisor, the Director of OSTP oversees the office tasked with providing advice to the President, and the Executive Office of the President, on matters related to science and technology, as well as coordinating the Administration’s science and tech policy among the assorted federal research agencies. Mr. Kratsios comes into the position with extensive experience in the first Trump Administration, where he served both at OSTP and the Defense Department in senior research policy roles. He will also serve as an assistant to the President for science and technology.
It was also announced that computer scientist Lynne Parker, of the University of Tennessee, Knoxville, was named counselor to Kratsios and executive director of the President’s Council of Advisors on Science and Technology (PCAST). Dr. Parker previously served in OSTP from 2018 to 2022, in both the Trump and Biden Administrations, and at NSF in the CISE Directorate as Division Director for Information and Intelligent Systems from 2015 to 2016. In her previous role at OSTP, she led national AI policy efforts as Deputy Chief Technology Officer of the United States, the founding Director of the National AI Initiative Office, and Assistant Director for AI.
Other high level OSTP staff announced were Sriram Krishnan, who will serve as Senior Policy Advisor for Artificial Intelligence, and Bo Hines, who will be the Executive Director of the new Presidential Council of Advisers for Digital Assets (informally dubbed the “Crypto Council”). This follows up on the news earlier in December that David Sacks will serve as advisor to the president for AI and cryptocurrency and co-chair to PCAST.
These advisor and staff announcements are encouraging news, as this means that OSTP’s leadership will be in place much sooner than in Trump’s first term, when it was over 2 years before the office had a confirmed director. The news about Mr. Kratsios and Dr. Parker, in particular, is good news, as both are well known to the computing research community and are well respected in their fields. Exactly when they will be in place is hard to tell at the moment; Kratsios’ nomination will need to be confirmed by the Senate before he can take up his position full time. But the other positions should be able to start soon after the Administration is sworn into office on January 20th.
This news also drives home that artificial intelligence and blockchain technology will be important issues in the second Trump Administration’s science and technology portfolio. CRA will continue to impress to elected officials and policymakers the essential role that fundamental research will play in the develop and advancement of both technologies, as well as the role other computing research disciplines will be to the future prosperity of the country.
Bipartisan House Task Force on Artificial Intelligence Releases Wide-Ranging Policy Report
/In: Artificial Intelligence, Information Technology R&D Highlights, Policy, R&D in the Press, Research /by Brian MosleyThe Bipartisan House Task Force on Artificial Intelligence released its long-awaited report today. The report contains 66 key findings and 89 recommendations, organized into 15 chapters that cover a diverse array of policy areas, all to, “ensure America continues to lead the world in responsible AI innovation.” It represents the culmination of almost a year of consultations with, “business leaders, government officials, technical experts, academics, and legal scholars.” This report is similar to the Senate AI Roadmap Report released in May.
The Task Force adopted several high-level principles to frame their policy analysis:
These principles are reflected in the chapter on, “Research, Development, & Standards,” which is of most immediate importance to CRA’s members. In it, the Task Force recommends that Congress continue to monitor and evaluate the impact that AI will have on different industries and the nation as a whole. Additionally, it calls to support fundamental R&D in the field of AI at American universities. However, it does not provide a specific amount or target that Congress should aim for to support this field; in comparison, the Senate report recommended at least $32 billion a year to support non-defense AI research.
The chapter on research also contains several wide-ranging recommendations that can impact the AI research community. They cover such topics as increasing tech transfer at universities; to promoting the development of infrastructure and data to enable AI research; to continued engagement in international standards development; to promoting public-private partnerships for AI R&D.
The report has several sections on policy areas of concern around AI (such as civil right and civil liberties, education and workforce, intellectual property, and open and closed systems) to sector specific chapters (healthcare, small business, agriculture, and energy usage and data centers). At over 250 pages, it is a wide-ranging report.
The release of this report is a good step in the direction of Congress tackling the complexities of AI. However, it is not the last action. As the Task Force says in their report, this is, “certainly not the final word on AI issues for Congress. Instead, it should be viewed as a tool for identifying and evaluating AI policy proposals.”
As with the Senate’s May report, the House Task Force is sending the right signals and saying the right things about the importance of research in handling the impact of artificial intelligence on the nation. What is needed now is to follow up this report with legislative action. CRA will continue to monitor any developments in this space and will advocate for the important role that the research community plays in any policy discussions around artificial intelligence.
White House Releases National Security Memorandum on Artificial Intelligence
/In: Artificial Intelligence, Defense R&D Highlights, Policy, R&D in the Press /by Brian MosleyToday, the Biden Administration released a National Security Memorandum (NSM) on artificial intelligence. At a high policy level, the NSM directs the executive branch to take steps to, “(1) ensure that the United States leads the world’s development of safe, secure, and trustworthy AI; (2) harness cutting-edge AI technologies to advance the U.S. Government’s national security mission; and (3) advance international consensus and governance around AI.” According to the White House, the NSM’s, “fundamental premise is that advances at the frontier of AI will have significant implications for national security and foreign policy in the near future.”
This memo has been long expected, as it was mentioned at the end of the AI Executive Order, which was released at the end of October 2023.
The NSM takes several steps to advance the use and adoption of AI for national security purposes. From directing improvements to the security and diversity of semiconductor chips supply chain; to “formally designating the AI Safety Institute” at NIST as the primary point of between US industry and the federal government; to doubling down on the NAIRR pilot; to directing the creation of a “Framework to Advance AI Governance and Risk Management in National Security;” to directing the US Government to collaborate with allies and partner nations to, “establish a stable, responsible, and rights-respecting governance framework,” to use AI in ways that adhere to international law and respect human rights. More details can be read in the Fact Sheet released by the White House.
This is a good step to continue the adoption of AI in the operations of the federal government, as well as to do so in a socially responsible way. However, more remains to be done. CRA will monitor the actions taken around AI at the federal level and we will continue to advocate that fundamental AI research, as well as research into its use and applications, must be a key part of any national strategy around this technology.
FY25 Appropriations Update: With End of Fiscal Year Approaching, Congress Rushes to Pass a Stopgap
/In: Funding, FY25 Appropriations, Research /by Brian Mosley
Update 9/26/24: The Senate approved the CR last night and sent the bill to President Biden to be signed into law.
Original Post: With less than a week till the beginning of Fiscal Year 2025 (October 1st), Congress is rushing to pass a continuing resolution (or CR) to avoid a government shut. Over the weekend, Congressional leaders announced a bicameral, bipartisan compromise to pass a relatively clean CR that would extend funding authority, at FY24 levels, to December 20th. This is likely the best-case scenario given the situation, as House Speaker Johnson (R-LA) originally floated a March 2025 CR with a politically toxic voting amendment attached to it. A clean CR, which does not contain politically contentious policy provisions or funding changes, avoids a messy political fight and the possibility of a government shut down until after the election.
However, confidence that Congress will finalize FY25 by December 20th would be misplaced. As we have said repeatedly during the budget process, the outcome of the November elections will directly impact how FY25 is closed out. If the political calculus is relatively unchanged after the election (meaning a split in control of Congress and Democrats retain the White House), it is likely that this Congress will want to close out FY25 to give the 119th Congress a clean slate. However, if either side gains a clear advantage in the election, the likely outcome will be punting the matter into 2025, where that side will have more say over the funding details. We will have to wait for events to play out.
House Science Committee Passes Nine Bills to Support the Advancement of Artificial Intelligence
/In: Artificial Intelligence, Diversity in Computing, Policy, Research, STEM /by Brian MosleyOn September 11th, the House Science, Space, and Technology Committee passed nine pieces of legislation covering a diverse set of topics around artificial intelligence (AI). The bills are now sent to the full House of Representatives for consideration and potentially passage into law.
The nine bills are:
H.R. 5077, the CREATE AI Act
This is perhaps the piece of legislation of most importance to the computing research community, as it would establish the National Artificial Intelligence Research Resource (NAIRR), provide a governing structure for the resource, and authorize it to receive funding. If you’re a regular reader of the Policy Blog, you’ll recognize this bill from previous posts on AI legislation and it was originally introduced in 2023.
NAIRR, a cyberinfrastructure resources proposed by a Congressionally established task force of the same name, was started up as a pilot program by NSF at the beginning of 2024. The goal of NAIRR, which will be run by NSF and overseen by an interagency steering committee, is to provide, “free or low-cost access to datasets and computing resources for development of AI workflows,” helping to democratize the development and use of artificial intelligence. The bill version would authorize NAIRR to receive $430 million a year for five years (FY2025 to FY2030).
Of all the bills the Science Committee moved on the 11th, this is the most likely to become law. That is because there is a Senate version that was passed by the Senate Commerce Committee over the summer and it enjoys wide, bipartisan support in both chambers of Congress. However, expectations on that happening should be kept in check, as there are few legislative working days left in the year and Congress does not have many legislative vehicles to move this to passage. If it were to move, it would likely do so as an amendment in the annual defense policy bill. There are also rumors of an AI supplemental funding bill during the lame duck session of Congress; however, this author would caution that until we know the outcome of the November elections, such an effort is unlikely.
H.R. 9402, the NSF AI Education Act of 2024
This bill support NSF’s education and professional develop mission related to AI. First, it allows NSF to support scholarships and fellowships in AI, and specifically includes community colleges for support. It also directs NSF to support professional development for students, teachers, faculty, and industry professionals. This includes supplements to students and faculty to attain skills, training, or education in AI, as well as fellowships for industry and school professionals.
Next, this legislation establishes a “Centers of AI Excellence” program at NSF. The agency is directed to run the program in coordination with NIST’s Regional Technology Hubs, while also leveraging the NSF Engines program and other NSF efforts it deems necessary and useful. The establishment of such a center is to, “enhance educational outcomes and drive workforce development by integrating artificial intelligence into teaching, learning, and community engagement.” The bill’s language specifically includes community colleges and “area center and technical educational schools” to the list of groups who can take part in the Center program.
The legislation also directs NSF to make awards to promote research regarding teaching models, and materials for AI and its integration into classrooms, teaching, and learning for Pre-K through grade 12 students who are from low-income, rural, or Tribal populations. Finally, the legislation incorporates AI skills development into the National STEM Teacher Corp.
H.R. 9211, the LIFT AI Act
The Lift AI Act would allow NSF to make awards to institutions of higher education or nonprofit organizations to support research activities to develop educational curricula and evaluation methods for AI literacy at the K-12 level. It also allows NSF to carry out these activities through new or existing funding; it does not provide additional funding authorizations.
There is also a sense of Congress clause that talks about the importance of AI literacy in K-12 for the nation’s future workforce and how it underpins the country’s economic and national security. Such clauses don’t have much legal strength, but they provide a useful rhetorical point for the research agencies to say “Congress deems this subject important and that’s why we’re funding these activities.”
H.R. 9403, the Expanding AI Voices Act
This bill directs NSF to broaden participation and capacity in AI research, education, and workforce development. It does this by competitive awards to institutions that are not one of the top 100 institutions (as determined by Federal R&D expenditures during the 3-year period prior to the year of the award), HBCUs, MSIs, tribal colleges or universities, or a consortia of any of the these entities.
An eligible institution may use the funds to carry out:
NSF is then directed, when performing outreach to the community about this program, to take into account all regions of the country, and to especially consider people from underserved communities and groups historically underrepresented in STEM. No additional funding authorization is provided.
H.R. 9215, the Workforce for AI Trust Act
This bill is meant to, “facilitate a workforce of trained experts to build trustworthy AI systems.” The legislative language is split between sections for NSF and NIST.
For NSF, it allows the agency to support graduate and postdoc research fellowships across disciplines. The language includes humanities and social sciences for fields that should be included. The fellowships are meant for the, “integration of ethical and responsible practices and principles into the design, development, training, deployment, evaluation, and understanding of artificial intelligence systems.” The bill also directs NSF to make awards for the development and hosting of intra and inter-institutional workshops on integrating perspectives and skills from multiple disciplines toward the deployment, evaluation, and understanding of AI systems. Finally, it directs NSF to integrate these perspectives into the agency’s peer review process.
For the NIST section of the bill, it amends NIST’s AI mission to have the agency support education and workforce development activities to expand the AI workforce. This includes careers related to helping organizations govern, map, measure, and manage AI related risks, including testing, evaluation, verification, and validation of AI systems.
H.R. 9466, the AI Development Practices Act
This bill directs NIST to “catalog and evaluate emerging practices and norms for communicating certain characteristics of artificial intelligence systems, including relating to transparency, robustness, resilience, security, safety, and usability, and for other purposes.” This is perhaps the most esoteric bill that the committee considered, but it covers an essential function of the federal agency whose mission is to assemble standards for industry. The language makes clear that any guidance that NIST develops must remain voluntary.
H.R. 9497, the AI Advancement and Reliability Act
This bill establishes a “Center for AI Advancement and Reliability” at NIST in order to ensure US leadership in, “research, development, and evaluation of the reliability, robustness, resilience, security, and safety of artificial intelligence systems.” The center is to coordinate with NSF, OSTP, DOE, DOD, DHS, and other departmental secretaries or agencies as considers appropriate. The bill also directs NIST to establish a consortium of stakeholders from academic, industry and civil society. The center is authorized to receive $10 million for Fiscal Year 2025.
H.R. 9197, the Small Business Artificial Intelligence Advancement Act
This is a small business assistance bill that directs NIST to provide guidance to help such entities utilize advances in the AI marketplace.
H.R. 9194, the Nucleic Acid Screening for Biosecurity Act
Finally, the shortest bill the Science Committee considered, it’s a biology sciences bill that touches on the potential impact AI could have on nucleic acid screening sciences.
Six Leading Computing Organizations Call on Congress to Fully Fund the CHIPS & Science Act
/In: CRA, Funding, R&D in the Press, Research, Statements /by Brian MosleyToday six leading organizations — AAAI, ACM, CRA, IEEE-USA, SIAM, and USENIX — representing more than 400,000 people in computing, information technology, science, and innovation across US industry, academia, and government, joined together to call on Congressional leaders of both parties to fully fund the research agencies contained in the CHIPS and Science Act of 2022:
Download the letter.