CCC at AAAS 2020

The Computing Community Consortium (CCC) has attended and hosted sessions at the American Association for the Advancement of Science (AAAS) Annual Meeting since 2013. Below you can find links to slides and resources from the 2020 sessions and links to related CCC white papers and resources. To learn more about the 2020 AAAS Meeting visit the webpage, and find recaps of the 2020 AAAS session here on the CCC blog.

New Approaches to Fairness in Automated Decision Making

Friday, February 14, 8:00 – 9:30 AM

Synopsis: Critical decisions are increasingly being made by machine-learning algorithms based on massive data trails that people all leave behind. Such decisions affect issues from college admissions and bank loans, to sentencing and police deployment. Concerns have been raised about the interpretability, transparency, and fairness of these algorithms. In response, an exciting mathematical theory of fairness is emerging that addresses topics such as defining fairness, ways of designing decision-making algorithms to incorporate fairness requirements, and incentivizing decision makers to be fair. Topics to be discussed in this session will provide a precise understanding of the frictions to fairness due to reasons such as mislabeled training data, use of inappropriate features, insufficient data, feedback loops, and the computational difficulty of being fair. This understanding will further inform attendees on how to achieve fair outcomes by avoiding obvious pitfalls and providing appropriate incentives.

You can find a recap of the session here on the CCC blog.

Speakers:

Moritz HardtMoritz

University of California, Berkeley

New Approaches to Fairness in Automated Decision Making

What does it mean for decisions to be made fairly? This question has become especially urgent as crucial decisions about our lives are being made by algorithmic procedures using data collected about each of us. This talk will describe different goals of fairness and how algorithms should be designed to meet these goals.

Sampath KannanKannan

University of Pennsylvania

Decision Making by Machine Learning Algorithms

This talk will introduce machine learning. Specifically it will describe machine-learning approaches to classification and resource allocation – two tasks where fairness is an important goal.

Toniann PitassiPitassi

University of Toronto

Recognizing and Overcoming Frictions to Fairness

This talk will develop a deeper understanding of the potential pitfalls of using machine-learning algorithms for tasks where fairness is a goal. Is the data used to train these algorithms appropriate for the desired goals? Even if sensitive attributes such as race and gender are explicitly excluded from the data, are they implicitly present in other features, and can we tell if the decisions made by these algorithms are unfair?

Moderator:

Ann Schwartz DrobnisDrobnis

Director, Computing Community Consortium

Discussant:

Jaime MorgensternMorgenstern

University of Washington

Related Resources:

Using Computing to Sustainably Feed a Growing Population

Friday, February 14, 3:30 – 5:00 PM

Synopsis: In the coming decades, the world population is projected to grow significantly, increasing the demand for food in the face of climate change, workforce aging and shortage, as well as environmental degradation. To ensure long-term food security, it is imperative to explore emerging computing innovations such as big data, artificial intelligence, internet of things, cloud computing with the purpose of working towards the next agricultural revolution.

Computing has already transformed agriculture. Precision agriculture uses cyber-physical systems and data science to increase yield, reducing fertilizer and pesticide runoffs. Global Agricultural Monitoring uses satellite imagery to monitor major crops for stress or recognize the failure to enable timely interventions to reduce disruptions in global food supply. This is only a start, and new compelling opportunities lie ahead. For example, big data may help synthesize new agricultural knowledge, make predictive decisions, and foster data-supported innovation.

This panel will feature the most promising computing advances to sustainably increase food production, based on the recent US Department of Agriculture’s Food and Agriculture Cyberinformatics and Tools Initiative; the congressional research service report on Big Data in U.S. Agriculture; and workshops such as the National Science Foundation’s Midwest Big Data Hub on Machine Learning from Farm to Table and Innovations at the Nexus of Food, Energy and Water Systems Data Science Workshop.

You can find a recap of the session here on the CCC blog.

Speakers:

Chandra KrintzKrintz

University of California, Santa Barbara

SmartFarm – Hybrid Cloud IoT Systems for Future Agriculture

SmartFarm investigates a novel, unifying, and open source approach to agriculture analytics and precision farming. It integrates disparate environmental sensor technologies into an on-premise, private cloud software infrastructure that provides farmers with a secure, easy to use, low-cost data analysis system. SmartFarm enables farmers to extract actionable insights from their data, to quantify the impact of their decisions, and to identify opportunities for increasing productivity.

Ranveer ChandraChandra

Microsoft Azure Global

AI and Internet of Things for Agriculture

Dr. Chandra will describe the Microsoft FarmBeats system providing for an end-to-end approach to enable data-driven farming. We believe that data, coupled with the farmer’s knowledge can help increase farm productivity and reduce costs. With FarmBeats we are building several unique solutions using low-cost sensors, drones, and vision and machine learning algorithms. The goal is to overcome technology adoption challenges such as limited power and Internet connectivity on farms.

Patrick SchnableSchnable

Iowa State University

Advancing Plant Science with Predictive Models and Large-scale Phenotyping

Prof. Schnable’s goal is to develop predictive models that will predict crop performance in diverse environments. Crop phenotypes such as yield and drought tolerance are controlled by genotype, environment and their interactions. The necessary volumes of phenotypic data, however, remain limiting to enhancing our understanding of the interactions between genotypes and environments. To address this limitation, we are building new sensors and robots to automatically collect large volumes of phenotypic data.

Moderator:
Lucas JoppaJoppa

Microsoft

Related Resources:

Artificial Intelligence Research: A Community Roadmap

Saturday, February 15, 8:00 – 9:30 AM

Synopsis: Decades of artificial intelligence research have produced formidable technologies that are providing immense benefit to industry, government, and society. AI systems can now translate across multiple languages, identify objects in images and video, converse about order placement, and control cars. The ubiquitous deployment of AI systems has created a trillion-dollar industry that is projected to quadruple in three years, while also exposing the need to make AI systems fair and trustworthy, as well as more competent about the world in which they and we operate. Future AI systems have the potential for transformative impact on society and will be rightfully expected to handle complex tasks and responsibilities, engage in meaningful communication, and improve awareness through experience. There are also concerns about future work in light of AI advancements that demand improved public communication and adjustments to the education and training of the workforce in order to leverage new types of jobs being created by AI technologies.

In a recent study by leading AI experts carried out by the Computing Community Consortium and the Association for the Advancement of Artificial Intelligence, they concluded that achieving the full potential of AI technologies poses research challenges that will require significant sustained investment and a radical transformation of the AI research enterprise. This session formulates a roadmap for AI research and development over the next twenty years.

You can find a recap of the session here on the CCC blog.

Speakers:

Yolanda GilGil

University of Southern California

The Transformative Potential of AI for Science

Artificial Intelligence is at a critical point, where we are seeing and using AI systems regularly in daily life, but we are only seeing the tip of the iceberg. To realize the full potential of AI systems in the future, we need to not only continue, but increase the research in AI across all disciplines to discover unknown ways for AI systems to be incorporated in the future. This talk will discuss the need for continued research and some of the potential possibilities for AI in the future.

Bart SelmanSelman

Cornell University

The Need for a National AI Research Infrastructure Initiative

AI Research has been happening since the early 1950’s. Recent changes in the ecosystem indicate that we are at a crucial point in time where new advancements are happening regularly, based on years of research. As the many disciplines which touch on AI continue to advance, the possibilities for advancement in AI are ever increasing. This talk will cover the new paradigms affecting the AI research ecosystem and how that can affect the broader research ecosystem going forward.

Daniel LoprestiLopresti

Lehigh University

The 20-Year AI Roadmap and Its Impacts

Advances in AI are already having a tremendous impact in science and throughout all aspects of our society. This talk will present an overview of the activity that led to the 20-Year Community Roadmap for AI Research, along with its major conclusions and recommendations. It will highlight some of the significant technical challenges we face and the visions that drive them.

Moderator:
Ann Schwartz DrobnisDrobnis

Director, Computing Community Consortium

Related Resources:

Detecting, Combating, and Identifying Dis and Mis-information

Saturday, February 15, 10:00 – 11:30 AM

Synopsis: Democratization of information and broad interconnectivity has had a wide range of positive transformative impacts on society. Through social networks, individuals can stay connected and share information, medical professionals can reach patients, and access to news and scholarly publications both from the consumer and producer perspective has significantly increased. Concurrently, there is an increasing rise in manipulation of information leading to the spread of disinformation in a broad range of media modalities including text, imagery, and video. This session brings together experts from social science, computer science, and journalism. Panelists will discuss computational training for journalists, the development of new technology to better identify and detect disinformation before it spreads, automated fact-checking systems, and methods for propagating corrections to misinformation. The session is structured specifically to address the need for an interdisciplinary approach, and attendees will gain an understanding of the latest technologies which can be leveraged for the purpose of detecting deep fakes and revealing the truth.

You can find a recap of the session here on the CCC blog.

Speakers:

Emma SpiroSpiro

University of Washington

Misinformation in the Context of Emergencies and Disaster Events

Looking at the structure and dynamics of interpersonal and organizational networks in both online and offline environments.

Dan GillmorGillmor

Arizona State University

Journalism and Misinformation

Journalism sometimes amplifies deceit, but journalists have come to realize that they must help their audiences be more savvy about what they watch, listen to, and read. Journalists need help from the public and CS (and others) to help improve news/media literacy.

John BeielerBeieler

Office of the Director of National Intelligence

Vulnerability of AI Systems

What happens when the fake news propagators advance? You do not want to get rid of all fake news. Reading irrelevant things can be helpful. From an analyst standpoint, it is important not to narrow the aperture of what people can see.

Moderator:
Nadya BlissBliss

Arizona State University

Discussant:
Juliana FreireFreire

New York University

Related Resources:

The Debrief: Detecting, Combating, and Identifying Dis and Mis-information via Youtube: https://www.youtube.com/watch?v=vwdyAUKgp8U

Catalyzing Computing Podcast Recording

Saturday, February 15, 12:00 – 1:00 PM

The Computing Community Consortium’s (CCC) official podcast, Catalyzing Computing, features interviews with researchers and policymakers about their background and experiences in the computing community. The podcast also offers recaps of visioning workshops and other events hosted by the Consortium. If you want to learn about some of the computing community’s most influential members or keep tabs on the latest areas of interest, then this is the podcast for you.

Episode 26: Science and Technology for National Intelligence with John Beieler (Live from AAAS 2020)

This episode of the podcast was recorded live at the “This Study Shows” Sci-Mic stage at the 2020 AAAS Annual Meeting in Seattle, Washington. Khari Douglas interviews Dr. John Beieler, a former program manager at IARPA and currently the Director of Science and Technology in the Office of the Director of National Intelligence. In this episode they discuss working in national security and the technical challenges the intelligence community is facing.

Stream in the embedded player below or find the podcast on Apple Podcasts | Spotify | Stitcher | Google Play | Blubrry | iHeartRadio | Youtube.

Download the episode transcript here.

Guest:
John BeielerBeieler

Office of the Director of National Intelligence

Host:
Khari DouglasDouglas

Computing Community Consortium

Next Generation Computer Hardware

Saturday, February 15, 3:30 – 5:00 PM

Synopsis: It is undeniable that powerful computing has led to fundamental advances in science and engineering. Rapid and powerful computing in small-scale devices such as phones and laptop have also revolutionized the global economy and offer the promise of AI assistants, smart health systems, and augmented reality. Unfortunately, this is soon to come to a screeching halt. CMOS-based computers that enabled the growth of computing through the 20th Century have reached their limits – Moore’s law and Dennard scaling, the observed doubling of transistors in microchips and the constant power consumption of shrinking transistors, are ending. In order to continue progress in science and engineering research, it is essential to find novel computers capable of meeting the community’s future needs.

A new way of designing computation is emerging in thermodynamic computing. Borrowing from the natural world and the proposition that thermodynamics drives the self-organization and evolution of natural systems, thermodynamic computing could lead to powerful and highly efficient analog computational systems that utilize self-organization to perform calculation. Leaders in physics, computational biology, and computer science came together in a recent Computing Community Consortium workshop to outline a research agenda to make such systems. Related to the theory of thermodynamic computing, reversible computing, also offers the possibility of increasing energy efficiency, while maintaining traditional digital computing systems.

You can find a recap of the session here on the CCC blog.

Speakers:

Tom ConteConte

Georgia Tech

Into The Wild: Radically New Computing Methods for Science

The end of Moore’s law coupled with increasing compute demands of science research has led to the end times for current computer designs. But there is hope by using novel ways to compute. I will review emerging “fringe techniques,” including leveraging open-system thermodynamics; physical processes that “optimize” natural;, and computing using quantum physics, and discuss their applicability to science research.

Todd HyltonGillmor

University of California, San Diego

The Future of Computing: It’s All About Energy

What could be more relevant to the future of computing than thermodynamics, the science of energy and change? After all, it is thermodynamics that explains energy efficiency, explains change, and dominates the development of machine learning, electron devices and system architectures today. In this talk I review the foundations, articulate a vision, and present a model neural network that illustrates the potential for thermodynamic models of computation.

Moderator:
Mark HillHill

University of Wisconsin-Madison

Presentation Slides

Related Resources:

Episode 3: What is Thermodynamic Computing? Part 1

In January 2019, the CCC hosted a visioning workshop on Thermodynamic Computing in Honolulu, Hawaii. This episode of the Catalyzing Computing podcast features an interview with workshop organizers Tom Conte (Georgia Tech) and Todd Hylton (UC San Diego) to discuss their reasons for proposing the workshop, what thermodynamic computing is, and the potential impact that thermodynamic computing could have on future technology. Workshop participant Christof Teuscher (Portland State University) also shares his thoughts on the workshop and his work with new models of computation, including computing with DNA. Stream in the embedded player below or find the podcast on iTunes | SpotifyStitcher | Google Play | Blubrry | iHeartRadioYoutube

A report summarizing the discussions and conclusions from the workshop is now available here.

Download the episode transcript here.

Episode 4: What is Thermodynamic Computing? Part 2

In January 2019, the CCC hosted a visioning workshop on Thermodynamic Computing in Honolulu, Hawaii. This episode of the Catalyzing Computing podcast features an interview with workshop organizer, Natesh Ganesh, a PhD student at the University of Massachusetts Amherst who is interested in the physical limits to computing, brain inspired hardware, non-equilibrium thermodynamics, and emergence of intelligence in self-organized systems. He was awarded the best paper award at IEEE ICRC’17 for the paper  A Thermodynamic Treatment of Intelligent Systems. I also speak with workshop participant Gavin Crooks, formerly a Senior Scientist at Rigetti Quantum Computing who developed algorithms for near term quantum computers. Gavin is a world expert on non-equilibrium thermodynamics and the physics of information. Stream in the embedded player below or find the podcast on iTunes | SpotifyStitcher | Google Play | Blubrry | iHeartRadioYoutube.

A report summarizing the discussions and conclusions from the workshop is now available here.

Download the episode transcript here.

The Debrief: Detecting, Combating, and Identifying Dis and Mis-information

Sunday, February 16, 10:00 – 10:30 AM

Synopsis: “The Debrief” is an opportunity for a twenty minute public interview of a scientific session’s speakers by a burgeoning journalist for both a physical and virtual audience. Interviews will be held on our Expo Stage. Nadya Bliss (Arizona State) and Dan Gillmor (Arizona State) will present the key take-aways from the Detecting, Combating, and Identifying Dis and Mis-information session.

Watch the full video of the debrief of Youtube here.

Speakers:

Nadya BlissBliss

Arizona State University