This article is published in the September 2015 issue.

CCC Uncertainty in Computation Workshop Report


uncertainty graphic

The Computing Community Consortium (CCC) is excited to release a report titled Quantification, Communication, and Interpretation of Uncertainty in Simulation and Data Science, the result of the Uncertainty in Computation Visioning Workshop, which was held in Washington, D.C. in mid-October 2014. 

The workshop brought together more than 40 scientists from different disciplines including simulation and data science, engineering, statistics, applied mathematics, visualization, decision science, and psychology. The overarching goal of the workshop was to open a discussion between experts with diverse scientific backgrounds about the topic of uncertainty/risk and its communication. Workshop participants identified significant shortcomings in the ways we currently process, present, and interpret uncertain data.

Specific recommendations on a research agenda for the future were made in four areas: uncertainty quantification in large-scale computational simulations, uncertainty quantification in data science, software support for uncertainty computation, and better integration of uncertainty quantification and communication to stakeholders.

The recommendations are below:

  • There is growing concern that the statistical models currently used to quantify uncertainty in the outputs of simulations won’t scale, particularly to large, heterogeneous computations models. This leads to a critical need to transition research in uncertainty quantification of computational systems from the analysis of components to the analysis of large-scale systems of interacting components.
  • The emerging field of data science is largely lacking in generalizable methods for quantifying the uncertainty in the output of analysis systems. As a result, a major new research initiative needs to be initiated in this area. Since data science programs are just getting established in universities, this effort needs to be accompanied by relevant curriculum development.
  • The increasing use of large-scale computational and data-based analyses in decision support and the increased importance of considering uncertainty in such systems will create substantial burdens for software developers. A major new effort needs to go in to the building of generally applicable, easy to-use software development tools supporting the representation and analysis of uncertainty.
  • The fragmented nature of expertise in quantification, communication, and interpretation of uncertainty will become more and more problematic as the scale of problems, the scale of computational resources, and the scale of data continues to increase. It is essential that a major, new research initiative be undertaken in communicating uncertainty about large-scale systems to stakeholders in a comprehensive and integrated manner.

The current state of affairs in the quantification, communication, and interpretation of uncertainty in simulation and data science is creating critical challenges, but it also presents important opportunities.

See the full report and the workshop website, for more information.