H. Appendix: Methodology

In Fall 2015, the CRA Enrollment Committee Institutional Subgroup sought to understand the scope and impact of recent increases in undergraduate computer science programs by collecting data from academic units[1] in the United States and Canada. This section describes the collection and analysis of the academic unit data obtained via the CRA Enrollment Survey.

Survey Scope

The survey was sent to units responsible for serving bachelor-level majors in computer science. We explicitly omitted units whose programs were only in computer engineering, information science, or information technology.

U.S. doctoral-granting and non-doctoral granting and Canadian doctoral-granting units were surveyed.

Survey Design

The survey questions covered the following areas:

  • Unit context (degrees offered, recent changes in degrees, admission of students to the major, responsibility for nonmajor courses)
  • Unit’s perception of the trend in demand for introductory, mid-level, and upper-level courses from majors and nonmajors
  • Impacts the unit sees or does not see on students, faculty, staff, physical space, and diversity
  • Actions the unit has considered or taken to deal with increases, including actions to limit impacts on faculty and actions to mitigate potential impacts on diversity
  • Course-level enrollment data for representative courses at the introductory, mid-level, and upper-level for 2005, 2010, and 2015; and also numbers of majors, women, and underrepresented minorities enrolled in the selected courses

Questions were revised based on a pilot survey with 10 units, including one Canadian and three non-doctoral granting units.

The final survey is available at https://cra.org/wp-content/uploads/2017/02/2015-Enrollment_Survey-Sample.pdf.

Data Collection

The survey used the format of the CRA Taulbee Survey for doctoral-granting units and the ACM NDC Study for non-doctoral-granting (bachelor’s and master’s) units, both of which survey their populations annually. Doctoral or non-doctoral status for the Taulbee and NDC surveys is based on whether a unit grants doctorates in computing.

For the doctoral-granting units, the CRA Enrollment Survey was set up as a separate survey under the Taulbee umbrella. All doctoral units with undergraduate CS programs were invited to participate via an email to the academic unit head and, later, to the primary Taulbee contact within the unit (if different from the academic unit head). Surveys were conducted online. Data collection began in October 2015 and concluded in January 2016.

For the non-doctoral units, the CRA Enrollment Survey was set up as a separate survey under the ACM NDC Study umbrella. The CRA Enrollment and ACM NDC surveys were launched concurrently in January 2016. An email was sent to the ACM NDC Study contacts of all non-doctoral granting units announcing the launch of the surveys; the email indicated that the CRA Enrollment Survey was only for those units with an undergraduate CS program. Data collection concluded at the end of March 2016.

Members of the CRA Enrollment Committee Institutional Subgroup were concerned that units might be unable to provide the requested course-level enrollment data. While not all units provided this data, Table H.1 provides a few examples that show many units were able to do so.

Table H.1: CRA Enrollment Survey Response Rates

Type of Unit Surveyed Responded

# (%)

Provided Intro Majors Course Data for 2015 Provided Intro Majors Course Data for 2010 Provided Course Data All Levels for 2005, 2010, and 2015
Doctoral 190 134 (70.5%) 70 59 49
Non-Doctoral 706 93 (13.2%) 25 23 20

Not all responding units answered every question. Thus, analyses and figures in different sections of this report may be based on different numbers of responses.

Data Analysis

Types of Analyses

This Generation CS report primarily includes descriptive statistics about the extent of the enrollment surge and its impact on academic units. Significance testing was employed in two places. In [B. Growth of CS Majors], we report that there was no significant difference in the percent change in enrollment over time between small and large academic units, which was tested using a paired sample t-test. In [D. Impact on Diversity], some correlations are reported between actions and diversity outcomes (percent of women or underrepresented minority students) with their associated p values.

Categorizing Academic Units

In the analysis within this Generation CS report, units are categorized in several different ways:

  • Doctoral- vs. non-doctoral granting. The CRA Taulbee Survey, the ACM NDC Study, and the CRA Enrollment Survey classify units as doctoral or not based on whether the unit grants doctorates in computing. The Integrated Postdoctoral Education Data System (IPEDS) categorizes institutions as doctoral or not based on whether they grant more than 20 research or scholarship doctorates per year in any field, following the Carnegie Classification of Institutions of Higher Education. Most sections of this report only use data from the CRA Enrollment Survey. However, both [D. Impact on Diversity] and [G. IPEDS Data] present some analyses of IPEDS data for context and comparison.
  • Public vs. private. Includes only U.S. doctoral units. There are responses from 92 public (74% of the responses) and 32 private (26%) doctoral units.
  • Large vs. small. Units with >= 25 tenured or tenure-track faculty FTE are considered large; those with fewer than 25 are considered small. Four units did not provide faculty size; of the 130 units with a stated faculty size, there are responses from 68 large (52%) and 62 small (48%) units.
  • Ten Canadian doctoral-granting units responded to the survey; no Canadian non-doctoral granting units were surveyed. Canadian units are not included in the public vs. private comparisons but are included in all other analyses for which they provided data.
  • Non-doctoral granting. All responses from non-doctoral granting units are grouped together. While subgroup differences would be of interest (e.g. liberal arts schools vs. large master’s granting units), not enough units responded to the CRA Enrollment Survey to properly analyze these differences.
  • MSIs (minority-serving institutions). The doctoral-granting respondents include four minority-serving institutions, one HBCU, and three Hispanic-serving institutions. The MSI data is included with the other units except for correlations between percent of underrepresented minority (URM) students and other variables, such as diversity actions taken. The non-doctoral granting respondents included one Hispanic-serving institution and one all-women institution.

Missing Data

Because of missing data, not all analyses using the same unit breakdown have the same number of academic units.

Analyses of course enrollment data across time use only responses of units that provided data for all years under analysis.

Variables

Course-Level Enrollment

The survey asked for an “intro-level course for (mainly) nonmajors” and an “intro-level course for (mainly) CS majors.” In most cases this is synonymous with an intro course not required for the major and an intro course required for the major. Such courses are handled differently by different units. For example, not all units have an intro-level course for nonmajors. Furthermore, in some units, the intro course primarily for majors is taken before a student can officially declare a major. Also, some units could only report whether the student who took a course was a CS major when they left the institution, not their major status at the time they took the course. Clearly, some students taking the intro-level course for nonmajors later become majors, and not all students taking the intro-level course for majors become majors.

There is less ambiguity about majors and nonmajors in the mid-level and upper-level courses. However, in interpreting this data, note that the CRA Enrollment Survey asked for data on a representative course at each level, not for complete data for the unit. For example, the enrollment figures (and thus the percent of women and URMs) labeled “Upper-Level” are for a single selected upper-level course for each unit, which is not a total picture of the unit’s enrollment in all upper-level courses. Each unit made its own choice of representative courses, but within the constraint that the course had been offered with similar goals and curriculum since 2005. Therefore, unit differences in course enrollment at the mid-level and upper-level may reflect differences between the topics of the selected representative courses as well as differences between institutions.

Definition of Underrepresented Minorities (URMs)

For the course-level enrollment data, the CRA Enrollment Survey asked units to provide the following data for each chosen representative course: total student enrollment in the course the most recent time it was offered, number of CS majors, number of women, number of international students, and number of URMs. The instructions for URM status were to “aggregate the following classifications: Black/African American, American Indian/Alaska Native, and Hispanic/Latino.”  Although we recognize that there may be differences in the trends and experiences of different groups, we did not collect data to break this aggregate down further.

Analyses using the Taulbee Survey data and IPEDS data group the same classifications into URMs. Students reported as multiracial are not counted as URMs because we do not have information on which races they identify with.

Diversity Action Composites

Diversity Considered in Decisions composite was computed as the sum of the first three items in Question F6 of the CRA Enrollment Survey, which are yes/no whether diversity impacts are explicitly considered when discussing possible actions. Some actions were considered but not taken because of concerns about their impact on diversity. Also, some actions were chosen specifically to reduce the potential impact on underrepresented groups.

Any Diversity Action Taken was computed as yes/no if there were yes responses to any of the three diversity action questions.


[1] We use the term “academic unit” or “unit” to denote the administrative division responsible for the CS bachelor’s program. Often, but not always, this is an academic department.