Skip to Main Content

A New Endeavor: Introducing the Fifth District Survey of Community College Outcomes

Regional Matters
May 12, 2022

Community colleges play a major role in workforce and economic development in communities across the Fifth District. Their responsibility for educating and connecting individuals to jobs means that understanding their outcomes is important to how we think about our best path to maximum employment and sustainable economic growth.

The primary data used to understand postsecondary outcomes in the United States is the Integrated Postsecondary Education Data System (IPEDS) from the U.S. Department of Education. Although IPEDS data are useful to understanding outcomes among four-year institutions of higher education, the data are fraught with issues for non-four-year institutions. The primary issue with IPEDS is that it was built for four-year schools and does not take into account the differing structure and purpose of community colleges. For example, the traditional IPEDS cohort, used to measure graduation rates, only includes first-time, full-time students. This works relatively well for four-year schools: A majority of their students are attending full time, and most who begin as freshmen are enrolling for the first time. However, many community college students attend part time or enroll after dropping out of a four-year institution or after attending another community college. Additionally, IPEDS only measures outcomes of students in for-credit programs, which leaves out many community college students, including those who, for example, complete a CDL (Commercial Driver’s License) or a noncredit certificate in phlebotomy. This article will not only introduce a new data collection method directed specifically at community colleges but also will present some results. In the end, we believe that IPEDS underreports true community college success.

Why a Survey?

Over the past few years, the Richmond Fed has had dozens of conversations with community college administrators about how community colleges serve students and communities. Community colleges engage in valuable activities such as training the next generation of workers for in-demand jobs, partnering with local firms and industries to train workers, and teaching high school students via dual enrollment. Unfortunately, existing data collection methods on postsecondary outcomes do not provide a quantitative assessment of the true outcomes of community colleges. Individual community colleges use many metrics to gauge their success, but these can vary from school to school, which makes it hard to compare across institutions and track outcomes consistently over time. Thus, the Richmond Fed decided to use our survey tools to create a quantitative, consistent system of measuring the outcomes of community colleges.

Creating the Survey

One critical disadvantage of the current system for collecting community college outcomes (IPEDS) is that it was created for traditional four-year institutions. By using the same metrics and definitions to assess community colleges (i.e., graduation rates), community colleges appear to be doing a relatively poor job.

To better understand what should be measured when thinking about community college success, we conducted in-depth 90-minute conversations with 10 community colleges in our district. It was important for us to create a measurement system that came directly from the schools and reflected their definitions of success. To achieve this, we were intentional in the schools we conversed with and chose a diverse set of schools across our district based on a school’s urban/rural status, location, and size.

Table 1. Profile of Pilot Community Colleges

Demographic  Count
State*
 Maryland  1
 North Carolina  2
 South Carolina  3
 Virginia  2
 West Virginia  2
School Type
 Rural  5
 Urban  5
School Size
 Fewer than 3,000 students  5
 3,000 to 9,999 students  3
 10,000 or more students  2
Total Schools  10

* The District of Columbia has no public community colleges

The schools used many metrics to understand their performance. Luckily, several metrics were common among all 10 schools. In our conversations, we learned that in addition to outcome measures such as completion or transfer rates, we should look at enrollment data — especially since the IPEDS cohort is narrow and excludes many students enrolled at the school. The table below shows the metrics that we believe — as reported to us by and confirmed in follow-up conversations with the schools — better reflect how community colleges are performing.

Table 2. Our Survey’s Community College Success Metrics

Student Type  Success Metric 









Credit Students
Enrollment
Includes any student enrolled in a credit program (including part-time students and those who have previously been enrolled in other programs)
Student Success
Includes any student who received an award (associate degrees, licensures, certificates, or industry credentials), transferred to a four-year institution, or is currently in good standing
Total Awards Granted
Includes the total number of associate degrees, licensures, certificates, and industry credentials granted
Total and Successful Credit Hours
Includes the total number of credit hours attempted and the number of credit hours that were successful
Retention Rates
Measures share of students retained year-over-year, fall semester to spring semester, and spring semester to fall semester




 Noncredit Students
Enrollment
Includes all students who are enrolled in the school but are not taking credit classes
Total Awards Granted
Includes the total number of licensures, certificates, and industry credentials
Total Contact Hours
Includes all the hours students are in class




High School Students in Early or Middle College Programs 
Enrollment
Includes all high school students who are enrolled in the school for early or middle college programs
Total and Successful Credit Hours
Includes the total number of credit hours attempted and the number of credit hours that were successful
Total Awards Granted
Includes the total number of associate degrees, licensures, certificates, and industry credentials




High School Dual Enrollment or Dual Credit Students
Enrollment
Includes all high school students who are enrolled in the school for early or middle college programs
Total and Successful Credit Hours
Includes the total number of credit hours attempted and the number of credit hours that were successful
Total Awards Granted
Includes the total number of associate degrees, licensures, certificates, and industry credentials


Industry Partnerships
Total Number of Companies Served
The number of partnerships the school has with companies
Total Number of Employees from Companies Served
The number of employees from companies enrolled at the school

Schools already spend a significant amount of time reporting data to the government, accreditation bodies, and internal groups. The goal of this data collection program is to collect relevant metrics while not overburdening the schools. We attempted to keep the survey as short as possible by focusing on the metrics that were cited as important by each school and not collecting the “lesser important” metrics. The pilot schools reported that although there will be some upfront time in data preparation, the survey should not be overly burdensome in the longer run.

Although our proposed measurement program is better than existing sources, this survey is not perfect. In addition, not every community college has all of the data easily available, especially for noncredit students. Hopefully, as this research is socialized, more schools will collect and retain information on their students’ success.

What We’ve Learned So Far

At this point, we have received data from nine of our 10 pilot schools. Although the available pilot data is still preliminary, some patterns were immediately apparent.

First, IPEDS greatly undercounts the success of students being served in for-credit community college programs. Our cohort measure, which includes all students who entered the school during the 2016-17 school year to take for-credit classes, is larger than the IPEDS cohort in every case. In urban schools, where they have more part-time students and students who are not first-time students, the difference is especially noteworthy. It is not surprising, therefore, that our student success metric, which we define as the share of all students in our cohort who graduate, transfer, receive a licensure or certificate, or persist in enrollment, is also higher in almost every case than the reported IPEDS outcome measure. In some cases, the IPEDS data are significantly undercounting what we believe to be the true success rate of for-credit students.

Secondly, the data that community colleges have on their noncredit programs and students are very messy. We knew from our conversations with community colleges that data collection on the noncredit side was not as robust as the for-credit side, partially because IPEDS doesn’t include data on noncredit programs, so the community colleges do not have as much of an incentive to collect the information in an organized way. One exception is Maryland because the community college funding formula in the state funds for-credit and workforce-related noncredit programs equally. Therefore, data must be collected on noncredit programs and students for them to receive full funding. On the other hand, other Fifth District states either don’t fund noncredit programs via state appropriations, or they fund them differently from for-credit programs, so the same extent of data collection is not mandated. Even though we expected the noncredit data to be less reliable than for-credit data, we were disappointed in the level of data we were able to obtain from our pilot colleges on noncredit programs and students. This is an area where great strides can be made in the future, and we hope that our survey will help to lead the charge.

Lastly, Fifth District community colleges are doing a tremendous amount to educate local high school students via dual enrollment and dual credit programs. The nine schools in our pilot enrolled over 11,700 high school students during the 2020-21 school year. Additionally, the nine schools granted a total of 949 degrees, licensures, or certificates to high school students. 

Table 3: High School Students Served by Pilot Institutions

School      Unduplicated headcount of high school students
in dual enrollment or dual credit programs
Percentage of
successful credit hours
School 1  11  100%
School 2  3,110  83%
School 3  501  70%
School 4  2,123  95%
School 5  2,108  87%
School 6  931  94%
School 7  829  81%
School 8  858  85%
School 9  1,307  86%

Note: We do not have permission to publicly release institutional-level data, so we masked the name of each school

Interestingly, the preliminary pilot data indicates that high school students have a higher course success rate than other students at community colleges. Over 86 percent of credit hours attempted by high school students at our nine schools were earned successfully, while the overall course success rate was 77.8 percent.

What’s Next in the Survey of Community College Outcomes

In spite of its current limitations, our pilot schools universally acknowledged the need for this sort of data collection system. The next step is to enroll more schools into the measurement program. Our end goal is to enable this to serve as a supplement to IPEDS. Nationwide, we need a measurement system that is reflective of all the ways that community colleges are providing value, so that prospective students, school counselors, parents, legislators, and researchers can better understand and articulate the role community colleges play in the educational and workforce systems. While IPEDS has made strides toward this, we believe a complete separate set of metrics is appropriate for community colleges due to their notable differences from four-year institutions.

We are grateful for the time and partnership of each school in this endeavor. We are equally grateful for the open and transparent dialogue we had with each school when creating this measurement system. The data reported in this article are just the beginning; as this program develops further, we expect continued iteration on how to define and measure the success of community colleges.

We will be hosting an event in early August to discuss our results in much greater detail and to explain our plans moving forward. All community colleges in the Fifth District, as well as state system offices, will be invited. If you would like to attend, please let us know