Skip to Main Content
Speaking of the Economy
Generic facade at community college
Speaking of the Economy
March 18, 2026

Tracking Pathways to Success at Community Colleges

Audiences: Students, Educators, Workforce Sector Leaders, General Public

Stephanie Norris and Davy Sell review the latest results of the Federal Reserve Bank of Richmond's Survey of Community College Outcomes, which examined the variety of ways that these colleges served different types of students and communities in the Fifth District as well as in five states outside of this region. Norris is a regional economist and Sell is a regional analyst, both at the Richmond Fed.

Transcript


Tim Sablik: My guests today are Stephanie Norris and Davy Sell. Stephanie is a regional economist and Davy is a research analyst, both at the Richmond Fed. Both are involved in the Bank's Survey of Community College Outcomes, which is the focus of today's conversation. Stephanie and Davy, thank you for joining me.

Stephanie Norris: Thanks for having us, Tim.

Davy Sell: It's great to be here.

Sablik: I'll start by congratulating both of you on the recent release of the 2025 Survey of Community College Outcomes results. I know a lot of work went into that project.

Regular listeners may recall that Stephanie was on the show last year to talk about the 2024 survey results. That release was notable because the number of participants had expanded from 10 schools when the survey first began in 2021 to 121 schools, covering nearly all the community colleges in the Richmond Fed's district.

For 2025, you expanded the survey again. It now includes responses from 189 schools, including some community colleges from states outside of our district. What was your motivation for expanding the survey?

Norris: One of the reasons we started this project was to develop a community college measure that was consistent across states. Every state is unique in how it approaches higher education in terms of funding, system structure, and policy priorities. Where community colleges fit in the broader education and workforce ecosystem in each state varies as well. As a result, community college data coverage and quality can look quite different in different states.

One key challenge for our data collection is ensuring that we are asking for data in each state that is consistent but still reflects the critical roles community colleges play in their states. We have worked through this process for a few years with the states in our immediate footprint, but we were eager to see if our approach translated to other states that differ from our original states in important ways.

Sablik: Davy, can you give us an overview of which states are represented in the survey and how you collected that data?

Sell: In 2025, we collected institution-level data with 10 states represented to expand our understanding of how community colleges serve their students and what those successes look like beyond our region. This, of course, included schools in the Fifth District states: Virginia, North and South Carolina, West Virginia, and Maryland. We also serve D.C., but there are no community colleges in D.C. We also expanded to collect data from state higher education systems in Iowa and Arkansas, as well as individual institutions in Massachusetts, New Jersey, and two schools in Texas.

Who we collect the data from varies by state. In some states — Maryland, New Jersey, Massachusetts, and Texas — we collected data from institutional researchers at each institution. For the other states, we collect data from a state entity, either a community college system or a state higher education agency.

Norris: Once we identify data providers, we walk them through the data elements we collect and talk through how their definitions and terminology for certain things might differ from ours and from other states.There are some data elements that are trickier than others. For example, many community colleges serve high school students. We call these students dual-enrollment students. But what those students are called varies by state, sometimes varies by colleges within states, and sometimes even varies within a college itself, depending on what program that student is enrolled in. After conversations with data providers, we design a data collection instrument and guidance for each state, then work with each data provider throughout the data collection and validation process.

As Davy mentioned, we collect data that reflects what community colleges do. So, we collect data on enrollment. We look at traditional degree-seeking students, but also students who are taking classes but not seeking a degree. We also look at students who are taking non-credit workforce courses and we collect a lot of these elements across student dimensions like gender, race, and age. We also collect data on student outcomes, like how many students earn degrees and credentials in a given year. And, of course, we collect data on a cohort of students to construct success rates for each institution.

Sablik: Yeah, that's a great overview, and gets to my point about all the work that went into this. People might not realize it's not just sending a list of questions to schools and waiting for their answers.

You mentioned a key result from this survey is the Richmond Fed Success Rate measure. We've talked about it on the show in the past, but can you provide a refresher on what this metric captures and how it compares to traditional measures of college success?

Sell: The Richmond Fed Success Rate is a measure of student success that takes into consideration the unique way that community colleges help their students find success and tries to capture pathways that traditional metrics might exclude. In order to do that, we broaden both our definition of success and the cohort of students who are being considered.

Similar to traditional metrics, we start by including award-seeking credit students. We expand our definition of who is considered a first-time student to include students who may have previous postsecondary experience but are enrolled for the first time at the institution of interest.

We also include students who enroll any time during the cohort entry year — not just the fall semester — to allow for more flexibility. That flexibility comes into play because a significant part of our larger cohort size comes from our inclusion of part-time students as well as full-time students. To better reflect these students varied timelines, we measure success over four years rather than the traditional three-year period used for evaluating two-year institutions.

For the 2025 survey, we are looking at a cohort of students who entered their institution for the first time during 2020-2021, academic or fiscal year. We asked data providers to report outcomes on these students through the end of 2023-2024. With the inclusion of all these different kinds of students in 2025, we ended up with an overall Richmond Fed success cohort that was three times larger than the traditional cohort. We're pretty proud of that.

Norris: In addition to looking at the traditional measures of success — like earning a degree or a long-term certificate — we also collect data on students who don't complete a degree or certificate but do earn some sort of workforce-recognized credential. For example, a student might start an associate degree program in welding but leave partway through after they pass an industry certification exam and get a full-time job offer. In the school's graduation rate, that student looks like a non-success. But from a workforce perspective, the college has served that student well and we want to count that student as a success.

These students are incredibly difficult to capture because these credentials are typically offered by third-party entities like an industry association or a state licensing board. These entities do not share data back with community colleges — the system just isn't set up that way. So, the data that we do have represent just a fraction of these students who follow these pathways. We still ask for it and we will keep asking for it because it's important and it's a growing segment in higher education. But we recognize that it's a significant undercount at many schools.

Another pathway that we capture is students who transfer to a four-year institution prior to earning a degree. This is a large segment of students at some institutions. Often, a student will go to a community college for just a year and transfer to a four-year college to finish a bachelor's degree. That makes sense for that student. But for a community college that student, if they did not earn an associate degree, will also look like a non-success in the graduation rate. We count those students as successes, so they end up in the numerator of our success rate.

We also capture students who persist. These are students who have not graduated, earned a workforce credential, or transferred to a four-year institution at the end of four years but are enrolled in good standing and have earned at least 30 credit hours over the four years enrolled. These are students who maybe have taken some part-time courses, have taken time off. Research shows that students who are still enrolled four years after starting are persisting because they want to finish and are likely to do so. So, we count these students as successes.

Sablik: With all of that in mind, what was the overall Richmond Fed Success Rate for your sample compared to traditional graduation rates in 2025?

Sell: The aggregate Richmond Fed Success Rate across the 189 institutions who participated in the survey this year was 49.8 percent, which was 13 percent higher than the traditional graduation rate of 33.8 percent for those same schools, even while looking at a broader group of students. By including these more comprehensive pathways of how students are able to achieve, we were able to capture 13 percent more student successes with our measure.

Sablik: Did the Richmond Fed Success Rate reveal any notable differences across states or between schools located in urban versus rural areas?

Norris: We do see some differences across states in both the aggregate success rate numbers and the composition of those success rates. Our goal was really to get a measure that we could compare across states. But even within states, there's a lot of variation.

State policies certainly play a role. States prioritize different populations and different programs. In some states, for example, we see far more students included in the success rate who transfer to a four-year institution prior to an award because that's just how the state model is set up. South Carolina has a lot of these "1+3" programs — you do one year at a community college and then transfer to a four-year [college] to finish. So, a lot of the students in the South Carolina bridge programs don't earn an associate degree before transferring. Other schools have a "2+2" set up, so they have more students who earn an associate degree before transferring. So, including those students in our success rate does less in those states.

State aggregate success rates do mask the differences that we see within states, too, because each community college has a specific service area. They serve different students. There are different labor markets around them. Even though we do see the influence of state policy on aggregate success rates, there are a lot of interesting patterns that we see when we look within states as well.

Sell: When looking across locale categories — so, thinking rural, town, suburban, and city — there was a 4 percent difference between the highest and lowest success rates. The highest aggregate rate was observed in community colleges in towns, while the lowest occurred in colleges in cities. It's important to add some context to this and say that the town category represented a Richmond Fed cohort size around 35,000 total students, while the city category had an overall cohort size that exceeded 115,000 students. So, there's a magnitude difference there.

The key takeaway for us when considering locale is the way these students are finding that success. In schools located in cities, we see them really leading in the share of their success coming from transferring students. The colleges located in towns appear to really lead in associates degrees and workforce credentials.

Sablik: Yeah, so a major advantage of your success rate measure is that it allows you to think about these different student populations, one of those being part-time students who make up a large share of a lot of community college student bodies compared to four-year institutions. What did you learn about part-time students in the 2025 survey results?

Norris: I'll let Davy talk about the success rate outcomes. But I want to emphasize what you mentioned about these students making up a large share of community college students. This is something that really does differentiate them from four-year colleges.

In our survey, we define part-time students based on how many credit hours they take in their first semester of enrollment. So, any student taking fewer than 12 credit hours, or roughly four classes, for that first semester is classified as part time in our cohort.

Across 189 schools, roughly 53 percent of the cohort students are classified as part time. But the range is large. We have a college in our sample for which just 9 percent of the cohort is part time and a college with an 80 percent part-time cohort, and everything in between.

Just like full-time students, part-time students are not a monolith. A student may enroll part time, particularly in that first semester, for a variety of reasons. Students who start out part time might switch to full time and vice versa. Sometimes this happens more than once throughout their college career. This isn't necessarily a bad thing. It's often necessary for students with significant responsibilities outside of school.

One thing we hear consistently is that community college students often define themselves as something other than a student first — for example, a parent, a caregiver, a mechanic. This isn't an indicator that they don't value their education. They just have responsibilities that demand their time in ways that a traditional student would not.

Flexibility is something we hear frequently as being an important offering that community college students seek. From a workforce perspective, it's valuable to have postsecondary options that can accommodate a student who needs or wants to work while investing further in their career. So, even though these students tend to have lower success rates overall [and] lower graduation rates, we think it's important to capture them.

Sell: The data showed that the aggregate success rate among full-time students was 56.8 percent, whereas part-time students achieved a lower success rate of 46.5 percent. As I previously mentioned, the comparable traditional graduation rate was 33.8 percent. So, while the part-time success rate is lower than the full-time [rate], it's not necessarily low.

We found that part-time students were more likely to transfer prior to receiving an award, particularly in suburban and urban settings where a four-year institution may be more accessible. They also obtained fewer diplomas, certificates, and associate degrees than full-time counterparts. Associate degree attainment continues to be the primary driver differentiating full-time and part-time student success, which is a trend we also saw last year.

Contrary to our expectations, persisters were not more prevalent among part-time successes. Some of this might be driven by our imperfect measure of "part-timeness," but there could be other reasons. Maybe a lot of these part-time students enrolled in trades or workforce-oriented degree programs and they might leave once they get a certification outside of the school from a third party. Then, we wouldn't be able to see them in our success rate.

Sablik: That's really key to mention that your measure captures more successes than the graduation rate, but you still might not be capturing everything. Some of these students are still hard to track.

What do you have planned next for this work with the survey?

Norris: We have a lot of data and we've only just scratched the surface in this first release. Throughout the year, we're going to be writing posts and sharing results through the Community College Insights blog and Regional Matters on richmondfed.org. We'll be digging deeper into the success rate data and exploring more deeply how state policies and funding models influence student success and institutional capacity.

There are two growing segments of community college enrollment that we consistently hear about but for which there's no consistent, readily available data. One is dually enrolled high school students that I mentioned before. They make up the majority enrollment at some community colleges and, in some cases, are earning certificates and degrees before or right when they graduate from high school. We'll be looking at the scope and scale of high school enrollment at community colleges and explore what this might mean for the future of higher education.

The other segment is students enrolled in short-term, non-credit workforce programs, which are increasing in popularity but not tracked or funded consistently. Particularly with Workforce Pell around the corner, there's going to be increased interest in these programs and the data associated with them. We do collect data on non-credit workforce programs. It is very difficult to get it consistent across states, but we're excited to share what we've learned and look at how state policies influence that as well.

And, of course, we do want to be able to track differences over time in the states in our sample. So, we are gearing up to collect a new round of data.

Sell: We are deep in planning for our 2026 data collection kickoff, and we're in contact with our data partners in each state. We're very excited to keep moving forward and continuing this work to equip our stakeholders across education, workforce, and community leadership with a more holistic way to consider community college outcomes and their role in the workforce pipeline.

Sablik: Stephanie and Davy, thank you so much for joining me today.