Kevin Carey is the director of the education policy program at New America Foundation, where his research topics include higher education reform, improving college graduation rates, and online education. Prior to joining New America, he worked as the policy director of Education Sector and as an analyst at both the Education Trust and the Center on Budget and Policy Priorities.
Carey’s writing has received two Education Writers Association awards for commentary. He has published articles in The New York Times, Slate, The New Republic, Washington Monthly, The American Prospect, The Chronicle of Higher Education, and Democracy, among others. He also appears frequently on media outlets including CNN, C-SPAN, and NPR.
Carey has authored a forthcoming book titled The End of College: Creating the Future of Learning and the University of Everywhere. “It’s about how the traditional higher education system and organizational model we’ve relied on for the last 100 years is colliding with the information technology revolution in all kinds of interesting and somewhat provocative ways, and what that’s going to mean for colleges and students in the future,” he says. The book is to be published by Riverhead Books in March 2015.
Carey holds a bachelor’s degree in political science from Binghamton University (New York) and a master’s degree in public administration from The Ohio State University.
Kevin, could you first tell us a little bit about the work you do and how you developed a passion for making it your career?
I direct the education policy program at the New America Foundation. New America is a relatively new think tank (for Washington, D.C.), at about 15 years old. We work on a whole range of issues—everything from foreign policy to early childhood, from asset building to health policy. I’ve worked for a variety of nonprofit research and advocacy organizations, but I started my career working in public policy. I was an analyst for Indiana’s state senate, and I also worked as the assistant state budget director in the late 1990s, mostly with a focus on education finance. I began with questions of how we decide how much money different schools and public universities are going to get. So that was my exposure to how public higher education policy looks at a typical state with a range of everything from community colleges all the way up to flagship universities.
And then my interest here in Washington began eight or nine years ago, when the federal government released its first comprehensive set of graduation rate measures. It had been tracking them for a while, but this was the first time it showed for every college what the cohort graduation rate looked like—not just a raw number, but segmented for men and women, racial/ethnic minorities, etc. What I noticed in looking at the data were some pretty troubling numbers at some institutions—I mean, graduation rates in the 40%, 30%, 20%, and even less sometimes. And then you could also look at a group of institutions with very similar kinds of missions and levels of funding and academic profiles of incoming students, yet see pretty different graduation rates. And if you talked to the institutions and ask them why, usually the reason behind it was that some institutions were just a lot more focused on trying to be good at that than others. At the time, the national higher education policy conversation was more limited; it was mostly around federal financial aid research, which still tends to be a lot of what we talk about. But since then, I think there’s been a growing awareness among policymakers and the public at large that there are some big-picture issues that need to be addressed if we’re going to meet our national goals for developing human capital and providing opportunities—and that if we don’t address those goals, we really run the danger that higher education will fall away from its historic role as an agent of economic mobility and instead be a system that actually reinforces inequality rather than acts against it.
As you look at things from a public policy perspective, colleges and universities often look at things from a business perspective, and students and families of course view things from a personal perspective. So is your work very much impacted by the question of what college is actually for?
I don’t think colleges ask themselves that question very often. It’s been a long time. I think the last serious conversation we had in this country about what college is for was probably the 1870s and 1880s, when we more or less decided that it would be for everything. We would create an organizational model that could accommodate the graduate research mission and the undergraduate, more liberal arts, mission, plus our various needs for workforce training, and just put them all in one place and kind of let them loose to act in their own interests. And that was, in a lot of ways, a very successful set of choices we made. But it had some long-term consequences, and I think we’re really feeling them now. We’ve created institutions that have a very strong sense of self—and a very strong sense of self-interest—but are trying to do many, many things, even as they inhabit at higher education culture of prestige that’s built around selectivity and research prowess. As a result, all of those institutions collectively made choices that start to run against the public interest.
Trends in college pricing and student debt are the most obvious manifestations of that. Nobody wants students to have to borrow more and more money to go to college. And while, to some extent, that’s a function of states’ choosing to pull money out of public colleges and universities, the same thing is happening in private universities. So you could put all of the onus on state lawmakers or the after-effects of the Great Recession, and I do think institutions are sort of struggling to figure out how to survive from year to year. But that leads them to approach their missions often without a real strong sense of self-reflection about some of the core purposes—particularly when it comes to undergraduate learning, which I think often gets short shrift in both the conversations and, more importantly, institutional choices.
Do you get much push-back from colleges and universities when you point out this lack of focus on teaching undergraduates?
It’s interesting. I get a fair amount of push-back from people in public, and then in private, people say, “Oh, yeah, I think that’s all true.”
In a lot of ways, I think my job is easy. All I really do is take the things about higher education that people within higher education are too polite to say about one another and say them. Because I don’t work in a college or have a career in academia, there’s no downside for me. Still, it’s amazing to me the number of things that everyone just admits. Like, for example, I can’t tell you how many people I’ve run into who, at some point in their career getting their doctorate and moving into academia, have said, “Yeah, you know, I tried really hard when I was a graduate student to do a really good job of teaching my courses, until one of the older faculty members said, ‘Hey, you really need to spend less time trying to do a good job teaching, because that’s not what’s important for your career right now. You need to be publishing.’ ” And that’s just a common refrain. I think lots of people have had this explicit message of “research is what matters; teaching doesn’t.” And in fact, if you do too good a job teaching, that is seen as prima facie evidence that you are not serious enough about your scholarship. Well, I just don’t think it can be anymore obvious than that! I don’t think that requires any further explanation or comment. It just sort of obviously speaks to a set of institutions that are not adequately focused on trying to do the best job they can in helping these students learn. And every time anybody tries to somehow measure how much students learn in college, they don’t get results that would be very encouraging to anybody. That’s true of the “Academically Adrift” research, it’s true of the new data that OECD [Organization for Economic Cooperation and Development] released last year . . . one can argue about all those studies in terms of the methods they used to measure student learning, but it’s not as if there’s some other set of studies that had different results.
And the thing is, everybody in the system is more or less just a creature of the system. Nobody who lives today created the university as we know it. Those systems were created a long time ago. By and large, most people—not all, but most people—who end up in academia have, in some way or another, a strong commitment to the idea of education. They’re all creatures of education and people who were very successful as students, and that’s why they ended up doing what they’re doing. And they’d like to do a good job, but they also want to have a career—and be paid, and have job security, and be recognized by their peers—so they make compromises. It’s one of those things that’s not any one person’s fault. And usually those are the great tragedies: the systems in which it’s not a function of one person doing something wrong (because I think most people are good people), but rather everyone collectively making rational choices that get us to a place that no one would want to be.
Does that apply even at small residential colleges that have more of a liberal arts orientation?
A small liberal arts college should have the least problem figuring out what it’s about. It’s not a research university. It’s not a huge for-profit endeavor. It by definition can’t be large. There’s a long tradition of what a liberal arts curriculum looks like (although surprisingly few liberal arts colleges are really as committed to that as they would say they are), and the liberal arts sort of replaced organized religion as the spiritual underpinning of higher education. All of our colleges and universities were explicitly religious institutions until about 130 years ago; all of the college presidents would come from the clergy. That all faded away with modernity and the broader secularization of our society. So in order to maintain their exalted position in society, a lot of colleges swapped in the liberal arts ideal as the “higher purpose” of higher education—which I think is wonderful, and I think representing those values is one of the important things that colleges do. But if you look at how they actually educate, and how they teach, and what students are required to do (or more importantly, not required to do) in order to get a diploma, I don’t see anything like a kind of on-the-ground commitment that matches the rhetoric. Most places, you can easily get a diploma without engaging in any serious or sustaining way with difficult intellectual ideas or the broader intellectual tradition. It’s not hard.
So I actually think, in some ways, liberal arts colleges ought to be more well-positioned than others for the future because they can plausibly say, “Our model requires us to be small and kind of expensive.” There’s no substitute for living in a genuine community of scholars and teachers. There’s no substitute for the totality of our commitment to teaching liberal arts—if that’s what they’re actually going to do. So in that sense, I think their path forward is clear. It’s actually the less selective and newer comprehensive universities where it’s much less obvious why they need to exist in the future.
Do you sense that colleges and universities overall are now more predisposed to thinking about doing things differently in order to be sustainable?
They should be. But at the same time, there are many smart people within universities who study organizational theory professionally, and if you wade around in the literature, it doesn’t suggest that organizations that have been one way for a long time are going to just up and be a different way. That’s simply not the way organizations behave—and particularly not organizations that are as settled into their culture and their particular sense of interest as colleges and universities are.
I don’t mean to be fatalistic when I say that. But colleges are also protected in many ways by public subsidy and by regulation from certain kinds of competition in the marketplace. So they have continued to make a set of rational mid-term choices about whether or not they really have to change and be entrepreneurial, or whether they can just keep trying to persist with the business model they have. And so far, the latter hasn’t been a bad choice. We continue to now and then lose institutions that are very weak financially and on the margins of the sector, but it’s not like college is the newspaper industry, where you see vast layoffs and once-august institutions disappearing from the face of the earth over the course of a few years.
But I think to some extent, that choice of trying to persist is delaying the inevitable. It’s just knowing when the inevitable is going to arrive, exactly—and if it’s something that’s going to happen on your watch—that is a tricky estimate to have to make. And I think people are being conservative (and somewhat in a rational way) about that.
Because students and parents have their eyes open, and they understand what it takes to get ahead (or at least not fall behind) in the modern economy. And that means a higher education credential. That fact, by the way, is what has kept the entire college industry alive and thriving for the last 50-odd years.
Ironically, though, employers and new graduates alike seem dissatisfied with the manner in which a college education provides job preparation. So do you foresee more partnerships between employers and colleges to enhance workforce training?
I have a broad sense that the old career center is not good enough. By that, I mean the place somewhere on campus that you could find if you really tried hard to get there, where there were a few people who could give you some broad sort of résumé writing tips, and that was the end of it. That was the reality of the intersection between higher education and the workforce for a lot of people for a long time. Now I think colleges are recognizing that they’ve got to up their game.
Interestingly, I recently talked to the CEO of a company called Koru. What they do is provide a post-graduation experience that lets graduates of liberal arts colleges spend three or four months in internship-like early career training experiences with technology companies, where they learn the basics of what it takes to work in a fast-paced company that does interesting things. And the college is actually paying Koru in order to provide this service to its new graduates. This is something the liberal arts college is paying for so they can say to parents who are thinking about sending their kids there but are worried, “Hey, am I going to spend $200,000 and then get in return for that a medieval studies major who is going to move back into his or her bedroom? Because I don’t want to do that. I have plans for that bedroom!” And so the colleges are actually recognizing they’re really not set up to prepare people—particularly at the high-end, more elite part of higher education, where they’re not set up to train people to go into careers at all. It actually is worth their while to engage with a third party to do that.
You also asked about employers’ views. Sure, companies are constantly dissatisfied with the graduates they get. There is a gulf of distance between the self-perception of college administrators in terms of how well they think they’re doing in preparing their students, and your average employer in terms of their assessment of new college graduates. At the same time, companies really don’t want to spend any money training people, and they’ve been pulling back on the investments they used to make in their own workforce. Because people are mobile now and change jobs, companies don’t want to spend a lot of money training someone to be good at their next job for a competitor. So it really is leaving students in a tough place, where they’re having a hard time making this transition from their credential—which wasn’t really job-oriented—to a good enough job where they can kind of progress on their own. There’s been a situation of the colleges not being set up to be good at that and the employers wanting their employees to be ready to contribute on Day One. I think colleges and institutions that can make a credible claim of helping to solve that problem will be well positioned in the future.
Do you think the way forward will involve collaboration with peer institutions, as well, to eliminate redundancies and more away from trying to be everything to everyone?
Well, the logic of colleges has always been, all the way back to the University of Bologna 900 years ago, that if you wanted to learn, you needed to be in the place where all the other smart people and books were. And that was it. That’s how you learned. You learned by being with other students, by being in proximity to a scholar, and being near a library full of books. And just for reasons of pure scarcity and logistics, there couldn’t ever be very many places like that. And you would locate them in centers of commerce, normally—wherever all the roads came together, that’s where you’d put it, where there was a big enough economy to support a level of civic society. And so that idea of colleges as scarce and expensive places kind of drove colleges to be what they are (because it cost a lot of money to build a building big enough to put all the books in, and because there were only so many brilliant scholars in any given field so there were virtues in locating them all in one place).
At the same time, I think the genius of the American system of higher education was to unleash the ambitions of individual colleges and give them space to try to be as successful as they knew how. And that very much led institutions to try to be as many things to as many people as possible. The bigger you could build your walled city-state of learning, the better it was. The more books and smart people you could bring in and the more program offerings you could have, the better and more attractive you were, and the more well positioned you were to compete with the other academic city-states to attract people to you. The logic of that made sense more or less all the way until around the turn of the century. I went to college in the late ’80s and early ’90s, and it still made sense then. I mean, there were computers, but it wasn’t as if the computers were a replacement for the library. The possibility of this being really different is a post-Internet possibility that we’re only really starting to grapple with now.
But now that we live in a world where you can log onto your computer and take, for free, classes offered online by the world’s most famous universities, why your university then also has to have that class, the answer to that is not at all obvious. There is no answer to that, because there’s no reason for it to do that. So I think one of the things absolutely every university is going to have to do in the future is really figure out what it is good at and try very hard to be good at that, and only that. And not everybody needs an English department—they just don’t.
Are there additional ways that technology has the potential to change the organizational model of higher education? Perhaps by doing away with the credit hour, for example?
The question that looms out there is, “What is the thing that will lead to the destruction of higher education?” And clearly, it’s not MOOCs. Just having free classes from great universities is not enough. And the reason is because people don’t go to college just to learn. They go to college to get a credential. In fact, that’s the main reason to go to college.
A credential needs to have some kind of rational basis, and right now the credit hour is that rational basis. And the history of the credit hour is fascinating. It wasn’t created in order to measure student learning; it was created essentially in order to measure faculty workload, for the purposes of figuring out who was a full-time faculty member, in turn for the purposes of figuring out who was eligible for a pension. That was a long time ago, when colleges had to grow enormously in order to accommodate the great surge of the post-G.I. Bill, American prosperity, and changes in the economy that sent 75 percent of high school graduates to college. There was just a strong need for bureaucratic systems that could regularize that whole process, and that’s what gave us the credit hour.
The problem is, the credit hour makes everything about time. It means that all of our degrees are expressed in terms of time. What does a four-year degree say? It says you went to college for four years. Explicitly, it’s a four-year degree; it’s not a “here’s how much I learned” degree. Consequently, it reinforced this inattention to the question of how much students were learning in college. And it very much made sense in the context of the scarce, expensive academic city-state, because the diploma said, “I was at University of Such-and-Such for a certain amount of time.”
Well, now that there are ways to credibly learn and form academic communities outside of those city-states, it doesn’t make sense to have credentials that are just focused on how long you were there, since you can learn a lot without ever going anywhere near one of those places. All of which means that we need to have credentials that are based in what they’re supposed to be about—which is student learning. If we can create a credentialing architecture that has credibility in the labor market that people can then start to organize around, that could be revolutionary.
Do any specific colleges jump out at you as being truly innovative in other ways that are particularly responsive to market conditions?
There are thousands of colleges and I’ve only been to a few of them, so I don’t like to represent myself as someone who can say, “Of all colleges, this is the one that’s doing things.” Earlier this year I spent some time at Davidson College in North Carolina, which struck me as being interesting because it is in many ways a very traditional, elite liberal arts college. It continues to have an authentic commitment to undergraduate liberal arts education—there’s a long tradition there of students taking a multi-semester humanities curriculum (although not as many are taking it as used to). But they’re also very much engaging with opportunities for higher education online. They were one of the first liberal arts colleges to join the EdX consortium. They have a lot of professors there who are very excited about the idea of engaging with the creation of these very large online communities.
And the thing is, in the United States right now we’ve created the greatest collection of higher education resources in history—but the system of walled city-states only allows a very small, tiny sliver of humanity to access it. Harvard is never really going to be any bigger than the number of freshmen they can fit into dorms or Harvard Yard. That’s what Harvard kind of means. But there are hundreds of millions of people around the world that would love to engage with the intellectual resources of Harvard University. So institutions that are forging ahead to try to simultaneously be good at what a real residential education looks like and also create new kinds of learning communities using technology—I think that’s got to be the way forward, both from the standpoint of relevance but also economically.
If you can reach a lot of people, you only have to get a little bit of money from them in order to make that an economically worthwhile thing to do. Whereas, I think a lot of colleges right now, they begin with the number of people they want—maybe it’s a thousand, maybe it’s two thousand; every enrollment manager in America has a number that probably keeps them up at night. Basically they say, “I need this many students, and those students need to collectively give us this much money.” And every year the number stays about the same, but the amount of money gets bigger. And that’s the dilemma they have.
Well, what if all of a sudden you can extend your resources to four or five or six orders of magnitude more people than that number? Most of them can pay you nothing, and maybe some of them can pay you a little bit, and still that can be a worthwhile thing to do—and will help you continue to do this for the conceivable future.
Is that perhaps a way to deal with “disruption” in the higher education industry?
I think Clay Christensen’s disruption thesis is relevant to how we think about higher education. Any broad theory is going to vary in exactly how explanatory it is—and no, colleges are not exactly like the integrated steel industry or what have you. But the underlying dynamic that Christensen has been talking about for a while is that competition starts with low-cost companies competing for the customers that the incumbents don’t really want. And it seems it’s not hard to fit the present circumstances of higher education to that progression. Organizations like colleges that have a combination of a lot of public resources (in the way of regulatory protections and subsidies) and also are very deeply embedded in the culture (in a way that steel manufacturers aren’t) have a lot more resources to push away the forces of disruption—for a while, anyway.
Now, that can play out in one of two ways. It may mean that they’re able to sustain their business model for even longer before the inevitable day of reckoning (which makes the day of reckoning even tougher to handle), or that the future is some kind of more interesting combination of things—which seems most likely to me. To me, the biggest thing is not so much the landscape of higher education is wiped out and replaced with something, so much as that technology changes the economic logic of creating new higher education organizations. Whatever percentage of all college students are enrolled in traditional colleges today, in the future that percentage is going to be substantial, but also substantially smaller. We will have a much more varied organizational landscape, we’ll have a much more varied credentialing landscape, and traditional colleges will not have a monopoly.
Every student of a certain age feels like they need a degree in order to have the kind of life they want to have. And in the near future, traditional colleges are going to have to compete harder with a much broader and more varied set of organizations that can credibly claim to provide that.