The Four-Year Degree Was Built for a Different America
A degree once promised economic security — now it's an expensive gamble
I make my living inside higher education.
I teach politics at George Washington University — a private, well-branded, expensive institution planted squarely in the middle of Washington, DC. From our campus windows, students can see the White House, the World Bank, and the State Department. Many of them will go on to work in those buildings. In my world, a college degree — and often a graduate degree — isn’t optional window dressing. It’s the price of entry.
From inside the system, it’s clear that higher education is facing pressures it waited far too long to confront. And the old assumptions — that college is always worth it, that demand will always rise, that families will always find a way to pay — no longer hold.
When I tell people where I work, it’s often followed by a pause and a question that didn’t exist a decade ago: Is college still worth it? Not your college specifically — but the whole enterprise. The four-year degree. The graduate school path. The price tag. The debt. The payoff.
The brutal, honest answer is sometimes yes, sometimes no — and the difference matters more than higher education has been willing to admit.
College still offers things that are impossible to quantify neatly on a spreadsheet. It gives young people space to grow up without immediately being optimized for productivity. It creates friendships, mentors, and professional networks that last decades. It teaches students how to live alongside people unlike them. Those experiences complicate any clean return-on-investment calculation, and they should.
But cost complicates everything.
Tuition has soared. Borrowing limits haven’t, and neither have the wages needed to pay back the loans. Families are asked to take on debt that would have been unthinkable a generation ago. Meanwhile, cheaper and faster alternatives — trade schools, apprenticeships, microcredentials (e.g., accounting, Excel, or HR certifications), employer-led training, even AI — have gotten better, more legitimate, and more aligned with today’s labor market. For many students, especially those not aiming for credential-heavy professions, the traditional model now looks less like a launchpad and more like a gamble.
Higher education isn’t irrevocably broken. But it is misaligned with the economic, technological, and political realities of the country it serves.
From where I sit, the real challenges aren’t the caricatures that dominate cable news or social media. They’re structural. Financial. Demographic. Cultural. And deeply intertwined. Colleges are trying to serve more students with fewer resources, defend public trust while navigating political crossfire, adapt to AI without hollowing out learning, and justify costs in a world that no longer assumes college is the default path to adulthood.
What follows is not a takedown of higher education. It’s an honest field report from someone who believes in the institution enough to say where it’s struggling. Colleges still matter, perhaps more than ever. But the model that carried them through the last half century is straining under the weight of a very different world.
1. The Cost Spiral — and the Quiet Role of Borrowing Limits
Let’s start with the elephant in every admissions office: College is historically expensive. The average annual cost of attendance (tuition, fees, room, and board) at private, nonprofit four-year institutions exceeds $60,000, while public in-state attendance costs just shy of $25,850 per year, having more than doubled in real terms since the 1990s. The sticker price for an incoming freshman at George Washington University is estimated at just over $92,000 per year (assuming no scholarships).
Part of the long-term rise in tuition is undeniably linked to the expansion of federal student lending itself. Easy access to loans helped subsidize the purchase of higher education, increased demand, and reduced the immediate price sensitivity of families — giving institutions more room to raise prices over decades.
But here’s a critical insider detail: federal borrowing limits haven’t come close to keeping pace with tuition increases. Dependent undergraduates can borrow only $5,500 to $7,500 annually in federal loans, unchanged for decades and far below actual costs at places like GWU. The gap must be made up elsewhere for most students, often via Parent PLUS loans, private debt (and worse loan terms), or family help. The gap between tuition and borrowing limits leaves many students vulnerable after graduation, especially when starting salaries for many occupations (hello, my beloved educators) aren’t exactly making new graduates rich.
In total, US student loan debt hit $1.814 trillion in 2025, with the average borrower owing $42,673 — Black graduates averaging $3,800 more than white peers, and 48% of Black borrowers owing even more four years after graduation because of interest. The steady climb in tuition rates — far outpacing wage growth for most entry-level jobs — coupled with stagnant federal loan limits that haven’t kept up with costs, has left many students rightfully pausing to weigh whether decades of debt truly represents their best path forward.
And if enrollment numbers decline, most institutions look to make up their revenue shortages by raising tuition costs on those still committing to attend. The cycle then continues.
2. The Endowment Paradox: Wealthy Schools, Rising Prices
Let’s stick to the money theme. Students ask me every year: “How can universities with massive endowments keep raising tuition? Can’t they tap the endowments to lower costs?” It’s a good question. Several elite institutions hold endowments larger than small nations’ GDPs. Harvard tops the list with an endowment of $53 billion in fiscal year 2025. One hundred and forty-nine higher ed institutions report endowments of over $1 billion.
But, despite being thought of as university slush funds, endowments aren’t checking accounts. In fact, about 55% of endowment assets at private colleges are donor-restricted for specific research initiatives or campus construction projects, not general operations. At Harvard, less than 5% of assets is unrestricted.
At GWU, as at nearly all colleges, we rely on tuition as the default revenue lever when costs rise for health care, compliance, technology, and student services. In the end, this mismatch between total wealth and operational realities deepens the affordability crisis, leaving institutions vulnerable and families footing an ever-growing bill.
3. Return-on-Investment Anxiety — and the Outdated Four-Year-Degree Model
For most, the assumption has long been that a college degree was borderline required to land a good job. Now, the return-on-investment (ROI) question is no longer whispered — it’s shouted: Will this degree lead to a job that justifies the cost?
On average, graduates still earn more over their lifetimes, but those figures mask significant variety across majors and industries, with some fields offering strong wage premiums while others leave debt-burdened alumni struggling. Students who borrowed for their bachelor’s degree in 2025 took out an average of $35,639 in education loans, and public sentiment reflects growing suspicions that they won’t get their money’s worth on the debt: only 35% of Americans now consider college “very important,” per a 2025 Gallup poll, a record low. That’s down from 51% in 2019 and 75% in 2010, just 15 years ago.
The traditional four-year, residential model was designed for a more stable economy, complete with employer-provided training, long career tenures, and clearer lines of career advancement. A key pillar of this structure is the requirement that students complete a wide variety of general education classes — often 30–60 credits — in humanities, sciences, and social studies, to become well-rounded graduates capable of critical thinking and adaptability. But with costs soaring, many students are questioning whether footing the bill for these broad requirements is truly worth it, especially when they delay entry into major-specific coursework and the workforce.
That’s especially true in today’s gig economy and skills-driven job market. Trade school enrollment is projected to grow 6.6% annually from 2025 to 2030, driven largely by Gen Z, while registered apprenticeships have surged to about 680,000 in fiscal year 2024, more than doubling from a decade earlier. Schools in Indiana, Utah, Idaho, and Maine are offering or piloting three-year bachelor’s programs to trim time and expenses.
Graduate programs continue to proliferate as institutional revenue-boosters, but freshman enrollment for 18-year-olds declined 5% year over year in fall 2024, amid rising tuition and skepticism. In my classes at GWU, I increasingly hear students compare traditional graduate school with coding bootcamps, employer-backed certificates, and industry microcredentials that promise faster, cheaper, and more targeted returns. For many, the old path feels increasingly misaligned with the labor market they’re entering.
Rethinking degrees as modular, stackable credentials doesn’t mean eliminating general education or turning college into narrow job training. It means reimagining the structure itself. Instead of front-loading two years of broad requirements disconnected from career goals, institutions could integrate general education into majors, shorten time-to-degree, and allow students to “stack” credentials over time — earning a credential after one year, another after two, and returning later as careers (and technology) evolve. In this model, breadth still matters, but flexibility becomes the feature, not the exception. The challenge is cultural as much as curricular: abandoning the assumption that learning must occur in one uninterrupted, four-year block to count as legitimate.
4. Teaching vs. Research: A Conflation with Consequences
I’ll bet you can think back to a class (or six) you had in college and vividly remember how clear it was that the professor didn’t want to be there, or how ineffective they were at building relationships and actually teaching the material. They didn’t care.
This tension grows out of a structural choice universities rarely admit out loud. The issue isn’t that research and teaching are confused — it’s that institutions insist the same person excel at both, even though they demand entirely different skill sets.
Research drives university prestige, secures grants, and boosts rankings — as in the Times Higher Education World University Rankings, where research quality accounts for 30% of the score, including 15% specifically for how often the schools’ scholars are cited by other academics.
Effective teaching, on the other hand, emphasizes student growth, engagement, and skill-building. Often, the best researchers are the worst teachers and the worst researchers are the best teachers.
Research primarily advances the institution; teaching serves the students. Requiring faculty to maximize both may be administratively convenient, but it often comes at the expense of the classroom.
Inside the institutions, faculty promotion and tenure decisions overwhelmingly pivot on publication records and research output, not teaching effectiveness or innovation. Put simply, for many professors teaching is a job requirement, but research is what gets celebrated and rewarded.
In practice, this means brilliant scholars are thrust into teaching roles simply because it’s part of the job, while standout educators — who might excel at fostering real learning — are frequently undervalued or overlooked for lacking a robust publication portfolio. Reforming tenure processes to equally value teaching and research could bridge the gap, but it carries the risk of alienating top researchers who bring in the grants that sustain operations.
5. AI’s Double-Edged Sword: Integrating Tech While Preserving Original Thought
Outside of COVID, nothing has proven more disruptive during my tenure than the explosion of AI. It has fundamentally reshaped higher education, and we won’t ever go back to the before-times. While there are clear benefits to the technology — tutoring, editing and brainstorming, even personalized debates for deeper thinking — most students see AI as a cheat code that does the work for them, in their voice, in a matter of seconds. (Yes, the tech isn’t perfect, but it’s getting better fast, in no small part because students are training it to stop making dumb mistakes.)
And here’s the scary part: the software universities invest in to detect AI-produced work is behind the technology itself, leaving professors mostly on their own to spot potential academic fraud based on a few telltale signs (like regular use of em dashes, which really sucks because I love them!). I guarantee we miss more than we catch.
Adoption rates underscore the scale of the challenge facing us. Forbes reports that 90% of college students are using AI for academic purposes just two years after ChatGPT’s launch. Two recent studies highlight that many college students are offloading higher-order thinking to AI, relying on chatbots to handle complex tasks like idea generation and analysis, potentially stunting their ability to engage deeply with material.
Overreliance on these tools risks eroding critical thinking, as students may not develop the problem-solving and analytical abilities needed for independent reasoning, with one survey finding that 55% of AI-using students report impacts on their learning and critical thinking abilities.
Forward-thinking universities won’t pretend this genie can be shoved back into the bottle. They’ll do the harder work of redesigning courses, assignments, and expectations on the assumption that AI is present — and then teach students how to use it responsibly, transparently, and critically. This means treating AI not as a shortcut to avoid thinking but as a tool that demands better thinking.
The institutions that get this right won’t just police cheating more effectively; they’ll graduate students who know how to collaborate with powerful technology without surrendering their own judgment. In a world saturated with AI, that may be one of higher education’s most important — and defensible — value propositions.
6. Demographics and Enrollment Declines: The Unavoidable Cliff
This challenge has nothing to do with branding, politics, or campus culture wars. It’s math.
Beginning in 2008, after the Great Recession, the US birth rate fell sharply — and it never fully recovered. That drop is now rippling through higher education as smaller cohorts of students reach college age. According to the National Center for Education Statistics, the number of high school graduates is projected to decline through at least 2033, with especially steep drops in the Northeast and Midwest. This is what administrators grimly call the “demographic cliff.” It is not just a metaphor. It is a countdown.
The effects are already visible. In 2024 alone, at least 20 US colleges permanently closed, most of them small, tuition-dependent private institutions without large endowments or national draw. And many schools have decided to shutter certain majors, such as history, literature, and economics, because of a drop in student demand.
Enrollment data tells the same story. First-year undergraduate enrollment fell by roughly 5% nationally in fall 2024, even as institutions aggressively expanded recruitment efforts and tuition discounting. For many colleges, the problem wasn’t demand — it was simply that there were fewer students to recruit.
International students, long a stabilizing force for enrollment and revenue, are no longer a reliable backstop. Visa backlogs, geopolitical tensions, and policy uncertainty have taken a very real toll, with international student enrollment down 17% from 2024 to 2025.
As the pool shrinks, competition intensifies. Colleges are offering steeper tuition discounts, larger merit awards, and more generous aid packages just to keep seats filled. That helps students — but it crushes margins. The National Association of College and University Business Officers reports that tuition discount rates at private colleges now routinely exceed 50%, meaning many institutions collect barely half of their sticker price.
The reality is clear: There are fewer students. No amount of marketing can fix that. Institutions that survive this decade will do so not because they won an ideological battle — but because they confronted arithmetic early, cut costs honestly, collaborated aggressively, and accepted that growth is no longer the default condition of higher education.
7. Trust, Politics, and the Culture War Erosion
For much of the 20th century, higher education enjoyed broad public support as a trusted institution. That confidence has fractured in the 21st century — particularly along partisan lines — and today politics shapes perceptions of American colleges and universities as much as pedagogy and outcomes do.
As recently as 2020, 69% of Americans believed that colleges and universities have a positive effect on the country, with 80% of Democrats and 60% of Republicans in agreement. Five short years later, only 39% percent of Republicans said the same, and Democrats had slipped to 74%. A 2024 Gallup survey echoed this trend, showing Republicans’ confidence collapsed from 56% in 2015 to about 20%, while half now report little confidence or none at all in higher education.
This distrust has not only weakened public goodwill — it has invited political action.
Under the second Trump administration, the federal government launched an aggressive campaign to reshape higher education policy. Executive orders directed federal agencies to terminate diversity, equity, and inclusion (DEI) initiatives and related grants and contracts, arguing they amounted to illegal preferences rather than lawful programs. The administration also advanced orders targeting campus antisemitism — including visa revocations and enforcement actions aimed at foreign students involved in protests — forcing universities to choose between compliance and legal challenges.
Federal funding has become a flashpoint. The Department of Education froze more than $2.3 billion in federal funds to Harvard University after it resisted federal demands to overhaul policies related to diversity, hiring, and student discipline, a move widely criticized as political interference in academic independence. Columbia, Northwestern, Cornell, and other schools faced similar freezes and have recently settled with the administration to get their funding back.
Still other schools, led by faculty, students, and unions from the University of California system, sued the federal government, arguing that its actions threaten academic freedom and unlawfully leverage funding to pressure campuses on admissions, speech, and curricula.
The result of the administration’s actions is that universities today must navigate not only academic challenges but a political landscape where public trust is uneven, ideology shapes funding battles, and any perceived departure from neutrality becomes fodder for legislative intervention, donor hesitation, or legal confrontation. For many institutions — especially expensive private universities — articulating civic value and academic mission in a polarized era is as consequential as balancing budgets or updating curricula.
So Is College Still Worth It?
From inside the system, my answer is an unsatisfying but honest one: college still works — but not automatically, not equally, and not in the way it did in the past. For many careers (including mine), a degree — and often a graduate degree — is still the price of admission. Colleges remain one of the few places where young people are given time and space to wrestle with ideas, test identities, build relationships, and learn how to think rather than just what to do.
But the trade-offs are no longer abstract. Cost, debt, uncertain returns, cheaper and faster alternatives, political pressure, demographic decline, and technological disruption have all exposed how long higher education delayed updating a model built for a very different economy and society. Students feel that gap acutely. Parents do too. And institutions that pretend otherwise are the ones most at risk.
The future of higher education won’t be decided by whether college is “good” or “bad.” It will be decided by whether colleges can change fast enough to justify the trust — and the tuition — they ask of students. Because today, the question isn’t whether learning still matters. It’s whether colleges are brave enough to earn their place as the best place to do it. The era of the unquestioned value of a college degree is over — and it’s not coming back.








Thank you for this piece, Casey. This has been something I’ve been thinking about for a long time, and it’s got me reflecting on how we might need to disentangle two things we’ve conflated: college as a place to learn skills and college as the place where young people figure out who they are. The price tag becomes impossible to justify when we’re talking about 18-year-olds who are just starting to ask the big questions about life being thrust into making decisions about the entire trajectory of their careers.
One alternative that I think deserves another look was floated during Pete Buttigieg’s 2020 presidential campaign: a national service program that would give young people options beyond military service or expensive education. The idea was to create opportunities in areas like community health, conservation, and infrastructure alongside traditional military routes. At the time, the only conversation I heard about it was people laughing and calling it out of touch, but I think it was ahead of its time.
It addresses the problem of what kids can do to buy some time before they figure out what they want and who they are. It would give young people a chance to grow up, contribute to their communities, and explore without the pressure to make the “right” choice while committing to six figures of debt or a military career that by definition isn’t for everyone.
The current system completely ignores the kids who know college is out of reach and know military isn’t right for them. They can end up without the kinds of networks and relationships you described as one of college’s intangible benefits, left only with whatever connections they happen to make at jobs they land without a degree.
And perhaps this national service idea could also educate the students in some of the areas we most desperately need better understanding: media literacy, civics, and emotional intelligence. As a supervisor in charge of hiring, I can tell you that someone who had just a year learning basic life skills is a lot more valuable than someone who spent four years studying a subject that may or may not align with the job requirements.
A national service option could fill that gap. I hope Mr. Buttigieg is still keeping that idea in his pocket if he’s considering a 2028 run.
Great analysis. Unless colleges and universities change their model to reflect changes in jobs, culture and economics they will continue to struggle. Burgat's third point about the outdated four-year-degree model was especially good. Making the goals and certification and degrees more modular and stackable is brilliant.