• History
  • Education

How Community Colleges Changed the Whole Idea of Education in America

8 minute read

In January of 2015, President Obama unveiled his “American College Promise” program – a plan to make two years of community college education available free of charge to “everyone who’s willing to work for it.” In offering the proposal, the president did not just venture a partial solution to the student debt crisis. He joined a growing community of thinkers who see the community college as central to solving a wide variety of problems in higher education, from cost and inclusivity to career-preparedness and community engagement.

The role of problem-solver is one that community colleges are well-equipped to play. Just over a century old, community colleges have been at the forefront of nearly every major development in higher education since their inception. To appreciate the role that community colleges can be expected to play in reforming higher education today, Americans would do well to consider their long history of innovation.

Community colleges, the United States’ unique contribution to higher education worldwide, were the humbler half of the wave of late-19th and early 20th-century innovation that brought America the modern research university and revolutionized the dusty classicism of the country’s existing colleges. Born at the heady crosswinds of Gilded Age and Progressive Era politics, two-year institutions of higher learning were designed to address two very different sets of problems in postsecondary education.

Just what were these problems? For creative elitists, the foremost bugbear of higher learning was one of quality. True, Gilded Age America had been blessed with a raft of new institutions, from land-grant universities like Penn State to private research institutions like the University of Chicago. But these universities, in reformers’ eyes, were prevented from realizing their true potential by an outmoded institutional model that compelled them to teach generalist subjects to freshmen and sophomores and specialized topics to juniors and seniors.

Instead, they advocated a new model, based on the German system, in which the first two years of college would be separated from the final, research-oriented years of postsecondary study. This model would ‘purify’ research universities, helping these new schools achieve their mission of stirring the Ivies and other old colleges from their classical obsolescence. It would also guard the gates of the budding research universities, with the new junior colleges, as they were then known, allowing only their finest students to transfer credits to four-year institutions.

To sell the idea of a comprehensive junior college system to taxpayers, however, advocates like University of Chicago president William Rainey Harper had to tailor the system to public concerns. The foremost of these, for the populist-minded public, was not quality but access. While the 19th century had witnessed the creation of a large number of new institutions – including the public, land-grant universities noted above – many were far from the small towns and urban neighborhoods that anchored turn-of-the-century life, and the majority were incredibly expensive, with yearly tuition typically in excess of the average worker’s annual income.

Clamoring for both physical and economic access to college learning at a moment when advanced education was becoming key to social mobility (sound familiar?), Americans of a populist persuasion were responsible for the egalitarian streak of the junior colleges that opened beginning in 1901. Inexpensive, often publically funded, and open to a wider cross-section of Americans than many of their four-year counterparts, these junior colleges were celebrated as “people’s colleges.” Though a far cry from full inclusivity, these male-dominated, majority-white schools nevertheless catered to a broader swath of working-class Americans than nearly any other contemporary educational institution.

Accessibility was not junior colleges’ only innovation, however. They were also instrumental in expanding the practical training offered by early land-grant colleges. This function became particularly important as the comprehensive system of junior colleges envisioned by William Rainey Harper and other elitist boosters failed to materialize. Forced to compete with better-known and better-funded institutions for liberal arts students, junior-college educators began to look beyond their role in preparing students for transfer, and instead imagined a position for themselves as vocational trainers.

Junior colleges’ embrace of vocational training began in earnest during the 1930s. Though present in many institutions’ curriculum from their earliest days, vocational instruction assumed a profound new importance against the backdrop of the Great Depression. Confronted with thousands of unemployed students who had entered junior colleges when hard times struck, administrators and educators rapidly expanded their ‘practical’ offerings. These vocational training programs included not just handicrafts and manual arts, but white-collar courses of study like business, accounting, finance, civil engineering, nursing and marketing.

Though these vocational programs anticipated several of the 21st century’s fastest-growing college-level courses of study, they were rejected by a many students as a distraction from their goal of attaining a B.A. and deplored by scholars as an effort to moderate students’ ambitions. More recently, however, students have warmed to vocational programs, as new research shows that associate degree holders in these fields not only out-earn their counterparts in the liberal arts, but transfer to four-year institutions at a higher rate as well. Vocational training, in other words, has been shown to offer students both security and flexibility – while foreshadowing the trajectory of undergraduate learning as a whole.

In the decades following the Depression, junior colleges – which were quickly renaming themselves community colleges to reflect their intimate relationship with their surrounding regions – underwent a period of unprecedented growth and innovation. Stimulated by favorable notice from educational policy makers in the Truman administration and a flood of World War II and Korean War veterans eager to use their G.I. Bill benefits, two-year schools grew dramatically during the 1940s and 1950s. But their most impressive period of growth arrived in the 1960s. Opening at an average rate of one per week during this decade, community colleges not only absorbed and educated a considerable portion of the Boomer generation; they also inaugurated many of the core features of the 21st century college while pioneering a revolutionary open-doors admission policy.

An even more extraordinary innovation during this period was community colleges’ embrace of a diverse student body. Though haunted by a lackluster early record on minority admission, community colleges desegregated more fully and more aggressively than their four-year counterparts, incorporating members of minority groups into a student body that already included large number of young, white working-class men and women; non-traditional adult students; and returning combat veterans. Thanks to these efforts, community colleges now board African American, Latino/a, and immigrant enrollment rates that roughly parallel these groups’ representation in American society as a whole.

Find the best college for your child and your wallet.

To support and accommodate this diverse community, colleges pioneered everything from innovative course formats to campus social services. Indeed, from summer classes and distance learning programs to campus counseling centers, much of the infrastructure that now supports four-year institutions’ diversity campaigns was first tested in the community college crucible.

And more recently, two-year institutions have added another kind of diversity, too. Confronting a quickly changing economy, many community colleges turned their attention to adult education and workforce retraining. Community college enrollees in the latter half of the 20th century came to include everyone from curious retirees eager to learn a new skill, to victims of mid-career layoffs in need of a new skillset.

Community colleges are not, of course, beyond reproach. Chronically underfunded, they rely even more heavily upon exploitative adjunct labor arrangements than B.A.-granting institutions. And, in their earnest desire to ensure positive career outcomes for students, many schools have risked becoming publically funded training centers for private concerns.

But thanks to these institutions’ long history of innovation, they are well-positioned to redress many of higher education’s most pressing problems today, including heightened institutional inequality, skyrocketing student debt and waning undergraduate interest in subjects that do not promise financial rewards. In response to these issues, community colleges offer diversity and affordability—and are thus among the last places in America where students can afford to take a class just because they want to.

Once more, then, community colleges may prove the saving grace of college-level learning in America.

The Long View

Historians explain how the past informs the present

Sean Trainor has a Ph.D. in History & Women’s, Gender, and Sexuality Studies from Penn State University. He blogs at seantrainor.org.

 

 

More Must-Reads From TIME

Contact us at letters@time.com