It’s an old saw: higher education can never change because, after all, it hasn’t in the 2,500 years since Socrates paced the Academy. There’s only one problem. That’s not true. It has not been the case in the past, and it should not be so — especially now.
Between 1860 and 1925, the old Puritan college designed to train ministers underwent massive changes to become the modern American university we know today. Industrialization was the motor. Alongside the movement for compulsory public grade school that trained farmers to be factory workers, a cadre of educators and business titans banded together to redesign higher education to shape an emerging professional-managerial class.
Charles Eliot, Harvard’s longest-running president, led the charge to create what he dubbed “the new education.” He and his colleagues enlarged the curriculum and then structured knowledge into discrete, specialized majors, minors and disciplines. Academic experts established a new infrastructure of entrance exams, credit hours, grades, bell curves, class rankings, school rankings and accreditation agencies for tabulating, assessing and certifying expertise in each field. They built research universities like Johns Hopkins, the University of Chicago and Stanford; pioneered graduate school; and developed post-baccalaureate professional schools that set the standards and status of new fields.
This system worked for most of the twentieth century. It makes less sense in our post-industrial and post-Internet world, in which the boundaries between work and home are far less distinct and work itself is unstable, even for high earners. How do you train a professional-managerial class at a time of accelerating automation when seemingly any profession could be “Uberized” tomorrow?
That’s the challenge of the “new education” we must invent today. Take the major. Virtually every study of workplace advancement, such as the annual surveys conducted by the National Association of Colleges and Employers, underscores the importance of general skills in communication, collaboration and basic cultural knowledge. CEOs like to complain that the best undergrads come with specialized skills but lack these basics. But it’s no wonder: the system we’ve inherited ranks basics as low-prestige, general-ed requirements typically taught by junior or part-time instructors. Students get these “out of the way” before they dive into majors where rigor is equated with the most required courses in specialized topics. Electives can feel like afterthoughts rather than survival skills.
Some forward-looking universities are beginning to change this. At Arizona State University, mathematician Sha Xin Wei presides over a new School for Arts, Media, and Engineering that combines the basics with cutting-edge, cross-disciplinary thinking that challenges students to think not only about what they learn but how. Students learn what Sha calls “synthesis,” the ability to combine knowledge and insights from everyone and everything, including from constraints. For example, the central question in one basic science course in this new major is: “How do bodies work?” Engineers and pre-med students work with art history students to analyze drawings by Michelangelo or mobiles by Calder to understand biomechanics. They also work with actual stroke patients who are no longer able to gain biofeedback information from their own limbs and who use art, meditation or music to learn to move again. This is at a far remove from Charles Eliot’s emphasis on specialization. Yet the stroke patient has more to teach students about how bodies work than any computer simulation or the dusty skeleton in the back of the bio lab — and far more about succeeding against odds.
Most colleges have not yet undergone such transformations. But any student — or concerned parent — seeking a relevant education can use the principle of synthesis to find a pathway through even the most traditional institution. The principle can be reduced to an adage: Make the major minor.
That means getting the major requirements “out of the way” in order to ensure graduation (a degree still gets one past past the screening algorithms for a first job), and then concentrate on the synthetic skills that can help one succeed in everything else. It means treating college like total immersion in a foreign country, exploring courses and extracurricular opportunities beyond one’s normal range of interests and experiences. If a school offers pass-fail options, a student can take classes where they have more interest than skill in order to connect with people unlike themselves, whether Python programmers or spoken-word poets. What makes them tick? There is no better way to hone the ability to work with others who complement rather than mirror one’s own skills.
A student entering college this fall could be in the workforce until 2065 or 2070. No one can predict what the world look like then, except that it will bear little resemblance to today’s college major. Being able to synthesize knowledge is a life-changing lesson that one can fall back on whenever life changes. And — we can predict — change it will.
- Florence Pugh Might Just Save the Movie Star From Extinction
- Why You Can't Remember That Taylor Swift Concert All Too Well
- What to Know About the History of the Debt Ceiling
- 10 Questions the Succession Finale Needs to Answer
- How Four Trans Teens Threw the Prom of Their Dreams
- Why Turkey’s Longtime Leader Is an Electoral Powerhouse
- The Ancient Roots of Psychotherapy
- Why Rich People Aren't Using Phone Cases