Bubble Trouble for Standardized Testing

21 minute read

A few weeks ago, a new class of nearly 800 students arrived at Wesleyan University to begin their first year. Wesleyan is consistently ranked as one of the nation’s top liberal-arts colleges, and admissions are competitive.

But those who have made it to the school’s leafy campus in Middletown, Conn., look pretty much like their newly minted peers at any other college in the U.S. The interesting difference is one that doesn’t show. Nearly one-third of Wesleyan’s incoming class was admitted without an SAT score. Last year, the school dropped its requirement that applicants submit a standardized-entrance-exam score, and hundreds of would-be Wesleyan students from a wide range of backgrounds took advantage of the option.

In abolishing the mandate, Wesleyan joined a growing cadre of selective schools that includes other recent defectors such as George Washington University and Wake Forest. Today, nearly 200 of the roughly 2,968 degree-granting four-year colleges in the U.S. no longer require the SAT or the ACT, while 600 more have diminished their role in the admission process, according to the antitesting organization Fair Test. Those ranks include not only elite and expensive colleges like Bates and Smith but also major state schools like the University of Arizona and regional campuses including Montclair State in New Jersey and Weber State in Utah.

College without the SAT? For generations of American graduates who will never forget the anxiety around that all-important Scholastic Aptitude Test and the quest to achieve the “perfect” score of 1600 (or, since a 2005 update, 2400), the idea might sound hard to believe. Yet even as a new crop of high school students sharpen their No. 2 pencils for the SAT’s first fall sitting on Oct. 3, its relevance–and that of its Midwestern cousin, the ACT–is fading at many schools, including some of the nation’s most selective.

One reason for the change is a belief among some administrators that the tests’ predictive power for college success is overrated. “People make too much of test-score differences,” says William Fitzsimmons, Harvard’s dean of admissions. “People with the very highest test scores coming into Harvard do a little better than those with the lowest test scores, but they don’t do a lot better.”

Yet in a little-noticed shift, even as the tests’ sway over college admissions has waned, they are looming ever larger on the K-12 education landscape. The organizations behind the SAT and the ACT are now locked in a battle for lucrative state-funded contracts to offer the tests to public high school students. These contracts benefit families by having taxpayers fund a college-admission test–but states are buying into the concept because the exams do double duty as assessment tests required of public schools. The ACT, which was originally known as the American College Testing Program, has signed nearly 20 of these contracts in the past 15 years, part of the reason the ACT surpassed the SAT in 2012 as the most popular college-admission test in the U.S.

Now the SAT is fighting back. Under the leadership of its president, David Coleman–an architect of the controversial Common Core standards–the College Board is about to launch a massively overhauled SAT that will compete head-on with the ACT for state contracts. The new test, which arrives in March, has schools, students, parents and test-prep companies all scrambling to anticipate the changes. And it has already won one convert: the state of Michigan, which had been offering the ACT. Starting in 2016, all Michigan high school students will take the SAT as a required state assessment exam.

The implications of these changes are significant for students of all ages. Both the ACT and the SAT have designs on testing kids who are years away from thinking about a college-entrance exam. As part of this strategy, the College Board has begun offering tests to students as young as eighth graders to help them address weaknesses early.

“We need to stop thinking about the SAT as a single transaction,” says Douglas Christiansen, chair of the board of trustees at the College Board. Instead of the tests’ being thought of only as a gateway to college, he envisions them playing a role throughout a student’s “educational journey.”

Beyond the practical impact for students, there are broader issues at stake. The SAT began as an “aptitude” test that allowed Ivy League colleges to discover the potential in students who didn’t attend elite New England prep schools. Reacting to criticisms about bias in aptitude measurement, the College Board swapped “aptitude” for “assessment” in 1994 and abandoned the idea of initials altogether three years later, insisting that SAT doesn’t stand for anything. Now, as the College Board chases the K-12 assessment business, the SAT is morphing into an achievement test similar to the ACT. But colleges are increasingly struggling with how to assess and nurture the creativity and analytical skills graduates need in the 21st century economy. As long as a degree remains the gateway to the best jobs and a bright future, is filling in a bunch of bubbles with a pencil really the best way to size up a student’s future?

If you want to understand the upheaval in college-entrance exams, a good place to begin is with Glenn “Max” McGee, a lanky, silver-haired educator with glasses, a Ph.D. and a friendly smile. When McGee was appointed Illinois school superintendent in 1998, he began seeking a way to help more disadvantaged kids get to college. It was a problem McGee knew well. His wife, a teacher trainer at a local college, had met kids from Chicago’s low-income North Lawndale neighborhood through her work. She started driving them to ACT tests on Saturdays and was struck by how few could afford to pay the fees.

Meanwhile, Illinois’s high school assessment exam wasn’t required for graduation or necessary for college admission, and as a result, many students just stopped taking it. So McGee came up with an idea to fix both problems at once: he would make the ACT part of the state test, and the state would pick up the tab, letting public-school students take a college-admission test for free.

McGee invited the ACT and the College Board to bid on a state contract to administer their tests to every public high school junior in the state. In June 2000, the Illinois state board of education voted to make the ACT part of Illinois’s Prairie State Achievement Exam, at a cost of $19 million over five years, giving students an incentive to take the assessment test. And in the spring of 2001, 130,000 Illinois public-school students took the ACT free of charge.

Fifteen years later, McGee’s idea has had more impact than he could have imagined. Eighteen states–including Colorado, Alabama and Wisconsin–now require or pay for 11th-graders to take the ACT, reflecting a perception that it is a more honest measure than the SAT of what a student knows, with fewer tricks or hidden biases. Since the ACT eclipsed the SAT in 2012, its dominance has grown as more colleges dispel the myth that they have a preference for the SAT. In 2014, 1.8 million American high school students took the ACT, compared with 1.7 million for the SAT. A third of the ACT’s growth in the number of tests taken has come from state contracts.

But a bigger consequence of that Illinois deal was demonstrating that college-admission tests like the ACT and SAT–which had traditionally been seen as gauges of “aptitude” or “intelligence”–could double as K-12 achievement exams that measured students’ mastery of curriculum, a kind of testing that became required by George W. Bush’s No Child Left Behind Act in 2001. The law mandated that students be tested annually in math and reading in grades 3 through 8, and once again in high school. Today, states spend more than $1.7 billion a year on K-12 assessment testing, according to research from the Brookings Institution.

As the market has grown, both the ACT, headquartered in Iowa City, and the New York City–based College Board have moved to capitalize. Both are nonprofit organizations, but that doesn’t mean they aren’t big businesses. IRS filings show that, taken together, the two groups bring in revenue of $1 billion–and top officials are paid upwards of $500,000.

“The marketing reps at the ACT pitched the test as a multifunction test. It was not only good for college admissions, it was a way to measure how your schools were performing,” says Jed Applerouth, a private test-prep tutor in Atlanta. “After No Child Left Behind, states said to themselves, ‘We can’t afford all these tests. Why don’t we use one test for all these purposes–to test students’ achievement as well as get them ready for college? We can kill two birds with one stone.'”

In 2012, the College Board brought in a new president. A Rhodes scholar and former business consultant from McKinsey, David Coleman is the prime example of a type of educator that has emerged in the past decade: a high-achieving product of the corporate world determined to bring entrepreneurial thinking to the education market. Coleman, now 45, was also a driving force in the Obama Administration–backed Common Core standards, which have influenced the content of the standardized tests used by many school districts. If the SAT were going to push into high school achievement testing, he was an obvious choice to lead the charge.

Coleman’s primary challenge was to get back market share the SAT had lost to the ACT. First he opened an office in Iowa City near the ACT’s headquarters and poached key ACT staff–including Cyndie Schmeiser, now the College Board’s chief of testing. Then he announced that the board would launch a redesigned test in March 2016. The College Board said the new SAT would lose the relics of its history as an IQ test that made it so conducive to expensive test prep and be transformed into an evaluation that reflects what students are already learning in high school. “It is time for an admissions assessment that makes it clear that the road to success is not last-minute tricks or cramming but the learning students do over years,” Coleman said in a speech in Austin in March 2014.

But to many observers, the overhaul seemed to have a lot more to do with business than with education. Many of the planned changes make the new SAT look more like the ACT. Among them: students will no longer lose a quarter-point for guessing the wrong answer, math questions will incorporate science, and the infamous “SAT words” of an expansive vocabulary will be banished. “In just about everything that was different about the two, they’ve embraced the ACT position,” says John Katzman, founder of the Princeton Review. “A bunch of us have tried to kill the SAT for decades, and I guess congratulations are in order for David Coleman for doing it.”

If there’s an entrance-exam revolution in progress, you wouldn’t guess it from looking at the discussions on CollegeConfidential.com, a favorite website for angst-ridden high school students and their lurking parents. One of the most popular threads is called “What Are My Chances?” Teens go there to post the bare facts about their high school careers–grades, test scores and “ECs” or extracurriculars–and ask their peers to rate their chances of getting into various colleges.

The guesstimated responses are blunt and occasionally brutal. Shortly after 1 a.m. on a Sunday in late September, a user named wolverinexci signed on, asking “Chance me please!” and offering a list of colleges and these particulars: 1800 SAT score along with a 3.278 GPA at a competitive California high school. The first reply was frank: “Your GPA will keep you out of many of the schools on your list,” wrote Gumbymom. After schools considered safe bets were deemed toss-ups by fellow users, wolverinexci responded: “wow i can’t believe i suck that much” and then, in the same thread, pleaded, “Could a couple more people please chance me as well?”

The test-score torment is everywhere in the postings of these anonymous teens. Malcolmx99 has a 1700 on his SAT and hopes to raise his score to 2100 out of a perfect 2400. If he preps for four hours a day over the next three months, he wonders, is that doable? Others describe intense regimens of test-prep courses, tutoring and hours of mock tests. Peculiar0Pencil asks the group, “Are There Standardized Tests That Don’t Matter?” The conventional wisdom from the group is, essentially, no.

But many college officials tell a much different story. Harvard’s Fitzsimmons says it’s rare that an entrance-exam score will reveal something about an applicant that isn’t already apparent from a high school transcript. “There’s always been a very direct relationship between the quality of the schooling that you’ve had and how well you do, not just on SAT but any of the standardized tests,” he says. Indeed, from many other admissions officials I spoke with, rigor and curriculum were consistent refrains. In other words: getting good grades in tough classes is a solid way to impress a college.

“Their grades, their GPA, the strength of the curriculum and the strength of the course work in the field they were interested in is the best predictor” of college success, says Kedra Ishop, an admissions officer at the University of Michigan. “If there were two AP courses available and you didn’t take them, that’s going to be looked at more closely than 20 points on the SAT.” Many colleges rely on a “rubric,” an algorithm that spits out a ranking calculated from GPA, test scores and extra points to represent things like AP courses. The weight of each factor in the rubric depends on the college.

But after a deluge of research over the past decade circumscribing the limits of standardized tests, it’s clear many colleges are digging deeper. “What is a good score depends on your given context,” says Jim Bock, dean of admissions at Swarthmore College. “Did the school you prepped at do a lot of critical reading? Was your score one of the highest in your school district?”

In 2014, William Hiss, a former Bates College admissions dean, and researcher Valerie Franks published results of a study of 123,000 student records from 33 colleges with test-optional admission policies, analyzing the high school GPAs and the graduation rates of the two groups: matriculants who had supplied an entrance-exam score and those who had opted not to. Their conclusion: high school GPA–even at poor high schools with easy curriculums–was better at predicting success in college than any standardized test. Nancy Hargrave Meislahn, the admissions dean at Wesleyan, told me those findings helped persuade the school to go test-optional.

Colleges have taken the data to heart and see less value in the scores, says Kim Reid, a higher-education analyst at research firm Eduventures–although the omnipresent “top college” ranking lists, which often use schools’ average ACT and SAT scores in their calculations, force them to pay some attention. The tests “do matter in ways that they probably shouldn’t,” Reid says. “If you could get rid of [their use in rankings], the colleges and universities would walk away from them even more.”

For Coleman, the vision isn’t less testing. It’s more–a lot more. He aims for the College Board to deliver more value to students, and he has expanded efforts to identify low-income kids and provide them with fee waivers so they can send their results to more colleges for free. In June, the College Board launched a partnership with Khan Academy, the popular online educational-video provider, and the Boys & Girls Clubs of America to offer free test prep to kids who can’t afford it. And Coleman wants more schools to offer the College Board’s SAT-warm-up test, the PSAT, to eighth- and 10th-graders to help identify gaps in students’ education earlier. “The real work,” he says, “is building opportunity for kids.”

Listening to all that fervor, it’s hard not to hear the voice of the Common Core evangelist. Coleman was a key engineer of those benchmarks, which aimed to have states adopt student achievement standards that met national goals. Supporters saw Common Core as a way to ensure that all students were getting a fair shake, but opponents have criticized it for narrowing curriculums and relying too heavily on standardized tests to evaluate learning. As the battle has become increasingly charged, the College Board does not want families to dwell on the connection between Common Core and its current chief. Says Christiansen, the College Board chair, who is also dean of admissions at Vanderbilt University: “I get asked a lot, ‘Did you hire David Coleman because he worked for Common Core?’ I’ll tell you, absolutely not.”

But Common Core has reinforced the demand for K-12 testing, and as a result some sort of collision between it and the SAT–and the ACT–seems inevitable. The strategies of both companies, educators say, put them in direct competition with the creators of Common Core testing. Indeed, Jon Erickson, who joined ACT in 1984 and recently retired as president, says the company always believed that its “test could serve a greater purpose than admissions screening.” That worries some educators, who fear that the presence of the SAT and ACT in the primary-school market could stifle innovation.

The exams also affect the things students are asked to learn. In 2015, after 15 years of using the ACT to assess high school students, Illinois dropped the ACT requirement in favor of a new Common Core assessment called PARCC, created by the for-profit education company Pearson. Illinois’s board of education set aside $14 million for districts to let students take the ACT on the state’s dime, in addition to the PARCC test. But the prospect of readying kids for two different tests has left some public-school educators feeling frustrated. Jeff Feucht, an assistant superintendent in Glenbard Township, a large suburban district outside Chicago, says one test is enough, and he would rather prepare students for an exam they can use to get into college. “I think [PARCC] is a better test, and if colleges would accept it for admissions, I’d prefer it to the ACT and the SAT, but I work at a school district. I work for taxpayers. I care about kids getting into college,” he says. “I don’t want to serve two masters.”

Meanwhile, college admissions officials are still searching for better tools to understand applicants. They are increasingly focused on, for lack of a better description, trying to measure what isn’t being measured. Two years ago, the MIT admissions office started soliciting something it calls “maker portfolio,” a way for talented students to submit videos showing off things they’ve created–from computer software and robots to glow-in-the-dark socks. As Dawn Wendell, a mechanical-engineering lecturer at MIT who worked in the school’s admissions office, said in a recent presentation, “We recognize that you are not fully captured in the numbers, so we are looking at you as a whole person.”

They particularly want more evaluations that they consider open-ended–that is, exercises for which there is no single precise right answer and which can’t be distilled to a multiple-choice question. It’s another reason that students’ achievement on Advanced Placement tests is coveted by admission officers. Those tests are highly open-ended (and more expensive to conduct, since humans must grade the answers). In that regard, the College Board’s decision to take the essay question added to the SAT in 2005 and make it optional in the relaunched test next year is seen by some as a step backward.

“We keep coming back to this multiple-choice testing strategy that was modern technology in 1950,” says Linda Darling-Hammond, faculty director at Stanford University’s Center for Opportunity Policy in Education and co-author of Beyond the Bubble Test: How Performance Assessments Support 21st Century Learning. “But it is antiquated now. It cannot measure higher-order thinking skills.” The College Board, for its part, says its redesigned test will include open-ended short answers on the math portion of the exam.

For a different approach, consider how students are assessed in Singapore–which topped the most recent global school rankings from the Organisation for Economic Co-operation and Development. The government-run test for college-bound students requires them to complete a group project over several weeks that is meant to measure their ability to collaborate, apply knowledge and communicate–all skills both educators and employers say are critical for the future economy.

Among those who believe the U.S. must do better is Max McGee, the onetime Illinois state superintendent who helped trigger the big shift in the ACT/SAT business model. McGee is now in Silicon Valley, running the Palo Alto, Calif., school system, and he says he dreams of a better assessment. “It needs to look like a portfolio students generate over time that reflects their passion, their purpose in life, their sense of wonder, and that demonstrates their resilience and persistence and some intellectual rigor,” he says. He’s reminded of that need by his wife, who still works with the low-income kids in North Lawndale who often lack the resources to take–let alone prepare for–the SAT and ACT.

“When you see them interact, you realize our future is bright,” McGee says, “to the extent that we can provide opportunities to show what they can really do.”

***

The SAT has been synonymous with the college-admission process for generations of Americans. Test your knowledge of how the test has changed over the years, from its creation by college presidents to the newest version, which launches in March.

In what year did 12 university presidents form the College Entrance Examination Board in order to create a uniform college-admission test in essay format?

A. 1920

B. 1905

C. 1900

D. 1890

A B C D

The first IQ test given to a large group of Americans, the Army Alpha, was administered to identify soldiers qualified to fight in which war?

A. World War II

B. World War I

C. Korean War

D. The Civil War

A B C D

Carl Brigham, a Princeton psychology professor and eugenics advocate, adapted the Army Alpha into the Scholastic Aptitude Test during which period?

A. 1912–15

B. 1882–85

C. 1965–68

D. 1923–26

A B C D

When did Harvard president James Bryant Conant first use the SAT to identify gifted Midwestern scholarship students who did not attend Eastern boarding schools?

A. 1912

B. 1874

C. 1965

D. 1934

A B C D

In what year did the College Board drop “aptitude” from the SAT’s name?

A. 1989

B. 1994

C. 1978

D. 2006

A B C D

After grading tests by hand, the SAT introduced machine scoring in which year?

A. 1924

B. 1952

C. 1939

D. 1975

A B C D

When did Stanley “Cram King” Kaplan begin SAT tutoring in his parents’ Brooklyn home?

A. 1946

B. 1942

C. 1965

D. 1958

A B C D

What was the first year that more than 500,000 students took the test?

A. 1949

B. 1957

C. 1965

D. 1982

A B C D

When did high schools start sharing test results with their students?

A. 1970

B. 1987

C. 1965

D. 1958

A B C D

University of Iowa education professor E.F. Lindquist launched the American College Testing Program in which year?

A. 1959

B. 1898

C. 1964

D. 1971

A B C D

When did the University of California begin requiring the SAT for admission?

A. 1954

B. 1920

C. 1978

D. 1960

A B C D

When did the College Board start offering fee waivers to low-income families?

A. 1973

B. 1950

C. 1969

D. 1987

A B C D

In what year did Bowdoin stop requiring the SAT, becoming a pioneer of the test-optional movement?

A. 1930

B. 1969

C. 1975

D. 1990

A B C D

In what year did the FTC determine that Kaplan’s test prep could raise students’ scores, following an investigation into his marketing?

A. 1965

B. 1958

C. 1979

D. 1986

A B C D

When did the SAT add a writing section, eliminate vocabulary analogies and change the top score from 1600 to 2400?

A. 1984

B. 1998

C. 2005

D. 2000

A B C D

In what year did the College Board announce that it would drop penalties for wrong SAT answers, scrap esoteric vocabulary words and make the essay optional?

A. 2014

B. 1999

C. 2004

D. 2012

A B C D

More Must-Reads From TIME

Contact us at letters@time.com