MONEY Best colleges

Get College Money by Playing Video Games

This university offers video game scholarships to some of its students. Really.

Chicago’s Robert Morris University is the first college to offer video gaming scholarships for students. The program is part of the school’s athletic department, and students are required to attend practices and competitions just like any other student athlete. The team plays League of Legends, a popular computer game in which players work in teams of five to destroy the opposing side’s home base. School officials say rather than distracting from the academic curriculum, students’ involvement with the e-gaming team helps students build teamwork and communication skills.

See the entire list of MONEY’s Best Colleges here.


How MONEY Ranked Colleges: An In-Depth Look at Our Methodology

College is now the second-largest financial expenditure for many families, exceeded only by buying a home. So it isn’t surprising that parents and students are taking a harder look at the costs and payoffs of any college they consider.

To help, MONEY has drawn on the research and advice of dozens of the nation’s top experts on education quality, financing, and value, to develop a new, uniquely practical analysis of more than 700 of the nation’s best-performing colleges.

MONEY’s Best Colleges for Your Money rankings are the first to combine the most accurate pricing estimates available with students’ likely earnings after graduation and a unique analysis of how much “value” a college adds.

We estimate a college’s “value add” by calculating its performance on important measures such as graduation rates, student loan default rates, and post-graduation earnings, after adjusting for the types of students it admits. We believe this analysis gives students and parents a much better indication of which colleges will provide real value for their tuition dollars.

We developed our ratings in partnership with one of the nation’s leading experts on higher education data and accountability metrics, Mark Schneider. The former commissioner of the National Center of Educational Statistics, he is currently a vice president at the American Institutes for Research (AIR) and president of College Measures, a for-profit partnership of AIR and Optimity Advisors, which collects and publishes public data comparing a student’s educational record and later earnings.

The final methodology decisions were made by the MONEY editorial team, in consultation with Schneider and College Measures.

In building our rankings, MONEY focused on the three basic factors that surveys show are the most important to parents and students:

  • Quality of education
  • Affordability
  • Outcomes

Because these three factors are so interrelated and crucial to families, we gave them equal weights in our ranking.

In each of these three major categories, we consulted with our advisers to identify the most reliable and useful data to assess a school’s performance. We also balanced the basic data in each category with at least one “value-added” measure.

To gauge how well a college is performing relative to its peers, we gathered federal and college data on the average test scores and grade point averages of students at each college, the percentage of each graduating class receiving degrees in each major, and the percentage of students with incomes low enough to qualify for Pell Grants (about 90% of which go to families with incomes below $50,000). We then used the statistical technique of regression analysis to determine the impact of a student’s test scores, economic background, and college major on key factors, such as graduation rates and future earnings. That enables us to see how much better or worse a particular college performed than would be expected given the characteristics of its student body.

We used this estimate of relative performance by the college as an important part of the ranking, as you’ll see below.

MONEY assigned each indicator a weight based on our analysis of the importance of the factor to families, the reliability of the data, and our view of the value of the information the data provided.

To avoid overloading readers with too many choices or too much data, and to ensure that we fairly compared apples with apples, we decided to analyze only those colleges that, in our view, passed these minimal quality and data tests. We decided that to be included in our rankings, a college must meet these four criteria:

  1. Be a public or not-for-profit four-year college or university.
  2. Have enough cost, quality, and outcomes data available to provide at least moderate confidence in our assessment.
  3. Not be financial trouble, as indicated by a below-investment grade rating for its bonds by Moody’s, or by inclusion on the U.S. Department of Education’s list of schools under the strictest level of “Heightened Cash Monitoring” because of indications of low “financial responsibility.”
  4. Have a graduation rate at or above the median for its type of school (public or private), or if the rate is below the median, have a graduation rate at least 25% above what would be expected given the incomes and test scores of its students.

This eliminated some colleges that may be good values, but might be facing temporary financial difficulties or have too few alumni reporting their incomes to PayScale, our salary data source, for us to evaluate them. But it left us with a robust universe of more than 700 colleges. In our view, even the lowest-ranked of the schools on our list demonstrate that they provide at least some value for your tuition dollars.

We then used the following data and methodologies in our three basic categories to create our rankings:

QUALITY OF EDUCATION: 33.3% weighting

For this factor, we used six indicators that provide meaningful information about the quality of a school’s instruction, weighted as shown:

(1) Graduation rates: 35%. Education experts and college officials generally agree that one of the most important reflections of a college’s quality is its graduation rate. (The American Association of State Colleges and Universities calls it “a legitimate indicator” of college quality.) Many rankings use this commonly cited federal statistic on the percentage of freshmen that graduate within six years. Because of its importance and wide acceptance, we assigned this measure a comparatively heavy weight.

(2) Value-added graduation rate: 35%. Many education experts and college officials point out that the basic graduation rate number, while useful, is an insufficient indicator of a college’s value because research shows that wealthier students and students who got good grades in high school are more likely to finish whatever college they attend. So elite, expensive schools such as, say, Harvard, would be expected to have high graduation rates. For that reason, we also calculated each school’s relative performance after accounting for the economic background and academic preparation of its students. The higher a school’s graduation rate was above the rate that would predicted for a school with that particular mix of students, the more value that particular college is assumed to have added. This “value-added” graduation rate analysis is widely accepted. (A 2013 OCED paper that such “value- added measurement provides a ‘fairer’ estimate of the contribution educational institutions.”) Because of its reliability and acceptance, we weighed this factor heavily.

(3 & 4) Peer quality: 15%. Decades of research have shown that undergraduates have a major impact on their peers. Students who room with better students get better grades, for example. By contrast, students surrounded by less conscientious peers—for example, heavy drinkers—study less and get worse grades. And students who room or socialize with more successful students tend to get better jobs upon graduation.

Our peer quality measure consists of these two indicators:

  • Academic preparation of students (10%). We gathered the federal data on accepted students’ high school grade point averages and their scores on the ACT and SAT. We analyzed the overall relationship between test scores and grade point averages for all colleges that reported both data points, and then used those averages to fill in an estimated test score for schools that don’t make their students’ test scores public. This is an imperfect measure, and there is controversy over the usefulness and validity of standardized tests to begin with. However, the SAT and ACT tests currently provide the only nationally comparable data on student abilities. In addition, many studies have found a high correlation between test scores and academic success.
  • Yield (5%). The federally reported “yield” is the percentage of accepted students who enroll in a given college. The higher the yield, the more likely it is that the school was the student’s first choice, or best option, and that applicants have perceive the college’s quality as high.

(5 & 6) Faculty quality: 15%. Research shows that students who get more individual attention from faculty tend to achieve more, both in college and after graduation. (See, for example, the recent Gallup-Purdue Index.) While there are no nationally comparable data on the number and quality of student interactions with faculty, we believe these two data points are useful indicators:

  • Studentfaculty ratio (10%). This is a standard, federally published metric used by many ranking organizations.
  • Quality of professors (5%). MONEY asked the nation’s largest independent source of student ratings of professors,, to calculate the average overall rating for all professors at each school for helpfulness, clarity, and quality. We did not include students’ ratings of the professors’ “hotness” or “easiness,” which are also collected by the site. Although research has found that students do tend to give higher marks to easier professors, independent investigators have also found that Ratemyprofessor quality ratings are generally reliable and provide students “with useful information about quality of instruction.” (

Change from our 2014 rankings: For 2015, we shifted a small amount of weight from the other measures to graduation rates because of the declining reliability of data such as test scores and GPA. As more colleges go “test optional,” for example, only students with high test scores will submit them, making a school’s average test score appear artificially high. Graduation rates, on the other hand, reflect many aspects of a school’s quality, since students who are unhappy with their experience will vote with their feet and drop or transfer out.

AFFORDABILITY: 33.3% weighting

For this factor we also used six indicators, weighted as shown:

(1) Net Price of a degree: 30%. MONEY has developed a unique, and many experts tell us, more accurate estimate, of college prices. We started with the “sticker” price provided by the college to the federal government. The full cost of attendance includes tuition, fees, room, board, books, travel, and miscellaneous costs. For public colleges, we used the in-state tuition and fees.

We then subtracted the average amount of institutional aid provided per student by the college, including need-based grants, merit aid, and athletic scholarships. (The aid data are provided by colleges on the Common Data Set, which we accessed through Peterson’s.) That gave us an estimate of the net price the college charged an average student for the most recent academic year.

Next, we used federal data on the percentage of students who graduate from that college in four, five, and six years to calculate an average time to degree, which for the vast majority of schools is now more than 4 years.

Judith Scott-Clayton of Columbia University, for example, has found that the average college graduate pays for 4.5 years of college, not just 4.However, because MONEY’s 2015 ranking includes only those schools with the best graduation rates, the average time to degree for all of the schools on our list is just 4.3 years.

We multiplied the net price of a single year by the average number of years it typically takes students to finish at that school (which ranges from four to six years, depending on the school) and added in an inflation factor (since tuition prices typically rise every year) to estimate the total net price of a degree for freshmen entering that school in the fall of 2015.

College counselors and financial aid experts have told us that the MONEY calculation is more realistic and helpful to parents and students than other popular price estimates, most of which use the cost of a single year, not the full degree. Sandy Baum, a senior fellow at the Urban Institute and one of the nation’s leading researchers on college costs and aid, says our estimates of the net prices of individual institutions “provide good benchmarks.” She added an important caveat, however: “Students at each institution face a wide range of net prices, so no individual student should assume that the schools on this list with highest net prices would end up being most expensive for them.”

The federal net price estimate for a year’s costs is lower than MONEY’s estimate because the Department of Education subtracts almost all grants—federal, state and institutional—while we only subtract aid provided by the school, for reasons described below.

MONEY gives the net price sub-factor a very heavy weighting because surveys show that the cost of college is now one of families’ biggest worries. A 2013 Harvard Institute of Politics survey reported, for example, that 70% of young adults say finances were a major factor in their college decision.

(2 & 3) Educational debt: 30%. Surveys show debt to be another of most families’ biggest worries. We weighed this factor equally with net price because of those concerns and because we believe it is an indicator of how fairly colleges distribute their grants and scholarships. Schools with comparatively low net prices but high borrowing are likely giving grant aid to wealthier students but shorting the packages of needier students, thus forcing them to borrow more. The higher the debt factor, the lower the school ranked.

Our educational debt assessment is based on these two indicators:

  • Student borrowing (20%). The federal government reports the percentage of freshmen who borrow, as well as their total average amount of federal and private student loans. We combined these data to estimate the average debt per student. We then multiplied that number by the average number of years it takes students at that college to earn a bachelor’s degree. The result is our estimate of total indebtedness for graduating seniors.
  • Parent borrowing (10%). The federal government reports the total amount of parent PLUS loans awarded to parents at each college each year. We divided this number by the school’s enrollment to calculate an average parent PLUS debt per student. While other organizations generally don’t include parental debt in their rankings, MONEY believes parent educational borrowing is a financial burden, and should be an important consideration.

(4 & 5) Ability to repay: 30%. The ability to repay loans taken out to finance a college education is another indication of a school’s affordability for its students.

We evaluated ability to repay using these two indicators:

  • Student loan default risk index (15%). Each year, the federal government publishes the number of former students who left college three years ago and have since defaulted on their federal student loans. Using a methodology proposed by The Institute for College Access and Success (TICAS), MONEY adjusts these numbers for the share of students at the college who take out federal student loans. TICAS says this is a fairer and more accurate indicator of the odds that any particular student at the college will end up defaulting on a student loan.
  • Student loan default risk indexvalue added (15%). We calculated the relationship between schools’ SLDRI and average student test scores and the percentage of the student body from low-income households. Schools that had a lower default rate than would be expected given their student population were ranked more highly.

(6) Affordability for low- and moderate-income families: 10%. This year, MONEY added a new affordability factor to the rankings, using data colleges report to the federal government on the average net price paid in the 2012-13 academic year by students in families with annual incomes of $48,000 or less. We did this to reflect how affordable the college is for disadvantaged students and working class families (the median family income in the U.S. in 2013 was $52,000). Another reason for focusing on this data for this income group is that is it is far more reliable than the government’s net price data for higher income groups. The government’s net price data is only reported for students who receive federal aid, which means the data covers almost all students with incomes below $50,000, but only about half of students from families in the highest quartile.

Changes from our 2014 rankings: To make room for the new affordability indicator, we reduced the weighting of our net price indicator. In addition, because PLUS loans are an imperfect indicator of how much parents may be borrowing to fund their children’s tuition (there are no data on how much parents of students at each college are borrowing from private lenders, or against their retirement plans or home equity, for example), we slightly reduced our weight on the PLUS indicator and shifted it to the default indicator.

OUTCOMES: 33.3% weighting

For our third category, we used a total of nine indicators, weighted as shown below.

(One data point we could not use: the percentage of students who find jobs within a year of graduation. Although many colleges claim that a high percentage of their new grads are employed, the data are generally not considered reliable enough to compare one college against another. That’s because each college has a different method of surveying graduates, and unemployed grads may be less likely to answer such surveys, which would tend to make a college’s employment rate look better than it really is.)

(1 &2) College career services: 15%. We used one quantitative and one qualitative indicator to assess the value of a college’s career services:

  • Caseload of career services staffers (10%). Surveys show that the single most important reason students now give for attending college is to increase their odds of landing a good job and launching a career. Unfortunately, most colleges don’t provide much career coaching or job assistance. At the typical four-year college, there is only one professional career services staffer for every 1,000 undergraduates, which equates to an average availability of just one hour of personal attention per undergraduate per year. While we couldn’t get reliable data on the quality of the career services that schools provide, we could at least tell parents how overworked the staff is. So MONEY included in its ranking the number of full-time equivalent staffers in each college’s career services office as reported to Peterson’s. (We called several hundred schools that hadn’t reported the data to Peterson’s to fill in the missing numbers.) We then calculated a caseload per staffer, based on each school’s enrllment.) The lower the caseload, the higher the college is ranked.
  • Formal programs linking alumni with job-seeking students (5%). Personal referrals provide a huge advantage in the job market. (A 2010 Federal Reserve study found, for example, that job applicants who received a personal referral from an employee were more likely to get interviews, get hired, and be offered higher starting salaries ( ). Colleges that help students make connections with their working alumni provide a valuable service.

(3, 4 & 5) Post-graduation earning power: 35%. We used three indicators to assess this factor:

  • earnings for students who graduated within the last five years (20%). We considered the salaries on reported by graduates of the class of 2009 and later. The typical person in this group has two years of work experience. PayScale provided MONEY with aggregated data on more than 1.4 million people who in the last three years have filled out an online form that asks what their job is, what their salary is, where they went to college, what they majored in, and when they graduated, the largest such database available. Matthew Chingos, a Brookings Institution economist who has studied the PayScale data notes that it is currently “the only game in town” for anyone who wants to compare colleges’ outcomes nationwide. (Find more details on PayScale’s data here.) We did not consider the earnings of anyone with a graduate degree, because there is no way to isolate the effect of the undergraduate degree on the eventual earnings of, say, a lawyer or doctor.
  • earnings for mid-career workers (5%). Here we used the average earnings reported by those who graduated before 2004. The typical worker in this group has 15 years of work experience. We weighted early earnings more heavily than mid-career earnings because Trey Miller, a RAND Corporation economist who has studied the relationship between college choice and earnings, noted that a college choice has a much stronger impact on the type and pay of a first job after graduation than it does on job type and pay for a person who has been in the workforce for, say, 10 years. By then, experience and skills acquired since graduation will also play an important role.
  • Estimated market value of alumni skills (10%). In April 2015, the Brookings Institution published an analysis of new data shedding light on the value added by each college. One of Brookings’ indicators was an estimate of the market value of the 25 most commonly cited skills listed by alumni of each college in their LinkedIn profiles. Jonathan Rothwell, the author of the Brookings study, said this new measure is based on millions of LinkedIn profiles and is a new and potentially better way to discover earning potential. In addition, the data is for all graduates, including those who have earned additional degrees, so it captures the earnings of some schools’ higher earners.

(6, 7, 8 & 9) Earnings value-add: 50%. We used four numbers to calculate how much more or less the graduates of each school are earning, compared with graduates of similar colleges.

  • Early-career PayScale earnings after adjusting for majors (20%). In other analyses of which colleges seem to produce the highest earners, engineering and tech-oriented colleges dominate because computer scientists and engineers tend to earn very high salaries. But what if your child isn’t fated to be an engineer? Which school produces the highest earners for other majors? MONEY used a regression analysis to estimate the average impact on reported earnings of each of three “buckets” of majors: Business, STEM (science, technology, engineering, and math) and “everything else,” which is mostly made up of humanities, education, visual and performing arts, and behavioral and social sciences. We calculated the average earnings for each group of majors. Then, using federal data on the number of graduates in each major at a college, we calculated what the expected earnings would be for that school if each student earned an average salary for his or her field of study. Colleges where graduates earn more than the predicted salary are ranked more highly.
  • Mid-career PayScale earnings after adjusting for majors (5%). We weighed mid-career earnings less heavily because, as previously noted, where a person went to school generally has a greater impact on salary earlier in a
  • Value-added early-career PayScale earnings (20%). We also conducted the same kind of ”value-added“ analysis we did in the educational quality and affordability categories. To find colleges that accomplish what education is supposed to do—help hardworking students from any background get ahead—we calculated the impact of test scores and of coming from a low-income family on graduates’ earnings. Then, using federal data on the test scores and economic background of each school’s student body, we calculated what the expected earnings would be for each school. Colleges where graduates earned more than would be predicted, given their student body, are ranked more highly.
  • Value-added mid-career PayScale earnings (5%).

Changes from our 2014 rankings: To make room for the new indicators, we slightly reduced the weighting we had given to mid-career earnings, because, as cited above, a college typically has the largest impact on early-career earnings.

How we calculated these rankings: For each of our data points, for each college, we calculated a “Z-score”—a statistical technique that turns lots of seemingly different kinds of data into a standard set of numbers that are easily comparable. Each Z-score tells you how far any particular number—such as, say, a graduation rate of 75% —is from the average for the entire group under consideration. We added up each school’s Z-score for each of the factors we used, and normalized that score into an easily understandable five-point scale. We then ranked the schools according to their total score.


While we believe our rankings are better than any others out there, we know they are not perfect. Here are some caveats that we are aware of, and hope to address in future rankings.

Data on student learning. What students actually learn is a big unanswered question in higher education. Very few colleges try to measure learning, and very few of the ones that do release the results publicly. In addition, we were not able to find good data on basic indicators of academic rigor, such as the number of pages of writing or reading required per assignment. We will continue to explore the data in hopes of finding useful indicators of student learning.

Geographical adjustment of wages. Some colleges in low-wage areas, such as parts of the South and Midwest, get lower rankings because we have not adjusted the PayScale data for cost of living. Wages are higher in New York, for example, because rents and other living costs are much higher, but that doesn’t mean the graduate’s lifestyle is better. We will consider making geographic adjustments of earnings data in the future.

Alumni satisfaction. The information that’s currently available on alumni satisfaction—based on surveys and donation rates—is incomplete for many of the colleges on our list, so we were unable to include it as a measure. We are looking for ways to improve the alumni data and make it part of future rankings.

Out-of-state-public college tuition. Many students are interested in attending public colleges out of state. But public colleges charge higher tuition to out-of-state students. We will consider developing a cost and value comparison for out-of-state students.

Graduate earnings. The PayScale data is admittedly imperfect. It does not reflect unemployed or part-time workers, which means that the numbers we see may be skewed higher than the real average for all graduates. Offsetting that, at least in part: We are using data only on those who stopped their education at a bachelor’s degree, thus excluding high earners such as doctors, lawyers, and MBAs.

Net prices. MONEY’s estimated net price is likely to be higher than the average price actually paid by most families. It is crucial to understand that while the MONEY net price estimate is the average price charged by the college, you and your family will pay less than that if your student receives any federal, state, or private scholarships. As an analogy, if you’re buying a can of soup, you have to pay what the grocery store charges, unless you have a coupon. Just as coupons can be used at competing supermarkets, most federal, state, and private scholarships can be used at many competing colleges. So we help you identify which college has the lowest net price at which you can apply any additional scholarships. In addition, our net price is based on the average student’s time-to-degree. Your student may finish in four years. And while many students take more than four years to finish a degree, they aren’t necessarily paying full tuition for the five or six years before they graduate, since they may, for example, take a year off to work. MONEY attempted to account for this by adjusting the estimated time to degree for all schools with large and established co-op work programs, such as Northeastern University. In addition, MONEY is not adding to the cost of a degree any amount for “opportunity cost,” which is the amount in earnings a student loses by not finishing a degree on time and delaying entry into the higher-paying job market of college graduates. So, while we may, in some cases, be overestimating the price of a degree, we are also underestimating the total economic expense to a student of schools that don’t speed them to graduation.


David Morton and Gareth Harper of College Measures collected and analyzed our data.

Katie Bardaro, the lead economist for, produced unique earnings reports and analyses for MONEY, and advised us on ways to account for the impact of majors on earnings.

Michael DeLeon, a graduate student at Teachers College, Columbia University, served as Money’s research assistant, analyzing data and conducting statistical tests.

Among the many experts who volunteered advice and suggestions:

  • Anthony Carnevale, director and research professor, Georgetown University Center on Education and the Workforce.
  • Trey Miller, associate economist, RAND Corporation
  • Jennifer Lewis Priestley, statistics professor, Kennesaw State University
  • Matt Reed, program director, The Institute for College Access and Success
  • Joseph Yeado higher education research and policy analyst, The Education Trust

Among the dozens of experts we interviewed or consulted:

  • Beth Akers and Matthew Chingos, Brookings Institution education researchers
  • Robert Archibald, economics professor, William & Mary
  • Jeffrey Arnett, research psychology professor, Clark University
  • Sandy Baum, senior fellow, Urban Institute
  • Douglas Bennett, former president, Earlham College
  • Charles Blaich, director, Center of Inquiries at Wabash College. Also at the Center, Kathleen Wise, assistant director.
  • Brandon Busteed, executive director, Gallup Education
  • Corbin Martin Campbell, associate professor, Columbia University’s Teachers College
  • Jesse Cunha, assistant professor, Graduate School of Business and Public Policy Naval Postgraduate School
  • John Curtis, director of research and public policy, American Association of University Professors
  • William Destler, president, Rochester Institute of Technology
  • Marilyn Emerson, past president, Independent Educational Consultants Association
  • Greg Fenves, provost, University of Texas
  • David Figlio, director, Northwestern University Institute for Policy Research
  • Douglas Harris, associate economics professor, Tulane University
  • Terry Hartle associate vice president of the American Council on Education. Also at ACE: senior vice president Dan Madzelan
  • David Hawkins, director of public policy and research, National Association for College Admission Counseling. Also at NACAC: president Kay Murphy and president-elect Jeffrey Fuller
  • Kerry Healy, president, Babson College
  • Lisa Heffernan, author,
  • Bradon Hosch, assistant vice president for institutional research, planning & effectiveness, Stony Brook University
  • Mark Kantrowitz, publisher,
  • Michael McPherson, president, Spencer Foundation
  • Ben Miller, senior policy analyst, New America Foundation
  • National College Advising Corps (several staffers)
  • Josipa Roksa, associate director, Center for Advanced Study of Teaching and Learning in Higher Education, University of Virginia
  • Joyce Serido, research professor, University of Arizona
  • Michael Violtt, president, Robert Morris University Illinois
  • Carl Wieman, professor in the physics and in the education departments at Stanford University, Nobel Prize-winner, and former associate director for science at the White House Office of Science and Technology Policy


TIME White House

Education Department Dials Back Plan to Rate Colleges

Getty Images

The Department of Education announced this week that it’s backing off its ambitious and controversial plan to rate all of the nation’s colleges and universities, marking a win for institutions and the vast higher education lobby that vehemently opposed the idea.

Administration officials promised nearly two years ago that they would roll out a new federal ratings plan, the Post-Secondary Institution Rating System (PIRS), to help push students toward high-quality schools that would give them the best return on their money. President Obama also suggested that the system could eventually be used as a tool to hold institutions accountable, by tying federal financial aid to institutions’ ratings.

The Education Department announced yesterday that it would instead release a different, significantly less ambitious “ratings tool” that will simply provide information about all of the more than 7,500 colleges and universities in the country, so students can “reach their own conclusions about a college’s value.” The new tool will not explicitly rate the institutions based on any measures of quality nor tie federal aid to a school’s performance. (The announcement prompted a cheeky discussion on Twitter about how, exactly, that could be called a “ratings tool” at all.)

Administration officials insisted that the Education Department’s decision to back off on the ratings system did not mark a significant policy shift: the original rating plan was designed primarily as a consumer-facing tool, to help students make informed decisions; the new tool will play precisely role.

Still, many advocates were disappointed. Ben Miller, the senior director of post-secondary education at the Center for American Progress, says it was “a decent step back from putting colleges on notice.”

“The problem I have is that anyone can create a consumer tool” that provides information about schools to students, he said. The Education Department’s College Scorecard and the National Center for Education Statistics’ College Navigator already do some of that.

“What the Education Department does have is an accountability role over every college and university in the country,” he said. “So that’s my disappointment. I wish it would use that unique role more and not do something anyone can do.”

Rachel Fishman, a policy analyst with New America’s Education Policy Program, saw the Education Department’s reversal this week as a “major win for institutions,” which, along with the higher education lobby and a coalition of mostly Republican lawmakers, opposed the ratings plan from the start. They argued that it was little more than a government-led effort to publicly shame certain schools on the basis of incomplete federal data and biased formulas that would reward schools for doing things like, say, admitting high percentages of low-income students.

The higher education lobby argued that PIRS, which was never completed, would be inherently unfair, “since it would be based on incomplete federal data on student achievement,” Fishman said. “They’re right that there’s incomplete data, but the reason for that is because the higher education lobby fought for a ban on that data,” she said. (The government’s ability to collect student records is currently very limited.)

Andrew Kelly, the director of the Center on Higher Education Reform at the American Enterprise Institute, saw the Education Department’s reversal on its rating plan as an indictment of the plan itself. “It’s easy to chalk this up to the higher education lobby’s power, but that implicitly suggests that the policy itself was sound and was the right way to go, and I think that’s not correct,” he said. “I think the notion of the federal government rating colleges wasn’t particularly appropriate in the first place. Where they would up is probably where they would have started.”

Administration officials argued that it hasn’t dropped the ball on holding institutions accountable; it’s just using other tools. For example, on Tuesday this week, a federal court judge threw out a lawsuit brought by for-profit colleges that attempted to overturn the federal government’s new “gainful employment rules,” which will require for-profit and a very limited number of other colleges to meet certain benchmarks of quality—like whether alumni get jobs that pay them well enough to repay their loans—in order to receive federal financial aid. The new rule are now set to go into effect next week.

MONEY colleges

Department of Education Backs Away from Plan to Rate Colleges

stacks of unlabeled books
Grant Faint—Getty Images

The Obama administration's plan to officially rate colleges appears to be cancelled.

The U.S. Department of Education appears to have scrapped plans to create a college rating system, according to an announcement on the agency’s official website.

In a blog posting Thursday, Deputy Under Secretary for Postsecondary Education Jamienne Studley said the department would be releasing a tool this summer that will “provide students with more data than ever before to compare college costs and outcomes” in order to “Help students to reach their own conclusions about a college’s value.” The new system will still provide information on colleges, but refrain from assigning a ranking.

That policy differs starkly from the president’s original plan, announced in August of 2013, to develop a rating system for colleges and tie federal financial aid to each institution’s performance. The ratings would have been based on factors like better access for lower-income students, affordability, and outcomes such as graduation rate and graduate earnings.

However, the rankings initiative met stiff opposition from educators who accused the administration of embracing a one-size-fits-all approach.

“Applying a sledgehammer to the whole system isn’t going to work,” Robert G. Templin Jr., president of Northern Virginia Community College, told the New York Times last year. “They think their vision of higher education is the only one.”

While some officials initially claimed the creation of a college ratings system would be a relatively simple endeavor—Deputy Under Secretary Studley previously compared rating colleges to “rating a blender”—the department seems to have come around to Templin’s position.

“Through our research and our conversations with the field, we have found that the needs of students are very diverse and the criteria they use to choose a college vary widely,” wrote Studley on Thursday. “By providing a wealth of data–including many important metrics that have not been published before–students and families can make informed comparisons and choices based on the criteria most important to them.”

In an interview with the Chronicle of Higher Education, Under Secretary of Education Ted Mitchell did not explicitly say the ratings plan had been cancelled, but admitted the department would “be focusing on the consumer-focused tool for this year’s project.” The ratings system was originally scheduled for release before the 2015 school year.

Even though the government has pulled back from providing official college ratings, the new system will be built to assist third parties in creating their own rankings. Studley’s blog post included a promise to “provide open data to researchers, institutions and the higher education community to help others benchmark institutional performance.”

Last summer, MONEY debuted its own “Best Colleges” rankings. Those ratings used a number of metrics to determine a college’s quality affordability, post-graduation outcomes, and affordability and then compose an overall ranking based on those features.

TIME India

You Now Need to Score 99% to Study English at One of India’s Top Colleges

Admissions at Delhi University
Ramesh Sharma—India Today Group/Getty Images Students arriving at the Delhi University to fill their admission form on June 9, 2015

It's literally only accepting the top 1%

St. Stephen’s College in New Delhi has long been one of India’s premier and most sought-after educational institutions, counting notable personalities from fields as diverse as politics (including the former Presidents of three different countries), science, business, writing and acting among its alumni.

The elite college has always been notoriously difficult to get into, but took its entry criteria to near-farcical proportions this year by setting a cutoff of 99% for students applying for its English honors course from the commerce stream.

Students applying from the science fields have it slightly better, needing to score above 97.75% to be eligible, while humanities students can get in with 97.5%, the Indian Express newspaper reported. The Economics honors course comes in a close second, with the bar set at 98.5%, 97.5% and 97%, respectively.

And that’s just the beginning — successful students will then face an interview, as well as a 30-minute aptitude test introduced this year, in order to be deemed worthy of studying at St. Stephen’s.

The high bar set by the reputed New Delhi school, which operates as part of the University of Delhi and is consistently ranked among the nation’s top five, reflects the increasingly cutthroat competition to get into elite institutions in the South Asian nation.

“We have received the maximum applications this year,” college spokesperson Karen Gabriel told the Express. “While 27,000 candidates had applied in 2013, this year’s figure is the highest in the college’s history.”

A record 32,100 applications came in this year, of which only 400 will eventually become “Stephenians.”

[Indian Express]


Texas Lawmakers Pass Bill to Allow Concealed Carry at Public Colleges

Mike Schoefield
Eric Gay—AP Rep. Mike Schoefield packs up his desk after the House adjourned on the final day of the legislative session in the House Chamber at the Texas Capitol on June 1, 2015, in Austin, TX.

If governor signs bill, openly carrying guns on campus would remain prohibited

The Texas state legislature passed a bill Monday that would allow people to carry concealed guns in buildings on public college campuses, and Governor Greg Abbott is expected to sign it.

The new law would remove a blanket ban on guns on campus at public Texas colleges, though school administrators could still ban guns from specific buildings, CNN reports. Backers of the bill claim it will provide increased individual protection to properly licensed gun-owners, but opponents argue campus shootings could increase and schools will have to pay up to boost security.

Texas is the first state to have a campus-carry bill reach the governor’s desk to be signed. Private universities won’t be affected by the new bill, and openly carrying a weapon on a public college campus would remain prohibited.


Read next: Vince Vaughn Says Banning Guns ‘Like Banning Forks’

TIME Innovation

Five Best Ideas of the Day: April 2

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

1. McDonald’s is raising wages for 90,000 employees. That’s a good start, and a strong message to other fast food outlets.

By Shan Li and Tiffany Hsu in the Los Angeles Times

2. “It must be right:” The human instinct to trust the authority of machines can be dangerous when life is on the line.

By Bob Wachter in Backchannel

3. As college acceptance letters roll in, women should ask about sexual assault prevention on campus.

By Veena Trehan at Nation of Change

4. When corporate values clash with policy in conservative states, big business has a powerful veto tool.

By Eric Garland in Medium

5. Amazon’s Dash button isn’t a hoax. It’s a step toward a true “Internet of Things.”

By Nathan Olivarez-Giles in the Wall Street Journal

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email

TIME Innovation

Five Best Ideas of the Day: March 26

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

1. Al Qaeda and ISIS are locked in an ideological war, and for once, it’s good to be their mutual enemy.

By Daniel Byman and Jennifer Williams in Lawfare

2. For the millions left behind by America’s new economy, disability claims — legitimate or otherwise — are skyrocketing.

By Chana Joffe-Walt in Planet Money by National Public Radio

3. Maybe universities shouldn’t measure prestige by the number of applicants they turn away.

By Jon Marcus in the Hechinger Report

4. When younger women have heart attacks, they’re twice as likely to die as their male counterparts. Is medicine’s gender bias to blame?

By Maya Dusenbery in Pacific Standard

5. Can the triumph and tragedy of soccer help Harvard students appreciate the humanities?

By Colleen Walsh in the Harvard Gazette

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email

TIME Innovation

Five Best Ideas of the Day: December 17

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

1. Independent and third party candidates could break D.C. gridlock — if they can get to Washington.

By Tom Squitieri in the Hill

2. A new software project has surgeons keeping score as a way to improve performance and save lives.

By James Somers in Medium

3. The New American Workforce: In Miami, local business are helping legal immigrants take the final steps to citizenship.

By Wendy Kallergis in Miami Herald

4. Policies exist to avoid the worst results of head injuries in sports. We must follow them to save athletes’ lives.

By Christine Baugh in the Chronicle of Higher Education

5. Sal Khan: Use portfolios instead of transcripts to reflect student achievement.

By Gregory Ferenstein at VentureBeat

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email

Your browser is out of date. Please update your browser at