Ranking de US News & World Report
Septiembre 16, 2016

HIGHER ED

New College Rankings Are Out: NPR Ed Rates The Rankings!

September 13, 2016·6:00 AM ET
Illustration of a college ranking on a grade system scale of F to A.

LA Johnson/NPR

College presidents from High Point, N. C., to Laie, Hawaii, are sitting up a little straighter, because the 2017 U.S. News & World Report rankings are out today. Published every year since 1983, they’ve become perhaps the most famous and influential college rankings. But they’re no longer the only game in town.

There are more than 7,000 accredited colleges and universities in the U.S., and 20.5 million studentsenrolled in them this fall. That’s potentially 20.5 million opinions about what makes a school “the best.”

With so many variables — cost, location, curriculum, reputation — parents and students often turn to rankings and reviews. Besides U.S. NewsWashington Monthly magazine has put out an extensive set of rankings for the past 11 years. There are also rankings inMoney and Forbes, plus published guides by Princeton Review, Barron’s, the Fiske Guide To Colleges, and The College Board.

And last year, the U.S. Department of Education got into the game with the introduction of the College Scorecard. Using information from the student loan program and the IRS, the Scorecard gives far more information than previously available on outcomes like average earnings post-graduation.

The penalty for the wrong choice can be huge. The cost of a degree continues to soar, student loan debt is at a record $1.3 trillion, and there’s been a recent rise in defaults — meaning more students are having trouble paying those loans back, even as the government is trying to introduce more options to make it more affordable. Meanwhile, graduation rates and other expected outcomes vary widely from college to college.

Before you choose a college, then, the first step may be choosing which rankings or reviews to trust. And that’s where NPR Ed comes in.

Each of these ratings systems and guides has its own particular recipe. Selectivity? Prestige? Graduation rates? All of these criteria have strengths and weaknesses — they may be more or less useful, more or less fair and more or less easily gameable by the colleges.

As you’ll see, this is our subjective judgment based on limited evidence, but, hey, that also describes these rankings systems themselves.

1) Selectivity

For decades, the primary measure used to rank universities was by looking at which ones graduated the “best men.” From the 1930s to the 1950s, for example, Prentice and Kunkel published a guide that listed colleges based on how many of their alumni appeared in the social bible Who’s Who.

This method was simple and transparent.

It also largely mirrored social class, and it was somewhat circular: The “best colleges” were where the “best men” went, so the “best men” (most, back then, were rich, white men) kept going there.

In culinary terms, selectivity is a measure of the quality of the raw ingredients, not the skills of the chef.

U.S. News makes selectivity, or “student excellence”, 12.5 percent of its formula. They used to rely heavily on the acceptance rate, a metric that colleges can and do mess with, by soliciting more applications. They’ve now de-emphasized that factor, giving more weight to SAT or ACT scores and class rank of the entering students.

Selectivity, or excellence, indirectly influences other U.S. News measures too, like academic reputation, graduation and retention rates (the percentage of students who return from year to year), and even alumni giving (the “best men” tend to write bigger checks), which is 5 percent of its rankings.

But here’s the problem: the vast majority of American college students today go to nonselective institutions that admit just about everybody. And so what use can they make of this information?

We give “selectivity” one mortarboard out of four.

one mortarboard

2) Reputation

How good would you say that Princeton’s undergraduate business program is? That’s a trick question: There is no such program. Yet when other college presidents were asked this question, they gave the nonexistent program top marks. That’s known as the “halo effect.”

U.S. News bases 22.5 percent of its formula in its rankings of undergraduate institutions on reputation, in part by asking leadership at peer institutions for their opinions. As the halo effect shows, that’s a less than foolproof process. Our rating, again:

One out of four.

one mortarboard

3) Learning

It sure would be nice, in theory, if we could directly measure what students actually learn in college. But the kinds of standardized tests given by states in each grade, which form the basis of K-12 accountability, arebecoming less common at colleges. And even though most colleges do try to assess student learning in various ways, they rarely release that information in a way that makes it easy to compare across schools.

Two caps in theory, but in practice?

One out of four.

one mortarboard

4) Earnings

One of the biggest reasons people go to college is to get a better-paying job. So why not judge colleges, at least in part, by their graduates’ incomes? It’s a simple calculation of return on investment, especially given the amounts some students are borrowing to go to school.

The Education Department has now made that data a whole lot easier to access through the College Scorecard.

U.S. News doesn’t use this criterion. Money magazine does. Washington Monthly uses the government’s data to make a “Bang for the Buck” list that weighs graduates’ earnings against net prices. Comparing their No. 1 picks across regions is instructive: In the Northeast isHarvard, which boasts both unmatched reputation and oodles of financial aid to give away; in the West isCalifornia State University, Bakersfield, part of one of the country’s strongest public systems. Missouri’sCollege of the Ozarks and Kentucky’s Berea College, both small private liberal arts colleges with religious roots where students work to stave off debt, also top the lists in their respective regions.

Of course, money isn’t everything. Some have argued that the earnings method is inherently biased against colleges that educate a lot of future teachers, social workers and artists, while favoring tech- and engineering-heavy schools. Robert Kelchen, a professor at Seton Hall University who helps compile theWashington Monthly rankings, says he wishes they had better data, broken down by program.

Our rating: 2 out of 4 mortarboards. Good effort.

two mortarboards

 

5) Broader Outcomes

Income and employment don’t tell us everything we need to know about the value of higher education. Far from it.

Educated people, by the numbers, are healthier, live longer, are more likely to vote and have stronger marriages. Of course, that’s to say nothing of the intangible benefits to individuals of a liberal arts education and, to society, of having an educated and informed citizenry.

There must be colleges that do a better or worse job of developing those qualities. Unfortunately, what we don’t have are agreed-upon ways of measuring them.

For the past few years, Purdue University, in partnership with Gallup, has been trying to get at the question through surveys. They’ve asked tens of thousands of college graduates about their levels of life satisfaction, to try to determine some of the important factors connecting experiences in college to life outcomes.

Unfortunately for purveyors of other ratings, the researchers found that going to a top-rated school had no impact on later success or happiness.

And while that survey challenged other definitions of “prestige,” the pollsters did find a strong link between great teaching and learning experiences in college, and how that showed up in terms of happy, engaged alumni years later.

“If you are a graduate who was emotionally supported during college, it more than doubles your odds of being engaged in work and triples your odds of thriving,” says Brandon Busteed, director of Gallup’s education practice. “So we’re talking about life-altering differences.”

Gallup and Purdue hope to market some of their services to universities, but it’s a little ways down the road. If reliable ratings that include broader long-term outcomes, whether using surveys or some other indicators, were to emerge, we’d give them:

3 out of 4 mortarboards.

three mortarboards

Here’s how we ranked past years’ rankings!

0 Comments

Submit a Comment

Tu dirección de correo electrónico no será publicada. Los campos requeridos están marcados *

PUBLICACIONES

Libros

Capítulos de libros

Artículos académicos

Columnas de opinión

Comentarios críticos

Entrevistas

Presentaciones y cursos

Actividades

Documentos de interés

Google académico

DESTACADOS DE PORTADA

Artículos relacionados

Formación inicial docente

Have universities failed teacher education? England’s Department for Education has deaccredited some universities while approving a range of alternative providers and strictly defining course contents. But while defenders hail an evidence-based push for quality,...

Share This