MBA Rankings – Take with a Grain of Salt?
Note: this article was published as a feature article in Poets & Quants’ 2014 Rankings Guide
As prospective students begin thinking about the business school admissions ordeal, one of the first things they often do is market research. Just as if you were to buy an automobile, you might look at surveys and expert opinions before choosing which car to test drive. Similarly, you might look at rankings to get a sense of where you might fit, or perhaps more to the point, where you might get in.
Read the Warning Label!
Rankings are deceptive. In fact, because they are based on statistics, they lure you into believing that not only are the numbers correct, but that they should drive your decision. I cannot deny that they matter, but they really should come with a caveat similar to those belonging to vices like red velvet cupcakes, home-brewed beer or salt, for that matter: Over reliance on rankings can be dangerous for your health.
Still, everybody uses them–but not only do they all have their flaws – as Universitiy of Texas McCombs’ market researcher Matt Turner points out, they are plagued by imprecise questions, inconsistent data, and uneven salary figures. Even Poets & Quants’ John Byrne, who launched the first ever B-school ranking at BusinessWeek, admits that some of the analyses suffer from sample size problems, rendering the data meaningless. “Too many applicants take [rankings] at face value, not understanding that the results of any ranking might be based on little more than the subjective judgments of a journalist,” writes Lauren Everitt, herself a journalist at Poets & Quants.
The Case of Tuck
So if you are understandably confused at this point, you’d be in good company. How can different rankings vary so widely? Look at Tuck, for example, which has been ranked as #2, #6, #9, #12, and #20, depending on which survey you look at. (The Economist, Forbes, U.S. News, BusinessWeek and the Financial Times, respectively). Tuck’s Assistant Dean Penny Paquette notes that certain rankings rely on more on hard data, and some rankings focus on opinions. And naturally, the value of those opinions depends on whom you ask…
Rankings are particularly weird when considering a school like Tuck, because that’s one of those places that either fits or it doesn’t. And it’s also weird to me, as a gut check, to think of Tuck as not being in the “top 10.” That’s why you too want to consider a mix of hard data – things like career services reports on your target industry, or simply a school’s location, as well as student and alumni opinions – from friends as well as strangers.
Don’t Split Hairs
Speaking of the top 10, at least 15 schools can claim that position, when you combine the results of any three rankings. So everybody’s a winner!
But please, don’t get hung up on comparing one school to another based on a narrow difference in ranking. I’ve had conversations with students debating whether Chicago or Wharton is the #3 school in the U.S., and personally, I think it is unrealistic to rely on a ranking with schools that are so different in location, temperament, and philosophy.
That’s because the business school experience is different for everyone, and it’s very hard for a rank to represent just how good a school is for you. Rankings can never fully reflect the rich complexity of business schools experience and they are no substitute for what matters most: finding schools that best fit you.
Bottom line: read rankings, enjoy them, but consume in moderation.