MBA Program Rankings Are Both Rank and King (2010 Businessweek Edition)

Businessweek released their 2010 rankings of MBA programs today. As always, any set of rankings has to be taken with a grain of salt, as the methodologies behind the rankings will determine whether or not it is evaluating the criteria that you care about, in a way that is accurate and makes sense. For me, I think the schools at the top of a ranking of MBA programs should reflect the graduating classes with best career prospects and qualifications (however that is measured). Still, particularly with well-known publications such as Businessweek or US News, there is going to be some self-fulfilling impact on people’s perception of a school’s prestige (and in turn on its students’ career prospects), whether deserved or not.

So, let’s take a look at BW’s 2010 ranking methodology. It says that the 2010 rankings are comprised of the following weights:
-45% student response (of which 50% is based on a survey of the Class of 2010 and 25% each for the Classes of 2008 and 2006)
-45% recruiter response (similarly, 50% is based on a survey of recruiters in 2010 and 25% each from 2008 and 2006)
-10% “intellectual-capital” rating (i.e., # of articles published by a school’s faculty in 20 publications)

Digging deeper, we see that the student surveys ask students to rank their schools on different aspects of their business school experience (e.g., teaching quality and career services) on a 50-question survey, on a scale of 1-10. At the very least, this would measure how happy students are with their school. It’s not clear though how schools get ranked though (e.g., do students at Chicago give their school a 10 in all categories, whereas students at the lowest ranked school give their school a 1 in all categories?) . In any case, it would be useful to generate a customized ranking that shows how a school stacks up on a certain segment of the questions that I care about, rather than on a category in the aggregate. That’s not an option, but it is quite interesting to read the individual comments from graduates of a particular school, even though it’s not clear whether all comments were included (doubtful) or how representative the chosen comments are.

Moving onto the recruiters portion of the rankings, BW surveyed 514 recruiters (of which 41.8%, or 215 recruiters responded) on the perceived quality of graduates and their company’s experience with MBAs. I think the relatively small sampling size and potential self-selection bias makes this portion of the rankings questionable. While I see the value in surveying the people who do the hiring at a professional school, it’s not clear who responded to this survey. Is it mostly investment banks and management consulting firms? Or was it mostly firms who are hiring in an area that doesn’t particularly interest me, such as marketing? Here, a useful ranking would be by industry or function, and to rank which schools are the most successful at securing a job there for its graduates. Part of the problem could be that not all firms and schools are willing to share this information.

Lastly, the “intellectual-capital” rating, which I would throw out. The tension here is between good teachers and good academics. In a professional school where I am trying to learn the skills I need to apply on the job, I care much more about the former and not the latter. The student survey already (supposedly) captures the teaching effectiveness of a school, so I don’t see much utility from including this measure. At least it only counts for 10% of the total ranking. Looking at the 2010 rankings, what does it mean when the top-3 schools in “intellectual-capital” are Duke, Maryland and Wake Forest, which are the 6th, 42nd and 48th schools, respectively, in the overall BW rankings? Maybe it means that the teachers at those schools should focus on teaching their students, who don’t seem to be as happy or have as good career prospects as other schools, instead of churning out research.

Love it or hate it, the BW rankings are here to stay. Overall, I think it has the right idea of surveying student and recruiter satisfaction, but since students are often interested in different aspects of a school or a professional field, it would be more beneficial to generate a set of customized rankings based on the data that BW collected. But, I can see why BW doesn’t want to do the legwork of collecting its data, just to give it away to potential competitors. It is also worth noting that an aggregate set of rankings, such as that found at Poets and Quants, seems closer to what one would think as the “real” ranking. As they point out, statistical differences between schools are not always significant, so rankings are better looked at as a grouping of similarly-ranked schools, rather than a statement that #4 is way better than #5, for example.

Advertisements
This entry was posted in Evaluating MBA Programs and tagged , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s