by JS O’Brien
If US News holds true to form, it will publish its 2009 undergraduate college rankings in August 2008, just in time to drill its way into the heads of all those eager new high school seniors who have to decide where to apply for early decision before November 1, and for regular decision before January 1.Â
Not to mention what the rankings do for their parents’ bragging rights.
The US News rankings are controversial, especially among thoseÂ colleges that aren’t highly ranked.Â They complain that the magazine doesn’t measure what actually goes on in the classroom and the learning outcomes at various universities, and they’re right.Â Of course, the schools themselves don’t know that stuff either.Â No one knows that stuff.Â I can’t even find a college that clearly defines exactly what skills and knowledge an undergrad should have before getting a degree, nor can I find one that tests to make sure their graduates have what the schools haven’t yet defined.
I think US News does a pretty decent job of measuring those things that tend to be proxies for the real measurement that doesn’t exist and which most faculty members I know resist to their last breaths.Â Basically, the magazine measures:
- What peer colleges think of other colleges on a scale from 1 to 5, with 5 being “distinguished” and 1 being “marginal.”Â In other words, 3 is average, 4 is a step better than average, and 2 is below average.
- The quality of the student body measured mostly by standardized tests scores and the percentage graduating at the tops of their high school classes
- The amount of money a school spends on things that benefit students
- Class size
- How many alumni donate money to the school (as a proxy for customer satisfaction)
- The quality of the faculty (full-time, PhDs, etc.)
- How many students graduate vs. predicted graduation, and
- Faculty resources
Quite a lot of the bickering over US News‘ rankings centers onÂ whether the factors are appropriate and how they are weighted.Â Note once again, if you will, that colleges aren’t offering an alternative and/or better numbers.Â They’re just offering complaints.
To stop all the complaints and come up with the final and definitive college rankings, I’m publishing my first annual JS O’Brien undergraduate college rankings, right here.Â My methodology is simple.Â I’ve already gone on record about the mess a poor-quality student body can make of a college experience, so it seems clear to me that one ingredient in a good college education mix has to be the quality of the students.Â
That’s factor one.
Factor two comes from the fact that the best quality students in the world cannot influence a classroom of 750 kids seated in a huge auditorium to hear a lecture.Â They might as well be watching the lecture on television.Â So, the second factor in a superior college education is “class size,” with smaller being better.
Factor three is faculty quality which, of course, we have no data for because most college faculty members are quite convinced that everything in the world can be measured except their own job performances.Â But that’s OK.Â If you look closely at the US News peer rankings, they fall almost in lockstep with overall faculty reputation.Â It’s not a perfect measure but, then, we’ve already established that such things don’t exist.
Putting it all together, it comes to a logical conclusion:Â We don’t have real data on educational performance, so we’re going to assume that when you put great teachers in small classrooms with highly skilled kids, good things tend to happen.Â That’s not to say that they always happen, but the odds are better than decent.
The following rankings work pretty simply.Â I ranked every college from 1 to 40 on peer review score (as published in US News), classroom size, and standardized test scores.Â For standardized test scores, I used the 25th percentile and broke ties by ranking the school with the higher 75th percentile higher.Â For classroom size, I used the percentage of classrooms under 20 people, and broke ties with the percentage of classrooms of 50 or more.Â If a school didn’t finish in the top 40 in all categories, it didn’t get ranked.Â Why?Â
Because I say so.
The lower the score, the better.Â The perfect score would be a 3, which would mean that school had the highest peer review score, the smallest classes, and the most able undergrads.Â The highest possible score would be 120, meaning a school finished 40th in all three categories.
Only large, national universities are ranked, so if you’re applying this year, don’t count out Swarthmore, Reed, Vassar, Amherst, Pomona, and the like.
And away we go!
- Yale (9)
- CalTech (11)
- Princeton (17)
- Harvard *Â (19)
- Stanford *Â (19)
- Duke (28)
- Penn (30)
- Chicago (31)
- Columbia (32)
- Washington University (35)
- Northwestern (36)
- MIT (38)
- Brown (40)
- Dartmouth (48)
- Johns Hopkins (51)
- Carnegie Mellon (58)
- Emory (63)
- Cornell (66)
- Rice (67)
- Vanderbilt (68)
- Berkeley (77)
- USC (80)
- NYU (109)
So, there you have it.Â The perfect rankings.
No need to thank me, but it would be appreciated.
*Â Ties are settled based on who has ever hired me and paid me consulting fees.Â If both institutionsÂ have paid me consulting fees, ties are broken by which university was the least pain in the ass to work with.Â If they were both terrible pains in the ass, ties are broken by the quality of brick construction on campus.