Friday, July 16, 2010

Businessweek publishes an embarrassingly bad article on the college wage premium


Bloomberg Businessweek published a "special report" on the return on investment to individual colleges that can charitably be described as misleading at best, irresponsible at the worst.  Titled "College: Big Investment, Paltry Return," the article purports to demonstrate how students attending some colleges earn a higher return on their investment than students attending other colleges.  Businessweek used a consulting firm that collected self-reported salary information on its website, and used that salary information to calculate the median earnings for graduates of each college, and compares this to what people who only have high school diplomas earn.

The consulting firm then subtracted the current cost of attendance for each institution, adjusts for the school's six-year graduation rate, and used the result to calculate an expected annual ROI over 30 years.  And as is ever so popular these days, it creates a ranking of 554 4-year colleges and universities across the country, from highest to lowest ROI (the methodology is described here).  The results?  MIT has the highest ROI in the country at 12.6%, and Black Hills State University (in South Dakota) the lowest, 4.3%.  The article also publishes a shame list, or those schools with high tuition but a low ROI.

So what's wrong with this "special report"?  Here are just a few examples:
  1. Probably the most egregious error is that the article makes the same mistake so many articles in the popular press make:  it confuses correlation with causality.  The article implies that any random student out there attending a higher-ranked institution will have a higher ROI than attending a lower-ranked one.  But of course students are not randomly distributed across those 852 institutions.  If you took all the MIT students, and sent them to Black Hills State University, the Black Hills ROI would skyrocket.  Why?  Because the typical MIT student has a higher level of human capital (intelligence, skills, aptitude, motivation - whatever you want to call it) than does the average student attending Black Hills State.

    So MIT may very well impart some added value on its attendees, as I'm sure it does, but it can't claim the full credit for their post-college earnings.   Another way to think of it is to consider an experiment.  Take all those students who were admitted to MIT and Black Hills, and randomly assign some of the students to attend their chosen institution, and others to not be allowed to enroll and instead go into the labor markets without the benefit of a college degree.  Under Businessweek's methodology, one would expect all of those non-college attendees to have the same earnings.  But of course this wouldn't be true; the students admitted to MIT but not allowed to attend would likely have much higher earnings in labor markets even without attending college than would the students accepted to Black Hills but not attending.  This is because many of the same traits that got those students admitted to MIT are also valued in labor markets and thus rewarded through higher wages. 

    In statistical terms, the Businessweek methodology suffers from strong selectivity bias effects.

  2. The earnings data used in the study are entirely self-reported on the consulting firm's website, and is by no means guaranteed to be representative of the population of graduates of each institution.  Thus, the earnings data used in the calculations could very well be biased upward or downward - there is no way to tell.

  3. The cost-of-attendance data used in the study are based on sticker prices, and don't account for financial aid students receive.  Given the important role that institutional financial aid plays in subsidizing the cost of college today (see for example this study I did a few years ago, or more recently a study with my Penn State colleagues John Cheslock, Rodney Hughes, and Rachel Frick-Cardelle), the lack of accounting for institutional aid leads to downwardly biased ROI figures for many institutions.

  4. The ROI figures are going to be highly dependent upon the mix of majors in each college.  Thus, Businessweek's list of "best bargains" is skewed toward public institutions that enroll a large number of students in STEM (science, technology, engineering, and math) fields, i.e., Colorado School of Mines, Georgia Tech, Virginia Tech, Cal Poly, Purdue, and Missouri University of Science & Technology.

  5.  It has been well documented that jobs that require college degrees generally have much better benefits, particularly pension and health care benefits, than those held by people with only a high school diploma.  Thus, by focusing on earnings only, and not accounting for benefits differences, the returns to college are again biased downward.
Most of the evidence from labor economists (see the work of Card and Krueger, or Ehrenberg) points to the fact that differences in returns to college are driven more by within college variation (i.e., differences in the choice of majors or academic experiences once enrolled in a particular college) rather than differences between colleges.  What this means is that the decisions students make about what to major in, what courses to take, and what other experiences they have in college have much more influence on their post-college earnings than does the choice of which college to attend.

The article concludes with this ridiculous comment:
Over the past 30 years, the S&P 500 Index averaged about 11 percent a year. Only 88 schools out of the 554 in the study had a better return than the S&;P. Everywhere else, students would have been better off—financially, at least—if they invested the money they spent on their college educations and never set foot in a classroom.

"For almost every school on the list," writes Lee [the director of quantitative analysis at the consulting firm used by Businessweek] in an e-mail, "prospective students paying full price would probably have been better off investing in the stock market 30 years ago rather than spending their money on a college education."
So the message here is that unless you attend one of those 88 schools, you are better off skipping college and instead investing the money in the stock market.

There have been many egregious examples over the years of the misuse of quantitative data to create rankings of colleges and universities.  But this is perhaps the worst I have ever seen.

4 comments:

  1. Even if the analysis was spot-on, the suggestion that students would have been better off investing the money makes the patently false assumption that students would have access to that money for investment purposes. How many students (or really, their parents) actually have the cash to pay for tuition? A good chunk of tuition is paid for using scholarships, grants, or education loans.

    ReplyDelete
  2. Great point, Turducken. Much of the money "paid" for their education is in the form of institutional grant aid and student loans that wouldn't be available for investing in the S&P 500 as an alternative.

    ReplyDelete
  3. Yes, BW reports on education have been absolutely atrocious for some time. I've written their award winning chief economist, Mike Mandel, on his blog about his own poor use of data. They rarely adjust list price for the 4 fold increase in student aid. They often focus on "full-time worker" earnings, ignoring that part of the benefit of jobs among the college educated is greater job security. As Don notes, they often focus solely one earnings, ignoring the fact that the college educated get more of their total compensation from non-earnings. That includes benefits, but in later years it also includes investment, partially generated by their higher salaries.

    I'm not sure what's the reasoning, but BW has a very clear corporate agenda against higher education.

    ReplyDelete
  4. Moralhazard: I'm not sure it's an agenda against higher education, though you're right that they've published other critical articles. But I think it's likely motivated at least in part by an interest in selling magazines/page views, as they know that "rankings" of this type are of great interest to readers.

    I haven't checked, but I wonder if any of the colleges that come out well promoted their position through press releases. If so, it would be pretty self-serving of them.

    ReplyDelete