Monday, July 26, 2010

A more detailed analysis of ED's proposed gainful employment rules


Now that I've had a chance to actually peruse (note I didn't say "read," since they are 93 pages long) the Education Department's proposed "gainful employment rules" for proprietary colleges, which were just published in the Federal Register today, I feel a little more qualified to comment on them.  Pay attention, folks:  comments are due to ED by September 9.

A few interesting observations about the proposed rules:
  1. Rather than basing eligibility for participation in the federal Title IV (student financial aid) programs solely on loan default rates, as do the current regs, the new rules use a two-part test: loan repayment rates and earnings-to-student loan repayment ratios.  Thus, institutions with programs that prepare students for gainful employment (i.e., vocationally-oriented programs) would have to demonstrate that students were both making enough money to pay back their loans, and were actually paying them back at acceptable rates.  No jokes please about programs that do not prepare people for gainful employment, i.e., most bachelor's degree programs.

  2. The application of the default and earnings-to-repayment ratios would result in institutions falling into three categories:
    -- Fully eligible to participate in Title IV
    -- Ineligible to participate in Title IV
    -- Partially eligible, but would have to curtail their growth and provide certain information to consumers about the risks of excessive borrowing
    The ED estimates that 5 percent of proprietary institutions would fall into the ineligible category, and 55 percent would become partially eligible.

  3. The new rules apply not just to proprietary (for-profit) colleges, as much of the press has focused on.  They apply to any higher education institution that offers gainful employment programs.  The ED estimates that the new regulations will affect the following number of institutions:
    -- For-profit: 22.7%, or 474 institutions
    -- Private, not-for-profit: 15%, 36 institutions
    -- Public: 11.8%, 252 institutions
    Note that the percentages are based only on the number of institutions in each sector that offer gainful employment programs.  Harvard, for example, would not be included in the denominator of the private, not-for-profit category.

  4. The Dept. of Education obviously has good data on default rates on federal student loans, from the National Student Loan Data System.  But the mystery was where it was going to get the data to calculate the earnings-to-repayment ratios.  Would it rely on the institutions to survey their graduates?  Would it survey graduates?  Or perhaps rely on a third party?  Well, the answer is found on page 43623 of the Federal Register:



    "The Department would calculate the average annual earnings by using most currently available actual, average annual earnings, obtained from the Social Security Administration (SSA) or another Federal agency,. . ."
    So the Department will most likely use Social Security Numbers to match student loan repayment amounts with individual's earnings, as recorded in the Social Security system.
Not surprisingly, the proprietary sector has come out guns ablazing against the regs, issuing a press release on the night they were announced titled, "Career College Association Rejects Metrics-Based Approach to Gainful Employment," calling the new rules "unwise, unnecessary, unproven and is likely to harm students, employers, institutions and taxpayers."  They're clearly not going to go down without a huge fight on this one.  We'll have to wait to see if they'll have any assistance from the other sectors, who obviously will not be as threatened by the regs as will the for-profit schools.  Bloomberg reported today that its index of 12 stocks of publicly-traded universities is down 2.1 percent, with DeVry leading the pack down 5.7 percent as of 3:00pm today.

It is too early to tell whether the regulations will survive largely in the form ED has proposed them.  One sign that they likely will is that Congress has grabbed onto this issue and doesn't appear ready to let go.  So some form of tighter rules will likely occur, and how much teeth they have will be determined in large part by the lobbying (and political donation) strength of the for-profit sector.

Stay tuned.

Friday, July 23, 2010

Department of Education finally issues the gainful employment rules for proprietary colleges


The ED finally issued the rules for measuring gainful employment in proprietary colleges.  The rules will impose a two-part test for gradutes of vocational programs, involving both earnings-to-loan repayment ratios, as well as default rates.  ED estimates that under the rules, 5 percent of proprietaries will be forced out of the Title IV federal student aid programs entirely, and another 55 percent will see their growth restricted by the regs.

Marketplace covered it this morning, including a short quote from me (here's a link to the audio of the story), as did the Chronicle of Higher Education and Inside Higher Ed.

Friday, July 16, 2010

Businessweek publishes an embarrassingly bad article on the college wage premium


Bloomberg Businessweek published a "special report" on the return on investment to individual colleges that can charitably be described as misleading at best, irresponsible at the worst.  Titled "College: Big Investment, Paltry Return," the article purports to demonstrate how students attending some colleges earn a higher return on their investment than students attending other colleges.  Businessweek used a consulting firm that collected self-reported salary information on its website, and used that salary information to calculate the median earnings for graduates of each college, and compares this to what people who only have high school diplomas earn.

The consulting firm then subtracted the current cost of attendance for each institution, adjusts for the school's six-year graduation rate, and used the result to calculate an expected annual ROI over 30 years.  And as is ever so popular these days, it creates a ranking of 554 4-year colleges and universities across the country, from highest to lowest ROI (the methodology is described here).  The results?  MIT has the highest ROI in the country at 12.6%, and Black Hills State University (in South Dakota) the lowest, 4.3%.  The article also publishes a shame list, or those schools with high tuition but a low ROI.

So what's wrong with this "special report"?  Here are just a few examples:
  1. Probably the most egregious error is that the article makes the same mistake so many articles in the popular press make:  it confuses correlation with causality.  The article implies that any random student out there attending a higher-ranked institution will have a higher ROI than attending a lower-ranked one.  But of course students are not randomly distributed across those 852 institutions.  If you took all the MIT students, and sent them to Black Hills State University, the Black Hills ROI would skyrocket.  Why?  Because the typical MIT student has a higher level of human capital (intelligence, skills, aptitude, motivation - whatever you want to call it) than does the average student attending Black Hills State.

    So MIT may very well impart some added value on its attendees, as I'm sure it does, but it can't claim the full credit for their post-college earnings.   Another way to think of it is to consider an experiment.  Take all those students who were admitted to MIT and Black Hills, and randomly assign some of the students to attend their chosen institution, and others to not be allowed to enroll and instead go into the labor markets without the benefit of a college degree.  Under Businessweek's methodology, one would expect all of those non-college attendees to have the same earnings.  But of course this wouldn't be true; the students admitted to MIT but not allowed to attend would likely have much higher earnings in labor markets even without attending college than would the students accepted to Black Hills but not attending.  This is because many of the same traits that got those students admitted to MIT are also valued in labor markets and thus rewarded through higher wages. 

    In statistical terms, the Businessweek methodology suffers from strong selectivity bias effects.

  2. The earnings data used in the study are entirely self-reported on the consulting firm's website, and is by no means guaranteed to be representative of the population of graduates of each institution.  Thus, the earnings data used in the calculations could very well be biased upward or downward - there is no way to tell.

  3. The cost-of-attendance data used in the study are based on sticker prices, and don't account for financial aid students receive.  Given the important role that institutional financial aid plays in subsidizing the cost of college today (see for example this study I did a few years ago, or more recently a study with my Penn State colleagues John Cheslock, Rodney Hughes, and Rachel Frick-Cardelle), the lack of accounting for institutional aid leads to downwardly biased ROI figures for many institutions.

  4. The ROI figures are going to be highly dependent upon the mix of majors in each college.  Thus, Businessweek's list of "best bargains" is skewed toward public institutions that enroll a large number of students in STEM (science, technology, engineering, and math) fields, i.e., Colorado School of Mines, Georgia Tech, Virginia Tech, Cal Poly, Purdue, and Missouri University of Science & Technology.

  5.  It has been well documented that jobs that require college degrees generally have much better benefits, particularly pension and health care benefits, than those held by people with only a high school diploma.  Thus, by focusing on earnings only, and not accounting for benefits differences, the returns to college are again biased downward.
Most of the evidence from labor economists (see the work of Card and Krueger, or Ehrenberg) points to the fact that differences in returns to college are driven more by within college variation (i.e., differences in the choice of majors or academic experiences once enrolled in a particular college) rather than differences between colleges.  What this means is that the decisions students make about what to major in, what courses to take, and what other experiences they have in college have much more influence on their post-college earnings than does the choice of which college to attend.

The article concludes with this ridiculous comment:
Over the past 30 years, the S&P 500 Index averaged about 11 percent a year. Only 88 schools out of the 554 in the study had a better return than the S&;P. Everywhere else, students would have been better off—financially, at least—if they invested the money they spent on their college educations and never set foot in a classroom.

"For almost every school on the list," writes Lee [the director of quantitative analysis at the consulting firm used by Businessweek] in an e-mail, "prospective students paying full price would probably have been better off investing in the stock market 30 years ago rather than spending their money on a college education."
So the message here is that unless you attend one of those 88 schools, you are better off skipping college and instead investing the money in the stock market.

There have been many egregious examples over the years of the misuse of quantitative data to create rankings of colleges and universities.  But this is perhaps the worst I have ever seen.

Wednesday, July 7, 2010