April 16, 2006

Details count for graduation rates

You ever realize belatedly that you should have paid more attention to something when you were distracted? I'm definitely getting that feeling today, as I've been working on projects that don't require the data-that-with-luck-is-in-recovery from the hard disk crash Thursday morning. One of those is an article manuscript that will be up at Education Policy Analysis Archives in a few days, and another is a closer look at Florida's official graduation-rate calculation.

More on the jump...

When Florida changed its method of calculating graduation rates in the late 1990s, I didn't pay too much attention, largely because these measures are a dime a dozen, because I was focusing on other projects at the time, and because the first few numbers seemed in line with other figures at the time. But now, Florida's measures is the model for the National Governors Association-approved measure (see p. 18), and because a sharp rise in the graduation rate between the late 1990s and 2003 (the latest published rate) is being touted as evidence of success in Governor Bush's education policy.

The empirical question observers have noted about the official rate is that it is considerably higher than the other measures researchers have proposed, whether the Boston College simple methods, Jay Greene's, or Rob Warren's. I'll illustrate this with one of the standard measures used for years, and one that doesn't depend on counting students in each grade: the ratio of high school graduates to the older-teen population that one might expect would graduate that year. I've calculated these measures in four different ways for Florida: with and without private-school graduates included, and compared both to the 17-year-old population estimate in Florida for the graduating year and also to the average of the 17-year-old poulation the year before and the 18-year-old population in the graduating year (in other words, an estimate of the number of 18th birthdays in the academic year of graduation). More details after the image (a larger version behind the thumbnail0:


A comparison of different graduation rates for Florida, 1989-2003

The figure above shows four trend lines for Florida from 1989 in contrast to the higher ratio for the U.S. as a whole (including private-school graduates) and then, starting in 1998-99, the official Florida graduation rate.

The trend-line for the graduate:teen ratio has been heading up for Florida and the country for the last few years, so the upward trend in the official rate isn't surprising. What is notable is the implicit claim from the official rate that the class of 2003 was about 9 percent more likely to graduate than the class of 1999, and that dramatic multi-year trend is inconsistent with every other data source available.

Looking at the official manual for the state, I can see a few troubling issues:

  • The inclusion of alternatives to standard diplomas in the graduation numbers, with no public disaggregation
  • The exclusion of alleged transfers and movers from the base (creating an adjusted cohort) without any data quality checks to ensure that transfers really show up at a private school or in another state
  • The exclusion from the base (adjusted cohort) of students who drop out and immediately enroll in GED programs (as transfers to adult programs)

The last one is especially troubling and highly misleading. Note: I am not claiming that there is deliberate fraud involved in either the construction of this definition (which was piloted before Jeb Bush became governor) or in schools' manipulation of student records. But I think I need to follow up on this and see if the state has kept decent records on what adjustments have been made, precisely, on which basis.

(Gory details for the chart: Public-school details from Common Core of Data; private-school graduates for various years from the Private School Survey, with other years interpolated with a spline function and extrapolated before 1992 and after 2001 with linear trend lines; estimates of 17- and 18-year-old populations from the Census Bureau.)

Update (4/17/06, 3:40 pm): Andrew Rotherham takes a few minutes from caring for babies to comment on things, but gave the subscription URL for the Greene, Winters, and Swanson piece in Ed Week, a column which is also available for free at the Manhattan Institute site (yeah, at that link above with their names). More stuff at the Ed Week forum on dropout rates and stuff, something to which I contributed just now, almost three weeks later. Pay attention to the remarks of Cliff Adelman—while I disagree that he's made a convincing case that NELS shows that the CPS figures are correct (and he doesn't actually say that, though a sloppy reader might assume it), it's important to note what type of documentation NELS's staff had available. Neither the CCD nor CPS has such a check on data quality.

Listen to this article
Posted in Education policy on April 16, 2006 8:47 PM |