Thursday, March 22, 2012

Bad science from the New York Times

I'm always on the lookout for bad science/bad math (and the terrible journalism that it produces), and this week I came across a real gem. This one was noteworthy because it came from the New York Times, the "paper of record" that's obviously not struggling with its business model at all. In an article entitled "For 2nd Year, a Sharp Drop in Law School Entrance Tests", author David Segal writes the following:
Legal diplomas are apparently losing luster. 
The organization behind the Law School Admission Test reported that the number of tests it administered this year dropped by more than 16 percent, the largest decline in more than a decade. 
The Law School Admission Council reported that the LSAT was given 129,925 times in the 2011-12 academic year. That was well off the 155,050 of the year before and far from the peak of 171,514 in the year before that. In all, the number of test takers has fallen by nearly 25 percent in the last two years. 
The decline reflects a spreading view that the legal market in the United States is in terrible shape and will have a hard time absorbing the roughly 45,000 students who are expected to graduate from law school in each of the next three years. And the problem may be deep and systemic.
Segal goes on to write a long and sometimes-rambling article about how law school may be entering a serious long-term downward correction, as students begin to shy away from high levels of debt and uncertain long-term income prospects.

For what it's worth, I happen to agree with some of the overall conclusions that Segal draws--student loan debt is reaching frighteningly unsustainable levels, and the positive "return on investment" that so many in my generation have taken on faith is no longer quite so clear. There's only one problem--the tipping point hasn't yet arrived (though it will), and the data that Segal presents to "support" his thesis in fact does nothing of the sort. Rather, this is just one more example of a journalistic outlet massaging the data to fit its argument, rather than the other way around.

The statistical crime that Segal has committed here is in his failure to provide enough context to give his data any real meaning. His statistics regarding the LSAT provide only the last three years of data--he tells you that the 2009-10 LSAT was a "peak" for total tests administered, but tells us nothing about the overall trend that led to that "peak". Luckily for us, the generous folks at LSAC make those statistics available for the public, so we can do a little bit of fact-checking on our friend Segal.

When we look at the overall data series (dating back to 1987-88), we see a pretty different story. Yes, tests taken are at their lowest level in three years, but the number is nowhere near the trough that we saw during the economic boom years of 1995 to 2000. Generally speaking, LSATs administered spike during recessions, then recede during recoveries, a trend that makes intuitive sense. Let's look at this graphically, using some more publicly available data (for this chart, I plotted the annual change in LSATs administered against the annual percent change in the unemployment rate--hence a change from 4.0% unemployment to 5.0% unemployment represents a 25% increase in unemployment):


Viewed another way (just using gross numbers rather than % change, and plotting on dual axes):



That's a pretty obvious strong relationship (for the math nerds among you, that's an r-squared of .486 on the first chart, .666 on the second), but Segal and the Times don't bother to show you that part of the picture. They have a story to tell, and dammit they're gonna make the statistics fit that story.

This trend is troubling, because it takes advantage of the innumeracy that plagues America--most people don't know how to tell when statistics are being manipulated, so they take the authors/researchers' conclusions largely on faith. That's fine when the authors are operating in good faith and using proper statistical analysis, but in the Times' case that's clearly not what's happening. Segal has used a dramatic two-year decline in LSATs administered and pretended that it's the only thing in the data series that matters. For the purposes of his story, it's completely irrelevant to provide any sort of context for his data.

What I actually find most interesting about the data (when looked at in full) is that LSATs occasionally have a tendency to creep higher BEFORE the unemployment rate follows in kind. In other words, the number of people taking the LSAT could actually be seen as a leading economic indicator--people start taking the LSAT in anticipation of being laid off, because they see business conditions beginning to deteriorate.

Am I actually suggesting that such a thesis is reasonable, and worth writing about? No, of course not, there's not nearly enough data to support my thesis, so it's mostly just me spitballing some ideas--without statistical justification, proposing that thesis doesn't meet my journalistic standards. But then, I don't write for the Times, do I?

[NY Times]

No comments:

Post a Comment