For decades survey research has provided trusted data about political attitudes and voting behavior, the economy, health, education, demography and many other topics. But political and media surveys are facing significant challenges as a consequence of societal and technological changes.
It has become increasingly difficult to contact potential respondents and to persuade them to participate. The percentage of households in a sample that are successfully interviewed – the response rate – has fallen dramatically. At Pew Research, the response rate of a typical telephone survey was 36% in 1997 and is just 9% today.
The general decline in response rates is evident across nearly all types of surveys, in the United States and abroad. At the same time, greater effort and expense are required to achieve even the diminished response rates of today. These challenges have led many to question whether surveys are still providing accurate and unbiased information. Although response rates have decreased in landline surveys, the inclusion of cell phones – necessitated by the rapid rise of households with cell phones but no landline – has further contributed to the overall decline in response rates for telephone surveys.The folks at the Pew Center also included this handy chart, which shows just how much deterioration has occurred in just the last 15 years:
There's a lot more to the article than just that blurb, and I'll admit that I'm not fully doing it justice by focusing only on this one dynamic. But I think it's an incredibly important trend, and one that most people are completely unaware of--nobody really knows these days how their statistical sausage is made (let alone their actual sausage, but let's just leave that for another day).
The problem with unreliable public opinion surveys, much like unreliable scientific studies, is that their results often impact future behavior. For example, if I'm considering voting for Ron Paul in an upcoming election, but opinion surveys show that he is polling only about 10%, I'll likely re-consider my vote because he looks "unelectable", regardless of whether or not this is actually the case. On the flip side, a poll that overstates a candidate's popularity could perversely make him more popular among other voters (in fact, I'm pretty sure this is what happened with Rick Santorum). Therefore, public opinion surveys can at times become self-fulfilling prophecies, which is always a bit of a problem.
But response rate isn't even the only problem that opinion polls are currently facing. I've recently participated in multiple phone surveys (one political, one economic), and both times I was literally laughing at the questions as I heard them because they were structured in ways that couldn't possibly reflect my actual thoughts or beliefs--they required overly simplistic answers, or else falsely limited my universe of possible answers. (An example: one caller asked if I aligned myself more closely with the Republican or Democratic party platforms. I responded "neither", and she told me that wasn't an option. I asked her if she was serious. She said she was. I laughed and said fine, flip a coin for me. She refused. So I flipped a coin. It came up tails. I went with the Democrats.)
Our problem is that we've all become increasingly obsessed with data, even as the quality of that data has steadily declined. But if you demand that your data tell a story, they'll tell one for you, even if it's an imaginary story. With response rates this low, that's basically what public opinion surveys have become--imaginary stories to be reported in national news outlets as if they were real news. It's sort of sad, really.
[Pew Research Center]