Saturday, October 02, 2010

Why Polls Are Mostly Worthless

Every election season, and election off-season for that matter, we are inundated with polling data. Polling firms act like the Delphic Oracle when they are really nothing more than educated guesses.
  • The margin of error is a statistical fudge factor. Assuming the data collected is accurate the real numbers will be anywhere within the margin (95% of the time). SurveyUSA's poll of the California governor's race has Brown beating Whitman 46% to 43% with a margin of error +-4%. That means Brown's numbers are anywhere between 50% and 42%, Whitman anywhere between 47% and 39%. So the poll is telling us that Brown is winning, unless he is losing.
  • Sample size. When they start breaking polls down to small bits the sample size gets tiny. That same SurveyUSA poll survey said they surveyed 610 voters with 6% being black. That's 36.6 people. Whitman gets 27% of black votes, says SurveyUSA, 27% of 36 people equals 10. Ten blacks said they were voting for Whitman. If it were 11, her percentage of black voters would jump to 30%. And with tiny sample sizes the margin of error skyrockets. The margin of error for the black subset is 16%.
  • Weighing. The problem is they probably didn't reach exactly 36.6 black voters. Maybe they only contacted 25, maybe they talked to 50. Whatever, the pollster adjusts his actual numbers to that 36.6 voters. If they only reached 25 black voters those responses would be multiplied by 1.44. Whitman's ten blacks may only have been live seven voters.
  • Polling Questions. This problem can't be quantified but can be immense. Polling responses vary widely depending on the questions asked and the order of the questions. Push polls are a notorious example. Even changing the order of the questions in the SurveyUSA poll would have changed the results in unpredictable ways. 
  • Non-responses. Two out of every three people polled hang up or don't answer. Pollsters cling to the hope that this isn't a statistically significant factor but they don't have any, you know, actual statistics to back up that hope. For example, do angry (Tea Party) voters hang up more or less frequently than bored voters?
  • How they count undecided. Many pollsters push respondents to commit to somebody, they refuse to take "I don't know" for an answer. SurveyUSA says that only 3% of Californians are undecided on the Lieutenant Governor race. That is absurdly small and a sign they did a lot of arm twisting to get responses. Better polls use the Very/Somewhat/Not Very construct. That takes more time, costs more money, and almost no one does that.
This is not to say that polling is wildly inaccurate, They are, remember, educated guesses. They are almost always right in landslides. In close elections they are right a little better than half the time. On the whole, public opinion polling is better than reading Tarot cards. But not by much.

Read also: The Good, the Bad, and the Ugly of Polling

2 comments:

PoliShifter said...

Another issue that is raised occasionally by some is the fact that many of the so-called reputable polls are conducted via land lines...

who still has a landline? Older white tea baggers for one which is why sometimes I think the media may be caught in their own trap reporting on a reality that doesn't exist.

Anonymous said...

We have a landline (my job provides me a cell phone which I reserve for work, or calling my wife). More than half the time I refuse to participate in phone polls, especially the automated ones (when there is a human, I sometimes use it as an opportunity to educate the poll taker). Those "couple minutes of your time" the poll takers promise are usually under-estimates. They want my time for free, for their benefit. Nope. My time has value to me.