Monday, November 15, 2004

Social scientists weigh in on poll results

I haven't given much credence to the claims that Bush operatives rigged the election. The fact that the Kerry team chose not to contest the results seemed the best evidence that Kerry lost the election. Those guys were closer to the data, and had more invested in the results, than any of the rest of us.

However, there remains a serious conundrum: why did swing state exit polls -- which in general are an extremely reliable predictor of election results -- showed Kerry so far ahead early on, only to see him lose?

So far this topic has been the subject, mainly, of rumor-mongering and innuendo -- a typical result of poor data and lack of methodlogical rigor. On the left we've heard people blathering on about how Bush rigged the election in advance, or bugs in electronic voting machines, or what have you. On the right we've heard anti-scientific snarks about being unable to rely on "experts" and statistics.

Now the social scientists are starting to weigh in, showing just how strange the divergence between the exit polls and the final results really were. Money graf, refering to Ohio:
Given that the exit poll revealed that Kerry received 52.1% of the vote, we are 95% sure that the true percentage he received was between 49.8% and 54.4%. And because half of the 1 in 20 cases that fall outside the interval would be high rather than low, we're 97.5 percent sure that the true percentage he received was at least 49.8%. We are 99.5% sure that the true percentage he received was at least 49.2%. It turns out that the likelihood that he would have received only 48.5% of the vote is less than one in a thousand.

Moreover, the probability of the divergence that took place in Florida was less than 3 in 1000 and in Pennsylvania the likelihood of the divergence was less than 2 in 1000. As the article concludes, "The odds against all three [of these divergences] occurring together are 250 million to one."

250 million to one? In other words, there's basically no chance that this was a random skew.

So what happened? Did the polling samples overrepresent women (tendential Democrats)? Were Republican voters less willing to be interviewed? Did the poll-takers express a systematic partisan bias (i.e. prefering to interview people wearing Kerry/Edwards buttons to those wearing Bush/Cheney buttons)? Were early/absentee voters much more heavily pro-Republican than usual? Also: why were the exit polls accurate on almost everything except the Presidential tally? And why were the divergences much higher in swing states than in non-swing states?

The truth is, we really don't know what happened. Like the author of the paper, I'm not suggesting that Bush stole the election. But there's certainly a problem here that needs explaining.

No comments: