Nov
02
2008

Opinion polling: the other dismal science

So, I open up my web browser this weekend to check the news, and I see the following three polls, all on the same page:

  • Rasmussen: Obama up by 5 points
  • Gallup: Obama up by 10 point
  • Zogby: McCain up by 1 point

These can’t all be right, can they?

Actually, they can. Or, at least, they can all be properly conducted, and just lead to wildly different results.

The only way to get a perfect result is to interview everyone in the country. (In fact, that’s exactly what we do on Election Day.) But that takes so much time and money that no individual pollster can do it. Instead, they interview several hundred people, maybe a couple thousand, and from there extrapolate what the country as a whole will do.

Now, mathematically, you can do this. You just can’t be sure of your answer. Here are a few of the reasons why.

Margin of error

Most opinion polls will state the margin of error. For example, they may say that that Candidate X is ahead by, say, 5 points, with a margin of error of plus-or-minus 3 points. Meaning, the real answer could be as high as 8 points or as low as 2 points.

(Sometimes, the margin of error is actually larger than the result. The poll shows Candidate X leading by 2 points, but with a margin of error of 4 points. Meaning, he could be ahead by 6, or he could actually be behind by 2! This seems to have happened a lot this year.)

A range of a few percentage points, when applied to a country with over 100 million voters, can lead to some pretty huge differences.

Confidence interval

In addition to reporting a margin of error, polls also report a confidence interval, usually 90% or 95%. This means that, according to the laws of mathematics, there is a 95% probability that the real result is the same as the poll result, within the margin of error.

But what about the other 5% or 10% of the time? Well, the folks reporting the numbers don’t like to tell you this, but, mathematically speaking, the poll can do everything right, and still be completely wrong, as much as 10% of the time.

There have been over 700 polls released this election season, and over 200 just in October. No doubt, many of the polls you have heard about fall into this category.

Weighting

In most elections, more women vote than men. If you conduct a survey and talk to 100 men and 100 women, you are going to have to give the women’s answers more weight to accurately reflect the Election Day results.

How much more weight? That depends. Do you think this election will be pretty much the same as previous years? Is there something happening this year that will make a lot more women come out to vote? Or, perhaps, something that will attract a lot more men?

The fact is, nobody knows. Weighting is just educated guesswork. And this year, it is more complicated than usual:

  • Black voters are expected to come out in record numbers to support Barack Obama. How many will actually vote? Nobody knows.
  • Young people generally do not vote as much as other groups. But many analysts expect more young people to vote this year. How many more? Nobody knows.
  • Democrats, having lost the last two elections, are likely to turn out in larger numbers. How much larger? Nobody knows.
  • New voters. There has been a great push to register new voters, many of them poor or minority members. These groups tend to vote Democratic. But, the NY Times reports that as many as 60% of those registrations may be fake. If you are a poll taker, you really have no idea how many new voters there actually are.
  • Likely voters. Every pollster ends up talking to some people who will not vote on Election Day. Different polls use different methods of figuring out who is likely to actually vote—based on whether they voted last time, how much interest they have in the election, or just taking the voter’s word for it.

The different weighting factors used by the different polls probably accounts for most of the variability we see in the results.

Human factors

Let’s face it – humans are complicated and sometimes uncooperative beings. There are lots of ways they can foul up a perfectly good poll.

  • Lying. People have a tendency to tell a pollster what they think he wants to hear. Maybe they just want to be nice; maybe they want to avoid an argument. This skewing has been found in many, many types of polls.
  • Refusal. In any poll, a certain number of people refuse to participate—they don’t want to be bothered, or they don’t want to talk politics with a stranger. If refusals are more likely to come from one party than the other, this can skew results.
  • Hard-to-reach folks. For years, pollsters called people on the telephone. But today more and more people have cell phones, or call screening, and are hard to reach. Again, if people with such gadgets are more likely to support one candidate or the other, it will skew the results. (One blogger has noticed that John McCain does better in polls conducted during the week than on the weekend, and speculates that's when McCain supporters are home.)
  • Bias. Poll takers are only human. They have their own thoughts and opinions. And while they take great pains to be neutral, those opinions sometimes come through in the questions asked or the way the answers are interpreted. Most of the major polling companies are based in big cities like New York, Los Angeles, Boston, or Washington, DC – all cities that are heavily Democratic. This may explain why, in the last two elections, the Democrats did better in the polls than they did on Election Day.

So, with all these problems, how can we figure out who is going to win the election? Well, never fear – there is one sure-fire way to find out the winner:

Read the newspaper Wednesday morning.

And don’t forget to vote!

Your Comments, Thoughts, Questions, Ideas

Linda's picture
Linda says:

Opinion polling is one way to gather a consensus - but it is only as good as your sample and sample sizing. And based on you sample it can get one sided.

posted on Mon, 11/03/2008 - 12:30pm
bryan kennedy's picture

Looks like a majority of the polls out there were correct in predicting the presidential election. I will be digging around in the coming weeks for an analysis of the more detailed polling results. I am specifically interested to see how pollster-elite of the moment, Nate Silver's work pans out. It seems like he took a different approach of averaging many polls and data points into a large model to predict the outcomes. I will be curious to see how accurate his more detailed state-by-state predictions were.

posted on Wed, 11/05/2008 - 11:26am
Gene's picture
Gene says:

There was a great variety in the poll predictions. According to this article, Rassmussen and Pew were right on the money. They list 23 polls in order of accuracy. Interestingly, the majority of the polls that were way off the mark were conducted by media companies -- TV networks and newspapers / magazines. This is irrefutable proof that either a) the media has a liberal bias; or b) the media wants a splashy headline and so they produce outrageous results; or c) the media should leave polling to the professionals. ;-)

As for state-by-state, I looked at Real Clear Politics.com, a non-partisan site that compiles all available polls and averages them out. They only got three states wrong -- Indiana, North Carolina and North Dakota. (As I write this, Missouri is still undecided.) Of the three, two were very close and could have gone either way, both in the polls and in the election. Only North Dakota was way off -- they gave it to Obama by 0.6 points; McCain won by 8.

The average state poll on RCP was off by about 3.5 points -- within the standard 4-point margin of error. 33 of the 50 state polls ended up being within 4 points of the actual tallies. And of those that weren't, only two -- Arizona and, again, North Dakota -- were expected to be close and ended up being comfortable wins.

So, it would appear that the pollsters have learned from their difficulties of the last few years -- though it is best to take media-driven polls with a grain or two of salt.

posted on Fri, 11/07/2008 - 2:08pm
kali0082's picture
kali0082 says:

Getting a sample that is representative of the entire population is a very common problem within many fields, including science and statistics. The sample that is observed or studied - whether it is a group being polled, or a water sample pulled from the lake - is not guaranteed to represent the whole population.

posted on Wed, 11/05/2008 - 4:56pm
DO's picture
DO says:

No, but with polling we can calculate the probability of a (properly drawn) sample of a given size. For opinion polling for the US 1350 is the most efficient size to also look at subgroups lile religion or education.

posted on Fri, 11/21/2008 - 4:30pm
iowaboy's picture
iowaboy says:

i believe in that old adage how 'less is more'. that article was too long.

the best thing to say about polls is that the only one that counts is on election day.

and the best thing to say after that is people should be smart enough not to listen to everything the media tells them, they should think for themselves and make up their own minds instead of let others do it for them just because 'it's easier that way'.

posted on Wed, 11/12/2008 - 12:24pm
Jackie Rabideau's picture
Jackie Rabideau says:

A lot of the presidential polls are different, most likey from how the information is collected. I don't remember the details, but way back when phones were just introduced a poll was conducted via phone. The results showed the Republican candidate was in a huge lead. Then, this candidate lost by a large margin. The poll was off, because only affluent people had phones back then and these people were more likely to vote Republican. So, where to the people that take polls get their database these days? And what type of people are even likely to take a poll?

posted on Fri, 11/21/2008 - 12:52pm
DO's picture
DO says:

Most polls taken today by phone use random digit dialing as a way to sample all phones in use including unlisted numbers. Most of the suppliers that do that kind of research have built up databanks so they can avoid business and unused exchanges. Well, you ask, what about cell phones? Surveys have been done just of cell phone users and, it is claimed, closely parallel land line users of similar ages and other demographics. Drawing a good, representative sample is realtively easy -- the much harder part is developing good questions that yield usable information and then interpreting what the data means. `in case you are wondering, I have a PhD in political/consumer behavior and have done many a survey.

posted on Fri, 11/21/2008 - 3:54pm

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Allowed HTML tags: <a> <h3> <h4> <em> <i> <strong> <b> <span> <ul> <ol> <li> <blockquote> <object> <embed> <param> <sub> <sup>
  • Lines and paragraphs break automatically.
  • You may embed videos from the following providers vimeo, youtube. Just add the video URL to your textarea in the place where you would like the video to appear, i.e. http://www.youtube.com/watch?v=pw0jmvdh.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Images can be added to this post.

More information about formatting options