Why Does Polling Go Wrong?
Mainstreet Research

Insights

Why does polling go wrong?

Over the last few weeks a number of polls have been published about byelections happening in Bourassa, Toronto Centre, Brandon-Souris and Provencher, and we’ve received many questions about conducting polls with automated dialing.

How can these polls, which are usually very accurate, go so wrong?

If you want to analyze a poll here are some things to look at:

When was the poll conducted?

One thing that always makes us wary is conducting polls immediately before a vote; which is why our firm has a blackout period of 48 hours prior to election day. In our experience, polling so close to the open of the polls provides a skewed response and doesn’t allow you to re-enter the field to collect additional information if your response rate was low. Always check to see when a poll was conducted and look at what events were happening before and during the polling period for a better understanding of public mood.

Who was called?

It’s tough to find out who was called if you didn’t commission the poll or execute it, but understanding how firms decide who to call is important for understanding reliability. On twitter some voters reported being repeatedly called by the same polling firm again and again during the past byelections, even though the published ‘random’ sample size in the polls was relatively small. This led some to ask: was the polling really random? Look to see in the methodology who the polling firm specified as being called.

Here at Mainstreet we try to eliminate multiple calls two ways. First, we knockout all answering machine calls. The platform detects answering machines and they are terminated immediately upon detection. On average, this occurs within 10 to 12 seconds of dialing. Second, when we do a second round of calls, we only use past detected answering machine results and no answer results. No one who answered in a live call is called again. All our polls use a ‘3 pass’ approach. We will dial the list 3 times, each time removing the phone numbers of anyone who picks up.

What’s the sample size?

The larger the sample size the better – but also the more expensive it is to conduct a poll. Polling companies are charged for connection time, meaning that each dial costs more the longer the respondent stays connected (ie. to answer questions). We try to base our results on a minimum of 600 respondents. Generally, response rates are between 8 and 10% meaning that to get 600 results we need to call a list of 6000 or better. 600 respondents correspond to a margin of error of +/- 3.8%. In contrast, 400 respondents correspond to a margin of error of +/- 4.8%. In the Ontario byelections that occurred earlier this year the sample size for our polls was over 1000 respondents each. Generally speaking, the smaller the sample size of a poll, the more wary you should be of the results because the margin of error is higher.

What are the standards of the polling firm?

Here at Mainstreet we have decades of political experience as we are veterans of countless campaigns. When it comes to polling and research we have conducted over 150 polls. Our clients may not have always been happy with the results, but they were always impressed with our accuracy. In the previous British Columbia provincial election we were one of a handful of organizations to publicly predict a win by the BC Liberal Party.

Over the last few days IVR polling has been receiving a lot of criticism. We can certainly understand why that would be the case, but we would caution both our clients and the public that polls conducted by IVR, when conducted diligently, are still an accurate way to gauge public opinion nevermind the significant cost savings. We were in the field for the latest round of by elections and we were comfortable with the results a week ago and we are comfortable with the results today.

Quito Maggi is the president of Mainstreet Research.