h1

Answering a poll question is NOT the same as having an opinion

August 10th, 2018

The above tweet from former Labour Party pollster James Morris strikes me as being very apposite and goes to the heart of how we use polling outcomes. For unless there is some effort within the poll to ascertain how strongly people feel about a subject then it can be hard to interpret results.

We know that with voting intention surveys almost all pollsters now try to ascertain how certain it is that respondents will actually vote but what about other findings? Those sampled might have a view when pressed but how strongly do they feel about it.

Are, for instance, ordinary voters really going round saying how angered they are about Chequers or do they not feel that strongly about it. Clearly those who are hostile, like the MP named in the tweet, are going to give an interpretation to a polling outcome that most supports their own position. A better measure, I’d argue, are the TMay leader ratings.

I’m sceptical of polling questions which require a very long preamble to explain what it is that an opinion is being sought. If the issue isn’t known to the respondent and doesn’t come over simply then you cannot assume that people really have the knowledge to make a judgement.

The other thing that does annoys is when the pollster knocks out all the don’t knows and refusers and gives you a net number so the total adds up to 100%. We really do do need to know what the other figure is so we can pass judgement. I tend to ignore these polls.

I’d argue that the fact that 40% of samples now have no view on whether Corbyn or May is the best prime minister says more than saying that Corbyn is 12 points behind on this question.

I like long-term tracker questions where they are sufficient data points to see if there is a trend when the same question as has been asked in the same format.

Mike Smithson