Before the 2017 UK general election, I wrote two blog entries (here and here) about how I was testing an intuition that instead of asking people for predictions about their OWN future behaviour, we might be better off asking them to predict the future behaviour of a FRIEND. In short, I had a hunch that we might know the minds of others better than we know our own!

Here is the summary, to this point:

    • I mostly work in healthcare market research, but used voting intention to test my hunch because it involves asking a simple question that can be easily repeated and, crucially, could be validated against the general election outcome. It is difficult to find robust real-world outcomes against which to test, so this was a golden opportunity, especially for an election junkie!
    • I used Google Surveys to independently field the two questions below. Initially, between the 29th April and 1st May (at the start of the campaign) and again, unchanged, between the 2nd and 4th June, just ahead of the election on the 8th June. That gave four data sets in total, two for each question.
      • A) How will you vote in the UK general election on the 8th June? Single-choice, answer list comprising: Conservative, Labour, Liberal Democrat, SNP (in Scotland) / Plaid Cymru (in Wales), Other (e.g. UKIP, Green, Independent), I haven’t decided yet / I don’t know, I probably (or definitely) won’t vote.¹
      • B) Please think about someone you know well and regard as a friend. How do you think THEY will vote in the UK general election on the 8th June? Single-choice, answer list comprising: Conservative, Labour, Liberal Democrat, SNP (in Scotland) / Plaid Cymru (in Wales), Other (e.g. UKIP, Green, Independent), I honestly don’t know how they will vote, I don’t think they will vote at all.¹
    • The aim was, definitively, NOT to predict the political outcome. Best to keep politics out of business I reckon. Also, I confirmed during the process that Google’s sampling has a strong natural bias to the political “Left”, no doubt resulting from its heavy reliance on intercept surveys on online news websites.
    • Instead I focused on predictions of turnout (i.e. do people intend to vote themselves / do they expect their friend to vote), and confidence in the voting decision (i.e. levels of certainty about how they will personally vote / how they think a friend will vote)
    • The reason for my hunch? Behavioural science tells us – variously – that we can be unreliable witnesses to our own motivations; that we can be over-confident in our ability to stick to our intentions; that we are naturally expert at observing other people; that we often tell little white lies to protect our own ego; and that as a consequence of all that (and more) we are notoriously poor forecasters.

So, did my intuition prove correct? 

Asking about a friend was the overall more accurate measure

Firstly, and unsurprisingly, actual turnout was lower than either the self-declared or friend-prediction measures predicted. That’s because I didn’t attempt to weight for it like the pollsters do (not my expertise, nor relevant to my purpose). Besides, it is an accepted market research / polling effect: answering a survey requires no physical action and is in the moment, whereas voting requires both remembering to vote, and then actually voting. In terms of testing the hypothesis – looking at the two data sets in relative terms, the friend-prediction measure was overall 3 points closer to actual turnout than the self-declared measure, based on aggregate results from the ‘start’ and ‘end’ of campaign surveys. The friend-prediction measure was also closest to actual turnout in five of the six age categories, and in the other one (18-24) it was only 1 point higher.

*Turnout amongst all registered voters according to Ipsos MORI

Asking about a friend was the only measure to predict that enthusiasm for voting might be starting to wane amongst older voters

Generally, in both versions of the question, the expression of likelihood to vote increased over the course of the election campaign. However, in the friend-prediction data we saw a distinct 3 point decrease in intention to vote amongst voters aged 65+, which stood out because the three brackets covering respondents aged 35-64 all indicated a modest increase in voting attention over the course of the campaign, and the two brackets covering those aged 18-34 showed markedly increased levels of enthusiasm for voting. The best data we have to compare against on this score is from Ipsos MORI, comparing the change in turnout by age between the UK general elections of 2015 and 2017. This clearly shows that turnout amongst younger voters was much higher in 2017 than in 2015 (as we all now know… it was one of the first observations to emerge on election night), but that turnout amongst older voters was, unexpectedly, down. I say unexpectedly because electoral history shows that older voters, especially those aged 65+, can usually be steadfastly relied upon to vote (and mostly for the Conservatives). Nobody, so far as I’m aware, predicted that they might not do so this time? The suggestion from the friend-prediction data is that whilst the campaign energised many younger voters, it had a negative effect on the voting intention of some older voters. I will leave readers to reach their own conclusions as to why.

Asking about a friend was much the more sensitive measure

Until this election result, pollsters and pundits used to say that election campaigns didn’t much matter to the outcome. However, this time, every single pollster identified a substantial narrowing of the Conservative lead over the course of the campaign: from a comfortable 20 point lead in voting intention share at the outset, to a slender 2.4 point lead in actual vote share on the night. In our data, the %-point increase in Labour’s share of voting intention over the campaign was greatest between the two friend-prediction surveys (+15 points), and whilst the self-declared measures showed the same direction of travel, they significantly under-reported the change that was afoot (+11.9 points). Returning to turnout, the friend-prediction survey data moved 4.7 points overall over the course of the campaign, whereas the self-declared data moved by less than 1 point. That is, the friend-prediction measure picked up changes in turnout intention amongst younger and older voters during the campaign, whereas the self-declared measure hardly changed. Given what transpired, the former seems to have been better attuned to reality.

Responses when asking about a friend were given with greater confidence

Self-declared survey responses for how confident people were in HOW they would vote firmed-up over the course of the campaign – on average they expressed themselves more confident in their voting choice close to election day than they had been at the start of the campaign. That seems like common sense; a fit with conventional wisdom. Yet, by contrast, results from the two friend-prediction surveys hardly deviate, suggesting that the degree of confidence in respondents’ perceived insight into the mind of a friend doesn’t waver. What is more, the absolute levels suggest that respondents also felt more confident predicting the voting choice of a friend, than predicting their own choice! It may seem against the grain, if not distinctly unnerving, to reflect on the possibility that our ability to mind-read a friend is a more effective predictor than our own conjecture, and yet that seems to be what is happening here?

SELF-DECLARED: How will you vote in the UK General Election on the 8th June? 
% selecting "I haven't decided yet / I don't know" at the START of the campaign31.3%
% selecting "I haven't decided yet / I don't know" at the END of the campaign23.0%
% point change-8.3
"Start" base=1230 UK adults (weighted), via Google Surveys (29/4/17 - 1/5/17)
"End" base=1314 UK adults (weighted), via Google Surveys (2/6/17 - 4/6/17)
FRIEND-PREDICTION: Please think about someone you know well and regard as a friend. How do you think THEY will vote in the UK General Election on the 8th June?
% selecting "I honestly don't know how they will vote" at the START of the campaign20.0%
% selecting "I honestly don't know how they will vote" at the END of the campaign19.4%
% point change-0.5
"Start" base=1257 UK adults (weighted), via Google Surveys (29/4/17 - 1/5/17)
"End" base=1288 UK adults (weighted), via Google Surveys (2/6/17 - 4/6/17)

Personally, I am satisfied – based on these findings – that my intuition did prove correct (but always happy to debate).

Further, I think the technique would work just as well if the word ‘colleague’ replaced ‘friend’ in a predictive question of this type when applied to business and professional settings, such as my home sector of healthcare and pharmaceuticals. The important thing, I suspect, is that the subject of the question should be a person whose relevant personal or professional behaviour the participant might reasonably be expected to know well.

Why does it work?

  • It forces us to think: anticipating the behaviour of another is tangential, effortful, and potentially fun (when compared to answering about ourselves)
  • We are natural-born observers of others, with an innate ability to read verbal and non-verbal cues
  • The technique frees us from our own ego. Specifically: we can bypass any self-doubt regarding our own intentions; we needn’t worry about exposing views or intentions that we’d rather not admit to (even anonymously); we overcome over-confidence bias: behavioural science tells us that we tend to be over confident about our own intentions (but more realistic when judging the intentions of others)

I think it follows that we should seriously consider putting this approach into active market research practice. Predictive questions are part of the market research furniture, despite their attendant problems. Even research buyers who are aware of the pitfalls include them, because marketing and brand teams everywhere love to forecast and set KPIs! I can’t see that changing, so if we can do it better – however achieved – then I think we should.

Any and all feedback welcome!

¹ Google Surveys limits answer lists to seven items only, hence the ‘SNP / Plaid Cymru’ and ‘Other’ groupings.