Prediction or projection? How bias skews data

Many of us will usher 2016 out of the door with a sigh of relief. The year has been a tumultuous one, and the two biggest predictions that were forecast didn’t come to pass. In June, the UK shocked the world by voting to leave the EU, and political pundits were left nursing their wounds this month after Donald Trump’s success in the US election.

The pollster Nate Silver, who has made a successful business out of his forecasting website FiveThirtyEight, successfully predicted the past three US elections. But on this one – and on the election primaries – he got it wrong.

To be sure, Silver’s model built in more uncertainty about the outcome of the presidential campaign – mainstream news outlets like The New York Times and The Washington Post gave Clinton an almost certain chance of winning, while FiveThirtyEight pointed out that the race was close, too close to be 99% sure of a Clinton win.

Nonetheless, the events of the past few months have been a reality check for prediction. It’s tempting to believe in a reassuring reality where algorithms and data can provide all the answers. However, as we discussed in our recent article on workplace prediction, data is valuable, but we do need to be cautious about this way of thinking.

Statisticians can build objective, data-driven models and still human bias can have an impact. Ironically, considering that predictive analytics is designed to reduce human error, algorithms and forecasts can reflect the biases of their creators.

“Software making decisions based on data can reflect, or even amplify, the results of historical discrimination” – The Atlantic.

Looking at Brexit and the US election, it’s likely that cognitive bias played a part in polling inaccuracies. Humans are easily swayed, often subconsciously, and this can lead to beliefs being shaped by the ‘bubble’ surrounding them.

As well as influencing voters, this can have an impact on political forecasts. Nate Silver’s colleague Harry Enten calls this ‘herding’ – when pollsters produce results that are very similar to each other, but in effect they are ‘the blind leading the blind’.

And that’s before all the other human and statistical factors that play a part in polling inaccuracy – lack of a random sample, response rates, margins of error, voter unpredictability and last-minute changes of mind.

In short, there are few straightforward answers, but one thing is for sure: humans can’t resist trying to see what will happen. We don’t cope well with uncertainty, so I’ll leave you with one way to approach the future:

“I know that history is going to be dominated by an improbable event, I just don’t know what that event will be.” – Nassim Nicholas Taleb, Black Swan: The Impact of the Highly Improbable.

Want to read more thought-leading articles on employee engagement, data and workplace trends? Request a copy of thinkBox, our annual publication, by getting in touch with us at thinkBox@karianandbox.com

Posted on 30th November 2016 in Insight
thinkBox: edition 7

Our annual journal featuring the latest thinking on employee engagement

Order a free copy Download the PDF
thinkBites: July 17

Our quarterly journal featuring the latest thinking on employee engagement

Order a free copy Download the PDF