Why Are We Suddenly So Bad At Predicting the Future?

Imagine that a monkey got into the control room of the universe and spent the year clicking random buttons. Imagine him hopping about on the ‘one musician less’ button, stomping on the ‘auto-destruct’ lever and gurgling while he thumped repeatedly on the ‘introduce chaos’ switch. Close your eyes and picture him dropping a giant poo on the bright red panel marked ‘do not touch under any circumstances’. That my friends is the only way to think about 2016 – after all, it was the year of the monkey in the Chinese zodiac. It was the year when rational thinking took a beating, when meritocracy became a bad word, when liberalism escaped from the battlefield to a cave in the mountains to lick its wounds. And not surprisingly, a year when projections, predictions and polls made as much sense in the real world as an episode of Game of Thrones on steroids.
Given much of our lives are spent in productively engaging with the future and making decisions based on big and small decisions about the possible future, this last point is more important than just the schadenfreude of laughing at pollsters and would be intellectuals. The present passes too quickly, so really every decision you’ve ever made in your life is counting on future events to turn out in ways that are favourable. Getting this wrong is therefore injurious to health, to put it mildly. And yet our ability to predict the future has never been under such a cloud in living memory. Why is this so?

Fundamentally, we’re wired to think linearly in time, space and even line of sight. We are taught compound interest but we get it intellectually rather than viscerally. When you first encounter the classic rice grains and chessboard problem, as a smart person, you know that it’ll be a big number, but hand on heart, can you say you got the order of magnitude right? i.e. the total amount of rice on the chessboard would be 10x the world’s rice production of 2010? Approximately 461,168,602,000 metric tons? This problem of compounding of effects is incredibly hard to truly appreciate, even before you start to factor in all the myriad issues that will bump the rate of change up or down, or when the curve hits a point of inflexion. The Bill Gates quote  – ‘we over-estimate the impact of technology in 2 years, and under-estimate the impact over 10’ – is a direct reframing of this inability to think in a compound manner.

Then there’s the matter of space and line of sight. The way the future unfolds is dramatically shaped by network effects. The progress of an idea depends on it’s cross fertilisation across fields, geographies and disciplines, across any number of people, networks and collaborations. These collaborations can be engineered to a point or are the result of fortuitous clustering of minds. In his book ‘Linked’ – Ablert-Lazlo Barabasi talks about the mathematician Erdos who spent his life nomadically, travelling from one associates’ home to another discussing mathematics and ironically, network theory. Not surprisingly, a lifestyle also practiced for many years by a young Bob Dylan, if you substitute mathematics for music. Or consider the story of the serial entrepreneur in Rhineland in the 1400s, as told by Steven Johnson, in ‘Where Good Ideas Come From’. Having failed with a business in mirrors, he was working in the wine industry, where the mechanical pressing of grapes had transformed the economics of winemaking. He took the wine press, and married it with a Chinese invention – movable type, to create the worlds first printing press. His name of course, was Johannes Gutenberg. This kind of leap is not easy to predict, not just for the kind of discontinuity they represent (more on that later), but also because of these networked effects. Our education system blinkers us into compartmentalised thinking which stays with us through our lives. Long ago, a student of my mothers once answered a question about the boiling point of water by saying “in Chemistry, it’s a 100 degrees Centigrade, but in Physics, I’m not sure”. We are trained to be specialists, becoming more and more narrow as we progress through our academic career, ending up more or less as stereotypes of our profession. Yet human progress is driven by thousands of these networked, collaborative, and often serendipitous examples. And we live in a world today with ever expanding connections, so it’s not surprising that we have fallen behind significantly in our ability to understand how the network effects play out.

If you want to study the way we typically make predictions, you should look no further than sport. In the UK, football is a year round sport, so there are games every weekend for 9 months and also mid week for half the year. And with gambling being legal, there is an entire industry around football gambling. Yet, the average punter, fan or journalist makes predictions which are at best wilfully lazy. There is an apocryphal story about our two favourite fictitious sardars – Santa Singh and Banta Singh, who decide to fly a plane. Santa, the pilot, asks Banta, the co-pilot to check if the indicators are working. Banta looks out over the wing and says “yes they are, no they aren’t, yes they are, no they aren’t…” – this is how a lot of predictions are made in the world of premier league football today. Any team that loses 3 games is immediately in a ‘crisis’ while a team that wins a couple of games are deemed to be on their way to glory. Alan Hansen, an otherwise insightful pundit and former great player, will always be remembered for his one comment “You can’t win anything with Kids” – which he made after watching a young Manchester United side lose to Aston Villa in the 1995-96 season. Manchester United of course went on to win the season and dominate the league for the next decade and a half. Nobody predicted a Leicester City win in 2016 of course, but win they did. The continuous and vertiginous increase in TV income for football clubs has led to a relatively more equal playing field when it comes to global scouting networks, so a great player can pop up in any team and surprise the league. Yet we find it hard to ignore all the underlying trends and often find ourselves guilty of treating incidents as trends.

The opposite, is amazingly, also true. We are so caught up with trends that we don’t factor in the kinks in the curve. Or to use Steve Jobs’ phrase – the ding in the universe. You can say that an iPhone like device was sure to come along sooner or later. But given the state of the market – with Nokia’s dominance and 40% global market share, you would have bet your house on Nokia producing the next breakthrough device eventually. Nobody saw the iPhone coming, but when it did it created a discontinuous change that rippled across almost every industry over the next decade. The thing is, we like trends. Trends are rational and they form a kind of reassuring continuity so that events can fit our narratives, which in turn reaffirm our world view. And unless we’re close to the event, or perennial change seekers and nomads ourselves, it’s hard to think of countercyclical events. It’s now easy to see how in 2016 we were so caught up in the narrative of progressive liberalisation and unstoppable path to globalisation, we failed to spot those counter-cyclical events and cues that were right there in our path.

In fact there are any number of cognitive biases we are guilty of – on an everyday basis. This article just lists a dozen of them. My favourites in this list are the confirmation bias and the negativity bias. Both of these are exacerbated by social media and digital media. While social media has led us to the echo-chambers – the hallmarks of 2016, our projection bias is also accentuated by our ability to choose any media we want to consume, in the digital world, where access is the easy part. Similarly, bad news spreads faster on social networks and digital media today than at any time before in history. Is it possible that despite knowing and guarding against these biases in the past, we’ve been caught out by the spikes in the impact and incidence of a couple of these, in the digital environment we live in today?
To be fair, not everybody got everything wrong. Plenty of people I know called the Donald Trump victory early in the game. And amongst others, John Batelle got more than his share of predictions right. There is no reason to believe that 2017 will be any less volatile or unpredictable than 2016, but will our ability to deal with that volatility improve? One of the more cynical tricks of the prediction game is to make lots of predictions at many different occasions. People won’t remember all your bad calls, but you can pick out the ones you got right, at leisure! This is your chance, then, to make your predictions for 2017. Be bold, be counter-cyclical. And shout it out! Don’t be demure. The monkey is history, after all. This is the year of the rooster!

Revolution By A Thousand Digital Cuts

You can say that with every new wave of technology, the format, structure and even the content of media changes. Or you can simply say that the medium is the message. Either way, publishing just isn’t what it used to be. Here’s my wide angle view of some of the many changes bubbling through the publishing world. 

Self publishing in all it’s many forms continues to flourish and grow. Just when you thought that between WordPress and Blogspot, that was all there was to it, you now have a raft of new platforms, including Medium and Hi for example. I’m still working out the real value of these tools, apart from the fact that they are beautifully laid out and easy to set up for the infrequent writer. I don’t feel yet that the writing on these platforms are genetically better than those on any other blogging platform. But clearly there’s a big market for self expression and this is really a good thing. 

If Blogging is the selfie of publishing, the social media must be its new mirror. As much as social media is a means of connecting with others, in an increasingly perverse way, it seems to also represent how we see ourselves through their eyes. This kind of voyeuristic narcissism is reflected in many ways but none more so than this strange and increasing trend of self categorisation through arbitrary & meaningless ontologies, abbreviated for the purpose of this discussion as ‘scamo’. Facebook is full of scamos. What colour is your aura? Which Downton Abbey character are you? Which superhero are you? Which city are you? Where will it end? Which lego piece are you? (I got the flat green base), which paper size are you? (I got A3 – gutted!) Which bollywood item girl are you? (I got Yana Gupta but I can’t see the resemblance). And definitely, which type of common cold virus are you? (I got adenovirus, so distinguished!) See how easy it is? You can make your own lists up as you go. You can see where this will end, right? Which scamo are you? 

The disturbing trends don’t end there. I pray every morning that the phrase “what happens next will xxxx you” (insert appropriate transitive verb) could be banned from headlines. “This surgeon started to operate on his patient. What happened next will amaze you!” “This man tried to drink printing ink. What happened next will stun you”. “This man tried to click on this link in face book. What happened next will baffle you” But of course, in the click-economy, it’s all fair game. As long as it can make a few (million) people click on the link, it’s a successful headline. 

To be fair, the original story headlines were competing with other headlines in the same newspaper, because typically, you would already have bought or be reading that paper. Or at best your lead story would be competing with the main headlines of other papers. Today you’re competing with a stream of consciousness flow of headlines, curated, ad-inserted, thrust upon you by friends and family and wished upon yourself by those links you clicked unsuspectingly last week. Forget 15 seconds of fame, each headline gets about 1.5 seconds before it’s history in your eyes. 

Facebook and Twitter have definitely become strong sources of news curation for me, much more so than TV or any single media organisation. But in the lead are still aggregation apps such as Flipboard and Zite. I was quite gutted to learn that Zite had sold itself to Flipboard, as they represented 2 very distinct kinds of approaches, both valuable, and I’m fretting that the result will be a bit of both but neither to it’s optimum levels. Circa is quite interesting (thanks @maria_axente!) because it allows you to track one story as it evolves. 

But then you have tools like Paper.li and others which allow the aggregation itself to be automated in a democratised format. This is basically software eating media eating software. Pretty soon the only real value will lie in original content. This is why players like the Economist, New York Times or HBR are likely to have long lives because they are effectively becoming the HBO’s of the publishing space. Everybody else is aggregating, sorting, distributing, dicing and slicing. It’s worth pointing out that players like Business Insider, Outbrain and Bleacher Report,  are trying hard to build a business that truly exploits the new distribution in different ways. But while Business Insider, which was founded by Henry Blodget and counts Bezos as an investor, does focus on good and original content, the others seem to want to flood you with content with those titilating headlines, so they can financially surf the click economy. 
One of the outcomes of the smartphone enabled and Instagram and Facebook fuelled environment we live in, is that I’ve come to expect pictures where a picture is required. As a consequence, I struggle through long descriptive paragraphs in books. I appreciate that it would have been hard for Dickens to simply Instagram a picture of foggy London, and perhaps we’re the better off for it. But with the increasing democratisation of tools and knowledge of image and video creation I think a new wave of storytelling is just around the corner. A book created for electronic consumption should have pictures, videos, hyperlinks and more, all of which are creatively and effectively used to enhance the storytelling. This is not to denounce the metaphor as a bedrock of great content, but to provide an experience that truly exploits the medium. Just imagine Gibson’s Neuromancer told through this kind of multimedia! 

Certainly, there’s plenty of excitement and enough funding for new ventures in news and publishing, as this piece from the FT points out. There is ongoing innovation as start ups like Blendle demonstrate. I personally think that we are just at that point where we start to appreciate the value of news as distinct from entertainment, and stop clubbing the business models together. Certainly, when ‘rock star’ geeks such as Bezos and Nate Silver are getting into the game, there must be plenty left to play for! 

In the mean time, I’m off to post my selfie to Facebook and Twitter, so I can see what others say about it, so I can figure out which mythically egocentric character I resemble the most.