Why Are We Suddenly So Bad At Predicting the Future?

Imagine that a monkey got into the control room of the universe and spent the year clicking random buttons. Imagine him hopping about on the ‘one musician less’ button, stomping on the ‘auto-destruct’ lever and gurgling while he thumped repeatedly on the ‘introduce chaos’ switch. Close your eyes and picture him dropping a giant poo on the bright red panel marked ‘do not touch under any circumstances’. That my friends is the only way to think about 2016 – after all, it was the year of the monkey in the Chinese zodiac. It was the year when rational thinking took a beating, when meritocracy became a bad word, when liberalism escaped from the battlefield to a cave in the mountains to lick its wounds. And not surprisingly, a year when projections, predictions and polls made as much sense in the real world as an episode of Game of Thrones on steroids.
monkey-prediction
Given much of our lives are spent in productively engaging with the future and making decisions based on big and small decisions about the possible future, this last point is more important than just the schadenfreude of laughing at pollsters and would be intellectuals. The present passes too quickly, so really every decision you’ve ever made in your life is counting on future events to turn out in ways that are favourable. Getting this wrong is therefore injurious to health, to put it mildly. And yet our ability to predict the future has never been under such a cloud in living memory. Why is this so?

Fundamentally, we’re wired to think linearly in time, space and even line of sight. We are taught compound interest but we get it intellectually rather than viscerally. When you first encounter the classic rice grains and chessboard problem, as a smart person, you know that it’ll be a big number, but hand on heart, can you say you got the order of magnitude right? i.e. the total amount of rice on the chessboard would be 10x the world’s rice production of 2010? Approximately 461,168,602,000 metric tons? This problem of compounding of effects is incredibly hard to truly appreciate, even before you start to factor in all the myriad issues that will bump the rate of change up or down, or when the curve hits a point of inflexion. The Bill Gates quote  – ‘we over-estimate the impact of technology in 2 years, and under-estimate the impact over 10’ – is a direct reframing of this inability to think in a compound manner.

Then there’s the matter of space and line of sight. The way the future unfolds is dramatically shaped by network effects. The progress of an idea depends on it’s cross fertilisation across fields, geographies and disciplines, across any number of people, networks and collaborations. These collaborations can be engineered to a point or are the result of fortuitous clustering of minds. In his book ‘Linked’ – Ablert-Lazlo Barabasi talks about the mathematician Erdos who spent his life nomadically, travelling from one associates’ home to another discussing mathematics and ironically, network theory. Not surprisingly, a lifestyle also practiced for many years by a young Bob Dylan, if you substitute mathematics for music. Or consider the story of the serial entrepreneur in Rhineland in the 1400s, as told by Steven Johnson, in ‘Where Good Ideas Come From’. Having failed with a business in mirrors, he was working in the wine industry, where the mechanical pressing of grapes had transformed the economics of winemaking. He took the wine press, and married it with a Chinese invention – movable type, to create the worlds first printing press. His name of course, was Johannes Gutenberg. This kind of leap is not easy to predict, not just for the kind of discontinuity they represent (more on that later), but also because of these networked effects. Our education system blinkers us into compartmentalised thinking which stays with us through our lives. Long ago, a student of my mothers once answered a question about the boiling point of water by saying “in Chemistry, it’s a 100 degrees Centigrade, but in Physics, I’m not sure”. We are trained to be specialists, becoming more and more narrow as we progress through our academic career, ending up more or less as stereotypes of our profession. Yet human progress is driven by thousands of these networked, collaborative, and often serendipitous examples. And we live in a world today with ever expanding connections, so it’s not surprising that we have fallen behind significantly in our ability to understand how the network effects play out.

If you want to study the way we typically make predictions, you should look no further than sport. In the UK, football is a year round sport, so there are games every weekend for 9 months and also mid week for half the year. And with gambling being legal, there is an entire industry around football gambling. Yet, the average punter, fan or journalist makes predictions which are at best wilfully lazy. There is an apocryphal story about our two favourite fictitious sardars – Santa Singh and Banta Singh, who decide to fly a plane. Santa, the pilot, asks Banta, the co-pilot to check if the indicators are working. Banta looks out over the wing and says “yes they are, no they aren’t, yes they are, no they aren’t…” – this is how a lot of predictions are made in the world of premier league football today. Any team that loses 3 games is immediately in a ‘crisis’ while a team that wins a couple of games are deemed to be on their way to glory. Alan Hansen, an otherwise insightful pundit and former great player, will always be remembered for his one comment “You can’t win anything with Kids” – which he made after watching a young Manchester United side lose to Aston Villa in the 1995-96 season. Manchester United of course went on to win the season and dominate the league for the next decade and a half. Nobody predicted a Leicester City win in 2016 of course, but win they did. The continuous and vertiginous increase in TV income for football clubs has led to a relatively more equal playing field when it comes to global scouting networks, so a great player can pop up in any team and surprise the league. Yet we find it hard to ignore all the underlying trends and often find ourselves guilty of treating incidents as trends.

The opposite, is amazingly, also true. We are so caught up with trends that we don’t factor in the kinks in the curve. Or to use Steve Jobs’ phrase – the ding in the universe. You can say that an iPhone like device was sure to come along sooner or later. But given the state of the market – with Nokia’s dominance and 40% global market share, you would have bet your house on Nokia producing the next breakthrough device eventually. Nobody saw the iPhone coming, but when it did it created a discontinuous change that rippled across almost every industry over the next decade. The thing is, we like trends. Trends are rational and they form a kind of reassuring continuity so that events can fit our narratives, which in turn reaffirm our world view. And unless we’re close to the event, or perennial change seekers and nomads ourselves, it’s hard to think of countercyclical events. It’s now easy to see how in 2016 we were so caught up in the narrative of progressive liberalisation and unstoppable path to globalisation, we failed to spot those counter-cyclical events and cues that were right there in our path.

In fact there are any number of cognitive biases we are guilty of – on an everyday basis. This article just lists a dozen of them. My favourites in this list are the confirmation bias and the negativity bias. Both of these are exacerbated by social media and digital media. While social media has led us to the echo-chambers – the hallmarks of 2016, our projection bias is also accentuated by our ability to choose any media we want to consume, in the digital world, where access is the easy part. Similarly, bad news spreads faster on social networks and digital media today than at any time before in history. Is it possible that despite knowing and guarding against these biases in the past, we’ve been caught out by the spikes in the impact and incidence of a couple of these, in the digital environment we live in today?
To be fair, not everybody got everything wrong. Plenty of people I know called the Donald Trump victory early in the game. And amongst others, John Batelle got more than his share of predictions right. There is no reason to believe that 2017 will be any less volatile or unpredictable than 2016, but will our ability to deal with that volatility improve? One of the more cynical tricks of the prediction game is to make lots of predictions at many different occasions. People won’t remember all your bad calls, but you can pick out the ones you got right, at leisure! This is your chance, then, to make your predictions for 2017. Be bold, be counter-cyclical. And shout it out! Don’t be demure. The monkey is history, after all. This is the year of the rooster!
Advertisements

Digi-Tally: Through My Digital Viewfinder, Jan 21, 2015

This is still the week of the CES after-glow, and there’s a lot of reflection in the media about what we saw at CES. In a nutshell, and as summarised in the TWIT podcast, no tablets and more cars. Autonomous vehicles has been one of the areas that has moved forward quicker than most of us had anticipated and may have great positive externalities by way of enabling a sharing economy for transport. In much the same way as the word “television” has been redefined in the past decade, we may be entering a similarly transformative phase for ‘automobiles’. It may well be a reaffirmation of the name – a self driving car is after all truly auto-mobile. But increasingly we may start to see the car as a network, a node on a larger network, or a collection of smart (and inter-changeable) components. On the other hand the broader IOT examples keep mushrooming, and we’ll no doubt be exposed to weird and wonderful examples of the power of Internet of Things – from smart security to home automation, and from wearable health and wellness monitors to self managed environments.

It’s also been the week for spotlighting the great transition of technology eras – as we move from the PC & desktop era into the untethered, wireless, mobile and ubiquitous computing era, the struggles of Intel and IBM amongst the behemoths of the 90s and 00’s are sharply in focus. Intel shipped over 100m chips, but are still dramatically dependent on the shrinking PC market. They’ve made an entry into the wearables, tablets and sensors space (interestingly, by acquiring a Chinese firm), but the numbers are still small and analysts aren’t convinced yet. IBM have just announced a 11th straight quarter of declining revenues. The slowdown is precipitated by the hardware business, with the digital arms, including mobile, security and cloud showing strong growth but very low numbers. Overall, a 16% growth in “digital” is probably not good enough, and the combined weight of 27% is impressive, but you sense that the bits that are big aren’t growing fast enough and that the parts that are growing well, aren’t big enough, to actually create an overall positive outlook for IBM just yet. At Cognizant, we often speak about the shifts of the “S-curves” we are currently in between the Web era S-curve (dominated by fixed wireless and PCs) to a digital S-curve – dominated by ubiqutous, nano-computing and wireless connectivity. Intel and IBM’s challenges are symptomatic of the difficulty of transitioning success from one wave to the next. But to state the blindingly obvious, they will not be alone. How will your business make this jump?

I continue to believe that 2015 will be a year of digital infrastructure. Broadly speaking this should include cloud, middleware and security, for most large enterprises. Of these, security has been much in the spotlight of late, with Sony obviously being the most high-profile victim. But arguably, despite the political intrigue and the alleged involvement of North Korea, the hacking of the US Central Command should be more akward, geopolitically speaking. This list of famous hacks from The Telegraph has some fascinating nuggets. From the unintentional Morris worm (Morris is now a professor at MIT), to the Target and Sony Hacks of the last 12 months. Two trends stand out. The first is the increasingly political colour of the hacks – indicating that this is now a serious form of warfare or international espionage. The second is the simplicity of many of these hacks. DDOS, phishing, these aren’t particularly sophisticated attacks, but they indicate that as humans we often represent the threats and weak links in the security environments of our organisations.

The HBR carried this great example of the success of Nordstrom’s digital strategy. I think all success stories tend to get over-simplified to an impractical level, in our hunt to find an easy formula. Usually there are dozens, if not hundreds of things that need to go right for a major project to work well, and it only takes a few to not work well, for it to be a limited result or an outright failure. This is why we have many more failures than successes of course. So while I agree about the arguments in this piece, I would hesitate to consider this as a necessary and sufficient condition for digital success. Nordstrom’s strategy comprises of a focus on customer experience, and the extensive use of digital (SMACIT) tools across the length and breadth of the business, to effectively create a new business model. As always, both God and the devil lie in the details.

And what should we make about Google’s change of tack on Google glass? It was initially interpreted as Google pulling the plug on a venture with mixed success, which it has a history of doing. But it seems apparent now that Google are taking a leaf out of Apple’s book and going design-first. By handing this product to Tony Faddel, of Nest and iPod fame, Google seem to be acknowledging that the technology (which works) needs to be nested inside a highly usable, and ideally beautiful product. This is hardly a revelation but if this is indeed the thinking, then it’s wonderful to see Google, the spiritual home of engineers, acknowledge the role of design and user experience.

Also, at CES, there was much buzz about more wearables – watches from Sony and HTC, and other devices. Smart watches look like being the wearable de l’annee, but the hunt for the killer app is still on. Any guesses? What would you use a smart watch for? What problem could you solve? Or what wonderful new benefit could you imagine? Like many others now, I don’t wear a watch to being with, so it would have to be a compelling benefit to make me wear a watch again (one more device to manage!).

It would be remiss of to not mention this video from Ola Cabs in India which a colleague kindly sent me. It’s refreshing to see such a stark focus on user experience from an engineering point of view, rather than design alone. Anybody looking to build a product should see this.

And finally, on a lighter note, this set of maps, yet another example of the emotive power of data in our lives, my favourite is the first map, on second languages spoken in the boroughs of London. Amongst other things, it shows you the patterns of immigration and the abundance of Indian and Polish people in London. May be there needs to be a new alliance for the IPOs (people of Indian and Polish Origins) a microcosm of a geo-political shift, a trading block and a platform for cultural enrichment hitherto overlooked. I mean, all this technology, data and understanding should bring us closer, right?