Ranting At The C-Word!

So…

I was at a Creative Industries event organised by the Technology Strategy Board yesterday. An event designed to bring together the creative and technology disciplines. And one of the many issues touched upon was the need to break out of the ‘creative industries’ straightjacket and to explore creative roles and jobs in other industries. Which led me to thinking about one of my favourite subjects: 

The rampant misuse of the C-word. 

In fact, it’s a very close call as to which word is more bastardised – ‘creative’? or ‘innovation’? But today I’m talking about the former. What does the word mean to you? Is it a skill? Is it a profession? Is it a role? Is it perhaps just a trait? 

I have experienced at close quarters for many years, the cliched and lazy labelling within the technology industry of the concept of creative work. This usually assumes that (a) all ‘creatives’ are the same, and makes no distinction between architectural or conceptual skills versus execution and tool-level skills, for example; (b) phrases like ‘UX’, ‘UI’, ‘creative’, ‘design’ and anything else which smells of visual design related activity are all synonyms and can be used interchangeably. 

This isn’t one sided. This kind of apathetic adjectivisation is common even among creative communities, when it comes to others. You’re a techie, or a suit. Or you’re a quant… you need classifying, not for your good but more for people to be able to pigeonhole you in their mental cubicles. 

I like to think of creativity as a trait. A problem solving capability which looks at the same problem differently. By this definition Newton, Gauss and Einstein would be among the most creative people the world has known. Serial inventors, entrepreneurs and technologists are creative. The best enterprise or solution architects in technology companies demonstrate this kind of creativity on an everyday basis. 

What we commonly call ‘creative’ is actually either referencing design or art. And again, these are two very distinct worlds. Design is a science, a discipline with a clear deliverable against a defined need. You design a house, or a doorknob, or a website or a logo with a view to delivering some fairly clearly defined objectives. Consequently good design is often dependent on the effective articulation of the objectives. 

Art is also a discipline and involves technique and often, structure, but by definition the objective of art is defined by the creator and as such it becomes a form of expression. The creator might want to make a point, raise an issue, support a cause, but equally, she might pander to a whim, be overcome by a subliminal urge, be provocatively abstract or seek no meaning at all. 

You can, therefore, have creative people who have nothing to do with design or art. You can have designers who are good but not particularly creative. Unusual but true. You can have great artists who would make bad designers and vice versa. Usually on the basis of their willingness or ability to work within the structure of a brief and the tyranny of an objective.

If you wrote a book with the purpose of making money, creating a bestseller, or selling a movie script, you would be designing. If you wrote a book that wasn’t governed by the outcome, you would be an artist. You might be a great wordsmith but poor at plots. Those are examples of the techniques you need for any discipline.

Advertising therefore is more design than art. Except for those wonderful ads which are great except that you can’t remember the product or brand. That’s good art, not good design. Or those ads which are made for awards. Or perhaps, that too, should be called design. 

In the context of film-making, who is the creative brain? Definitely the director, and to an lesser but important extent, the editor, the cinematographer and choreographer, to name a few. But usually not the actors. They are usually following the directors brief. They are in effect, designing a performance.

Of course, I’m stretching a point. In every one of these examples, there is a need for creativity and artistic expression, which may well make the difference between good and mediocre, and between making history and being history. I’m simply driving these giant imaginary wedges between art, design, and creativity, to make the point bluntly.

And to state the blindingly obvious, design is greatly enhanced by creativity. You only have to look at some of the best design to see the magic touch of a creative insight or treatment. This chair, this ambulance redesign, this wheelchair, this plug, and this folding wheel, all have a creative spine which makes them stand taller than their peers. It’s just important to distinguish between the terms for better results, especially when you’re in the results (read: design) business. 

And so, every time at work I see people lumping terms together, I bite my tongue and control my fingers from typing that shouty email. But the irony is of course, that when you’re designing a mobile app, which is a highly constrained experience in so many ways, the creativity often needs to come from the technologist, and the discipline, from the designer. 

As to art, you can always find it in the gallery. 

Surviving The Information Age

I was recently asked (as I often am), “what devices will we use in x years?” Usually, in this kind of question,  x can be 2 years, 5 years or 10 years, depending on the ambition of the question? This is often a cue for animated debates on Google Glasses, Apple Watches and the next big thing in wearable computing and whether you would want a phone chip installed in your ear. This kind of argument, whichever flavour of device your rooting for, misses the point completely about the real challenge facing us – how to survive the information age. In fact, to stretch a point, this is like being faced with an energy crisis and arguing whether the batteries we use should be square or round. 
 
I am personally petrified of how ill equipped I am in dealing with this information driven era we are increasingly finding ourselves in. By all apparent measures, you would think I’m reasonably§ information savvy. All my files are on dropbox and accessible over the cloud. My phones are always backed up. My photographs are on Flickr, Picassa or on Apple’s photostream. My music is on Spotify or on iTunes. I use Google docs extensively to collaborate professionally. I maintain 4 different levels of passwords to keep my data safe. And yet, I would give myself a 3 out of 10 in terms of being ready for the information world. 
 
In my last blog I touched about the problem of “Dom’s MacBook in Iran“. That was just one example. Professor Gerd Kortuem of the Lancaster University’s High Wire program spoke at a session I attended a few years ago about an example where they added little meters to the drills used by roadworks teams, in order to measure the levels of vibration and to alert the supervisor if it was above safety limits. This created a huge problem for supervisors, as they now had new information they needed to act on. Earlier, they just took a gut call and that was it. Now they had to review the information and decide what to do if the readings were too high. Stop work? Look for alternatives? Inform the office? Increasingly, we find ourselves in possession of information which actually creates new challenges for decision making.  
 
On the other hand, this superabundance of information creates a responsibility of it’s own. It is an act of negligence today to not do the “due diligence” on any important decision. Whether it’s researching a hotel you want to book for a holiday, or perhaps the person you are going to meet. Whether you are unwell and need to know about the symptoms and possible causes, or whether you’re checking nutritional values of the things you’re buying at the super market, small and big decisions are now made much easier by information availability. The bottom line is you can make better decisions and a series of better decisions should lead to a higher quality of life and work. 
 
Which brings us to the first of my list of three key survival skills, which collectively explain why all of us aren’t yet equally good at handling and using this information. I’m talking about the ability to find information effectively. How to use Google (or any other) search effectively. How to find information on lean thinking without being flooded by results for lean meat. Information search skills should be taught in primary school. 
 
A personal peeve of mine is the phrase “new research shows…” a term often heard in television news programs. Usually it’s accompanied by fresh perspectives on whether something is or isn’t good for health, and runs contrary to previously held ideas. However, very rarely are we told the source of the data, and more critically the source of the sponsorship of the research. We know well that it’s easy for interest groups to “create” research with favourable outcomes. So if research suggesting that broccoli can cure the common cold is backed by the Broccoli Growers Association, you would do well to dig deeper into the evidence. Mostly this kind of “newsworthy” research suffers from the sponsorship bias or often just the news bias – the need to make a story. The recent story in the BBC about surviving on £ 1 per day is riddled with palpably bad research and poor homework,  as well documented herebut it made for a good story. Of course if you aren’t up to date with the Daily Mail’s ongoing obsession with things that cause and/or cure Cancer, you can get a quick summary here.
 
The second information survival skill therefore, that every 10 year old should know today, is to be able to validate the source of data. It’s likely that you can get a slew of answers to almost any question, on the internet. But which one do you trust and how do you establish the quality of the source? Or remove sponsorship bias? Equally, when the fall back option for most people is Wikipedia, it’s important to note what is and isn’t best crowdsourced. To put it simply, in a quiz show, if you were asked a question about a character on a soap, it’s a good question for an audience poll. Not so much if the question refers to, say, the isotopes of Neon. 
 
Sooner or later, that pesky question pops up again – what is information, and how does it differ from data? My favourite answer is context. Let’s take 2 people – Mary and Max. Max is navigating his way through a jungle, with no access to provisions except what he can find and eat in the jungle. Mary is playing football in a tournament and about to take a penalty. Both are given 2 pieces of data each. First, that most goalkeepers tend to dive to the left or right, so statistically, hitting a penalty straight down the middle has the highest chance of scoring. Second, that if you dig a hole in a muddy area, it takes 20 minutes for the sediments to settle, and the water to become drinkable. Now, clearly for Max and Mary, one of these pieces of data is information. The other, irrelevant. To see the ludicrousness of information without context, see this Fry & Laurie sketch)
 
But Max and Mary might find themselves in each other’s shoes at some point of time.  Will they still retain the “irrelevant” information they were given? 
 
Which means that the third key survival skill is an ability to continuously build and reference your data gathering so that your personal library and signposts enable you to marry information to context all the time? Our education system was historically built to provide information you had to store in your head and use for the rest of your life. That has obviously changed, but do you have a reliable library system to replace it? Plenty of tools (such as bit.ly) for example enable tagging and marking of content and a combination of ubiquitous access and smart devices make this library always accessible. You are your own librarian. Pay attention to your filing system. 
 
To put this in the enterprise perspective, the role of all systems is to deliver the right information in the right context. Whether it’s customer data to a sales person, or risk information to a project manager. This is the bottom line. Once you strip away all the IT jargon and the systems-speak, this is the simple objective. Every time your business doesn’t deliver the relevant information at the point of decision making, it’s an area of improvement for information technology. 
 
But of course, you can only deliver the data you have, so data capture becomes the next challenge. The best source of enterprise data is transactions (and the worst are probably areas where employees are expected to make an extra effort just to provide the data). What’s very interesting is that as more and more activities go digital, we’re seeing the emergence of the digital trail that comes straight out of the transaction. 
 
Increasingly every activity, from using an oyster card in the tube, to a meter reading and from checking your bank account to measuring your blood pressure is a digital activity and leaves a digital trail. This is a trail of data which is is currently divergent and disaggregated. But if harnessed, they could be extremely powerful. Even within your business, the ability to mine the digital trail creates a new source of information. The HBR article Exploiting the Virtual Value Chain is a must read to understand how this could work. 
 
Sometime in the late 90s, I read a very interesting story about a transport company in India which was having trouble tracking the vehicles as they made their way across the vast hinterland of the company, with no real communication system in place to track them. Some of the journeys were over 7 days long each way. The company hit upon the idea of doing a deal with a specific set of petrol pumps where in exchange for a the volume of business, the company got some benefits and information on every truck that refuelled at any of the stations. In one step, the company had created an information network, which would report back on each truck whenever they refuelled. A great example of the digital trail at work. 
 
The Hailo taxi application is another great example of this. Hailo digitises the process of calling a taxi and even paying for it. But it creates a digital information trail which allows you track which taxi you used and when. The company now markets as a feature that you can always trace back if you left something in the cab. 
 
However, one of the great unanswered questions which is sure to be debated hotly over the next few years is about ownership of the information. As we generate digital trails about our purchases, our health, our travel, our energy consumption, this creates a huge and valuable information cloud. Who owns this?
 
In case you’re not convinced about how valuable information can be, consider the fact that smart meters can reveal detailed information about all devices and appliances being at home, including when and for how long. Which in turn provides very meaningful clues into the lifestyles of people in the house – including the number of people, when they are at home and what they do at home. Clearly this is a gold mine for marketers. 
 
So is it us as individual owners of our own data? Is it each service provider? Will there be a role for an information intermediary who can hold our data and monetize it on our behalf? Who should this be? Google? The Government? Cooperatives? Richard Seymour, founder of Seymour Powell has an idea about a digital surrogate who is on “our side of the glass” who is the repository and identity manager for us. But there is also aggregation value of the information which needs to be realised. 
 
To summarise, we need to individually build the 3 basic skills of finding information, verifying the source and creating our own reference & library system. We also, as companies need to tap into the digital trail of transactions and find creative ways of extracting value and meaning from the digital trail, and delivering information in the right context. Finally as a society, we need to answer questions about the ownership, curation and exploitation of data at an individual and collective level. For me these are the basics of survival in the information age.