I’ve just finished a rather excellent book, The Innovators by Walter Isaacson.
It takes a wonderfully long term view of digital – starting with Charles Babbage and Ada Lovelace and ending with Larry and Sergei in more recent times. I thoroughly recommend it.
At the end is a magnificent concluding chapter, identifying the lessons that the history of computers offers to those wanting to innovate themselves. They may, perhaps, be surprising.
Firstly, creativity is a collaborative process. The notion of a lone innovator is essentially a myth. Individuals do have ideas of course, but to make them happen they need to be a member of a team. Solo brilliance just isn’t enough.
A good (or, rather, bad) example of this in the book is William Shockley, a brilliant mind who made several vital intellectual leaps in the development of the transistor, but whose personal style meant he was unable to take people with him and was a poor collaborator.
The team needs to bring together people with different skills and complementary styles. One common arrangement is to have a visionary paired with a great operations person – the two Steves at Apple are a classic example of this. Jobs provided the vision and Wozniak did the actual doing.
Another lesson from the book is that truly innovative ideas take root when nobody seeks the credit – whether as organisations or as individuals. The internet is a great example of this. When Vint Cerf and colleagues were writing the specifications for the protocols that would establish the Internet, they called them ‘requests for comment’ rather than anything more formal. It made all those involved feel ownership of the process and so encouraged them to fall in line.
Third, money is not always the primary motivation in digital innovation. The digital revolution was spurred on by three main groups – government, private enterprise and communities and each was as important as the other in the story.
The US government in particular, through the military, provided the funding for the research to take place to develop new ideas. private enterprises then sprung up to develop them into products. Meanwhile, others through a process of what Yochai Benkler refers to as “commons based peer production” helped to figure out what all this meant for individuals and communities, which created the use cases for the technology and also developed new ideas themselves.
Only one of those groups has profit as its primary motivation. That’s not to denigrate private enterprise’s role at all – it is vital and few if any of the innovations would have worked without it. However, societal and community focused benefits matter just as much and are an important part of the mix that generates disruptive change.
Finally, and perhaps most interestingly, many of those involved in the digital revolution were not “just” engineers, or scientists. A common thread is an appreciation of the arts as much as the sciences. Perhaps best exemplified by Steve Jobs’ idea of Apple being at the “intersection of technology and the liberal arts”, artistic creativity is a core driver of innovations. This takes us all the way back to Ada Lovelace and her description of Babbage’s early computer as being “poetical science”.
This creativity is still the one thing that humans can do much better than machines, which leads us to another of Isaacson’s lessons, which is that people and computers working together in a kind of symbiosis is where the real sweet spot in digital innovation lies, rather than in artificial intelligence. Instead of trying to make machines that act like humans, we should leave the computers to do what they are good at – crunching through data and calculations – which frees up the people to do the creative, intuitive bit that machines struggle with so much.
Great innovation, then, is a balance – between art and science, between individual brilliance and collaboration, between humans and machines. Something well worth thinking about and bearing in mind as we head into a new year.