Review, Science / Tech, Top Stories

How Innovation Works—A Review

A review of How Innovation Works and How it Flourishes in Freedom by Matt Ridley, Harper (May 19th, 2020), 416 pages.

If you are reading this, then you are taking advantage of the global information network we call the Internet and a piece of electronic hardware known as a computer. For much of the world, access to these technologies is commonplace enough to be taken for granted, and yet they only emerged in the last century. A few hundred thousand years ago, mankind was born into a Hobbesian state of destitution, literally and figuratively naked. Yet we’ve come to solve an enormous sequence of problems to reach the heights of the modern world, and continue to do so (global extreme poverty currently stands at an all-time low of just nine percent). But what are the necessary ingredients in solving such problems, in improving the conditions of humanity? In How Innovation Works, author Matt Ridley investigates the nature of progress by documenting the stories behind some of the developments that make our modern lives possible.

Early in the book, Ridley writes that “Innovation, like evolution, is a process of constantly discovering ways of rearranging the world into forms that are unlikely to arise by chance… The resulting entities are… more ordered, less random, than their ingredients were before.” He emphasizes that mere invention is not enough for some newly created product or service to reach mass consumption—it must be made “sufficiently practical, affordable, reliable, and ubiquitous to be worth using.” This distinction is not intuitively obvious, and Ridley goes on to support his case with historical examples from transportation, medicine, food, communication, and other fields.

The story of humanity is often told as a series of singular heroes and geniuses, and yet most innovations are both gradual and impossible to attribute to any one person or moment. Ridley offers the computer as an illustrative example, the origin of which can be traced to no definitive point in time. While the ENIAC, the Electronic Numerical Integrator and Computer, began running at the University of Pennsylvania in 1945, it was decimal rather than binary. Furthermore, one of its three creators, John Mauchly, was found guilty of “stealing” the idea from an engineer who built a similar machine in Iowa years before. And just a year before the ENIAC was up and running, the famous Colossus began running as a digital, programmable computer, although it was not intended to be a general-purpose machine.

Besides, neither the ENIAC nor the Colossus themselves were the product of one mind—they were unambiguously collaborative creations. Attributing credit to the inventor of the computer becomes even more difficult when we consider that the very idea of a machine that can compute was theoretically developed by Alan Turing, Claude Shannon, and John von Neumann. And, as Ridley documents, the back-tracing does not stop there. There is simply no single inventor or thinker to whom we may credit the birth of the computer, nor are its origins fixed at a single point in time.

Interestingly, Ridley argues that inventions often precede scientific understanding of the relevant physical principles. In 1716, Lady Mary Wortley Montagu witnessed women in Turkey perform what she called “engrafting”—the introduction of a small amount of pus drawn from a mild smallpox blister into the bloodstream of a healthy person via a scratch in the skin of the arm or leg. This extremely crude form of vaccination, it was discovered, led to fewer people falling ill from smallpox. Lady Mary boldly engrafted her own children, which was not received well by some of her fellow British citizens. In a case of parallel innovation, a similar understanding of inoculation emerged in the United States in 1706, when physician Zabdiel Boylston attempted it on 300 individuals. A mob tried to kill him for this creative endeavor. These precursors to modern vaccination demonstrate just how much of innovation is a matter of trial-and-error. Neither Lady Mary nor Zabdiel Boylston understood the true nature of viruses, which would not emerge for more than a century, nor did they know why injecting people with a small amount of something would produce immunity. Nevertheless, it seemed to work, and that was enough. It wouldn’t be until the late 19th century that the success of vaccination was explained by Louis Pasteur.

Ridley points out that there have always been opponents of innovation. Such people often have an interest in maintaining the status quo but justify their objections with reference to the precautionary principle. Even margarine was once called an abomination by the governor of Minnesota—in 1886, the United States government implemented inhibitive regulations in order to reduce its sales. The burgeoning popularity of coffee in the 16th and 17th centuries was met with fierce and moralistic condemnation by both rulers and winemakers, though for different reasons. King Charles II of Scotland and England was uncomfortable with the idea of caffeinated patrons of cafes gathering and criticizing the ruling elite. The winemakers, meanwhile, correctly viewed this strange new black beverage as a competitor, and gave their support to academics who said that coffee gave those who consumed it “violent energy.” Ridley regards the war against coffee as emblematic of the resistance to novelty wherein “we see all the characteristic features of opposition to innovation: an appeal to safety; a degree of self-interest among vested interests; and a paranoia among the powerful.”

Another feature of many of the innovations examined in Ridley’s book is that they were often made simultaneously by independent actors. Thomas Edison wasn’t the only one to have invented the light bulb. Apparently, 21 different people had come up with the design of—or had made crucial improvements to—the incandescent light bulb by 1880. The light bulb is not the only example of convergent invention—“six different people invented or discovered the thermometer, five the electric telegraph, four decimal fractions, three the hypodermic needle, two natural selection.” Ridley conjectures that this is the rule, rather than the exception, and goes on to apply it to scientific theories. Even Einstein’s theory of relativity, Ridley argues, may have been discovered shortly thereafter by Hendrik Lorenz. And although Watson and Crick are world-famous for their discovery of the structure of DNA, evidence suggests that they may have barely beaten other scientists to the finish line, who were on similar trails.

As fundamental as the innovation is to the story of humanity, Ridley argues that innovation predates our species by a few billion years. Touching on an almost spiritual note, he writes that “The beginning of life on earth was the first innovation: the first arrangement of atoms and bytes into improbable forms that could harness energy to a purpose…” Of course, there are differences between the creativity of a human mind and that of the biosphere, but many of the same principles govern both: the necessity of trial-and-error, a tendency toward incremental improvement, unpredictability, and trends of increasing complexity and specialization.

What is the relationship between our anticipation of innovation and its actual process and products? Ridley adheres to Amara’s Law, which states that “people tend to overestimate the impact of a new technology in the short run, but to underestimate it in the long run.” He tracks the intense excitement around the Internet in the 1990s, which faced the disappointing reality of the dotcom bust of 2000, but was then vindicated by the indisputable digital explosion of the 2010s. Human genome sequencing has followed the same pattern—after its disappointing early years, the technology may be gearing up for a period of success. If we indeed initially overestimate the impact of innovation and eventually underestimate it, then, as Ridley deduces, we must get it about right somewhere in the middle. He chalks this up to one of his central theses, namely that inventing something is only the first step—the creation must then be innovated until it’s affordable for mass consumption. Applying Amara’s Law to contemporary trends, Ridley thinks that artificial intelligence is in the “underestimated” phase while blockchain is in the “overestimated” phase.

Ridley dispels many of the falsehoods surrounding innovation, from “regulations help protect consumers” to “such-and-such innovation will cause mass unemployment.” In doing so, one does not get the sense that he approaches these issues from an ideological perspective, but rather that he has first studied history and then deemed such common beliefs to be misplaced. Similarly, How Innovation Works explores the relationship between innovation and empire, concluding that as ruling elites expand their territory and centralize their control, the creative engine of their people stalls out and technological progress stagnates. This, in turn, contributes to those empires’ eventual downfall. In contrast to such ossified behemoths, regions of multiple, fragmented city-states were often the home of intense innovation, such as those of Renaissance Italy, ancient Greece, or China during its “warring states” period. In short, larger states tended to be more wasteful, bureaucratic, and legislatively restrictive than smaller ones and freedom, as the book’s subtitle indicates, is critical to progress.

In How Innovation Works, Matt Ridley is trying to solve a problem—that “innovation is the most important fact about the modern world, but one of the least well understood. It is the reason most people today live lives of prosperity and wisdom compared with their ancestors, [and] the overwhelming cause of the great enrichment of the past few centuries…” Many people are simply unaware that generations of creativity have been necessary for humanity to progress to its current state (and of course, problems will always remain to be solved).

Ridley’s book is his valiant attempt to change that. There are many more lessons, stories, and patterns expounded upon in How Innovation Works that I’ve not touched on, and Ridley weaves them all together in a kind of historical love letter to our species. It is the latest contribution in a growing list of recent books defending a rational case for optimism and progress, including Michael Shermer’s The Moral Arc, Steven Pinker’s Enlightenment Now, and David Deutsch’s The Beginning of Infinity. Unlike those other authors, Ridley’s primary focus is technological progress, and his passion and gratitude for it is infectious:

For the entire history of humanity before the 1820s, nobody had travelled faster than a galloping horse, certainly not with a heavy cargo; yet in the 1820s suddenly, without an animal in sight, just a pile of minerals, a fire and a little water, hundreds of people and tons of stuff are flying along at breakneck speed. The simplest ingredients—which had always been there—can produce the most improbable outcome if combined in ingenious ways… just through the rearrangement of molecules and atoms in patterns far from thermodynamic equilibrium.

Ridley presents an inspiring view of history, because we are not merely the passive recipients of thousands of years of innovations. We can contribute to this endless chain of progress, if we so choose.

 

Logan Chipkin is a Philadelphia writer and tutor. He holds a master’s in biology and a BA in physics. His writing focuses on science, philosophy, economics, and culture. You can follow him on Twitter @ChipkinLogan.

Photo by Josh Riemer on Unsplash.

Comments

  1. “In short, larger states tended to be more wasteful, bureaucratic, and legislatively restrictive than smaller ones and freedom, as the book’s subtitle indicates, is critical to progress.”
    Is that true, particularly in the present? A great many innovations, from the internet and duct tape to tampons and cat’s eyes on the road (I could also mention GPS, Jeeps, microwaves, drones etc), have their roots in military technology. The military is most definitely bureaucratic and superpowers tend to be larger with more funds available for development. These things were invented in pursuit of a military advantage.
    Also, while innovation is presented in this article as an unmitigated good, the reality is trickier. Innovations have pros and cons. Nuclear energy is great if you ask me, but the atom bomb not so much. Being able to download obscure Japanese movies is awesome but the surveillance state that is gaining momentum does not thrill me. Also, the wrong technology is often promoted. Hundreds of millions of dollars were pumped into Theranos, and sweet FA came out. Vaccines are awesome but the ethics of pharmaceutical companies might well not be.

  2. “Even margarine was once called an abomination by the governor of Minnesota—in 1886, the United States government implemented inhibitive regulations in order to reduce its sales.”

    As recently as 1967, the sale of yellow-colored margarine was prohibited in the state of Wisconsin.

    Yellow margarine was banned in Wisconsin until 1967, the last ban of its kind. The law was overturned only after one state senator (who was particularly anti-oleo) agreed to a blind taste test. He chose margarine over butter and allowed the law to be reversed. It turns out that, for health reasons, his family was secretly serving him margarine smuggled in from another state, which is why he thought it was butter!

  3. “most people today live lives of prosperity and wisdom compared with their ancestors”

    Yes and no. Yes, we live longer lives, for example. But, as regards wisdom, the majority of consumer innovations, we do not make them. We say the earth is round because other, smarter minds figured that one out. But we are not intrinsically smarter. Plenty of educated people fall for claptrap like The Secret. And innovations do not necessarily make us smarter. It’s not like Twitter has immeasurably improved the standard of debate. Prosperity is also somewhat up for debate. Some machinery and innovations create new jobs, while others can take away. I’m not anti-innovation but I do think we should recognize innovation as a double-edged sword.

  4. For the opposite view favouring states:


    Worth reminding that in arguments like these, the question is the “tendency” - the extent to which state structure contributes. Which is systemic view question, complex in and of itself. Extensive data like what Mazzucato relies on, doesn’t itself paint a reliable picture. Because it would be merely be a ‘relative’ picture.

    Worth noting that all countries are closely interwoven with treaties, finance relationships, so we can’t notice stark differences, unlike in the past. Then, sovereignty meant something.

  5. I think you’re not quite getting what the author means by wisdom in that quote. Today, most people know that diseases are caused by germs and that personal hygiene can help avoid disease. Most people know that weather is a product of natural forces and that sacrificing children or animals won’t change it. No not everyone is a genius, most people have only the tiniest, cursory understanding of how much of the technology in the world we live in works, but overall, our understanding of how the world works is light years beyond where it was even a few hundred years ago.

    I do get that the choice of the word wisdom is potentially ambiguous, but I think part of the author’s goal is to focus on the positive side of innovation (his first example of a problem addressed was world poverty). My sense of the phrase, “living in wisdom” means living under the mostly benign leadership of intelligent rulers using good information to help solve problems.

  6. Even margarine was once called an abomination by the governor of Minnesota.

    Margarine IS an abomination…

    Ridley is always an engaging read, though the topic of innovation may be a bit obvious.

  7. How Innovation Works (…) is the latest contribution in a growing list of recent books defending a rational case for optimism and progress, including Michael Shermer’s The Moral Arc, Steven Pinker’s Enlightenment Now, and David Deutsch’s The Beginning of Infinity.

    At a time when it is increasingly difficult not to doubt the reasonableness and sanity of a considerable number of our fellow human beings, the optimism and belief in the future that prevails in such books can only be reassuring and comforting.

    This is all the more true when the progress described is about useful technological and practical innovations rather than the latest “innovative breakthroughs” in postmodern theory or gender studies, a kind of progress hardly anyone would miss.

  8. There aren’t any “innovative breakthroughs” though - The tactics, rhetorical depth are staid, repetitive riffs across millenia. Ours is a time of decadence, a globally inter-connected one.

  9. In one way, yes, people’s general stock of information has increased, but it has also decreased. For example, early American pioneers would often know how to hunt, skin, cook, the best way to chop down a tree, how to build a log cabin. This is also science of a sort. We’re also no different in superstition. They had the witch trials, we had the Satanic panic. The current belief systems we have now will be regarded as ridiculous perhaps just two decades from now, but humanity will have a fresh stupidity in store by then.

  10. I completely agree. I’ve added quotation marks for clarity.

  11. I believe that Ridley correctly observed that innovation starts with an invention and then has to be innovated to a usable and affordable product. I think you correctly note that many of those inventions were military in origin or I’d say that, at least, they’re state funded. Many inventions find their origin in universities, the army or other laboratories where states are by far the biggest investors.

    I think it’s worth noting the strange role contemporary capitalism plays in this mechanism. While the state and therefore the taxpayers are the primary investors and risk takers in many of these inventions, they only receive a miniscule bit of the profit once a private company like Apple uses that technology to commercialize it. Even worse, a lot of these companies outsource their production to low income countries where the products are produced under slavery-like conditions so a couple of speculating stockholders can profit maximally. In other words, the world does the investment so a small number of people, who have never innovated anything in their lives, can profit. I’d say that system is not optimal.

  12. The clue is in the title- how innovation works. Ideas are two-a-penny, but the real stars of the innovation process are those who find a practical application, and make it work at scale. Alan Turing might have developed the concept and Maths for Enigma, but Tommy Flowers was just as important to the process, as his MBE attests. Although, it’s my personal theory that whilst us British have made great advances in the Sciences, Philosophy and civic innovation, beyond the initial spurt of the industrial revolution, we have always suffered from an inability to monetise it. That’s the American genius- the ability to both innovate and monetise.

    I have a theory that we lost this ability to monetise somewhere in the Nineteenth Century, as a result of the first bout of financialising our economy. If this theory is true, it doesn’t bode well for America. In 1890 our economy had gone from strength to strength, but German industrial output had outpaced Britain’s by a factor of five. In the modern context, Germany makes four times as much ‘stuff’ as America. Offshoring might seem like a good idea in the short term- but in the long run, it’s important to remember the genesis of innovation begins in a a hobbyists garage, or an engineer’s shed, outside of normal working hours. The service sector is unlikely to provide as much innovation going forward, as the tradables sector has done in the past- and unfortunately gaining a degree is less likely to make you an inventor (although I would posit this is not true for the STEM fields).

  13. Again, I think you’re missing the point of the article and the idea of intellectual development. It is an important step forward for humanity in general to move past thinking of disease as a punishment from the gods for unrighteous behavior and to understanding the basic principles of germs and infections. Of course there still are people who believe disease is a sign of divine displeasure, but humanity as a whole has clearly progressed.

    I happen to be one of those old guys who hangs out on save our skills websites, teaching folks how to do all sorts of things that people don’t do any more. I get your argument if a zombie apocalypse happens, every one where I live is gonna be fighting to get into my backyard :slight_smile:

    But aside from a zombie apocalypse, there are lots of things we’ve forgotten, like how to bridle an ox to pull a plow, the balancing of a stick with two buckets of water hanging as you walk back from the river, etc. Are those skills to reminisce about?

    The idea that we’re no different in supersticion is obviously not quite right. In the witch trials people were burned at the stake, in what you call the satanic panic, people went on tv shows and complained and wrote magazine articles and books and made mediocre horror movies.

    It is an intriguing question, how people two centuries in the future will regard our current belief systems. It seems obvious in two centuries folks will look back and think us primitive technologically, just like we do looking back two centuries. But my best sense is, in two centuries, people will still believe in a natural understanding of existence and focus on science to help make sense of it. Now if we discover a divine alien being that wants us to do roller disco as our required form of worship …

  14. «Similarly, How Innovation Works explores the relationship between innovation and empire, concluding that as ruling elites expand their territory and centralize their control, the creative engine of their people stalls out and technological progress stagnates. This, in turn, contributes to those empires’ eventual downfall. In contrast to such ossified behemoths, regions of multiple, fragmented city-states were often the home of intense innovation, such as those of Renaissance Italy, ancient Greece, or China during its “warring states” period»

    For this reason, I’m sure Google has become a zombie (I had said this to my young colleague about eight years ago)

  15. You might just as easily be talking about recent developments in academia, especially in relation to research committees. Tenure doesn’t mean shit in those fields which rely on funding for research, if the prevailing ideology stifles the generation of unpopular knowledge…

Continue the discussion in Quillette Circle

11 more replies

Participants

Comments have moved to our forum

28 Comments

  1. Pingback: Some Links - Cafe Hayek

  2. Pingback: Something for the weekend #112 – 2020 : Tracking Optimism

Comments are closed.