View From the Mountaintop

Meet a brilliant strategist who is skeptical of strategic planning, the stock market, and PowerPoint presentations.

Strategy and innovation have much in common, not all of it pretty. They are the questing beasts of mythology let loose in the world of business, which most of the time is concerned with process, maintenance, the status quo—what Richard Rumelt calls “doorknob polishing.” Rumelt is a professor of strategy at the Anderson School of Management at UCLA. Considered “the strategist’s strategist,” he was interviewed by the global management-consulting firm McKinsey & Company.

The high ground is natural to Rumelt. As a mountaineer he achieved several first ascents, and as an academic he was the first to find a link between strategy and profitability: moderately diversified companies outperform more diversified ones. He has also shown that being good at what you do matters more than what industry you are in.

Rumelt’s central themes: Most corporate strategic planning is an annual highly structured process, which is frustrating to many executives because it doesn’t add much value. To Rumelt, this exercise is really the setting of rolling multi-year budgets. While important, it is not strategy. In his view, the term “strategic planning” doesn’t even make sense. Planning is about how things are the same: taking the patterns of the past and extrapolating them into the future. The intention is to make incremental progress.

The goal of strategy, on the other hand, is to significantly improve performance. There are only two ways to do this, says Rumelt: to invent something new, which is a gamble; and to exploit a change in the environment (“take a position by investing in resources that will be made more valuable by the changes happening”). As a strategist, he clearly prefers the second option. (We in Atlantic Canada should take note. Around here, strategic innovation is mostly thought of as invention.)

Starting at the top, it is the CEO’s role to “absorb ambiguity” by defining the broad problem in narrower terms that the managers and business units can handle. By its nature, strategy involves a certain leap of faith. “Speculative judgments are the essence of strategic thinking— a substitute for having the clear connections between the positions we take and their economic outcomes,” says Rumelt. “They help us take a position in a world that is confusing and uncertain. You can’t get rid of ambiguity and uncertainty; they are the flip side of opportunity.”

Strategy “work” (his term) is best done by a small group of smart people. “Doing this kind of work is hard,” he says. “A strategic insight is essentially the solution to a puzzle, which is more likely to be done by an individual or a small team.”

Rumelt uses creative language, metaphors. Take a predatory posture, he says. Leap through the window of opportunity. In his descriptions, strategy appears episodic, unpredictable, and nonlinear—like the process of evolution. It is not something that can be put neatly on the calendar. During the interview, the word “insight” comes up again and again.

Entrepreneurial strategy is distinguished by a focus on the big wins and not on maintenance activities, he says. “There is no substitute for entrepreneurial insight, but almost all insight flows from the unexpected combination of two or more things.” The key for the company is to possess the skills to combine these elements effectively. Indeed, the great strategists in any field possess a rare gift: the ability to blend vision and creativity with current skills and abilities.

Although he has worked with a who’s who of corporate America, in his illustrations Rumelt keeps coming back to maverick Steve Jobs of Apple. A wonderful example of the “unexpected combination” is the iPod, which is the result of Apple’s ability to combine knowledge of three things: the music industry, hardware, and the Web.

To this mountaineer, the economic environment is “terrain,” with high ground and low ground. Today as never before, the terrain can be quickly shifted and even reversed by “earthquakes,” so strategy needs to be dynamic. In earlier decades it was usually static: performance was improved mostly by improving processes, not by changing direction.

The “locus of success” is a place that is difficult to replicate; it is based on intangible resources, such as how people do things. Competencies are created by activity, by doing: “They create advantages, but today advantages evaporate quickly.”

Rumelt also weighs in on what not to do. When companies get too diversified, some of the units are inevitably unprofitable and are carried by the others. “No one wants the unrewarding task of weeding the garden,” he says.

Rumelt also cautions the CEOs of public companies about the temptation of managing for stock prices, which respond to changes in expectations, not performance. His logical mind considers this a mild form of insanity. What’s more, the irrational volatility of the market produces a “signal-to-noise ratio that is very low,” says Rumelt, who began life as an electrical engineer. “For a CEO, living with the stock market as a constant factor in your life takes iron nerve and an ability to be detached.”

Finally, Rumelt loathes PowerPoint presentations. Putting things in point form hides contradictions, he asserts. Better to try to write three short clear paragraphs. Good writing is good thinking, I say. Amen to that.

Strategy column in Progress magazine, 2008.

Lessons from the Bell Curve

Strategy and innovation, the highest functions of reason and creativity, combine careful planning with improvisation—and the ability to turn on a dime.

In my last column, I quoted the definition of strategy given by Richard Rumelt, a professor of business strategy at UCLA’s Anderson School of Management. To paraphrase: the essence of strategic thinking is speculation. The key word is “speculation.” Success usually appears during periods of stability, when the systems we have designed are producing the outcomes we desire. During these prosperous times, we may forget the earlier stages when we planted the seeds of future opportunity. There was uncertainty, soul searching, and trial and error—a certain fumbling around. A mathematician friend of mine came up with the phrase “data momentum” to describe most of what happens in our lives: 90% is driven by what happened yesterday, by forces that have already been set in motion.

This is true at all levels: in nature, in human society, in organizations, and within ourselves. Most of the time, we are pushed along by data momentum. We ride a wave, following well-known trends or processes. We are comfortable with predictable inputs and outputs. It’s like our body, where most processes are controlled by lower-brain functions. We don’t have to concentrate for it to work. Strategy and innovation belong to the other 10%, the stuff that requires focus, the times when we must make choices. They are the levers that allow us to set new goals and move in new directions. It may seem paradoxical at first that much of this conscious goal-oriented behaviour is in fact based on guesswork.

Strategy is setting a direction where there is not yet a clear link between behaviour and outcome. In other words, you are making a bet on the future where you will expend resources without being sure of the result. It is the same with innovation. In business or technology, an innovator pursues an opportunity but does not know in advance what the outcome will be or whether it will be profitable. Strategy and innovation both depend on a combination of intention and rational analysis on the one hand, and intuition and luck on the other. They also depend on an ability to suspend disbelief and consider multiple hypothetical scenarios, to ask “what if?”

Consider a bell curve, fat in the middle and tailing off at either end. In an established company, for example, most of the activity is tried-and true process and procedure, the stuff in the middle that takes place over the medium term. Strategy and innovation are the tails on either end. One tail is the zone of long range planning, and on the other tail lies improvising, the art of the short term. Strategy and innovation are also inherently self-destructive, in that they aim to replace their hypothetical selves with proven processes and procedures.

They share another important quality, which is the essence of a true experiment: the ability to be wrong. Most of us don’t like that one! Our egos prefer to sit comfortably on a little perch of expertise and experience, where we can look straight ahead and judge everything that comes our way as being either right or wrong. This is why some of the most brilliant stuff comes from the outlying regions, where there is not so much invested in the status quo.

As a strategist, the outcast Genghis Khan learned from every conquest and systematically combined the tactics and technology of his foes. Coming from nothing, he wasn’t wedded to any one particular way of doing things. As innovators, the Wright brothers were true to the painstaking hands-on approach of their bicycle shop. They made systematic wind-tunnel experiments inspired by the shape of birds’ wings, while the top physicists of the day declared manned flight to be impossible based on theoretical calculations.

Great generals learn at least as much from battles lost as battles won. Edison was proud of his methodical, trial and (mostly) error approach. Einstein ransacked libraries until he chanced upon a mathematics that described multidimensional spaces. Modern software developers intentionally release flawed beta versions of new programs so early adopters can identify the bugs.

The ever-shifting balance between planning and improvising is well illustrated by the Normandy landings during D-Day. The plan called for the Allies’ naval vessels to lie well offshore, out of reach of the German heavy guns. But when the ships’ commanders realized the troops were being mowed down on the shoreline, they disregarded orders and improvised, moving in closer so their guns could take out key German gun emplacements. This act of disobedience allowed Allied troops to establish a beachhead—the beginning of the end of the Second World War.

In the early stages, a start-up company is mostly about strategy: speculative ideas, concepts, and goals. There are few resources or processes. “The middle of the bell curve is mostly empty. The activity, and the value, is at both ends. If the company is able to attract financing and people, it is able to create processes: detailed plans, departments, products or services, a path to market. It fills in the middle of the curve. Strategy and innovation have done their jobs and can retire, at least for the moment, into the background.


The power of fear

Kayaking in the Bay of Fundy, I saw a giant shark fin coming at me out of the mist. My heart raced. My brain snapped into high gear. I was, in a word, afraid.

Actually, this never happened.

What really happened was that I recalled what two fishermen had told me the last time I was out paddling. Some other fishermen had told them they had seen a huge fin out there. One guy alone in his boat had been afraid enough to get the hell out of there.

Some great whites have been tracked off the coast of Nova Scotia this summer. Maybe this had been one of them.

I hadn’t actually seen the fin. Alone on the water, I had just remembered what the fishermen had told me the week before. That was enough to get the heart pumping.

Instead, I had seen a few kayakers, the usual fishermen in boats and on shore, lots of gulls and a couple of eagles. I’d felt the rush of the powerful Fundy tide. In other words, business as usual on the bay. This nameless fear brought on by my imagination is a good metaphor for a lot of what goes on nowadays.

By modern standards my home base of Halifax, Nova Scotia is a boring place. There is some downtown construction making it hard to get around, too much government on the one hand and too few enlighted policies on the other, with consequences like a shortage of doctors and a lot of debt. The rural economy is struggling and woodland is being burned up as low-value bio-fuel, a source of so-called “green energy.”

Still, the province is a tame neighbourhood compared to much of the world, at least according to the news. From here, the big world looks like a scary place. There is a seemingly steady stream of terrorist attacks and wars, with refugees fleeing for their lives.

South of the border, the Great Democratic Experiment envisioned by Jefferson, Hamilton and their visionary cohort is degenerating into a sandbox battle of spoiled special interests. North Korea’s nuclear threat highlights two world leaders willing to sacrifice the world in service of their own inflamed egos – or so it seems. Their eyes have that righteous bulged-out look you get from too much adrenaline, the fuels behind hose two high emotions, anger and fear.

The message behind all the adrenalin out there is simple: be afraid, be very afraid. All those words, images and sounds are designed to zero in on the amygdala and associated territory – your brain’s fear centers.

Emotion drives much of human behavior. It quickly shuts down the higher brain. You might think our modern era, all shiny and high tech, would transcend our primitive origins, but it is almost the opposite.

The power of emotion to sway even the most intelligent among us has been demonstrated by two psychologists: the late Amos Tversky (who was also a mathematician) and his colleague Danny Kahneman, who won the Nobel Prize in Economics, although he is not an economist.

Evolution has wired fear deep into our subconscious for good reason. It protects us from immediate threats — when the luxury of taking time to think may be fatal. But this advantage is double edged. Fear highjacks our latest evolutionary indulgence, the cerebral cortex and its capacity for reflection, creativity and problem solving.

“Fear” is the single cover line on the summer 2017 edition of Lapham’s Quarterly. The cover image is Shield with the Head of Medusa by Arnold Bocklin (1897). In ancient mythology Medusa was considered so frightening you would be turned to stone just by looking at her. The anthology tunes with deadly accuracy into the sense of uncertainty and even terror that lurks beneath all civilizations.

In his preamble, Editor Lewis Lapham quoted President Franklin D. Roosevelt’s famous 1933 speech: “The belief that the only thing we have to fear is fear itself… the unjustified terror which paralyzes needed efforts to convert retreat into advance.”

The president’s goal, Lapham wrote, was to bolster “the national resolve needed to emerge from the Depression.” Later, in 1941 – 45, the same resolve was needed to win a world war. So far so good.

Then Lapham switched gears: “… and in the years since to bring forth the wealthiest society and the most heavily armed nation-state known in the history of mankind.”

What is the nation protecting itself from now?

“The armour is protection against the real enemy – once again it is fear itself. Yet fear is also a tool – malleable and subject to misuse… Fear is America’s top-selling consumer product. Our leading politicians and think-tank o

peratives regard mental paralysis as the premium state of securitized being. [There is] a rarefied awareness of nameless, unreasoning terror as evidence of superior sensibility and soul.”

It’s Freud who made the modern distinction between “real fear” (a rational response to a clear and present danger) and “neurotic fear,” a sort of free-floating anxiety.

Lapham ticked off the recent wars “born on the cradle of expectant anxiety” — the Cold War, the wars in Vietnam and Iraq. Following the fall of the Berlin Wall in 1989, the challenge was “to defend, honor and protect the cash flow of the nation’s military-industrial complex.”

Now we’re talking manipulation.

The term “military-industrial complex” was first used by President (and five-star general) Dwight D. Eisenhower, during his Farewell Address to the Nation in 1961. He noted how the military was needed to keep the


“Our arms must be mighty, ready for instant action, so that no potential aggressor may be tempted to risk his own destruction,” he continued. “[Yet] in the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military–industrial complex. The potential for the disastrous rise of misplaced power exists. We must never let the weight of this combination endanger our liberties or democratic processes.”

Lapham pointed out that this military-industrial complex can survive only amid a climate of fear. Post 9/11, the nation “took up arms against a figment of the imagination–Saddam Hussein’s weapons of mass destruction.”

The greatest threat turned out to be not a missile-wielding foreign despot but something more insidious — the loss of internal freedom: “Unable to erect a secure perimeter around the life and landscape of a free society, the government departments of public safety solve the technical problem [of protecting wealth and privilege] by seeing to it that society becomes less free.”

Sixteen years later, from “the collective fear and loathing of the American people,” Donald Trump became the President of the United States.

A few weeks after the publication of the “Fear” edition of the Quarterly, President Trump and “The Great Leader” of North Korea circled each other, two alpha males in their prime – actually slightly past it. That is the most dangerous stage, when, according to history, younger men and even civilians may be sacrificed on the unquenchable altar of ego.

Driving to your death

“Be afraid” is a beguiling mantra that works well for politicians, the media, sales people and marketers—indeed, for persuaders of all stripes.

In The Science of Fear (Dutton 2008), Daniel Gardner, a Canadian journalist, burrowed into our susceptibility to this entrancing bogeyman. The author recalled that on September 12 2001, US President George W. Bush called the attacks of the day before “more than an act of terror, they were acts of war… Freedom and democracy are under attack.”

Before this, George W Bush was “a weak leader with a flimsy mandate. He had lost the popular vote and had mediocre approval ratings. Afterwards, he was a hero.”

Gardner noted that 9/11 had a high “signal value,” a term used by researchers to describe how much an event seems to inform us of future dangers. A poll in mid-October 2001 found 85% of Americans thought more attacks were likely over the next few weeks. Five years later, 50% still thought an immediate attack was likely.

The author then presented the statistical risks from the actuarial tables of the threats to the health and welfare of the ordinary citizen based on recent history. (I round off most of the numbers.)

On 9/11 almost 3,000 people were killed from a population of 2.85 million. This equals an annual risk of 1 in 93,000. Compare this to a 1 in 88,000 risk of drowning, a 1 in 48,000 risk of a pedestrian being killed by a car, or the 1 in 6,000 risk of dying in a car accident.

Worldwide there were 10,000 international terrorism incidents between 1968 and 2007, killing 14,790 people for an average worldwide death toll of 379 people per year. Taking the anomaly of Israel out of the equation, the lifetime risk of being injured or killed by terrorism in the world was between 1 in 10,000 and 1 in 1 million.

In comparison, an American’s lifetime risk of being killed by lightening is 1 in 80,000, of drowning in a bathtub 1 in 11,000. In contrast, the probability of committing suicide is 1 in 119, of dying in a car crash 1 in 84. Indeed, in North America the greatest threat to your life is driving your car within half an hour of home, because that is where you do most of your driving. How many of us think that as we reach for the keys?

Do we want the government to reduce this mortal risk by, say, forcing us to use self-driving cars, which according to all evidence are far safer than our allowing our impetuous and distractable species to take the wheel.

In contrast, if the statistical risk posed by terrorism were considered in a public health context, Gardner wrote, it would be considered de minimis, “too small for concern.”

He looked at other quantifiable mortality factors. In America, 14% of Americans, 41 million people at the time, did not have health insurance. According to the Institute of Medicine, this caused 18,000 unnecessary deaths each year, costing the US between $60B and $130B annually.

On the international front, malaria in Africa could be controlled at $2B to $3B per year, but only a tiny fraction of that was being spent. The disease “will likely continue to kill 67 times more people each year than the almost 15,000 killed by international terrorism over the last four decades.”

While many commentators decry the risks of terrorism and other security-related concerns, Gardner noted that we in the West were living in the healthiest, wealthiest and safest period in history. This fact is lost on most of us.

For example, until the 1920s there was no check on the caprice of infectious diseases that ravaged communities with no regards for income, power or social status. Antibiotics, vaccinations and an emphasis on hygiene, considered the greatest single advance in medicine, have only recently combined to protect our species.

Children had been especially vulnerable. A diphtheria vaccine was created in 1923, but before that not even the mighty were spared. Queen Victoria’s daughter and granddaughter both died of the disease during an outbreak in 1878.

In 1725 the average life expectancy in the US was 50 years. By the end of the 20th century it had risen to 78 years. In 1900 20% of children born in the US died before they were five years old. By 2002, this figure had fallen to less than 1%.

Preoccupied by the drama of our own circumstances, we underestimate the uncertainty of previous eras. Generation after generation, millennia after millennia, life had been tenuous, fragile, susceptible at any time to the dark angels of Illness, famine, poverty and war.

With the dramatic exceptions of the mass insanity of the two world wars, the 20th century was a turning point. Medicine protected us and communication and transportation connected us. Yet beguiled by the tug of “free-floating anxiety” and fear, our age stays locked in its own thought prison.

The Cold War pitted the Soviet Union against the West in a resource-guzzling build-up of arms. Nuclear Armageddon was postponed by the unhinged but effective insurance policy of Mutual Assured Destruction (MAD).  In 1985 the Soviet Union and the US possessed enough nuclear weapons to kill half of the human race and reduce the rest to scavengers. But the Cold War ended peacefully, Gardner noted, and the Soviet Union dissolved within a few years.

While later instability provided an opportunity for the ex-KGB hand Vladimir Putin to rise to power, the Russian threat pales in comparison to that of the Cold War.

Boiling the frog

Photo © 2010 J. Ronald Lee.

We are wired to respond to the immediate, the sensational, the loud, the unexpected. Subtle butpotent long-term threats escape our attention.

Ecological Intelligence by Daniel Goleman (Ballantine Books 2009) was published a  year after The Science of Fear. Goleman recalled growing up during the Cold War, when “bomb drills at school reminded us that we could be blown to bits in a nuclear war. Children today face what may prove, over the long term, to be an even more dire threat: the specter of drastic disruptions of life from global warming and the other ecological disasters we may have already set in motion.”


David Wallace-Wells takes us to this simmering apocalypse in his article “The Uninhabitable Earth,” the cover story in New York magazine, July 10, 2017. “Absent a significant adjustment to the way billions of humans conduct their lives, parts of the earth will become close to uninhabitable, and other parts close to uninhabitable, as soon as the end of the century,” he wrote.

He cited Wallace Smith Broecker, the oceanographer who coined the term global warming, who calls the planet “an angry beast.” Wallace-Wells piles on the evidence of slow-moving but relentless scenarios, stories that unfold over decades and centuries, although the slopes of the curves are steepening faster than even doomsayers were predicting a short while ago. Recent satellite data show that since 1968 global warming has progressed more than twice as fast as scientists had predicted. Moreover:

Arctic permafrost contains 1.8 trillion tons of carbon, more than twice as much as is currently suspended in the atmosphere. When the permafrost thaws, the carbon may evaporate as methane, which is 34 times as powerful a greenhouse gas as carbon oxide.

The UN Intergovernmental Panel on Climate Change projects 4° of warming by the beginning of the next century. The last time the planet was 4° warmer, it wiped out all but one species of European primate.

Since 1980 the planet has experienced a 50-fold increase the number of places experiencing dangerous or extreme heat. Cereal crop yields decline with heat–and animal protein even more. The tropics are already too hot to efficiently grow grain.

Drought may be a worse problem than heat. By 2080, without dramatic reductions in emissions southern Europe will be permanent extreme drought. The same will be true in Iraq and Syria and much of the Middle East. Experts estimate that 800 million people are under nourished globally, with increasing famines in Africa and the Middle East.

The fraction of carbon dioxide in the atmosphere has just crossed 400 ppm and is expected to hit 1,000 ppm by 2100.

The warmer the planet gets, the more ozone forms, increasing smog. More than 10,000 people die each day for the small particles emitted from fossil fuel burning. In 2013 smog was responsible for third of all deaths in China.

Violence in society increases as people become hotter. Forced migration is already at a record high with at least 65 million people displaced people wandering the planet.

Wallace-Wells presented an explanation of the modern economy – the driver of global warming — that is simple and direct: cheap energy that we have not properly employed. While neoliberalism prevailed between the end of the Cold War 1989 and the onset of the great recession in 2008, “historians suggest that the entire history of swift economic growth which began in the 18th century is not the result of innovation or trade but simply our discovery of fossil fuels.”

Following this logic, researchers predict a 23% loss in per capita earning globally by the end of the century, resulting from changes in agriculture, crime, storms, energy, mortality and labour. “Imagine that the world would look like today with an economy which is half as big and which will produce half as much value.”

After the statistics and the science, Wallace-Wells ended on a poetic note.

“Early naturalists often talked about “deep time,” the profound slowness of nature… What lies in store for us is more like what Victorian anthropologists identified as dream time or ‘everywhen’ — the semi-mystical experience described by aboriginal Australians of encountering in the present moment an out-of-time past when ancestors, heroes and demigods crowded an epic stage, a feeling of history happening all at once.

“The carbon-burning processes that began in 18th century England lit the fuse of everything followed. But more than half of the carbon humanity has exhaled into the atmosphere its entire history has been emitted in just the past three decades. Since the end of World War II the figure is 85%.”

What do our big thinkers propose?

Stephen Hawking says our species needs to colonize other the planets in the next century to survive. Elon Musk plans to build a Mars habitat.

Wally Broecker, now 84, puts his faith in carbon capture, the untested technology to extract carbon dioxide from the atmosphere, which he estimates will cost several trillion dollars of geo-engineering. Climatologist Jim Hansen, former head of research at NASA, filed a lawsuit against the federal government charging inaction on warming that will impose massive burdens on future generations.

The challenge is clear. To meet the goals of the Paris Accord, by 2050 carbon emissions from energy and industry that are still rising must fall by half each decade. The writer ended on a quasi-hopeful note, the forced but seemingly necessary optimism of the climatologists who think we can engineer our way out of this problem that we have created by our own naïve engineering.

Our short-term preoccupation with terrorism, security, policing and war is our ancient brain at work. Television images and bombastic leaders channel our attention away from more insidious threats like global warming. Our brains are programmed to latch on to threats that are immediate and local, not longer-term, global and infinitely more dangerous.

In the short term, over-reliance on fossil fuels causes warming of the planet and other environmental issues. This leads to droughts and famines that in turn lead to forced migrations and a motley soup of social and political instability.

In the longer term, our current civilization is based on the cheap energy and reckless spending of the fossil fuel economy. This engine has lifted billions into a higher standard of living and now threatens the ecology that supports us all.

These threats are gradual and, in comparison to the images of terrorism and war on the television news, not very dramatic. Then again, one of the predictions of global warming is more weather extremes, including storms. Houston under water, a coastline of oil infrastructure under threat, that makes good TV as well.

Wallace-Wells’ scenarios are bold and disturbing. They are based on precise measurements, validated scientific theories and sober predictions. It’s not “fake news.”

The scenarios are alarming. They should make us afraid, in a rational sort of way. Let’s put that recent innovation the cerebral cortex to work and think our way out in a collaborative sort of way. Here on earth, we’re all in this together.

Let’s clean up the mess we have made.

Beyond the challenges – social, economic, technical – let’s end with that giant emotional filter we began with: fear.

Following the Dalai Lama, Rinpoche Chogyam Trungpa escaped from Tibet after the Chinese occupation. He eventually settled in Halifax. Fear was one of his favorite subjects.

“When you are frightened by something, you have to relate with fear, explore why you are frightened, and develop some sense of conviction,” he wrote. “You can actually look at fear. Then fear ceases to be the dominant situation that is going to defeat you. Fear can be conquered. You can be free from fear, if you realize that fear is not the ogre. You can step on fear, and therefore you attain what is known as fearlessness but that requires that, when you see fear, you smile.”







Claude Shannon’s world: Fun, games and the search for truth


It’s a small sample, but the brilliant people I know and have known share two underappreciated traits: a sense of humor and a quirky passion for following their curiosity, no matter what. These two qualities can get you into a lot of trouble.

The most interesting people, too, have their own unique motivations and their own ways of slicing and dicing what interests them. Put them into the usual mold, and they will probably break.

The article below, by the authors of the new book A Mind at Play: How Claude Shannon Invented the Information Age, touches on both qualities.

Shannon was a mathematician and puzzle solver who proved theorems and built gadgets with the same combination of passion and precision. His concept of information was the elusive quantity that lies behind the entire digital age. It was similar to that of entropy, at least mathematically, and he called his home Entropy House, a metaphor any homeowner can appreciate.

I used to visit Entropy House in the 1970s and ’80s and recall the eccentric unicycle and the bald patch of grass where Shannon practiced juggling.

His hobbies tended to combine a sense of play, some physical challenges and — often but not always — a deeper theoretical insight. He built gadgets that were both amusing and instructive. One was a machine that turned itself off.

He made a machine that solved the Rubik’s Cube and wrote a funny poem about the elusive device. He built Theseus the mechanical mouse, which, along with its maze, was considered the first practical demonstration of artificial intelligence.

After building several scale models, he turned a small school bus into an elaborate camper van. He played the clarinet and loved Dixieland Jazz.

He investigated the mathematics of the stock market but made more money by investing in a few tech companies. Hanging out in casinos, he co-invented the first wearable computer.

Back at Entropy House, the rule was you paid attention to something because your were curious about it. Not because it was practical or promised some conventional reward like financial gain, although could ensue later on.

This something could be a puzzle, a scientific mystery, a technical or physical challenge, the draft of a poem, or a gadget to build or fix. Tools included unicycles, juggling balls and clubs, Erector Sets, mechanical devices of all kinds, musical instruments, computers both analog and digital, mathematics, chess sets, frisbees, books – and, well, tools.

Shannon was known for a killer intuition, paring away a problem to get at its essence, then building it up again. This could irritate brilliant people who didn’t have the knack.

His many interests and hobbies cross-fertilized each other. Procrastination was a way to avoid the quick but simplistic solution. He would spend years on a difficult problem. When he had solved it to his satisfaction, he moved on.

Reclusive and not career oriented by normal standards, Shannon was employed by Bell Labs, MIT and the US government, with stops at the Princeton Institute for Advanced Study and Stanford.

During WW II he put cryptography on a mathematical footing and did research that led to his ground breaking work on what was to become known as information theory.

At Bell Labs he sometimes got in trouble for unicycling and juggling – at the same time – but as senior researcher and respected all-purpose genius, he got away with it.

Not only did he juggle, but he also came up with a mathematical theory of juggling. This is interesting partly because many math types have juggled over the generations, but he was the first one to deconstruct the practice.

At Bell Labs he preferred to hang out with the guys at the machine shop while other theoreticians wondered why he was so fascinated with gadgets. He liked to build things with his own hands, but wasn’t above hiring others to help.

Shannon combined his passions for machines and chess in early research into what we now call artificial intelligence, but at the time it seemed more like a sidetrack.

He rarely collaborated. The “bit” was his idea, but he asked his buddies to come up with a word for it.

He loved games of all kinds and although he described himself as a frail man, he was athletic. He loved music, Alice in Wonderland and the sounds of words.

In recent years several books have brought a new prominence to this shy scientist and inventor. A film is coming soon.

There is no Nobel for the fields in which he worked — math, engineering and computer science — so it was fitting that he won the first Kyoto Prize.

His work was often decades ahead of its time.

When asked if machines would ever be able to think, he replied: “Well, I’m a machine and you’re a machine and we both think, don’t we?” [I think I got that right.]

He was friends with Alan Turing and spent time with John Von Neumann. His mentor at MIT was Vannevar Bush, the practical academic who invented modern science policy, fostering research into atomic weapons during the war and the Internet in peacetime. Our modern tech billionaires owe a lot to Bush, but not all of them know it.

Entropy House had a great vibe. It was a sort of Disney Land for talented people who didn’t take themselves too seriously.

His wife Betty was a collaborator. She was also a math whiz, a reader, a puzzle solver and a master weaver. She bought him his first Erector Set. The next generation is talented too.

With all his technical and scientific achievements, I asked what Claude Shannon was most proud of. “Lasting 42 moves with the world chess champion,” replied a family member.

From the article below:

Reflecting on the arc of his career, Shannon confessed, “I don’t think I was ever motivated by the notion of winning prizes, although I have a couple of dozen of them in the other room. I was more motivated by curiosity. Never by the desire for financial gain. I just wondered how things were put together. Or what laws or rules govern a situation, or if there are theorems about what one can’t or can do. Mainly because I wanted to know myself.”


Creating new worlds vs. the myth of problem solving

By David Holt

Despite its popularity, the current obsession with problem solving is way off the mark. Ditto finding out what your customers want and giving it to them. This is fine for fine tuning and tweaking what already works, but it’s not what changes the world. Not even close.

As Henry Ford reportedly said, “If I’d listened to my customers, I would have invented a faster horse.”

Here’s why. The problem-solving orientation is based on accepting someone else’s view of the world, like that of your fourth grade teacher. A problem is, by my definition, a situation that has already been defined, usually by someone else.

School is a good example. Here’s a problem, says the teacher. Solve it. Get an A. Please the teacher. This is really about social acceptance—not objectivity and creativity, the two forces that change the world.

The people who change the world in all fields aren’t trying to get an A from the powers that be, including the apparent rules of the current marketplace. These people are driven by curiosity to understand and to create.

Mozart had his moments but was buried in a pauper’s grave. The careers of Van Gogh and Toulouse-Lautrec didn’t kick in until after they died. Shakespeare made his money as an investor in the Globe Theatre, not as a playwright and actor.

Gregor Mendel’s genetics experiments were ignored for decades. Nikola Tesla was often out-maneuvered by Edison, a ruthless character who staged the public execution of a circus elephant in a misleading stunt to malign the concept of alternating current.

It was Alexander Graham Bell’s wife Mabel who insisted on the patent application for the telephone. Bell was a serial inventor, a lone genius and brilliant collaborator. His wife was the business mind.

What creative people are doing is creating new worlds. They realize, more than the rest of us, that reality is not fixed.

This is the key insight in a recent article in Fortune magazine by Michael Puett and Christine Gross-Loh: This Ancient Chinese Text Is the Manual for Business Disruptors: And no, it’s not Sun Tzu’s “Art of War.” [April 1, 2016].

A few extracts:

When disruption became the rallying cry for innovators a decade ago, they seized on ancient work of Chinese philosophy to prove their point. In Sun-Tzu’s Art of War, a new class of business disrupters claimed to have found the original manual.

They were right about ancient Chinese philosophy, but wrong about the manual.

As it turns out, another text from China, the Laozi, actually offers a much more expansive—and revolutionary—vision of innovation. Like the Art of War, the Laozi is a 2000-year-old text.

The Art of War says that victory comes to a general—read, a business leader—who avoids following conventional strategies and instead uses surprising tactics to unsettle a seemingly dominant opponent… But it assumes that the disrupter has to take into account things like the actual terrain on which he is fighting and that he must treat his adversary as stable and unchanging.

The Laozi, by contrast, questions the very idea that we should try to come up with innovative strategies within a defined, predictable arena… Instead, the Laozi assumes a world in constant flux and motion.

The key lies in understanding one basic idea: although we tend to think of things as stable because that makes them easier to grasp, every situation that ever arises actually results from interactions between sets of constantly shifting, interweaving worlds.

Laozian innovation comes from an awareness that if everything is composed of moving parts, subtle actions allow one to alter or even make the world into something new.


[With CNN] Ted Turner … crushed the idea that the news belonged to a few TV channels and a few hours of the day, and thus helped pave the way for the day when everyone can assume news is delivered instantly and around the clock.

[With Amazon] Jeff Bezos did not disrupt the book industry or even several industries. He created a new world where those industries became less relevant because a large amount of shopping would now be done through this one website.

With the iPhone, Steve Jobs created a new world simply through completely reinventing a daily object [the phone] we already used and carried around all the time.

It’s that we don’t realize that we are entering these new worlds created by others that makes them so powerful.

Indeed, these new creations are more platforms or delivery vehicles than products or services. CNN carries “the news.” Amazon is really the ultimate logistics company. Apple provides the opportunity to create.

Besides delivering atoms and bits, these platforms are their own worlds. In addition to efficiency, they offer something else we humans love. Call it a feeling, a certain tone. We line up to buy.