Learning to think in deep-security terms means largely abandoning our idea that we can deter the threats we face and, instead, pressing to make our societies more resilient so we can absorb whatever strikes us. Resilience will be the defining concept of [the] twenty-first century…, as crucial for your fast-changing job as it is for the nation.
Joshua Cooper Ramo, The Age Of The Unthinkable
I think there is an interesting parallel between the rise of the concept of the social business and sustainability. They share the premise that we live in an interconnected world, and steps must be taken to find a balance that allow for continuation of life: life of the organic world in the environmental context, and the life of the business in the business setting. But the desire for a steady-state — in both cases — may be blinding us to the realities of the world we are living in.
Read the rest at GigaOM.
A post by Bijan included a quote from John Lilly, ‘Design like you’re right; listen like you’re wrong’. He suggested that it was derived from Bob Sutton’s one liner: “Fight like you’re right, listen like you’re wrong.”
But doing some research, Sutton seems to credit Paul Saffo for the original germ of this thought, which is his advice to approach the future with ‘strong opinions, weakly held’:
A couple years ago, I was talking the Institute [For The Future]’s Bob Johansen about wisdom, and he explained that – to deal with an uncertain future and still move forward – they advise people to have “strong opinions, which are weakly held.” They’ve been giving this advice for years, and I understand that it was first developed by Instituite Director Paul Saffo. Bob explained that weak opinions are problematic because people aren’t inspired to develop the best arguments possible for them, or to put forth the energy required to test them. Bob explained that it was just as important, however, to not be too attached to what you believe because, otherwise, it undermines your ability to “see” and “hear” evidence that clashes with your opinions. This is what psychologists sometimes call the problem of “confirmation bias.”
The design case turns out to be a specific example of the more general mindset, which is, in fact, what defines wisdom. And more importantly, in a world changing at the pace of ours, the core premise of resilience.
Loosely defined, resilience is the capacity of a system—be it an individual, a forest, a city, or an economy—to deal with change and continue to develop. It is both about withstanding shocks and disturbances (like climate change or financial crisis) and using such events to catalyze renewal, novelty, and innovation. In human systems, resilience thinking emphasizes learning and social diversity. And at the level of the biosphere, it focuses on the interdependence of people and nature, the dynamic interplay of slow and gradual change. Resilience, above all, is about turning crisis into opportunity.
Why is surprise the permanent condition of the U.S. political and economic elite? In 2007-8, when the global ﬁnancial system imploded, the cry that no one could have seen this coming was heard everywhere, despite the existence of numerous analyses showing that a crisis was unavoidable. It is no surprise that one hears precisely the same response today regarding the current turmoil in the Middle East. The critical issue in both cases is the artiﬁcial suppression of volatility — the ups and downs of life — in the name of stability. It is both misguided and dangerous to push unobserved risks further into the statistical tails of the probability distribution of outcomes and allow these high-impact, low-probability “tail risks” to disappear from policymakers’ ﬁelds of observation. What the world is witnessing in Tunisia, Egypt, and Libya is simply what happens when highly constrained systems explode.
Complex systems that have artificially suppressed volatility tend to become extremely fragile, while at the same time exhibiting no visible risks. In fact, they tend to be too calm and exhibit minimal variability as silent risks accumulate beneath the surface. Although the stated intention of political leaders and economic policymakers is to stabilize the system by inhibiting fluctuations, the result tends to be the opposite. These artiﬁcially constrained systems become prone to “Black Swans” — that is, they become extremely vulnerable to large-scale events that lie far from the statistical norm and were largely unpredictable to a given set of observers.
Such environments eventually experience massive blowups, catching everyone off-guard and undoing years of stability or, in some cases, ending up far worse than they were in their initial volatile state. Indeed, the longer it takes for the blowup to occur, the worse the resulting harm in both economic and political systems.
Seeking to restrict variability seems to be good policy (who does not prefer stability to chaos?), so it is with very good intentions that policymakers unwittingly increase the risk of major blowups. And it is the same misperception of the properties of natural systems that led to both the economic crisis of 2007-8 and the current turmoil in the Arab world. The policy implications are identical: to make systems robust, all risks must be visible and out in the open — fluctuat nec mergitur (it fluctuates but does not sink) goes the Latin saying.
Just as a robust economic system is one that encourages early failures (the concepts of “fail small” and “fail fast”), the U.S. government should stop supporting dictatorial regimes for the sake of pseudostability and instead allow political noise to rise to the surface. Making an economy robust in the face of business swings requires allowing risk to be visible; the same is true in politics.
- Nassim Nicholas Taleb and Mark Blyth, The Black Swan of Cairo | Foreign Affairs
And is the same true in business? Businesses often have ‘stability at all costs’ as an unstated operating principle. And to achieve that goal management will prohibit risk taking, and avoids entering into activities that could lead to fluctuations in business operations.
Business management is just as likely to be blindsided by high-impact, low probability events as governments are, and for the same systemic reasons, one of which is they do not naturally allow the ‘noise’ from work to rise to the surface, into strategic realms of management.
Just as we should aspire to make our economic systems robust and resilient, working at the macro economic level, so too businesses, at the microeconomic level, must to take a different operating stance to do the same.