Posts tagged with ‘taleb’
I just finished reading Antifragile, by Nassim Nicholas Taleb, which is, without question, the best book I’ve ever read. I am blogging less since I get most of the same benefits from twitter (interesting conversation, thought leadership attribution, etc.) at a much lower cost (it takes considerably less time to tweet, than to blog). That said, I thought I’d share some of my takeaways from Antifragile in long form since they are worth the extra characters.
Below is a summary of the crux of the book, by Dr. Shaiy Pilpel:
Everything gains or loses from volatility. Fragility is what loses from volatility and uncertainty.
To be antifragile, and thus benefit from volatility and uncertainty (norms in today’s macro-environment), one must be able to identify things that “love volatility” and things that “hate volatility”. Taleb says to be antifragile, one must learn to “not be a turkey”. What he means is that turkeys think life is good on average because every day they are usually well-fed and allowed to roam free. That is of course until Thanksgiving, when they are slaughtered. Turkeys are not free, and they are only fed to be slaughtered. Turkeys are in the matrix.
To free oneself of the matrix, one must understand that the fragile breaks with time and thus embrace, and love, volatility. Taleb has a recipe for translating this theory to practice:
(i) Look for optionality; in fact, rank things according to optionality, (ii) preferably with open-ended, not closed-ended, payoffs; (iii) Do not invest in business plans but in people, so look for someone capable of changing six or seven times over his career, or more (an idea that is part of the modus operandi of the venture capitalist Marc Andreessen); one gets immunity from the backfit narratives of the business plan by investing in people. It is simply more robust to do so; (iv) Make sure you are barbelled, whatever that means in your business.
As in the turkey example, the notion of average is of no significance when one is fragile to variability (e.g. Thanksgiving). Taleb suggests one should not spend any energy on mundane events and should only focus on Thanksgiving-type events (not necessarily Thanksgiving itself, since a Turkey could not know about Thanksgiving, but only that something was likely to change significantly - variability). He uses iatrogenesis as an example.
…we need to focus on high-symptom conditions and ignore, I mean really ignore, other situations in which the patient is not very ill.
He also uses what he calls the “tragedy of big data” as an example.
The more variables, the more correlations that can show significance in the hands of a “skilled” researcher. Falsity grows faster than information; it is nonlinear (convex) with respect to data.
Taleb’s solution is to only look at very large changes in data or conditions, never at small ones. Because the commercial world only benefits from addition, not subtraction, one can identify things that hate volatility as things that would be harmed if, as the result of changing conditions, they would no longer exist. For me, I plan to become more vigilant to things in my life that “hate volatility”, and will seek more opportunities to subtract.
Happy new year everyone!
Thanks for the cliff note version! I am waiting to get a copy from the library. I’m like 15th in line.
Why is surprise the permanent condition of the U.S. political and economic elite? In 2007-8, when the global ﬁnancial system imploded, the cry that no one could have seen this coming was heard everywhere, despite the existence of numerous analyses showing that a crisis was unavoidable. It is no surprise that one hears precisely the same response today regarding the current turmoil in the Middle East. The critical issue in both cases is the artiﬁcial suppression of volatility — the ups and downs of life — in the name of stability. It is both misguided and dangerous to push unobserved risks further into the statistical tails of the probability distribution of outcomes and allow these high-impact, low-probability “tail risks” to disappear from policymakers’ ﬁelds of observation. What the world is witnessing in Tunisia, Egypt, and Libya is simply what happens when highly constrained systems explode.
Complex systems that have artificially suppressed volatility tend to become extremely fragile, while at the same time exhibiting no visible risks. In fact, they tend to be too calm and exhibit minimal variability as silent risks accumulate beneath the surface. Although the stated intention of political leaders and economic policymakers is to stabilize the system by inhibiting fluctuations, the result tends to be the opposite. These artiﬁcially constrained systems become prone to “Black Swans” — that is, they become extremely vulnerable to large-scale events that lie far from the statistical norm and were largely unpredictable to a given set of observers.
Such environments eventually experience massive blowups, catching everyone off-guard and undoing years of stability or, in some cases, ending up far worse than they were in their initial volatile state. Indeed, the longer it takes for the blowup to occur, the worse the resulting harm in both economic and political systems.
Seeking to restrict variability seems to be good policy (who does not prefer stability to chaos?), so it is with very good intentions that policymakers unwittingly increase the risk of major blowups. And it is the same misperception of the properties of natural systems that led to both the economic crisis of 2007-8 and the current turmoil in the Arab world. The policy implications are identical: to make systems robust, all risks must be visible and out in the open — fluctuat nec mergitur (it fluctuates but does not sink) goes the Latin saying.
Just as a robust economic system is one that encourages early failures (the concepts of “fail small” and “fail fast”), the U.S. government should stop supporting dictatorial regimes for the sake of pseudostability and instead allow political noise to rise to the surface. Making an economy robust in the face of business swings requires allowing risk to be visible; the same is true in politics.
- Nassim Nicholas Taleb and Mark Blyth, The Black Swan of Cairo | Foreign Affairs
And is the same true in business? Businesses often have ‘stability at all costs’ as an unstated operating principle. And to achieve that goal management will prohibit risk taking, and avoids entering into activities that could lead to fluctuations in business operations.
Business management is just as likely to be blindsided by high-impact, low probability events as governments are, and for the same systemic reasons, one of which is they do not naturally allow the ‘noise’ from work to rise to the surface, into strategic realms of management.
Just as we should aspire to make our economic systems robust and resilient, working at the macro economic level, so too businesses, at the microeconomic level, must to take a different operating stance to do the same.