The book talks about randomness, associated maths, and the psychological biases which interfere in a more stochastic approach to thinking about life.


  1. Probability is not about odds but a belief in the existence of an alternative outcome.
    • The right way to judge the performance in any field is not by the results but by the cost of the alternatives. A million dollar earned through dentistry is more valuable than a million dollar won via playing Russian roulette. The alternative outcomes in the case of Russian roulette are far worse.
    • Our habitats have evolved faster than our ability to evolve with it. Things like probability does not come naturally to us.
  2. When a random process scales (“repeated”), its results regresses towards mean. A 15% return with 10% volatility implies 93% chance of a positive return in a year (=> 7% bad years) but only a 67% chance of having a positive month (=> 33% bad months) and 50.02% chance of a positive second (=> every other second will have a negative return).
    • Over a short time, one sees variability in the portfolio not returns. Over a long time, one sees returns, not variability. 
    • The wise man listens to the meaning; the fool only gets the noise.
    • Law of large numbers – A population consisting of a large number of bad managers is virtually guaranteed to produce some who will have amazing track records. Moreover, the extent of the greatness of their results depends more on the initial sample set than their ability to produce results.
  3. In real life, a lot of events have skewed payoffs. The likelihood of an event can be lower but the corresponding payoff (or damage) can be much higher. Maximizing the probability of such events does not translate into maximizing the payoffs.
    • An option seller makes continuous “small” amounts of income while an option buyer loses money and makes money in one-shot only in case of the occurrence of a rare events. Selling options gives psychological kicks while buying is economically optimal.
  4. Path-dependent outcome – Computer keyboards are QWERTY because typewriters are QWERTY and that’s because that was the preferred way to slow down the typists to avoid jamming the typewriters. It is not rational to have them today but path-dependent outcomes are the cornerstone of life.
    • The same effect can explain that why Microsoft’s Windows was able to win by creating a networks effect around its operating system.
    • Endowment effect is a manifestation of that.
    • Polya process is a more accurate model of the real world than “independent events” approach. Economists fail to realize that.


  1. We have two systems of reasoning – one is for the fast decision making, the other one is for the slow decision making (more details in Thinking Fast and Slow).
    • One major implication of it is that ideas do not sink in when emotions come into play.
    • Consumers consider 75% fat-free hamburger to be different from only 25% fat hamburger. Mathematically, they are the same.
  2. Affect heuristic – the emotions associated with an outcome determine the probability of that outcome in our mind
  3. Attribution bias – People ascribe skills to their success and random unfortunate events to their failure.
  4. Simulation heuristic – playing the alternative scenarios in the head. “I was about to exit the market right before the 2008 crash, I just missed it”
  5. Hindsight bias – We an incident happens, we believe that we knew that it was going to happen all along. Even though, if earlier, we were asked about our belief in the occurrence of the incident we would not have been that certain. The worst aspect is that it fools us into believing that we can predict the future.
    • History appears deterministic even though it is just a manifestation of the of the potential random paths that were realized.
    • Unlike hard sciences, one cannot experiment in history.
  6. Survivorship bias implies that the highest performing realization is most visible since loser never speaks up (or survives).
  7. Availability heuristic – We assign probabilities based on how easily can instance of that incident be recalled.
    • People don’t like to insure against something abstract, only the vivid risks merit their attention, therefore, it is easy to sell travel insurance against terrorist act vs travel insurance against loss of life.
  8. Representativeness heuristic/Conjunction fallacy – We assign the probability of a person belonging to a particular group based on how similar the person looks to that group.
    • A feminist student is deemed more likely to be a feminist bank teller than to be a bank teller. Even though latter is superset of the former
  9. Wittgenstien’s ruler – If you have a scale, whose accuracy you aren’t confident of, and you use that to measure the table then you are measuring the scale as much as you are measuring the table. Or more formally, unless the source of the statement is a qualified author the statement is more revealing of the author than the information intended by him.
  10. Sandpile effect – “The last straw on the camel’s back” are examples of how a linear change can have a non-linear impact on the complex systems (causing a collapse).
  11. Firehouse effect – A group of people with a similar mindset, after spending too much time can come to the conclusions which are ludicrous to an outsider.
  12. Psychologically, the frequency of positive events matters more than magnitude. A negative event of the same magnitude is more devastating than the positive event of the same magnitude.
  13. The inverse skills problem – The higher up the person is in a corporate ladder, the lower is the repeatability of their work, and hence, lower is the actual evidence of a contribution. A person engaged in repeatable work is easy to judge. One engaged in non-repeatable work cannot be easily judged since their results might be pure manifestation of randomness.
  14. Humans and even non-humans start seeing patterns in randomness and develop superstitions around how those patterns can benefit or harm them.


  1. Rational thinking has little to do with risk avoidance, most of it is about rationalizing one’s actions by fitting some logic to them.
  2. Some details of our daily life like career decisions and investments can harm us or even threaten our survival, it is good to be rational and scientific about them. Other mundane details like the choice of religion can be a very irrational one.
  3. It does not matter how frequently something succeeds if the failure is too costly to bear.
  4. Common sense is nothing but a collection of misconceptions acquired by the age of 18.
  5. Depending on the use-case, extreme values are noise or devastating signals. Average temperature (excluding extremes) is great to decide next vacation destination but for climate scientists, it’s the extremes that matter. Similarly, extremely rare events can bankrupt a company.
  6. As per Karl Popper, there are two types of theories, falsified and yet to be falsified. Something which can not be falsified is not a theory.
  7. More knowledge does not always lead to more information, sometimes, it just leads to a stronger belief in meaningless noise.
  8. Markets always go up in 20 years more or less holds but only a few markets have really survived over time and it was not obvious with a hindsight that which ones will survive (Germany, Imperial Russia, Argentina blew up completely).
  9. Most humans stop when they are satificed (satisfied + suffice) with a result than working towards the most optimal outcome. Satisficers are happier; optimizers end up being more successful on any traditional metrics of success. The causality is not clear though.
  10. At a given point in the market, the most successful traders are likely to be those that are the best fit for the latest cycle. This does not apply to dentists since that profession is more immune to randomness.


  1. People become leaders not because of the skills they possess but the superficial expressions they make on others (“charisma”).
  2. When people merely work hard, they lose focus and intellectual energy. Work ethics, however, draw people towards signal than the noise.
  3. Extreme empiricism, an absence of a logical structure, and competitiveness can be quite an explosive combination.
  4. Someone’s raw performance and personal wealth can sometimes (but not always) be an indicator of their success.
  5. We learn from mistakes by doing them not by reading/listening to them. Learnings from history cannot be acquired via pure reading either.
  6. Listening/reading current news neither provides one with any predictive ability nor improves one’s knowledge of the current world.
  7. Spontaneous remissions of cancer can suddenly cure the disease and the patient might think that whatever pill they have consumed in the meanwhile has the cancer-killing property.
  8. Unpredictability of the behavior is a deterrent. Sometimes, govt. have to overreact on small things, so that, others cannot figure out the precise limits of tolerance.