I’ve made a huge mistake — the dangers of ignoring expected value

Written by samiron.ray | Published 2015/10/07
Tech Story Tags: startup | probability | risk | investing | history

TLDRvia the TL;DR App

For fans of the show Arrested Development, the character Gob provides endless comedic relief as he bumbles through life and his magic tricks, ahem, illusions, with enthusiasm and idiocy. Each time, after making the same kinds of mistakes again, he utters a deadpan “I’ve made a huge mistake”.

After making a decision that turned out poorly, it’s tempting for us to take after Gob and lament our mistakes. But this can be short-sighted. We often confuse realized value with expected value and learn the wrong lessons from our decisions. This confusion has critical implications for decision-making in investing, diplomacy, and even daily life.

For the purposes of our discussion, realized value signifies the end result, positive or negative, of how a decision actually turned out in real life after it was made. Expected value signifies the probability based value of making a decision at the time it was made, assuming long-run repetitions of the decision.

To illustrate with a very simple example, let’s say your friend offered you the following bet. After flipping a fair coin (only once), if the coin lands on heads, your friend pays you $2. If the coin lands on tails, you pay your friend $1. The expected value of this simple bet is (.5 X $2) + (.5 X -$1) = $0.50. Intuitively and mathematically, this is a good bet for you.

If you take the bet and the coin lands on tails, you might be lamenting your decision. But as you know, this is a fool-hardy response, since in this particular case the realized value happened to be less than the expected value of the decision.

Let’s flip the terms of the bet. You now have to pay your friend $2 if the coin lands on heads, but you get paid $1 if the coin lands on tails. The expected value of the bet is -$0.50. If you get lucky and the coin lands on heads, you don’t deserve to celebrate your win here, because your initial decision to take the bet was misguided.

This simple example is representative of the confusion around many decisions in investing, politics, and life.

Take the decision to invest or not in any given startup, for example. Many Silicon Valley VC’s lament turning down seed stage opportunities once whatever hot startup they initially passed on has finally achieved success years later. But this line of thinking can be fraught with fallacious thinking.

What if the successful startup they initially turned down on had a low expected chance of success due to factors like unlikely FDA approval or high capital intensity?

If we run “startup history” a hundred times, analogous to taking the coin flip bets above repeatedly, perhaps this particular startup succeeds massively only once due to eleventh hour luck and fails ninety-nine times.

One might recall the last minute government loans Tesla received in 2010 to save it from the throes of bankruptcy as a relevant Deus ex machina, albeit with Tesla starting out of the gate with an unusually high expected chance of success because of Elon Musk.

The investor who passed on an opportunity for good reasons should not be so quick to regret their decision, then. In the other ninety-nine possible worlds they may have lost their money. The key thing for them to examine is the quality and thoroughness of their decision making processes.

Similar confusion of expected value with realized value often affects our evaluations of major historic events. The relatively bloodless confrontation between the US and the Soviet Union during the Cold War is often pointed out as a good example of the theory of mutually assured destruction (“MAD”) working well in practice, given the lack of a major nuclear conflict between the two countries.

However, this analysis overlooks the negative expected value of the MAD policy due to the chance of catastrophic miscommunication. What if we’re living in the one possible world out of ten in which the US and the Soviet Union didn’t annihilate each other completely with nuclear weapons? In fact, I think that we are. This list of nuclear close calls and the story of the “man who saved the world” should be enough to give us pause.

Evaluations of our own life decisions can suffer from analogous fuzzy thinking. While it’s tempting to judge ourselves on outcomes, this can be unwise on both psychological and epistemic levels. Outcomes are a relevant data point, but focus on outcomes can both paralyze our performance and obscure the truth about why things have turned out as they actually have.

So how should we evaluate our decisions, then? A good start is to focus on our decision making process rather than just the outcome and to evaluate the quality, thoroughness, and preparation behind the process itself. More to come in later articles on how best to do this, but this book on decision making and this book on checklists can provide you with initial inspiration.


Published by HackerNoon on 2015/10/07