Bayes’ Theorem. I never understood this at school. But, now I do. And I am taken aback by its importance in life and investing. Simple, yet profound.

Bayes’ theorem basically helps us to update our current beliefs based on new information.

Let’s start with an example:

- Imagine that Mr.A is
*diagnosed*with cancer. - Assume that tests to detect cancer are 99% accurate.

**Question:** What is the probability that Mr.A *actually has* cancer?

Your first answer would be “99%” – after all, the test is 99% accurate, right?

Wrong!

Why?

- Let’s look at some more information: cancer affects 0.1% of the population (just an assumption). That is, out of 1000 people, only one person will get cancer. So, 999 people do not have cancer.
- Since the test is 99% accurate, the rest 1% (of 999 who do not have cancer) – or 10 people – will receive wrong diagnosis. That is, 10 people – among the 999 people – will recieve diagnosis that they have cancer, though they don’t have it.
- In total, 1 person (who has cancer) and 10 other people (who do not have cancer) will receive positive test results (Total=11).
- Out of the total 11, only one person has cancer. This means that there is only 9% chance (1/11) that Mr.A has cancer, despite getting positive test result.

Got it?

Let’s figure this out again, using a tree diagram (this helps me a lot to figure out probabilities):

In the below diagram, the green cells only are the correct test results (99%), while red ones are wrong (1%).

Remember that Mr.A got a positive test result. Now, we are trying to determine the probability that Mr.A has cancer, after getting positive test result. To find this, divide the “correct positive test results” by “total positive test results”. In the below tree diagram, the “total positive test results” are indicated by the thick orange lines:

Let me use absolute numbers instead of percentages:

Let’s now do the math:

Probability that Mr.A has cancer

= Probability of getting correct positive test results / Probability of getting positive test results, be it correct or incorrect

= 1/(1+10)

**= 9%**

Want a formula to calculate it easily? Here it is:

*where,*

A means: Having cancer

B means: Testing positive

Mr. A doesn’t have too much to worry! There is only 9% chance that he actually has cancer, despite receiving positive test result. We initially thought it was 99%, but it is actually just 9%.

Let’s get back to theory now (Oh, please don’t sleep.. Stay with me!):

As I said in the beginning, Bayes theorem is all about updating **current beliefs** based on **new information**. Let’s split this into two:

**Current beliefs:**They are called priors, or base rates. These are our assessment of the probability of an event, based on statistical evidence.**New information:**We should adjust our current beliefs based on new information.

Going back to our example – the reason why we all shouted “99%” is that:

- We neglected the “
**priors**“. This is also called “**base rate neglect**“. Here, the base rate is that only 0.1% of the population will have cancer. True, it was not given in the question, but that does not prevent us from seeking it. Always ask: “What is the base rate of this happening?” - We did not
**update the priors based on new information**. Here, we should have adjusted for the probability that some of those who received positive results, might not actually have cancer. It is very important that you assess the strength or validity of new information, before updating your beliefs or base rates.

**Think about base rates; Update with new information**

In his phenomenally popular book “**Thinking Fast and Slow**“, Daniel Kahneman uses the terms “Outside view and Inside view”. “Outside view” refers to the base rate, while “Inside view” refers to the specifics of the situation (or new information).

Philip Tetlock explains this in his famous book “**Superforecasting**“:

Consider a family in the U.S. – with husband, wife, a five year old kid and the husband’s mother. What is the probability that they have a pet?

Ordinary mortals like us will answer: “Oh, definitely they will have a pet. Because the kid does not have any siblings to play with, isn’t it?”.

Superforecasters think differently. Their first question will be: “how many households in the U.S. have pets?”. This is the base rate, from which they start answering. Next, the base rate is adjusted for the uniqueness of the family situation.

In short, we are biased towards the narrative, rather than looking at hard numbers.

Humans also have a tendency to stick to initial beliefs, and adjust too little even if new information is strong enough to change it. This is anchoring and adjustment bias. In “Superforecasting”, Tetlock noted that superforecasters continuously assessed the strength of new information, and adjusted their beliefs if needed.

**Applicability in investing**

Say, you are looking at a company in the airline industry. What question will you ask first?

Exactly! “What is the **base rate** of success in this business?”

Then, we will ask: “Considering the specific situations of this company, how much should I adjust the base rate, to arrive at the probability of success of this company?”.

**Conclusion**

To conclude, let us remember the key lessons here:

- Always think in terms of probabilities
- Ask: what is the base rate here?
- Study whether the new information is strong enough to update current beliefs. If yes, update the base rates.

**Additional materials on Bayes’ Theorem **

**Julia Galef** on A visual guide to Bayesian thinking | Think Rationally via Bayes’ Rule

**CIA:** Bayes’ Theorem for Intelligence Analysis

[…] (or may be, most) things in this world are not entirely predictable. So, it helps to think in terms of probabilities, rather than […]

LikeLike