Psychology of Intelligence Analysis Annotated: Biases

We are always on a journey to better design our job. This has been my mood during this current Brazilian crisis. I (almost) always come down to qualitative key issues that deter me from further exploring investment ideas, many of those related to people and incentives. The Brazilian Government is no exception. Brazil is a company with a critical governance problem. But I am not that into politics, so let the media, politicians, consultants and lobbyists do their jobs.

However, today I wanted to discuss the last topic of the book Psychology of Intelligence Analysis by Richard Heuer, namely biases. I consider it to be an interesting subject as its definition per se is already curious. If anyone has a bias, it’s because neutrality occurs somewhere, somehow. But, by definition, there is no “neutral” behavior and that’s why I find it ingenious. And that’s what Lee Ross, a psychologist at Stanford University calls “naive realism”, meaning we see the world as it truly is, without bias or error.

Anyhow, the best debate on the topic I have stumbled so far is the classic Charlie Munger’s lecture entitled the Psychology of Human Misjudgement, that I strive to read yearly. As the time goes by, it gets easier to grasp thoughtful concepts and internalize those. If you prefer to listen rather than to read, you can find the full lecture right below.

In Heuer’s book, however, there’s a list of interesting behaviors the CIA have apprehended along the way; simple little tricks cleaning up the path to clearer thinking:

Biases

  • Specifying in advance what would cause you to change your mind will also make it more difficult for you to rationalize such developments if they occur, as not really requiring any modification of your judgment
  • Statistical data, in particular, lack the rich and concrete detail to evoke vivid images, and they are often overlooked, ignored or minimized
  • Consistency can also be deceptive
  • People save residual non-accurate information even after being told they are inaccurate
  • Be cautious with causation. Remember variance and/or randomness exist
  • The tendency to reason according to similarity of cause and effect is frequently found in conjunction with the bias toward inferring centralized direction. Together, they explain the persuasiveness of conspiracy theories
  • People have better intuitive understanding of odds than of percentages
  • Base-rate fallacy: numerical data are ignored unless they illuminate causality
  • People tend to underestimate both how much they learn from new information and the extent to which new information permits them to make correct judgments with greater confidence
  • Fighting hindsight: if it was the opposite, would I be surprised based on the previous report?
  • Consciously avoid any prior judgment as a starting point
  • The act of constructing a detailed scenario for a possible future event makes that event more readily imaginable and, therefore, increases its perceived probability
  • Expectations or theory are unlikely to be given great weight and tend to be dismissed and unreliable, erroneous, unrepresentative, or the product of contaminating third variable influences 

Psychology of Intelligence Analysis Annotated: Linchpin Assumptions

This is the fourth of a series of posts that I try to lay down the most relevant lessons from the book  Psychology of Intelligence Analysis by Richard Heuer.

In my opinion, the next topic lies at the core of investment cases. And it is not the investment thesis per se, it’s a subset of it. Linchpin assumptions are the core companies value drivers. Those are the KPIs executives and investors should understand and follow closely. For investors, those KPIs are simply lagging indicators. For executives, they work with both leading and lagging indicators. Internally, middle management need to act on leading KPIs that shall yield the desired outcome for investors and global goals for the company executives (lagging indicators).

For instance, take a retailer. Gross margin is the lagging indicator. The leading indicator could be the amount of sales sold with some markdown. And the leading indicator of the leading indicator could be a new COO that would redesign the supply chain or even more basic store processes. Many analysts consider gross margin to be the value driver. To be right before Mr. Market, you have to decipher that the COO is the value driver.

Linchpin Assumptions

  • “Analysts actually use much less of the available information than they think they do.”
  • “People’s mental models are simpler than they think, and the analyst is typically unaware not only of which variables should have the greatest influence, but also which variables actually are having the greatest influence.”
  • “When analysis turns out to be wrong, it is often because of key assumptions that went unchallenged and proved invalid.”

It’s easy to become lost with too many pieces of information. One must be sharp, separating what matters from a fundamental value perspective which will eventually be translated to the stock price from pure data garbage. You must pick one, two, no more than three value drivers for your investment case. If you haven’t identified those yet or just have too many of them, maybe you still don’t have an investment case. From previous posts, remember to make them visible and challenge them. Try to disprove your hypothesis and formulate new ones with their respective value drivers.