We are always on a journey to better design our job. This has been my mood during this current Brazilian crisis. I (almost) always come down to qualitative key issues that deter me from further exploring investment ideas, many of those related to people and incentives. The Brazilian Government is no exception. Brazil is a company with a critical governance problem. But I am not that into politics, so let the media, politicians, consultants and lobbyists do their jobs.
However, today I wanted to discuss the last topic of the book Psychology of Intelligence Analysis by Richard Heuer, namely biases. I consider it to be an interesting subject as its definition per se is already curious. If anyone has a bias, it’s because neutrality occurs somewhere, somehow. But, by definition, there is no “neutral” behavior and that’s why I find it ingenious. And that’s what Lee Ross, a psychologist at Stanford University calls “naive realism”, meaning we see the world as it truly is, without bias or error.
Anyhow, the best debate on the topic I have stumbled so far is the classic Charlie Munger’s lecture entitled the Psychology of Human Misjudgement, that I strive to read yearly. As the time goes by, it gets easier to grasp thoughtful concepts and internalize those. If you prefer to listen rather than to read, you can find the full lecture right below.
In Heuer’s book, however, there’s a list of interesting behaviors the CIA have apprehended along the way; simple little tricks cleaning up the path to clearer thinking:
- Specifying in advance what would cause you to change your mind will also make it more difficult for you to rationalize such developments if they occur, as not really requiring any modification of your judgment
- Statistical data, in particular, lack the rich and concrete detail to evoke vivid images, and they are often overlooked, ignored or minimized
- Consistency can also be deceptive
- People save residual non-accurate information even after being told they are inaccurate
- Be cautious with causation. Remember variance and/or randomness exist
- The tendency to reason according to similarity of cause and effect is frequently found in conjunction with the bias toward inferring centralized direction. Together, they explain the persuasiveness of conspiracy theories
- People have better intuitive understanding of odds than of percentages
- Base-rate fallacy: numerical data are ignored unless they illuminate causality
- People tend to underestimate both how much they learn from new information and the extent to which new information permits them to make correct judgments with greater confidence
- Fighting hindsight: if it was the opposite, would I be surprised based on the previous report?
- Consciously avoid any prior judgment as a starting point
- The act of constructing a detailed scenario for a possible future event makes that event more readily imaginable and, therefore, increases its perceived probability
- Expectations or theory are unlikely to be given great weight and tend to be dismissed and unreliable, erroneous, unrepresentative, or the product of contaminating third variable influences
This is the fourth of a series of posts that I try to lay down the most relevant lessons from the book Psychology of Intelligence Analysis by Richard Heuer.
In my opinion, the next topic lies at the core of investment cases. And it is not the investment thesis per se, it’s a subset of it. Linchpin assumptions are the core companies value drivers. Those are the KPIs executives and investors should understand and follow closely. For investors, those KPIs are simply lagging indicators. For executives, they work with both leading and lagging indicators. Internally, middle management need to act on leading KPIs that shall yield the desired outcome for investors and global goals for the company executives (lagging indicators).
For instance, take a retailer. Gross margin is the lagging indicator. The leading indicator could be the amount of sales sold with some markdown. And the leading indicator of the leading indicator could be a new COO that would redesign the supply chain or even more basic store processes. Many analysts consider gross margin to be the value driver. To be right before Mr. Market, you have to decipher that the COO is the value driver.
- “Analysts actually use much less of the available information than they think they do.”
- “People’s mental models are simpler than they think, and the analyst is typically unaware not only of which variables should have the greatest influence, but also which variables actually are having the greatest influence.”
- “When analysis turns out to be wrong, it is often because of key assumptions that went unchallenged and proved invalid.”
It’s easy to become lost with too many pieces of information. One must be sharp, separating what matters from a fundamental value perspective which will eventually be translated to the stock price from pure data garbage. You must pick one, two, no more than three value drivers for your investment case. If you haven’t identified those yet or just have too many of them, maybe you still don’t have an investment case. From previous posts, remember to make them visible and challenge them. Try to disprove your hypothesis and formulate new ones with their respective value drivers.
This is the third of a series of posts that I try to lay down the most relevant lessons from the book Psychology of Intelligence Analysis by Richard Heuer.
I believe this one topic, perspective or better yet, peripheral vision, to be one of the trickiest to analysts. It’s easy to literally get lost as research per se is simply never-ending. “Data collection” pitfall might play a big role in security analysis.
Perspective or Peripheral Vision
- “By studying similar phenomena in many countries, one can generate and evaluate hypotheses concerning root causes that may not even be considered by an analyst who is dealing only with the logic of a single situation.”
- There are many possible market segmentation exercises that possibly yield interesting questions, such as geographic, demographic, social, ethnic, economic, political, etc. The concept of horizontality here is introduced in the business environment analysis. Conclusions are dangerous since causality is difficult to fathom;
- “Historical analysis often precede, rather than follow, a careful analysis of the situation. The most productive use of comparative analysis is to suggest hypotheses and to highlight differences, not to draw conclusions.”
- Verticality: understanding how a specific firm and its sector weathered prior crises and demand peaks are great expectation calibrators. A little history coupled with sector-specific operational know-how satisfy analysis from this particular angle;
- “When faced with an analytical problem, people are either unable or simply do not take the time to identify the full range of potential answers.”
- Combining horizontality & verticality, with a little luck and a sharp mind, you can have a holistic view of the subject company. After spending a good time imagining possible scenarios and their impact in the key value drivers, you are likely set to have a investment-case-oriented research project;
Additionally, mind PR/IR company departments exist for a couple of reasons. One of them is to be the regular communication channel with investors. Also, a crucial part of their job description is storytelling. That’s the story the company tells investors during meetings with investors. Beware if one does not own the framework, one becomes susceptible to whatever story he is told.
The bottom line is extensive sweeping for comparisons propels better questioning, thus enabling one to ask sharper questions and to ponder new possibilities that previously weren’t even considered. To do that, peripheral vision is required.
I recently ended up reading Psychology of Intelligence Analysis from Richard Heuer, based on a compilation of declassified articles from the CIA’s Center for the Study of Intelligence, prepared for intelligence analysts and CIA directors. My previous post was about the analyst job in a series of posts that I try to lay down the most relevant lessons from the book.
Moving forward, one must grasp the whole to notice the missing parts. But if the whole isn’t visible, you likely wouldn’t have noticed the constellations drew in the above picture. Once they are noticed, they become obvious. And that’s what I wanted to talk about today.
Making Thinking Visible
- “Intelligence analysts should be self-conscious about their reasoning process.”
- “The analyst operates from a set of assumptions about human nature and what drives people and groups. Those are like to remain implicit in the analysis.”
- “The question is not whether one’s prior assumptions and expectations influence analysis, but only whether these influences are made explicit or remain implicit.”
- “One’s attention tends to focus on what is reported rather than what is NOT reported. It requires a conscious effort to think about what is missing but should be present if a given hypothesis were true.”
- “Assumptions are fine as long as they are made explicit in your analysis and you analyze the sensitivity of your conclusions to those assumptions.”
All of those excerpts from the book points to a single direction, which is the name of the topic I picked: make things visible (explicit). That should be the mantra. Acknowledging it is the first and so important step, but some tools should help the routine.
Writing is a topic I have already discussed here and I consider to be of the utmost importance. Sharing the research material previously with colleagues enriches the discussion and leaves time to other people criticize your thinking. Colleagues should not only read the material for the meeting, but also look up for complementary material that could reinforce or invalidate preconceived hypotheses. The ultimate step would be for the whole company to have a single investment thesis documenting the team’s work on the object company, not the lead analyst’s view on it.
Finally, when reviewing the material, always use two questions to check the rationale behind every assumption: “Is it? If so, why is it?” Those are brilliantly simple questions capable of enabling information sharing among team members and even debunking pseudo robust hypothesis.
I believe there’s much more to develop on this topic, but I believe those are a good first step in a profitable direction.
“Seeing should not always be believing.”
What entertains me the most is learning and the discovery journey. Sometimes you are able to assemble the puzzle, sometimes you have to live with no answers and simply deal with. And that’s why I find the topic of Intelligence Analysis so interesting. I have recently read Psychology of Intelligence Analysis by Richard Heuer, based on a compilation of declassified articles from the CIA’s Center for the Study of Intelligence, prepared for intelligence analysts and CIA directors. I intend to make a series of short posts containing what I deemed to be the most relevant from the book, categorizing topics and commenting to fit my, and hopefully our, needs.
The Analyst Job
- “Human mind has limitation dealing with ambiguous information, multiple players and fluid circumstances”
- This is what the real world is all about. Acknowledge it and be humble. Consider nonprobable hypotheses. Deconstruct boundaries. Fathom scenarios.
- “We must battle against bureaucratic and ideological biases. The other guy most likely have a different cultural background, life premises and values. People built-in systems are not the same. One must understand people’s values and assumptions, and even their misperceptions and misunderstandings.”
- Remember companies actually are a group of people working towards theoretical global goals and many individual goals. Consider each key member of it and their respective background. Motivations are crucial. Mind those are not explicit most of the time .
- “It always involves an analytical leap, from the known to the uncertain. And still, you are not going to be certain. It’s a matter of odds a sense-making. The intelligence analysts function might be described as transcending the limits of incomplete information through the exercise of analytical judgment”
- Some questions are simpler puzzles, others lead analysts to mysteries. Grasp not even executives know what lay down on the road for their companies. Be skeptical and proceed with caution.
- “When dealing with a new and unfamiliar subject, the uncritical and relatively non-selective accumulation and review of information is an appropriate first step. But this is a process of absorbing information, not analyzing it. Analysis begins when the analyst consciously inserts himself into the process to select, sort and organize information.”
- Usually it’s more interesting begin collecting non-biased information than biased ones (company filings vis a vis sell-side reports). The extensive work of number-crunching, people background checking and reading competitor-related articles are exploratory. One starts creating value when one begins to think, i.e., lays down an opinion on a summary (investment case).
- “Major intelligence failures are usually caused by failures of analysis not collection. Relevant information is discounted, misinterpreted, ignored, rejected or overlooked because it fails to fit a prevailing mental model.”
- We must see the whole to notice the missing parts. Additionally, we must recognize the linchpin pieces of what could be a potential investment. No doubt this is easily said than done. Experience helps a lot here. History also. Shortcuts? Read, read, read…
- “Analysts will often find, to their surprise, that they can construct a quite plausible scenario for an event they had previously thought unlikely.”
- This could be translated as “Hey, please take a step back and think it over.” Second-level thinking helps a lot, i.e. extensive sense-making.
- “When one recognizes the importance of proceeding by eliminating rather than confirming hypotheses, it becomes apparent that any written argument for a certain judgment is incomplete unless it also discusses alternative judgments that were considered and why they were rejected.”
- This trick helps a lot dealing with confirmation bias. Being extensive in your preliminary research aids to eliminate biased thesis ahead. “What if” aids tremendously when fathoming investment cases.
- “What is difficult to find, and is most significant when found, is hard evidence that is clearly inconsistent with a reasonable hypothesis.”
- One can never be so certain.
“The historian uses imagination to construct a coherent story out of fragments of data.” – Richard Heuer
Analogies between chess and investing are somewhat frequent. What those usually fails to address is that the chess game is a closed-end system (masters usually memorize thousands of game set-ups like an algorithm so they can outplay opponents), while investments analysts and investors are required to constantly assess the ever-changing business environment. Taking a step back, sometimes the definition of what the business environment is not that linear. At the end of the day, it’s like investing is a matter of perspective and framing, not data collection and processing per se.
Additionally, it’s worth emphasizing what Heuer put brilliantly when he addressed the downside of mental models: it’s the principal source of inertia in recognizing and adapting to a changing environment.
There is a crucial difference between the chess master and the master intelligence analyst. Although the chess master faces a different opponent in each match, the environment in which each contest takes place remains stable and unchanging: the permissible moves of the diverse pieces are rigidly determined, and the rules cannot be changed without the master’s knowledge. Once the chess master develops an accurate schema, there is no need to change it. The intelligence analyst, however, must cope with a rapidly changing world. (…) Schemata that were valid yesterday may no longer be functional tomorrow.
Learning new schemata often requires the unlearning of existing ones, and this is exceedingly difficult. It is always easier to learn a new habit than to unlearn an old one. Schemata in long-term memory that are so essential to effective analysis are also the principal source of inertia in recognizing and adapting to a changing environment.
From Psychology of Intelligence Analysis; Richard Heuer
Imagine you own the only machine (and irreplicable) in the world that manufactures product X. Demand for X is virtually infinite, so you almost certainly know all your inventory will be sold with not much effort. And in cash. As the product is so demanded, you don’t even need a salesforce to sell it. But the best is yet to come: input costs are nearly zero because you own a vertically integrated business. Ah, and there are no taxes as you are rebuilding the world and no government has been structured so far. Isn’t it the perfect business? It’s the virtually 100% net margin business with no working capital and little PP&E. Indeed. Although, when valuing it, I bet you will not get much above the risk free rate (Rf). Check the example below:
I have assumed you haven’t exerted your monopoly power to emanate your pricing power for the sake of simplicity (although a sustainable pricing power of X% per year would still be sufficient for the price tagging example, we just need a fixed number). Suppose the available money base do not support price hikes or simply that people do not accept prices higher than 1, otherwise they come and destroy your dream machine.
So here is the key issue: how would a potential buyer agree to pay more than the 850 price tag for your company? He would have to either:
- Increase free cash flow – (a) if he under-invest in maintenance of the machine, which would affect its future free cash flow; not that smart! or (b) if he was fortunate enough to be able to replicate the irreplicable dream machine, but that would require R&D investment and expansion CAPEX which can not be estimated today, so we will ignore it. Moreover, input costs are already 0 so he wouldn’t benefit from scale, but rather increase the available cash at the end of the period
- Accept a lower rate of return – pretty much what we are seeing in today’s global equity market
So what is the takeaway? Yes, competitive advantages (moats) are what we look for in a company, but the perfect company doesn’t exist and if it did, it was likely priced to perfection and thus shouldn’t be a great investment, unless you can manage to increase its value over time. The perfect company has little “investment value”.The marginal buyer can’t bid it up unless he accepts a rate of return lower than risk free – by the way, why work if you are already able to earn Rf?! Thus, addressing what are the moats in place and how to sustain those in the long run are of utmost importance, but understanding how can a company possibly expand the existing moats or even create new ones (i.e., leading indicators to the increased long-term free cash flow) is what analysts should fathom.