This is a new website with more space for the Playbook. Pages are still being edited and moved.
How Spies Think
by David Omand
At first glance, agency or professional services businesses don't have too much in common with international intelligence agencies.
For a start we're no good at keeping secrets (in fact it's generally our job to spread information far and wide).
On deeper examination, though, you find that the gathering and analysis of information is similarly at the core of both lines of business. Intelligence agencies have a much more systematic and thoughtful approach to this process, of course — and that means there's much for agencies to learn.
David Omand is the former Director of GCHQ, at the heart of the UK's intelligence gathering and analysis. In How Spies Think, Omand writes about their methods of managing information and working out what is going to happen.
This can be applied by agencies in helping clients understand their markets and the wider world, or within the agency itself in planning more strategically.
The Big Idea
Leaders are constantly making decisions. The quality of those decisions is directly proportional to the quality of the information they are able to consider before deciding.
But, there is rarely perfect information: “Our knowledge of the world is always fragmentary and incomplete, and is sometimes wrong,” says Omand.
Therefore it is crucial that we have good methods for being able to analyse complex information effectively.
GCHQ and other intelligence agencies have systematic approaches to do this, and Omand shares a framework that he has taught to intelligence analysts: the SEES model.
The SEES model is:
Situational awareness: gaining an understanding of what is happening and what we are faced with.
Explanations and motives: why we are seeing this situation, and what the motivations are of those involved.
Estimates and forecasts: how events may unfold under different assumptions.
Strategic notice: future issues that may come to challenge us in the longer term, based on what may happen here.
Key Points
This is a highly detailed book, summarising the author's learnings from decades working in intelligence. It sets out the frameworks the intelligence agencies use, and the ways of thinking, and also provides real world examples of their use.
These are the key points, which all fit under sections of the SEES framework outlined above.
Situational Awareness
Omand describes the first part of the SEES model — gaining an understanding of what is happening and what we are faced with.
Understand the limitations of what we know
However much information we have, and intelligence agencies invest vast fortunes in gathering seemingly endless amounts of information, it's never the whole picture.
Our knowledge of the world is always fragmentary and incomplete, and is sometimes wrong
As a consequence, Omand argues, we should actively seek new information on an ongoing basis, and our analysis and beliefs should be constantly evolving as we get new information.
We should subject the information we have to some analysis for quality and confidence:
What is the source of this information?
How credible/reliable is this source usually?
Is this backed up by any other source?
How relevant is this information to what we are working on?
The author recommends being open and honest about the limitations of what we know, especially in public.
Buddhists teach that there are three poisons that cripple the mind — anger, attachment and ignorance.
We have to be careful of how emotions such as anger can distort our perception of what is true and what is false.
Attachment to our old thinking gives us false reassurance and blinds us to changing reality, leading to nasty surprises.
But ignorance is the most dangerous.
Be aware of our own biases
When we receive information that doesn't fit with our existing view of the world, or ‘common sense’ about how things are, it's natural human instinct to try to find reasons to explain away or discard this new information, to preserve our world-view.
Intelligence analysts have a construct of the world based on their experiences and training, leading to assumptions and expectations.
The training manual for CIA officers states, “these constructs strongly influence what information analysts will accept — that is, data that are in accordance with analysts' unconscious mental models are more likely to be perceived and remembered than information that is at odds with them.”
The author says that special caution is needed therefore when the source seems to be showing you what you had most hoped to see.
A lesson from experience (including that of assessing Iraq's holdings of chemical and biological weapons in 2002) is that once we have chosen our favoured explanation we become unconsciously resistant to changing our mind. Conflicting information that arrives is then too easily dismissed as unreliable or ignored as an anomaly.
He recommends starting by identifying the key assumptions you have that might bias you. A good question to ask yourself is, ‘If I believed something different, how much would my weighting and analysis of this information change?’
After Israeli intelligence failed to give warning before the 1973 Yom Kippur war, because they discounted information they received as it did not fit their assumptions, the government set up a 'devils advocate' group. This was staffed by the best analysts, and briefed to take a contradictory view to the mainstream intelligence thinking and make the case for it. This ‘red team’ can get direct access to the prime minister if needed, in order to present alternate viewpoints where they strongly disagree with the official line.
Leaders should make it clear they are open to hearing things they don't want to hear, and value alternate views.
(The author provides more detail about biases and how to consider them — you can get further detail about that from our book summary of ‘Thinking, Fast and Slow’ by Daniel Kahneman.)
Work from competing hypotheses
Instead of getting a piece of information and directly deciding what it tells us about what's going to happen (thinking forwards), intelligence agencies take a subtly, but importantly, different approach, Omand tell us.
They first look at the situation they are analysing, and develop a range of hypothesis about what is going on. This needs to be the full range of possible explanations, referred to as exhausting the solution space. Some hypotheses might be further divided into sub-explanations — variations on the main theme.
In the intelligence world this may concern a country's intentions with nuclear materials, say. There could be several reasonable hypotheses:
‘They are working on nuclear weapons,’ or
‘They are working only on nuclear power,’ or
‘They have no nuclear activity.’
Analysts start by assessing, on the basis of what is currently known, the likelihood of each hypothesis being true. This is called the analysis of competing hypotheses.
Then, when new information is received — a leaked report, satellite imagery, reports from a mole in a target organisation, and so on — they measure that information against each hypothesis. At that point, they ask a vital question:
If this hypothesis was true, what is the likelihood that we would be able to see this piece of information?
The answer to that question will then alter your belief in the likelihood of each hypothesis being true.
It's a constant process of adjustment of beliefs in hypothesis — taking the intelligence and working backwards — rather than simply looking at the information and trying to directly figure out what will happen next — working forwards. It's a subtle but important shift.
Whereas most people tend to support the hypothesis that has the most information supporting it, intelligence analysts place favour on the hypothesis that has the least evidence pointing against it.
Explanations and Motives
Omand turns now to the second part of the SEES model — why we are seeing this situation, and what the motivations are of those involved?
Answer mysteries rather than just uncover secrets
Intelligence agencies can discover secrets in all sorts of ways — bugging, informers, interception, and so on. Secrets can be found out.
Their main objective, though, is to find the answers to mysteries, and that is more complex. A mystery could be a puzzle about why a particular government is doing certain things, or whether a certain leader will actually honour a commitment.
For this you can often only draw inferences from a range of other information, as you can't get inside people's heads. You have to see things from their point of view.
Looking at what information you have, and what you assess the most likely hypothesis to be, intelligence analysts have to try to explain why. Tell the story behind the hypothesis. Why are people acting as they are in this context? What might they be thinking and planning?
Consider complexity
People are complex. Groups of people even more so. And this is only amplified when competing groups interact.
People don't just act based on what they think and want, and plan to do — they act based on what they think others think and want, and plan to do. As a consequence, as the assumptions become greater on each side, the actions can become more unpredictable.
Additionally, analysts can fall into the trap of drawing a direct line from understand the situation as it is, to predicting what will happen by assuming trends will continue and assumptions will remain the same.
Instead, analysts need to consider how the multiple variables will interact. This is their explanatory model.
Estimates and Forecasts
Next Omand addresses the third part of the SEES model — how events may unfold under different assumptions.
You're looking at probabilities, not predictions
Intelligence analysts don't like to use the terms prediction, because it seems to imply a level of confidence. Instead they talk about their ‘degree of belief’ in the various hypotheses, an estimate of the probability of being right.
In UK intelligence they keep the scale on which they express this degree of belief purposefully imprecise:
Percentage Range | Description of Probability ('degree of belief') |
---|---|
0-5% | Remote chance |
10-20% | Highly unlikely |
25-35% | Unlikely |
40-50% | Realistic possibility |
55-75% | Likely |
80-90% | Highly likely |
95-100% | Almost certain |
Note the gaps between each step on the scale — it helps avoid the idea of false precision, that a one percentage point change could make the difference. Note also that the bands are not all the same sizes.
Strategic Notice
Lastly, Omand looks at the fourth part of the SEES model — future issues that may come to challenge us in the longer term, based on what may happen here.
Look ahead
Gathering and analysing intelligence is only useful if we then use that to look into the future and try to get a strategic advantage by identifying what might happen. From that we can get an idea of what we can prepare for, in terms of mitigating threats, or of exploiting opportunities before others can take them.
In terms of threats, intelligence officers talk about two kinds of surprises that we ideally want to avoid:
Strategic surprise: when we're not expecting the scenario at all, and are caught completely unprepared. 9/11 was a strategic surprise (which is a bit of an understatement), as there had been no expectation that passenger aircraft would be used for attacks.
Tactical surprise: when we're expecting the scenario, but we just don't know when. The London tube bombings were a tactical surprise, as attacks on public transport had been expected and the alert was high, but nobody knew when or specifically where it would happen. Similarly, the coronavirus pandemic is another example of a tactical surprise — not unexpected, but the specific time or location of the initial outbreak was not known.
Of course, we'd all like no surprise at all, but strategic surprises are usually the most damaging. If we can at least avoid strategic surprises we should be able to understand things that might happen, and prepare for them.
Understand and reduce the risks of potential threats
Once you have a view on the likely situations that your hypotheses may present you can begin to assess the threats to your interests that they involve. It's then important to understand the level of risk that each threat presents.
Intelligence agencies in the UK do this with a simple calculation (not necessarily by using numbers, just by understanding that these are multiplying factors):
Risk = likelihood x vulnerability x impact
Assess each of the threats in this way to identify the different levels of risk.
Then you can adopt a strategy that presents what military planners would describe as a 'layered defence', by taking multiple actions to reduce each part of the equation. As an illustration, the author sets out the UK strategy to reduce the risk of terrorism:
Reduce the likelihood by pursuing terrorist networks
Reduce the likelihood by preventing radicalisation and recruitment
Reduce vulnerability by improving protection of potential targets
Reduce impact by preparing to manage an initial response rapidly and effectively
Reduce impact by preparing to shorten the period of disruption (improving resilience)
Insights for Agency Leaders
Agencies operate in a very uncertain environment by their very nature, yet few take a structured approach to understanding the scenarios presented by emerging risks and opportunities.
For an agency, these scenarios could be about winning a client, a client deciding to put their account up for review, a competitor's strategy, the prospects for an acquisition, a societal/technology change, or some big political/economic event that might impact the agency's market.
Without a structured approach, the agency's planning falls back to being based on gut feel, and being very reactive when unexpected things happen. That feels exciting, but doesn't make for good sustainable businesses.
Adopting the SEES model to guide a more considered approach to gathering intelligence about your market and analysing it could help you see further ahead and plan better.
Aside from gaining better intelligence, there is value in having a better picture about what you don't know, where the gaps in your knowledge lie.
You can develop hypotheses of what could happen, and then assess your level of belief in what might happen, based on probability.
Forecasting is not prediction, and the language of probability should be kept deliberately imprecise. There is a learning point here about how to communicate future events (such as schedules, roadmaps, etc.) with clients too.
Once you have a view on things that might happen you can take three categories of active steps to reduce the level of risk: reduce the likelihood of the risk occurring, reduce your vulnerability to a bad outcome, or reduce the impact that your business would suffer if the risk materialised (by preparing both a response and a recovery).
Given that clients often rely on their agencies for strategic input and future direction, having a solid intelligence-led approach could give you a distinct competitive advantage.
From Words to Actions
Here are some ideas from the Convivio team to take the inspiration from this book, How Spies Think, and take action in your agency.
Improve your intelligence sources
You can ensure that a wider range of information is coming across your desk:
What blogs, newsletters, slack groups and social media accounts should you subscribe to? Search for those relevant to your sector, your market, your profession, your craft.
Consider subscribing to a 'serious' paper with international reach? (eg the Financial Times, New York Times, the Economist)
What trade publications (print or online) are there for your market?
It can be worth subscribing to jobs boards/newsletters for your market, simply to see who is hiring for what. That can tell you a lot.
Set up Google Alerts for keywords about your market and profession, such as key client names, competitor names, services and so on.
What meetups and events are good in your market? Make sure your agency has people at them. It's in the social chat afterwards that you pick up valuable information.
Consider how you could manage the inflow of all this intelligence and manage to scan a high percentage, then read a few key things a week. What time do you need to block out, when?
Develop a process to evaluate intelligence
A lot of the information you read will just be 'of interest' and you can file it away in the back of your brain somewhere. But some will be 'intelligence' — information that is of importance to strategic decision making in your agency.
First you need to figure out what you care about, your issues of interest. These might be wider issues like 'What will happen with the economy over the next year?', or very specific questions to your agency like 'Will client X renew the contract with us?'
Then, for each of those, play out what all the possible answers are, your hypotheses about things that could happen. For the question about Client X, those hypotheses could include 'Yes, with an increase in work', 'Yes, with no change', 'Yes, but with a decrease in work', 'No, account put up for review', 'No, contract ends'. The key is, as the book describes, you want to 'exhaust the solution space'. Write these hypotheses down for each issue of interest.
You can use a Heuer table, named after the CIA analyst who invented the method. This is described in the book, and might look like this for our Client X example:
Evidence | Source/ Credibility/ Relevance | Hypothesis 1Yes, increase | Hypothesis 2: Yes, static | Hypothesis 3 Yes, decrease | Hypothesis 4: No, review | Hypothesis 5: Ends |
---|---|---|---|---|---|---|
High client satisfaction | Client contact/ Medium/ Medium | High | High | Medium | Low | Low |
New Marketing Director | Client contact/ Medium/ High | Low | High | High | Medium | Low |
New Marketing Director | Job ad/ High/ High | Low | Low | Medium | High | High |
Other agency meetings | Rumour/ Low/ Low | Low | Low | Medium | High | High |
You can see a story emerges from this simple table.
That story can form the basis for a more sensible discussion at board and management meetings for higher strategic issues, or management or account meetings in this case. It would help you get past the bias you might attach to a client telling you what you want to hear in an email, when you add in that the job ad for the new marketing director includes a fair few references to creating an entirely fresh approach, selecting agencies to bring in new eyes and ideas, and then you add in that you heard at an agency event that another agency has been invited in for a casual coffee and a ‘get to know you’ session.
From the discussion of this table you'd be likely to update our level of belief in what the probability of each hypothesis being true is. It might be wise to start preparing for a review of the account, watching for who gets appointed as the marketing director and figuring out what floats their boat. Building the relationship, and starting to float new ideas with the client. You might also want to start some new business activities that could replace some possible loss of income within the next few months.
Having a methodical, analytical approach will help you anticipate risks and make more strategic decisions.
Actively monitor risks
From your analysis of issues of interest, and from other information you have about your business, you can begin to build up a picture of the risks the agency faces.
At Convivio, we've developed a simple tool to use for tracking risks — the Risk Radar.
The idea is simple:
Have a highly visible display like a radar screen.
Capture risks onto sticky notes.
Move the notes around to reflect movement of risks.
The radar board becomes a prompt for the conversations around proximity and changes in risks.
Think a few steps ahead
The intelligence you have might point to a few things that could happen, and that's where most people stop in their thinking. Smart people do more, though — they take the issues or risks and game out the action a further few steps ahead.
If this thing was to happen, how could we react?
How might our reaction change things?
And how might we react to that?
How might others react?
So, in this scenario above, you might think about how you could get in with, and win over, the new marketing director as soon as they get into the role. You might think about which other agencies might be brought in to chat, and what they would be suggesting to the client. You might set up a 'red team' to make the case for why you should be dropped from the account — in doing so, you identify weaknesses that competitors might highlight, and you're able to fix them or head off the criticism in advance.
Feed the insights into strategic discussions at board meetings
It's key to understand that these methods don't give you clear answers or advice on what next steps to take, just the information to feed into a discussion and decision-making forum.
Intelligence analysts produce what they call 'intelligence product' but we would call reports. These are then provided to political leaders to decide what to do.
In your agency, bigger strategic issues should be decided at a reasonably well structured regular board meeting. More operational issues can be addressed at management or other meetings within the business.
For a board meeting. although it feels odd or silly at first in a creative business, it really is worth having a short written report sum up the intelligence you have, the hypothesis you envisage, what your degree of belief in each hypothesis is, why you think things are happening as they are, and what options the business has to prepare and respond. Have a standard format you all use. Stick to it. You'll be amazed at the increased quality of discussion and decision making that results.
Use the insights for your clients, too
If you do all the work to ensure that your leadership team is better briefed and decisions are made strategically based on intelligence, then how much value could you offer to clients through preparing intelligence reports on some of the issues they face, or their market faces? It'll make you stand out so far from other agencies. It could even become a chargeable service.