Georgetown University Structured Analysis Techniques Discussion

Setermining when to use which SAT depends largely on the type of information that you are trying to uncover: the ‘what’ or ‘who’, the ‘why now’ or ‘how’, or the ‘what next’ aspects of your story. It also depends on the type and amount of data that you have, as some techniques work best with large amounts of data, while other techniques are premised upon the existence of contradictory data.

Draw from the materials to identify which technique(s) would be most appropriate for uncovering the key ‘unknowns’ listed below.

  1. Why is Macedonia’s Prime Minister pushing for Macedonia to join the EU?
  2. Who was responsible for the June 2017 attack involving an explosive device on the US Embassy in Kiev, Ukraine?
  3. What effect has the death penalty had on murder rates nationally?
  4. How will ISIL’s tactics in the Middle East change as the size of the territory that they control is reduced?
  5. Why are Apple customers so loyal to the company?
  6. How might al-Qa’ida carry out another large-scale attack in the United States?
  7. How will the full implementation of Saudi Arabia’s economic reform plan—dubbed “Vision 2030”—affect global oil prices?
  8. Who was responsible for the OPM data breach of 2014?
  9. What types of support do Yemen’s Huthi rebels receive from foreign patrons?
  10. Why have homicide rates in El Salvador spiked in recent years?

More Advance Praise for Structured Analytic Techniques for
Intelligence Analysis
“Structured Analytic Techniques is an indispensable companion for all analysts and
policy-makers who strive to instill transparency, rigor, and foresight into their
everyday analytic routines, provide sound argumentation for policy decisions, and
avoid surprise. As the authors so vividly point out: Our brain is not always our
friend. Good analysis and foresight mean that we need to constantly keep
questioning ourselves. This book tells us how to do this in a structured and
systematic manner. It is easy to use, practice-oriented, and convinces even sceptics
of the necessity of applying structured techniques. The 55 techniques in this book
will undoubtedly contribute to advancing foresight and critical thinking skills in
Germany’s policy community. Randy Pherson’s extraordinary experience in
teaching SATs has already contributed significantly to the training aspects of our
efforts to enhance strategic foresight at the federal level in Germany.”
—Kathrin Brockmann, Head of the Government Foresight Project at the Berlinbased stiftung neue verantwortung and Analyst at the Futures Analysis Section,
German Armed Forces Planning Office
“A hearty thanks to Heuer and Pherson for writing—and updating—this key work
for the aspiring intelligence analyst—or anyone interested in sound approaches to
analyzing complex, ambiguous information. This was the only textbook we
considered when developing our Intelligence Analysis Techniques course because
it is comprehensive, well-organized, and practical. The second edition has several
valuable improvements that will greatly help both instructors and students. Of
particular note is the discussion of dual-process thinking and cognitive biases and
how structured analytic techniques can help analysts avoid such biases. The second
edition should provide an unsurpassed learning opportunity for our students,
particularly when used in conjunction with Beebe and Pherson’s Case Studies in
Intelligence Analysis.”
—Alan More, Adjunct Professor, Intelligence Studies, George Mason University
“Competitive Intelligence has been struggling with the Information Cycle for more
than two decades. With this book, Richards J. Heuer Jr. and Randolph Pherson are
releasing us from the traditional, and mostly intuitive, methods that go with it.
They lay the foundation for a new approach to intelligence in business if we take
off our blinders and investigate new methods in other fields. It provides an
unprecedented and practical baseline for developing a new culture of information
sharing in intelligence activities writ large.”
—Dr. Pascal Frion, President of Acrie Competitive Intelligence Network and 2013
recipient of the French Competitive Intelligence Academy Award
“Heuer and Pherson have written a book that provides law enforcement
intelligence and crime analysts with numerous techniques to assist in homeland
security and crime prevention. The book is a must read for analysts in the law
enforcement community responsible for analyzing intelligence and crime data.
Analysis of Competing Hypotheses is but one non-traditional example of a tool
that helps them challenge assumptions, identify investigative leads and trends, and
anticipate future developments.”
—Major Jesse McLendon (ret.), North Kansas City Police, North Kansas City,
“Heuer and Pherson’s Structured Analytic Techniques for Intelligence Analysis has
become a classic in intelligence literature. Already a standard text in numerous
universities and government agencies around the world, the 2nd edition will
continue to be required reading for Denmark’s current and future intelligence
analysts. Its techniques are taught at the University of Copenhagen and the book
represents the core literature for analysis simulation exercises, in which graduate
students at the Department of Political Science practice the art and science of
intelligence analysis under the supervision of senior government intelligence
—Morten Hansen, Lecturer in intelligence studies, Department of Political
Science, University of Copenhagen
“Heuer and Pherson’s Structured Analytic Techniques is the standard text for
learning how to conduct intelligence analysis. This handbook provides a panoply
of critical thinking methodologies suitable to any issue that intelligence analysts
may encounter. Used by both government practitioners and intelligence studies
students throughout the world, the book’s techniques have redefined critical
—Dr. Melissa Graves, Associate Director, Center for Intelligence and Security
Studies, The University of Mississippi
“Heuer and Pherson are the leading practitioners, innovators, and teachers of the
rigorous use of structured analytic techniques. Their work stands out above all
others in explaining and evaluating the utility of such methods that can appreciably
raise the standards of analysis. The methods they present stimulate the imagination,
enhance the rigor, and apply to hard intelligence problems as well as other areas
requiring solid analysis. This new, expanded edition is a must-have resource for
any serious analyst’s daily use as well as one’s professional bookshelf.”
—Roger Z. George and James B. Bruce, adjunct professors, Center for Security
Studies, Georgetown University and co-editors, Analyzing Intelligence: National
Security Practitioners’ Perspectives
“The science of reasoning has grown considerably over the past 40-odd years.
Among the many fascinating aspects of the human intellect is the ability to amplify
our own capabilities by creating analytic tools. The tools in this book are for those
whose profession often requires making judgments based on incomplete and
ambiguous information. You hold in your hands the toolkit for systematic analytic
methods and critical thinking. This is a book you can read and then actually apply
to accomplish something. Like any good toolkit, it has some simple tools that
explain themselves, some that need explanation and guidance, and some that
require considerable practice. This book helps us in our quest to enrich our
expertise and expand our reasoning skill.”
—Robert R. Hoffman, Institute for Human & Machine Cognition
CQ Press, an imprint of SAGE, is the leading publisher of books, periodicals, and
electronic products on American government and international affairs. CQ Press
consistently ranks among the top commercial publishers in terms of quality, as
evidenced by the numerous awards its products have won over the years. CQ Press
owes its existence to Nelson Poynter, former publisher of the St. Petersburg Times,
and his wife Henrietta, with whom he founded Congressional Quarterly in 1945.
Poynter established CQ with the mission of promoting democracy through
education and in 1975 founded the Modern Media Institute, renamed The Poynter
Institute for Media Studies after his death. The Poynter Institute (
is a nonprofit organization dedicated to training journalists and media leaders.
In 2008, CQ Press was acquired by SAGE, a leading international publisher of
journals, books, and electronic media for academic, educational, and professional
markets. Since 1965, SAGE has helped inform and educate a global community of
scholars, practitioners, researchers, and students spanning a wide range of subject
areas, including business, humanities, social sciences, and science, technology, and
medicine. A privately owned corporation, SAGE has offices in Los Angeles,
London, New Delhi, and Singapore, in addition to the Washington, D.C., office of
CQ Press.
Copyright © 2015 by CQ Press, an Imprint of SAGE Publications, Inc. CQ Press is a registered trademark
of Congressional Quarterly Inc.
All rights reserved. No part of this book may be reproduced or utilized in any form or by any means,
electronic or mechanical, including photocopying, recording, or by any information storage and retrieval
system, without permission in writing from the publisher.
Printed in the United States of America
Library of Congress Cataloging-in-Publication Data
Heuer, Richards J.
Structured analytic techniques for intelligence analysis / by Richards J. Heuer Jr. and Randolph H.
Pherson.—Second Edition.
pages cm
ISBN 978-1-4522-4151-7
1. Intelligence service—United States. 2. Intelligence service—Methodology. I. Pherson, Randolph H. II.
JK468.I6H478 2015
327.12—dc23 2014000255
This book is printed on acid-free paper.
14 15 16 17 18 10 9 8 7 6 5 4 3 2 1
CQ Press
An Imprint of SAGE Publications, Inc.
2455 Teller Road
Thousand Oaks, California 91320
SAGE Publications Ltd.
1 Oliver’s Yard
55 City Road
London EC1Y 1SP
United Kingdom
SAGE Publications India Pvt. Ltd.
B 1/I 1 Mohan Cooperative Industrial Area
Mathura Road, New Delhi 110 044
SAGE Publications Asia-Pacific Pte. Ltd.
3 Church Street
#10-04 Samsung Hub
Singapore 049483
Acquisitions Editors: Sarah Calabi, Charisse Kiino
Editorial Assistant: Davia Grant
Production Editor: David C. Felts
Typesetter: C&M Digitals (P) Ltd.
Copy Editor: Talia Greenberg
Proofreader: Sally Jaskold
Cover Designer: Glenn Vogel
Interior Graphics Designer: Adriana M. Gonzalez
Marketing Manager: Amy Whitaker
Foreword by John McLaughlin
1. Introduction and Overview
1.1 Our Vision
1.2 Two Types of Thinking
1.3 Dealing with Bias
1.4 Role of Structured Analytic Techniques
1.5 Value of Team Analysis
1.6 History of Structured Analytic Techniques
1.7 Selection of Techniques for This Book
1.8 Quick Overview of Chapters
2. Building a System 2 Taxonomy
2.1 Taxonomy of System 2 Methods
2.2 Taxonomy of Structured Analytic Techniques
3. Choosing the Right Technique
3.1 Core Techniques
3.2 Making a Habit of Using Structured Techniques
3.3 One Project, Multiple Techniques
3.4 Common Errors in Selecting Techniques
3.5 Structured Technique Selection Guide
4. Decomposition and Visualization
4.1 Getting Started Checklist
4.2 AIMS (Audience, Issue, Message, and Storyline)
4.3 Customer Checklist
4.4 Issue Redefinition
4.5 Chronologies and Timelines
4.6 Sorting
4.7 Ranking, Scoring, and Prioritizing
4.7.1 The Method: Ranked Voting
4.7.2 The Method: Paired Comparison
4.7.3 The Method: Weighted Ranking
4.8 Matrices
4.9 Venn Analysis
4.10 Network Analysis
4.11 Mind Maps and Concept Maps
4.12 Process Maps and Gantt Charts
5. Idea Generation
5.1 Structured Brainstorming
5.2 Virtual Brainstorming
5.3 Nominal Group Technique
5.4 Starbursting
5.5 Cross-Impact Matrix
5.6 Morphological Analysis
5.7 Quadrant Crunching™
6. Scenarios and Indicators
6.1 Scenarios Analysis
6.1.1 The Method: Simple Scenarios
6.1.2 The Method: Cone of Plausibility
6.1.3 The Method: Alternative Futures Analysis
6.1.4 The Method: Multiple Scenarios Generation
6.2 Indicators
6.3 Indicators Validator™
7. Hypothesis Generation and Testing
7.1 Hypothesis Generation
7.1.1 The Method: Simple Hypotheses
7.1.2 The Method: Multiple Hypotheses Generator™
7.1.3 The Method: Quadrant Hypothesis Generation
7.2 Diagnostic Reasoning
7.3 Analysis of Competing Hypotheses
7.4 Argument Mapping
7.5 Deception Detection
8. Assessment of Cause and Effect
8.1 Key Assumptions Check
8.2 Structured Analogies
8.3 Role Playing
8.4 Red Hat Analysis
8.5 Outside-In Thinking
9. Challenge Analysis
9.1 Premortem Analysis
9.2 Structured Self-Critique
9.3 What If? Analysis
9.4 High Impact/Low Probability Analysis
9.5 Devil’s Advocacy
9.6 Red Team Analysis
9.7 Delphi Method
10. Conflict Management
10.1 Adversarial Collaboration
10.2 Structured Debate
11. Decision Support
11.1 Decision Trees
11.2 Decision Matrix
11.3 Pros-Cons-Faults-and-Fixes
11.4 Force Field Analysis
11.5 SWOT Analysis
11.6 Impact Matrix
11.7 Complexity Manager
12. Practitioner’s Guide to Collaboration
12.1 Social Networks and Analytic Teams
12.2 Dividing the Work
12.3 Common Pitfalls with Small Groups
12.4 Benefiting from Diversity
12.5 Advocacy versus Objective Inquiry
12.6 Leadership and Training
13. Validation of Structured Analytic Techniques
13.1 Limits of Empirical Analysis
13.2 Establishing Face Validity
13.3 A Program for Empirical Validation
13.4 Recommended Research Program
14. The Future of Structured Analytic Techniques
14.1 Structuring the Data
14.2 Key Drivers
14.3 Imagining the Future: 2020
= technique
System 1 and System 2 Thinking
Eight Families of Structured Analytic Techniques
Value of Using Structured Techniques to Perform Key Tasks
The Five Habits of the Master Thinker
Issue Redefinition Example
Timeline Estimate of Missile Launch Date
Paired Comparison Matrix
Weighted Ranking Matrix
Rethinking the Concept of National Security: A New Ecology
Venn Diagram of Components of Critical Thinking
Venn Diagram of Invalid and Valid Arguments
Venn Diagram of Zambrian Corporations
Zambrian Investments in Global Port Infrastructure Projects
4.10a Social Network Analysis: The September 11 Hijackers
4.10b Social Network Analysis: September 11 Hijacker Key Nodes
4.10c Social Network Analysis
4.11a Concept Map of Concept Mapping
4.11b Mind Map of Mind Mapping
4.12 Gantt Chart of Terrorist Attack Planning
Picture of Structured Brainstorming
Starbursting Diagram of a Lethal Biological Event at a Subway Station
Cross-Impact Matrix
Morphological Analysis: Terrorist Attack Options
5.7a Classic Quadrant Crunching™: Creating a Set of Stories
5.7b Terrorist Attacks on Water Systems: Flipping Assumptions
5.7c Terrorist Attacks on Water Systems: Sample Matrices
5.7d Selecting Attack Plans
6.1.1 Simple Scenarios
6.1.2 Cone of Plausibility
6.1.3 Alternative Futures Analysis: Cuba
6.1.4a Multiple Scenarios Generation: Future of the Iraq Insurgency
6.1.4b Future of the Iraq Insurgency: Using Spectrums to Define Potential
6.1.4c Selecting Attention-Deserving and Nightmare Scenarios
Descriptive Indicators of a Clandestine Drug Laboratory
Using Indicators to Track Emerging Scenarios in Zambria
Zambria Political Instability Indicators
Indicators Validator™ Model
Indicators Validator™ Process
Simple Hypotheses
Multiple Hypothesis Generator™: Generating Permutations
Quadrant Hypothesis Generation: Four Hypotheses on the Future of Iraq
Creating an ACH Matrix
Coding Relevant Information in ACH
Evaluating Levels of Disagreement in ACH
Argument Mapping: Does North Korea Have Nuclear Weapons?
Key Assumptions Check: The Case of Wen Ho Lee
Using Red Hat Analysis to Catch Bank Robbers
Inside-Out Analysis versus Outside-In Approach
Mount Brain: Creating Mental Ruts
Structured Self-Critique: Key Questions
What If? Scenario: India Makes Surprising Gains from the Global
Financial Crisis
High Impact/Low Probability Scenario: Conflict in the Arctic
Delphi Technique
Decision Matrix
Pros-Cons-Faults-and-Fixes Analysis
Force Field Analysis: Removing Abandoned Cars from City Streets
SWOT Analysis
Impact Matrix: Identifying Key Actors, Interests, and Impact
Variables Affecting the Future Use of Structured Analysis
Traditional Analytic Team
Special Project Team
Wikis as Collaboration Enablers
Advocacy versus Inquiry in Small-Group Processes
Effective Small-Group Roles and Interactions
Three Approaches to Evaluation
Variables Affecting the Future Use of Structured Analysis
John McLaughlin
Senior Research Fellow, Paul H. Nitze School of Advanced International Studies,
Johns Hopkins University
Former Deputy Director, Central Intelligence Agency and Acting Director of
Central Intelligence
As intensively as America’s Intelligence Community has been studied and
critiqued, little attention has typically been paid to intelligence analysis. Most
assessments focus on such issues as overseas clandestine operations and covert
action, perhaps because they accord more readily with popular images of the
intelligence world.
And yet, analysis has probably never been a more important part of the
profession—or more needed by policymakers. In contrast to the bipolar dynamics
of the Cold War, this new world is strewn with failing states, proliferation dangers,
regional crises, rising powers, and dangerous nonstate actors—all at play against a
backdrop of exponential change in fields as diverse as population and technology.
To be sure, there are still precious secrets that intelligence collection must
uncover—things that are knowable and discoverable. But this world is equally rich
in mysteries having to do more with the future direction of events and the
intentions of key actors. Such things are rarely illuminated by a single piece of
secret intelligence data; they are necessarily subjects for analysis.
Analysts charged with interpreting this world would be wise to absorb the
thinking in this book by Richards Heuer and Randy Pherson and in Heuer’s earlier
work The Psychology of Intelligence Analysis. The reasons are apparent if one
considers the ways in which intelligence analysis differs from similar fields of
intellectual endeavor.
Intelligence analysts must traverse a minefield of potential errors.
First, they typically must begin addressing their subjects where others have
left off; in most cases the questions they get are about what happens next,
not about what is known.
Second, they cannot be deterred by lack of evidence. As Heuer pointed out
in his earlier work, the essence of the analysts’ challenge is having to deal
with ambiguous situations in which information is never complete and
arrives only incrementally—but with constant pressure to arrive at
Third, analysts must frequently deal with an adversary that actively seeks to
deny them the information they need and is often working hard to deceive
Finally, for all of these reasons, analysts live with a high degree of risk—
essentially the risk of being wrong and thereby contributing to ill-informed
policy decisions.
The risks inherent in intelligence analysis can never be eliminated, but one way
to minimize them is through more structured and disciplined thinking about
thinking. On that score, I tell my students at the Johns Hopkins School of
Advanced International Studies that the Heuer book is probably the most important
reading I give them, whether they are heading into the government or the private
sector. Intelligence analysts should reread it frequently. In addition, Randy
Pherson’s work over the past six years to develop and refine a suite of structured
analytic techniques offers invaluable assistance by providing analysts with specific
techniques they can use to combat mindsets, groupthink, and all the other potential
pitfalls of dealing with ambiguous data in circumstances that require clear and
consequential conclusions.
The book you now hold augments Heuer’s pioneering work by offering a clear
and more comprehensive menu of more than fifty techniques to build on the
strategies he earlier developed for combating perceptual errors. The techniques
range from fairly simple exercises that a busy analyst can use while working alone
—the Key Assumptions Check, Indicators Validator™, or What If? Analysis—to
more complex techniques that work best in a group setting—Structured
Brainstorming, Analysis of Competing Hypotheses, or Premortem Analysis.
The key point is that all analysts should do something to test the conclusions
they advance. To be sure, expert judgment and intuition have their place—and are
often the foundational elements of sound analysis—but analysts are likely to
minimize error to the degree they can make their underlying logic explicit in the
ways these techniques demand.
Just as intelligence analysis has seldom been more important, the stakes in the
policy process it informs have rarely been higher. Intelligence analysts these days
therefore have a special calling, and they owe it to themselves and to those they
serve to do everything possible to challenge their own thinking and to rigorously
test their conclusions. The strategies offered by Richards Heuer and Randy Pherson
in this book provide the means to do precisely that.
The investigative commissions that followed the terrorist attacks of 2001 and the
erroneous 2002 National Intelligence Estimate on Iraq’s weapons of mass
destruction clearly documented the need for a new approach to how analysis is
conducted in the U.S. Intelligence Community. Attention focused initially on the
need for “alternative analysis”—techniques for questioning conventional wisdom
by identifying and analyzing alternative explanations or outcomes. This approach
was later subsumed by a broader effort to transform the tradecraft of intelligence
analysis by using what have become known as structured analytic techniques.
Structured analysis involves a step-by-step process that externalizes an individual
analyst’s thinking in a manner that makes it readily apparent to others, thereby
enabling it to be shared, built on, and critiqued by others. When combined with the
intuitive judgment of subject-matter experts, such a structured and transparent
process can significantly reduce the risk of analytic error.
Our current high-tech, global environment increasingly requires collaboration
among analysts with different areas of expertise and different organizational
perspectives. Structured analytic techniques are ideal for this interaction. Each step
in a technique prompts relevant discussion and, typically, this generates more
divergent information and more new ideas than any unstructured group process.
The step-by-step process of structured analytic techniques organizes the interaction
among analysts in a small analytic group or team in a way that helps to avoid the
multiple pitfalls and pathologies that often degrade group or team performance.
Progress in the development and use of structured analytic techniques has been
steady since the publication of the first edition of this book in 2011. By defining
the domain of structured analytic techniques, providing a manual for using and
testing these techniques, and outlining procedures for evaluating and validating
these techniques, the first edition laid the groundwork for continuing improvement
in how analysis is done within the U.S. Intelligence Community and a growing
number of foreign intelligence services. In addition, the techniques have made
significant inroads into academic curricula and the business world.
This second edition of the book includes five new techniques—AIMS
(Audience, Issue, Message, and Storyline) and Venn Analysis in chapter 4, on
Decomposition and Visualization; Cone of Plausibility in chapter 6, on Scenarios
and Indicators; and Decision Trees and Impact Matrix in chapter 11, on Decision
Support. We have also split the Quadrant Crunching™ technique into two parts—
Classic Quadrant Crunching™ and Foresight Quadrant Crunching™, as described
in chapter 5, on Idea Generation—and made significant revisions to four other
techniques: Getting Started Checklist, Customer Checklist, Red Hat Analysis, and
Indicators Validator™.
In the introductory chapters, we have included a discussion of System 1 and
System 2 thinking (intuitive versus analytic approaches to thinking) as they relate
to structured analysis and have revised the taxonomy of analytic procedures to
show more clearly where structured analytic techniques fit in. Chapter 3 includes a
new section describing how to make the use of structured analytic techniques a
habit. We have also expanded the discussion of how structured analytic techniques
can be used to deal with cognitive biases and intuitive traps most often encountered
by intelligence analysts. In addition, we substantially revised chapter 13, on
strategies to validate structured analytic techniques, and chapter 14, which projects
our vision of how structured techniques may be used in the future.
As the use of structured analytic techniques becomes more widespread, we
anticipate that the ways these techniques are used will continue to change. Our goal
is to keep up with these changes in future editions, so we welcome your
suggestions, at any time, for updating this second edition or otherwise enhancing
its utility. To facilitate the use of these techniques, CQ Press/SAGE published a
companion book, Cases in Intelligence Analysis: Structured Analytic Techniques in
Action, with twelve case studies and detailed exercises and lesson plans for
learning how to use and teach twenty-four of the structured analytic techniques. A
second edition of that book will be published simultaneously with this one,
containing five additional case studies and including new or updated exercises and
lesson plans for six structured techniques.
This book is for practitioners, managers, teachers, and students in the intelligence,
law enforcement, and homeland security communities, as well as in academia,
business, medicine, and the private sector. Managers, policymakers, corporate
executives, strategic planners, action officers, and operators who depend on input
from analysts to help them achieve their goals will also find it useful. Academics
and consulting companies who specialize in qualitative methods for dealing with
unstructured data will be interested in this pathbreaking book as well.
Many of the techniques described here relate to strategic intelligence, but there
is ample information on techniques of interest to law enforcement,
counterterrorism, and competitive intelligence analysis, as well as to business
consultants and financial planners with a global perspective. Many techniques
developed for these related fields have been adapted for use in intelligence
analysis, and now we are starting to see the transfer of knowledge going in the
other direction. Techniques such as Analysis of Competing Hypotheses (ACH),
Key Assumptions Check, Quadrant Crunching™, and the Indicators Validator™
developed specifically for intelligence analysis are now being adapted for use in
other fields. New techniques that the authors developed to fill gaps in what is
currently available for intelligence analysis are being published for the first time in
this book and have broad applicability.
The first three chapters describe structured analysis in general, how it fits into the
spectrum of methods used by analysts, and how to select which techniques are
most suitable for your analytic project. The next eight chapters describe when,
why, and how to use each of the techniques contained in this volume. The final
three chapters discuss the integration of these techniques in a collaborative team
project, validation strategies, and a vision of how these techniques are likely to be
used in the year 2020.
We designed the book for ease of use and quick reference. The spiral binding
allows analysts to have the book open while they follow step-by-step instructions
for each technique. We grouped the techniques into logical categories based on a
taxonomy we devised. Tabs separating each chapter contain a table of contents for
the selected chapter. Each technique chapter starts with a description of that
technique category and then provides a brief summary of each technique covered
in that chapter.
Richards J. Heuer Jr. is best known for his book Psychology of Intelligence
Analysis and for developing and then guiding automation of the Analysis of
Competing Hypotheses (ACH) technique. Both are being used to teach and train
intelligence analysts throughout the Intelligence Community and in a growing
number of academic programs on intelligence or national security. Long retired
from the Central Intelligence Agency (CIA), Mr. Heuer has nevertheless been
associated with the Intelligence Community in various roles for more than five
decades and has written extensively on personnel security, counterintelligence,
deception, and intelligence analysis. He has a B.A. in philosophy from Williams
College and an M.A. in international relations from the University of Southern
California, and has pursued other graduate studies at the University of California at
Berkeley and the University of Michigan.
Randolph H. Pherson is president of Pherson Associates, LLC; CEO of
Globalytica, LLC; and a founding director of the nonprofit Forum Foundation for
Analytic Excellence. He teaches advanced analytic techniques and critical thinking
skills to analysts in the government and private sector. Mr. Pherson collaborated
with Richards Heuer in developing and launching the Analysis of Competing
Hypotheses software tool, and he developed several analytic techniques for the
CIA’s Sherman Kent School, many of which were incorporated in his Handbook of
Analytic Tools and Techniques. He coauthored Critical Thinking for Strategic
Intelligence with Katherine Hibbs Pherson, Cases in Intelligence Analysis:
Structured Analytic Techniques in Action with Sarah Miller Beebe, and the
Analytic Writing Guide with Louis M. Kaiser. He has developed a suite of
collaborative web-based analytic tools, TH!NK Suite®, including a collaborative
version of ACH called Te@mACH®. Mr. Pherson completed a twenty-eight-year
career in the Intelligence Community in 2000, last serving as National Intelligence
Officer (NIO) for Latin America. Previously at the CIA, Mr. Pherson managed the
production of intelligence analysis on topics ranging from global instability to
Latin America, served on the Inspector General’s staff, and was chief of the CIA’s
Strategic Planning and Management Staff. He is the recipient of the Distinguished
Intelligence Medal for his service as NIO and the Distinguished Career Intelligence
Medal. Mr. Pherson received his B.A. from Dartmouth College and an M.A. in
international relations from Yale University.
The authors greatly appreciate the contributions made by Mary Boardman, Kathrin
Brockmann and her colleagues at the Stiftung Neue Verantwortung, Nick Hare and
his colleagues at the UK Cabinet Office, Mary O’Sullivan, Kathy Pherson, John
Pyrik, Todd Sears, and Cynthia Storer to expand and improve the chapters on
analytic techniques, as well as the graphics design and editing support provided by
Adriana Gonzalez and Richard Pherson.
Both authors also recognize the large contributions many individuals made to
the first edition, reviewing all or large portions of the draft text. These include J.
Scott Armstrong, editor of Principles of Forecasting: A Handbook for Researchers
and Practitioners and professor at the Wharton School, University of
Pennsylvania; Sarah Miller Beebe, a Russian specialist who previously served as a
CIA analyst and on the National Security Council staff; Jack Davis, noted teacher
and writer on intelligence analysis, a retired senior CIA officer, and now an
independent contractor with the CIA; Robert R. Hoffman, noted author of books on
naturalistic decision making, Institute for Human & Machine Cognition; Marilyn
B. Peterson, senior instructor at the Defense Intelligence Agency, former president
of the International Association of Law Enforcement Intelligence Analysts, and
former chair of the International Association for Intelligence Education; and
Cynthia Storer, a counterterrorism specialist and former CIA analyst now
associated with Pherson Associates, LLC. Their thoughtful critiques,
recommendations, and edits as they reviewed this book were invaluable.
Valuable comments, suggestions, and assistance were also received from many
others during the development of the first and second edition, including Todd
Bacastow, Michael Bannister, Aleksandra Bielska, Arne Biering, Jim Bruce, Hriar
Cabayan, Ray Converse, Steve Cook, John Donelan, Averill Farrelly, Stanley
Feder, Michael Fletcher, Roger George, Jay Hillmer, Donald Kretz, Terri Lange,
Darci Leonhart, Mark Lowenthal, Elizabeth Manak, Stephen Marrin, William
McGill, David Moore, Mary O’Sullivan, Emily Patterson, Amanda Pherson, Kathy
Pherson, Steve Rieber, Grace Scarborough, Alan Schwartz, Marilyn Scott,
Gudmund Thompson, Kristan Wheaton, and Adrian “Zeke” Wolfberg. We also
thank Jonathan Benjamin-Alvarado, University of Nebraska–Omaha; Lawrence D.
Dietz, American Public University System; Bob Duval, West Virginia University;
Chaka Ferguson, Florida International University; Joseph Gordon, National
Intelligence University; Kurt Jensen, Carleton University; and Doug Watson,
George Mason University.
Richards Heuer is grateful to William Reynolds of Least Squares Software for
pointing out the need for a taxonomy of analytic methods and generating financial
support through the ODNI/IARPA PAINT program for the initial work on what
subsequently evolved into chapters 1 and 2 of the first edition. He is also grateful
to the CIA’s Sherman Kent School for Intelligence Analysis for partial funding of
what evolved into parts of chapters 3 and 12 of the first edition. This book as a
whole, however, has not been funded by the Intelligence Community.
The CQ Press team headed by editorial director Charisse Kiino did a marvelous
job in managing the production of the second edition of this book and getting it out
on schedule. Copy editor Talia Greenberg, editorial assistant Davia Grant,
production editor David C. Felts, and designer Glenn Vogel all deserve praise for
the quality of their work.
The ideas, interest, and efforts of all the above contributors to this book are
greatly appreciated, but the responsibility for any weaknesses or errors rests solely
on the shoulders of the authors.
All statements of fact, opinion, or analysis expressed in this book are those of the
authors and do not reflect the official positions of the Office of the Director of
National Intelligence (ODNI), the Central Intelligence Agency (CIA), or any other
U.S. government agency. Nothing in the contents should be construed as asserting
or implying U.S. government authentication of information or agency endorsement
of the authors’ views. This material has been reviewed by the ODNI and the CIA
only to prevent the disclosure of classified information.
Introduction and Overview
1.1 Our Vision
1.2 Two Types of Thinking
1.3 Dealing with Bias
1.4 Role of Structured Analytic Techniques
1.5 Value of Team Analysis
1.6 History of Structured Analytic Techniques
1.7 Selection of Techniques for This Book
1.8 Quick Overview of Chapters
nalysis as practiced in the intelligence, law enforcement, and business
communities is steadily evolving from a mental activity done
predominantly by a sole analyst to a collaborative team or group
activity. The driving forces behind this transition include the following:
The growing complexity of international issues and the consequent
requirement for multidisciplinary input to most analytic products.2
The need to share more information more quickly across organizational
The dispersion of expertise, especially as the boundaries between analysts,
collectors, operators, and decision makers become blurred.
The need to identify and evaluate the validity of alternative mental models.
This transition is being enabled by advances in technology, such as new
collaborative networks and communities of interest, and the mushrooming growth
of social networking practices among the upcoming generation of analysts. The
transition is being facilitated by the increasing use of structured analytic techniques
to guide the exchange of information and reasoning among analysts in ways that
identify and eliminate a wide range of cognitive biases and other shortfalls of
intuitive judgment.
This book defines the role and scope of structured analytic techniques as a distinct
analytic methodology that provides a step-by-step process for dealing with the
kinds of incomplete, ambiguous, and sometimes deceptive information with which
analysts must work. Structured analysis is a mechanism by which internal thought
processes are externalized in a systematic and transparent manner so that they can
be shared, built on, and easily critiqued by others. Each technique leaves a trail that
other analysts and managers can follow to see the basis for an analytic judgment.
These techniques are used by individual analysts but are perhaps best utilized in a
collaborative team or group effort in which each step of the analytic process
exposes participants to divergent or conflicting perspectives. This transparency
helps ensure that differences of opinion among analysts are heard and seriously
considered early in the analytic process. Analysts tell us that this is one of the most
valuable benefits of any structured technique.
Structured analysis helps analysts ensure that their analytic framework—the
foundation upon which they form their analytic judgments—is as solid as possible.
By helping break down a specific analytic problem into its component parts and
specifying a step-by-step process for handling these parts, structured analytic
techniques help to organize the amorphous mass of data with which most analysts
must contend. Such techniques make our thinking more open and available for
review and critique by ourselves as well as by others. This transparency enables the
effective communication at the working level that is essential for intraoffice and
interagency collaboration.
These are called “techniques” because they usually guide the analyst in thinking
about a problem rather than provide the analyst with a definitive answer, as one
might expect from a method. Structured analytic techniques in general, however,
do form a methodology—a set of principles and procedures for qualitative analysis
of the kinds of uncertainties that many analysts must deal with on a daily basis.
In the last twenty years, important gains have been made in psychological research
on human judgment. Dual process theory has emerged as the predominant
approach, positing two systems of decision making called System 1 and System 2.3
The basic distinction between System 1 and System 2 is intuitive versus analytical
System 1 is intuitive, fast, efficient, and often unconscious. It draws naturally on
available knowledge, past experience, and often a long-established mental model
of how people or things work in a specific environment. System 1 thinking is very
alluring, as it requires little effort, and it allows people to solve problems and make
judgments quickly and efficiently. It is often accurate, but intuitive thinking is also
a common source of cognitive biases and other intuitive mistakes that lead to faulty
analysis. Cognitive biases are discussed in the next section of this chapter.
System 2 thinking is analytic. It is slow, deliberate, conscious reasoning. It
includes all types of analysis, such as critical thinking and structured analytic
techniques, as well as the whole range of empirical and quantitative methods. The
introductory section of each of this book’s eight chapters on structured analytic
techniques describes how the type of analytic technique discussed in that chapter
helps to counter one or more types of cognitive bias and other common intuitive
mistakes associated with System 1 thinking.
There are many types of bias, all of which might be considered cognitive biases, as
they are all formed and expressed through System 1 activity in the brain. Potential
causes of bias include professional experience leading to an ingrained analytic
mindset, training or education, the nature of one’s upbringing, type of personality,
a salient personal experience, or personal equity in a particular decision.
All biases, except perhaps the personal self-interest bias, are the result of fast,
unconscious, and intuitive thinking (System 1)—not the result of thoughtful
reasoning (System 2). System 1 thinking is usually correct, but frequently
influenced by various biases as well as insufficient knowledge and the inherent
unknowability of the future. Structured analytic techniques are a type of System 2
thinking designed to help identify and overcome the analytic biases inherent in
System 1 thinking.
Behavioral scientists have studied the impact of cognitive biases on analysis and
decision making in many fields such as psychology, political science, medicine,
economics, business, and education ever since Amos Tversky and Daniel
Kahneman introduced the concept of cognitive biases in the early 1970s.4 Richards
Heuer’s work for the CIA in the late 1970s and the 1980s, subsequently followed
by his book Psychology of Intelligence Analysis, first published in 1999, applied
Tversky and Kahneman’s insights to problems encountered by intelligence
analysts.5 Since the publication of Psychology of Intelligence Analysis, other
authors associated with the U.S. Intelligence Community (including Jeffrey Cooper
and Rob Johnston) have identified cognitive biases as a major cause of analytic
failure at the CIA.6
This book is a logical follow-on to Psychology of Intelligence Analysis, which
described in detail many of the biases that influence intelligence analysis.7 Since
then hundreds of cognitive biases have been described in the academic literature
using a wide variety of terms. As Heuer noted many years ago, “Cognitive biases
are similar to optical illusions in that the error remains compelling even when one
is fully aware of its nature. Awareness of the bias, by itself, does not produce a
more accurate perception.”8 This is why cognitive biases are exceedingly difficult
to overcome. For example, Emily Pronin, Daniel Y. Lin, and Lee Ross observed in
three different studies that people see the existence and operation of cognitive and
motivational biases much more in others than in themselves.9 This explains why so
many analysts believe their own intuitive thinking (System 1) is sufficient.
An extensive literature exists on cognitive biases, sometimes called “heuristics,”
that explains how they affect a person’s thinking in many fields. What is unique
about our book is that it provides guidance on how to overcome many of these
biases. Each of the fifty-five structured analytic techniques described in this book
provides a roadmap for avoiding one or more specific cognitive biases as well as
other common intuitive pitfalls. The introduction and overview in each of the eight
chapters on structured analytic techniques identifies and describes the diverse
System 1 errors that this category of structured analytic techniques is designed to
avoid. While these techniques are helpful, they too carry no guarantee.
Structured analytic techniques are debiasing techniques. They do not replace
intuitive judgment. Their role is to question intuitive judgments by identifying a
wider range of options for analysts to consider. For example, a Key Assumptions
Check requires the identification and consideration of additional assumptions.
Analysis of Competing Hypotheses requires identification of alternative
hypotheses, a focus on refuting rather than confirming hypotheses, and a more
systematic analysis of the evidence. All structured techniques described in this
book have a Value Added section that describes how this technique contributes to
better analysis and helps mitigate cognitive biases and intuitive traps often made by
intelligence analysts and associated with System 1 thinking. For many techniques,
the benefit is self-evident. None purports to always give the correct answer. They
identify alternatives that merit serious consideration.
No formula exists, of course, for always getting it right, but the use of structured
techniques can reduce the frequency and severity of error. These techniques can
help analysts mitigate the proven cognitive limitations, sidestep some of the known
analytic biases, and explicitly confront the problems associated with unquestioned
mental models or mindsets. They help analysts think more rigorously about an
analytic problem and ensure that preconceptions and assumptions are not taken for
granted but are explicitly examined and, when possible, tested.10
The most common criticism of structured analytic techniques is, “I don’t have
enough time to use them.” The experience of many analysts shows that this
criticism is not justified. Many techniques take very little time. Anything new does
take some time to learn; but, once learned, the use of structured analytic techniques
saves analysts time. It can enable individual analysts to work more efficiently,
especially at the start of a project, when the analyst may otherwise flounder a bit in
trying to figure out how to proceed. Structured techniques aid group processes by
improving communication as well as enhancing the collection and interpretation of
evidence. And, in the end, a structured technique produces a product in which the
reasoning behind the conclusions is more transparent and more readily accepted
than one derived from other methods. This saves time by expediting review by
supervisors and editors and thereby compressing the coordination process.11
Analytic methods are important, but method alone is far from sufficient to
ensure analytic accuracy or value. Method must be combined with substantive
expertise and an inquiring and imaginative mind. And these, in turn, must be
supported and motivated by the organizational environment in which the analysis
is done.
Our vision for the future of intelligence analysis dovetails with that of the Director
of National Intelligence’s Vision 2015, in which intelligence analysis increasingly
becomes a collaborative enterprise, with the focus shifting “away from
coordination of draft products toward regular discussion of data and hypotheses
early in the research phase.”12 This is a major change from the traditional concept
of intelligence analysis as largely an individual activity and coordination as the
final step in the process.
In a collaborative enterprise, structured analytic techniques are a process
through which collaboration occurs. Just as these techniques provide structure to
our individual thought processes, they can also structure the interaction of analysts
within a small team or group. Because the thought process in these techniques is
transparent, each step in the technique prompts discussion within the team. Such
discussion can generate and evaluate substantially more divergent information and
new information than can a group that does not use a structured process. When a
team is dealing with a complex issue, the synergy of multiple minds using
structured analysis is usually more effective than is the thinking of a lone analyst.
Structured analytic techniques when paired with collaborative software can also
provide a framework to guide interagency collaboration and coordination,
connecting team members in different offices, agencies, parts of traffic-congested
metropolitan areas, and even around the world.
Team-based analysis can, of course, bring with it a new set of challenges
equivalent to the cognitive biases and other pitfalls faced by the individual analyst.
However, the well-known group process problems are minimized by the use of
structured techniques that guide the interaction among members of a team or
group. This helps to keep discussions from getting sidetracked and facilitates the
elicitation of alternative views from all team members. Analysts have also found
that use of a structured process helps to depersonalize arguments when there are
differences of opinion. This is discussed further in chapter 12. Also, today’s
technology and social networking programs make structured collaboration much
easier than it has ever been in the past.
The first use of the term “structured analytic techniques” in the U.S. Intelligence
Community was in 2005. However, the origin of the concept goes back to the
1980s, when the eminent teacher of intelligence analysis, Jack Davis, first began
teaching and writing about what he called “alternative analysis.”13 The term
referred to the evaluation of alternative explanations or hypotheses, better
understanding of other cultures, and analysis of events from the other country’s
point of view rather than by mirror imaging. In the mid-1980s some initial efforts
were made to initiate the use of more alternative analytic techniques in the CIA’s
Directorate of Intelligence. Under the direction of Robert Gates, then CIA Deputy
Director for Intelligence, analysts employed several new techniques to generate
scenarios of dramatic political change, track political instability, and anticipate
military coups. Douglas MacEachin, Deputy Director for Intelligence from 1993 to
1996, supported new standards for systematic and transparent analysis that helped
pave the path to further change.14
The term “alternative analysis” became widely used in the late 1990s after Adm.
David Jeremiah’s postmortem analysis of the U.S. Intelligence Community’s
failure to foresee India’s 1998 nuclear test, a U.S. congressional commission’s
review of the Intelligence Community’s global missile forecast in 1998, and a
report from the CIA Inspector General that focused higher-level attention on the
state of the Directorate of Intelligence’s analytic tradecraft. The Jeremiah report
specifically encouraged increased use of what it called “red-team” analysis.
When the Sherman Kent School for Intelligence Analysis at the CIA was created
in 2000 to improve the effectiveness of intelligence analysis, John McLaughlin,
then Deputy Director for Intelligence, tasked the school to consolidate techniques
for doing what was then referred to as “alternative analysis.” In response to
McLaughlin’s tasking, the Kent School developed a compilation of techniques, and
the CIA’s Directorate of Intelligence started teaching these techniques in a class
that later evolved into the Advanced Analytic Tools and Techniques Workshop.
The course was subsequently expanded to include analysts from the Defense
Intelligence Agency and other elements of the U.S. Intelligence Community.
Wisdom begins with the definition of terms.
—Socrates, Greek philosopher
The various investigative commissions that followed the surprise terrorist
attacks of September 11, 2001, and then the erroneous analysis of Iraq’s possession
of weapons of mass destruction, cranked up the pressure for more alternative
approaches to intelligence analysis. For example, the Intelligence Reform Act of
2004 assigned to the Director of National Intelligence “responsibility for ensuring
that, as appropriate, elements of the intelligence community conduct alternative
analysis (commonly referred to as ‘red-team’ analysis) of the information and
conclusions in intelligence analysis.”
Over time, however, analysts who misunderstood or resisted this approach came
to interpret alternative analysis as simply meaning an alternative to the normal way
that analysis is done, implying that these alternative procedures are needed only
occasionally in exceptional circumstances when an analysis is of critical
importance. Kent School instructors had to explain that the techniques are not
alternatives to traditional analysis, but that they are central to good analysis and
should be integrated into the normal routine—instilling rigor and structure into the
analysts’ everyday work process.
In 2004, when the Kent School decided to update its training materials based on
lessons learned during the previous several years and publish A Tradecraft
Primer,15 Randolph H. Pherson and Roger Z. George were among the drafters.
“There was a sense that the name ‘alternative analysis’ was too limiting and not
descriptive enough. At least a dozen different analytic techniques were all rolled
into one term, so we decided to find a name that was more encompassing and
suited this broad array of approaches to analysis.”16 Kathy Pherson is credited with
coming up with the name “structured analytic techniques” during a dinner table
conversation with her husband, Randy. Roger George organized the techniques
into three categories: diagnostic techniques, contrarian techniques, and imagination
techniques. The term “structured analytic techniques” became official in June
2005, when updated training materials were formally approved.
The Directorate of Intelligence’s senior management became a strong supporter
of structured analytic techniques and took active measures to facilitate and promote
this approach. The term is now used throughout the U.S. Intelligence Community
—and increasingly in academia and many intelligence services around the globe.
One thing cannot be changed, however, in the absence of new legislation. The
Director of National Intelligence (DNI) is still responsible under the Intelligence
Reform Act of 2004 for ensuring that elements of the U.S. Intelligence Community
conduct alternative analysis, which it now describes as the inclusion of alternative
outcomes and hypotheses in analytic products. We view “alternative analysis” as
covering only a part of what now is regarded as structured analytic techniques and
recommend avoiding use of the term “alternative analysis” to avoid any confusion.
The techniques described in this book are limited to ones that meet our definition
of structured analytic techniques, as discussed in chapter 2. Although the focus is
on techniques for strategic intelligence analysis, many of the techniques described
in this book have wide applicability to tactical military analysis, law enforcement
intelligence analysis, homeland security, business consulting, financial planning,
and complex decision making in any field. The book focuses on techniques that
can be used by a single analyst working alone or preferably with a small group or
team of analysts. Techniques that require sophisticated computing, or complex
projects of the type usually outsourced to an outside expert or company, are not
included. Several interesting techniques that were recommended to us were not
included for this reason.
From the several hundred techniques that might have been included here, we
identified a core group of fifty-five techniques that appear to be most useful for the
intelligence profession, but also useful for those engaged in related analytic
pursuits in academia, business, law enforcement, finance, and medicine.
Techniques that tend to be used exclusively for a single type of analysis in fields
such as law enforcement or business consulting, however, have not been included.
This list is not static. It is expected to increase or decrease as new techniques are
identified and others are tested and found wanting. In fact, we have dropped two
techniques from the first edition and added five new ones to the second edition.
Some training programs may have a need to boil down their list of techniques to
the essentials required for one particular type of analysis. No one list will meet
everyone’s needs. However, we hope that having one fairly comprehensive list and
common terminology available to the growing community of analysts now
employing structured analytic techniques will help to facilitate the discussion and
use of these techniques in projects involving collaboration across organizational
In this collection of techniques we build on work previously done in the U.S.
Intelligence Community, but also include a few techniques developed and used by
our British, Canadian, Spanish, and Australian colleagues. To select the most
appropriate additional techniques, Heuer reviewed a large number of books and
websites dealing with intelligence analysis methodology, qualitative methods in
general, decision making, problem solving, competitive intelligence, law
enforcement intelligence, forecasting or futures research, and social science
research in general. Given the immensity of this literature, there can be no
guarantee that nothing was missed.
About half of the techniques described here were previously incorporated in
training materials used by the Defense Intelligence Agency, the Office of
Intelligence and Analysis in the Department of Homeland Security, or other
intelligence agencies. We have revised or refined those techniques for this book.
Many of the techniques were originally developed or refined by one of the authors,
Randolph H. Pherson, when he was teaching structured techniques to intelligence
analysts, students, and in the private sector. Twenty-five techniques were newly
created or adapted to intelligence analyst needs by Richards Heuer or Randy
Pherson to fill perceived needs and gaps.
We provide specific guidance on how to use each technique, but this guidance is
not written in stone. Many of these techniques can be implemented in more than
one way, and some techniques are known by several different names. An
experienced government analyst told one of the authors that he seldom uses a
technique the same way twice. He adapts techniques to the requirements of the
specific problem, and his ability to do that effectively is a measure of his
The names of some techniques are normally capitalized, while many are not. For
consistency and to make them stand out, the names of all techniques described in
this book are capitalized.
Chapter 2 (“Building a System 2 Taxonomy”) defines the domain of structured
analytic techniques by describing how it differs from three other major categories
of intelligence analysis methodology. It presents a taxonomy with eight distinct
categories of structured analytic techniques. The categories are based on how each
set of techniques contributes to better intelligence analysis.
Chapter 3 (“Choosing the Right Technique”) describes the criteria we used for
selecting techniques for inclusion in this book, discusses which techniques might
be learned first and used the most, and provides a guide for matching techniques to
analysts’ needs. Analysts using this guide answer twelve abbreviated questions
about what the analyst wants or needs to do. An affirmative answer to any question
directs the analyst to the appropriate chapter(s), where the analyst can quickly zero
in on the most appropriate technique(s).
Chapters 4 through 11 each describe a different category of technique, which
taken together cover fifty-five different techniques. Each of these chapters starts
with a description of that particular category of technique and how it helps to
mitigate known cognitive biases or intuitive traps. It then provides a brief overview
of each technique. This is followed by a discussion of each technique, including
when to use it, the value added, description of the method, potential pitfalls when
noteworthy, relationship to other techniques, and origins of the technique.
Readers who go through these eight chapters of techniques from start to finish
may perceive some overlap. This repetition is for the convenience of those who use
this book as a reference guide and seek out individual sections or chapters. The
reader seeking only an overview of the techniques as a whole can save time by
reading the introduction to each technique chapter, the brief overview of each
technique, and the full descriptions of only those specific techniques that pique the
reader’s interest.
Highlights of the eight chapters of techniques are as follows:
Chapter 4 (“Decomposition and Visualization”) covers the basics such as
checklists, sorting, ranking, classification, several types of mapping,
matrices, and networks. It includes two new techniques, Venn Analysis and
AIMS (Audience, Issue, Message, and Storyline).
Chapter 5 (“Idea Generation”) presents several types of brainstorming. That
includes Nominal Group Technique, a form of brainstorming that rarely has
been used in the U.S. Intelligence Community but should be used when there
is concern that a brainstorming session might be dominated by a particularly
aggressive analyst or constrained by the presence of a senior officer. A
Cross-Impact Matrix supports a group learning exercise about the
relationships in a complex system.
Chapter 6 (“Scenarios and Indicators”) covers four scenario techniques and
the indicators used to monitor which scenario seems to be developing. The
Indicators Validator™ developed by Randy Pherson assesses the diagnostic
value of the indicators. The chapter includes a new technique, Cone of
Chapter 7 (“Hypothesis Generation and Testing”) includes the following
techniques for hypothesis generation: Diagnostic Reasoning, Analysis of
Competing Hypotheses, Argument Mapping, and Deception Detection.
Chapter 8 (“Assessment of Cause and Effect”) includes the widely used Key
Assumptions Check and Structured Analogies, which comes from the
literature on forecasting the future. Other techniques include Role Playing,
Red Hat Analysis, and Outside-In Thinking.
Chapter 9 (“Challenge Analysis”) helps analysts break away from an
established mental model to imagine a situation or problem from a different
perspective. Two important techniques developed by the authors, Premortem
Analysis and Structured Self-Critique, give analytic teams viable ways to
imagine how their own analysis might be wrong. What If? Analysis and
High Impact/Low Probability Analysis are tactful ways to suggest that the
conventional wisdom could be wrong. Devil’s Advocacy, Red Team
Analysis, and the Delphi Method can be used by management to actively
seek alternative answers.
Chapter 10 (“Conflict Management”) explains that confrontation between
conflicting opinions is to be encouraged, but it must be managed so that it
becomes a learning experience rather than an emotional battle. It describes a
family of techniques grouped under the umbrella of Adversarial
Collaboration and an original approach to Structured Debate in which
debaters refute the opposing argument rather than support their own.
Chapter 11 (“Decision Support”) includes several techniques, including
Decision Matrix, that help managers, commanders, planners, and
policymakers make choices or trade-offs between competing goals, values,
or preferences. This chapter describes the Complexity Manager developed
by Richards Heuer and two new techniques, Decision Trees and the Impact
As previously noted, analysis done across the global intelligence community is
in a transitional stage from a mental activity performed predominantly by a sole
analyst to a collaborative team or group activity. Chapter 12, entitled
“Practitioner’s Guide to Collaboration,” discusses, among other things, how to
include in the analytic process the rapidly growing social networks of area and
functional specialists who often work from several different geographic locations.
It proposes that most analysis be done in two phases: a divergent analysis or
creative phase with broad participation by a social network using a wiki, followed
by a convergent analysis phase and final report done by a small analytic team.
How can we know that the use of structured analytic techniques does, in fact,
improve the overall quality of the analytic product? As we discuss in chapter 13
(“Validation of Structured Analytic Techniques”), there are two approaches to
answering this question—logical reasoning and empirical research. The logical
reasoning approach starts with the large body of psychological research on the
limitations of human memory and perception and pitfalls in human thought
processes. If a structured analytic technique is specifically intended to mitigate or
avoid one of the proven problems in human thought processes, and that technique
appears to be successful in doing so, that technique can be said to have “face
validity.” The growing popularity of many of these techniques would imply that
they are perceived by analysts—and their customers—as providing distinct added
value in a number of different ways.
Another approach to evaluation of these techniques is empirical testing. This is
often done by constructing experiments that compare analyses in which a specific
technique is used with comparable analyses in which the technique is not used. Our
research found that such testing done outside the intelligence profession is
generally of limited value, as the experimental conditions varied significantly from
the conditions under which the same techniques are used by most intelligence
analysts. Chapter 13 proposes a broader approach to the validation of structured
analytic techniques. It calls for structured interviews, observation, and surveys in
addition to experiments conducted under conditions that closely simulate how
these techniques are used by intelligence analysts. Chapter 13 also recommends
formation of a separate organizational unit to conduct such research as well as
other tasks to support the use of structured analytic techniques.
Chapter 14 (“The Future of Structured Analytic Techniques”) employs one of
the techniques in this book, Complexity Manager, to assess the prospects for
continued growth in the use of structured analytic techniques. It asks the reader to
imagine it is 2020 and answers the following questions based on an analysis of ten
variables that could support or hinder the growth of structured analytic techniques
during this time period: Will structured analytic techniques gain traction and be
used with greater frequency by intelligence agencies, law enforcement, and the
business sector? What forces are spurring the increased use of structured analysis?
What obstacles are hindering its expansion?
1. Vision 2015: A Globally Networked and Integrated Intelligence Enterprise (Washington, DC: Director
of National Intelligence, 2008).
2. National Intelligence Council, Global Trends 2025: A Transformed World (Washington, DC: U.S.
Government Printing Office, November 2008).
3. For further information on dual process theory, see the research by Jonathan Evans and Keith Frankish,
In Two Minds: Dual Processes and Beyond (Oxford, UK: Oxford University Press, 2009); and Pat
Croskerry, “A Universal Model of Diagnostic Reasoning,” Academic Medicine 84, no. 8 (August 2009).
4. Amos Tversky and Daniel Kahneman, “Judgment under Uncertainty: Heuristics and Biases,” Sciences
185, no. 4157 (1974): 1124–1131.
5. Psychology of Intelligence Analysis was republished by Pherson Associates, LLC, in 2007, and can be
purchased on its website:
6. Jeffrey R. Cooper, Curing Analytic Pathologies: Pathways to Improved Intelligence Analysis
(Washington, DC: CIA Center for the Study of Intelligence, 2005); and Rob Johnston, Analytic Culture in
the U.S. Intelligence Community: An Ethnographic Study (Washington, DC: CIA Center for the Study of
Intelligence, 2005).
7. Richards J. Heuer Jr., Psychology of Intelligence Analysis (Washington, DC: CIA Center for the Study
of Intelligence, 1999; reprinted by Pherson Associates, LLC, 2007).
8. Ibid., 112.
9. Emily Pronin, Daniel Y. Lin, and Lee L. Ross, “The Bias Blind Spot: Perceptions of Bias in Self versus
Others,” Personality and Social Psychology Bulletin 28, no. 3 (2002): 369–381.
10. Judgments in this and the next sections are based on our personal experience and anecdotal evidence
gained in work with or discussion with other experienced analysts. As we will discuss in chapter 13, there
is a need for systematic research on these and other benefits believed to be gained through the use of
structured analytic techniques.
11. Again, these statements are our professional judgments based on discussions with working analysts
using structured analytic techniques. Research by the U.S. Intelligence Community on the benefits and
costs associated with all aspects of the use of structured analytic techniques is strongly recommended, as
discussed in chapter 13.
12. Vision 2015: A Globally Networked and Integrated Intelligence Enterprise (Washington, D.C.:
Director of National Intelligence, 2008), p. 13.
13. Information on the history of the terms “structured analytic techniques” and “alternative analysis” is
based on information provided by Jack Davis, Randolph H. Pherson, and Roger Z. George, all of whom
were key players in developing and teaching these techniques at the CIA.
14. See Heuer, Psychology of Intelligence Analysis, xvii–xix.
15. A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis, 2nd ed.
16. Personal communication to Richards Heuer from Roger Z. George, October 9, 2007.
Building a System 2 Taxonomy
2.1 Taxonomy of System 2 Methods
2.2 Taxonomy of Structured Analytic Techniques
taxonomy is a classification of all elements of some domain of
information or knowledge. It defines the domain by identifying,
naming, and categorizing all the various objects in this domain. The
objects are organized into related groups based on some factor
common to each object in the group. The previous chapter defined the difference
between two different types of thinking, System 1 and System 2 thinking.
System 1 thinking is intuitive, fast, efficient, and often unconscious. Such
intuitive thinking is often accurate, but it is also a common source of
cognitive biases and other intuitive mistakes that lead to faulty analysis.
System 2 thinking is analytic. It is slow, deliberate, and conscious, the result
of thoughtful reasoning. In addition to structured analytic techniques,
System 2 thinking encompasses critical thinking and the whole range of
empirical and quantitative analysis.
Intelligence analysts have largely relied on intuitive judgment—a System 1
process—in constructing their line of analysis. When done well, intuitive judgment
—sometimes referred to as traditional analysis—combines subject-matter expertise
with basic thinking skills. Evidentiary reasoning, historical method, case study
method, and reasoning by analogy are examples of this category of analysis.1 The
key characteristic that distinguishes intuitive judgment from structured analysis is
that it is usually an individual effort in which the reasoning remains largely in the
mind of the individual analyst until it is written down in a draft report. Training in
this type of analysis is generally provided through postgraduate education,
especially in the social sciences and liberal arts, and often along with some country
or language expertise.
This chapter presents a taxonomy that defines the domain of System 2 thinking.
Figure 2.0 distinguishes System 1 or intuitive thinking from the four broad
categories of analytic methods used in System 2 thinking. It describes the nature of
these four categories, one of which is structured analysis. The others are critical
thinking, empirical analysis, and quasi-quantitative analysis. As discussed in
section 2.2, structured analysis consists of eight different categories of structured
analytic techniques. This chapter describes the rationale for these four broad
categories and identifies the eight families of structured analytic techniques.
The word “taxonomy” comes from the Greek taxis, meaning arrangement,
division, or order, and nomos, meaning law. Classic examples of a taxonomy are
Carolus Linnaeus’s hierarchical classification of all living organisms by kingdom,
phylum, class, order, family, genus, and species that is widely used in the
biological sciences, and the periodic table of elements used by chemists. A library
catalogue is also considered a taxonomy, as it starts with a list of related categories
that are then progressively broken down into finer categories.
Figure 2.0 System 1 and System 2 Thinking
Development of a taxonomy is an important step in organizing knowledge and
furthering the development of any particular discipline. Rob Johnston developed a
taxonomy of variables that influence intelligence analysis but did not go into any
depth on analytic techniques or methods. He noted that “a taxonomy differentiates
domains by specifying the scope of inquiry, codifying naming conventions,
identifying areas of interest, helping to set research priorities, and often leading to
new theories. Taxonomies are signposts, indicating what is known and what has
yet to be discovered.”2
Robert Clark has described a taxonomy of intelligence sources.3 He also
categorized some analytic techniques commonly used in intelligence analysis, but
not to the extent of creating a taxonomy. To the best of our knowledge, a taxonomy
of analytic methods for intelligence analysis has not previously been developed,
although taxonomies have been developed to classify research methods used in
forecasting,4 operations research,5 information systems,6 visualization tools,7
electronic commerce,8 knowledge elicitation,9 and cognitive task analysis.10
After examining taxonomies of methods used in other fields, we found that there
is no single right way to organize a taxonomy—only different ways that are more
or less useful in achieving a specified goal. In this case, our goal is to gain a better
understanding of the domain of structured analytic techniques, investigate how
these techniques contribute to providing a better analytic product, and consider
how they relate to the needs of analysts. The objective has been to identify various
techniques that are currently available, identify or develop additional potentially
useful techniques, and help analysts compare and select the best technique for
solving any specific analytic problem. Standardization of terminology for
structured analytic techniques will facilitate collaboration across agency and
international boundaries during the use of these techniques.
Intelligence analysts employ a wide range of methods to deal with an even wider
range of subjects. Although this book focuses on the field of structured analysis, it
is appropriate to identify some initial categorization of all the methods in order to
see where structured analysis fits in. Many researchers write of only two general
approaches to analysis, contrasting qualitative with quantitative, intuitive with
empirical, or intuitive with scientific. Others might claim that there are three
distinct approaches: intuitive, structured, and scientific. In our taxonomy, we have
sought to address this confusion by describing two types of thinking (System 1 and
System 2) and defining four categories of System 2 thinking.
The first step of science is to know one thing from another. This
knowledge consists in their specific distinctions; but in order that it
may be fixed and permanent, distinct names must be given to different
things, and those names must be recorded and remembered.
—Carolus Linnaeus, Systema Naturae (1738)
Whether intelligence analysis is, or should be, an art or science is one of the
long-standing debates in the literature on intelligence analysis. As we see it,
intelligence analysis has aspects of both spheres. The range of activities that fall
under the rubric of intelligence analysis spans the entire range of human cognitive
abilities, and it is not possible to divide it into just two categories—art and science
—or to say that it is only one or the other. The extent to which any part of
intelligence analysis is either art or science is entirely dependent upon how one
defines “art” and “science.”
The taxonomy described here posits four functionally distinct methodological
approaches to intelligence analysis. These approaches are distinguished by the
nature of the analytic methods used, the type of quantification if any, the type of
data that is available, and the type of training that is expected or required. Although
each method is distinct, the borders between them can be blurry.
Critical thinking: Critical thinking, as defined by longtime intelligence
methodologist and practitioner Jack Davis, is the application of the processes
and values of scientific inquiry to the special circumstances of strategic
intelligence.11 Good critical thinkers will stop and reflect on who is the key
customer, what is the question, where can they find the best information,
how can they make a compelling case, and what is required to convey their
message effectively. They recognize that this process requires checking key
assumptions, looking for disconfirming data, and entertaining multiple
explanations as long as possible. Most students are exposed to critical
thinking techniques at some point in their education—from grade school to
university—but few colleges or universities offer specific courses to develop
critical thinking and writing skills.
Structured analysis: Structured analytic techniques involve a step-by-step
process that externalizes the analyst’s thinking in a manner that makes it
readily apparent to others, thereby enabling it to be reviewed, discussed, and
critiqued piece by piece, or step by step. For this reason, structured analysis
usually becomes a collaborative effort in which the transparency of the
analytic process exposes participating analysts to divergent or conflicting
perspectives. This type of analysis is believed to mitigate some of the
adverse impacts of a single analyst’s cognitive limitations, an ingrained
mindset, and the whole range of cognitive and other analytic biases.
Frequently used techniques include Structured Brainstorming, Scenarios
Analysis, Indicators, Analysis of Competing Hypotheses, and Key
Assumptions Check. Structured techniques are taught at the college and
graduate school levels and can be used by analysts who have not been
trained in statistics, advanced mathematics, or the hard sciences.
Quasi-quantitative analysis using expert-generated data: Analysts often
lack the empirical data needed to analyze an intelligence problem. In the
absence of empirical data, many methods are designed that rely on experts to
fill the gaps by rating key variables as High, Medium, Low, or Not Present,
or by assigning a subjective probability judgment. Special procedures are
used to elicit these judgments, and the ratings usually are integrated into a
larger model that describes that particular phenomenon, such as the
vulnerability of a civilian leader to a military coup, the level of political
instability, or the likely outcome of a legislative debate. This category
includes methods such as Bayesian inference, dynamic modeling, and
simulation. Training in the use of these methods is provided through
graduate education in fields such as mathematics, information science,
operations research, business, or the sciences.
Empirical analysis using quantitative data: Quantifiable empirical data are
so different from expert-generated data that the methods and types of
problems the data are used to analyze are also quite different. Econometric
modeling is one common example of this method. Empirical data are
collected by various types of sensors and are used, for example, in analysis
of weapons systems. Training is generally obtained through graduate
education in statistics, economics, or the hard sciences.
No one of these four methods is better or more effective than another. All are
needed in various circumstances to optimize the odds of finding the right answer.
The use of multiple methods during the course of a single analytic project should
be the norm, not the exception. For example, even a highly quantitative technical
analysis may entail assumptions about motivation, intent, or capability that are best
handled with critical thinking approaches and/or structured analysis. One of the
structured techniques for idea generation might be used to identify the variables to
be included in a dynamic model that uses expert-generated data to quantify these
Of these four methods, structured analysis is the new kid on the block, so to
speak, so it is useful to consider how it relates to System 1 thinking. System 1
thinking combines subject-matter expertise and intuitive judgment in an activity
that takes place largely in an analyst’s head. Although the analyst may gain input
from others, the analytic product is frequently perceived as the product of a single
analyst, and the analyst tends to feel “ownership” of his or her analytic product.
The work of a single analyst is particularly susceptible to the wide range of
cognitive pitfalls described in Psychology of Intelligence Analysis and throughout
this book.12
Structured analysis follows a step-by-step process that can be used by an
individual analyst, but it is done more commonly as a group process, as that is how
the principal benefits are gained. As we discussed in the previous chapter,
structured techniques guide the dialogue between analysts with common interests
as they work step by step through an analytic problem. The critical point is that this
approach exposes participants with various types and levels of expertise to
alternative ideas, evidence, or mental models early in the analytic process. It can
help the experts avoid some of the common cognitive pitfalls. The structured group
process that identifies and assesses alternative perspectives can also help to avoid
“groupthink,” the most common problem of small-group processes.
When used by a group or a team, structured techniques can become a
mechanism for information sharing and group learning that helps to compensate
for gaps or weaknesses in subject-matter expertise. This is especially useful for
complex projects that require a synthesis of multiple types of expertise.
Structured techniques have been used by U.S. Intelligence Community
methodology specialists and some analysts in selected specialties for many years,
but the broad and general use of these techniques by the average analyst is a
relatively new approach to intelligence analysis. The driving forces behind the
development and use of these techniques in the intelligence profession are (1) an
increased appreciation of cognitive limitations and biases that make intelligence
analysis so difficult, (2) prominent intelligence failures that have prompted
reexamination of how intelligence analysis is generated, (3) policy support and
technical support for intraoffice and interagency collaboration, and (4) a desire by
policymakers who receive analysis that it be more transparent as to how the
conclusions were reached.
Considering that the U.S. Intelligence Community started focusing on structured
techniques in order to improve analysis, it is fitting to categorize these techniques
by the various ways they can help achieve this goal (see Figure 2.2). Structured
analytic techniques can mitigate some of the human cognitive limitations, sidestep
some of the well-known analytic pitfalls, and explicitly confront the problems
associated with unquestioned assumptions and mental models. They can ensure
that assumptions, preconceptions, and mental models are not taken for granted but
are explicitly examined and tested. They can support the decision-making process,
and the use and documentation of these techniques can facilitate information
sharing and collaboration.
Figure 2.2 Eight Families of Structured Analytic Techniques
A secondary goal when categorizing structured techniques is to correlate
categories with different types of common analytic tasks. This makes it possible to
match specific techniques to individual analysts’ needs, as will be discussed in
chapter 3. There are, however, quite a few techniques that fit comfortably in
several categories because they serve multiple analytic functions.
The eight families of structured analytic techniques are described in detail in
chapters 4–11. The introduction to each chapter describes how that specific
category of techniques helps to improve analysis.
1. Reasoning by analogy can also be a structured technique called Structured Analogies, as described in
chapter 8.
2. Rob Johnston, Analytic Culture in the U.S. Intelligence Community (Washington, DC: CIA Center for
the Study of Intelligence, 2005), 34.
3. Robert M. Clark, Intelligence Analysis: A Target-Centric Approach, 2nd ed. (Washington, DC: CQ
Press, 2007), 84.
4. Forecasting Principles website:
5. Russell W. Frenske, “A Taxonomy for Operations Research,” Operations Research 19, no. 1 (January–
February 1971).
6. Kai R. T. Larson, “A Taxonomy of Antecedents of Information Systems Success: Variable Analysis
Studies,” Journal of Management Information Systems 20, no. 2 (Fall 2003).
7. Ralph Lengler and Martin J. Epler, “A Periodic Table of Visualization Methods,” undated,
8. Roger Clarke, Appropriate Research Methods for Electronic Commerce (Canberra, Australia: Xanax
Consultancy Pty Ltd., 2000),
9. Robert R. Hoffman, Nigel R. Shadbolt, A. Mike Burton, and Gary Klein, “Eliciting Knowledge from
Experts,” Organizational Behavior and Human Decision Processes 62 (May 1995): 129–158.
10. Robert R. Hoffman and Laura G. Militello, Perspectives on Cognitive Task Analysis: Historical
Origins and Modern Communities of Practice (Boca Raton, FL: CRC Press/Taylor and Francis, 2008); and
Beth Crandall, Gary Klein, and Robert R. Hoffman, Working Minds: A Practitioner’s Guide to Cognitive
Task Analysis (Cambridge, MA: MIT Press, 2006).
11. See Katherine Hibbs Pherson and Randolph H. Pherson, Critical Thinking for Strategic Intelligence
(Washington, DC: CQ Press, 2013), xxii.
12. Richards J. Heuer Jr., Psychology of Intelligence Analysis (Washington, DC: CIA Center for the Study
of Intelligence, 1999; reprinted by Pherson Associates, LLC, 2007).
Choosing the Right Technique
3.1 Core Techniques
3.2 Making a Habit of Using Structured Techniques
3.3 One Project, Multiple Techniques
3.4 Common Errors in Selecting Techniques
3.5 Structured Technique Selection Guide
his chapter provides analysts with a practical guide to identifying the various
techniques that are most likely to meet their needs. It also does the following:
Identifies a set of core techniques that are used or should be used most
frequently and should be in every analyst’s toolkit. Instructors may also want
to review this list in deciding which techniques to teach.
Describes five habits of thinking that an analyst should draw upon when
under severe time pressure to deliver an analytic product.
Discusses the value of using multiple techniques for a single project.
Reviews the importance of identifying which structured analytic techniques
are most effective in helping analysts overcome or at least mitigate ingrained
cognitive biases and intuitive traps that are common to the intelligence
Lists common mistakes analysts make when deciding which technique or
techniques to use for a specific project.
A key step in considering which structured analytic technique to use to solve a
particular problem is to ask: Which error or trap do I most need to avoid or at least
to mitigate when performing this task? Figure 3.0 highlights fifteen of the most
common intuitive traps analysts are likely to encounter depending on what task
they are trying to do. As illustrated by the chart, our experience is that intelligence
analysts are particularly susceptible to the following five traps: (1) failing to
consider multiple hypotheses or explanations, (2) ignoring inconsistent evidence,
(3) rejecting evidence that does not support the lead hypothesis, (4) lacking
insufficient bins or alternative hypotheses for capturing key evidence, and (5)
improperly projecting past experience. Most of these traps are also mitigated by
learning and applying the Five Habits of the Master Thinker, discussed in section
The average analyst is not expected to know how to use every technique in this
book. All analysts should, however, understand the functions performed by various
types of techniques and recognize the analytic circumstances in which it is
advisable to use them. An analyst can gain this knowledge by reading the
introductions to each of the technique chapters and the overviews of each
technique. Tradecraft or methodology specialists should be available to assist when
needed in the actual implementation of many of these techniques. In the U.S.
Intelligence Community, for example, the CIA and the Department of Homeland
Security have made good progress supporting the use of these techniques through
the creation of analytic tradecraft support cells. Similar units have been established
by other intelligence analysis services and proven effective.
All analysts should be trained to use the core techniques discussed here because
they are needed so frequently and are widely applicable across the various types of
analysis—strategic and tactical, intelligence and law enforcement, and cyber and
business. They are identified and described briefly in the following paragraphs.
Structured Brainstorming (chapter 5): Perhaps the most commonly used
technique, Structured Brainstorming is a simple exercise often employed at the
beginning of an analytic project to elicit relevant information or insight from a
small group of knowledgeable analysts. The group’s goal might be to identify a list
of such things as relevant variables, driving forces, a full range of hypotheses, key
players or stakeholders, available evidence or sources of information, potential
solutions to a problem, potential outcomes or scenarios, or potential responses by
an adversary or competitor to some action or situation; or, for law enforcement,
potential suspects or avenues of investigation. Analysts should be aware of
Nominal Group Technique as an alternative to Structured Brainstorming when
there is concern that a regular brainstorming session may be dominated by a senior
officer or that junior personnel may be reluctant to speak up.
Cross-Impact Matrix (chapter 5): If the brainstorming identifies a list of
relevant variables, driving forces, or key players, the next step should be to create a
Cross-Impact Matrix and use it as an aid to help the group visualize and then
discuss the relationship between each pair of variables, driving forces, or players.
This is a learning exercise that enables a team or group to develop a common base
of knowledge about, for example, each variable and how it relates to each other
variable. It is a simple but effective learning exercise that will be new to most
intelligence analysts.
Key Assumptions Check (chapter 8 ): One of the most commonly used
techniques is the Key Assumptions Check, which requires analysts to explicitly list
and question the most important working assumptions underlying their analysis.
Any explanation of current events or estimate of future developments requires the
interpretation of incomplete, ambiguous, or potentially deceptive evidence. To fill
in the gaps, analysts typically make assumptions about such things as the relative
strength of political forces, another country’s intentions or capabilities, the way
governmental processes usually work in that country, the trustworthiness of key
sources, the validity of previous analyses on the same subject, or the presence or
absence of relevant changes in the context in which the activity is occurring. It is
important that such assumptions be explicitly recognized and questioned.
Indicators (chapter 6): Indicators are observable or potentially observable
actions or events that are monitored to detect or evaluate change over time. For
example, they might be used to measure changes toward an undesirable condition
such as political instability, a pending financial crisis, or a coming attack.
Indicators can also point toward a desirable condition such as economic or
democratic reform. The special value of Indicators is that they create an awareness
that prepares an analyst’s mind to recognize the earliest signs of significant change
that might otherwise be overlooked. Developing an effective set of Indicators is
more difficult than it might seem. The Indicator Validator™ helps analysts assess
the diagnosticity of their Indicators.
Analysis of Competing Hypotheses (chapter 7): This technique requires
analysts to start with a full set of plausible hypotheses rather than with a single
most likely hypothesis. Analysts then take each item of relevant information, one at
a time, and judge its consistency or inconsistency with each hypothesis. The idea is
to refute hypotheses rather than confirm them. The most likely hypothesis is the
one with the least relevant information that would argue against it, not the most
relevant information that supports it. This process applies a key element of
scientific method to intelligence analysis. Software recommended for using this
technique is discussed in chapter 7.
Premortem Analysis and Structured Self-Critique (chapter 9): These two
easy-to-use techniques enable a small team of analysts who have been working
together on any type of future-oriented analysis to challenge effectively the
accuracy of their own conclusions. Premortem Analysis uses a form of reframing,
in which restating the question or problem from another perspective enables one to
see it in a different way and come up with different answers. Imagine yourself
several years in the future. You suddenly learn from an unimpeachable source that
your original estimate was wrong. Then imagine what could have happened to
cause your estimate to be wrong. Looking back to explain something that has
happened is much easier than looking into the future to forecast what will happen.
With the Structured Self-Critique, analysts respond to a list of questions about a
variety of factors, including sources of uncertainty, analytic processes that were
used, critical assumptions, diagnosticity of evidence, information gaps, and the
potential for deception. Rigorous use of both of these techniques can help prevent a
future need for a postmortem.
What If? Analysis (chapter 9): In conducting a What If? Analysis, one
imagines that an unexpected event has happened and then, with the benefit of
“hindsight,” analyzes how it could have happened and considers the potential
consequences. This type of exercise creates an awareness that prepares the
analyst’s mind to recognize early signs of a significant change, and can enable
decision makers to plan ahead for that contingency. A What If? Analysis can be a
tactful way of alerting decision makers to the possibility that they may be wrong.
Analysts sometimes express concern that they do not have enough time to use
structured analytic techniques. The experience of most analysts and particularly
managers of analysts is that this concern is unfounded. In fact, if analysts stop to
consider how much time it takes not just to research an issue and draft a report, but
also to coordinate the analysis, walk the paper through the editing process, and get
it approved, they will usually discover that the use of structured techniques almost
always speeds the process.
Many of the techniques, such as a Key Assumptions Check, Indicators
Validation, or Venn Analysis, take little time and substantially improve the
rigor of the analysis.
Some take a little more time to learn, but once learned, often save analysts
considerable time over the long run. Analysis of Competing Hypotheses
and Red Hat Analysis are good examples of this phenomenon.
Techniques such as the Getting Started Checklist; AIMS (Audience,
Issue, Message, and Storyline); or Issue Redefinition force analysts to stop
and reflect on how to be more efficient over time.
Premortem Analysis and Structured Self-Critique usually take more time
but offer major rewards if errors in the original analysis are discovered and
Figure 3.2 The Five Habits of the Master Thinker
Source: Copyright 2013 Pherson Associates, LLC.
When working on quick-turnaround items such as a current situation report or an
intelligence assessment that must be produced the same day, a credible argument
can be made that a structured analytic technique cannot be applied properly in the
available time. When deadlines are short, gathering the right people in a small
group to employ a structured technique can prove impossible.
The best response to this valid observation is to practice using the core
techniques when deadlines are less pressing. In so doing, analysts will ingrain new
habits of thinking critically in their minds. If they and their colleagues practice how
to apply the concepts embedded in the structured techniques when they have time,
they will be more capable of applying these critical thinking skills instinctively
when under pressure. The Five Habits of the Master Thinker are described in
Figure 3.2.1 Each habit can be mapped to one or more structured analytic
Key Assumptions: In a healthy work environment, challenging assumptions
should be commonplace, ranging from “Why do you assume we all want pepperoni
pizza?” to “Won’t increased oil prices force them to reconsider their export
strategy?” If you expect your colleagues to challenge your key assumptions on a
regular basis, you will become more sensitive to your own assumptions, and you
will increasingly ask yourself if they are well founded.
Alternative Explanations: When confronted with a new development, the first
instinct of a good analyst is to develop a hypothesis to explain what has occurred
based on the available evidence and logic. A master thinker goes one step further
and immediately asks whether any alternative explanations should be considered.
If envisioning one or more alternative explanations is difficult, then a master
thinker will simply posit a single alternative that the initial or lead hypothesis is not
true. While at first glance these alternatives may appear much less likely, over time
as new evidence surfaces they may evolve into the lead hypothesis. Analysts who
do not generate a set of alternative explanations at the start and lock on to a
preferred explanation will often fall into the trap of Confirmation Bias—focusing
on the data that are consistent with their explanation and ignoring or rejecting other
data that are inconsistent.
Inconsistent Data: Looking for inconsistent data is probably the hardest habit to
master of the five, but it is the one that can reap the most benefits in terms of time
saved when conducting an investigation or researching an issue. The best way to
train your brain to look for inconsistent data is to conduct a series of Analysis of
Competing Hypotheses (ACH) exercises. Such practice helps the analyst learn how
to more readily identify what constitutes compelling contrary evidence. If an
analyst encounters an item of data that is inconsistent with one of the hypotheses in
a compelling fashion (for example, a solid alibi), then that hypothesis can be
quickly discarded, saving the analyst time by redirecting his or her attention to
more likely solutions.
Key Drivers: Asking at the outset what key drivers best explain what has
occurred or will foretell what is about to happen is a key attribute of a master
thinker. If key drivers are quickly identified,…

Calculate your order
Pages (275 words)
Standard price: $0.00
Client Reviews
Our Guarantees
100% Confidentiality
Information about customers is confidential and never disclosed to third parties.
Original Writing
We complete all papers from scratch. You can get a plagiarism report.
Timely Delivery
No missed deadlines – 97% of assignments are completed in time.
Money Back
If you're confident that a writer didn't follow your order details, ask for a refund.

Calculate the price of your order

You will get a personal manager and a discount.
We'll send you the first draft for approval by at
Total price:
Power up Your Academic Success with the
Team of Professionals. We’ve Got Your Back.
Power up Your Study Success with Experts We’ve Got Your Back.
WeCreativez WhatsApp Support
Our customer support team is here to answer your questions. Ask us anything!
👋 Hi, how can I help?