Brief Analysis of Stephane
Brief Analysis of Stephane
Brief Analysis of Stephane
misinterpreted, faults in analysis, lack of collection by the intelligence agencies, and poor
reality, and factual error. Scholars argue that failures in intelligence are inevitable (e.g. Froscher,
2010; Betts, 1978). This supports Lefebvre’s view, in regard to factual error, as if there are any
incorrect elements in predictions, the analytical assessment would be flawed (Lefebvre, 2004)
Lefebvre considers how intelligence analysis uses and has become dependent on
technology and how alluring analysing freely obtainable big data has become. The Chief
Technology Officer of the CIA strongly supports the use of big data (Hunt, 2018). Many
agencies like the CIA, NSA and GCHQ have established big data teams devoted to analytics,
research and development. Lefebvre argues analysts must be skilled in their use of technology as
analysis “involves assessing the reliability and credibility of the data, and comparing it with the
knowledge base available” (Lefebvre, 2004). Technology such as algorithms can strip out a
Information gathered concerning the intentions of foreign leaders, remain best identified by
humans. Lefebvre is against the collection of data, which is not used (Lefebvre, 2004). The
National Academies, (2015) agree with Lefebvre that a substantial percentage of data collected is
extraneous or is considered bulk. It is impossible for analysts to keep up with the vast quantity of
incoming data regarding threats, which needs consideration before recommendations are made.
Katter, Montgomery, and Thompson (1979) reasoned that intelligence analysis is conceptually
driven rather than data driven. Lefebvre agrees claiming qualified and competent analysts,
linguists, and collection personnel are needed in order to exploit data.
concluding that initial training is not enough and hands-on training was necessary to improve
analysts’ performance. Hulnick, (1999) agrees analysts develop largely within their specialist
job. However, this conclusion lacks practical applications since it is unlikely intelligence
agencies can accommodate a plethora of intelligence analysts for ‘on the job’ learning.
related to an analyst’s intellect and organisational behaviour. This is consistent with Heuer
(1999) that humans have difficulties in dealing with inherent and induced uncertainty. Lefebvre
argues that cognitive biases seek to confirm one’s already held beliefs, creating difficulties for
analysts when dealing with uncertainty. Tversky and Kahneman (1974) argue cognitive biases
can cause errors in judgment. The British intelligence community is conscious of the capacity for
bias, as seen in regard to the Iraq war (Morrison, 2011) when Tony Blair favoured information
confirming his own biases/hypotheses (Grieves 2018). This is particularly noticeable in analysts
returning to a specific conflict zone after being absent for some time, utilising their previous
understanding the analyst’s intrinsic strengths and weaknesses in processing information (Moore,
2007). However, it is unlikely that a set of rules will influence accurate analysis of the material
presented to analysts.
References
Betts, R. K. (1978) ‘Analysis, War, and Decision: Why Intelligence Failures are Inevitable’,
Grieves, M.R. (2018) ‘How useful is the IC as a way of understanding the work of intelligence
https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-
Hulnick, A. S. (1999) Fixing the Spy Machine: Preparing American Intelligence for the Twenty-
Hunt, I. J. (2018) ‘Big Data is a Big Deal at the CIA’, Central Intelligence Agency. Available at:
https://www.cia.gov/news-information/featured-story-archive/2012-featured-story-
analysis: phase 1 overview’, U.S. Army Research Institute for the Behavioural and Social
Morrison, J. N. (2011) ‘British Intelligence Failures in Iraq’, Intelligence and National Security,
National Academies of Sciences, Engineering and Medicine (2015) ‘New Report Says No
Technological Replacement Exists for Bulk Data Collection’, 15 January. Available at:
http://www8.nationalacademies.org/onpinews/newsitem.aspx?recordid=19414&_ga=2.18
2018).
Tversky, A. & Kahneman, D. (1974) ‘Judgment under uncertainty: Heuristics and biases’,