SOAR Report 2018
SOAR Report 2018
SOAR Report 2018
Report, 2018
The second annual state of incident response report
SPONSORED BY
TABLE OF CONTENTS
EXECUTIVE SUMMARY ............................................................................................................................. 4
ATTRITION ................................................................................................................................................... 11
PAGE 2
07. THE STATE OF SOAR BUDGETS AND MEASUREMENT .................................... 25
SOC METRICS: MEASURING RESULTS ................................................................................................... 25
PAGE 3
Executive Summary
Today’s business landscape is a delicate balancing act between technological advancement and
security. Workplace changes and technical innovations have made it easier to do business and live
our lives, but securing these manifold developments is a mammoth task that falls upon already
overworked security teams.
There is already a wealth of research that highlights the unending growth in security alerts, a
widening security skills gap, and the ensuing fatigue that is heaped upon understaffed security
teams. Demisto conducted a large study to delve deeper into these issues, their manifestations, and
possible solutions. Our results yielded fascinating insights into the state of cybersecurity in
businesses of all sizes.
Shift to SOAR
For this second annual report, we decided to shift our focus from incident response to SOAR
(Security Orchestration, Automation, and Response) as we believe SOAR to be a more
all-encompassing lens through which to view the security posture of a business. We will present a
brief overview of SOAR and its drivers before discussing our research findings.
Personnel Challenge
Organizations continue to face challenges in hiring, training, and retaining security personnel. Our
research found that it took an average of 8 months to train new security analysts; despite this, a
quarter of employees were likely to end up leaving within 2 years. In this scenario, SOAR tools should
aim to both fill personnel gaps and make existing analysts’ jobs easier and more fruitful.
PAGE 4
Willingness to Automate
One of the key findings from our research was an increase in the number of respondents who
indicated a strong ‘readiness to automate’. Besides the growing market validation of automation, this
increase in willingness is likely connected to the fact that all four major security challenges revealed
by research participants were related to human capital shortages.
PAGE 5
1. Understanding SOAR
To understand the reason for our shift from incident response to SOAR as a focal point, it is
important to understand what SOAR is and how it encompasses incident response. Incident
response focuses primarily on addressing issues after they have been identified. However, an
incident’s lifecycle involves many more stages: aggregation, enrichment, correlation, and investigation
being some of them. SOAR, unlike incident response, addresses all these stages and more. Here’s a
brief description of the building blocks that make up SOAR.
• Orchestration refers to the act of integrating disparate technologies, usually through work-
flows, so that they can function together. This means using security specific and non-security
specific technologies simultaneously in a way that eases coordination.
• Automation refers to the process of machines executing tasks hitherto performed by humans.
In the context of SOAR, automation is ideally seen as human enhancement and not human
replacement. Automation of repeatable, low-level tasks acts in concert with human
decision-making for overall acceleration of incident investigations.
• Incident management and response is still a crucial element in SOAR. Fundamentally, SOAR
seeks to foster a comprehensive, end-to-end understanding of incidents by security teams,
resulting in better, more informed response.
• Dashboards and reports form a critical part of SOAR. One of the ways to achieve unified
response is by providing data visualizations where incidents can be easily seen, correlated,
triaged, documented, and measured.
Security Incident
Response Platforms
S OAR
Figure 1
PAGE 6
2. SOAR Drivers
The development of SOAR as a cybersecurity practice has been driven chiefly by the shortcomings of
conventional tactics:
• Unattended Alerts: The sheer volume of alerts far outpaces the security team’s capacity to
examine them. Consequently, serious threats can be left unaddressed because security teams
are too busy wading through the sea of events on their screens.
• A Paucity of Proactivity: Since security teams are so busy dealing with day-to-day alert
prioritization and response, they don’t have time to proactively hunt for threats in their
environment before it’s too late. This inability to read early warning signs results in a vicious
cycle that leads to even more alerts being generated, usually after it’s too late.
• Lack of Central Context: The process of determining if an alert constitutes a serious threat
requires cross-referencing multiple data sources for complete context. While a large number of
security tools provide unique threat insights, teams find it tough to collate and correlate
intelligence at a central location, leading to variance in investigation quality.
PAGE 7
3. Security Challenges
To better understand how cybersecurity shortcomings are affecting businesses in the real world, we
asked several questions in our study. These questions helped us understand two things. Firstly, it
helped us verify whether our survey recipients felt the same challenges as those expounded by
general research. Secondly, it helped us understand the priority and magnitude of each issue.
57.30% 42.69%
Documenting incidents
55.56% 44.44%
Creating reports for management
Figure 2
The pattern that stands out starkly from these results is that the security skills gap continues to be
a challenge. Improving processes and results, coordinating among multiple products, improving team
collaboration, and avoiding effort duplication ranked highly among organizational security challenges,
with the human capital crunch being a common underlying thread.
While these results were insightful in isolation, we also asked recipients to provide subjective
comments and additional information that allowed us to better understand their specific challenges.
We were able to identify a unique agreement among respondents as they further explained their
issues with staffing. We got comments from businesses worried about “fewer people doing more
work” and analysts having “a lot of other responsibilities and finding it difficult to devote enough time
to properly documenting incidents, resolutions, etc.”
PAGE 8
SOC / IR CHALLENGE LEVEL (CONTINUED)
Figure 3
Respondents also opened up about how these issues could be resolved. Many emphasized the
importance of proper communication, observing how a lack of single-source communication could
lead to unnecessary duplication of efforts. They also emphasized the importance of creating and
implementing new processes to address these issues.
EMPLOYEE RETENTION
To further explore the issue of employee retention, we asked several questions to help gauge the
health of hiring and turnover in information security.
PAGE 9
EMPLOYEE RETENTION
50%
40.6%
40%
33.0%
30%
19.5%
20%
6.90%
10%
0%
0-1 year 1-2 year 3-4 year 5+ year
Figure 4
From these results, we can see that 67% of employees leave in 3-4 years and a quarter of
employees leave within two years. With the notoriously steep learning curves and training routines
that define a security analyst’s onboarding, this retention rate is a concerning figure.
Figure 5
PAGE 10
Respondent commentary on the matter provided further insight into the hiring dilemma. It was noted
that attracting talent to the public sector was often a difficult challenge. Many professed an inability
to find qualified candidates given the large discrepancy in what candidates list on their resume as a
skill versus what they know.
Lastly, there seemed to be a broad agreement on the fact that money/budget was now a secondary
issue after time. Time was by far the most limiting factor facing many of the businesses we
interviewed.
ATTRITION
While interviewing employees, we wanted to find out what was causing the high rate of attrition
within information security.
Figure 6
PAGE 11
Unsurprisingly, a substantial salary raise was far and away the number one reason for employee
departure. This is congruent with the security skills gap: if there’s a dearth of new security
personnel on the market and it takes time to train security teams, companies will try to outdo each
other in terms of remuneration to prize away experienced security analysts.
To delve further into this information, we asked employees a more generalized question about what
was important to them in their field.
Figure 7
Respondent data made it clear that salary was the chief driving factor. Taking this information in
concert with other security challenges, it’s understandable that overworked security analysts – who
are in demand and have irreplaceable knowledge – expect to be compensated for their skills and
efforts.
Looking beyond the monetary reasons, respondents also cited a desire for work-life balance and
skills improvement as desirable aspects they looked for in a job opportunity and employer.
HIRING + TRAINING
Turning back to employers, we wanted to ascertain how long it took for them to acquire and train a
new employee and better understand the time/cost concerns associated with high attrition rates.
PAGE 12
AVERAGE TIME TO
1-2 MONTHS 3-4 MONTHS 5-6 MONTHS 6-12 MONTHS MORE THAN 12 MONTHS
Figure 8
AVERAGE TIME TO
Figure 9
Almost half of our respondents said that it takes more than 8 months to properly train a new team
member. When this figure is contrasted with low employee retention rates, it paints a picture of lost
investment and time without requisite return.
Additional comments indicated a lack of employee experience in corporate culture which inhibited
the training procedure, invariably extending it beyond conventionally ‘normal’ limits. As previously
stated in this report, employers also expressed deep concerns when it came to employee experience,
citing a discrepancy in ‘on-paper’ experience versus ‘actual experience’.
PAGE 13
4. The State of Security Orchestration
Security orchestration – the process of interweaving security tools, teams, and processes together for
coordinated incident response – seeks to solve user pain points including product proliferation,
unstructured processes, and swivel-chair investigations.
We asked respondents for their thoughts on orchestration drivers and security tool challenges to
verify, validate, and evolve industry perception on their pain points and how orchestration can help
bridge the gap.
ORCHESTRATION DRIVERS
To figure out which security tasks were the most mundane and distracting for information security
staff, we asked them to select options from the list below:
Figure 10
PAGE 14
The results show most respondents concerned with the high number of false positives, followed
by a concern with alert volume in general. This insight is concurrent with the ‘tool sensitivity’ debate
that’s gripping the security industry. If security tools are not sensitive enough, real and dangerous
alerts can slip through the cracks and cause organizational havoc. To err on the side of caution,
security tools are instead ‘too sensitive’ and spin up a multitude of alerts, enveloping security teams
in false positives.
SOAR tools can mitigate this challenge through branching workflows that execute differing actions
based on alert malice and seriousness, ensuring that security teams only investigate high-priority
alerts and aren’t mired in false positives.
In addition to gauging which pressing issues were affecting security teams, we also wanted to know
how many security tools an analyst needed to learn for effective security operations and incident
response.
Figure 11
PAGE 15
INFORMATION SECURITY TOOLS IN USE IN ORGANIZATION
(MANAGEMENT)
50%
40.3%
40%
32.2%
30%
20.8%
20%
6.7%
10%
0%
1-2 3-5 6-10 10+
Figure 12
As we can see, there are plenty of tools to learn and many of these tools need to be personally
managed. This result helps us contextualize the high training times needed for security analysts
discussed earlier in this report. But with the number of tools constantly on the rise, high training
times and attrition rates truly spell out the gravity of the human capital challenge facing the industry
today.
PAGE 16
5. The State of Security Automation
Automation has been one of the practices at the forefront of security innovation for a while. The two
governing criteria while deciding upon candidates for automation are
1. How long the task takes, and
2. How often the task must be performed.
We sought to refine these initial thoughts and asked our respondents about incident loads,
processes, and perceived automation benefits.
AUTOMATION DRIVERS
An important goal of our study was to find and validate linkages between high incident loads, high
response teams, and the desire for automation. Thus, we asked respondents to give us a ‘best guess’
estimate on how many incidents they examined per week.
Figure 13
These results indicate that information security staff are overwhelmed by the tasks at hand. The
average number of alerts they review in on a weekly basis is roughly 12,024. This sobering statistic
coupled with the fact that it takes more than 4 days to resolve an alert makes it clear that there’s a
vicious cycle in effect. Alert volume leads to increased MTTR which in turn leads to even more alert
volume.
PAGE 17
ARE SECURITY PROCESSES WELL DEFINED?
For automation to be successful, the tasks which the automation system takes over must be properly
defined, preferably with sequential workflows. Defined processes are also important as a knowledge
repository that security teams can both feed into and learn from.
To understand the extent to which processes were defined, we asked respondents about the tasks
they felt would benefit most from automation.
Security Operations (e.g. periodic checks for security 64.60% 20.00% 15.40%
products' updates)
Threat Hunting 61.50% 19.50% 19.10%
Tracking metrics for Incident Response like MTTR,
61.20% 20.40% 18.50%
% of Incidents addressed, % responded to, etc.
Figure 14
Results show a robust automation base (and a willingness to automate further) on both the proactive
and reactive front. Proactively, security operations and threat hunting ranked high on the ‘automation
candidates’ list, highlighting security teams’ desire for automation to assist them in identifying
incipient threats and vulnerabilities. Reactively, incident response, tracking IR metrics, and case
management were felt as good candidates for partial or full automation.
PAGE 18
PROCESS DOCUMENTATION: INCIDENT RESPONSE
PLAYBOOK/RUNBOOK/PROCESSES
In addition to understanding what tasks should be automated, we also wanted clarity over
documentation processes and whether they varied across security departments. The process of
automation implies a degree of uniformity, and the presence of disparate documentation sources
would make it that much more difficult to kickstart structured automation in a security department.
Figure 15
Our results show conclusive unanimity in this regard. Ticketing was by far the most popular
documentation process followed by our respondents. Since ticketing systems often span across
teams (IT, security, support, and so on), it makes sense for central documentation to occur on these
platforms. A SOAR platform that integrates robustly with existing ticketing tools while providing its
own ‘security documentation center’ would be a good start for automation and process definition.
However, an overreliance on ticketing platforms for security operations has its own problems, which
we will cover later in the report.
PAGE 19
PROCESS UPDATE FREQUENCY: INCIDENT RESPONSE
PLAYBOOK/RUNBOOK/PROCESSES
One relatively overlooked side effect of the alert fatigue and day-to-day firefighting that security
teams face is the stagnancy of security processes. When analysts are strapped for time, they find it
difficult to capture the gaps in current processes and update them on an ongoing basis.
We wanted to see if this hypothesis rang true with our respondent base and asked them about the
frequency with which they updated their incident response processes.
40%
31.00%
30%
20% 15.70%
8.8%
10%
0%
Quick enough to capture On a regular weekly, Infrequently (once Never, or almost
and incorporate best or monthly schedule every year or so) never
practices as they are
identified
Figure 16
Results show that over 50% of respondents either didn’t update IR processes at all or updated
them infrequently. SOAR platforms will not only improve process speed through automation, but
also enable the iterative improvement of processes through proper metric capturing and visibility of
process gaps and potential improvements.
PAGE 20
6. The State of Incident Management
Currently, incident management is done almost exclusively by people. While it’s important for teams
to retain control over incident management, this task becomes tougher with a wider geographical
dispersion of teams. To verify the extent of personnel dispersal, we asked respondents whether their
IR function was in-house/outsourced and centrally located/distributed.
60%
50.60%
In-house 50%
40% 33.70%
In-house, augmented by
consultants as needed
30%
Partially outsourced, with Tier- 2
ad Tier-3 in-house 20% 13.80%
0%
Figure 17
Although 84% of respondents stated that their IR functions were largely in-house, those in-house
teams seem to be widely dispersed both nationally and internationally. Around 68% of respondents
had security teams either dispersed across locations in one country or observing a Follow-the-Sun
model across multiple countries.
Whether it’s due to scale, labor pressures, or the need for customer proximity, security teams are
dispersed. This dispersal makes it much harder to conduct unified incident management.
In the following sections, we will go through various components of incident management and
discuss what our research insights project for these practices.
PAGE 21
OUR ORGANIZATION IS
33.60%
35% 31.60%
Figure 18
Research results conclusively convey the trouble security teams have with processing and
contextualizing alerts. From simple alert volume and resolution stats (Figure 13) to false positive
concerns (Figure 10), the state of today’s threat landscape has resulted in a proliferation of alerts that
are tough to prioritize in a timely manner.
From the insights we have in the report so far, we can infer that security teams are dispersed and
use a variety of security tools to conduct incident response. This presents a problem during evidence
collection and post-incident documentation. Consider the following cases:
• An attack campaign manifests over a long period of time and records of what took place are
lost or only faintly remembered by a few people.
• There are multiple security analysts working on the same incident.
• Incidents involve multiple teams. For instance, if an incident involves a data breach of
personal information, regulatory mandates require that breach notifications are sent out to
affected individuals.
PAGE 22
None of the above cases can be resolved successfully at scale using the current incident
management setup that organizations possess.
Our research shows two main challenges with existing case management and workflows.
Firstly, over 50% of our respondents stated that they rarely updated processes (Figure 16),
highlighting a lack of both analyst time and process intelligence to make updates happen.
Secondly, most respondents use ticketing platforms to document incident response actions
(Figure 15). Since ticketing platforms are designed to be ‘static’ and capture moment-in-time
comments and flows, it prevents the dynamic, fast workflow changes that are necessary in the face
of sophisticated attacks.
In our view, SOAR platforms should be capable of both integrating with third-party ticketing tools,
as well as providing their own, more modular and flexible case management that’s better suited to
security use cases.
Threat intelligence tools form a vital cog in the wheel of incident response and security operations.
Our research found a healthy mix of respondents using open-source and commercially available
tools. With each TIP providing a unique slice and perspective on threat data, it becomes critical to
have an action hub that can aggregate data from different third-party sources for central visibility
and informed response.
SOAR tools are uniquely placed to leverage multiple threat intelligence platforms (both open-source
and commercially available), aggregate indicator reputation, and provide security teams with the
means to drive threat data to action.
PAGE 23
THREAT INTELLGENGE SOURCES
(SELECT ALL THAT APPLY)
57.90%
60% 53.30%
51.30%
50%
40%
30%
20%
10%
0%
Open source Industry leaders Commercial threat
intelligence
providers
Figure 19
7. The State of SOAR Budgets and Measurement
To get a better grasp of SOAR’s acceptance among its potential user base, we aimed to find out more
information about the measurement of security metrics as well as any budgetary exclusivity assigned
to SOAR tools by executives.
Measuring security results is the cornerstone upon which future improvements are actioned. With
this in mind, we asked respondents what metrics they usually measured during incident response.
Results show either a lack of formal measurement or rudimentary measurement of incident metrics.
Around 48% of respondents stated that they measured the number of incidents, which is a good
start but not an ideal end state for SOC efficiency measurement.
PAGE 25
Close to 42% of respondents cited that they didn’t have a formal system to measure security
metrics at all; this fact, when coupled with increasing alert volume, leads to a dangerous ‘double
whammy’. Not only are teams inundated with alerts, but they’re also unable to verify the accuracy
and efficacy of their response to those alerts.
Usually, a security industry or segment being assigned its own budget line is a strong indicator of
maturity. We wanted to find out if SOAR tools had their own budgetary allotments and, if not, what
other security concerns they had to compete with.
SOAR BUDGET?
Figure 21
Results show that the SOAR space is not mature enough to demand its own budget line, but it’s
growing at an appreciable pace. Around 38% of respondents stated that, while SOAR tools didn’t
have a dedicated budget, they were a part of the overall security budget. A further 15% of
respondents projected to including SOAR tools in their budgets the following year.
A sign of SOAR’s emergent nature is highlighted by around 20% of our responders being unsure
about where to include SOAR in their budgets. A growing acknowledgement of SOAR in security
budgets will come with increased awareness and continued verifiable benefits in existing SOAR
deployments.
PAGE 26
8. SOAR Benefits
Figure 22
When asked about challenges that SOAR tools could help solve, respondents addressed applications
in multiple fields.
• Alert triage: Respondents cited ‘reduction of false positives’, ‘identification and classification
of threats’, ‘risk determination’, and ‘incident prioritization’ as major areas where SOAR tools
could help. SOAR tools that ingest alerts from multiple sources and automate enrichment and
triage rules across products will help alleviate the ‘context chaos’ currently engulfing security
teams.
• Faster response: Around 70% of respondents stated that SOAR tools would be beneficial
in ‘automating response’. For simpler use cases, security teams can use SOAR playbooks to
automate repeatable steps and achieve quick resolution. For complex use cases, playbooks can
include manual checks and balances that give the analysts final control.
• Tool coordination: Around 69% of respondents cited a ‘reduction in need to learn vast number
of security tools’ as a benefit of SOAR deployment. While learning more tools is always
helpful for personal development, SOAR platforms can prevent the necessity to manually
perform ‘basic’ but important tasks hundreds of times per day.
PAGE 27
9. A Look Into the Future
SOAR APPLICATIONS
As SOAR tools continue to gain market acceptance, their functionalities and use cases will grow.
Currently, based on our industry knowledge and research results, we posit the following applications
of SOAR tools:
• Proactive threat hunting: Since threat hunting usually requires analysts to rapidly coordinate
among multiple security tools, it presents a great opportunity for orchestration with immediate
impact. SOAR tools can enable security teams to ingest third-party threat feeds and automate
‘search and destroy’ workflows that scan for potential vulnerabilities across environments.
• Standardize and iterate incident processes: It’s vital for security teams to minimize ‘quality
variance’ in incident management and response. SOAR workflows are a great first step in this
direction, allowing for partial/full codification of best-practice processes and guaranteeing that
security analysts don’t have to start from scratch each time they encounter a specific incident.
With deployment maturity, SOAR tools will also allow teams to quickly iterate upon these
processes by spotting gaps and areas for improvement.
• Improve investigation quality: Multiple data points in this report suggest that security teams
struggle with gathering incident context and leveraging full visibility of data at their disposal.
SOAR tools can help improve investigation quality by enabling faster resolution of false
positives, prioritizing incidents and risk through correlated information from multiple tools, and
freeing up analyst time by obviating the need to learn the detailed vernacular of many security
products.
• Accelerate and scale incident response: SOAR offers coordinated automation to an
industry that is currently beset by important but repeatable, high-quantity tasks. SOAR tools
allow SOCs to rely on automation for the grunt-work and leverage rich, correlated information
for decision-making and investigation.
• Security operations and maintenance: In addition to automating repetitive response tasks,
SOAR tools can also help security teams simplify system checks, maintenance, upgrades, and
general security operations. These practices rely on workflows as much (if not more) than
response does, and standardized, automated execution will increase accuracy and better plug
gaps that leave systems vulnerable.
PAGE 28
MAIN DISRUPTOR: MACHINE LEARNING
As with any industry, the growing maturity of SOAR tools will result in a baseline of features that all
major solutions in the space will be expected to provide. With clear product differentiators expected
to diminish with time, machine learning will emerge as a critical facet through which end users can
separate out the leading SOAR products from the rest.
Considering the results of this report and existing industry knowledge, we posit that machine
learning sits at the intersection of solving many problems faced by security teams today. Robust
machine learning algorithms, if fed with relevant data, can:
PAGE 29
10. Who we surveyed
To enumerate the depth and breadth of our study, we wanted to document the broad demographics
of our respondents. These details include company size, company location, and job roles.
COMPANY SIZE
NUMBER OF EMPLOYEES
3.1%
Less than 500 9.2%
500-1,999
20.3%
2,000-4,999
13.4%
5,000-9,999
10,000-14,999 19.2%
15,000-19,999
17.6%
Figure 23
We tried to maintain an appropriate dispersion of businesses sizes to cater results and insights to the
widest possible user base. As can be seen from the results, we were able to represent a wide variety
of businesses and avoid any biases resulting from niche, insulated samples.
PAGE 30
RESPONDENT INFORMATION
ROLE
42.92% Management
Individual Contributors
57.08%
Figure 24
ROLE
30%
24.50%
19.20%
20%
14.20%
12.60% 12.30%
9.60%
10% 7.70%
0%
Manager or Individual Individual VP or a above with Director with Individual Manager or
Director Contributor Contributor and duties that include duties that include Contributor Director, SOC
Information Security Team Security information information Security Consultant
Security-Other Engineer security oversight security oversight
Figure 25
PAGE 31
We wanted to get the perspective of both employees and managers and tried to represent the
opinions of both sets in this research. Like any industry, cybersecurity is reliant on both the strategic
vision of managers and the tactical execution of employees, and the report managed to represent a
healthy spread across this aisle.
COMPANY LOCATION
COMPANY LOCATION:
1.2% 1.5%
0.4% 0.4%
North America
4.2%
Europe 7.7%
Asia / Pacific
South America
Central America
Middle East
Africa
84.7%
Figure 26
We wanted to have an international respondent spread in our research if possible and avoid any
locational biases in responses. Our respondents ended up being mainly from North America. We
conceded that responses might be ‘localized’ as a result, but considering that North America is one
of the forerunners in terms of information security, we hope the results and insights from this report
provide value to readers across the globe.
PAGE 32