Download as xlsx, pdf, or txt
Download as xlsx, pdf, or txt
You are on page 1of 730

Index

Index

Click on any section name to proceed directly to that part of the assessment
Domain Section Questions remaining
Introduction 1. Introduction N/A
2. Usage N/A
3. Change notes
Questions remaining
General 1. Profile N/A
2. Scope N/A
Questions remaining
Business 1. Business drivers 0/5
2. Customers 0/6
3. Charter 0/4
4. Governance 0/9
5. Privacy & Policy 0/10
Questions remaining
People 1. Employees 0/8
2. Roles and Hierarchy 0/8
3. People Management 0/14
4. Knowledge Management 0/8
5. Training & Education 0/7
Questions remaining
Process 1. SOC Management 0/7
2. Operations and Facilities 0/31
3. Reporting & Communication 0/17
4. Use Case Management 0/20
5. Detection Engineering & Validation 0/18
Questions remaining
Technology 1. SIEM / UEBA 0/60
2. NDR 0/0
3. EDR 0/71
4. SOAR 0/46
Questions remaining
Services 1. Security Monitoring 0/41
2. Security Incident Management 0/50
3. Security Analysis and Forensics 0/38
4. Threat Intelligence 44/44
5. Threat Hunting 17/35
6. Vulnerability Management 30/33
7. Log Management 1/33
Questions remaining
Results 1. Results N/A
2. NIST CSF Scoring
3. Results Sharing
Questions remaining
Next steps 1. Next steps N/A
Questions remaining
N/A
N/A
N/A
Questions remaining
N/A
N/A
Questions remaining
0/5
0/6
0/4
0/9
0/10
Questions remaining
0/8
0/8
0/14
0/8
0/7
Questions remaining
0/7
0/31
0/17
0/20
0/18
Questions remaining
0/60
0/0
0/71
0/46
Questions remaining
0/41
0/50
0/38
44/44
17/35
30/33
1/33
Questions remaining
N/A
N/A
N/A
Questions remaining
N/A
Introduction
1. Introduction
2. Usage
3. Change notes

General information
Author Rob van Os
Site https://www.soc-cmm.com/
Contact info@SOC-CMM.com
Version 2.3.3, basic version
Date April 19th, 2024
Assessment training https://www.soc-cmm.com/services/training/

Background

The SOC-CMM model is a capability maturity model that can be used to perform a self-assessment of your Security O
review conducted on literature regarding SOC setup and existing SOC models as well as literature on specific elemen
validated by questioning several Security Operations Centers in different sectors and on different maturity levels to d
The output from the survey, combined with the initial analysis is the basis for this self-assessment.

For more information regarding the scientific background and the literature used to create the SOC-CMM self-asses
available through: https://www.soc-cmm.com/

If you have any questions or comments regarding the contents of this document, please use the above information t
Purpose and intended audience
The purpose of the SOC-CMM is to gain insight into the strengths and weaknesses of the SOC. This enables the SOC
which elements of the SOC require additional attention and/or budget. By regularly assessing the SOC for maturity a

Besides the primary purpose of performing an assessment of the SOC, the assessment can also be used for extensive
valuable insights.

This tool is intended for use by SOC and security managers, experts within the SOC and SOC consultants.

Navigation
Navigation through this tool is done using the navigation bar at the top of each page. Each of the numbered section
Furthermore, the icons can be used to navigate through sections within a domain and between domains. The icons

navigate to previous domain navigate to previous section within the domai

navigate to index navigate to next section within the domain

navigate to next domain navigate directly to results

Assessment Model
The assessment model consists of 5 domains and 25 aspects. All domains are evaluated for maturity (blue), only tec
maturity and capability (purple)
Maturity Levels
CMMI defines maturity as a means for an organization "to characterize its performance" for a specific entity (here:
The SOC-CMM calculates a maturity score using 6 maturity levels:
- Level 0: non-existent
- Level 1: initial
- Level 2: managed
- Level 3: defined
- Level 4: quantitatively managed
- Level 5: optimizing

These maturity levels are measured across 5 domains: business, people, process, technology and services. The mat
staged with pre-requisites for each level. Instead, every element adds individually to the maturity score: a continuo

Capability Levels
Capabilities are indicators of completeness. In essence, capabilities can support maturity.
The SOC-CMM calculates a capability score using 4 capability levels, similar to CMMi:
- Level 0: incomplete
- Level 1: performed
- Level 2: managed
- Level 3: defined

These capability levels have a strong technical focus and are measured across 2 domains: technology and services.
capability level is continuous. There are no prerequisites for advancing to a higher level, thus the capability growth

Disclaimer
The SOC-CMM is provided without warranty of any kind. The author of the document cannot assure its accuracy an
based on the output of this tool. The usage of this tool does not in any way entitle the user to support or consultan

License
Copyright (C) 2024 - SOC-CMM

The SOC-CMM advanced version is part of the SOC-CMM.

The SOC-CMM assessment tool is free software, released under the CC SA-BY license: https://creativecommons.org

You are free to:


Share — copy and redistribute the material in any medium or format
Adapt — remix, transform, and build upon the material for any purpose, even commercially.

Under the following terms:


Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. Yo
any way that suggests the licensor endorses you or your use.
ShareAlike — If you remix, transform, or build upon the material, you must distribute your contributions under the

No additional restrictions — You may not apply legal terms or technological measures that legally restrict others fr

This license is acceptable for Free Cultural Works. The licensor cannot revoke these freedoms as long as you follow

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the impl
A PARTICULAR PURPOSE.
ShareAlike — If you remix, transform, or build upon the material, you must distribute your contributions under the

No additional restrictions — You may not apply legal terms or technological measures that legally restrict others fr

This license is acceptable for Free Cultural Works. The licensor cannot revoke these freedoms as long as you follow

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the impl
A PARTICULAR PURPOSE.
.com/

.com/services/training/

a self-assessment of your Security Operations Center (SOC). The model is based on


well as literature on specific elements within a SOC. The literature analysis was then
and on different maturity levels to determine which elements were actually in place.
self-assessment.

to create the SOC-CMM self-assessment tool, please refer to the thesis document as

please use the above information to contact SOC-CMM.


s of the SOC. This enables the SOC management to make informed decisions about
rly assessing the SOC for maturity and capability, progress can be monitored.

ment can also be used for extensive discussions about the SOC and can thus provide

C and SOC consultants.

age. Each of the numbered sections can be clicked to proceed directly to that section.
n and between domains. The icons are as follows:

previous section within the domain

next section within the domain

rectly to results

aluated for maturity (blue), only technology and services are evaluated for both
mance" for a specific entity (here: the SOC).

technology and services. The maturity levels as implemented in this tool are not
y to the maturity score: a continuous maturity model.

maturity.
MMi:

domains: technology and services. Similar to maturity levels, progress to a higher


er level, thus the capability growth is continuous as well.

ment cannot assure its accuracy and is not liable for any cost as a result of decisions
e the user to support or consultancy. By using this tool, you agree to these conditions.

ense: https://creativecommons.org/licenses/by-sa/4.0/

mmercially.

d indicate if changes were made. You may do so in any reasonable manner, but not in

ibute your contributions under the same license as the original.

asures that legally restrict others from doing anything the license permits.

ese freedoms as long as you follow the license terms.

WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR


ibute your contributions under the same license as the original.

asures that legally restrict others from doing anything the license permits.

ese freedoms as long as you follow the license terms.

WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR


Introduction
1. Introduction
2. Usage
3. Change notes

How to use the SOC-CMM


The SOC-CMM has an embedded workflow that guides the assessment. First, the profile sheet is filled in and the sco
of the SOC-CMM (i.e. Business, People, Process, Technology and Services) are each evaluated in separate sections of

The evaluation is based on questions that can be answered using a drop-down that presents a 5-point scale. This sc
under 'Scoring mechanism'. This tool should be used by assessing each sheet in order. When all domains are comple
total scoring and detailed scoring for each domain. A sheet 'Next steps' is also included to provide pointers for follow

In the advanced version only, there is also a weighing mechanism in place. For each question, the importance of tha
importance is 'normal', which means that the score is not modified. Changing to importance to 'low' will cause the e
it to 'High' or 'Critical' will cause the element to have more impact on the score. Setting it to 'none' will ignore the el
'Weighing mechanism'. This feature should be used with care.

Some additional remarks regarding the usage of the SOC-CMM:


1. Some elements are not used directly for scoring (this is also indicated), but are a guideline for answering other qu
example, question 3.1 (part of maturity score) can be answered by using the elements in 3.2 (not part of maturity sc
2. Elements with a green colour are calculated fields. These will be filled in automatically by filling in those parts of t
3. The services and Technology domains evaluate both maturity and capability. These capabilities do not have a 5-p
scale instead. This is to reduce the amount of clicks and answers. The sixth element in the scale is 'not required'. Use
capability and to exclude it from scoring.
4. Every sheet has a part where you can fill in some comments or remarks. Discussing the questions in this self-asses
is added value for a self-assessment, so it is worthwhile to create notes.
5. The weighing mechanisms allows for manipulation of the maturity score. Therefore, it is important to strongly co
deviate from the standard. The goal of the SOC-CMM is to provide insight into strengths and weaknesses and to imp
possible.
6. The NIST score is calculated automatically as explained below
7. Performing a full SOC-CMM assessment can take a significant amount of time, depending on the level of detail yo
that you have allocated sufficient time. A way to reduce effort is to have a single knowledgeable SOC employee perf
that are debatable. Also, reducing scope for an initial assessment is a way to reduce the assessment effort.

Scoring mechanism
Each question that is part of the maturity scoring can be answered by selecting one of 5 options. These options vary
questions regarding completeness, the following applies:
- Incomplete, score: 0
- Partially complete, score: 1,25
- Averagely complete, score: 2,5
- Mostly complete, score: 3,75
- Fully complete, score: 5
As indicated, the score can be modified by using the weighing mechanism (use with care)
- Averagely complete, score: 2,5
- Mostly complete, score: 3,75
- Fully complete, score: 5
As indicated, the score can be modified by using the weighing mechanism (use with care)

Guidance
For each of the maturity questions, guidance is available. When a value is selected from the dropdown box, guidanc
This guidance can be used to help determine the correct level. Note that this is truly meant as guidance on interpret
prescriptive.

Weighing mechanism (advanced version only)


The weighing mechanism in the tool works by applying a factor to the element score as follows:
- Importance 'None', factor = 0 (not included in scoring)
- Importance 'Low', factor = 0.5 (score divided by 2)
- Importance 'Normal', factor = 1 (score not affected)
- Importance 'High', factor = 2 (score doubled)
- Importance 'Critical', factor = 4 (score quadrupled)

NIST Cyber Security Framework scoring


A detailed mapping between the SOC-CMM and the NIST CSF was created to allow for granular scoring. The exact m
separate download.

Customization
The SOC-CMM is built using standard Excel features without macros. The sheets are not locked or password protect
applying other changes such as changing guidance or adding elements is possible. To show columns and rows, go to
the tabs underlying the SOC-CMM, go top 'File' --> 'Options' --> 'Advanced' --> 'Display options' and check 'Show she
understanding of the calculations done in the '_Output' sheet. However, guidance for this type of customization is n
way the SOC-CMM sheets are connected.
profile sheet is filled in and the scope for assessment is selected. Then, the 5 domains
h evaluated in separate sections of this tool.

at presents a 5-point scale. This scale relates to the maturity level as explained below
rder. When all domains are completed, the sheet 'Results' will provide you with the
cluded to provide pointers for follow-up.

ach question, the importance of that element can be changed. The standard
mportance to 'low' will cause the element to have less impact on the score. Changing
etting it to 'none' will ignore the element in scoring entirely, as explained under

a guideline for answering other questions. These elements have a lighter colour. For
ments in 3.2 (not part of maturity score) as a guideline.
matically by filling in those parts of the assessment.
hese capabilities do not have a 5-point scale and an importance, but use a 6-point
nt in the scale is 'not required'. Use this if you do not feel like you need that particular

sing the questions in this self-assessments will likely uncover some improvements. This

efore, it is important to strongly consider and possibly document why you wish to
rengths and weaknesses and to improve the SOC, not to obtain the highest score

depending on the level of detail you put into the assessment. Before you start, ensure
knowledgeable SOC employee perform a quick scan and subsequently focus on areas
uce the assessment effort.

ne of 5 options. These options vary based on the type of question. For example, for

ith care)
ith care)

d from the dropdown box, guidance for that value is show under the guidance column.
uly meant as guidance on interpretation and scoring, not as mandatory and

ore as follows:

w for granular scoring. The exact mapping can be found on the SOC-CMM site as a

are not locked or password protected. Therefore, adding columns and rows and
e. To show columns and rows, go to 'View' --> 'Show' and check 'Headings'. To show all
splay options' and check 'Show sheet tabs'. Customizing calculations will require
e for this type of customization is not provided, as it requires an understanding of the
Introduction
1. Introduction Version 2.3
2. Usage Version 2.2
3. Change notes

Version 2.3.x

Business domain:
Governance
4.3.13: included SOC risk management into governance elements
4.7: Added question on governance meetings
4.7 - 4.10: renumbered to 4.8 - 4.11
Privacy & Policy:
5.2: SOC policy added as a new question, 5.3 outlines the elements in the policy
5.4 - 5.11: renumbered from 5.2 - 5.9
People domain:
Roles & hierarchy
2.2.3: changed security specialist to forensic analyst
2.2.12: added detection engineer
2.2.13: added automation engineer

Knowledge management:
Skill matrix questions aggregated into two questions
Knowledge matrix questions aggregated into two questions
Renumbering applied to section

Process domain:
SOC management:
1.2: guidance updated for this question
1.6: added a question on continuous improvement
1.7: added a question on quality assurance
1.8: added 3 questions on SOC architecture
Operations and facilities:
2.1: security operations exercises turned into a separate section, additional questions inserte
2.1.1: renumbered to 2.1.3
2.1.2 - 2.5.2: renumbered to 2.2.1 - 2.6.2

Reporting & Communication


Report types aggregated into a single question
Metrics types aggregated into a single question
3.9 - 3.12: renumbered to 3.11 - 3.14

Detection Engineering & Validation:


5.2.1: changed wording, guidance and remark for this question. Added some example resour
5.2.7 - 5.2.8: added questions on validation of integrity of data ingestion and log source cove

Technology domain:
All technologies: confidentiality section changed to access control
All technologies: break glass procedure added to access control
All technologies: maintenance (x.4) updated to 'maintenance & configuration', question x.4.3 updated to

SIEM / UEBA (formerly: SIEM)


Aggregated Security Analytics (v2.2) and SIEM (v2.2) capabilities into a single SIEM/UEBA tec
Capabilities restructured

NDR (formerly: IDPS)


Completely reworked all capabilities

EDR (new)
New section for the SOC-CMM

SOAR
Capabilities restructured
Capabilities from previous version renumbered and re-ordered

Services domain:
All services: removed CMMI level references, as they were somewhat confusing and did not add any valu
Security incident response
2.17.6: change questions from 'password reset procedure' to more generic 'incident containm

Navigation improvements
Scrolling error fixed for sheets that did not allow scrolling with the scroll wheel
Clicking on SOC-CMM elements in the results section navigates directly to that section
Added lines in each comments section to document the rationale for choosing a certain value

Backend improvements:
Index calculations improved to exclude questions with importance set to 'none' and provide a better ove
Guidance added for new questions
Generic guidance modified to better reflect capability levels

Bug fixes & typos:


Conditional formatting errors fixed
Scrolling error fixed for sheets that did not allow scrolling with the scroll wheel
Fixed guidance for several questions
Fixed answer selection for several questions
Fixed 'questions remaining' showing for questions set to importance 'none'
Fixed calculation error for section Reporting/Communication

Version 2.3.1
Fixed description in the profile section

Version 2.3.2
Fixed reference bug in guidance for 1 capability

Version 2.3.3

Content changes
Business domain\Privacy & Policy
Added comments fields for the last 2 questions

Process domain\SOC Management


Added support questions to continuous improvement
Added support questions to quality assurance
Converted architecture questions into a single maturity question with supporting questions
Process domain\detection engineering
Rephrased M 5.2.5 to include automated attack validation in the delivery pipeline

Technology domain\EDR
Removed duplicate capability (attack surface management), applied renumbering

Services domain\Security monitoring


Added OT monitoring capability
Reordered capabilities to optimize assessment flow
Provided specific guidance to monitoring capabilities

Results section
Mapping to NIST CSF 2.0 implemented
Results sharing resource embedded into assessment sheets

Bug fixes & typos:


Fixed several typo's
Fixed several remarks fields
Fixed several alignment issues
Fixed link to NICE framework and added additional resources

Version 2.2

Business domain:
Governance
question 4.10 added (external SOC cooperation)
Privacy & Policy:
questions 5.1, 5.2 and 5.3 added (security policy)
question 5.4: additional NIST mapping applied
People domain:
Employees:
questions 1.9 and 1.10 added (KSAOs)
People management:
questions 3.5 and 3.6 added, renumbering applied (team goals and tracking of goals)
questions 3.13 and 3.14 added (multi-team systems and team performance)
Knowledge management:
question 4.4.1 added, renumbering applied (employee abilities)

Process domain:
Operations and facilities:
question 2.1.6 added (OPSEC program)
questions 2.3.2, 2.3.5, 2.3.9 added, renumbering applied (war room, physical storage, remot
question 2.4.2 added, renumbering applied (vigilance)
Reporting: (changed to reporting & communication)
question 3.8.6 added (proactive & reactive metrics)
questions 3.10.1 and 3.10.2 added (education & awareness)
question 3.11 added (communication)
Use case management:
question 4.1.9 (testing use cases) moved to detection engineering, renumbering applied
section 4.2 added (MITRE ATT&CK®)
section 4.3 added (visibility)
Detection Engineering & Validation:
completely new section

Technology domain:
All technologies: maintenance and support removed from capabilities, and moved to maturity (section x

Services domain:
All services: question about onboarding procedure included
Threat Intelligence
question 4.14.25 added, renumbering applied (threat landscaping)
question 4.14.31 added (CTI infrastructure management)

Backend improvements:
calculations improved and simplified
Index updated from percentage completed to remaining questions
generic guidance applied for all capabilities (technology & services domain)
guidance added for new questions

Bug fixes & typos:


Typos fixed where found
conditional formatting error fixed

License updated:
CC BY-SA 4.0 license replaces previous GPLv3 license
elements in the policy
ection, additional questions inserted

estion. Added some example resources


data ingestion and log source coverage

guration', question x.4.3 updated to reflect this change

abilities into a single SIEM/UEBA technology

confusing and did not add any value


' to more generic 'incident containment procedures', including a reference to RE&CT framework

ly to that section
choosing a certain value

to 'none' and provide a better overview of remaining questions

uestion with supporting questions


n in the delivery pipeline

nt), applied renumbering

goals and tracking of goals)


eam performance)

war room, physical storage, remote working)

ineering, renumbering applied

, and moved to maturity (section x.4), renumbering applied


E&CT framework
Profile
1. Profile
2. Scope

Please fill in the information below to create a short profile of the SOC and the assessment

Assessment Details
Date of assessment 5/11/2024

Name(s)

Department(s) Estrategias

Intended purpose of the assessment 5/18/2024

Scope

Assessment type
Assessment style

Organisation & SOC Profile


Business size (FTE) 1.000-4.999
Sector Consulting
Number of years of SOC operations 3
SOC size (FTE's) 5
SOC organisational model Distributed SOC
SOC region South America
Geographic operation Regional

Target Maturity (optional)


Target maturity level business domain 3
Target maturity level people domain 3
Target maturity level process domain 3
Target maturity level technology domain 3
Target maturity level services domain 3
Target overall maturity level 3

Target Capability (optional)


Target capability level technology domain 2
Target capability level services domain 1
Target overall capability level 1.5

Notes or comments
5/11/2024

Estrategias
5/18/2024

Click on the dropdown box for more information


Click on the dropdown box for more information

1.000-4.999
Consulting
3
5
Distributed SOC Click on the dropdown box for more information
South America
Regional
3 Indicate a score from 1 to 5. Decimals can be used
3 Indicate a score from 1 to 5. Decimals can be used
3 Indicate a score from 1 to 5. Decimals can be used
3 Indicate a score from 1 to 5. Decimals can be used
3 Indicate a score from 1 to 5. Decimals can be used
3

2 Indicate a score from 1 to 3. Decimals can be used


1 Indicate a score from 1 to 3. Decimals can be used
1.5
Profile
1. Profile
2. Scope

Please select the services and technologies that should be included into the assessment. Excluding a service or techn

SOC Tooling (Technology domain)


SIEM / UEBA
NDR
EDR
SOAR

SOC Services (services domain)


Security Monitoring
Security Incident Management
Security Analysis
Threat Intelligence
Threat Hunting
Vulnerability Management
Log Management
sment. Excluding a service or technology here will exclude it from scoring. Note: changes to these values take some time to process

Remarks
Security Information and Event management tooling. Used to gather logging information from company assets and correla
Network security solution, used detect network exploits and anomalous network activity and perform network forensics
End-point security solution, used to prevent, detect and respond to threats on end-points
Used to automate workflows and SOC actions, support incident response and orchestrate between different security prod

Remarks
The security monitoring service aims at detecting security incidents and events
The security incident management service aims at responding to security incidents in a timely, accurate and organized fash
The security analysis service supports security monitoring and security incident management. Analysis includes event anal
The threat intelligence service provides information about potential threats that can be used in security monitoring, secur
The hunting service takes a proactive approach to finding threats in the infrastructure. Threat intelligence is often used to
The vulnerability management service is used to detect vulnerabilities in assets by discovery and actively scanning assets f
The log management service is used to collect, store and retain logging. Can be used for compliance purposes as well as in
: changes to these values take some time to process

er logging information from company assets and correlate events. Also includes User and Entity Behaviour Analytics (UEBA)
alous network activity and perform network forensics
o threats on end-points
sponse and orchestrate between different security products

s and events
ecurity incidents in a timely, accurate and organized fashion
urity incident management. Analysis includes event analysis and forensic analysis
al threats that can be used in security monitoring, security incident response, security analysis and threat hunting
n the infrastructure. Threat intelligence is often used to guide hunting efforts
ties in assets by discovery and actively scanning assets for known vulnerabilities
ging. Can be used for compliance purposes as well as investigation purposes
ludes User and Entity Behaviour Analytics (UEBA)

se, security analysis and threat hunting


Business
1. Business Drivers 5. Privacy & Policy
2. Customers
3. Charter
4. Governance

1 Business Drivers
1.1 Have you identified the main business drivers?
1.2 Have you documented the main business drivers?
1.3 Do you use business drivers in the decision making process?
1.4 Do you regularly check if the current service catalogue is aligned with business drivers?
1.5 Have the business drivers been validated with business stakeholders?

Comments and/or Remarks


1.6 Specify rationale for chosen values or any additional co 0
y

Answer

h business drivers?

1.1
1.2
1.3
1.4
1.5
Guidance
Most business drivers have been identified
Basic documentation of business drivers
Business drivers are occasionally used in decisions
Every change in the catalogue is checked against drivers
Alignment of SOC drivers with stakeholders is performed
Remarks
Example business drivers: cyber crime prevention, risk reduction, law / regulation, audit / compliance, business continuity
Documentation of business drivers is important for demonstrable business alignment
e.g. to determine priorities or make decisions regarding the on-boarding of new services or operations
i.e. do you check for services or operations that outside the scope of business drivers?
Business stakeholders can be C-level management
Business
1. Business Drivers 5. Privacy & Policy
2. Customers
3. Charter
4. Governance

2 Customers
2.1 Have you identified the SOC customers?
2.2 Please specify your customers:
2.2.1 Legal
2.2.2 Audit
2.2.3 Engineering / R&D
2.2.4 IT
2.2.5 Business
2.2.6 External customers
2.2.7 (Senior) Management
2.2.8 Other customers:

2.3 Have you documented the main SOC customers?


2.4 Do you differentiate output towards these specific customers?
2.5 Do you have service level agreements with these customers?
2.6 Do you regularly send updates to your customers?
2.7 Do you actively measure and manage customer satisfaction?

Comments and/or Remarks


2.8 Specify rationale for chosen values or any additional comments
y

Answer

2.1
2.3
2.4
2.5
2.6
2.7
Guidance
All customers are identified, including relevance and context

Single document, full description of SOC customers


All important customers receive differentiated output
Contract signed, approved by- and regularly reviewed with customers
Periodical updates sent and discussed with all customers
Customer satisfaction fully managed and improved over time
Remarks
Types of customers, customer requirements / expectations, etc.
Use this a guideline for answering 2.1 This is also potentially useful for insights and comparison with previous assessmen
Legal department, may be a stakeholder for privacy, or may request forensic investigation to the SOC
The audit department can be supported by logging provided by the SOC
The engineering departments deal with Intellectual Property that may require additional access monitoring
IT departments can be supported by monitoring for anomalies in their infrastructure and systems
Business should be the most important customer, as all SOC activities ultimately support business processes
External customers mostly apply to managed service providers
Senior management may be a direct SOC customer, depending on organization hierarchy
Specify any additional customers

Formal registration of customer contact details, place in the organization, geolocation, etc.
For example, are communication style and contents to Business customers different than that to IT?
Service level agreements are used to provide standardized services operating within known boundaries
For example: changes in service scope or delivery. Can also be reports, dashboards, etc.
Understanding customer satisfaction will help to better align with business needs
Business
1. Business Drivers 5. Privacy & Policy
2. Customers
3. Charter
4. Governance

3 Charter
3.1 Does the SOC have a formal charter document in place?
3.2 Please specify elements of the charter document:
3.2.1 Mission
3.2.2 Vision
3.2.3 Strategy
3.2.4 Service Scope
3.2.5 Deliverables
3.2.6 Responsibilities
3.2.7 Accountability
3.2.8 Operational Hours
3.2.9 Stakeholders
3.2.10 Objectives / Goals
3.2.11 Statement of success
Completeness
3.3 Is the SOC charter document regularly updated?
3.4 Is the SOC charter document approved by the business / CISO?
3.5 Are all stakeholders familiar with the SOC charter document contents?

Comments and/or Remarks


3.6 Specify rationale for chosen values or any additional comments
y

Answer

Mostly complete

3.1
3.3
3.4
3.5
Guidance
Single charter, full description of SOC strategic elements

Charter is updated on major changes in business strategy


Business / CISO approves of the content, but not formally
All Stakeholders are aware of the charter and its contents
Remarks
See 3.2 for charter document elements

A SOC mission should be established to provide insight into the reason for existence of the SOC
A vision should be created to determine long-term goals for the SOC
A strategy should be in place to show how to meet goals and targets set by mission and vision
Service scope is documented to provide insight into SOC service delivery
The output provided by the SOC, for example: reports, incidents, investigations, advisories, etc.
Responsibilities of the SOC
Accountability for the SOC for actions taken
Operational hours of the SOC
All relevant stakeholders for the SOC
Objectives and goals should be concrete and measurable so that they are fit for reporting purposes
A statement of success is used to determine when the SOC is successful. Should be aligned with goals and objectives
Use this outcome as a guideline to determine the score for 3.1
Regularity should be matched to your own internal policy. At least yearly is recommended
Approval from the relevant stakeholders will aid in business support for SOC operations
Making stakeholders aware of the contents helps in getting organizational support for security operations
Business
1. Business Drivers 5. Privacy & Policy
2. Customers
3. Charter
4. Governance

4 Governance
4.1 Does the SOC have a governance process in place?
4.2 Have all governance elements been identified?
4.3 Please specify identified governance elements
4.3.1 Business Alignment
4.3.2 Accountability
4.3.3 Sponsorship
4.3.4 Mandate
4.3.5 Relationships & Third Party Management
4.3.6 Vendor Engagement
4.3.7 Service Commitment
4.3.8 Project / Program Management
4.3.9 Continual Improvement
4.3.10 Span of control / federation governance
4.3.11 Outsourced service management
4.3.12 SOC KPIs & Metrics
4.3.13 SOC risk management
4.3.14 Customer Engagement / Satisfaction
Completeness
4.4 Is cost management in place?
4.5 Please specify cost management elements
4.5.1 People cost
4.5.2 Process cost
4.5.3 Technology cost
4.5.4 Services cost
4.5.5 Facility cost
4.5.6 Budget forecasting
4.5.7 Budget alignment
4.5.8 Return on investment
Completeness
4.6 Are all governance elements formally documented?
4.7 Are SOC governance meetings regularly held?
4.8 Is the governance process regularly reviewed?
4.9 Is the governance process aligned with all stakeholders?
4.10 Is the SOC regularly audited or subjected to (external) assessments?
4.11 Is there an active cooperation with other SOCs (external)?

Comments and/or Remarks


4.12 Specify rationale for chosen values or any additional comments
y

Answer

Fully complete
Fully complete

4.1
4.2
4.4
4.6
4.7
4.8
4.9
4.10
4.11
Guidance
Several governance elements are in place, but not structurally
Some governance elements are identified and governed actively

Costs fully managed and formally aligned with business stakeholders


Governance document completed, approved and formally published
Governance meetings held in an ad-hoc fashion
Process is regularly and formally reviewed and updated with findings
Some stakeholders are aware of the process and its details
The SOC is regularly and informally assessed
Information exchanged regularly, cooperation not formalized
Remarks
A governance process is required to determine the way the SOC should be managed
Possible governance elements can be found in under 4.3

Aligning SOC operations to business needs


Note that this can be part of the SOC charter document. This does not automatically make it part of the governance proce
Can be part of stakeholder management
Mandate for the SOC should be established so that the SOC can take action in crisis situations
Both management of internal and external relationships
For example: active involvement of vendors in the creation of a vision and strategy for the SOC
For example: service level agreements and IT controls
Project management for individual projects within the SOC / program management for larger transitions
Improvement of the SOC and of SOC management
Especially important for SOC setups where multiple SOCs exist within the same company
Especially important for hybrid SOC setups. When using outsourcing, SLAs and oversight should be in place
These are discussed in more detail in the Process section regarding reporting
Identification of- and dealing with risk (business, people, process and technology risk) within the SOC
Are customers an integral part of your security operations? Is their satisfaction of SOC services every inquired about?
Use this outcome as a guideline to determine the score for 4.2
Managing costs is required to justify budget allocation for the SOC and ensure continued service delivery in the future

Costs associated with employees. Should be managed to prove FTE requirements to stakeholders
Cost associated with processes. Should be managed to ensure process elements can be delivered
Cost associated with technology. Should be managed to prove budget requirements for new technology or replacement
Cost associated with service delivery. Especially important for managed service providers to ensure a healthy business mo
Cost associated with facilities used by the SOC
Forecasting of required budget over time. Should be aligned with business needs; increased spending must be justified
Alignment of budget with business requirements and drivers to ensure balanced spending on the SOC
Prove the return on investment to stakeholders to ensure continued budget allocation
Use this outcome as a guideline to determine the score for 4.4
Formal documentation should be signed off and stored in a quality management system
Meetings at different levels (operational, tactical, strategic) should be formalised in Terms of Reference (ToR) and driven b
Frequency should be matched to your own internal policy. At least yearly is recommended
Alignment will help the SOC obtain required mandate, budget and management support
Frequency should be matched to the SOC policy. At least yearly is recommended. 3rd party assessments have a higher obj
Exchange of best practices, intelligence and actions on threats with other SOCs is vital for improving cyber defence
Business
1. Business Drivers 5. Privacy & Policy
2. Customers
3. Charter
4. Governance

5 Privacy & Policy


5.1 Is there an information security policy in place that supports the SOC activities?
5.2 Has a SOC policy been created?
5.3 Please specify elements of the SOC policy
5.3.1 Code of conduct
5.3.2 Rules of engagement & responsibilities
5.3.3 Review frequency of documentation
5.3.4 SOC assessment frequency and type
5.3.5 Knowledge exchange and maintenance
5.3.6 Exercise frequency
5.3.7 Usage of TLP
5.3.8 Working agreements
Completeness
5.4 Is the SOC consulted in the creation and updates of operational security policy?
5.5 Is a reporting policy for security incidents in place?
5.6 Is a privacy policy regarding security monitoring of employees in place?
5.7 Does the SOC operate in compliance with all applicable privacy laws and regulations?
5.8 Does the SOC cooperate with legal departments regarding privacy matters?
5.9 Are specific procedures in place for dealing with privacy related investigations?
5.10 Is the SOC aware of all information that it processes and is subject to privacy regulations?
5.11 Is a Privacy Impact Assessment (PIA) regularly conducted?

Comments and/or Remarks


5.10 Specify rationale for chosen values or any additional comments
y

Answer

Averagely complete

and regulations?

privacy regulations?

5.1
5.2
5.4
5.5
5.6
5.7
5.8
5.9
5.10
5.11
Guidance
Policy in place, SOC activities mentioned in detail with mandate
Some ad-hoc information across documents

SOC informed of policy creation and updates only


Policy in place, no mention of the SOC
A policy exists, but has not been accepted formally
Some regulations are known and the SOC is non-compliant
There is some ad-hoc cooperation between SOC and legal
Some ad-hoc information across documents
The SOC is fully aware, some information is formally identified
PIAs are conducted using a structured approach in an ad-hoc fashion
Remarks
A clear security policy that supports SOC operations provides guidance and helps to enforce mandate for the SOC in the or
A SOC policy is to outline the rules to which SOC personnel and SOC management needs to adhere

How to behave in the SOC, mandatory and optional meetings, SOC culture, repercussions for non-compliance, etc.
What activities can and can not be performed as part of the job
What documentation is subjected to review and how often that documentation needs to be reviewed
How often the SOC is assessed and in what manner (self-assessment, audit, external assessment, etc.)
The means (meetings and platforms) for knowledge exchange and rules for maintenance of knowledge bases
Frequency and type (table top, cyber range, red team, etc.) of exercises in the SOC
Information exchange protocols. Especially important for collaborations outside the organisation
Agreements on length of meetings, length of agile sprints, transparency of completed work, etc.
Use this outcome as a guideline to determine the score for 5.2
Consulting the SOC in the creation of security policy will ensure that SOC activities are properly mentioned and enforceabl
A reporting policy for security incidents will aid the SOC in identifying incidents and threats in the organization
A privacy policy should state that monitoring of employees is possible within acceptable limits
Local laws and regulations as well as company policy may apply and should all be considered
Cooperation will ensure that the SOC is enabled to perform activities, rather than blocked
Privacy related issues require careful examination, especially those potentially leading to court cases
Such information includes IP addresses, customer identifiers, user names, host names (for personally owned devices), etc.
Can be used to determine the impact of monitoring on privacy, and can help uncover potential violations
People
1. Employees 5. Training & Education
2. Roles and Hierarchy
3. People Management
4. Knowledge Management

1 Employees
1.1 How many FTE’s are in your SOC?
1.2 Do you use external employees / contractors in your SOC?
1.2.1 If yes, specify the number of external FTE's
1.3 Does the current size of the SOC meet FTE requirements?
1.4 Does the SOC meet requirements for internal to external employee FTE ratio?
1.5 Does the SOC meet requirements for internal to external employee skillset?
1.6 Are all positions filled?
1.7 Do you have a recruitment process in place?
1.8 Do you have a talent acquisition process in place?
1.9 Do you have specific KSAOs established for SOC personnel?
1.10 Do you actively seek to create a psychologically safe environment for SOC personnel?

Comments and/or Remarks


1.11 Specify rationale for chosen values or any additional comments

[1] See the CSIRT social maturity handbook, pages 134-135


[2] See the CSIRT social maturity handbook, pages 41-42
[3] See the NIST NICE framework, specifically the Cyber Defense Analyst and Incident Responder roles:
https://www.nist.gov/itl/applied-cybersecurity/nice/nice-framework-resource-center/nice-framework-
cation

Answer
4

SOC personnel?

1.3
1.4
1.5
1.6
1.7
1.8
1.9
1.10
dent Responder roles:
-resource-center/nice-framework-current-versions
Guidance

The SOC is somewhat overstaffed or understaffed

There are too few or too many external employees

Some required skills are not present internally, and not transferred

All key positions filled

A full recruitment process is in place and performing effectively

A full acquisition process is in place, but not performing effectively

A full KSAO is created, but not actively used in staffing

A safe and actively managed environment exists and is evaluated


Remarks
Include both internal and external FTE's
External employees can be hired experts to fill in vacant positions or perform project activities
Current ratio: 125%
i.e. is the SOC size sufficient to realize business goals?
Note: requirements do not need to be explicit. Set importance to 'None' if you have no external employees.
i.e. Are there any crucial skills amongst external employees? Set importance to 'None' if you have no external employees
Unfilled positions may be due to deficiencies in the recruitment process
A recruitment process is required to obtain new employees in a market where talent is scarce [1]
Talent recruitment can be vital for SOC success, but talent retaining is equally important
Knowledge, Skills, Abilities and Other attributes (KSAOs) should be in place: technical, cognitive, social and character [2], [3
A psychologically safe environment is an environment where everyone is able to speak their mind and feel valued
People
1. Employees 5. Training & Education
2. Roles and Hierarchy
3. People Management
4. Knowledge Management

2 Roles and Hierarchy


2.1 Do you formally differentiate roles within the SOC?
2.2 Which of the following roles are present in your SOC?
2.2.1 Security Analyst
2.2.2 Security / Systems Engineer
2.2.3 Forensic Analyst
2.2.4 Security Architect
2.2.5 Threat Intelligence Analyst
2.2.6 Data Scientist
2.2.7 SOC Manager
2.2.8 Team Leader
2.2.9 Incident Handler
2.2.10 Incident Manager
2.2.11 Penetration Tester
2.2.12 Detection engineer
2.2.13 Automation engineer
2.2.14 Others, specify:
2.3 Do you differentiate tiers within these roles?
2.4 Are all roles sufficiently staffed?
2.5 Is there a role-based hierarchy in your SOC?
2.6 Have you formally documented all SOC roles?
2.7 Please specify elements in the role documentation:
2.7.1 Role description
2.7.2 Role tasks
2.7.3 Role responsibilities
2.7.4 Role expectations
2.7.5 Required technical skills
2.7.6 Required soft skills
2.7.7 Required educational level
2.7.8 Required or preferred certifications
Completeness
2.8 Are responsibilities for each role understood?
2.9 Have you documented career progression requirements for each of these roles?
2.10 Do you regularly revise or update the role descriptions?

Comments and/or Remarks


2.11 Specify rationale for chosen values or any additional comments
cation

Answer
Fully complete

hese roles?

2.1
2.3
2.4
2.5
2.6
2.8
2.9
2.10
Guidance
All roles are fully in use and formalized

All relevant roles are tiered and formalized

All roles fully meet FTE requirements

A full hierarchy is in place and formalized

Single document, full description of SOC roles


Full understanding of responsibilities formalized in training sessions

Single document, full description of career progression for roles

Documentation is reviewed ad-hoc, using a structured approach


Remarks
Use the roles in 2.2 to determine if you have all roles required in the SOC

Primarily responsible for triage and analysis of security alerts


Primarily responsible for technical / functional maintenance of security systems
Primarily responsible for in-depth analysis and security projects
Primarily responsible for technical architecture for security systems used within the SOC
Primarily responsible for analysis of threat intelligence
Primarily responsible for big data security analytics
Primarily responsible for managing SOC services
Primarily responsible for leading a team of other, for example, analysts and engineers
Primarily responsible for executing security incident management workflows
Primarily responsible for ensuring correct and timely management and escalation of security incidents
Primarily responsible for testing applications and systems for security weaknesses
Primarily responsible for creating and updating detection analytics
Primarily responsible for automation of repetitive SOC tasks
Specify any additional roles
If you have no tiers, and you feel this is not a restriction, select importance 'None'
Consider the staffing levels (desired FTE count) as well as knowledge and experience for all roles
If you have no hierarchy, and you feel this is not a restriction, select importance 'None'
Possible documentation elements can be found in under 2.7

A formal description of the role


A description of tasks that are part of the role
The responsibilities of the role
This is an extension of responsibilities. Example expectation: take a pro-active leading role in case of security incidents
e.g. experience with specific technologies or products
e.g. communication skills, presentation skills
e.g. university college, university
e.g. technical security certifications or security management certifications
Use this outcome as a guideline to determine the score for 2.6
Responsibilities for each role should be clearly understood by all SOC personnel
Career progression for roles can be documented through training, certification, experience and soft skills requirements
To revise is to review and verify whether to documentation is still correct or requires an update
People
1. Employees 5. Training & Education
2. Roles and Hierarchy
3. People Management
4. Knowledge Management

3 People Management
3.1 Do you have a job rotation plan in place?
3.2 Do you have a career progression process in place?
3.3 Do you have a talent management process in place?
3.4 Do you have team diversity goals?
3.5 Have you established team goals?
3.6 Do you document and track individual team member goals?
3.7 Do you periodically evaluate SOC employees?
3.8 Do you have a 'new hire' process in place?
3.9 Are all SOC employees subjected to screening?
3.10 Do you measure employee satisfaction for improving the SOC?
3.11 Are there regular 1-on-1 meetings between the SOC manager and the employees?
3.12 Do you perform regular teambuilding exercises?
3.13 Do you perform regular teambuilding exercises with other teams relevant to the SOC?
3.14 Do you periodically evaluate team performance?

Comments and/or Remarks


3.15 Specify rationale for chosen values or any additional comments
cation

Answer

e employees?

vant to the SOC?

3.1
3.2
3.3
3.4
3.5
3.6
3.7
3.8
3.9
3.10
3.11
3.12
3.13
3.14
Guidance
A plan covering some roles is in place, but not operational
A process covering some roles is in place and operational
No talent management process in place
Diversity goals have been formally defined and are not met
Team goals are determined, approved and tracked regularly
Individual goals are determined, approved and tracked regularly
Periodic evaluation is performed in an ad-hoc fashion
A process is in place, but does not cover all aspects
Basic screening procedure in place, applied structurally
Employee satisfaction is measured, not used for improvement
1-on-1 meetings are regularly held and used for coaching and growth
Exercises are regularly done and focused on improving team dynamics
MTS exercises are usually performed, but not embedded in processes
Periodic evaluation is performed, results are used for team growth
Remarks
Job rotation can be used to train employees in a variety of tasks and avoid too much routine
Career development, promotion, etc.
Talent should be adequately managed to retain such staff and fully develop their potential.
e.g. background diversity, ethnic diversity, gender diversity, etc.
Team goals help to bring focus to the team and monitor progress
Individual team member goals should be set to help grow the employee to full potential
Can also be included in the regular organization evaluation process
i.e. a defined process to quickly let new employees find their place and perform well in the SOC
Personnel screening is performed to avoid infiltration or misbehaviour by SOC employees
Employee satisfaction should be taken seriously as lack of satisfaction may lead to key personnel leaving
Such informal 1-on-1 conversations are used to coach employees and help the SOC manager gain insight in personal challe
Teambuilding exercises are used to promote collaboration between individuals in the team and to raise team spirit
In multi-team systems (MTS), the SOC collaborates with other teams. Use cross-team teambuilding to maximize performan
Besides individual performance, team performance and dynamics are also important to measure and improve on
People
1. Employees 5. Training & Education
2. Roles and Hierarchy
3. People Management
4. Knowledge Management

4 Knowledge Management
4.1 Do you have a formal knowledge management process in place?
4.2 Do you have a skill matrix in place?
4.3 Please specify elements of the skill matrix:
4.3.1 All SOC employees
4.3.2 Hard skills
4.3.3 Soft skills
4.3.4 Skill levels (novice, intermediate, expert)
Completeness
4.4 Is the skill matrix actively used for team and personal improvement?
4.5 Do you have a knowledge matrix in place?
4.6 Please specify elements of the knowledge matrix:
4.6.1 All SOC employees
4.6.2 All relevant knowledge areas
4.6.3 Knowledge levels (novice, intermediate, expert)
Completeness
4.7 Is the knowledge matrix actively used to determine training and education needs?
4.8 Have you documented SOC team member abilities?
4.9 Do you regularly assess and revise the knowledge management process?
4.10 Is there effective tooling in place to support knowledge documentation and distribution?

Comments and/or Remarks


4.11 Specify rationale for chosen values or any additional comments
cation

Answer

Fully complete

Fully complete
cation needs?

on and distribution?

4.1
4.2
4.4
4.5
4.7
4.8
4.9
4.10
Guidance
A formal process is in place, covering all knowledge aspects
A complete skill matrix is in place and approved, not regularly updated

Matrix used to improve personal and team results, improvements tracked


A complete knowledge matrix is in place, approved and regularly updated

Matrix used to identify all training needs, but not tracked for execution
All employee abilities documented, but is not regularly updated
Documentation is regularly and informally reviewed and updated
Tooling is in place and use of the tool is embedded in processes
Remarks
Formal knowledge management helps to optimize knowledge creation and distribution
A matrix may consist of: SOC skills, SOC employees and skill levels (novice, intermediate, expert)

The skill matrix should cover all SOC employees, both internal and external
e.g. ability to effectively use analysis tools
e.g. communication skills
Determining and documenting skill levels helps to identify areas where limited expertise is available
Use this outcome as a guideline to determine the score for 4.2
Personal improvement is essential, team improvement requires insight in team dynamics and skill distribution

The knowledge matrix should cover all SOC employees, both internal and external
Knowledge for service delivery: technical (i.e. support), functional (i.e. configuration) and foundational (e.g. networking, en
Determining and documenting knowledge levels helps to identify areas where limited expertise is available
Use this outcome as a guideline to determine the score for 4.5
The matrix should be used as a means to identify and resolve knowledge gaps
Besides knowledge and skills, team member abilities are also important to document
This refers to the knowledge management process as a whole
Such tooling can help to avoid investigation similar issues multiple times by integrating into the security monitoring proces
People
1. Employees 5. Training & Education
2. Roles and Hierarchy
3. People Management
4. Knowledge Management

5 Training and Education


5.1 Do you have a training program in place?
5.2 Please specify elements of the training program:
5.2.1 Training on the Job
5.2.2 Product-specific training
5.2.3 Internal company training
5.2.4 Role-based specific training
5.2.5 Soft-skill training
5.2.6 Formal education
Completeness
5.3 Do you have a certification program in place?
5.4 Please specify elements of the certification program:
5.4.1 Internal certification track
5.4.2 External certification track
5.4.3 Re-certification track (continuous education)
Completeness
5.5 Is the training and certification program connected to evaluation and career progression?
5.6 Is there a reserved budget for education and training?
5.7 Is there a reserved amount of time for education and training?
5.8 Do you have regular workshops for knowledge development?
5.9 Do you regularly revise and update the training and certification programs?

Comments and/or Remarks


5.10 Specify rationale for chosen values or any additional comments
cation

Answer

Mostly complete

Incomplete
career progression?

5.1
5.3
5.5
5.6
5.7
5.8
5.9
Guidance
A training program covering all roles is in place, but not formalized

A certification program is not in place

The programs are formally embedded in evaluation and progression

Sufficient budget is allocated for the team as a whole

Employees have sufficient time, and encouraged to attend training

Workshops are held in an ad-hoc fashion

Programs are reviewed ad-hoc, not using a structured approach


Remarks
A training program is used to ensure a minimal level of knowledge for employees

Training on the job can be done internally by senior employees or using external consultants
Product-specific training may be required for new technologies or complex solutions
e.g. training on internal policies
For example: security analysis training for the security analyst role
To complement hard skills, soft skills should be trained as well
Formal education may be university or university college degrees
Use this outcome as a guideline to determine the score for 5.1
A certification program is used to provide a demonstrable minimum level of knowledge and skills

Internal certifications may be in place to demonstrate knowledge of company processes and policies
Certification track with external certification organizations (e.g. ISACA, (ISC)2, SANS
Permanent education (PE) may be part of the certification itself
Use this outcome as a guideline to determine the score for 5.3
e.g. certain training and certifications are required to grow from a junior level function to a more senior level function
i.e. a fixed percentage of the total SOC budget that is allocated for education and cannot be used for other purposes
This is an extension of education budget
Workshops are an informal way of distributing knowledge
Training and certification must be a relevant reflection of SOC knowledge and skill requirements
Process
1. SOC Management 5. Detection Engineering & Validation
2. Operations & Facilities
3. Reporting & Communication
4. Use Case Management

1 Management
1.1 Is there a SOC management process in place?
1.2 Are SOC management elements formally identified and documented?
1.3 Please specify identified SOC management elements:
1.3.1 Internal relationship management
1.3.2 External relationship management
1.3.3 Vendor management
1.3.4 Continuous service improvement
1.3.5 Project methodology
1.3.6 Process documentation and diagrams
1.3.7 RACI matrix
1.3.8 Service Catalogue
1.3.9 Service on-boarding procedure
1.3.10 Service off-loading procedure
Completeness
1.4 Is the SOC management process regularly reviewed?
1.5 Is the SOC management process aligned with all stakeholders?
1.6 Have you implemented a process for continuous improvement (CI)?
1.7 Specify elements of the continuous improvement program:
1.7.1 Daily progress tracking
1.7.2 Weekly planning
1.7.3 Backlog management
1.7.4 Work item effort estimation
1.7.5 Work item prioritisation
1.7.6 Refinement
1.7.7 Capacity for change
Completeness
1.8 Have you implemented a process to manage SOC quality assurance (QA)?
1.9 Please specify elements of the quality assurance program:
1.9.1 Ticket quality assurance
1.9.2 Incident quality assurance
1.9.3 Service quality assurance
1.9.4 Process quality assurance
1.9.5 Report quality assurance
Completeness
1.10 Have you implemented a SOC architecture process?
1.11 Please specify elements of the SOC architecture:
1.11.1 SOC process architecture
1.11.2 SOC technology architecture
1.11.3 SOC service architecture
1.11.4 Architecture diagrams
1.11.5 Architecture principles
Completeness

Comments and/or Remarks


1.8 Specify rationale for chosen values or any additional comments
neering & Validation

Answer

Averagely complete

Averagely complete
Averagely complete

Mostly complete

1.1
1.2
1.4
1.5
1.6
1.8
1.9
1.10
Guidance
An informal process is in place that covers most aspects
Single document, full description of SOC management process

Process is regularly and formally reviewed and updated with findings


All stakeholders are aware, not all stakeholders know its details
CI conducted structurally, not documented

QA conducted structurally, not documented


Basic documentation of SOC architecture, principles defined
Remarks
A SOC management process is used to manage all aspects SOC service delivery and quality
Possible SOC management elements can be found in under 1.3

Relationship management within the organization


Relationship management outside of the organization
Relationship management with relevant vendors for SOC technologies
A methodology for continuously improving on SOC service delivery and internal processes supporting service delivery
For example: LEAN or agile project approach
Any documentation on SOC processes or services. May contains diagrams explaining relationships between processes
A description of all SOC responsibilities, accountabilities and cases in which the SOC is informed or consulted
A description of all SOC services and service levels
Procedure for intake, evaluation and move-to-production for requests for new services or customers
Procedure to remove existing services and customers from service delivery
Use this outcome as a guideline to determine the score for 1.2
Regular review of the SOC management process ensures optimal performance
Alignment with stakeholders will ensure the SOC delivers services that meet customer expectations
Continuous improvement is a vital process for SOCs to evolve their capabilities

Daily progress tracking is used to identify (blocking) issues, determine priorities and request help for certain activities
Weekly planning is done to create a balanced improvement workload for the team
Managing the backlog includes structuring the backlog and grooming the backlog
Work item estimation (through t-shirt sizing, or more accurate estimation) is essential to realistic planning
Prioritisation of work items should follow a defined prioritisation method and be done by the owner of the backlog
Refinement of items, including a definition of ready, is required to ensure all team members understand the task at hand
Having a reserved capacity for change ensures the improvement is continuous, and not overtaken by operational tasks
Use this outcome as a guideline to determine the score for 1.6
Quality assurance is aimed at ensuring SOC processes, technology and services meet their quality requirements
Correct and timely analysis of alerts, including correct usage of playbooks
Correct and timely follow-up of incidents, including correctly following the incident response procedures
Delivery of services in accordance with established quality criteria
Execution of processes in accordance with established quality criteria
Correct and timely report provisioning
Use this outcome as a guideline to determine the score for 1.8
A SOC architecture describes how different components

A process architecture outlines the different processes within the SOC and how they interact / integrate
A technology architecture outlines the different technologies used within the SOC and how they interact / integrate
A service architecture outlines the different services used within the SOC and how they interact / integrate
Architecture diagrams are visualisations of components and integrations
Architecture principles are guidelines for implementing processes, technology & services
Use this outcome as a guideline to determine the score for 1.10
Process
1. SOC Management 5. Detection Engineering & Validation
2. Operations & Facilities
3. Reporting & Communication
4. Use Case Management

2 Operations and Facilities


2.1 Security operations exercises
2.1.1 Do you have a documented exercise plan?
2.1.2 Please specify types of exercises included in the plan
2.1.2.1 Table-top exercises
2.1.2.2 Playbook drills
2.1.2.3 Cyber range
2.1.2.4 Capture the flag
2.1.2.5 Purple/Red/Black team exercises
2.1.2.6 Public exercises [1]
Completeness
2.1.3 Do you perform security operations exercises regularly?
2.1.4 Are the results from exercises documented?
2.1.5 Is the output from exercises actively used to improve security operations?
2.2 Service delivery standardization
2.2.1 Do you have standard operating procedures?
2.2.2 Do you use checklists for recurring activities?
2.2.3 Do you use documented workflows?
2.2.4 Do you have a SOC operational handbook?
2.2.5 Have you established an Operational Security (OPSEC) program?
2.3 Process integration
2.3.1 Is the configuration management process integrated in the SOC?
2.3.2 Is the change management process integrated in the SOC?
2.3.3 Is the problem management process integrated in the SOC?
2.3.4 Is the incident management process integrated in the SOC?
2.3.5 Is the asset management process integrated in the SOC?
2.4 SOC Facilities
2.4.1 Do you have a dedicated physical SOC location?
2.4.2 Do you have a war room for the SOC?
2.4.3 Do you have a dedicated network for the SOC?
2.4.4 Do you have physical access control to the SOC location?
2.4.5 Do you have a secure physical storage location?
2.4.6 Do you have a video wall for monitoring purposes?
2.4.7 Do you have a call-center capability for the SOC?
2.4.8 Do you have specialized analyst workstations?
2.4.9 Have you optimized secure remote working capabilities for SOC employees?
2.5 Operational shifts
2.5.1 Do you use shift schedules?
2.5.2 Have schedules been created to optimize vigilance during shifts?
2.5.3 Do you have a shift log?
2.5.4 Do you have a formally described shift turnover procedure?
2.5.5 Do you have a daily SOC operational stand-up?
2.5.6 Do you have stand-by arrangements with employees within the SOC?
2.6 Knowledge & document management
2.6.1 Do you have a Document Management System in place?
2.6.2 Do you have a knowledge & collaboration platform in place?

Comments and/or Remarks


2.6 Specify rationale for chosen values or any additional comments

[1] public exercises may include exercises like:


- FS-ISAC CAPS (payment systems)
https://www.fsisac.com/resilience/exercises
- ENISA Cyber Europe
https://www.enisa.europa.eu/topics/training-and-exercises/cyber-exercises/
- CISA Cyber Storm
https://www.cisa.gov/cyber-storm-securing-cyber-space
neering & Validation

Answer

Averagely complete
2.1.1
2.1.3
2.1.4
2.1.5
2.2.1
2.2.2
2.2.3
2.2.4
2.2.5
2.3.1
2.3.2
2.3.3
2.3.4
2.3.5
2.4.1
2.4.2
2.4.3
2.4.4
2.4.5
2.4.6
2.4.7
2.4.8
2.4.9
2.5.1
2.5.2
2.5.3
2.5.4
2.5.5
2.5.6
2.6.1
2.6.2
Guidance

Basic description of SOC exercises

Exercises are sometimes performed in a structured manner


Results documented in a single document
Improvements determined, formally assigned, implementation tracked

Most procedures are in place


Checklists are in place, but not used consistently
Single document, full description of workflows
Single document, full description of SOC tasks & rules

Consistent OPSEC documentation, no program

Configuration management is mostly automated


Change management process in place, structurally executed
Problem management process in place, structurally executed
Incident management process in place, structurally executed
All asset management updates reflected in CMDB and security tooling

SOC established on single floor


Room available, not equipped and not dedicated for the SOC
Most SOC equipment in separate network, basic access controls in place
Dedicated access control in place using badges, access restricted
Dedicated secure storage in place, granular access control, no monitoring
Multiple screens in place, showing prioritized events and alerts
Some basic communication means in place
Dedicated analyst workstations, toolset not standardized
Remote working facilitated, fully secured, not actively monitored

Shift schedules in place, guaranteeing full shift coverage for all roles
Shift schedule optimized for vigilance, but not regularly improved
Shift log in place, fully accurate and up to date
Basic shift turnover procedure created
Stand-up carried out regularly, but not in structured fashion
Stand-by arrangement in place, not supported by tooling and not tested

Documentation centralized on file shares


Knowledge & collaboration performed in an ad-hoc fashion
Remarks

An exercise plan lists the type of exercises to be conducted, the frequency and the exercise goals

Table-top exercises are an easy way to go through a scenario and determine if all procedures and actions are clear
Playbook drills are lowlevel exercises to test the accuracy, effectiveness, and efficiency of playbooks
Cyber range exercises provide a simulated environment that is used to train analysts. A simulation of the actual IT environ
A capture the flag event is a gamification for security analysts, in which they must achieve a specified goal
These are types of exercises to conduct an actual attack against the organisation, without malicious intent
Public exercsies are exercises that the organisation can participate in. Often, these are large-scale exercises
Use this outcome as a guideline to determine the score for 2.1.1
Regularity should be matched to your own internal policy
Results from exercisess should be structurally documented for future reference and identification of improvements
Exercise output should be used to structurally improve security operations

Standard operating procedures are used to provide consistent output


Checklists can be useful to avoid recurring activities from being overlooked
Workflows are used to standardize steps in, for example, security analysis
A SOC operational handbook contains an overview of SOC tasks, as well as rules of engagement and expected behavior
An OPSEC program dictates security rules to ensure the integrity and confidentiality of SOC processes, tools and informatio

SOC services and procedures should be aligned and integrated with the organization's configuration management process
SOC services and procedures should be aligned and integrated with the organization's change management process
SOC services and procedures should be aligned and integrated with the organization's problem management process
SOC services and procedures should be aligned and integrated with the organization's incident management process
SOC services and procedures should be aligned and integrated with the organization's asset management process

A dedicated physical location decreases likelihood of unauthorized access and provides confidentiality for security incident
A dedicated facility for coordination of major security incidents
Given the confidentiality of the SOC and the importance of monitoring, it is recommended to use a separate network
e.g. key cards (badges) for access with access logging
Secure storage facilities can be used to store evidence collected during investigations or other operational security purpos
A video wall can be used to display the real-time security status and can be used for decision making as well as PR
Since communication and coordination are important features of a SOC, call-center capability may be required
e.g. multiple screen setup, virtual machines, etc.
Secure working enabled means secure access (MFA, encryption, etc.), secure working facilitated also means equipped and

Shift schedules are used to ensure full shift coverage


Schedules should be created to optimize vigilance. Thus, shift should include mandatory breaks and not be too long
A shift log covers all exceptions found during the shift, running investigations, etc.
i.e. a procedure for handing over a shift and exchanging information regarding running tasks or issues for further follow-up
This can also be a call in case physical attendance is not possible for all attendees
i.e. is there a formal stand-by function that obligates employees to be able to be reached within a certain time?

The system should support different file types, authorizations and version management; possibly even encryption
e.g. a wiki space or SharePoint that allows collaboration and supports team efforts
nt is prefered
Process
1. SOC Management 5. Detection Engineering & Validation
2. Operations & Facilities
3. Reporting & Communication
4. Use Case Management

3 Reporting & communication


3.1 Do you regularly provide reports?
3.2 Are these reports tailored to the recipients?
3.3 Are the report contents approved by or reviewed by the recipients?
3.4 Do you have established reporting lines within the organization?
3.5 Do you regularly revise and update the report templates?
3.6 Do you have formal agreements with the recipients regarding reports?
3.7 Do you provide different types of reports to your recipients?
3.8 Please specify SOC report types:
3.8.1 Technical security reports
3.8.2 Executive security reports
3.8.3 Operational reports
3.8.4 Incident reports
3.8.5 Newsletter or digest
3.8.6 KPI reports
3.8.7 Trend reports
3.8.8 Real-time reporting dashboards
Completeness
3.9 Do you use different types of metrics in your reports?
3.10 Please specify SOC metric types[1]
3.10.1 Are quantitative metrics used in reports?
3.10.2 Are qualitative metrics used in reports?
3.10.3 Are incident & case metrics used in reports?
3.10.4 Are timing metrics used in reports?
3.10.5 Are metrics regarding SLAs used in reports?
3.10.6 Are proactive and reactive metrics used in reports?
Completeness
3.11 Advisories
3.11.1 Do you provide advisories to the organization regarding threats and vulnerabilities?
3.11.2 Do you perform risk / impact assessments of these advisories?
3.11.3 Do you perform follow-up of these advisories?
3.12 Education and Awareness
3.12.1 Do you provide education and security awareness to the organization?
3.12.2 Do you measure the effect of education and security awareness efforts?
3.13 Communication
3.13.1 Do you use communication templates?
3.13.2 Do you have a communication matrix in place?
3.13.3 Is communication training (verbal/written) available for SOC personnel?
3.13.4 Are communication skills element of SOC role descriptions?

Comments and/or Remarks


3.14 Specify rationale for chosen values or any additional comments

[1] SOC-CMM has published a metrics suite that can serve as a starting point:
https://www.soc-cmm.com/products/metrics/
neering & Validation

Answer

Averagely complete

Averagely complete

vulnerabilities?
3.1
3.2
3.3
3.4
3.5
3.6
3.7
3.9
3.11.1
3.11.2
3.11.3
3.12.1
3.12.2
3.13.1
3.13.2
3.13.3
3.13.4
Guidance
Reports are provided regularly, standardized and regularly optimized
Reports fully tailored to recipients, manual customization required
Reports regularly reviewed, not formally signed off by recipients
Reports dissemination through standard and approved reporting lines
Report templates regularly revised and updated
Formal agreements exists, not measured
Required reporting types provided, not regularly evaluated

Required metric types used, not regularly evaluated

Advisories provided regularly, format discussed but not approved


Unstructured risk & impact assessments performed
Follow-up performed for most advisories, aligned with ITSM processes

E&A provided through an established program


Efforts measured in a structured fashion, output used in improvement

Templates created, but not used consistently


Communication information available in a single matrix
Communication training formal part of employee onboarding and evaluation
Communication skills documented in role description
Remarks
Regular reports help to keep customers informed of SOC activities
e.g. management reports for senior management, technical reports for the IT organization
formal sign-off can be part of a larger service delivery sign-off
e.g. reporting lines could be: SOC management, IT management, senior management
Report templates should be regularly optimized to ensure continued
For example: timelines of delivery, report contents, etc.
Different types of reports provide more insight into security operations

i.e. reports regarding technical issues or technical solutions to security issues


i.e. reports aimed at senior executives to inform them of SOC services
i.e. reports regarding security operations in general
Ad-hoc reports created to provide insight into incidents. This can also be part of incident management
A newsletter can be an informal way to provide updates to the organization
KPI reports are used to measure service performance
Trend reports can be used to determine changes over time
Real-time reporting dashboards provide immediate insight into the current performance level
Use this outcome as a guideline to determine the score for 3.7
Different types of metrics provide more insight into security operations

Event count, false-positive rate, number of service requests, etc.


i.e. risk level, customer satisfaction
e.g. the number of cases and incidents, number of incidents detected by SOC, average cost per incident, etc.
Time to detect, time to contain, time to eradicate
e.g. service availability, incidents handled within agreed time period, etc.
Proactive metrics can help to show how the team is actively preventing incidents from occurring
Use this outcome as a guideline to determine the score for 3.9

Advisories are used to inform customers of security threats and vulnerabilities


i.e. do you add organizational context to these advisories?
i.e. do you assist in coordination when required?

The SOC may be involved in security awareness and education to make users in the organisation aware of their role in cyb
Measuring the efforts is necessary for improvement of the security awareness function

Communication templates help to standardize and professionalize SOC communication to stakeholders


A communication matrix can be used to decide when, how and with whom to communicate
Communication training helps SOC personnel to communicate effectively
Communication skills are important for analysts to explain their findings and recommendations
Process
1. SOC Management 5. Detection Engineering & Validation
2. Operations & Facilities
3. Reporting & Communication
4. Use Case Management

4 Use Case Management


4.1 Use Case Management
4.1.1 Is there a use case management process or framework in place?
4.1.2 Are use cases formally documented?
4.1.3 Are use cases approved by relevant stakeholders?
4.1.4 Is the use case management process aligned with other important processes?
4.1.5 Are use cases created using a standardized process?
4.1.6 Are use cases created using a top-down approach?
4.1.7 Can use cases be traced from high-level drivers to low-level implementation?
4.1.8 Can use cases be traced from low-level implementation to high-level drivers?
4.1.9 Are use cases measured for implementation and effectiveness?
4.1.10 Are use cases scored and prioritized based on risk levels?
4.1.11 Are use cases regularly revised and updated?

4.2 Mitre ATT&CK® / Threat Intelligence


4.2.1 Do you measure use cases against the MITRE ATT&CK® framework for gap analysis purposes?
4.2.2 Are monitoring rules tagged with MITRE ATT&CK® framework identifiers? [2]
4.2.3 Have you created a MITRE ATT&CK® risk profile for your organization?
4.2.4 Have you prioritized MITRE ATT&CK® techniques for relevance? [3]
4.2.5 Is use case output (alerts) used in threat intelligence activities?
4.2.6 Is threat intelligence used for the creation and updates of use cases?

4.3 Visibility
4.3.1 Do you determine and document visibility requirements for each use case?
4.3.2 Do you measure visibility status for your use cases for gap analysis purposes?
4.3.3 Do you map data source visibility to the MITRE ATT&CK® framework? [4]

Comments and/or Remarks


4.4 Specify rationale for chosen values or any additional comments
[1] The MaGMa Use Case Framework is a framework and tool for use case management created by the Dutch
financial sector and can be obtained from the following location:
https://www.betaalvereniging.nl/en/safety/magma/

[2] The OSSEM Detection Model can be useful:


https://github.com/OTRF/OSSEM
[3] A useful resource can be the Mitre Assistant
https://ma-insights.vercel.app/overview
[4] For example, using the DeTT&CT tooling:
https://github.com/rabobank-cdc/DeTTECT
neering & Validation

Answer

r gap analysis purposes?

4.1.1
4.1.2

gement created by the Dutch 4.1.3


4.1.4
4.1.5
4.1.6
4.1.7
4.1.8
4.1.9
4.1.10
4.1.11
4.2.1
4.2.2
4.2.3
4.2.4
4.2.5
4.2.6
4.3.1
4.3.2
4.3.3
Guidance

Basic process in place, not applied to all phases of the use case lifecycle
Single repository, full description of use cases
Use cases not approved, all critical use cases known to stakeholders
Alignment done structurally and regularly with relevant processes
Use cases mostly created in a structured and documented fashion
Use cases created in a structured top-down way, SOC context only
Traceability is possible for all use cases, but requires manual effort
Traceability is possible for all use cases, but requires manual effort
Metrics applied to all use cases, no risk-based feedback loop
All use cases scored and prioritized, validated & reviewed by stakeholders
All use cases are regularly and formally reviewed and updated

All use cases continuously measured, output used in improvement


High- and medium risk monitoring rules tagged
MITRE ATT&CK® profile created, not validated or maintained
ATT&CK® techniques prioritized, validated and regularly updated
Formal process in place, connecting use case output to TI activities
TI used in use case creation / updates for high-risk threats

Visibility requirements determined for most use cases, not documented


Visibility status measured frequently, output used in improvement
Data sources continuously mapped to ATT&CK®, output used in improvement
Remarks

A framework, such as MaGMa UCF[1], can be used to guide use case lifecycle and document use case in a standardized form
Formal documentation may include use case documentation templates
e.g. business stakeholders, IT stakeholders, CISOs, audit & compliance, risk management, etc.
e.g. integration with the threat / risk management process to revise use cases when the threat landscape changes
i.e. a standardized approach to derive use cases from threats or business requirements
e.g. use cases can be derived from business requirements, risk assessments, threat management / intelligence
Top-down traceability is important to determine completeness of implementation and demonstrable risk reduction
Bottom-up traceability is important for contextualizing use case output and business alignment
Metrics can be applied to use cases to determine growth and maturity by measuring effectiveness and implementation
Risks can be (cyber)threats, but also non-compliance or penalties (laws & regulations)
Use cases should be subjected to life cycle management and may require updates or may be outdated and decommissione

By measuring use cases against MITRE ATT&CK®, it is possible to determine strengths and weaknesses in your layered dete
Tagging monitoring rules with MITRE ATT&CK® identifiers allows for reporting on sightings of attack techniques
The creation of a risk profile in MITRE ATT&CK® can help to identify relevant attack techniques
Using organizational context (protection and detection mechanisms), ATT&CK® techniques can be prioritized
Using MITRE ATT&CK®, it is possible to connect alerts to specific threat actors, or potentially even active campaigns
Threat intelligence can provide input into security monitoring, especially when using MITRE ATT&CK® to connect both

Data source requirements should be part of use case design


Measuring visibility status of use cases will help to identify use cases that require additional data sources for detection opti
Mapping available data sources to MITRE ATT&CK® data source requirements will help identify gaps in visibility of attack te
on approach
Process
1. SOC Management 5. Detection Engineering & Validation
2. Operations & Facilities
3. Reporting & Communication
4. Use Case Management

5 Detection Engineering & Validation


5.1 Detection Engineering
5.1.1 Do you have a detection engineering process in place?
5.1.2 Is the detection engineering process formally documented?
5.1.3 Are there specific roles and requirements for detection engineers?
5.1.4 Is there active cooperation between the SOC analysts and the detection engineers?
5.1.5 Is there active cooperation between the Threat Intelligence analysts and detection engineers?
5.1.6 Are there formal hand-over to the analyst team?
5.1.7 Is there a testing enviroment to test and validate detections before deploying them?
5.1.8 Is there a formal release process in place for new detections?
5.1.9 Do you apply a versioning system to detections?
5.1.10 Do you have a roll-back procedure in place in case of problems with detections?

5.2 Detection validation


5.2.1 Do you perform adversary emulation or automated detection testing? [1]
5.2.2 Do you test for detection of MITRE ATT&CK® techniques?
5.2.3 Do you test detection analytics not directly associated with MITRE ATT&CK®?
5.2.4 Do you test response playbooks?
5.2.5 Is detection validation fully integrated in the detection engineering process / pipeline?
5.2.6 Is the outcome from detection validation used as input into monitoring and detection engineering?
5.2.7 Do you monitor the data ingestion status for data sources?
5.2.8 Do you actively measure and improve data source coverage?

Comments and/or Remarks


5.3 Specify rationale for chosen values or any additional comments

[1] example tools / resources include:


- Atomic red team
- MITRE Caldera
- MAAD-AF
- Breach & Attack Simulation (BAS) tools
- CTID Adversary Emulation Library
neering & Validation

Answer

on engineers?
and detection engineers?

eploying them?

rocess / pipeline?
ng and detection engineering?

5.1.1
5.1.2
5.1.3
5.1.4
5.1.5
5.1.6
5.1.7
5.1.8
5.1.9
5.1.10
5.2.1
5.2.2
5.2.3
5.2.4
5.2.5
5.2.6
5.2.7
5.2.8
Guidance

Basic process in place, not applied to all use cases

Basic documentation of detection engineering process

Requirements identified, role defined but not documented

SOC analysts are informed and review outcomes

Threat analysts are informed and review outcomes

Handover performed, process documentation in place

Testing environment used, testing process documented

Releases done structurally, process not documented of formalised

Versioning system used for some detections

Roll-back capability in place and documented

Validation activities performed structurally following a documented process

All use cases tested, visibility and detection targets used in improvements

All use cases tested, process formalized, detection targets set

Some response playbooks tested, no formal process

Releases process triggers ADT/AE for all use cases, documented process

ADT/AE outcome used, no documented process

Data ingestion status monitored, not complete for all data source types

Data source coverage measured , not complete for all data source types
Remarks

A detection engineering process supports the creation and deployment of detection rules for security monitoring purpose
Formal documentation supports process standardisation, and allows for faster training of new engineers
Detection engineers have a skillset that is different from security analysts and security engineers
SOC analyst deal with alerts resulting from detections created by engineers, so a tight interaction is required to optimize th
Threat intelligence is a major input into the creation or updating of detection rules
Once the detections are created, they must be operationalized. This should be done with a formal hand-over to production
A testing environment allows for thorough testing of new detections, which ensures a higher level of quality
A formal release process includes automated deployment of rules and adheres to organizational change management pro
A versioning system will allow to revert back to previous versions of detections
A roll-back procedure enabled reverting abck to a good state if a deployment has an adverse effect on security monitoring

These validation activities provides insights into how well security monitoring is able to detect certain adversaries or attack
Testing for MITRE ATT&CK® techniques can augment mapping of use cases and visibility in Mitre ATT&CK
Not all use cases and risks have a relationship to MITRE ATT&CK®. These use cases should be tested as well
Testing both detection and response provides a more complete view of SOC capabilities
When deploying new or updated detection, automated detection testing should be executed as a quality gate
Output should lead to updates in detections and new detections, as well as instructions for SOC analysts
Monitoring data ingestion is used to identify data ingestion problems or inactive data sources
Data source coverage should be optimized to avoid blind spots in monitoring
Technology
1. SIEM / UEBA
2. NDR
3. EDR
4. SOAR

1 SIEM tooling

Maturity
1.1 Accountability
1.1.1 Has functional ownership of the solution been formally assigned?
1.1.2 Has technical ownership of the solution been formally assigned?
1.2 Documentation
1.2.1 Has the solution been technically documented?
1.2.2 Has the solution been functionally documented?
1.3 Personnel & support
1.3.1 Is there dedicated personnel for support?
1.3.2 Is the personnel for support formally trained?
1.3.3 Is the personnel for support certified?
1.3.4 Is there a support contract for the solution?
1.4 Maintenance & configuration
1.4.1 Is the system regularly maintained?
1.4.2 Is remote maintenance on the system managed?
1.4.3 Are maintenance & configuration updates executed through the change management process?
1.4.4 Have you established maintenance windows?
1.4.5 Is maintenance performed using authorised and trusted tooling?
1.5 Availability & Integrity
1.5.1 Is there high availability (HA) in place for the solution?
1.5.2 Is there data backup / replication in place for the solution?
1.5.3 Is there configuration backup / replication in place for the solution?
1.5.4 Is there a Disaster Recovery plan in place for this solution?
1.5.5 Is the Disaster Recovery plan regularly tested?
1.5.6 Is there a separate development / test environment for this solution?
1.6 Access management
1.6.1 Is access to the solution limited to authorized personnel?
1.6.2 Are access rights regularly reviewed and revoked if required?
1.6.3 Is a break glass procedure in place?

Capability
1.7 Specify which technological capabilities and artefacts are present and implemented:
Technical capabilities
1.7.1 Subtle event detection
1.7.2 Automated alerting
1.7.3 Alert acknowledgement
1.7.4 Case management system
1.7.5 Network model
1.7.6 Detailed audit trail of analyst activities
1.7.7 Historical activity detection
1.7.8 Flexible and scalable architecture
1.7.9 MITRE ATT&CK® identifier tagging

Data ingestion and processing


1.7.10 Aggregation
1.7.11 Normalisation
1.7.12 Correlation
1.7.13 Multi-stage correlation
1.7.14 Custom parsing
1.7.15 API Integration
1.7.16 Secure Event Transfer
1.7.17 Support for multiple event transfer technologies

Integrations (technical & process)


1.7.18 Asset management integration
1.7.19 Business context integration
1.7.20 Identity context integration
1.7.21 Asset context integration
1.7.22 Vulnerability context integration
1.7.23 Threat Intelligence integration
1.7.24 Threat hunting integration
1.7.25 Security incident management integration
1.7.26 SOAR integration
Rule-based detection
1.7.27 Standard detection rules
1.7.28 Custom detection rules

Anomaly detection
1.7.29 User anomalies
1.7.30 Application anomalies
1.7.31 Device anomalies
1.7.32 Network anomalies

Visualisation and output


1.7.33 Reporting
1.7.34 Dashboards
1.7.35 Data visualization techniques
1.7.36 Data drilldowns
1.7.37 Central analysis console
1.7.38 Advanced searching and querying
Completeness (%)

Comments and/or Remarks


1.8 Specify rationale for chosen values or any additional comments
Answer

nge management process?


d implemented:
74

Maturity 1.1.1
1.1.2
1.2.1
1.2.2
1.3.1
1.3.2
1.3.3
1.3.4
1.4.1
1.4.2
1.4.3
1.4.4
1.4.5
1.5.1
1.5.2
1.5.3
1.5.4
1.5.5
1.5.6
1.6.1
1.6.2
1.6.3

Capability 1.7.1
1.7.2
1.7.3
1.7.4
1.7.5
1.7.6
1.7.7
1.7.8
1.7.9
1.7.10
1.7.11
1.7.12
1.7.13
1.7.14
1.7.15
1.7.16
1.7.17
1.7.18
1.7.19
1.7.20
1.7.21
1.7.22
1.7.23
1.7.24
1.7.25
1.7.26
1.7.27
1.7.28
1.7.29
1.7.30
1.7.31
1.7.32
1.7.33
1.7.34
1.7.35
1.7.36
1.7.37
1.7.38
Guidance

Functional ownership fully described and assigned, not approved

Technical ownership fully described and assigned, not approved

Single document, full technical description of SIEM system

Single document, full functional description of SIEM system

Sufficient dedicated personnel available, documented and formalized

All personnel formally trained

All personnel formally certified

Support contract in place, covering most SOC requirements

System maintenance done structurally, following procedures

Remote maintenance controlled, in a documented process

All maintenance executed through changes, no formal approval

Established, formally approved & aligned with change management

authorized & updated tooling used, regularly evaluated

Fully automated HA in place, not aligned with business continuity plans

Daily backup routine in place

Daily backup routine in place

Full DR plan in place, not approved by business continuity stakeholders

DR plan tested, but not formally

Test environment not in place, testing performed in ad-hoc fashion

Granular access rights implemented and monitored, not audited


Access rights reviewed periodically and structurally

Break glass account and defined procedure in place

Mostly implemented, documented and approved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Mostly implemented, documented and approved

Fully implemented, documented, approved, actively improved

Mostly implemented, documented and approved

Fully implemented, documented, approved, actively improved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Averagely implemented, partially documented

Averagely implemented, partially documented

Mostly implemented, documented and approved

Fully implemented, documented, approved, actively improved

Averagely implemented, partially documented

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Averagely implemented, partially documented

Partially implemented, incomplete

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved


Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Averagely implemented, partially documented

Averagely implemented, partially documented

Averagely implemented, partially documented

Averagely implemented, partially documented

Fully implemented, documented, approved, actively improved

Averagely implemented, partially documented

Mostly implemented, documented and approved

Fully implemented, documented, approved, actively improved

Mostly implemented, documented and approved

Mostly implemented, documented and approved


Remarks

Functional ownership includes functional accountability


Technical ownership includes technical accountability

A technical description of the SIEM system components and configuration


A description of the SIEM functional configuration (rules, filters, lists, etc.)

Dedicated personnel should be in place to ensure that support is always available. Can also be staff with outsourced provid
Training helps to jump start new hires, and to learn a proper way of working with the tool
Certification demonstrates ability to handle the tooling properly
A support contract may cover on-site support, support availability, response times, escalation and full access to resources

Systems should be regularly maintained to keep up with the latest features and bug fixes
Remote maintenance by a third party may be part of system maintenance procedures
Maintenance and configuration updates should follow the formal change management process
Setting maintenance windows helps to structure the maintenance process and make it more predictable
Performing maintenance with authorised and trusted tooling helps to ensure security and integrity of the system

Can be fully implemented HA, partially implemented, hot spare, etc.


May not be feasible for all SIEM solutions
Configuration synchronization could be part of a HA setup
A DR plan is required to restore service in case of catastrophic events
Testing the DR plan is required to ensure that it is complete, functional and tasks and responsibilities for involved personn
A separate test environment allows for testing of new configurations before deployment in production

The system will contain confidential information and information that possibly impacts employee privacy
Revocation is part of normal employee termination. Special emergency revocation should be in place for suspected misuse
A break glass procedure and account is required to gain access to the tooling even in case of major IT outages

Capability to detect slight changes in systems, applications or network that may indicate malicious behavior
Alerting based on different alerting mechanisms (SMS, mail, etc.)
Capability to acknowledge alerts so other analysts know the alert is being investigated
A case management system that supports SOC analyst workflows
A full network model in which zones and segments are defined
The audit trail can be used to report on analyst activities and to uncover potential abuse of the big data solution
Capability of detecting historical activity for recently uncovered threats
A flexible and scalable architecture supports the SOC as it grows in size (FTE) and data (coverage)
Add MITRE ATT&CK® tags to rules / analytics for mapping purposes

Capability to aggregate data and optimize the event flow


Normalization of data is required for advanced searching and comparison of events from different sources
Capability to correlate multiple events
Capability to detect attacks across multiple attack stages
Capability to create and maintain custom parsers for parsing and normalization needs
Both export of information / commands and import of data from API sources (such as cloud)
Support for secure event transfer and the actual implementation of secure transfer (e.g. regular syslog is not secure)
The SIEM should support event transfer technologies for all possible data sources

Integration into the asset management process for automated adding of assets to the SIEM for monitoring
Integration of business context (business function, asset classification, etc.)
Integration of identity information into the SIEM for enhanced monitoring of users and groups
Integration of asset management information into the SIEM (asset owner, asset location, etc.)
Integration of vulnerability management information into SIEM assets to determine risk levels for assets
Integration of threat intelligence information (observables / IoCs) into the security monitoring tooling
Integration of the tooling into the threat hunting process to support threat hunting investigations
Integration of the security incident management process to support incident investigation
Integration with the SOAR tooling for automation purposes
Use of standard content packs in the security monitoring solution
Use of custom content (correlation rules, etc.) in the security monitoring solution

Anomalous pattern detection for users


Anomalous pattern detection for applications
Anomalous pattern detection for devices
Anomalous pattern detection for network

Custom reports for SOC customers and SOC analysts


Custom dashboards used by analysts and managers
Graphing capabilities to support analysis
Drilldowns on graphs to quickly 'zoom in' on details of visual anomalies
A central console used by SOC analysts
Searching capabilities that support finding specific information based on characteristics
understood
Technology
1. SIEM / UEBA
2. NDR
3. EDR
4. SOAR

2 IDPS Tooling

Maturity
2.1 Accountability
2.1.1 Has functional ownership of the solution been formally assigned?
2.1.2 Has technical ownership of the solution been formally assigned?
2.2 Documentation
2.2.1 Has the solution been technically documented?
2.2.2 Has the solution been functionally documented?
2.3 Personnel & support
2.3.1 Is there dedicated personnel for support?
2.3.2 Is the personnel for support formally trained?
2.3.3 Is the personnel for support certified?
2.3.4 Is there a support contract for the solution?
2.4 Maintenance & configuration
2.4.1 Is the system regularly maintained?
2.4.2 Is remote maintenance on the system managed?
2.4.3 Are maintenance & configuration updates executed through the change management process?
2.4.4 Have you established maintenance windows?
2.4.5 Is maintenance performed using authorised and trusted tooling?
2.5 Availability & Integrity
2.5.1 Is there high availability (HA) in place for the solution?
2.5.2 Is there data backup / replication in place for the solution?
2.5.3 Is there configuration backup / replication in place for the solution?
2.5.4 Is there a Disaster Recovery plan in place for this solution?
2.5.5 Is the Disaster Recovery plan regularly tested?
2.5.6 Is there a separate development / test environment for this solution?
2.6 Access Management
2.6.1 Is access to the solution limited to authorized personnel?
2.6.2 Are access rights regularly reviewed and revoked if required?
2.6.3 Is a break glass procedure in place?

Capability
2.7 Specify which technological capabilities and artefacts are present and implemented:
Technical capabilities
2.7.1 Encrypted traffic analysis
2.7.2 IDS signature matching
2.7.3 Supervised machine learning
2.7.4 Unsupervised machine learning
2.7.5 Traffic blocking
2.7.6 Unauthorised device detection
2.7.7 MITRE ATT&CK® identifier tagging
2.7.8 Deep packet inspection
2.7.9 Correlation

Data ingestion and processing


2.7.10 Full packet capture
2.7.11 Flow data ingestion

Monitoring capabilities
2.7.12 Monitoring north - south network traffic
2.7.13 Monitoring east - west network traffic
2.7.14 Monitoring classified network segements
2.7.15 Monitoring cloud environments
2.7.16 Monitoring ICS/SCADA networks
2.7.17 Monitoring DNS traffic

Integrations (technical & process)


2.7.18 Business context integration
2.7.19 Identity context integration
2.7.20 Threat Intelligence integration
2.7.21 Threat hunting integration
2.7.22 Security incident management integration
2.7.23 SIEM integration
2.7.24 Malware sandbox integration
Rule-based detection
2.7.25 Standard detection rules
2.7.26 Custom detection rules

Anomaly detection
2.7.27 Traffic baselining
2.7.28 Pattern analysis

Visualisation and output


2.7.29 Reporting
2.7.30 Dashboards
2.7.31 Data visualization techniques
2.7.32 Data drilldowns
2.7.33 Central analysis console
2.7.34 Advanced searching and querying
Completeness (%)

Comments and/or Remarks


2.8 Specify rationale for chosen values or any additional comments
Answer

nge management process?


d implemented:
0

Maturity 2.1.1
2.1.2
2.2.1
2.2.2
2.3.1
2.3.2
2.3.3
2.3.4
2.4.1
2.4.2
2.4.3
2.4.4
2.4.5
2.5.1
2.5.2
2.5.3
2.5.4
2.5.5
2.5.6
2.6.1
2.6.2
2.6.3

Capability 2.7.1
2.7.2
2.7.3
2.7.4
2.7.5
2.7.6
2.7.7
2.7.8
2.7.9
2.7.10
2.7.11
2.7.12
2.7.13
2.7.14
2.7.15
2.7.16
2.7.17
2.7.18
2.7.19
2.7.20
2.7.21
2.7.22
2.7.23
2.7.24
2.7.25
2.7.26
2.7.27
2.7.28
2.7.29
2.7.30
2.7.31
2.7.32
2.7.33
2.7.34
Guidance
Remarks

Functional ownership includes functional accountability


Technical ownership includes technical accountability

A technical description of the NDR system components and configuration


A description of the NDR functional configuration (rules, alerts, thresholds, etc.)

Dedicated personnel should be in place to ensure that support is always available. Can also be staff with outsourced provid
Training helps to jump start new hires, and to learn a proper way of working with the tool
Certification demonstrates ability to handle the tooling properly
A support contract may cover on-site support, support availability, response times, escalation and full access to resources

Systems should be regularly maintained to keep up with the latest features and bug fixes
Remote maintenance by a third party may be part of system maintenance procedures
Maintenance and configuration updates should follow the formal change management process
Setting maintenance windows helps to structure the maintenance process and make it more predictable
Performing maintenance with authorised and trusted tooling helps to ensure security and integrity of the system

Can be fully implemented HA, partially implemented, hot spare, etc.


Data may include logs and PCAP files
Configuration synchronization could be part of a HA setup
A DR plan is required to restore service in case of catastrophic events
Testing the DR plan is required to ensure that it is complete, functional and tasks and responsibilities for involved personn
A separate test environment allows for testing of new configurations before deployment in production

The IDPS system will contain confidential information and possibly information that impacts employee privacy
Revocation is part of normal employee termination. Special emergency revocation should be in place for suspected misuse
A break glass procedure and account is required to gain access to the tooling even in case of major IT outages

Helps to identify potentially malicious traffic in encrypted communications, using fingerprinting or certificate analysis
The ability to use IDS signatures (e.g. YARA) in network monitoring
Machine learning trained on a predefined data set with known good and bad traffic
Machine learning trained without a predefined data set
In-line appliances can block malicious traffic as part of their response capability
Detection of unauthorised devices accessing the network
Add MITRE ATT&CK® tags to rules / analytics for mapping and hunting purposes
Detailed inspection of data sent across the network
Correlation of anomalies with previously detected anomalies or detection rules

Full packet capture supports network forensic capabilities


Flow data (such as NetFlow, IPFIX, etc.) ingestion

Monitoring capabilities for traffic crossing the perimeter


Monitoring capabilities for traffic traversing the internal network
Monitoring capabilities for classified network segments
Monitoring capabilities for cloud connections / environments
Monitoring capabilities for ISC/SCADA networks
Monitoring capabilities for DNS queries

Integration of business context (business function, asset classification, etc.)


Integration of identity information into the NDR for enhanced monitoring of users and groups
Integration of threat intelligence information (observables / IoCs) into the NDR tooling for reputation-based monitoring
Integration of the tooling into the threat hunting process to support threat hunting investigations
Integration of the security incident management process to support incident investigation
Integration with the SIEM tooling for centralised correlation
Detonate potential malware samples in a sandbox environment
Use of standard content packs in the security monitoring solution
Use of custom content (correlation rules, etc.) in the security monitoring solution

Creation of network baselines for anomaly detection purposes


Detection of anomalous patterns, for example in (encrypted) sessions, using machines learning

Custom reports for SOC customers and SOC analysts


Custom dashboards used by analysts and managers
Graphing capabilities to support analysis
Drilldowns on graphs to quickly 'zoom in' on details of visual anomalies
A central console used by SOC analysts
Searching capabilities that support finding specific information based on characteristics, especially useful for network thre
understood
Technology
1. SIEM / UEBA
2. NDR
3. EDR
4. SOAR

3 Security Analytics Tooling

Maturity
3.1 Accountability
3.1.1 Has functional ownership of the solution been formally assigned?
3.1.2 Has technical ownership of the solution been formally assigned?
3.2 Documentation
3.2.1 Has the solution been technically documented?
3.2.2 Has the solution been functionally documented?
3.3 Personnel & support
3.3.1 Is there dedicated personnel for support?
3.3.2 Is the personnel for support formally trained?
3.3.3 Is the personnel for support certified?
3.3.4 Is there a support contract for the solution?
3.4 Maintenance & configuration
3.4.1 Is the system regularly maintained?
3.4.2 Is remote maintenance on the system managed?
3.4.3 Are maintenance & configuration updates executed through the change management process?
3.4.4 Have you established maintenance windows?
3.4.5 Is maintenance performed using authorised and trusted tooling?
3.5 Availability & Integrity
3.5.1 Is there high availability (HA) in place for the solution?
3.5.2 Is there data backup / replication in place for the solution?
3.5.3 Is there configuration backup / replication in place for the solution?
3.5.4 Is there a Disaster Recovery plan in place for this solution?
3.5.5 Is the Disaster Recovery plan regularly tested?
3.5.6 Is there a separate development / test environment for this solution?
3.6 Confidentiality
3.6.1 Is access to the solution limited to authorized personnel?
3.6.2 Are access rights regularly reviewed and revoked if required?
3.6.3 Is a break glass procedure in place?

Capability
3.7 Specify which technological capabilities and artefacts are present and implemented:
Technical capabilities
3.7.1 OS support
3.7.2 Mobile device support
3.7.3 Physical, virtual & cloud deployment
3.7.4 Vulnerability patching
3.7.5 Forensic information preservation
3.7.6 Historic data retention
3.7.7 MITRE ATT&CK® identifier tagging
3.7.8 Memory analysis
3.7.9 Correlation

Prevention capabilities
3.7.10 Exploit prevention
3.7.11 Fileless malware protection
3.7.12 Application allowlisting
3.7.13 Ransomware protection
3.7.14 Attack surface reduction

Detection capabilities [1]


3.7.15 Vulnerability detection
3.7.16 Process execution monitoring
3.7.17 File system monitoring
3.7.18 Task & service monitoring
3.7.19 Network connection monitoring
3.7.20 Registry monitoring
3.7.21 User activity monitoring
3.7.22 Configuration monitoring
3.7.23 Air-gaped end-point monitoring
3.7.24 File reputation service
3.7.25 Deception techniques
Remediation capabilities
3.7.26 URL filtering / blocking
3.7.27 Web content filtering
3.7.28 Machine isolation
3.7.29 Process termination / suspension
3.7.30 File / registry key deletion
3.7.31 Forced user logoff

Integrations
3.7.32 Threat Intelligence integration
3.7.33 Vulnerability intelligence integration
3.7.34 Threat hunting integration - TTPs
3.7.35 Threat hunting integration - Tools & artifacts
3.7.36 Threat hunting integration - Technical indicators
3.7.37 Security incident management integration
3.7.38 SIEM integration
3.7.39 Malware sandbox integration

Rule-based detection
3.7.40 Online signature-based detection
3.7.41 Offline signature-based detection
3.7.42 Custom rules

Anomaly detection
3.7.43 Behavioural detection

Visualisation & output


3.7.44 Reporting
3.7.45 Dashboards
3.7.46 Data visualization techniques
3.7.47 Data drilldowns
3.7.48 Central analysis console
3.7.49 Advanced searching and querying
Completeness (%)

Comments and/or Remarks


3.8 Specify rationale for chosen values or any additional comments

[1] see:
https://github.com/tsale/EDR-Telemetry
Answer

nge management process?


d implemented:
71
Maturity 3.1.1
3.1.2
3.2.1
3.2.2
3.3.1
3.3.2
3.3.3
3.3.4
3.4.1
3.4.2
3.4.3
3.4.4
3.4.5
3.5.1
3.5.2
3.5.3
3.5.4
3.5.5
3.5.6
3.6.1
3.6.2
3.6.3

Capability 3.7.1
3.7.2
3.7.3
3.7.4
3.7.5
3.7.6
3.7.7
3.7.8
3.7.9
3.7.10
3.7.11
3.7.12
3.7.13
3.7.14
3.7.15
3.7.16
3.7.17
3.7.18
3.7.19
3.7.20
3.7.21
3.7.22
3.7.23
3.7.24
3.7.25
3.7.26
3.7.27
3.7.28
3.7.29
3.7.30
3.7.31
3.7.32
3.7.33
3.7.34
3.7.35
3.7.36
3.7.37
3.7.38
3.7.39
3.7.40
3.7.41
3.7.42
3.7.43
3.7.44
3.7.45
3.7.46
3.7.47
3.7.48
3.7.49
3.7.50
Guidance

Functional ownership fully described and assigned, not approved

Technical ownership fully described and assigned, not approved

Single document, full technical description of analytics system

Single document, full functional description of analytics system

Sufficient dedicated personnel available, not documented

All personnel formally trained

All personnel formally certified

Support contract in place, covering all SOC requirements

System maintenance done structurally, following procedures

Remote maintenance controlled, in a documented process

All maintenance executed through changes, with formal approval

Established, formally approved & aligned with change management

Authorized & updated tooling used, regularly evaluated

Manual actions required for achieving redundancy

Weekly backup routine in place

Daily backup routine in place

Basic DR plan in place

DR plan tested, but not formally

Separate test environment in place, not used structurally

Granular access rights implemented, not monitored


Access right review documented, but not executed structurally

Break glass account and defined procedure in place, formally approved

Averagely implemented, partially documented

Averagely implemented, partially documented

Averagely implemented, partially documented

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Fully implemented, documented, approved, actively improved

Averagely implemented, partially documented

Fully implemented, documented, approved, actively improved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Partially implemented, incomplete

Averagely implemented, partially documented

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Averagely implemented, partially documented

Averagely implemented, partially documented

Averagely implemented, partially documented


Averagely implemented, partially documented

Fully implemented, documented, approved, actively improved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Fully implemented, documented, approved, actively improved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Partially implemented, incomplete

Averagely implemented, partially documented

Averagely implemented, partially documented

Averagely implemented, partially documented

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Averagely implemented, partially documented

Averagely implemented, partially documented

Mostly implemented, documented and approved

Averagely implemented, partially documented

Fully implemented, documented, approved, actively improved

Mostly implemented, documented and approved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Mostly implemented, documented and approved


Remarks

Functional ownership includes functional accountability


Technical ownership includes technical accountability

A technical description of the security analytics system components and configuration


A description of the EDR functional configuration (rules, alerts, thresholds, etc.)

Dedicated personnel should be in place to ensure that support is always available. Can also be staff with outsourced provid
Training helps to jump start new hires, and to learn a proper way of working with the tool
Certification demonstrates ability to handle the tooling properly
A support contract may cover on-site support, support availability, response times, escalation and full access to resources

Systems should be regularly maintained to keep up with the latest features and bug fixes
Remote maintenance by a third party may be part of system maintenance procedures
Maintenance and configuration updates should follow the formal change management process
Setting maintenance windows helps to structure the maintenance process and make it more predictable
Performing maintenance with authorised and trusted tooling helps to ensure security and integrity of the system

Can be fully implemented HA, partially implemented, hot spare, etc. May not be applicable
May not be applicable to all EDR solutions
Configuration synchronization could be part of a HA setup
A DR plan is required to restore service in case of catastrophic events
Testing the DR plan is required to ensure that it is complete, functional and tasks and responsibilities for involved personn
A separate test environment allows for testing of new configurations before deployment in production

The analytics system will contain confidential information and information that possibly impacts employee privacy
Revocation is part of normal employee termination. Special emergency revocation should be in place for suspected misuse
A break glass procedure and account is required to gain access to the tooling even in case of major IT outages

Support for common OS's (e.g. Windows, Linux, OSX, etc.)


Support for mobile devices (e.g. Android, IOS)
Deployment across physical and virtual devices and cloud-based workloads increases coverage and visibility
Remote patching of uncovered vulnerabilities. May be fully automated
Forensic information preservation is a capability that supports security incident management and forensic investigation
Historic data retention is especially important in threat hunting investigations. Multiple months of data should be retained
Add MITRE ATT&CK® tags to rules / analytics for mapping and hunting purposes
Analysis of end-point memory
Correlation of events with previously detected events

Prevention of exploits being executed on the system


Protection against malware running only in memory
Prevention of running unauthorised applications by means of allow listing
Protection against ransomware, for example with the used of controlled access to sensitive files and folders
Attack surface reduction can be used to decrease the likelihood of compromise

Detection of vulnerabilities on the local end-point


Detection of execution of potentially malicious processes and scripts
Detection of file creation, deletion and or modification (including encryption)
Detection of creation or modification of services and scheduled tasks
Detection of potentially malicious network traffic
Detection of modifications to the system registry
Detection of user activity, including logon and logoff
Detection of unauthorised changes to the local end-point configuration, including modifications to registry, files, services,
Monitoring capabilities for end-points not connected to the corporate network
Detection of potentially malicious files using an (online) file reputation service
Detection of adversaries by employing deception techniques (such as honey tokens) on end-points
Block traffic to specific URL
Block specific web content
Isolate a machine from the network
Terminate or suspend a running process
Delete local files or registry key (in case of Windows)
Force a user to log off to terminate any user-associated processes

Integration of threat intelligence information (observables / IoCs) into the EDR tooling for reputation-based monitoring
Integration of vulnerability intelligence information into the EDR for vulnerability monitoring purposes
Integration of the tooling into the threat hunting process to support threat hunting investigations on the TTP level
Integration of the tooling into the threat hunting process to support threat hunting investigations on the Tools & artifacts l
Integration of the tooling into the threat hunting process to support threat hunting investigations on the indicator level (IP
Integration of the security incident management process to support incident investigation
Integration with the SIEM tooling for centralised correlation
Detonate potential malware samples in a sandbox environment

Signature-based malware detection requiring an active internet connection


Signature-based malware detection not requiring an active internet connection
Creation of custom rules for the detection of specific Indicators of Attack

Detection of potentially malicious behaviour using machine learning technologies

Custom reports for SOC customers and SOC analysts


Custom dashboards used by analysts and managers
Graphing capabilities to support analysis
Drilldowns on graphs to quickly 'zoom in' on details of visual anomalies
A central console used by SOC analysts
Searching capabilities that support finding specific information based on characteristics, especially useful for end-point thr
understood
drivers, etc.
Technology
1. SIEM / UEBA
2. NDR
3. EDR
4. SOAR

4 Automation & Orchestration Tooling

Maturity
4.1 Accountability
4.1.1 Has functional ownership of the solution been formally assigned?
4.1.2 Has technical ownership of the solution been formally assigned?
4.2 Documentation
4.2.1 Has the solution been technically documented?
4.2.2 Has the solution been functionally documented?
4.3 Personnel & support
4.3.1 Is there dedicated personnel for support?
4.3.2 Is the personnel for support formally trained?
4.3.3 Is the personnel for support certified?
4.3.4 Is there a support contract for the solution?
4.4 Maintenance & configuration
4.4.1 Is the system regularly maintained?
4.4.2 Is remote maintenance on the system managed?
4.4.3 Are maintenance & configuration updates executed through the change management process?
4.4.4 Have you established maintenance windows?
4.4.5 Is maintenance performed using authorised and trusted tooling?
4.5 Availability & Integrity
4.5.1 Is there high availability (HA) in place for the solution?
4.5.2 Is there data backup / replication in place for the solution?
4.5.3 Is there configuration backup / replication in place for the solution?
4.5.4 Is there a Disaster Recovery plan in place for this solution?
4.5.5 Is the Disaster Recovery plan regularly tested?
4.5.6 Is there a separate development / test environment for this solution?
4.6 Confidentiality
4.6.1 Is access to the solution limited to authorized personnel?
4.6.2 Are access rights regularly reviewed and revoked if required?
4.6.3 Is a break glass procedure in place?

Capability
4.7 Specify which technological capabilities and artefacts are present and implemented:
Technical capabilities
4.7.1 Historical event matching
4.7.2 Risk-based event prioritization
4.7.3 Ticket workflow support

Data integrations
4.7.4 SIEM data integration
4.7.5 Threat intelligence integration
4.7.6 Asset context integration
4.7.7 Identity context integration
4.7.8 Vulnerability management integration

Response integrations
4.7.9 Knowledge base integration
4.7.10 Firewall integration
4.7.11 NDR integration
4.7.12 EDR integration
4.7.13 Email protection integration
4.7.14 Malware protection integration
4.7.15 Sandbox integration
4.7.16 Active Directory / IAM integration
4.7.17 SIEM integration

Playbooks [1]
4.7.18 Standard playbooks
4.7.19 Customised playbooks
4.7.20 Playbook automation
4.7.21 Playbook development process

Visualisation and ouput


4.7.22 Reporting
4.7.23 Dashboards
4.7.24 Performance tracking
Completeness (%)

Comments and/or Remarks


4.8 Specify rationale for chosen values or any additional comments

[1] the SOAR maturity model is a helpful resource for understanding levels of playbook application:
https://chronicle.security/blog/posts/SOAR-adoption-maturity-model/
Answer

nge management process?


d implemented:
71

4.1.1
4.1.2
ybook application: 4.2.1
4.2.2
4.3.1
4.3.2
4.3.3
4.3.4
4.4.1
4.4.2
4.4.3
4.4.4
4.4.5
4.5.1
4.5.2
4.5.3
4.5.4
4.5.5
4.5.6
4.6.1
4.6.2
4.6.3

4.7.1
4.7.2
4.7.3
4.7.4
4.7.5
4.7.6
4.7.7
4.7.8
4.7.9
4.7.10
4.7.11
4.7.12
4.7.13
4.7.14
4.7.15
4.7.16
4.7.17
4.7.18
4.7.19
4.7.20
4.7.21
4.7.22
4.7.23
4.7.24
Guidance

Functional ownership fully described and assigned, not approved

Technical ownership fully described and assigned, not approved

Single document, full technical description of analytics system

Single document, full functional description of analytics system

Sufficient dedicated personnel available, not documented

Individual training, not part of the training program

All personnel formally certified

Support contract in place, covering all SOC requirements

System maintenance done structurally, not following procedures

Remote maintenance controlled, not documented

All major maintenance executed through change management

Established, formally approved & aligned with change management

Authorized & updated tooling used, regularly evaluated

Fully automated HA in place, not aligned with business continuity plans

Daily backup routine in place

Daily backup routine in place

Full DR plan in place, not approved by business continuity stakeholders

DR plan regularly and fully tested, results not formally published

Separate test environment with formal procedures in place

Granular access rights implemented, monitored and subjected to audit


Access rights reviewed periodically and structurally

Break glass account and defined procedure in place, formally approved

Mostly implemented, documented and approved


Mostly implemented, documented and approved
Fully implemented, documented, approved, actively improved

Mostly implemented, documented and approved


Mostly implemented, documented and approved
Mostly implemented, documented and approved
Mostly implemented, documented and approved
Averagely implemented, partially documented

Mostly implemented, documented and approved


Mostly implemented, documented and approved

Mostly implemented, documented and approved

Partially implemented, incomplete

Averagely implemented, partially documented

Averagely implemented, partially documented

Mostly implemented, documented and approved

Averagely implemented, partially documented

Averagely implemented, partially documented

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Automation used as decision support for remediation activities

Mostly implemented, documented and approved

Fully implemented, documented, approved, actively improved


Mostly implemented, documented and approved

Fully implemented, documented, approved, actively improved


Remarks

Functional ownership includes functional accountability


Technical ownership includes technical accountability

A technical description of the automation & orchestration system components and configuration
A description of the automation & orchestration system functional configuration (workflows, integrations, etc.)

Dedicated personnel should be in place to ensure that support is always available. Can also be staff with outsourced provid
Training helps to jump start new hires, and to learn a proper way of working with the tool
Certification demonstrates ability to handle the tooling properly
A support contract may cover on-site support, support availability, response times, escalation and full access to resources

Systems should be regularly maintained to keep up with the latest features and bug fixes
Remote maintenance by a third party may be part of system maintenance procedures
Maintenance and configuration updates should follow the formal change management process
Setting maintenance windows helps to structure the maintenance process and make it more predictable
Performing maintenance with authorised and trusted tooling helps to ensure security and integrity of the system

Can be fully implemented HA, partially implemented, hot spare, etc.


May not be required for this particular solution
Configuration synchronization could be part of a HA setup
A DR plan is required to restore service in case of catastrophic events
Testing the DR plan is required to ensure that it is complete, functional and tasks and responsibilities for involved personn
A separate test environment allows for testing of new configurations before deployment in production

The automation system may have automated actions that can impact the usage of systems and should be restricted
Revocation is part of normal employee termination. Special emergency revocation should be in place for suspected misuse
A break glass procedure and account is required to gain access to the tooling even in case of major IT outages

Contextualize potential incidents using similar historical events


Risk-based prioritization of security events using contextualized information
Automated ticket creation and workflow support

The automation & orchestration tool receives events from the SIEM system
Contextualize potential incidents using threat intelligence
Contextualize potential incidents using asset information
Contextualize potential incidents using user information
Contextualize potential incidents using vulnerability management information

Automatically update the knowledge base using event information


Automated remediation by blocking attackers on the firewall
Automated remediation by blocking attackers in the network
Automated remediation by blocking attackers on the end-point
Automated remediation by blocking email senders
Automated remediation by quarantining malware and scanning end-points for malware threats
Automated delivery of malware samples to sandbox environments for extensive analysis
Automated locking and suspension of user accounts or revocation of access rights based on event outcome
Querying SIEM to obtain information for ticket context or automated triage

Application of standard (out of the box) response playbooks in the SOAR


Application of customized and fully response customized playbooks in the SOAR
Application of automation in playbooks
A development, refinement and life cycle management process for SOC playbooks for usage in SOAR

Custom reports for SOC customers and SOC analysts


Custom dashboards used by analysts and managers
Application of KPIs and metrics to ticket workflow
understood
Services
1. Security Monitoring 5. Threat Hunting
2. Security Incident Management 6. Vulnerability Management
3. Security Analysis & Forensics 7. Log Management
4. Threat Intelligence

1 Security Monitoring

Maturity
1.1 Have you formally described the security monitoring service?
1.2 Please specify elements of the security monitoring service document:
1.2.1 Key performance indicators
1.2.2 Quality indicators
1.2.3 Service dependencies
1.2.4 Service levels
1.2.5 Hours of operation
1.2.6 Service customers and stakeholders
1.2.7 Purpose
1.2.8 Service input / triggers
1.2.9 Service output / deliverables
1.2.10 Service activities
1.2.11 Service roles & responsibilities
Completeness
1.3 Is the service measured for quality?
1.4 Is the service measured for service delivery in accordance with service levels?
1.5 Are customers and/or stakeholders regularly updated about the service?
1.6 Is there a contractual agreement between the SOC and the customers?
1.7 Is sufficient personnel allocated to the process to ensure required service delivery?
1.8 Is the service aligned with other relevant processes?
1.9 Is there a incident resolution / service continuity process in place for this service?
1.10 Has a set of procedures been created for this service?
1.11 Is there an onboarding and offloading procedure for this service?
1.12 Are best practices applied to the service?
1.13 Are use cases used in the security monitoring service?
1.14 Is process data gathered for prediction of service performance?
1.15 Is the service continuously being improved based on improvement goals?

Capability
1.16 Please specify capabilities of the security monitoring service:
1.16.1 Early detection
1.16.2 Intrusion detection
1.16.3 Exfiltration detection
1.16.4 Subtle event detection
1.16.5 Malware detection
1.16.6 Anomaly detection
1.16.7 Real-time detection
1.16.8 Alerting & notification
1.16.9 False-positive reduction
1.16.10 Continuous tuning
1.16.11 Coverage management
1.16.12 Status monitoring
1.16.13 Perimeter monitoring
1.16.14 Host monitoring
1.16.15 Network & traffic monitoring
1.16.16 Access & usage monitoring
1.16.17 User / identity monitoring
1.16.18 Application & service monitoring
1.16.19 Behavior monitoring
1.16.20 Database monitoring
1.16.21 Data loss monitoring
1.16.22 Device loss / theft monitoring
1.16.23 Third-party monitoring
1.16.24 Physical environment monitoring
1.16.25 Cloud monitoring
1.16.26 Mobile device monitoring
1.16.27 OT monitoring
Completeness (%)

Comments and/or Remarks


1.17 Specify rationale for chosen values or any additional comments
Management
nt

Answer

Averagely complete

rvice delivery?

this service?
84

Maturity 1.1
1.3
1.4
1.5
1.6
1.7
1.8
1.9
1.10
1.11
1.12
1.13
1.14
1.15

Capability 1.16.1
1.16.2
1.16.3
1.16.4
1.16.5
1.16.6
1.16.7
1.16.8
1.16.9
1.16.10
1.16.11
1.16.12
1.16.13
1.16.14
1.16.15
1.16.16
1.16.17
1.16.18
1.16.19
1.16.20
1.16.21
1.16.22
1.16.23
1.16.24
1.16.25
1.16.26
Guidance

Single document, full description of service

Metrics formalized and used in regular reports

SLA compliance reported periodically, not discussed with customers

Periodical updates sent to all customers/stakeholders

Contract signed, but not regularly reviewed

Sufficient dedicated personnel available, not fully trained and capable

Alignment done regularly, but not in a structured fashion

Basic service continuity process in place

Procedures in place, operational and used structurally

Procedures in place, operational and used structurally

Best practices applied to service architecture and service delivery

Use cases embedded in the security monitoring processes

Goals set for service performance, measured structurally and formally


Goals formally defined and pursued structurally and periodically

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Averagely implemented, partially documented

Fully implemented, documented, approved, actively improved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Mostly implemented, documented and approved

Use cases, playbooks and procedures defined and implemented

Use cases, playbooks and procedures defined and implemented

Fully implemented, performance measured and improved

Fully implemented, performance measured and improved

Fully implemented, performance measured and improved

Fully implemented, performance measured and improved

Fully implemented, performance measured and improved

Use cases, playbooks and procedures defined and implemented

Fully implemented, performance measured and improved

Fully implemented, performance measured and improved

Use cases, playbooks and procedures defined and implemented

Specific use cases defined and operationalised

Fully implemented, performance measured and improved

Fully implemented, performance measured and improved

Fully implemented, performance measured and improved

Fully implemented, performance measured and improved


Remarks

A service description should be in place

Indicators to establish the performance of the service


Indicators to establish the quality of service delivery
A clear understanding of which people / process / technologies are required for adequate service delivery
Agreements on minimum performance, capacity, availability, etc.
The operational hours for this service (e.g. 24/7)
The customers and stakeholders for this service (e.g. IT management)
The purpose and objectives for this service
The service input: what triggers this service to run?
The service output: what does the service deliver? Can be tangible (e.g. reports) or intangible (e.g. situational awareness )
Which activities are carried out within the scope of the service?
Which roles and responsibilities apply to this service?
Use this outcome to determine the score for 1.2
Are the quality indicators from the previous questions used for reporting on the service?
Service levels should be used to formally commit the SOC to service delivery
Changes to the service scope, delivery, etc.
Contractual agreements should also cover penalties
Allocation of dedicated personnel will ensure highest service quality
e.g. alignment with configuration management, incident management, etc.
Service continuity is important to comply with contractual agreements, even in case of major incidents
Procedures support process standardization and quality. Personnel should be trained to use procedures correctly and stru
Customer onboarding and offloading procedures support efficient service delivery and ensure customers are (dis)connecte
Best practices should be used to optimize this service
e.g. user login brute-force, denial of service, non-compliance, etc.
Service performance measurement requires establishment of performance goals
Improvement based on based evaluation, (maturity) assessment, tests, etc.

Capability to detect incidents in an early stage


Capability to detect intrusion attempts
Capability to detect information leaving the organization
Capability to detect slight changes in systems, applications or network that may indicate malicious behavior
Capability to detect malware in the infrastructure
Capability to detect anomalies
Can also be near real-time (e.g. 15 minutes delay)
Capability to automatically send alerts for all security monitoring components
A process for reducing the amount of false-positives
A continuous tuning process for the correlation rules
Coverage indicates how well the service covers the assets in your environment
Monitoring of the status of the system
Monitoring of the network perimeter for attempted intrusions and exfiltration
Monitoring of endpoints in the networks (servers, clients, etc.)
Monitoring of network and traffic flows and anomalies
Monitoring of access attempts
Monitoring of user actions
Monitoring of applications & services
Monitoring of behavior against baselines (can be host, network and user behavior)
Monitoring of databases
Monitoring for loss of information
Monitoring for loss or theft of company assets
Monitoring of trusted third-parties to detect possible breach attempts through the supply chain
Monitoring of the physical environment to detect cyber security incidents
Monitoring of private and public cloud environment: SAAS, IAAS and PAAS
Monitoring of corporate owned mobile devices or mobile devices containing corporate information
Monitoring of Operational Technology environments, including ICS, SCADA, DPC and PLC systems
Services
1. Security Monitoring 5. Threat Hunting
2. Security Incident Management 6. Vulnerability Management
3. Security Analysis & Forensics 7. Log Management
4. Threat Intelligence

2 Security Incident Management

Maturity
2.1 Have you adopted a maturity assessment methodology for Security Incident Management?
2.1.1 If yes, please specify the methodology
2.1.2 If yes, please specify the maturity level (can have up to 2 digits)
If yes, skip directly to 2.17 (capabilities)
2.2 Have you adopted a standard for the Security Incident Management process?
2.3 Have you formally described the security incident management process?
2.4 Please specify elements of the security incident management document:
2.4.1 Security incident definition
2.4.2 Service levels
2.4.3 Workflow
2.4.4 Decision tree
2.4.5 Hours of operation
2.4.6 Service customers and stakeholders
2.4.7 Purpose
2.4.8 Service input / triggers
2.4.9 Service output / deliverables
2.4.10 Service activities
2.4.11 Service roles & responsibilities
Completeness
2.5 Is the service measured for quality?
2.6 Is the service measured for service delivery in accordance with service levels?
2.7 Are customers and/or stakeholders regularly updated about the service?
2.8 Is there a contractual agreement between the SOC and the customers?
2.9 Is sufficient personnel allocated to the process to ensure required service delivery?
2.10 Is the service aligned with other relevant processes?
2.11 Is the incident response team authorized to perform (invasive) actions when required?
2.12 Is there an onboarding and offloading procedure for this service?
2.13 Are best practices applied to the service?
2.14 Is the service supported by predefined workflows or scenarios?
2.15 Is process data gathered for prediction of service performance?
2.16 Is the service continuously being improved based on improvement goals?

Capability
2.17 Please specify capabilities and artefacts of the security incident management service:
2.17.1 Incident logging procedure
2.17.2 Incident resolution procedure
2.17.3 Incident investigation procedure
2.17.4 Escalation procedure
2.17.5 Evidence collection procedure
2.17.6 Incident containment procedures
2.17.7 IR Training
2.17.8 Table-top exercises
2.17.9 Red team / blue team exercises
2.17.10 RACI matrix
2.17.11 Response authorization
2.17.12 Incident template
2.17.13 Incident tracking system
2.17.14 False-positive reduction
2.17.15 Priority assignment
2.17.16 Severity assignment
2.17.17 Categorization
2.17.18 Critical bridge
2.17.19 War room
2.17.20 Communication plan & email templates
2.17.21 Backup communication technology
2.17.22 Secure communication channels
2.17.23 (dedicated) information sharing platform
2.17.24 Change management integration
2.17.25 Malware extraction & analysis
2.17.26 On-site incident response
2.17.27 Remote incident response
2.17.28 Third-party escalation
2.17.29 Evaluation template
2.17.30 Reporting template
2.17.31 Incident closure
2.17.32 Lessons learned extraction for process improvement
2.17.33 External security incident support agreements
2.17.34 Exercises with other incident response teams
2.17.35 Root Cause Analysis
2.17.36 Restore integrity verification

Completeness (%)

Comments and/or Remarks


2.18 Specify rationale for chosen values or any additional comments

RE&CT framework:
https://atc-project.github.io/atc-react/
Management
nt

Answer

ncident Management?

Partially complete

rvice delivery?

s when required?
gement service:
56

Maturity 2.2
2.3
2.5
2.6
2.7
2.8
2.9
2.10
2.11
2.12
2.13
2.14
2.15
2.16

Capability 2.17.1
2.17.2
2.17.3
2.17.4
2.17.5
2.17.6
2.17.7
2.17.8
2.17.9
2.17.10
2.17.11
2.17.12
2.17.13
2.17.14
2.17.15
2.17.16
2.17.17
2.17.18
2.17.19
2.17.20
2.17.21
2.17.22
2.17.23
2.17.24
2.17.25
2.17.26
2.17.27
2.17.28
2.17.29
2.17.30
2.17.31
2.17.32
2.17.33
2.17.34
2.17.35
Guidance

Standard fully adopted, process set up and executed using standard


Single document, full description of service

Metrics defined, applied in an ad-hoc fashion


SLA defined, measured periodically but not reported
Frequent updates sent to most customers/stakeholders
Basic contract in place, not formally signed of
Personnel allocated, but not sufficient for required service delivery
Alignment done in an ad-hoc fashion
Mandate informally given, not supported by all stakeholders
Full procedures in place, operational but not used structurally
Best practices identified, but not applied
Basic workflows in place, not covering all incident types
Goals set for service performance, measured structurally but informally
Goals defined and pursued structurally, but not formalized

Averagely implemented, partially documented


Averagely implemented, partially documented
Mostly implemented, documented and approved
Mostly implemented, documented and approved
Fully implemented, documented, approved, actively improved
Mostly implemented, documented and approved
Averagely implemented, partially documented
Mostly implemented, documented and approved
Averagely implemented, partially documented
Mostly implemented, documented and approved
Mostly implemented, documented and approved
Mostly implemented, documented and approved
Averagely implemented, partially documented
Mostly implemented, documented and approved
Mostly implemented, documented and approved
Mostly implemented, documented and approved
Averagely implemented, partially documented
Fully implemented, documented, approved, actively improved
Mostly implemented, documented and approved
Averagely implemented, partially documented
Partially implemented, incomplete
Partially implemented, incomplete
Averagely implemented, partially documented
Averagely implemented, partially documented
Averagely implemented, partially documented
Partially implemented, incomplete
Not in place
Partially implemented, incomplete
Partially implemented, incomplete
Averagely implemented, partially documented
Averagely implemented, partially documented
Averagely implemented, partially documented
Averagely implemented, partially documented
Averagely implemented, partially documented

Averagely implemented, partially documented


Remarks

e.g., SOC-CMM 4CERT, SIM3, CREST, etc.


Please convert to a 5-point scale if required. For example: 3.6 on a 4-point scale = 4.5 on a 5-point scale
The score in 2.1.2 overrules any maturity scoring in this section
e.g. NIST 800-62r1, CERT handbook, etc.
A service description should be in place

A clear and unambiguous definition of a security incident


e.g. response times
The process steps that are part of the security incident management process (e.g. detection, triage, etc.)
Decision tree for escalation and starting of the process
When can the security incident response process be started?
The customers and stakeholders for this service (e.g. IT management)
The purpose and objectives for this service
The service input: what triggers this service to run?
The service output: what does the service deliver? Can be tangible (e.g. reports) or intangible (e.g. situational awareness )
Which activities are carried out within the scope of the service?
Which roles and responsibilities apply to this service?
Use this outcome to determine the score for 2.4
Are the quality indicators from the previous questions used for reporting on the service?
Service levels should be used to formally commit the SOC to service delivery
Changes to the service scope, delivery, etc.
Contractual agreements should also cover penalties
Allocation of dedicated personnel will ensure highest service quality
e.g. alignment with configuration management, incident management, etc.
This is a mandate issue. The team should have mandate beforehand to optimize incident response times
Customer onboarding and offloading procedures support efficient service delivery and ensure customers are (dis)connecte
Best practices should be used to optimize this service
Workflows and scenario's can be used to structure follow-up and determine expected incident progression
Service performance measurement requires establishment of performance goals
Improvement based on based evaluation, (maturity) assessment, tests, etc.

Part of preparation procedures


Part of preparation procedures. Likely involves a checklist and a workflow for incident handling
Part of preparation procedures. Includes triage procedure and investigation / analysis procedures
Part of preparation procedures
Part of preparation procedures
Part of preparation procedures. Can be based on RE&CT framework[1]
Preparation exercises to determine service effectiveness
Preparation exercises to determine service effectiveness
Preparation exercises to determine service effectiveness
Matrix with Responsibility, Accountability and Consulted and Informed entities for the process
Authorization from senior management to take any action required for incident mitigation (e.g. disconnect systems)
Templates for security incident management registration
A system that support the security incident management workflow. If possible dedicated or supporting ticket confidentiali
A procedure to avoid false-positives in the security incident management process
Assignment of priority to the incident, part of impact and magnitude assessment
Assignment of severity to the incident, part of impact and magnitude assessment
Categorization of the incident. For example, the VERIS framework could be used for classification
A communication bridge for continuous alignment of employees involved in security incident management
A dedicated facility for coordination of security incidents
Standardized plans and templates for communication. Includes reachability in case of emergency and outreach to custome
Backup communication technology in case of failure of primary means. Includes internet access, email systems and phone
Encrypted and secure communications (includes email and phones) that can be used during incident response
A platform for sharing information regarding the security incident
Integration with the change management process for any actions taken in the security incident management process
Extraction and analysis of malware
Localized incident response capability
Remote incident response capability
Escalation process to third parties (vendors, partners, etc.)
A template for post-incident evaluation
A template for reporting on the security incident
Formal closure of the incident, including debriefing sessions
Continuous improvement based on previous experiences
Retainer for incident response as a service in case of major breaches
Exercises with other IR teams, for example outsourcing partners and other teams in the sector
Investigation and reporting on the root cause of the incident. Required for optimizing lessons learned for the organization
Verification that backups and restored assets do not contain IoCs or backdoors used in the initial incident
Services
1. Security Monitoring 5. Threat Hunting
2. Security Incident Management 6. Vulnerability Management
3. Security Analysis & Forensics 7. Log Management
4. Threat Intelligence

3 Security Analysis & Forensics

Maturity
3.1 Have you formally described the security analysis & forensics service?
3.2 Please specify elements of the security analysis service document:
3.2.1 Key performance indicators
3.2.2 Quality indicators
3.2.3 Service dependencies
3.2.4 Service levels
3.2.5 Hours of operation
3.2.6 Service customers and stakeholders
3.2.7 Purpose
3.2.8 Service input / triggers
3.2.9 Service output / deliverables
3.2.10 Service activities
3.2.11 Service roles & responsibilities
Completeness
3.3 Is the service measured for quality?
3.4 Is the service measured for service delivery in accordance with service levels?
3.5 Are customers and/or stakeholders regularly updated about the service?
3.6 Is there a contractual agreement between the SOC and the customers?
3.7 Is sufficient personnel allocated to the process to ensure required service delivery?
3.8 Is the service aligned with other relevant processes?
3.9 Is there a incident resolution / service continuity process in place for this service?
3.10 Has a set of procedures been created for this service?
3.11 Is there an onboarding and offloading procedure for this service?
3.12 Are best practices applied to the service?
3.13 Is the service supported by predefined workflows or scenarios?
3.14 Is process data gathered for prediction of service performance?
3.15 Is the service continuously being improved based on improvement goals?

Capability
3.16 Please specify capabilities and artefacts of the security analysis process:
3.16.1 Event analysis
3.16.2 Event analysis toolkit
3.16.3 Trend analysis
3.16.4 Incident analysis
3.16.5 Visual analysis
3.16.6 Static malware analysis
3.16.7 Dynamic malware analysis
3.16.8 Tradecraft analysis
3.16.9 Historic analysis
3.16.10 Network analysis
3.16.11 Memory analysis
3.16.12 Mobile device analysis
3.16.13 Volatile information collection
3.16.14 Remote evidence collection
3.16.15 Forensic hardware toolkit
3.16.16 Forensic analysis software toolkit
3.16.17 Dedicated analysis workstations
3.16.18 Security analysis & forensics handbook
3.16.19 Security analysis & forensics workflows
3.16.20 Case management system
3.16.21 Report templates
3.16.22 Evidence seizure procedure
3.16.23 Evidence transport procedure
3.16.24 Chain of custody preservation procedure

Completeness (%)

Comments and/or Remarks


3.17 Specify rationale for chosen values or any additional comments
Management
nt

Answer

Fully complete

rvice delivery?

this service?
96

Maturity 3.1
3.3
3.4
3.5
3.6
3.7
3.8
3.9
3.10
3.11
3.12
3.13
3.14
3.15

Capability 3.16.1
3.16.2
3.16.3
3.16.4
3.16.5
3.16.6
3.16.7
3.16.8
3.16.9
3.16.10
3.16.11
3.16.12
3.16.13
3.16.14
3.16.15
3.16.16
3.16.17
3.16.18
3.16.19
3.16.20
3.16.21
3.16.22
3.16.23
3.16.24
Guidance

Document completed, approved and formally published

Formal and approved metrics in place, feedback used for improvement

SLA compliance discussed with customers regularly for improvement

Periodical updates sent to all customers/stakeholders

Contract signed, but not regularly reviewed

Sufficient dedicated personnel available, trained and fully capable

Alignment done structurally & regularly with all relevant processes

Full process in place, formally approved by relevant stakeholders

Procedures in place, formally published and fully operationalized

Procedures in place, operational and used structurally

Best practices applied and adherence checked regularly

Formal workflows created, approved and published for all incident types

Continuous measurement to determine progress & adjust process


Continuous improvement based on targets and feedback loops

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Averagely implemented, partially documented

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Averagely implemented, partially documented


Remarks

A service description should be in place

Indicators to establish the performance of the service


Indicators to establish the quality of service delivery
A clear understanding of which people / process / technologies are required for adequate service delivery
Agreements on minimum performance, capacity, availability, etc.
The operational hours for this service (e.g. 24/7)
The customers and stakeholders for this service (e.g. IT management)
The purpose and objectives for this service
The service input: what triggers this service to run?
The service output: what does the service deliver? Can be tangible (e.g. reports) or intangible (e.g. situational awareness )
Which activities are carried out within the scope of the service?
Which roles and responsibilities apply to this service?
Use this outcome to determine the score for 3.2
Are the quality indicators from the previous questions used for reporting on the service?
Service levels should be used to formally commit the SOC to service delivery
Changes to the service scope, delivery, etc.
Contractual agreements should also cover penalties
Allocation of dedicated personnel will ensure highest service quality
e.g. alignment with configuration management, incident management, etc.
Service continuity is important to comply with contractual agreements, even in case of major incidents
Procedures support process standardization and quality. Personnel should be trained to use procedures correctly and stru
Customer onboarding and offloading procedures support efficient service delivery and ensure customers are (dis)connecte
Best practices should be used to optimize this service
Use cases can be used to guide the analysis workflows
Service performance measurement requires establishment of performance goals
Improvement based on based evaluation, (maturity) assessment, tests, etc.

Analysis of detailed events


A combination of internal and external tools that can be used for security event analysis purposes
Analysis of trends in events or incidents
Analysis of security incidents
Visualization tools for data analysis
Reverse engineering and disassembly of malware
Running malware in a controlled environment to determine its characteristics
Analysis of the tradecraft of the attacker. This includes the tools, tactics, techniques and procedures used by attackers
Analysis of historic information based on new insights. APTs can span multiple months or years
Analysis of network traffic patterns and packets
Analysis of end-point memory, for example fileless malware
Capability to perform forensic analysis of mobile devices
Collection of volatile information (such as memory; see RFC3227) requires swift response, as evidence may be lost quickly
Capability to remotely collect evidence (files, disk images, memory dumps, etc. ) from target systems
Hardware toolkits will likely at least consist of write-blockers for disk imaging
Software tools used in forensic analysis
Dedicated workstations loaded with specialized tools should be used to make investigations more efficient
A handbook that describes security analysis workflows, tools, exceptions, known issues, etc.
An established workflow for performing security analysis
A case management system that supports the analyst workflow
Report templates for standardization of investigation reporting
Procedure for seizure of evidence in forensic analysis
Procedure for trusted transport of evidence (e.g. laptops) that preserve the chain of custody
Procedures to correctly process evidence, while preserving the chain of custody
Services
1. Security Monitoring 5. Threat Hunting
2. Security Incident Management 6. Vulnerability Management
3. Security Analysis & Forensics 7. Log Management
4. Threat intelligence

4 Threat Intelligence

Maturity
4.1 Have you formally described the threat intelligence service?
4.2 Please specify elements of the threat intelligence service document:
4.2.1 Key performance indicators
4.2.2 Quality indicators
4.2.3 Service dependencies
4.2.4 Service levels
4.2.5 Hours of operation
4.2.6 Service customers and stakeholders
4.2.7 Purpose
4.2.8 Service input / triggers
4.2.9 Service output / deliverables
4.2.10 Service activities
4.2.11 Service roles & responsibilities
Completeness
4.3 Is the service measured for quality?
4.4 Is the service measured for service delivery in accordance with service levels?
4.5 Are customers and/or stakeholders regularly updated about the service?
4.6 Is there a contractual agreement between the SOC and the customers?
4.7 Is sufficient personnel allocated to the process to ensure required service delivery?
4.8 Is the service aligned with other relevant processes?
4.9 Is there a incident resolution / service continuity process in place for this service?
4.10 Has a set of procedures been created for this service?
4.11 Is there an onboarding and offloading procedure for this service?
4.12 Are best practices applied to the service?
4.13 Is process data gathered for prediction of service performance?
4.14 Is the service continuously being improved based on improvement goals?
Capability
4.15 Please specify capabilities and artefacts of the threat intelligence process:
Collection
4.15.1 Continuous intelligence gathering
4.15.2 Automated intelligence gathering & processing
4.15.3 Centralized collection & distribution
4.15.4 Intelligence collection from open / public sources
4.15.5 Intelligence collection from closed communities
4.15.6 Intelligence collection from intelligence provider
4.15.7 Intelligence collection from business partners
4.15.8 Intelligence collection from mailing lists
4.15.9 Intelligence collection from internal sources
Processing
4.15.10 Structured data analysis
4.15.11 Unstructured data analysis
4.15.12 Past incident analysis
4.15.13 Trend analysis
4.15.14 Automated alerting
4.15.15 Adversary movement tracking
4.15.16 Attacker identification
4.15.17 Threat identification
4.15.18 Threat prediction
4.15.19 TTP extraction
4.15.20 Deduplication
4.15.21 Enrichment
4.15.22 Contextualization
4.15.23 Prioritization
4.15.24 Threat intelligence reporting
4.15.25 Threat landscaping
4.15.26 Forecasting
Dissemination
4.15.27 Sharing within the company
4.15.28 Sharing with the industry
4.15.29 Sharing outside the industry
4.15.30 Sharing in standardized format (e.g. STIX)
Infrastructure Management
4.15.31 Management of the CTI infrastructure (Threat Intelligence Platform)

Completeness (%)

Comments and/or Remarks


4.16 Specify rationale for chosen values or any additional comments
Management
nt

Answer

Incomplete

rvice delivery?

this service?
0

Maturity 4.1
4.3
4.4
4.5
4.6
4.7
4.8
4.9
4.10
4.11
4.12
4.13
4.14

Capability 4.15.1
4.15.2
4.15.3
4.15.4
4.15.5
4.15.6
4.15.7
4.15.8
4.15.9
4.15.10
4.15.11
4.15.12
4.15.13
4.15.14
4.15.15
4.15.16
4.15.17
4.15.18
4.15.19
4.15.20
4.15.21
4.15.22
4.15.23
4.15.24
4.15.25
4.15.26
4.15.27
4.15.28
4.15.29
4.15.30
4.15.31
Guidance
Remarks

A service description should be in place

Indicators to establish the performance of the service


Indicators to establish the quality of service delivery
A clear understanding of which people / process / technologies are required for adequate service delivery
Agreements on minimum performance, capacity, availability, etc.
The operational hours for this service (e.g. 24/7)
The customers and stakeholders for this service (e.g. IT management)
The purpose and objectives for this service
The service input: what triggers this service to run?
The service output: what does the service deliver? Can be tangible (e.g. reports) or intangible (e.g. situational awareness )
Which activities are carried out within the scope of the service?
Which roles and responsibilities apply to this service?
Use this outcome to determine the score for 4.2
Are the quality indicators from the previous questions used for reporting on the service?
Service levels should be used to formally commit the SOC to service delivery
Changes to the service scope, delivery, etc.
Contractual agreements should also cover penalties
Allocation of dedicated personnel will ensure highest service quality
e.g. the security monitoring process, and mainly the security incident management process
Service continuity is important to comply with contractual agreements, even in case of major incidents
Procedures support process standardization and quality. Personnel should be trained to use procedures correctly and stru
Customer onboarding and offloading procedures support efficient service delivery and ensure customers are (dis)connecte
Best practices should be used to optimize this service
Service performance measurement requires establishment of performance goals
Improvement based on based evaluation, (maturity) assessment, tests, etc.
A process for continuously gathering relevant intelligence information
An automated system that collects and processes security intelligence information
A central 'hub' for distributing indicators of compromise to other systems for further processing
The use of public sources in the security intelligence process
The use of closed trusted communities in the security intelligence process
The use of intelligence providers as a source for the security intelligence process
The use of business partners as a source for the security intelligence process
The use of mailing lists as a source for the security intelligence process
The use of internal intelligence sources for the security intelligence process

The capability to analyze structured information


The capability to analyze unstructured information
The capability of using past incidents in the threat intelligence process. e.g. connecting new IoCs to past threats
Analyzing trends in the threat intelligence IoCs observed within the company
Automated alerting of sightings of observables
Tracking the movement of attackers to keep track of new tools, tactics, techniques and procedures
Identification of adversaries based on correlating intelligence indicators and incidents
Identification of threats related to attacker groups
Prediction of threats based on the information gathered in the threat intelligence process
Extraction of Tactics, Techniques and Procedures (TTP) from observables within the infrastructure
Deduplication of threat intelligence feeds to avoid duplicate events
Enrichment of information with additional sources for a higher level of confidentiality
Addition of context to the threat intelligence process. Context can be vulnerability context, asset criticality, etc.
Prioritization of threat intelligence based on trustworthiness of source, sector relevance, geographic relevance, timeliness,
Reporting on threat intelligence findings and activities
Creation of a landscape of current and emerging threats for strategic purposes
Forecasting based on trends and incidents

Sharing of information with relevant parties within the company


Sharing of information with relevant parties within the same industry
Sharing of information with relevant parties outside the industry
Sharing of information in standardized exchange formats, such as STIX
Managing the TIP to optimally support TI efforts
Services
1. Security Monitoring 5. Threat Hunting
2. Security Incident Management 6. Vulnerability Management
3. Security Analysis & Forensics 7. Log Management
4. Threat Intelligence

5 Threat Hunting

Maturity
5.1 Do you use a standardized threat hunting methodology?
5.2 Have you formally described the threat hunting service?
5.3 Please specify elements of the threat hunting service document:
5.3.1 Key performance indicators
5.3.2 Quality indicators
5.3.3 Service dependencies
5.3.4 Service levels
5.3.5 Hours of operation
5.3.6 Service customers and stakeholders
5.3.7 Purpose
5.3.8 Service input / triggers
5.3.9 Service output / deliverables
5.3.10 Service activities
5.3.11 Service roles & responsibilities
Completeness
5.4 Is the service measured for quality?
5.5 Is the service measured for service delivery in accordance with service levels?
5.6 Are customers and/or stakeholders regularly updated about the service?
5.7 Is there a contractual agreement between the SOC and the customers?
5.8 Is sufficient personnel allocated to the process to ensure required service delivery?
5.9 Is the service aligned with other relevant processes?
5.10 Is there a incident resolution / service continuity process in place for this service?
5.11 Has a set of procedures been created for this service?
5.12 Is there an onboarding and offloading procedure for this service?
5.13 Are best practices applied to the service?
5.14 Is process data gathered for prediction of service performance?
5.15 Is the service continuously being improved based on improvement goals?

Capability
5.16 Please specify capabilities and artefacts of the threat hunting process:
5.16.1 Hash value hunting
5.16.2 IP address hunting
5.16.3 Domain name hunting
5.16.4 Network artefact hunting
5.16.5 Host-based artefact hunting
5.16.6 Adversary tools hunting
5.16.7 Adversary TTP hunting
5.16.8 Inbound threat hunting
5.16.9 Outbound threat hunting
5.16.10 Internal threat hunting
5.16.11 Outlier detection
5.16.12 Hunting coverage
5.16.13 Leveraging of existing tooling
5.16.14 Custom hunting scripts and tools
5.16.15 Dedicated hunting platform
5.16.16 Continuous hunting data collection
5.16.17 Historic hunting
5.16.18 Automated hunting
5.16.19 Hunt alerting
5.16.20 Vulnerability information integration
5.16.21 Threat intelligence integration

Completeness (%)

Comments and/or Remarks


5.17 Specify rationale for chosen values or any additional comments

[1] The TaHiTI threat hunting methodology is a methodology for conducting threat hunting investigations created
by the Dutch financial sector and can be obtained from the following location:

https://www.betaalvereniging.nl/en/safety/tahiti/
Management
nt

Answer

Mostly complete

rvice delivery?

this service?
12

Maturity 5.1
5.2
at hunting investigations created 5.4
5.5
5.6
5.7
5.8
5.9
5.10
5.11
5.12
5.13
5.14
5.15

Capability 5.16.1
5.16.2
5.16.3
5.16.4
5.16.5
5.16.6
5.16.7
5.16.8
5.16.9
5.16.10
5.16.11
5.16.12
5.16.13
5.16.14
5.16.15
5.16.16
5.16.17
5.16.18
5.16.19
5.16.20
5.16.21
Guidance

SLA defined, measured periodically but not reported

Periodical updates sent to all customers/stakeholders

Basic contract in place, not formally signed of

All procedures in place, operational but not used structurally

Full procedures in place, operational but not used structurally

Best practices applied, but not structurally

Goals set for service performance, measured structurally but informally


Averagely implemented, partially documented

Partially implemented, incomplete

Averagely implemented, partially documented

Averagely implemented, partially documented

Averagely implemented, partially documented

Averagely implemented, partially documented

Averagely implemented, partially documented

Averagely implemented, partially documented

Averagely implemented, partially documented

Averagely implemented, partially documented

Partially implemented, incomplete


Remarks

Can be an internally developed approach or a publically available methodolology, such as TaHiTI [1]
A service description should be in place

Indicators to establish the performance of the service


Indicators to establish the quality of service delivery
A clear understanding of which people / process / technologies are required for adequate service delivery
Agreements on minimum performance, capacity, availability, etc.
The operational hours for this service (e.g. 24/7)
The customers and stakeholders for this service (e.g. IT management)
The purpose and objectives for this service
The service input: what triggers this service to run?
The service output: what does the service deliver? Can be tangible (e.g. reports) or intangible (e.g. situational awareness )
Which activities are carried out within the scope of the service?
Which roles and responsibilities apply to this service?
Use this outcome to determine the score for 4.2
Are the quality indicators from the previous questions used for reporting on the service?
Service levels should be used to formally commit the SOC to service delivery
Changes to the service scope, delivery, etc.
Contractual agreements should also cover penalties
Allocation of dedicated personnel will ensure highest service quality
e.g. threat intelligence, security monitoring, security incident response
Service continuity is important to comply with contractual agreements, even in case of major incidents
Procedures support process standardization and quality. Personnel should be trained to use procedures correctly and stru
Customer onboarding and offloading procedures support efficient service delivery and ensure customers are (dis)connecte
Best practices should be used to optimize this service
Service performance measurement requires establishment of performance goals
Improvement based on based evaluation, (maturity) assessment, tests, etc.

'Trivial', lowest added value as these change swiftly


'Easy', low added value
'Simple', somewhat higher added value
'Annoying', include network flow, packet capture, proxy logs, active network connections, historic connections, ports and s
'Annoying', includes users, processes, services, drivers, files, registry, hardware, memory, disk activity, network connection
'Challenging', includes dual-use tools
'Tough!', requires detailed knowledge of adversaries and their modus operandi
Hunting for inbound threats such as inbound connections, DNS zone transfers, inbound emails
Hunting for outbound threats such as C&C traffic, outbound emails
Hunting for threats inside the organization. Hunting may focus on lateral movement or anomalous network connections
Using statistical methods to detect outliers, such as least frequency of occurrence
How well does the hunting process cover your environment? All assets & network traffic, or only partially? Scalability is ke
Existing tools may include the SIEM system, firewall analysis tools, etc.
Any custom scripts to assist the hunting process. May include scripts to scan end-points for particular artifacts
Dedicated tooling for the hunting process
Continuous collection of information can be used to alert on indicators and to preserve system state
Hunting for indicators that may have been present on end-points in the past. Requires some sort of saved state
Fully automated hunting capability
Automated alerting based on queries performed in the hunting process
Integration of vulnerability information into the hunting process to provide additional context
Integration of threat intelligence information into the hunting process, threat hunts should be driven by threat intelligence
Services
1. Security Monitoring 5. Threat Hunting
2. Security Incident Management 6. Vulnerability Management
3. Security Analysis & Forensics 7. Log Management
4. Threat Intelligence

6 Vulnerability Management

Maturity
6.1 Have you formally described the vulnerability management service?
6.2 Please specify elements of the vulnerability management service document:
6.2.1 Key performance indicators
6.2.2 Quality indicators
6.2.3 Service dependencies
6.2.4 Service levels
6.2.5 Hours of operation
6.2.6 Service customers and stakeholders
6.2.7 Purpose
6.2.8 Service input / triggers
6.2.9 Service output / deliverables
6.2.10 Service activities
6.2.11 Service roles & responsibilities
Completeness
6.3 Is the service measured for quality?
6.4 Is the service measured for service delivery in accordance with service levels?
6.5 Are customers and/or stakeholders regularly updated about the service?
6.6 Is there a contractual agreement between the SOC and the customers?
6.7 Is sufficient personnel allocated to the process to ensure required service delivery?
6.8 Is the service aligned with other relevant processes?
6.9 Is there a incident resolution / service continuity process in place for this service?
6.10 Has a set of procedures been created for this service?
6.11 Is there an onboarding and offloading procedure for this service?
6.12 Are best practices applied to the service?
6.13 Is process data gathered for prediction of service performance?
6.14 Is the service continuously being improved based on improvement goals?
Capability
6.15 Please specify capabilities and artefacts of the vulnerability management process:
6.15.1 Network mapping
6.15.2 Vulnerability identification
6.15.3 Risk identification
6.15.4 Risk acceptance
6.15.5 Security baseline scanning
6.15.6 Authenticated scanning
6.15.7 Incident management integration
6.15.8 Asset management integration
6.15.9 Configuration management integration
6.15.10 Patch management integration
6.15.11 Trend identification
6.15.12 Enterprise vulnerability repository
6.15.13 Enterprise application inventory
6.15.14 Vulnerability Management procedures
6.15.15 Scanning policy tuning
6.15.16 Detailed Vulnerability Reporting
6.15.17 Management Reporting
6.15.18 Scheduled scanning
6.15.19 Ad-hoc specific scanning
6.15.20 Vulnerability information gathering & analysis

Completeness (%)

Comments and/or Remarks


6.16 Specify rationale for chosen values or any additional comments
Management
nt

Answer

Partially complete

rvice delivery?

this service?
ment process:

Maturity 6.1
6.3
6.4
6.5
6.6
6.7
6.8
6.9
6.10
6.11
6.12
6.13
6.14

Capability 6.15.1
6.15.2
6.15.3
6.15.4
6.15.5
6.15.6
6.15.7
6.15.8
6.15.9
6.15.10
6.15.11
6.15.12
6.15.13
6.15.14
6.15.15
6.15.16
6.15.17
6.15.18
6.15.19
6.15.20
Guidance

Basic procedures in place, used in an ad-hoc fashion

Full procedures in place, operational but not used structurally

Best practices applied, but not structurally


Remarks

A service description should be in place

Indicators to establish the performance of the service


Indicators to establish the quality of service delivery
A clear understanding of which people / process / technologies are required for adequate service delivery
Agreements on minimum performance, capacity, availability, etc.
The operational hours for this service (e.g. 24/7)
The customers and stakeholders for this service (e.g. IT management)
The purpose and objectives for this service
The service input: what triggers this service to run?
The service output: what does the service deliver? Can be tangible (e.g. reports) or intangible (e.g. situational awareness )
Which activities are carried out within the scope of the service?
Which roles and responsibilities apply to this service?
Use this outcome to determine the score for 5.2
Are the quality indicators from 1.3.2 used for reporting on the service?
Service levels should be used to formally commit the SOC to service delivery
Changes to the service scope, delivery, etc.
Contractual agreements should also cover penalties
Allocation of dedicated personnel will ensure highest service quality
e.g. the security monitoring process, and mainly the security incident management process
Service continuity is important to comply with contractual agreements, even in case of major incidents
Procedures support process standardization and quality. Personnel should be trained to use procedures correctly and stru
Customer onboarding and offloading procedures support efficient service delivery and ensure customers are (dis)connecte
Best practices should be used to optimize this service. Adoption of a standard like NIST 800-40r4 is part of best practices
Service performance measurement requires establishment of performance goals
Improvement based on based evaluation, (maturity) assessment, tests, etc.
The capability to map the entire network
Capability of identification of vulnerabilities on all types of assets: systems, network components, databases, etc.
Identification of the risk associated with each of these vulnerabilities
Vulnerabilities that are not mitigated must be formally accepted and documented as such
Scanning of systems for compliance to a security baselines (e.g. CIS baselines)
Scanning of systems using credentials for higher confidence and additional vulnerabilities
Integration of the vulnerability management process with the incident management process
Integration of the vulnerability management process with the asset management process
Integration of the vulnerability management process with the configuration management process
Integration of the vulnerability management process with the patch management process
Identification of vulnerability trends across the whole population of systems
A repository or database that holds all vulnerability information. Can be used for analysis
An inventory of all applications used in the enterprise and the vulnerability status for each of those applications
Procedures supporting the vulnerability management process
Continuous tuning of the scanning policy to include new threats and vulnerabilities
Detailed reporting of vulnerable assets and mitigation strategies
A management report that contains an overview of the vulnerability status in the organizations
A scheduling engine that allows for scanning at predefined times and insight into all available scans
e.g. capability to scan for specific vulnerabilities. May require consent and other processes to be in place
Gathering & analyzing information from internal and external sources, including external researchers, bulletins and other f
Services
1. Security Monitoring 5. Threat Hunting
2. Security Incident Management 6. Vulnerability Management
3. Security Analysis & Forensics 7. Log Management
4. Threat Intelligence

7 Log Management

Maturity
7.1 Have you formally described the log management service?
7.2 Please specify elements of the log management service document:
7.2.1 Key performance indicators
7.2.2 Quality indicators
7.2.3 Service dependencies
7.2.4 Service levels
7.2.5 Hours of operation
7.2.6 Service customers and stakeholders
7.2.7 Purpose
7.2.8 Service input / triggers
7.2.9 Service output / deliverables
7.2.10 Service activities
7.2.11 Service roles & responsibilities
Completeness
7.3 Is the service measured for quality?
7.4 Is the service measured for service delivery in accordance with service levels?
7.5 Are customers and/or stakeholders regularly updated about the service?
7.6 Is there a contractual agreement between the SOC and the customers?
7.7 Is sufficient personnel allocated to the process to ensure required service delivery?
7.8 Is the service aligned with other relevant processes?
7.9 Is there a incident resolution / service continuity process in place for this service?
7.10 Has a set of procedures been created for this service?
7.11 Is there an onboarding and offloading procedure for this service?
7.12 Are best practices applied to the service?
7.13 Is process data gathered for prediction of service performance?
7.14 Is the service continuously being improved based on improvement goals?
Capability
7.15 Please specify capabilities and artefacts of the log management process:
7.15.1 End-point log collection
7.15.2 Application log collection
7.15.3 Database log collection
7.15.4 Network flow data collection
7.15.5 Network device log collection
7.15.6 Security device log collection
7.15.7 Centralized aggregation and storage
7.15.8 Multiple retention periods
7.15.9 Secure log transfer
7.15.10 Support for multiple log formats
7.15.11 Support for multiple transfer techniques
7.15.12 Data normalization
7.15.13 Log searching and filtering
7.15.14 Alerting
7.15.15 Reporting and dashboards
7.15.16 Log tampering detection
7.15.17 Log collection policy
7.15.18 Logging policy
7.15.19 Data retention policy
7.15.20 Privacy and Sensitive data handling policy

Completeness (%)

Comments and/or Remarks


7.16 Specify rationale for chosen values or any additional comments
Management
nt

Answer

Fully complete

rvice delivery?

this service?
91

Maturity 7.1
7.3
7.4
7.5
7.6
7.7
7.8
7.9
7.10
7.11
7.12
7.13
7.14

Capability 7.15.1
7.15.2
7.15.3
7.15.4
7.15.5
7.15.6
7.15.7
7.15.8
7.15.9
7.15.10
7.15.11
7.15.12
7.15.13
7.15.14
7.15.15
7.15.16
7.15.17
7.15.18
7.15.19
7.15.20
Guidance

Document completed, approved and formally published

Formal and approved metrics in place, feedback used for improvement

SLA compliance discussed with customers regularly for improvement

Periodical updates sent to all customers/stakeholders

Contract signed, approved by- and regularly reviewed with customers

Sufficient dedicated personnel available, trained and fully capable

Alignment done structurally & regularly with all relevant processes

Basic service continuity process in place

Procedures in place, formally published and fully operationalized

Procedures in place, formally published and regularly reviewed

Best practices applied and adherence checked regularly

Continuous measurement to determine progress & adjust process

Continuous improvement based on targets and feedback loops


Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Mostly implemented, documented and approved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Mostly implemented, documented and approved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved

Fully implemented, documented, approved, actively improved


Remarks

A service description should be in place

Indicators to establish the performance of the service


Indicators to establish the quality of service delivery
A clear understanding of which people / process / technologies are required for adequate service delivery
Agreements on minimum performance, capacity, availability, etc.
The operational hours for this service (e.g. 24/7)
The customers and stakeholders for this service (e.g. IT management)
The purpose and objectives for this service
The service input: what triggers this service to run?
The service output: what does the service deliver? Can be tangible (e.g. reports) or intangible (e.g. situational awareness )
Which activities are carried out within the scope of the service?
Which roles and responsibilities apply to this service?
Use this outcome to determine the score for 6.2
Are the quality indicators from 1.3.2 used for reporting on the service?
Service levels should be used to formally commit the SOC to service delivery
Changes to the service scope, delivery, etc.
Contractual agreements should also cover penalties
Allocation of dedicated personnel will ensure highest service quality
e.g. the security monitoring process, and mainly the security incident management process
Service continuity is important to comply with contractual agreements, even in case of major incidents
Procedures support process standardization and quality. Personnel should be trained to use procedures correctly and stru
Customer onboarding and offloading procedures support efficient service delivery and ensure customers are (dis)connecte
Best practices should be used to optimize this service
Service performance measurement requires establishment of performance goals
Improvement based on based evaluation, (maturity) assessment, tests, etc.
Collection of logs from servers and clients
Collection of application logs
Collection of database logs
Collection of netflow (or equivalent) information
Collection of logs from network devices (switches, routers, etc.)
Collection of logs from security devices (firewall, remote access gateways, etc.)
A central physical or logical entity for processing and aggregating collected logging
e.g. short period for large-quantity logging (proxy logging), long period for security logging
Support for encryption and (client or server) authentication
Support for different log formats (plain text, XML, Windows Event Log, etc.)
e.g. syslog, WMI, etc.
i.e. assignment of severity, category, priority
The capability to search in large quantities of logging using search expressions and filter expressions
Basic alerting functions based on log contents or normalized information (severity, etc.)
Reports and dashboards for visualization of log information
Detection of tampering with the logging information. This can be part of techniques applied to cover tracks
A policy that enforces log collection from all systems
Policy to enforce the generation of a minimum set of operational and security logs (e.g. authentication, authorization)
A policy that defines how long logging should (or may) be stored
A policy that describes how to deal with sensitive information that may exist in the security monitoring systems
Results
1. Results
2. NIST CSF Scoring
3. Results Sharing

Domain Aspect Maturity Score


Business 1. Business Drivers 3.5
2. Customers 4.58
3. Charter 3.75
4. Governance 3.47
5. Privacy & policy 1.88
overall Business 3.44
People 1. Employees 3.13
2. Roles and Hierarchy 4.38
3. People Management 3.21
4. Knowledge Management 4.22
5. Training & Education 2.68
overall People 3.52
Process 1. SOC Management 3.39
2. Operations & Facilities 3.06
3. Reporting & Communication 3.6
4. Use Case Management 3.5
5. Detection Engineering & Validation 3.19
overall Process 3.35
Technology 1. SIEM / UEBA 3.81
2. NDR 0
3. EDR 3.64
4. SOAR 3.81
overall Technology 3.75
Services 1. Security Monitoring 3.57
2. Security Incident Management 2.41
3. Security Analysis & Forensics 4.73
4. Threat Intelligence 0
5. Threat Hunting 0.71
6. Vulnerability Management 0
7. Log Management 4.71
overall Services 2.3
Maturity Target Capability score Capability target In scope?

3 N/A N/A

3 N/A N/A

3 N/A N/A
2.21 Yes
0 No
2.13 Yes
2.12 Yes
3 2.15 2
2.53 Yes
1.69 Yes
2.87 Yes
0 Yes
0.36 Yes
0 Yes
2.74 Yes
3 1.46 1
1. Business Drivers
7. Log Management 2. Custome
Services
6. Vulnerability Management
5

5. Threat Hunting 4.5

3.5
4. Threat Intelligence
3

2.5
3. Security Analysis & Forensics
2

1.5

1
2. Security Incident Management
0.5

1. Security Monitoring

4. SOAR

3. EDR

Technology 2. NDR

1. SIEM / UEBA

5. Detection Engineering & Validation 3. Reportin


4. Use Case Management
Technology 2. NDR

1. SIEM / UEBA

5. Detection Engineering & Validation 3. Reportin


4. Use Case Management

Business

Services 2 People

Technology Process

Maturity score Target maturity score


1. Business Drivers
7. Log Management 2. Customers
Business
nagement 3. Charter
5

4.5 4. Governance

3.5
5. Privacy & policy
3

2.5
1. Employees
2

1.5

1
2. Roles and Hierarchy
0.5
M
0
C

3. People Management

4. Knowledge Management

People
5. Training & Education

1. SOC Management

M / UEBA 2. Operations & Facilities

n Engineering & Validation 3. Reporting & Communication Process


4. Use Case Management
1. SOC Management

M / UEBA 2. Operations & Facilities

n Engineering & Validation 3. Reporting & Communication Process


4. Use Case Management

2.5

2
People

1.5

0.5

0
Process Technology Services

t maturity score Capability score Target capability score


Results
1. Results NIST CSF 2.0
2. NIST CSF Scoring NIST CSF 1.1
3. Results Sharing

Domain Aspect
Identify Asset Management (ID.AM)
Business Environment (ID.BE)
Governance (ID.GV)
Risk Assessment (ID.RA)
Risk Management Strategy (ID.RM)
Supply Chain Risk Management (ID.SC)
overall Identify
Protect Access Control (PR.AC)
Awareness and Training (PR.AT)
Data Security (PR.DS)
Information Protection Processes and Procedures (PR.IP)
Maintenance (PR.MA)
Protective Technology (PR.PT)
overall Protect
Detect Anomalies and Events (DE.AE)
Security Continuous Monitoring (DE.CM)
Detection Processes (DE.DP)
overall Detect
Respond Response Planning (RS.RP)
Communications (RS.CO)
Analysis (RS.AN)
Mitigation (RS.MI)
Improvements (RS.IM)
overall Respond
Recover Recovery Planning (RC.RP)
Improvements (RC.IM)
Communications (RC.CO)
overall Recover
Maturity Score Capability score
4.46 N/A
3.61 N/A
Recove
2.63 3
2.69 0
3.75 N/A
N/A 1.5 Recovery P

3.43 1.5
3.3 0
4.08 1.5 Improvements (

2.75 2.68
1.66 0
3.21 0
Mitigation (RS.M
4.31 3
3.22 1.2
3.75 2.32
3.75 1.24
Analysis (RS.AN
3.83 2.08
3.78 1.88
2.5 2.25
2.5 1.85 Communications (R
5 2.25
2.5 0.61 Respon
2.19 1.31 Response P
2.94 1.65
N/A N/A
N/A N/A
N/A N/A
Detect
N/A N/A
Recover Communications (RC.CO)
Asset Management (ID.AM)
Business Environment (ID.B

Improvements (RC.IM) 5 Governanc

4.5

Recovery Planning (RC.RP) 4

3.5

Improvements (RS.IM) 2.5

1.5

Mitigation (RS.MI) 1

0.5

Analysis (RS.AN)

Communications (RS.CO)

Respond
Response Planning (RS.RP)

Detection Processes (DE.DP) Maintenan

Detect Security Continuous Monitoring (DE.CM)


Anomalies and Events (DE.AE)
Protective Technology (PR.P
Identify

Recover 2 Protect

Respond Detect

Maturity score
Asset Management (ID.AM)
(RC.CO) Business Environment (ID.BE) Identify
5 Governance (ID.GV)

4.5

4 Risk Assessment (ID.RA)


3.5

2.5 Risk Management Strategy (ID.RM)

1.5 Ma
Cap
1 Access Control (PR.AC)
0.5

Awareness and Training (PR.AT)

Data Security (PR.DS)

Information Protection Processes and Procedures (PR.IP)

Maintenance (PR.MA)

(DE.CM)
Anomalies and Events (DE.AE)
Protective Technology (PR.PT) Protect
3

2.5

2
Protect

1.5

0.5

0
Detect Identify Protect Detect Respond Recov

e Capability score
Results
1. Results NIST CSF 2.0
2. NIST CSF Scoring NIST CSF 1.1
3. Results sharing

Function Category
Govern Organizational Context (GV.OC)
Risk Management Strategy (GV.RM)
Roles, Responsibilities, and Authorities (GV.RR)
Policy (GV.PO)
Oversight (GV.OV)
Cybersecurity Supply Chain Risk Management (GV.SC)
overall Govern
Identify Asset Management (ID.AM)
Risk Assessment (ID.RA)
Improvement (ID.IM)
overall Identify
Protect Identity Management, Authentication, and Access Control (PR.AA)
Awareness and Training (PR.AT)
Data Security (PR.DS)
Platform Security (PR.PS)
Technology Infrastructure Resilience (PR.IR)
overall Protect
Detect Continuous Monitoring (DE.CM)
Adverse Event Analysis (DE.AE)
overall Detect
Respond Incident Management (RS.MA)
Incident Analysis (RS.AN)
Incident Response Reporting and Communication (RS.CO)
Incident Mitigation (RS.MI)
overall Respond
Recover Incident Recovery Plan Execution (RC.RP)
Incident Recovery Communication (RC.CO)
overall Recover
Maturity Score Capability score
3.37 3
Recove
3.13 N/A
3.94 N/A
2.86 3
1.25 N/A
4.38 1.5
3.16 2.5
4.04 0.75 Respon
1.76 0
Incident Response Repor
2.64 1.35
2.81 0.7
3.38 N/A
3.75 2
3.54 2.81
4.17 3
In
2.57 2.25
3.48 2.52
3.67 1.99 Detect
3.75 2.27
3.71 2.13
3.5 1.98
5 2.54
2.5 1.82
2.5 0.92
3.38 1.82
N/A 2.63
N/A N/A Protect
N/A 2.63
Recover

Organizational Context (GV.OC)


Incident Recovery Communication (RC.CO) Risk Managemen

Incident Recovery Plan Execution (RC.RP) Rol


5
4.5
Incident Mitigation (RS.MI) 4
Respond 3.5
3
Incident Response Reporting and Communication (RS.CO)
2.5
2
1.5
Incident Analysis (RS.AN) 1
0.5
0

Incident Management (RS.MA)

Detect
Adverse Event Analysis (DE.AE)

Continuous Monitoring (DE.CM)

Technology Infrastructure Resilience (PR.IR) Ide

Platform Security (PR.PS) Awareness and T


Data Security (PR.DS)

Protect
Govern

5
4
Recover 3 Identify

2
1
0

Respond Protect

Detect

Maturity score
Govern

Organizational Context (GV.OC)


y Communication (RC.CO) Risk Management Strategy (GV.RM)

ution (RC.RP) Roles, Responsibilities, and Authorities (GV.RR)


5
4.5
MI) 4 Policy (GV.PO)

3.5
3
Oversight (GV.OV)
2.5
2
1.5
1 Cybersecurity Supply Chain Risk Management (GV.SC)
0.5
0

Asset Management (ID.AM)

Risk Assessment (ID.RA)

M) Improvement (ID.IM)
Identify
lience (PR.IR) Identity Management, Authentication, and Access Control (PR.AA)

Platform Security (PR.PS) Awareness and Training (PR.AT)


Data Security (PR.DS)
3

2.5
Identify
2

1.5

Protect
0.5

0
Govern Identify Protect Detect Respond Reco

e Capability score
Assessment results
Personal information
May we contact you regarding your scoring?
If yes: please provide your email address

SOC & organisational Profile


Assessment date 2024/05/11 YYYY-MM-DD format
Select assessment type 0
Select assessment style 0
What is your organization's size (FTE)? 1.000-4.999
What is your organization's sector? Consulting
Number of years of SOC operations 3
What is your SOC size (FTE)? 5
What region is your SOC located in? South America
What is your SOC organisational model? Distributed SOC
What is your SOC geographical scope? Regional

SOC assessment scores Maturity score Maturity target Capability score Capability target
Business domain 3.44 3
Business drivers 3.5
Customers customers 4.58
Charter charter 3.75
Governance governance 3.47
Privacy & policy 1.88
People domain 3.52 3
Employees 3.13
Roles & hierarchy 4.38
People & team management 3.21
Knowledge management 4.22
Training & Education 2.68
Process 3.35 3
SOC management 3.39
Operations & facilities 3.06
Reporting & communication 3.6
Use case management 3.5
Detection engineering & validatio 3.19
Technology 3.75 3 2.15 2 In scope?
SIEM / UEBA tooling 3.81 2.21 Yes
NDR toolgin 0 0 No
EDR tooling 3.64 2.13 Yes
SOAR tooling 3.81 2.12 Yes
Services 2.3 3 2.15 1 In scope?
Security monitoring 3.57 2.53 Yes
Security incident management 2.41 1.69 Yes
Security analysis & forensics 4.73 2.87 Yes
Threat intelligence 0 0 Yes
Threat hunting 0.71 0.36 Yes
Vulnerability management 0 0 Yes
Log management 4.71 2.74 Yes
Next steps
1. Next steps for improvement

Maturity improvement
With the SOC-CMM assessment completed, the next steps are to determine the areas to improve. This requires som
analysed top-down. First, determine which domains are scoring less than the target maturity level. Then, drill down
maturity level was not used, then the domains should be chosen that underperform in comparison to the other dom
those domains yield the lowest scores.

When the domains and the respective aspects that require improvement have been identified, detailed information
that need to be made. The sheets for those domains provide the detailed information that is required for improvem
'Usage' sheet to determine which of the individual elements is negatively contributing to the overall score. Those ele
Improvement can as simple as creating and maintaining the appropriate documentation or as complex as introducin
CMM does not provide guidance on how to execute the improvement. This should be determined by internal expert
to purchase a licensed and supported version of the SOC-CMM. This licensed and supported version contains a num

Capability improvement
Capabilities apply to services and technologies and indicate how capable a service or technology is to reach it's goal
be improved, the first question to ask is: which service or technology is negatively impacted the most by lack of capa
candidate for improvement.

Similar to maturity improvement, the detailed information is provided in the sheets for those domains. The element
to be addressed. It is recommended to search for groups of elements that perhaps have the same underlying reason
improvement of capabilities can be optimised. A common root cause is lack of documentation and formalisation.

Comparison
When a second assessment is performed, the results should be compared to the previous assessment to determine t
both the high-level and the detailed information about the improvement. Use the result tables to determine the diff
of the assessment to see where actual improvement was made, and if this is in line with goals set for improvement.
areas to improve. This requires some analysis of the results. The results should be
get maturity level. Then, drill down into those domains using the graphs. If a target
rm in comparison to the other domains. The next step is to determine which aspects of

en identified, detailed information is required to determine the exact improvements


ation that is required for improvement. Use the scoring mechanism as described in the
uting to the overall score. Those elements are candidate for improvement.
entation or as complex as introducing new management elements to the SOC. The SOC-
d be determined by internal experts or external consultants. Alternatively, it is possible
supported version contains a number of consultancy hours.

e or technology is to reach it's goals. To determine which specific capabilities need to


y impacted the most by lack of capabilities? That service or technology is the first

ets for those domains. The elements that score the lowest are the elements that need
ps have the same underlying reason (root cause) for underscoring. This way,
cumentation and formalisation.

previous assessment to determine the growth and evolution of the SOC. This includes
e result tables to determine the differences and then drill down to those specific parts
ne with goals set for improvement.
in scope type answer importance
SOC-CMM - Business Domain
B1 - Business Drivers
B 1.1 1 M 4 3
B 1.2 1 M 3 3
B 1.3 1 M 3 3
B 1.4 1 M 5 3
B 1.5 1 M 4 3
SUM 19 15

B2 - Customers
B 2.1 1 M 5 3
B 2.2
B 2.2.1 2
B 2.2.2 2
B 2.2.3 1
B 2.2.4 2
B 2.2.5 1
B 2.2.6 2
B 2.2.7 2
B 2.2.8
B 2.3 1 M 4 3
B 2.4 1 M 4 3
B 2.5 1 M 5 3
B 2.6 1 M 5 3
B 2.7 1 M 5 3
SUM 28 18

B3 - SOC Charter
B 3.1 1 M 4 3
B 3.2 Mostly complete
B 3.2.1 2
B 3.2.2 2
B 3.2.3 1
B 3.2.4 2
B 3.2.5 1
B 3.2.6 2
B 3.2.7 2
B 3.2.8 2
B 3.2.9 2
B 3.2.10 2
B 3.2.11 2
B 3.3 1 M 3 3
B 3.4 1 M 4 3
B 3.5 1 M 5 3
SUM 16 12
B4 - Governance
B 4.1 1 M 3 3
B 4.2 1 M 3 3
B 4.3 Fully complete
B 4.3.1 2
B 4.3.2 2
B 4.3.3 2
B 4.3.4 2
B 4.3.5 2
B 4.3.6 2
B 4.3.7 2
B 4.3.8 2
B 4.3.9 2
B 4.3.10 2
B 4.3.11 2
B 4.3.12 2
B 4.3.13 2
B 4.3.14 2
B 4.4 1 M 5 3
B 4.5 Fully complete
B 4.5.1 2
B 4.5.2 2
B 4.5.3 2
B 4.5.4 2
B 4.5.5 2
B 4.5.6 2
B 4.5.7 2
B 4.5.8 2
B 4.8 1 M 5 3
B 4.9 1 M 5 3
B 4.10 1 M 3 3
B 4.11 1 M 4 3
Maturity SUM 34 27

B5 - Privacy & Policy


B 5.6 1 M 3 3
B 5.7 1 M 2 3
B 5.7 1 M 2 3
B 5.7 1 M 2 3
B 5.8 1 M 2 3
B 5.9 1 M 2 3
B 5.10 1 M 3 3
B 5.11 1 M 3 3
Maturity SUM 25 30

SOC-CMM - People Domain


P1 - SOC Employees
P 1.1 4
P 1.2 2
P 1.2.1 0
P 1.3 1 M 3 3
P 1.4 1 M 2 3
P 1.5 1 M 2 3
P 1.6 1 M 3 3
P 1.7 1 M 5 3
P 1.8 1 M 4 3
Maturity SUM 28 24

P2 - SOC Roles and Hierarchy


P 2.1 1 M 5 3
P 2.1 1 M 5 3
P 2.1 1 M 5 3
P 2.2
P 2.2.1 2
P 2.2.2 2
P 2.2.3 1
P 2.2.4 2
P 2.2.5 2
P 2.2.6 2
P 2.2.7 2
P 2.2.8 1
P 2.2.9 2
P 2.2.10 1
P 2.2.11 1
P 2.2.12 1
P 2.3 1 M 5 3
P 2.3 1 M 5 3
P 2.4 1 M 5 3
P 2.4.1
P 2.5 1 M 5 3
P 2.6 1 M 4 3
P 2.6 1 M 4 3
P 2.6 1 M 4 3
P 2.7
P 2.7.1 2
P 2.7.2 2
P 2.7.3 2
P 2.7.4 2
P 2.7.5 2
P 2.7.6 2
P 2.7.7 2
P 2.7.8 2
P 2.8 1 M 5 3
P 2.8 1 M 5 3
P 2.8 1 M 5 3
P 2.9 1 M 4 3
P 2.10 1 M 3 3
Maturity SUM 36 24

P3 - People Management
P 3.1 1 M 2 3
P 3.2 1 M 3 3
P 3.3 1 M 1 3
P 3.4 1 M 4 3
P 3.7 1 M 2 3
P 3.8 1 M 3 3
P 3.9 1 M 3 3
P 3.10 1 M 4 3
P 3.11 1 M 5 3
P 3.12 1 M 5 3
Maturity SUM 50 42

P4 - Knowledge Management
P 4.1 1 M 5 3
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
P 4.9 1 M 4 3
P 4.10 1 M 4 3
Maturity SUM 35 24

P5 - Training & Education


P 5.1 1 M 4 3
P 5.2
P 5.2.1 2
P 5.2.2 2
P 5.2.3 2
P 5.2.4 2
P 5.2.5 2
P 5.2.6 1
P 5.3 1 M 1 3
P 5.4
P 5.4.1 1
P 5.4.2 1
P 5.4.3 1
P 5.5 1 M 5 3
P 5.6 1 M 3 3
P 5.7 1 M 5 3
P 5.8 1 M 2 3
P 5.9 1 M 2 3
Maturity SUM 22 21

SOC-CMM - Process Domain


M1 - SOC Management
M 1.1 1 M 4 3
M 1.2 1 M 4 3
M 1.3
M 1.3.1 2
M 1.3.2 1
M 1.3.3 1
M 1.3.4 2
M 1.3.5 1
M 1.3.6 2
M 1.3.7 2
M 1.3.8 1
M 1.3.9 2
M 1.3.10 1
M 1.4 1 M 5 3
M 1.5 1 M 4 3
Maturity SUM 26 21

M2 - Security Operations & Facilities


M 2.1
M 2.1.3 1 M 3 3
M 2.2.1 1 M 3 3
M 2.2.2 1 M 3 3
M 2.2.3 1 M 4 3
M 2.2.4 1 M 4 3
M 2.3
M 2.3.1 1 M 4 3
M 2.3.2 1 M 4 3
M 2.3.3 1 M 4 3
M 2.3.4 1 M 4 3
M 2.3.5 1 M 5 3
M 2.4
M 2.4.1 1 M 3 3
M 2.4.3 1 M 3 3
M 2.4.4 1 M 3 3
M 2.4.6 1 M 4 3
M 2.4.7 1 M 2 3
M 2.4.8 1 M 3 3
M 2.5
M 2.5.1 1 M 5 3
M 2.5.3 1 M 5 3
M 2.5.4 1 M 3 3
M 2.5.5 1 M 3 3
M 2.5.6 1 M 3 3
M 2.6
M 2.6.1 1 M 2 3
M 2.6.2 1 M 2 3
Maturity SUM 107 93

M3 - Reporting & Communication


M 3.1 1 M 5 3
M 3.2 1 M 4 3
M 3.3 1 M 4 3
M 3.4 1 M 5 3
M 3.5 1 M 3 3
M 3.6 1 M 4 3
M 3.7
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
M 3.8
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
M 3.9
M 3.9.1 1 M 4 3
M 3.9.2 1 M 3 3
M 3.9.3 1 M 4 3
Maturity SUM 66 51

M4 - Use Case Management


M 4.1.1 1 M 3 3
M 4.1.1 1 M 3 3
M 4.1.1 1 M 3 3
M 4.1.2 1 M 4 3
M 4.1.2 1 M 4 3
M 4.1.2 1 M 4 3
M 4.1.3 1 M 3 3
M 4.1.3 1 M 3 3
M 4.1.3 1 M 3 3
M 4.1.4 1 M 4 3
M 4.1.4 1 M 4 3
M 4.1.4 1 M 4 3
M 4.1.4 1 M 4 3
M 4.1.5 1 M 3 3
M 4.1.5 1 M 3 3
M 4.1.5 1 M 3 3
M 4.1.6 1 M 3 3
M 4.1.6 1 M 3 3
M 4.1.6 1 M 3 3
M 4.1.7 1 M 3 3
M 4.1.7 1 M 3 3
M 4.1.7 1 M 3 3
M 4.1.8 1 M 3 3
M 4.1.8 1 M 3 3
M 4.1.8 1 M 3 3
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
M 4.1.9 1 M 4 3
M 4.1.9 1 M 4 3
M 4.1.9 1 M 4 3
M 4.1.10 1 M 5 3
M 4.1.10 1 M 5 3
M 4.1.10 1 M 5 3
M 4.1.11 1 M 5 3
M 4.1.11 1 M 5 3
M 4.1.11 1 M 5 3
Maturity SUM 76 60

SOC-CMM - Technology Domain


T1 - SIEM Technology
T 1 - Scope 2
T 1.1
T 1.1.1 1 M 4 3
T 1.1.2 1 M 4 3
T 1.2
T 1.2.1 1 M 4 3
T 1.2.2 1 M 4 3
T 1.3
T 1.3.1 1 M 5 3
T 1.3.2 1 M 5 3
T 1.3.3 1 M 5 3
T 1.3.4 1 M 4 3
T 1.5
T 1.5.1 1 M 4 3
T 1.5.2 1 M 4 3
T 1.5.3 1 M 4 3
T 1.5.4 1 M 4 3
T 1.5.5 1 M 3 3
T 1.5.6 1 M 2 3
T 1.6
T 1.6.1 1 M 4 3
T 1.6.1 1 M 4 3
T 1.6.2 1 M 4 3
T 1.6.2 1 M 4 3
T 1.7
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Capability SUM 150 114
Maturity SUM 89 66

T2 - NDR Tooling
T 2 - Scope 1
T 2.1
T 2.1.1 0 M 0 3
T 2.1.2 0 M 0 3
T 2.2
T 2.2.1 0 M 0 3
T 2.2.2 0 M 0 3
T 2.3
T 2.3.1 0 M 0 3
T 2.3.2 0 M 0 3
T 2.3.3 0 M 0 3
T 2.3.4 0 M 0 3
T 2.5
T 2.5.1 0 M 0 3
T 2.5.2 0 M 0 3
T 2.5.3 0 M 0 3
T 2.5.4 0 M 0 3
T 2.5.5 0 M 0 3
T 2.5.6 0 M 0 3
T 2.6
T 2.6.1 0 M 0 3
T 2.6.1 0 M 0 3
T 2.6.2 0 M 0 3
T 2.6.2 0 M 0 3
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Capability SUM 0 0
Maturity SUM 0 0

T3 - Security Analytics
T 3 - Scope 2
T 3.1
T 3.1.1 1 M 4 3
T 3.1.2 1 M 4 3
T 3.2
T 3.2.1 1 M 4 3
T 3.2.2 1 M 4 3
T 3.3 1
T 3.3.1 1 M 3 3
T 3.3.2 1 M 5 3
T 3.3.3 1 M 5 3
T 3.3.4 1 M 5 3
T 3.5
T 3.5.1 1 M 3 3
T 3.5.2 1 M 3 3
T 3.5.3 1 M 4 3
T 3.5.4 1 M 3 3
T 3.5.5 1 M 3 3
T 3.5.6 1 M 3 3
T 3.6
T 3.6.1 1 M 3 3
T 3.6.1 1 M 3 3
T 3.6.2 1 M 3 3
T 3.6.2 1 M 3 3
T 3.7
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Capability SUM 188 147
Maturity SUM 86 66

T4 - Security Automation & Orchestration


T 4 - Scope 2
T 4.1
T 4.1.1 1 M 4 3
T 4.1.2 1 M 4 3
T 4.2
T 4.2.1 1 M 4 3
T 4.2.2 1 M 4 3
T 4.3
T 4.3.1 1 M 3 3
T 4.3.2 1 M 3 3
T 4.3.3 1 M 5 3
T 4.3.4 1 M 5 3
T 4.5
T 4.5.1 1 M 4 3
T 4.5.2 1 M 4 3
T 4.5.3 1 M 4 3
T 4.5.4 1 M 4 3
T 4.5.5 1 M 4 3
T 4.5.6 1 M 5 3
T 4.6
T 4.6.1 1 M 5 3
T 4.6.1 1 M 5 3
T 4.6.2 1 M 4 3
T 4.6.2 1 M 4 3
T 4.7
T 4.7.4 1 C 4 3
T 4.7.5 1 C 4 3
T 4.7.6 1 C 4 3
T 4.7.7 1 C 4 3
T 4.7.8 1 C 3 3
T 4.7.1 1 C 4 3
T 4.7.9 1 C 4 3
T 4.7.2 1 C 4 3
T 4.7.10 1 C 4 3
T 4.7.11 1 C 4 3
T 4.7.13 1 C 3 3
T 4.7.14 1 C 3 3
T 4.7.15 1 C 4 3
T 4.7.16 1 C 3 3
T 4.7.3 1 C 5 3
Removed, keep lines for backwards compatibility
Renumbered, keep lines for backwards compatibility
Renumbered, keep lines for backwards compatibility
T 4.7.24 1 C 5 3
Removed, keep lines for backwards compatibility
Capability SUM 92 72
Maturity SUM 89 66

SOC-CMM - Services Domain


S1 - Security Monitoring
S 1 - Scope 2
S 1.1 1 M 4 3
S 1.1 1 M 4 3
S 1.1 1 M 4 3
S 1.1 1 M 4 3
S 1.1 1 M 4 3
S 1.1 1 M 4 3
S 1.1 1 M 4 3
S 1.1 1 M 4 3
S 1.1 1 M 4 3
S 1.2
S 1.2.1 2
S 1.2.2 2
S 1.2.3 2
S 1.2.4 1
S 1.2.5 2
S 1.2.6 1
S 1.2.7 1
S 1.2.8 2
S 1.2.9 1
S 1.2.10 1
S 1.2.11 1
S 1.3 1 M 4 3
S 1.4 1 M 4 3
S 1.5 1 M 4 3
S 1.6 1 M 4 3
S 1.7 1 M 4 3
S 1.8 1 M 3 3
S 1.9 1 M 3 3
S 1.9 1 M 3 3
S 1.10 1 M 4 3
S 1.12 1 M 4 3
S 1.13 1 M 4 3
S 1.13 1 M 4 3
S 1.13 1 M 4 3
S 1.13 1 M 4 3
S 1.13 1 M 4 3
S 1.13 1 M 4 3
S 1.13 1 M 4 3
S 1.13 1 M 4 3
S 1.13 1 M 4 3
S 1.14 1 M 4 3
S 1.15 1 M 4 3
S 1.16
S 1.16.1 1 C 4 3
S 1.16.2 1 C 4 3
S 1.16.3 1 C 4 3
S 1.16.4 1 C 4 3
S 1.16.5 1 C 4 3
S 1.16.5 1 C 4 3
S 1.16.6 1 C 4 3
S 1.16.7 1 C 3 3
S 1.16.8 1 C 5 3
S 1.16.12 1 C 4 3
S 1.16.13 1 C 4 3
S 1.16.14 1 C 5 3
S 1.16.15 1 C 5 3
S 1.16.16 1 C 5 3
S 1.16.17 1 C 5 3
S 1.16.18 1 C 5 3
S 1.16.19 1 C 4 3
S 1.16.20 1 C 5 3
S 1.16.21 1 C 5 3
S 1.16.22 1 C 4 3
S 1.16.23 1 C 3 3
S 1.16.24 1 C 5 3
S 1.16.9 1 C 4 3
S 1.16.10 1 C 4 3
S 1.16.11 1 C 4 3
S 1.17
Capability SUM 118 81
Maturity SUM 54 42

S 2 - Security incident Management


S 2 - Scope 2
S 2.1 1
S 2.1.1 0
S 2.1.2 0
S 2.2 1 M 5 3
S 2.3 1 M 4 3
S 2.4
S 2.4.1 1
S 2.4.2 2
S 2.4.3 1
S 2.4.4 1
S 2.4.5 1
S 2.4.6 1
S 2.4.7 1
S 2.4.8 2
S 2.4.9 1
S 2.4.10 1
S 2.4.11 2
S 2.5 1 M 2 3
S 2.6 1 M 3 3
S 2.7 1 M 3 3
S 2.7 1 M 3 3
S 2.7 1 M 3 3
S 2.7 1 M 3 3
S 2.8 1 M 3 3
S 2.9 1 M 2 3
S 2.10 1 M 2 3
S 2.11 1 M 3 3
S 2.11 1 M 3 3
S 2.11 1 M 3 3
S 2.13 1 M 2 3
S 2.14 1 M 3 3
S 2.15 1 M 3 3
S 2.16 1 M 3 3
S 2.16 1 M 3 3
S 2.17
S 2.17.1 1 C 3 3
S 2.17.2 1 C 3 3
S 2.17.3 1 C 4 3
S 2.17.4 1 C 4 3
S 2.17.5 1 C 5 3
S 2.17.6 1 C 4 3
S 2.17.7 1 C 3 3
S 2.17.8 1 C 4 3
S 2.17.9 1 C 3 3
S 2.17.10 1 C 4 3
S 2.17.11 1 C 4 3
S 2.17.12 1 C 4 3
S 2.17.13 1 C 3 3
S 2.17.14 1 C 4 3
S 2.17.15 1 C 4 3
S 2.17.15 1 C 4 3
S 2.17.16 1 C 4 3
S 2.17.16 1 C 4 3
S 2.17.17 1 C 3 3
S 2.17.18 1 C 5 3
S 2.17.19 1 C 4 3
S 2.17.20 1 C 3 3
S 2.17.21 1 C 2 3
S 2.17.22 1 C 2 3
S 2.17.23 1 C 3 3
S 2.17.24 1 C 3 3
S 2.17.25 1 C 3 3
S 2.17.26 1 C 2 3
S 2.17.26 1 C 2 3
S 2.17.27 1 C 1 3
S 2.17.27 1 C 1 3
S 2.17.28 1 C 2 3
S 2.17.28 1 C 2 3
S 2.17.29 1 C 2 3
S 2.17.30 1 C 3 3
S 2.17.31 1 C 3 3
S 2.17.32 1 C 3 3
S 2.17.32 1 C 3 3
S 2.18
Capability SUM 117 108
Maturity SUM 41 42

S 3 - Security Analysis
S 3 - Scope 2
S 3.1 1 M 5 3
S 3.1 1 M 5 3
S 3.1 1 M 5 3
S 3.1 1 M 5 3
S 3.2
S 3.2.1 2
S 3.2.2 2
S 3.2.3 2
S 3.2.4 2
S 3.2.5 2
S 3.2.6 2
S 3.2.7 2
S 3.2.8 2
S 3.2.9 2
S 3.2.10 2
S 3.2.11 2
S 3.3 1 M 5 3
S 3.4 1 M 5 3
S 3.5 1 M 4 3
S 3.6 1 M 4 3
S 3.7 1 M 5 3
S 3.8 1 M 5 3
S 3.9 1 M 5 3
S 3.9 1 M 5 3
S 3.10 1 M 5 3
S 3.12 1 M 5 3
S 3.13 1 M 5 3
S 3.13 1 M 5 3
S 3.13 1 M 5 3
S 3.13 1 M 5 3
S 3.14 1 M 5 3
S 3.15 1 M 5 3
S 3.16
S 3.16.1 1 C 5 3
S 3.16.2 1 C 5 3
S 3.16.3 1 C 3 3
S 3.16.4 1 C 5 3
S 3.16.5 1 C 5 3
S 3.16.6 1 C 5 3
S 3.16.7 1 C 5 3
S 3.16.8 1 C 5 3
S 3.16.9 1 C 5 3
S 3.16.10 1 C 5 3
S 3.16.11 1 C 5 3
S 3.16.12 1 C 5 3
S 3.16.13 1 C 5 3
S 3.16.14 1 C 5 3
S 3.16.15 1 C 5 3
S 3.16.16 1 C 5 3
S 3.16.17 1 C 5 3
S 3.16.18 1 C 5 3
S 3.16.19 1 C 5 3
S 3.16.20 1 C 5 3
S 3.16.21 1 C 5 3
S 3.16.22 1 C 5 3
S 3.16.23 1 C 5 3
S 3.16.24 1 C 3 3
S 3.17
Capability SUM 116 72
Maturity SUM 67 42

S4 - Threat Intelligence
S 4 - Scope 2
S 4.1 1 M 0 3
S 4.2
S 4.2.1 1
S 4.2.2 1
S 4.2.3 1
S 4.2.4 1
S 4.2.5 1
S 4.2.6 1
S 4.2.7 1
S 4.2.8 1
S 4.2.9 1
S 4.2.10 1
S 4.2.11 1
S 4.3 1 M 0 3
S 4.4 1 M 0 3
S 4.5 1 M 0 3
S 4.6 1 M 0 3
S 4.7 1 M 0 3
S 4.8 1 M 0 3
S 4.9 1 M 0 3
S 4.9 1 M 0 3
S 4.10 1 M 0 3
S 4.12 1 M 0 3
S 4.13 1 M 0 3
S 4.14 1 M 0 3
S 4.15
S 4.15.1 1 C 0 3
S 4.15.2 1 C 0 3
S 4.15.3 1 C 0 3
S 4.15.4 1 C 0 3
S 4.15.5 1 C 0 3
S 4.15.6 1 C 0 3
S 4.15.7 1 C 0 3
S 4.15.8 1 C 0 3
S 4.15.9 1 C 0 3
S 4.15.10 1 C 0 3
S 4.15.11 1 C 0 3
S 4.15.12 1 C 0 3
S 4.15.13 1 C 0 3
S 4.15.14 1 C 0 3
S 4.15.15 1 C 0 3
S 4.15.16 1 C 0 3
S 4.15.17 1 C 0 3
S 4.15.18 1 C 0 3
S 4.15.19 1 C 0 3
S 4.15.20 1 C 0 3
S 4.15.21 1 C 0 3
S 4.15.22 1 C 0 3
S 4.15.23 1 C 0 3
S 4.15.24 1 C 0 3
S 4.15.25 1 C 0 3
S 4.15.26 1 C 0 3
S 4.15.27 1 C 0 3
S 4.15.28 1 C 0 3
S 4.15.29 1 C 0 3
S 4.16
Capability SUM 0 93
Maturity SUM 0 39

S5 - Hunting
S 5 - Scope 2
S 5.1 1 M 0 3
S 5.2 1 M 0 3
S 5.3
S 5.3.1 2
S 5.3.2 2
S 5.3.3 2
S 5.3.4 2
S 5.3.5 2
S 5.3.6 2
S 5.3.7 1
S 5.3.8 2
S 5.3.9 2
S 5.3.10 2
S 5.3.11 1
S 5.4 1 M 0 3
S 5.5 1 M 3 3
S 5.6 1 M 4 3
S 5.7 1 M 3 3
S 5.8 1 M 0 3
S 5.9 1 M 0 3
S 5.10 1 M 0 3
S 5.10 1 M 0 3
S 5.11 1 M 3 3
S 5.13 1 M 3 3
S 5.14 1 M 3 3
S 5.15 1 M 0 3
S 5.16
S 5.16.1 1 C 0 3
S 5.16.2 1 C 3 3
S 5.16.3 1 C 0 3
S 5.16.4 1 C 2 3
S 5.16.5 1 C 0 3
S 5.16.6 1 C 3 3
S 5.16.7 1 C 0 3
S 5.16.8 1 C 3 3
S 5.16.9 1 C 0 3
S 5.16.10 1 C 3 3
S 5.16.11 1 C 3 3
S 5.16.12 1 C 0 3
S 5.16.13 1 C 0 3
S 5.16.14 1 C 0 3
S 5.16.15 1 C 0 3
S 5.16.16 1 C 0 3
S 5.16.17 1 C 3 3
S 5.16.18 1 C 3 3
S 5.16.19 1 C 3 3
S 5.16.20 1 C 3 3
S 5.16.21 1 C 2 3
S 5.17
Capability SUM 31 63
Maturity SUM 22 42

S6 - Vulnerability Management
S 6 - Scope 2
S 6.1 1 M 0 3
S 6.1 1 M 0 3
S 6.2
S 6.2.1 1
S 6.2.2 2
S 6.2.3 1
S 6.2.4 1
S 6.2.5 1
S 6.2.6 1
S 6.2.7 1
S 6.2.8 2
S 6.2.9 1
S 6.2.10 1
S 6.2.11 1
S 6.3 1 M 0 3
S 6.4 1 M 0 3
S 6.5 1 M 0 3
S 6.6 1 M 0 3
S 6.7 1 M 0 3
S 6.8 1 M 0 3
S 6.9 1 M 0 3
S 6.9 1 M 0 3
S 6.10 1 M 2 3
S 6.10 1 M 2 3
S 6.12 1 M 3 3
S 6.13 1 M 0 3
S 6.14 1 M 0 3
S 6.15
S 6.15.1 1 C 0 3
S 6.15.1 1 C 0 3
S 6.15.2 1 C 0 3
S 6.15.2 1 C 0 3
S 6.15.3 1 C 0 3
S 6.15.3 1 C 0 3
S 6.15.4 1 C 0 3
S 6.15.5 1 C 0 3
S 6.15.6 1 C 0 3
S 6.15.7 1 C 0 3
S 6.15.8 1 C 0 3
S 6.15.9 1 C 0 3
S 6.15.10 1 C 0 3
S 6.15.10 1 C 0 3
S 6.15.10 1 C 0 3
S 6.15.11 1 C 0 3
S 6.15.11 1 C 0 3
S 6.15.12 1 C 0 3
S 6.15.13 1 C 0 3
S 6.15.13 1 C 0 3
S 6.15.14 1 C 0 3
S 6.15.15 1 C 0 3
S 6.15.16 1 C 0 3
S 6.15.17 1 C 0 3
S 6.15.18 1 C 0 3
S 6.15.19 1 C 0 3
S 6.16
Capability SUM 0 60
Maturity SUM 8 39

S7 - Log Management
S 7 - Scope 2
S 7.1 1 M 5 3
S 7.2
S 7.2.1 2
S 7.2.2 2
S 7.2.3 2
S 7.2.4 2
S 7.2.5 2
S 7.2.6 2
S 7.2.7 2
S 7.2.8 2
S 7.2.9 2
S 7.2.10 2
S 7.2.11 2
S 7.3 1 M 5 3
S 7.4 1 M 5 3
S 7.5 1 M 4 3
S 7.6 1 M 5 3
S 7.7 1 M 5 3
S 7.8 1 M 5 3
S 7.9 1 M 3 3
S 7.9 1 M 3 3
S 7.10 1 M 5 3
S 7.12 1 M 5 3
S 7.13 1 M 5 3
S 7.14 1 M 5 3
S 7.15
S 7.15.1 1 C 0 3
S 7.15.2 1 C 5 3
S 7.15.3 1 C 5 3
S 7.15.4 1 C 5 3
S 7.15.5 1 C 4 3
S 7.15.6 1 C 5 3
S 7.15.7 1 C 5 3
S 7.15.8 1 C 4 3
S 7.15.9 1 C 5 3
S 7.15.10 1 C 5 3
S 7.15.11 1 C 5 3
S 7.15.12 1 C 5 3
S 7.15.13 1 C 5 3
S 7.15.14 1 C 5 3
S 7.15.15 1 C 5 3
S 7.15.16 1 C 5 3
S 7.15.17 1 C 5 3
S 7.15.18 1 C 5 3
S 7.15.19 1 C 5 3
S 7.15.19 1 C 5 3
S 7.15.20 1 C 5 3
S 7.16
Capability SUM 93 60
Maturity SUM 62 39

ALL ENTRIES BEYOND THIS POINT FOR BACKWARDS COMPATIBILIT


Additions in version 2.1
S 1.16.23 1 C 3 3
S 1.16.25 1 C 5 3
S 1.16.9 1 C 4 3
S 1.16.26 1 C 5 3
S 2.17.28 1 C 2 3
S 2.16.33 1 C 3 3
S 2.16.34 1 C 3 3
S 2.16.34 1 C 3 3
S 2.16.35 1 C 3 3
S 6.14.20 1 C 0 3

Additions in version 2.2


Business
B 4.11 1 M 4 3
B 4.11 1 M 4 3
B 5.1 1 M 4 3
B 5.1 1 M 4 3
B 5.4 1 M 2 3
B 5.4 1 M 2 3
B 5.5 1 M 2 3
B 5.5 1 M 2 3
B 5.6 1 M 3 3
People
P 1.9 1 M 4 3
P 1.10 1 M 5 3
P 3.5 1 M 5 3
P 3.6 1 M 5 3
P 3.13 1 M 3 3
P 3.14 1 M 5 3
P 4.8 1 M 4 3
Process
M 2.1.6 1 M 3 3
M 2.3.2 1 M 2 3
M 2.3.5 1 M 4 3
M 2.3.9 1 M 4 3
M 2.4.2 1 M 4 3
Removed, keep lines for backwards compatibility
M 3.12.1 1 M 4 3
M 3.12.2 1 M 4 3
M 3.13.1 1 M 3 3
M 3.13.2 1 M 3 3
M 3.13.3 1 M 5 3
M 3.13.4 1 M 3 3
M 4.2.1 1 M 5 3
M 4.2.2 1 M 3 3
M 4.2.3 1 M 3 3
M 4.2.4 1 M 5 3
M 4.2.5 1 M 5 3
M 4.2.6 1 M 3 3
M 4.3.1 1 M 3 3
M 4.3.2 1 M 4 3
M 4.3.3 1 M 5 3

M5 - Detection Engineering
M 5.1.1 1 M 3 3
M 5.1.2 1 M 3 3
M 5.1.3 1 M 3 3
M 5.1.4 1 M 4 3
M 5.1.5 1 M 4 3
M 5.1.6 1 M 4 3
M 5.1.7 1 M 4 3
M 5.1.8 1 M 3 3
M 5.1.9 1 M 3 3
M 5.1.10 1 M 4 3
M 5.2.1 1 M 4 3
M 5.2.2 1 M 5 3
M 5.2.3 1 M 4 3
M 5.2.4 1 M 3 3
M 5.2.5 1 M 4 3
M 5.2.6 1 M 3 3
Maturity SUM 64 54

Technology
T 1.4.1 1 M 4 3
T 1.4.2 1 M 4 3
T 1.4.3 1 M 4 3
T 1.4.4 1 M 5 3
T 1.4.5 1 M 5 3
T 2.4.1 0 M 0 3
T 2.4.2 0 M 0 3
T 2.4.3 0 M 0 3
T 2.4.4 0 M 0 3
T 2.4.5 0 M 0 3
T 3.4.1 1 M 4 3
T 3.4.2 1 M 4 3
T 3.4.3 1 M 5 3
T 3.4.4 1 M 5 3
T 3.4.5 1 M 5 3
T 4.4.1 1 M 3 3
T 4.4.2 1 M 3 3
T 4.4.3 1 M 3 3
T 4.4.4 1 M 5 3
T 4.4.5 1 M 5 3
Services
S 1.11 1 M 4 3
S 2.12 1 M 3 3
S 3.11 1 M 4 3
S 4.11 1 M 0 3
S 5.12 1 M 3 3
S 6.11 1 M 3 3
S 7.11 1 M 5 3
S 4.14.25 1 C 0 3
S 4.14.31 1 C 0 3

Additions in version 2.3


Business
B 4.7 1 M 2 3
B 5.2 1 M 2 3
B 5.2 1 M 2 3
B 5.3.1 2
B 5.3.2 2
B 5.3.3 1
B 5.3.4 2
B 5.3.5 1
B 5.3.6 2
B 5.3.7 1
B 5.3.8 2
People
P 2.2.12 1
P 2.2.13 1
P 4.2 1 M 4 3
P 4.3.1 2
P 4.3.2 2
P 4.3.3 2
P 4.3.4 2
P 4.4 1 M 5 3
P 4.5 1 M 5 3
P 4.6.1 2
P 4.6.2 2
P 4.6.3 2
P 4.7 1 M 4 3
Process
M 1.6 1 M 3 3
M 1.8 1 M 3 3
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
M 2.1.1 1 M 3 3
M 2.1.2.1 2
M 2.1.2.2 1
M 2.1.2.3 2
M 2.1.2.4 1
M 2.1.2.5 2
M 2.1.2.6 1
M 2.1.4 1 M 3 3
M 2.1.5 1 M 5 3
M 3.7 1 M 4 3
M 3.8.1 2
M 3.8.2 1
M 3.8.3 2
M 3.8.4 2
M 3.8.5 1
M 3.8.6 2
M 3.8.7 1
M 3.8.8 2
M 3.9 1 M 4 3
M 3.10.1 2
M 3.10.2 1
M 3.10.3 2
M 3.10.4 2
M 3.10.5 2
M 3.10.6 1
M 5.2.7 1 M 3 3
M 5.2.7 1 M 3 3
M 5.2.8 1 M 3 3
M 5.2.8 1 M 3 3
Technology
T 1.6.3 1 M 3 3
T 2.6.3 0 M 0 3
T 3.6.3 1 M 4 3
T 4.6.3 1 M 4 3
T 1.7.1 1 C 4 3
T 1.7.2 1 C 5 3
T 1.7.3 1 C 5 3
T 1.7.4 1 C 4 3
T 1.7.5 1 C 5 3
T 1.7.6 1 C 4 3
T 1.7.7 1 C 5 3
T 1.7.8 1 C 4 3
T 1.7.9 1 C 4 3
T 1.7.10 1 C 3 3
T 1.7.11 1 C 3 3
T 1.7.12 1 C 4 3
T 1.7.13 1 C 5 3
T 1.7.14 1 C 3 3
T 1.7.15 1 C 4 3
T 1.7.16 1 C 4 3
T 1.7.17 1 C 4 3
T 1.7.18 1 C 4 3
T 1.7.19 1 C 4 3
T 1.7.20 1 C 4 3
T 1.7.21 1 C 4 3
T 1.7.22 1 C 3 3
T 1.7.23 1 C 2 3
T 1.7.24 1 C 4 3
T 1.7.25 1 C 4 3
T 1.7.26 1 C 4 3
T 1.7.27 1 C 5 3
T 1.7.28 1 C 5 3
T 1.7.29 1 C 3 3
T 1.7.29 1 C 3 3
T 1.7.29 1 C 3 3
T 1.7.30 1 C 3 3
T 1.7.30 1 C 3 3
T 1.7.31 1 C 3 3
T 1.7.31 1 C 3 3
T 1.7.32 1 C 3 3
T 1.7.32 1 C 3 3
T 1.7.32 1 C 3 3
T 1.7.33 1 C 5 3
T 1.7.34 1 C 3 3
T 1.7.35 1 C 4 3
T 1.7.36 1 C 5 3
T 1.7.37 1 C 4 3
T 1.7.38 1 C 4 3
T 2.7.1 0 C 0 3
T 2.7.2 0 C 0 3
T 2.7.3 0 C 0 3
T 2.7.4 0 C 0 3
T 2.7.5 0 C 0 3
T 2.7.6 0 C 0 3
T 2.7.7 0 C 0 3
T 2.7.8 0 C 0 3
T 2.7.9 0 C 0 3
T 2.7.10 0 C 0 3
T 2.7.11 0 C 0 3
T 2.7.12 0 C 0 3
T 2.7.12 0 C 0 3
T 2.7.13 0 C 0 3
T 2.7.13 0 C 0 3
T 2.7.14 0 C 0 3
T 2.7.14 0 C 0 3
T 2.7.15 0 C 0 3
T 2.7.15 0 C 0 3
T 2.7.16 0 C 0 3
T 2.7.16 0 C 0 3
T 2.7.17 0 C 0 3
T 2.7.17 0 C 0 3
T 2.7.18 0 C 0 3
T 2.7.19 0 C 0 3
T 2.7.20 0 C 0 3
T 2.7.21 0 C 0 3
T 2.7.22 0 C 0 3
T 2.7.23 0 C 0 3
T 2.7.24 0 C 0 3
T 2.7.25 0 C 0 3
T 2.7.26 0 C 0 3
T 2.7.27 0 C 0 3
T 2.7.28 0 C 0 3
T 2.7.29 0 C 0 3
T 2.7.30 0 C 0 3
T 2.7.31 0 C 0 3
T 2.7.32 0 C 0 3
T 2.7.33 0 C 0 3
T 2.7.34 0 C 0 3
T 3.7.1 1 C 3 3
T 3.7.2 1 C 3 3
T 3.7.3 1 C 3 3
T 3.7.4 1 C 4 3
T 3.7.5 1 C 4 3
Removed, keep lines for backwards compatibility
T 3.7.6 1 C 5 3
T 3.7.7 1 C 3 3
T 3.7.8 1 C 5 3
T 3.7.9 1 C 4 3
T 3.7.10 1 C 4 3
T 3.7.11 1 C 4 3
T 3.7.11 1 C 4 3
T 3.7.12 1 C 4 3
T 3.7.13 1 C 4 3
T 3.7.14 1 C 4 3
T 3.7.15 1 C 4 3
T 3.7.15 1 C 4 3
T 3.7.16 1 C 5 3
T 3.7.16 1 C 5 3
T 3.7.17 1 C 5 3
T 3.7.17 1 C 5 3
T 3.7.18 1 C 5 3
T 3.7.18 1 C 5 3
T 3.7.19 1 C 2 3
T 3.7.19 1 C 2 3
T 3.7.20 1 C 3 3
T 3.7.20 1 C 3 3
T 3.7.21 1 C 4 3
T 3.7.21 1 C 4 3
T 3.7.22 1 C 4 3
T 3.7.22 1 C 4 3
T 3.7.23 1 C 3 3
T 3.7.23 1 C 3 3
T 3.7.24 1 C 3 3
T 3.7.24 1 C 3 3
T 3.7.25 1 C 3 3
T 3.7.25 1 C 3 3
T 3.7.26 1 C 3 3
T 3.7.27 1 C 5 3
T 3.7.28 1 C 4 3
T 3.7.29 1 C 4 3
T 3.7.30 1 C 4 3
T 3.7.31 1 C 5 3
T 3.7.32 1 C 4 3
T 3.7.33 1 C 4 3
T 3.7.34 1 C 2 3
T 3.7.35 1 C 3 3
T 3.7.36 1 C 3 3
T 3.7.37 1 C 3 3
T 3.7.38 1 C 4 3
T 3.7.39 1 C 4 3
T 3.7.40 1 C 3 3
T 3.7.41 1 C 3 3
T 3.7.42 1 C 4 3
T 3.7.43 1 C 3 3
T 3.7.44 1 C 5 3
T 3.7.45 1 C 4 3
T 3.7.46 1 C 5 3
T 3.7.47 1 C 5 3
T 3.7.48 1 C 5 3
T 3.7.49 1 C 4 3
T 4.7.12 1 C 2 3
T 4.7.17 1 C 3 3
T 4.7.18 1 C 4 3
T 4.7.19 1 C 4 3
T 4.7.20 1 C 4 3
T 4.7.21 1 C 4 3
T 4.7.22 1 C 5 3
T 4.7.23 1 C 4 3
Additions in version 2.3.3
M 1.7
M 1.7.1 2
M 1.7.2 2
M 1.7.3 1
M 1.7.4 1
M 1.7.5 2
M 1.7.6 1
M 1.7.7 2
M 1.9
M 1.9.1 2
M 1.9.2 1
M 1.9.3 2
M 1.9.4 1
M 1.9.5 2
M 1.10 1 M 3 3
M 1.11
M 1.11.1 2
M 1.11.2 2
M 1.11.3 2
M 1.11.4 2
M 1.11.5 1
S 2.16.36 1 C 4 3
S 1.16.12 1 C 4 3
S 1.16.13 1 C 4 3
S 1.16.14 1 C 5 3
S 1.16.15 1 C 5 3
S 1.16.16 1 C 5 3
S 1.16.17 1 C 5 3
S 1.16.18 1 C 5 3
S 1.16.19 1 C 4 3
S 1.16.20 1 C 5 3
S 1.16.21 1 C 5 3
S 1.16.22 1 C 4 3
S 1.16.23 1 C 3 3
S 1.16.24 1 C 5 3
S 1.16.25 1 C 5 3
S 1.16.26 1 C 5 3
S 1.16.27 1 C 5 3
T 2.7.12 0 C 0 3
S 1.9 1 M 3 3
S 1.9 1 M 3 3
S 1.16.21 1 C 5 3
S 1.16.21 1 C 5 3
S 2.17.1 1 C 3 3
S 2.17.5 1 C 5 3
S 2.17.15 1 C 4 3
S 2.17.16 1 C 4 3
T 1.4.1 1 M 4 3
T 1.4.2 1 M 4 3
T 1.4.3 1 M 4 3
T 1.4.4 1 M 5 3
T 1.4.5 1 M 5 3
T 2.4.1 0 M 0 3
T 2.4.2 0 M 0 3
T 2.4.3 0 M 0 3
T 2.4.4 0 M 0 3
T 2.4.5 0 M 0 3
T 3.4.1 1 M 4 3
T 3.4.2 1 M 4 3
T 3.4.3 1 M 5 3
T 3.4.4 1 M 5 3
T 3.4.5 1 M 5 3
T 4.4.1 1 M 3 3
T 4.4.2 1 M 3 3
T 4.4.3 1 M 3 3
T 4.4.4 1 M 5 3
T 4.4.5 1 M 5 3
S 1.7 1 M 4 3
S 2.9 1 M 2 3
S 3.7 1 M 5 3
S 6.7 1 M 0 3
S 7.7 1 M 5 3
P 2.4 1 M 5 3
S 7.15.18 1 C 5 3
B 4.11 1 M 4 3
T 1.4.2 1 M 4 3
T 2.4.2 0 M 0 3
T 3.4.2 1 M 4 3
T 4.4.2 1 M 3 3
3
NIST mapping NIST in scope factor
(CSF 1.1) (CSF 1.1) (SUM = MIN score)

ID.BE-5 ID.BE-5 GV.OC-04 GV.OC-04 1


ID.BE-5 ID.BE-5 GV.OC-04 GV.OC-04 1
ID.BE-5 ID.BE-5 GV.OC-04 GV.OC-04 1
ID.BE-5 ID.BE-5 GV.OC-04 GV.OC-04 1
ID.BE-4 ID.BE-4 GV.OC-04 GV.OC-04 1
0 0 5

ID.AM-6 ID.AM-6 GV.OC-02 GV.OC-02 1

ID.AM-6 ID.AM-6 GV.OC-02 GV.OC-02 1


ID.AM-6 ID.AM-6 GV.OC-02 GV.OC-02 1
ID.AM-6 ID.AM-6 GV.OC-02 GV.OC-02 1
ID.AM-6 ID.AM-6 GV.OC-02 GV.OC-02 1
ID.AM-6 ID.AM-6 GV.OC-02 GV.OC-02 1
0 0 6

ID.BE-3 ID.BE-3 GV.OC-04 GV.OC-04 1

ID.BE-3 ID.BE-3 GV.OC-04 GV.OC-04 1


ID.BE-3 ID.BE-3 GV.OC-04 GV.OC-04 1
ID.BE-3 ID.BE-3 GV.OC-04 GV.OC-04 1
0 0 4
ID.GV-3 ID.GV-3 GV.RM-01 GV.RM-01 1
ID.GV-1 ID.GV-1 GV.PO-01 GV.PO-01 1

ID.GV-1 ID.GV-1 GV.PO-02 GV.PO-02 1


ID.GV-3 ID.GV-3 GV.PO-02 GV.PO-02 1
ID.GV-2 ID.GV-2 ID.IM-01 ID.IM-01 1
ID.GV-4 ID.GV-4 GV.OC-02 GV.OC-02 1
0 0 9

ID.GV-3 ID.GV-3 GV.OC-03 GV.OC-03 1


ID.GV-3 ID.GV-3 GV.OC-03 GV.OC-03 1
PR.IP-6 PR.IP-6 1
PR.DS-5 PR.DS-5 1
ID.GV-3 ID.GV-3 GV.OC-03 GV.OC-03 1
ID.GV-3 ID.GV-3 GV.OC-03 GV.OC-03 1
ID.GV-3 ID.GV-3 GV.OC-03 GV.OC-03 1
ID.GV-3 ID.GV-3 GV.OC-03 GV.OC-03 1
0 0 10
GV.RR-03 GV.RR-03 1
1
1
1
1
1
0 0 8

ID.AM-6 ID.AM-6 GV.SC-02 GV.SC-02 1


ID.GV-2 ID.GV-2 1
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1

ID.AM-6 ID.AM-6 GV.SC-02 GV.SC-02 1


DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1
ID.AM-6 ID.AM-6 GV.SC-02 GV.SC-02 1

ID.AM-6 ID.AM-6 GV.SC-02 GV.SC-02 1


ID.AM-6 ID.AM-6 GV.SC-02 GV.SC-02 1
ID.GV-2 ID.GV-2 1
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1

ID.AM-6 ID.AM-6 GV.SC-02 GV.SC-02 1


PR.AT-5 PR.AT-5 1
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1
ID.AM-6 ID.AM-6 GV.SC-02 GV.SC-02 1
ID.AM-6 ID.AM-6 GV.SC-02 GV.SC-02 1
0 0 8

1
1
1
1
1
PR.AT-1 PR.AT-1 PR.AT-01 PR.AT-01 1
PR.IP-11 PR.IP-11 GV.RR-04 GV.RR-04 1
1
1
1
0 0 14

1
1
0 0 8

PR.AT-1 PR.AT-1 PR.AT-02 PR.AT-02 1

1
PR.AT-1 PR.AT-1 PR.AT-02 PR.AT-02 1
PR.AT-1 PR.AT-1 PR.AT-02 PR.AT-02 1
PR.AT-1 PR.AT-1 PR.AT-02 PR.AT-02 1
1
PR.AT-1 PR.AT-1 PR.AT-02 PR.AT-02 1
0 0 7

1
1

1
1
0 0 7

PR.IP-10 PR.IP-10 ID.IM-02 ID.IM-02 1


1
1
1
1

1
PR.IP-3 PR.IP-3 PR.PS-01 PR.PS-01 1
1
1
ID.AM-08 ID.AM-08 1

PR.IP-5 PR.IP-5 PR.IR-02 PR.IR-02 1


PR.AC-5 PR.AC-5 PR.IR-01 PR.IR-01 1
PR.AC-2 PR.AC-2 PR.AA-06 PR.AA-06 1
1
1
1
1
1
1
1
1

1
1
0 0 31

1
1
1
1
1
1

1
1
1
0 0 17

ID.RA-3 ID.RA-3 ID.RA-03 ID.RA-03 1


ID.RA-4 ID.RA-4 ID.RA-04 ID.RA-04 1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1
ID.RA-3 ID.RA-3 ID.RA-03 1
ID.RA-4 ID.RA-4 ID.RA-04 ID.RA-04 1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1
ID.RA-3 ID.RA-3 ID.RA-03 1
ID.RA-4 ID.RA-4 ID.RA-04 ID.RA-04 1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1
ID.RA-3 ID.RA-3 ID.RA-03 1
ID.RA-4 ID.RA-4 ID.RA-04 ID.RA-04 1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1
ID.RM-1 ID.RM-1 GV.RM-03 GV.RM-03 1
ID.RA-3 ID.RA-3 ID.RA-03 1
ID.RA-4 ID.RA-4 ID.RA-04 ID.RA-04 1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1
ID.RA-3 ID.RA-3 ID.RA-03 1
ID.RA-4 ID.RA-4 ID.RA-04 ID.RA-04 1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1
ID.RA-3 ID.RA-3 ID.RA-03 1
ID.RA-4 ID.RA-4 ID.RA-04 ID.RA-04 1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1
ID.RA-3 ID.RA-3 ID.RA-03 1
ID.RA-4 ID.RA-4 ID.RA-04 ID.RA-04 1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1

ID.RA-3 ID.RA-3 ID.RA-03 1


ID.RA-4 ID.RA-4 ID.RA-04 ID.RA-04 1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1
ID.RA-3 ID.RA-3 ID.RA-03 1
ID.RA-4 ID.RA-4 ID.RA-04 ID.RA-04 1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1
ID.RA-3 ID.RA-3 ID.RA-03 1
ID.RA-4 ID.RA-4 ID.RA-04 ID.RA-04 1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1
0 0 20

DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1


DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1

1
1

PR.IR-04 PR.IR-04 1
PR.AT-5 PR.AT-5 PR.AT-02 PR.AT-02 1
PR.AT-5 PR.AT-5 PR.AT-02 PR.AT-02 1
ID.SC-3 ID.SC-3 GV.SC-04 GV.SC-04 1

PR.PT-5 PR.PT-5 PR.IR-03 PR.IR-03 1


PR.IP-4 PR.IP-4 PR.DS-11 PR.DS-11 1
PR.IP-4 PR.IP-4 PR.DS-11 PR.DS-11 1
PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1
PR.IP-10 PR.IP-10 ID.IM-02 ID.IM-02 1
PR.DS-7 PR.DS-7 PR.IR-01 PR.IR-01 1

1
PR.AC-4 PR.AC-4 PR.AA-05 PR.AA-05 1
1
PR.AC-4 PR.AC-4 PR.AA-05 PR.AA-05 1

0 0 38
0 0 22

DE.DP-1 GV.RR-02 1
DE.DP-1 GV.RR-02 1

1
1
PR.IR-04 1
PR.AT-5 PR.AT-02 1
PR.AT-5 PR.AT-02 1
ID.SC-3 GV.SC-04 1

PR.PT-5 PR.IR-03 1
PR.IP-4 PR.DS-11 1
PR.IP-4 PR.DS-11 1
PR.IP-9 ID.IM-04 1
PR.IP-10 ID.IM-02 1
PR.DS-7 PR.IR-01 1

PR.PT-3 PR.PS-04 1
PR.AC-4 PR.AA-05 1
PR.PT-3 PR.PS-04 1
PR.AC-4 PR.AA-05 1

0 0 0
0 0 0

DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1


DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1

1
1

PR.IR-04 PR.IR-04 1
PR.AT-5 PR.AT-5 PR.AT-02 PR.AT-02 1
PR.AT-5 PR.AT-5 PR.AT-02 PR.AT-02 1
ID.SC-3 ID.SC-3 GV.SC-04 GV.SC-04 1

PR.PT-5 PR.PT-5 PR.IR-03 PR.IR-03 1


PR.IP-4 PR.IP-4 PR.DS-11 PR.DS-11 1
PR.IP-4 PR.IP-4 PR.DS-11 PR.DS-11 1
PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1
PR.IP-10 PR.IP-10 ID.IM-02 ID.IM-02 1
PR.DS-7 PR.DS-7 PR.IR-01 PR.IR-01 1

PR.PT-3 PR.PT-3 PR.PS-04 PR.PS-04 1


PR.AC-4 PR.AC-4 PR.AA-05 PR.AA-05 1
PR.PT-3 PR.PT-3 PR.PS-04 PR.PS-04 1
PR.AC-4 PR.AC-4 PR.AA-05 PR.AA-05 1

0 0 49
0 0 22

DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1


DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1

1
1

PR.IR-04 PR.IR-04 1
PR.AT-5 PR.AT-5 PR.AT-02 PR.AT-02 1
PR.AT-5 PR.AT-5 PR.AT-02 PR.AT-02 1
ID.SC-3 ID.SC-3 GV.SC-04 GV.SC-04 1

PR.PT-5 PR.PT-5 PR.IR-03 PR.IR-03 1


PR.IP-4 PR.IP-4 PR.DS-11 PR.DS-11 1
PR.IP-4 PR.IP-4 PR.DS-11 PR.DS-11 1
PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1
PR.IP-10 PR.IP-10 ID.IM-02 ID.IM-02 1
PR.DS-7 PR.DS-7 PR.IR-01 PR.IR-01 1

PR.PT-3 PR.PT-3 PR.PS-04 PR.PS-04 1


PR.AC-4 PR.AC-4 PR.AA-05 PR.AA-05 1
PR.PT-3 PR.PT-3 PR.PS-04 PR.PS-04 1
PR.AC-4 PR.AC-4 PR.AA-05 PR.AA-05 1

1
DE.AE-07 DE.AE-07 1
1
1
1
1
1
1
1
1
1
1
1
1
1

0 0 24
0 0 22
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-2 DE.CM-2 DE.CM-02 DE.CM-02 1
DE.CM-3 DE.CM-3 DE.CM-03 DE.CM-03 1
DE.CM-4 DE.CM-4 DE.CM-01 DE.CM-01 1
DE.CM-5 DE.CM-5 DE.CM-09 DE.CM-09 1
DE.CM-6 DE.CM-6 DE.CM-06 DE.CM-06 1
DE.CM-7 DE.CM-7 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1

DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-4 DE.DP-4 DE.AE-06 DE.AE-06 1
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1
DE.DP-2 DE.DP-2 1
PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1
PR.MA-1 PR.MA-1 ID.AM-08 ID.AM-08 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-2 DE.CM-2 DE.CM-02 DE.CM-02 1
DE.CM-3 DE.CM-3 DE.CM-03 DE.CM-03 1
DE.CM-4 DE.CM-4 DE.CM-01 DE.CM-01 1
DE.CM-5 DE.CM-5 DE.CM-09 DE.CM-09 1
DE.CM-6 DE.CM-6 DE.CM-06 DE.CM-06 1
DE.CM-7 DE.CM-7 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
DE.DP-2 DE.DP-2 1
DE.DP-5 DE.DP-5 ID.IM-03 ID.IM-03 1

DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1


DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
DE.CM-4 DE.CM-4 DE.CM-09 DE.CM-09 1
DE.CM-5 DE.CM-5 1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
DE.AE-5 DE.AE-5 DE.AE-06 DE.AE-06 1
DE.AE-5 DE.AE-5 DE.AE-06 DE.AE-06 1
PR.DS-4 PR.DS-4 PR.IR-04 PR.IR-04 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-7 DE.CM-7 DE.CM-03 DE.CM-03 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
PR.DS-5 PR.DS-5 PR.DS-01 PR.DS-01 1
PR.DS-5 PR.DS-5 PR.DS-01 PR.DS-01 1
DE.CM-6 DE.CM-6 DE.CM-06 DE.CM-06 1
DE.CM-2 DE.CM-2 DE.CM-02 DE.CM-02 1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1

0 0 27
0 0 14

1
RS.CO-1 RS.CO-1 PR.AT-01 PR.AT-01 1

RS.IM-1 RS.IM-1 ID.IM-03 ID.IM-03 1


1
RS.CO-2 RS.CO-2 RS.CO-02 RS.CO-02 1
RS.CO-3 RS.CO-3 RS.CO-03 RS.CO-03 1
RS.CO-4 RS.CO-4 RS.MA-01 RS.MA-01 1
RS.CO-5 RS.CO-5 RS.MA-01 RS.MA-01 1
1
RS.CO-1 RS.CO-1 PR.AT-01 PR.AT-01 1
1
RS.CO-1 RS.CO-1 PR.AT-01 PR.AT-01 1
RS.MI-1 RS.MI-1 RS.MI-01 RS.MI-01 1
RS.MI-2 RS.MI-2 RS.MI-02 RS.MI-02 1
1
RS.RP-1 RS.RP-1 RS.MA-01 RS.MA-01 1
RS.IM-1 RS.IM-1 ID.IM-03 ID.IM-03 1
RS.IM-1 RS.IM-1 ID.IM-03 ID.IM-03 1
RS.IM-2 RS.IM-2 1

RS.CO-2 RS.CO-2 RS.CO-02 RS.CO-02 1


RS.MI-2 RS.MI-2 RS.MI-02 RS.MI-02 1
RS.AN-1 RS.AN-1 RS.MA-02 RS.MA-02 1
RS.AN-2 RS.AN-2 RS.MA-04 RS.MA-04 1
RS.AN-3 RS.AN-3 RS.AN-03 RS.AN-03 1
RS.MI-1 RS.MI-1 RS.MI-01 RS.MI-01 1
PR.AT-5 PR.AT-5 PR.AT-02 PR.AT-02 1
RS.RP-1 RS.RP-1 RS.MA-01 RS.MA-01 1
DE.DP-3 DE.DP-3 ID.IM-02 ID.IM-02 1
RS.CO-1 RS.CO-1 PR.AT-01 PR.AT-01 1
RS.CO-1 RS.CO-1 PR.AT-01 PR.AT-01 1
RS.CO-2 RS.CO-2 RS.CO-02 RS.CO-02 1
RS.CO-2 RS.CO-2 RS.CO-02 RS.CO-02 1
RS.AN-1 RS.AN-1 RS.MA-02 RS.MA-02 1
RS.AN-2 RS.AN-2 RS.MA-02 RS.MA-02 1
DE.AE-4 DE.AE-4 DE.AE-04 DE.AE-04 1
RS.AN-2 RS.AN-2 RS.MA-02 RS.MA-02 1
DE.AE-4 DE.AE-4 DE.AE-04 DE.AE-04 1
RS.AN-4 RS.AN-4 RS.MA-03 RS.MA-03 1
RS.CO-4 RS.CO-4 RS.MA-01 RS.MA-01 1
RS.CO-4 RS.CO-4 RS.MA-01 RS.MA-01 1
RS.CO-2 RS.CO-2 RS.CO-02 RS.CO-02 1
RS.CO-4 RS.CO-4 RS.MA-01 RS.MA-01 1
RS.CO-4 RS.CO-4 RS.MA-01 RS.MA-01 1
RS.CO-2 RS.CO-2 RS.CO-02 RS.CO-02 1
1
RS.AN-3 RS.AN-3 RS.AN-03 RS.AN-03 1
RS.MI-1 RS.MI-1 RS.MI-01 RS.MI-01 1
RS.MI-2 RS.MI-2 RS.MI-02 RS.MI-02 1
RS.MI-1 RS.MI-1 RS.MI-01 RS.MI-01 1
RS.MI-2 RS.MI-2 RS.MI-02 RS.MI-02 1
RS.MI-1 RS.MI-1 RS.MI-01 RS.MI-01 1
RS.MI-2 RS.MI-2 RS.MI-02 RS.MI-02 1
RS.IM-1 RS.IM-1 ID.IM-03 ID.IM-03 1
RS.CO-2 RS.CO-2 RS.CO-02 RS.CO-02 1
RS.MI-2 RS.MI-2 RS.MI-02 RS.MI-02 1
RS.IM-1 RS.IM-1 ID.IM-03 ID.IM-03 1
RS.IM-2 RS.IM-2 1

0 0 36
0 0 14

DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1


RS.AN-1 RS.AN-1 RS.MA-02 RS.MA-02 1
RS.AN-3 RS.AN-3 RS.AN-03 RS.AN-03 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1

DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-4 DE.DP-4 DE.AE-06 DE.AE-06 1
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1
DE.DP-2 DE.DP-2 1
PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1
PR.MA-1 PR.MA-1 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
RS.AN-1 RS.AN-1 RS.MA-02 RS.MA-02 1
RS.AN-3 RS.AN-3 RS.AN-03 RS.AN-03 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.DP-2 DE.DP-2 1
DE.DP-5 DE.DP-5 ID.IM-03 ID.IM-03 1

DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1


DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
RS.AN-3 RS.AN-3 RS.AN-03 RS.AN-03 1
RS.AN-3 RS.AN-3 RS.AN-03 RS.AN-03 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
RS.AN-3 RS.AN-3 RS.AN-03 RS.AN-03 1
RS.AN-3 RS.AN-3 RS.AN-03 RS.AN-03 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
RS.CO-2 RS.CO-2 RS.CO-02 RS.CO-02 1
RS.AN-3 RS.AN-3 RS.AN-03 RS.AN-03 1
RS.AN-3 RS.AN-3 RS.AN-03 RS.AN-03 1
RS.AN-3 RS.AN-3 RS.AN-03 RS.AN-03 1

0 0 24
0 0 14

ID.RA-3 ID.RA-3 ID.RA-03 ID.RA-03 1

1
1
1
1
PR.IR-04 PR.IR-04 1
ID.RA-5 ID.RA-5 1
PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1
PR.MA-1 PR.MA-1 1
1
ID.RA-3 ID.RA-3 ID.RA-03 ID.RA-03 1
1
1

ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1


ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1
ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1
ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1
ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1
ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1
ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1
ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1
ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1
ID.RA-3 ID.RA-3 ID.RA-03 ID.RA-03 1
ID.RA-3 ID.RA-3 ID.RA-03 ID.RA-03 1
ID.RA-3 ID.RA-3 ID.RA-03 ID.RA-03 1
ID.RA-3 ID.RA-3 ID.RA-03 ID.RA-03 1
1
ID.RA-3 ID.RA-3 ID.RA-03 ID.RA-03 1
ID.RA-3 ID.RA-3 ID.RA-03 ID.RA-03 1
ID.RA-3 ID.RA-3 ID.RA-03 ID.RA-03 1
ID.RA-3 ID.RA-3 ID.RA-03 ID.RA-03 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
1
1
1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1
1
1
ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1
ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1
ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1
ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1

0 0 31
0 0 13

1
ID.RA-3 ID.RA-3 ID.RA-03 ID.RA-03 1
1
1
1
1
PR.IR-04 PR.IR-04 1
1
PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1
PR.MA-1 PR.MA-1 1
1
1
1
DE.DP-5 DE.DP-5 ID.IM-03 ID.IM-03 1

DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1


DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
1
1
1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
1
1
1
1

0 0 21
0 0 14

PR.IP-12 PR.IP-12 1
ID.RA-1 ID.RA-1 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1
PR.MA-1 PR.MA-1 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
ID.RA-1 ID.RA-1 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1

DE.CM-8 DE.CM-8 ID.RA-01 ID.RA-01 1


ID.AM-1 ID.AM-1 ID.AM-01 ID.AM-01 1
DE.CM-8 DE.CM-8 1
ID.RA-1 ID.RA-1 ID.RA-01 ID.RA-01 1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1
ID.RA-1 ID.RA-1 ID.RA-01 ID.RA-01 1
RS.MI-3 RS.MI-3 ID.RA-06 ID.RA-06 1
DE.CM-8 DE.CM-8 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
ID.RA-5 ID.RA-5 ID.RA-01 ID.RA-01 1
ID.RA-1 ID.RA-1 ID.RA-01 ID.RA-01 1
RS.MI-3 RS.MI-3 ID.RA-06 ID.RA-06 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
ID.RA-1 ID.RA-1 ID.RA-01 ID.RA-01 1
ID.AM-2 ID.AM-2 ID.AM-02 ID.AM-02 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
ID.RA-1 ID.RA-1 ID.RA-01 ID.RA-01 1
DE.CM-8 DE.CM-8 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
DE.CM-8 DE.CM-8 ID.RA-01 ID.RA-01 1
DE.CM-8 DE.CM-8 ID.RA-01 ID.RA-01 1
DE.CM-8 DE.CM-8 ID.RA-01 ID.RA-01 1

0 0 20
0 0 13

PR.PT-1 PR.PT-1 PR.PS-04 PR.PS-04 1

PR.PT-1 PR.PT-1 PR.PS-04 PR.PS-04 1


PR.PT-1 PR.PT-1 PR.PS-04 PR.PS-04 1
PR.PT-1 PR.PT-1 PR.PS-04 PR.PS-04 1
PR.PT-1 PR.PT-1 PR.PS-04 PR.PS-04 1
PR.PT-1 PR.PT-1 PR.PS-04 PR.PS-04 1
PR.PT-1 PR.PT-1 PR.PS-04 PR.PS-04 1
PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1
PR.MA-1 PR.MA-1 1
PR.PT-1 PR.PT-1 PR.PS-04 PR.PS-04 1
PR.PT-1 PR.PT-1 PR.PS-04 PR.PS-04 1
PR.PT-1 PR.PT-1 PR.PS-04 PR.PS-04 1
PR.PT-1 PR.PT-1 PR.PS-04 PR.PS-04 1

DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1


DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
PR.DS-4 PR.DS-4 PR.DS-01 PR.DS-01 1
PR.DS-2 PR.DS-2 PR.DS-02 PR.DS-02 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
PR.DS-2 PR.DS-2 PR.DS-01 PR.DS-01 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
1
1
1
PR.DS-1 PR.DS-1 PR.DS-01 PR.DS-01 1
ID.GV-3 ID.GV-3 GV.PO-01 GV.PO-01 1
PR.PT-1 PR.PT-1 PR.PS-04 PR.PS-04 1
ID.GV-3 ID.GV-3 GV.PO-01 GV.PO-01 1
PR.IP-6 PR.IP-6 1
ID.GV-3 ID.GV-3 GV.OC-03 GV.OC-03 1

0 0 20
0 0 13

ARDS COMPATIBILITY WITH VERSION 2.0


ID.SC-4 ID.SC-4 GV.SC-07 GV.SC-07 1
DE.CM-6 DE.CM-6 DE.CM-06 DE.CM-06 1
ID.SC-4 ID.SC-4 GV.SC-07 GV.SC-07 1
DE.CM-6 DE.CM-6 DE.CM-06 DE.CM-06 1
ID.SC-5 ID.SC-5 GV.SC-08 GV.SC-08 1
ID.SC-3 ID.SC-3 GV.SC-04 GV.SC-04 1
PR.IP-10 PR.IP-10 ID.IM-02 ID.IM-02 1
ID.SC-5 ID.SC-5 GV.SC-08 GV.SC-08 1
RS.IM-1 RS.IM-1 ID.IM-03 ID.IM-03 1
RS.AN-5 RS.AN-5 ID.RA-08 ID.RA-08 1

1
ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1
1
ID.GV-1 ID.GV-1 GV.PO-01 GV.PO-01 1
1
ID.GV-1 ID.GV-1 GV.PO-01 GV.PO-01 1
1
ID.GV-1 ID.GV-1 GV.PO-01 GV.PO-01 1
ID.GV-1 ID.GV-1 GV.OC-03 GV.OC-03 1

1
1
1
1
1
1
1

1
1
1
1
1

PR.AT-1 PR.AT-1 PR.AT-01 PR.AT-01 1


1
1
1
PR.AT-1 PR.AT-1 PR.AT-01 PR.AT-01 1
1
1
1
1
1
1
DE.DP-5 DE.DP-5 ID.IM-03 ID.IM-03 1
1
DE.DP-5 DE.DP-5 ID.IM-03 ID.IM-03 1
DE.DP-5 DE.DP-5 ID.IM-03 ID.IM-03 1

1
1
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1
1
1
1
PR.DS-7 PR.DS-7 PR.IR-01 PR.IR-01 1
1
1
1
DE.DP-3 DE.DP-3 ID.IM-02 ID.IM-02 1
DE.DP-3 DE.DP-3 ID.IM-02 ID.IM-02 1
DE.DP-3 DE.DP-3 ID.IM-02 ID.IM-02 1
DE.DP-3 DE.DP-3 ID.IM-02 ID.IM-02 1
1
DE.DP-5 DE.DP-5 ID.IM-03 ID.IM-03 1
0 0 18

PR.MA-1 PR.MA-1 ID.AM-08 ID.AM-08 1


PR.MA-2 PR.MA-2 ID.AM-08 ID.AM-08 1
PR.MA-1 PR.MA-1 ID.AM-08 ID.AM-08 1
PR.MA-1 PR.MA-1 ID.AM-08 ID.AM-08 1
PR.MA-1 PR.MA-1 ID.AM-08 ID.AM-08 1
PR.MA-1 ID.AM-08 1
PR.MA-2 ID.AM-08 1
PR.MA-1 ID.AM-08 1
PR.MA-1 ID.AM-08 1
PR.MA-1 ID.AM-08 1
PR.MA-1 PR.MA-1 ID.AM-08 ID.AM-08 1
PR.MA-2 PR.MA-2 ID.AM-08 ID.AM-08 1
PR.MA-1 PR.MA-1 ID.AM-08 ID.AM-08 1
PR.MA-1 PR.MA-1 ID.AM-08 ID.AM-08 1
PR.MA-1 PR.MA-1 ID.AM-08 ID.AM-08 1
PR.MA-1 PR.MA-1 ID.AM-08 ID.AM-08 1
PR.MA-2 PR.MA-2 ID.AM-08 ID.AM-08 1
PR.MA-1 PR.MA-1 ID.AM-08 ID.AM-08 1
PR.MA-1 PR.MA-1 ID.AM-08 ID.AM-08 1
PR.MA-1 PR.MA-1 ID.AM-08 ID.AM-08 1

1
1
1
1
1
1
1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1
1

GV.OV-03 GV.OV-03 1
ID.GV-1 ID.GV-1 GV.PO-02 GV.PO-02 1
ID.GV-2 ID.GV-2 1

1
1

1
ID.IM-03 ID.IM-03 1
DE.DP-5 DE.DP-5 ID.IM-03 ID.IM-03 1

PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1

PR.IP-10 PR.IP-10 ID.IM-02 ID.IM-02 1


PR.IP-10 PR.IP-10 ID.IM-02 ID.IM-02 1
1

DE.DP-3 DE.DP-3 ID.IM-02 ID.IM-02 1


DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
DE.DP-3 DE.DP-3 ID.IM-02 ID.IM-02 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1

PR.AC-4 PR.AC-4 PR.AA-05 PR.AA-05 1


PR.AC-4 PR.AA-05 1
PR.AC-4 PR.AC-4 PR.AA-05 PR.AA-05 1
PR.AC-4 PR.AC-4 PR.AA-05 PR.AA-05 1
DE.AE-5 DE.AE-5 DE.AE-06 DE.AE-06 1
DE.AE-5 DE.AE-5 DE.AE-06 DE.AE-06 1
1
1
1
1
1
1
1
1
1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
1
1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
1
1
1
1
1
DE.AE-07 DE.AE-07 1
1
1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
DE.AE-1 DE.AE-1 ID.AM-03 ID.AM-03 1
DE.CM-3 DE.CM-3 DE.CM-03 DE.CM-03 1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
DE.AE-1 DE.AE-1 ID.AM-03 ID.AM-03 1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
DE.AE-1 DE.AE-1 ID.AM-03 ID.AM-03 1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
DE.AE-1 DE.AE-1 ID.AM-03 ID.AM-03 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
1
1
1
1
1
DE.CM-7 ID.AM-08 1
1
1
1
PR.PS-05 1
PR.PS-05 1
DE.CM-1 DE.CM-01 1
DE.DP-2 DE.AE-02 1
DE.CM-1 DE.CM-01 1
DE.DP-2 DE.AE-02 1
DE.CM-1 DE.CM-01 1
DE.DP-2 DE.AE-02 1
DE.CM-1 DE.CM-01 1
DE.DP-2 DE.AE-02 1
DE.CM-1 DE.CM-01 1
DE.DP-2 DE.AE-02 1
DE.CM-1 DE.CM-01 1
DE.DP-2 DE.AE-02 1
1
1
DE.AE-07 1
1
1
DE.AE-3 DE.AE-03 1
1
1
1
DE.AE-1 ID.AM-03 1
1
1
1
DE.AE-2 DE.AE-02 1
DE.AE-2 DE.AE-02 1
DE.AE-2 DE.AE-02 1
DE.AE-2 DE.AE-02 1
1
1
1
1
1

1
1
1
1
1
DE.CM-4 DE.CM-4 DE.CM-09 DE.CM-09 1
DE.CM-5 DE.CM-5 1
1
DE.CM-4 DE.CM-4 DE.CM-09 DE.CM-09 1
1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
1
1
1
1
1
DE.AE-07 DE.AE-07 1
1
1
1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
1
1
1
1
1
1
1
1
1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
1
1
1
1
1
1
1
1

RC.RP-03 RC.RP-03 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 RC.RP-03 1
DE.AE-02 1
PR.PS-03 PR.PS-03 1
PR.PS-02 PR.PS-02 1
PR.DS-02 PR.DS-02 1
PR.DS-10 PR.DS-10 1
RS.AN-06 RS.AN-06 1
RS.AN-07 RS.AN-07 1
RS.AN-08 RS.AN-08 1
RS.AN-08 RS.AN-08 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 1
PR.PS-03 1
PR.PS-03 1
PR.PS-03 1
PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.IR-04 PR.IR-04 1
PR.IR-04 PR.IR-04 1
PR.IR-04 PR.IR-04 1
PR.IR-04 PR.IR-04 1
PR.IR-04 PR.IR-04 1
GV.RR-03 GV.RR-03 1
GV.PO-01 GV.PO-01 1
GV.SC-02 GV.SC-02 1
DE.CM-06 DE.CM-06 1
DE.CM-06 1
DE.CM-06 DE.CM-06 1
DE.CM-06 DE.CM-06 1
total score MAX score final score

4 5 75
3 5 50
3 5 50
5 5 100
4 5 75
19 25 70

5 5

4 5
4 5
5 5
5 5
5 5
28 30 91.67

4 5

3 5
4 5
5 5
16 20 75
3 5
3 5

5 5

5 5
5 5
3 5
4 5
34 45 69.44

3 5
2 5
2 5
2 5
2 5
2 5
3 5
3 5
25 50 37.5
3 5
2 5
2 5
3 5
5 5
4 5
28 40 62.5

5 5
5 5
5 5

5 5
5 5
5 5

5 5
4 5
4 5
4 5

5 5
5 5
5 5
4 5
3 5
36 40 87.5

2 5
3 5
1 5
4 5
2 5
3 5
3 5
4 5
5 5
5 5
50 70 64.29

5 5

4 5
4 5
35 40 84.38

4 5

1 5
5 5
3 5
5 5
2 5
2 5
22 35 53.57

4 5
4 5

5 5
4 5
26 35 67.86

3 5
3 5
3 5
4 5
4 5

4 5
4 5
4 5
4 5
5 5

3 5
3 5
3 5
4 5
2 5
3 5
5 5
5 5
3 5
3 5
3 5

2 5
2 5
107 155 61.29

5 5
4 5
4 5
5 5
3 5
4 5

4 5
3 5
4 5
66 85 72.06

3 5
3 5
3 5
4 5
4 5
4 5
3 5
3 5
3 5
4 5
4 5
4 5
4 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5

4 5
4 5
4 5
5 5
5 5
5 5
5 5
5 5
5 5
76 100 70

4 5
4 5

4 5
4 5

5 5
5 5
5 5
4 5

4 5
4 5
4 5
4 5
3 5
2 5

4 5
4 5
4 5
4 5

150 190 73.68


89 110 76.14

0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5

0 0 #DIV/0!
0 0 #DIV/0!

4 5
4 5

4 5
4 5

3 5
5 5
5 5
5 5

3 5
3 5
4 5
3 5
3 5
3 5

3 5
3 5
3 5
3 5

188 245 70.92


86 110 72.73

4 5
4 5

4 5
4 5

3 5
3 5
5 5
5 5

4 5
4 5
4 5
4 5
4 5
5 5

5 5
5 5
4 5
4 5

4 5
4 5
4 5
4 5
3 5
4 5
4 5
4 5
4 5
4 5
3 5
3 5
4 5
3 5
5 5

5 5

92 120 70.83
89 110 76.14
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5

4 5
4 5
4 5
4 5
4 5
3 5
3 5
3 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5

4 5
4 5
4 5
4 5
4 5
4 5
4 5
3 5
5 5
4 5
4 5
5 5
5 5
5 5
5 5
5 5
4 5
5 5
5 5
4 5
3 5
5 5
4 5
4 5
4 5

118 135 84.26


54 70 71.43

5 5
4 5

2 5
3 5
3 5
3 5
3 5
3 5
3 5
2 5
2 5
3 5
3 5
3 5
2 5
3 5
3 5
3 5
3 5

3 5
3 5
4 5
4 5
5 5
4 5
3 5
4 5
3 5
4 5
4 5
4 5
3 5
4 5
4 5
4 5
4 5
4 5
3 5
5 5
4 5
3 5
2 5
2 5
3 5
3 5
3 5
2 5
2 5
1 5
1 5
2 5
2 5
2 5
3 5
3 5
3 5
3 5

117 180 56.25


41 70 48.21

5 5
5 5
5 5
5 5

5 5
5 5
4 5
4 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5

5 5
5 5
3 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
3 5

116 120 95.83


67 70 94.64

0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 155 0
0 65 0

0 5
0 5
0 5
3 5
4 5
3 5
0 5
0 5
0 5
0 5
3 5
3 5
3 5
0 5

0 5
3 5
0 5
2 5
0 5
3 5
0 5
3 5
0 5
3 5
3 5
0 5
0 5
0 5
0 5
0 5
3 5
3 5
3 5
3 5
2 5

31 105 11.9
22 70 14.29

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
2 5
2 5
3 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 100 0
8 65 0

5 5

5 5
5 5
4 5
5 5
5 5
5 5
3 5
3 5
5 5
5 5
5 5
5 5

0 5
5 5
5 5
5 5
4 5
5 5
5 5
4 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5

93 100 91.25
62 65 94.23

3 5
5 5
4 5
5 5
2 5
3 5
3 5
3 5
3 5
0 5

4 5
4 5
4 5
4 5
2 5
2 5
2 5
2 5
3 5

4 5
5 5
5 5
5 5
3 5
5 5
4 5

3 5
2 5
4 5
4 5
4 5

4 5
4 5
3 5
3 5
5 5
3 5
5 5
3 5
3 5
5 5
5 5
3 5
3 5
4 5
5 5

3 5
3 5
3 5
4 5
4 5
4 5
4 5
3 5
3 5
4 5
4 5
5 5
4 5
3 5
4 5
3 5
64 90 63.89

4 5
4 5
4 5
5 5
5 5
0 5
0 5
0 5
0 5
0 5
4 5
4 5
5 5
5 5
5 5
3 5
3 5
3 5
5 5
5 5

4 5
3 5
4 5
0 5
3 5
3 5
5 5
0 5
0 5

2 5
2 5
2 5

4 5

5 5
5 5

4 5
3 5
3 5

3 5

3 5
5 5
4 5

4 5

3 5
3 5
3 5
3 5

3 5
0 5
4 5
4 5
4 5
5 5
5 5
4 5
5 5
4 5
5 5
4 5
4 5
3 5
3 5
4 5
5 5
3 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
3 5
2 5
4 5
4 5
4 5
5 5
5 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
5 5
3 5
4 5
5 5
4 5
4 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
3 5
3 5
3 5
4 5
4 5

5 5
3 5
5 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
5 5
5 5
5 5
5 5
5 5
5 5
2 5
2 5
3 5
3 5
4 5
4 5
4 5
4 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
5 5
4 5
4 5
4 5
5 5
4 5
4 5
2 5
3 5
3 5
3 5
4 5
4 5
3 5
3 5
4 5
3 5
5 5
4 5
5 5
5 5
5 5
4 5
2 5
3 5
4 5
4 5
4 5
4 5
5 5
4 5

3 5

4 5
4 5
4 5
5 5
5 5
5 5
5 5
5 5
4 5
5 5
5 5
4 5
3 5
5 5
5 5
5 5
5 5
0 5
3 5
3 5
5 5
5 5
3 5
5 5
4 5
4 5
4 5
4 5
4 5
5 5
5 5
0 5
0 5
0 5
0 5
0 5
4 5
4 5
5 5
5 5
5 5
3 5
3 5
3 5
5 5
5 5
4 5
2 5
5 5
0 5
5 5
5 5
5 5
4 5
4 5
0 5
4 5
3 5
remarks

not used in calculations, but to determine 2.1

not used in calculations, but to determine 3.1


not used in calculations, but to determine 4.2

NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
Not part of scoring

NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING

NIST MAPPING

NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING
NIST MAPPING

Note, maturity score can be overruled in S 2.2.2

NIST MAPPING
NIST MAPPING
NIST MAPPING

NIST MAPPING

NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING

NIST MAPPING
NIST MAPPING
NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING
NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING
NIST MAPPING
NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING
NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING

NIST MAPPING
not used in calculations, but to determine 1.6

not used in calculations, but to determine 1.8

NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING

NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST CSF
NIST CSF version Function Category

1.1 IDENTIFY (ID) Asset Management (ID.AM)


1.1 IDENTIFY (ID) Asset Management (ID.AM)
1.1 IDENTIFY (ID) Asset Management (ID.AM)
1.1 IDENTIFY (ID) Asset Management (ID.AM)
1.1 IDENTIFY (ID) Asset Management (ID.AM)
1.1 IDENTIFY (ID) Asset Management (ID.AM)

1.1 IDENTIFY (ID) Business Environment (ID.BE)


1.1 IDENTIFY (ID) Business Environment (ID.BE)
1.1 IDENTIFY (ID) Business Environment (ID.BE)
1.1 IDENTIFY (ID) Business Environment (ID.BE)
1.1 IDENTIFY (ID) Business Environment (ID.BE)

1.1 IDENTIFY (ID) Governance (ID.GV)


1.1 IDENTIFY (ID) Governance (ID.GV)
1.1 IDENTIFY (ID) Governance (ID.GV)
1.1 IDENTIFY (ID) Governance (ID.GV)

1.1 IDENTIFY (ID) Risk Assessment (ID.RA)


1.1 IDENTIFY (ID) Risk Assessment (ID.RA)
1.1 IDENTIFY (ID) Risk Assessment (ID.RA)
1.1 IDENTIFY (ID) Risk Assessment (ID.RA)
1.1 IDENTIFY (ID) Risk Assessment (ID.RA)
1.1 IDENTIFY (ID) Risk Assessment (ID.RA)

1.1 IDENTIFY (ID) Risk Management Strategy (ID.RM)


1.1 IDENTIFY (ID) Risk Management Strategy (ID.RM)
1.1 IDENTIFY (ID) Risk Management Strategy (ID.RM)

1.1 IDENTIFY (ID) Supply Chain Risk Management (ID.SC)


1.1 IDENTIFY (ID) Supply Chain Risk Management (ID.SC)
1.1 IDENTIFY (ID) Supply Chain Risk Management (ID.SC)
1.1 IDENTIFY (ID) Supply Chain Risk Management (ID.SC)
1.1 IDENTIFY (ID) Supply Chain Risk Management (ID.SC)

1.1 PROTECT (PR) Access Control (PR.AC)


1.1 PROTECT (PR) Access Control (PR.AC)
1.1 PROTECT (PR) Access Control (PR.AC)
1.1 PROTECT (PR) Access Control (PR.AC)
1.1 PROTECT (PR) Access Control (PR.AC)
1.1 PROTECT (PR) Access Control (PR.AC)
1.1 PROTECT (PR) Access Control (PR.AC)

1.1 PROTECT (PR) Awareness and Training (PR.AT)


1.1 PROTECT (PR) Awareness and Training (PR.AT)
1.1 PROTECT (PR) Awareness and Training (PR.AT)
1.1 PROTECT (PR) Awareness and Training (PR.AT)
1.1 PROTECT (PR) Awareness and Training (PR.AT)

1.1 PROTECT (PR) Data Security (PR.DS)


1.1 PROTECT (PR) Data Security (PR.DS)
1.1 PROTECT (PR) Data Security (PR.DS)
1.1 PROTECT (PR) Data Security (PR.DS)
1.1 PROTECT (PR) Data Security (PR.DS)
1.1 PROTECT (PR) Data Security (PR.DS)
1.1 PROTECT (PR) Data Security (PR.DS)
1.1 PROTECT (PR) Data Security (PR.DS)

1.1 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)


1.1 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.1 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.1 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.1 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.1 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.1 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.1 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.1 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.1 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.1 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.1 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)

1.1 PROTECT (PR) Maintenance (PR.MA)


1.1 PROTECT (PR) Maintenance (PR.MA)

1.1 PROTECT (PR) Protective Technology (PR.PT)


1.1 PROTECT (PR) Protective Technology (PR.PT)
1.1 PROTECT (PR) Protective Technology (PR.PT)
1.1 PROTECT (PR) Protective Technology (PR.PT)
1.1 PROTECT (PR) Protective Technology (PR.PT)

1.1 DETECT (DE) Anomalies and Events (DE.AE)


1.1 DETECT (DE) Anomalies and Events (DE.AE)
1.1 DETECT (DE) Anomalies and Events (DE.AE)
1.1 DETECT (DE) Anomalies and Events (DE.AE)
1.1 DETECT (DE) Anomalies and Events (DE.AE)

1.1 DETECT (DE) Security Continuous Monitoring (DE.CM)


1.1 DETECT (DE) Security Continuous Monitoring (DE.CM)
1.1 DETECT (DE) Security Continuous Monitoring (DE.CM)
1.1 DETECT (DE) Security Continuous Monitoring (DE.CM)
1.1 DETECT (DE) Security Continuous Monitoring (DE.CM)
1.1 DETECT (DE) Security Continuous Monitoring (DE.CM)
1.1 DETECT (DE) Security Continuous Monitoring (DE.CM)
1.1 DETECT (DE) Security Continuous Monitoring (DE.CM)

1.1 DETECT (DE) Detection Processes (DE.DP)


1.1 DETECT (DE) Detection Processes (DE.DP)
1.1 DETECT (DE) Detection Processes (DE.DP)
1.1 DETECT (DE) Detection Processes (DE.DP)
1.1 DETECT (DE) Detection Processes (DE.DP)

1.1 RESPOND (RS) Response Planning (RS.RP)

1.1 RESPOND (RS) Communications (RS.CO)


1.1 RESPOND (RS) Communications (RS.CO)
1.1 RESPOND (RS) Communications (RS.CO)
1.1 RESPOND (RS) Communications (RS.CO)
1.1 RESPOND (RS) Communications (RS.CO)

1.1 RESPOND (RS) Analysis (RS.AN)


1.1 RESPOND (RS) Analysis (RS.AN)
1.1 RESPOND (RS) Analysis (RS.AN)
1.1 RESPOND (RS) Analysis (RS.AN)
1.1 RESPOND (RS) Analysis (RS.AN)

1.1 RESPOND (RS) Mitigation (RS.MI)


1.1 RESPOND (RS) Mitigation (RS.MI)
1.1 RESPOND (RS) Mitigation (RS.MI)

1.1 RESPOND (RS) Improvements (RS.IM)


1.1 RESPOND (RS) Improvements (RS.IM)

1.1 RECOVER (RC) Recovery Planning (RC.RP)

1.1 RECOVER (RC) Improvements (RC.IM)


1.1 RECOVER (RC) Improvements (RC.IM)

1.1 RECOVER (RC) Communications (RC.CO)


1.1 RECOVER (RC) Communications (RC.CO)
1.1 RECOVER (RC) Communications (RC.CO)

NIST CSF 2.0


2.0 Govern (GV) Organizational Context (GV.OC)
2.0 Govern (GV) Organizational Context (GV.OC)
2.0 Govern (GV) Organizational Context (GV.OC)
2.0 Govern (GV) Organizational Context (GV.OC)
2.0 Govern (GV) Organizational Context (GV.OC)

2.0 Govern (GV) Risk Management Strategy (GV.RM)


2.0 Govern (GV) Risk Management Strategy (GV.RM)
2.0 Govern (GV) Risk Management Strategy (GV.RM)
2.0 Govern (GV) Risk Management Strategy (GV.RM)
2.0 Govern (GV) Risk Management Strategy (GV.RM)
2.0 Govern (GV) Risk Management Strategy (GV.RM)
2.0 Govern (GV) Risk Management Strategy (GV.RM)

2.0 Govern (GV) Roles, Responsibilities, and Authorities (GV.RR)


2.0 Govern (GV) Roles, Responsibilities, and Authorities (GV.RR)
2.0 Govern (GV) Roles, Responsibilities, and Authorities (GV.RR)
2.0 Govern (GV) Roles, Responsibilities, and Authorities (GV.RR)

2.0 Govern (GV) Policy (GV.PO)


2.0 Govern (GV) Policy (GV.PO)

2.0 Govern (GV) Oversight (GV.OV)


2.0 Govern (GV) Oversight (GV.OV)
2.1 Govern (GV) Oversight (GV.OV)

2.0 Govern (GV) Cybersecurity Supply Chain Risk Management (GV.SC)


2.0 Govern (GV) Cybersecurity Supply Chain Risk Management (GV.SC)
2.0 Govern (GV) Cybersecurity Supply Chain Risk Management (GV.SC)
2.0 Govern (GV) Cybersecurity Supply Chain Risk Management (GV.SC)
2.0 Govern (GV) Cybersecurity Supply Chain Risk Management (GV.SC)
2.0 Govern (GV) Cybersecurity Supply Chain Risk Management (GV.SC)
2.0 Govern (GV) Cybersecurity Supply Chain Risk Management (GV.SC)
2.0 Govern (GV) Cybersecurity Supply Chain Risk Management (GV.SC)
2.0 Govern (GV) Cybersecurity Supply Chain Risk Management (GV.SC)
2.0 Govern (GV) Cybersecurity Supply Chain Risk Management (GV.SC)

2.0 IDENTIFY (ID) Asset Management (ID.AM)


2.0 IDENTIFY (ID) Asset Management (ID.AM)
2.0 IDENTIFY (ID) Asset Management (ID.AM)
2.0 IDENTIFY (ID) Asset Management (ID.AM)
2.0 IDENTIFY (ID) Asset Management (ID.AM)
2.0 IDENTIFY (ID) Asset Management (ID.AM)
2.0 IDENTIFY (ID) Asset Management (ID.AM)

2.0 IDENTIFY (ID) Risk Assessment (ID.RA)


2.0 IDENTIFY (ID) Risk Assessment (ID.RA)
2.0 IDENTIFY (ID) Risk Assessment (ID.RA)
2.0 IDENTIFY (ID) Risk Assessment (ID.RA)
2.0 IDENTIFY (ID) Risk Assessment (ID.RA)
2.0 IDENTIFY (ID) Risk Assessment (ID.RA)
2.0 IDENTIFY (ID) Risk Assessment (ID.RA)
2.0 IDENTIFY (ID) Risk Assessment (ID.RA)
2.0 IDENTIFY (ID) Risk Assessment (ID.RA)
2.0 IDENTIFY (ID) Risk Assessment (ID.RA)

2.0 IDENTIFY (ID) Improvement (ID.IM)


2.0 IDENTIFY (ID) Improvement (ID.IM)
2.0 IDENTIFY (ID) Improvement (ID.IM)
2.0 IDENTIFY (ID) Improvement (ID.IM)

2.0 PROTECT (PR) Identity Management, Authentication, and Access Control (PR.AA)

2.0 PROTECT (PR) Identity Management, Authentication, and Access Control (PR.AA)

2.0 PROTECT (PR) Identity Management, Authentication, and Access Control (PR.AA)

2.0 PROTECT (PR) Identity Management, Authentication, and Access Control (PR.AA)

2.0 PROTECT (PR) Identity Management, Authentication, and Access Control (PR.AA)

2.0 PROTECT (PR) Identity Management, Authentication, and Access Control (PR.AA)

2.0 PROTECT (PR) Awareness and Training (PR.AT)


2.0 PROTECT (PR) Awareness and Training (PR.AT)

2.0 PROTECT (PR) Data Security (PR.DS)


2.0 PROTECT (PR) Data Security (PR.DS)
2.0 PROTECT (PR) Data Security (PR.DS)
2.0 PROTECT (PR) Data Security (PR.DS)

2.0 PROTECT (PR) Platform Security (PR.PS)


2.0 PROTECT (PR) Platform Security (PR.PS)
2.0 PROTECT (PR) Platform Security (PR.PS)
2.0 PROTECT (PR) Platform Security (PR.PS)
2.0 PROTECT (PR) Platform Security (PR.PS)
2.0 PROTECT (PR) Platform Security (PR.PS)

2.0 PROTECT (PR) Technology Infrastructure Resilience (PR.IR)


2.0 PROTECT (PR) Technology Infrastructure Resilience (PR.IR)
2.0 PROTECT (PR) Technology Infrastructure Resilience (PR.IR)
2.0 PROTECT (PR) Technology Infrastructure Resilience (PR.IR)

2.0 DETECT (DE) Continuous Monitoring (DE.CM)


2.0 DETECT (DE) Continuous Monitoring (DE.CM)
2.0 DETECT (DE) Continuous Monitoring (DE.CM)
2.0 DETECT (DE) Continuous Monitoring (DE.CM)
2.0 DETECT (DE) Continuous Monitoring (DE.CM)

2.0 DETECT (DE) Adverse Event Analysis (DE.AE)


2.0 DETECT (DE) Adverse Event Analysis (DE.AE)
2.0 DETECT (DE) Adverse Event Analysis (DE.AE)
2.0 DETECT (DE) Adverse Event Analysis (DE.AE)
2.0 DETECT (DE) Adverse Event Analysis (DE.AE)
2.0 DETECT (DE) Adverse Event Analysis (DE.AE)

2.0 RESPOND (RS) Incident Management (RS.MA)


2.0 RESPOND (RS) Incident Management (RS.MA)
2.0 RESPOND (RS) Incident Management (RS.MA)
2.0 RESPOND (RS) Incident Management (RS.MA)
2.0 RESPOND (RS) Incident Management (RS.MA)

2.0 RESPOND (RS) Incident Analysis (RS.AN)


2.0 RESPOND (RS) Incident Analysis (RS.AN)
2.0 RESPOND (RS) Incident Analysis (RS.AN)
2.0 RESPOND (RS) Incident Analysis (RS.AN)

2.0 RESPOND (RS) Incident Response Reporting and Communication (RS.CO)


2.0 RESPOND (RS) Incident Response Reporting and Communication (RS.CO)

2.0 RESPOND (RS) Incident Mitigation (RS.MI)


2.0 RESPOND (RS) Incident Mitigation (RS.MI)

2.0 RECOVER (RC) Incident Recovery Plan Execution (RC.RP)


2.0 RECOVER (RC) Incident Recovery Plan Execution (RC.RP)
2.0 RECOVER (RC) Incident Recovery Plan Execution (RC.RP)
2.0 RECOVER (RC) Incident Recovery Plan Execution (RC.RP)
2.0 RECOVER (RC) Incident Recovery Plan Execution (RC.RP)
2.0 RECOVER (RC) Incident Recovery Plan Execution (RC.RP)

2.0 RECOVER (RC) Incident Recovery Communication (RC.CO)


2.0 RECOVER (RC) Incident Recovery Communication (RC.CO)
Maturity
Subcategory Applicable? Subcategory maturity Subcategory maturity Subcategory maturity
MIN TOTAL MAX
ID.AM-1 0 0 0 0
ID.AM-2 0 0 0 0
ID.AM-3 0 0 0 0
ID.AM-4 0 0 0 0
ID.AM-5 0 0 0 0
ID.AM-6 14 14 64 70
SUM 14 64 70
ID.BE-1 0 0 0 0
ID.BE-2 0 0 0 0
ID.BE-3 4 4 16 20
ID.BE-4 1 1 4 5
ID.BE-5 4 4 15 20
SUM 9 35 45
ID.GV-1 7 7 21 35
ID.GV-2 4 4 14 20
ID.GV-3 8 8 23 40
ID.GV-4 1 1 4 5
SUM 20 62 100
ID.RA-1 2 2 2 10
ID.RA-2 1 1 4 5
ID.RA-3 14 14 40 70
ID.RA-4 11 11 40 55
ID.RA-5 12 12 40 60
ID.RA-6 0 0 0 0
SUM 40 126 200
ID.RM-1 1 1 4 5
ID.RM-2 0 0 0 0
ID.RM-3 0 0 0 0
SUM 1 4 5
ID.SC-1 0 0 0 0
ID.SC-2 0 0 0 0
ID.SC-3 3 3 14 15
ID.SC-4 0 0 0 0
ID.SC-5 0 0 0 0
SUM 3 14 15
Total
PR.AC-1 0 0 0 0
PR.AC-2 1 1 3 5
PR.AC-3 0 0 0 0
PR.AC-4 9 9 34 45
PR.AC-5 1 1 3 5
PR.AC-6 0 0 0 0
PR.AC-7 0 0 0 0
SUM 11 40 55
PR.AT-1 8 8 31 40
PR.AT-2 0 0 0 0
PR.AT-3 0 0 0 0
PR.AT-4 0 0 0 0
PR.AT-5 7 7 33 35
SUM 15 64 75
PR.DS-1 0 0 0 0
PR.DS-2 0 0 0 0
PR.DS-3 0 0 0 0
PR.DS-4 0 0 0 0
PR.DS-5 1 1 2 5
PR.DS-6 0 0 0 0
PR.DS-7 4 4 14 20
PR.DS-8 0 0 0 0
SUM 5 16 25
PR.IP-1 0 0 0 0
PR.IP-2 0 0 0 0
PR.IP-3 1 1 4 5
PR.IP-4 6 6 23 30
PR.IP-5 1 1 3 5
PR.IP-6 1 1 2 5
PR.IP-7 0 0 0 0
PR.IP-8 0 0 0 0
PR.IP-9 10 10 25 50
PR.IP-10 6 6 21 30
PR.IP-11 1 1 3 5
PR.IP-12 11 11 5 55
SUM 37 86 185
PR.MA-1 18 18 64 90
PR.MA-2 3 3 11 15
SUM 21 75 105
PR.PT-1 11 11 54 55
PR.PT-2 0 0 0 0
PR.PT-3 4 4 15 20
PR.PT-4 0 0 0 0
PR.PT-5 3 3 11 15
SUM 18 80 90
Total
DE.AE-1 0 0 0 0
DE.AE-2 2 2 10 10
DE.AE-3 4 4 14 20
DE.AE-4 0 0 0 0
DE.AE-5 0 0 0 0
SUM 6 24 30
DE.CM-1 2 2 8 10
DE.CM-2 2 2 8 10
DE.CM-3 2 2 8 10
DE.CM-4 2 2 8 10
DE.CM-5 2 2 8 10
DE.CM-6 2 2 8 10
DE.CM-7 2 2 8 10
DE.CM-8 0 0 0 0
SUM 14 56 70
DE.DP-1 17 17 72 85
DE.DP-2 14 14 62 70
DE.DP-3 6 6 22 30
DE.DP-4 2 2 8 10
DE.DP-5 8 8 27 40
SUM 47 191 235
Total
RS.RP-1 1 1 3 5
SUM 1 3 5
RS.CO-1 3 3 9 15
RS.CO-2 1 1 3 5
RS.CO-3 1 1 3 5
RS.CO-4 1 1 3 5
RS.CO-5 1 1 3 5
SUM 7 21 35
RS.AN-1 2 2 10 10
RS.AN-2 0 0 0 0
RS.AN-3 2 2 10 10
RS.AN-4 0 0 0 0
RS.AN-5 0 0 0 0
SUM 4 20 20
RS.MI-1 1 1 3 5
RS.MI-2 1 1 3 5
RS.MI-3 0 0 0 0
SUM 2 6 10
RS.IM-1 3 3 8 15
RS.IM-2 1 1 3 5
SUM 4 11 20
Total
RC.RP-1 0 0 0 0
SUM 0 0 0
RC.IM-1 0 0 0 0
RC.IM-2 0 0 0 0
SUM 0 0 0
RC.CO-1 0 0 0 0
RC.CO-2 0 0 0 0
RC.CO-3 0 0 0 0
SUM 0 0 0
Total

GV.OC-01 0 0 0 0
GV.OC-02 7 7 32 35
GV.OC-03 7 7 18 35
GV.OC-04 9 9 35 45
GV.OC-05 0 0 0 0
SUM 23 85 115
GV.RM-01 1 1 3 5
GV.RM-02 0 0 0 0
GV.RM-03 1 1 4 5
GV.RM-04 0 0 0 0
GV.RM-05 0 0 0 0
GV.RM-06 0 0 0 0
GV.RM-07 0 0 0 0
SUM 2 7 10
GV.RR-01 0 0 0 0
GV.RR-02 17 17 72 85
GV.RR-03 2 2 8 10
GV.RR-04 1 1 3 5
SUM 20 83 100
GV.PO-01 4 4 11 20
GV.PO-02 3 3 12 15
SUM 7 23 35
GV.OV-01 0 0 0 0
GV.OV-02 0 0 0 0
GV.OV-03 1 1 2 5
SUM 1 2 5
GV.SC-01 0 0 0 0
GV.SC-02 9 9 40 45
GV.SC-03 0 0 0 0
GV.SC-04 3 3 14 15
GV.SC-05 0 0 0 0
GV.SC-06 0 0 0 0
GV.SC-07 0 0 0 0
GV.SC-08 0 0 0 0
GV.SC-09 0 0 0 0
GV.SC-10 0 0 0 0
SUM 12 54 60
Total
ID.AM-01 0 0 0 0
ID.AM-02 0 0 0 0
ID.AM-03 0 0 0 0
ID.AM-04 0 0 0 0
ID.AM-05 0 0 0 0
ID.AM-07 0 0 0 0
ID.AM-08 17 17 72 85
SUM 17 72 85
ID.RA-01 12 12 7 60
ID.RA-02 1 1 4 5
ID.RA-03 4 4 3 20
ID.RA-04 11 11 40 55
ID.RA-05 11 11 40 55
ID.RA-06 0 0 0 0
ID.RA-07 0 0 0 0
ID.RA-08 0 0 0 0
ID.RA-09 0 0 0 0
ID.RA-10 0 0 0 0
SUM 39 94 195
ID.IM-01 1 1 3 5
ID.IM-02 12 12 43 60
ID.IM-03 12 12 38 60
ID.IM-04 10 10 25 50
SUM 35 109 175
Total
PR.AA-01 0
0 0 0
PR.AA-02 0
0 0 0
PR.AA-03 0
0 0 0
PR.AA-04 0
0 0 0
PR.AA-05 9
9 34 45
PR.AA-06 1
1 3 5
SUM 10 37 50
PR.AT-01 6 6 21 30
PR.AT-02 11 11 47 55
SUM 17 68 85
PR.DS-01 0 0 0 0
PR.DS-02 0 0 0 0
PR.DS-10 0 0 0 0
PR.DS-11 6 6 23 30
SUM 6 23 30
PR.PS-01 1 1 4 5
PR.PS-02 1 1 3 5
PR.PS-03 16 16 67 80
PR.PS-04 15 15 69 75
PR.PS-05 0 0 0 0
PR.PS-06 0 0 0 0
SUM 33 143 165
PR.IR-01 5 5 17 25
PR.IR-02 1 1 3 5
PR.IR-03 3 3 11 15
PR.IR-04 10 10 27 50
SUM 19 58 95
Total
DE.CM-01 4 4 16 20
DE.CM-02 2 2 8 10
DE.CM-03 2 2 8 10
DE.CM-06 5 5 19 25
DE.CM-09 2 2 8 10
SUM 15 59 75
DE.AE-02 2 2 10 10
DE.AE-03 4 4 14 20
DE.AE-04 0 0 0 0
DE.AE-06 2 2 8 10
DE.AE-07 0 0 0 0
DE.AE-08 0 0 0 0
SUM 8 32 40
Total
RS.MA-01 3 3 9 15
RS.MA-02 2 2 10 10
RS.MA-03 0 0 0 0
RS.MA-04 0 0 0 0
RS.MA-05 0 0 0 0
SUM 5 19 25
RS.AN-03 2 2 10 10
RS.AN-06 0 0 0 0
RS.AN-07 0 0 0 0
RS.AN-08 0 0 0 0
SUM 2 10 10
RS.CO-02 1 1 3 5
RS.CO-03 1 1 3 5
SUM 2 6 10
RS.MI-01 1 1 3 5
RS.MI-02 1 1 3 5
SUM 2 6 10
Total
RC.RP-01 0 0 0 0
RC.RP-02 0 0 0 0
RC.RP-03 0 0 0 0
RC.RP-04 0 0 0 0
RC.RP-05 0 0 0 0
RC.RP-06 0 0 0 0
SUM 0 0 0
RC.CO-03 0 0 0 0
RC.CO-04 0 0 0 0
SUM 0 0 0
Total
Maturity
Category Category Function maturity Applicable?
maturity applicability

1
1
0
0
0
0
89.29 1 SUM
0
0
0
0
0
72.22 1 SUM
0
0
3
0
52.5 1 SUM
5
13
8
0
4
0
53.75 1 SUM
0
0
0
75 1 SUM
0
0
1
2
2
91.67 1 SUM
434.43 6 72.41 Total
0
0
0
0
0
0
0
65.91 1 SUM
0
0
0
0
1
81.67 1 SUM
1
2
0
2
2
0
0
0
55 1 SUM
0
0
0
0
0
1
0
0
0
1
0
8
33.11 1 SUM
0
0
64.29 1 SUM
1
0
0
0
0
86.11 1 SUM
386.09 6 64.35 Total
4
25
17
2
4
75 1 SUM
20
1
1
3
2
3
1
7
75 1 SUM
0
25
1
0
0
76.6 1 SUM
226.6 3 75.53 Total
1
50 1 SUM
2
7
0
4
0
50 1 SUM
2
3
9
1
1
100 1 SUM
4
5
2
50 1 SUM
3
1
43.75 1 SUM
293.75 5 58.75 Total
0
0 0 SUM
0
0
0 0 SUM
0
0
0
0 0 SUM
0 0 0 Total

0
0
1
0
0
67.39 1 SUM
0
0
0
0
0
0
0
62.5 1 SUM
0
0
0
0
78.75 1 SUM
3
0
57.14 1 SUM
0
0
0
25 1 SUM
0
0
0
1
0
0
2
2
0
0
87.5 1 SUM
378.28 6 63.05 Total
1
1
4
0
0
0
0
80.88 1 SUM
20
13
8
0
3
2
0
1
0
0
35.26 1 SUM
0
2
3
0
52.86 1 SUM
169 3 56.33 Total

0
67.5 1 SUM
2
1
75 1 SUM
5
2
1
0
70.83 1 SUM
0
0
0
1
0
0
83.33 1 SUM
0
0
0
1
51.32 1 SUM
347.98 5 69.6 Total
19
1
2
3
18
73.33 1 SUM
50
17
2
4
3
0
75 1 SUM
148.33 2 74.17 Total
5
4
1
1
0
70 1 SUM
9
1
1
2
100 1 SUM
7
0
50 1 SUM
4
5
50 1 SUM
270 4 67.5 Total
0
0
2
0
0
0
0 0 SUM
0
0
0 0 SUM
0 0 0 Total
Capability
Subcategory Subcategory Subcategory Category
capability MIN capability TOTAL capability MAX capability

1 0 5
1 0 5
0 0 0
0 0 0
0 0 0
0 0 0
2 0 10 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0 0
0 0 0
0 0 0
3 15 15
0 0 0
3 15 15 100
5 0 25
13 0 65
8 0 40
0 0 0
4 0 20
0 0 0
30 0 150 0
0 0 0
0 0 0
0 0 0
0 0 0 0
0 0 0
0 0 0
1 3 5
2 7 10
2 5 10
5 15 25 50
150
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
1 3 5
1 3 5 50
1 5 5
2 10 10
0 0 0
2 8 10
2 9 10
0 0 0
0 0 0
0 0 0
7 32 35 89.29
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
1 5 5
0 0 0
0 0 0
0 0 0
1 3 5
0 0 0
8 0 40
10 8 50 0
0 0 0
0 0 0
0 0 0 0
1 5 5
0 0 0
0 0 0
0 0 0
0 0 0
1 5 5 100
239.29
4 12 20
25 114 125
17 62 85
2 8 10
4 17 20
52 213 260 77.4
20 55 100
1 5 5
1 3 5
3 12 15
2 8 10
3 13 15
1 5 5
7 0 35
38 101 190 41.45
0 0 0
25 95 125
1 3 5
0 0 0
0 0 0
26 98 130 69.23
188.08
1 4 5
1 4 5 75
2 8 10
7 24 35
0 0 0
4 13 20
0 0 0
13 45 65 61.54
2 8 10
3 12 15
9 41 45
1 3 5
1 0 5
16 64 80 75
4 9 20
5 11 25
2 0 10
11 20 55 20.45
3 8 15
1 3 5
4 11 20 43.75
275.74
0 0 0
0 0 0 0
0 0 0
0 0 0
0 0 0 0
0 0 0
0 0 0
0 0 0
0 0 0 0
0

0 0 0
0 0 0
1 5 5
0 0 0
0 0 0
1 5 5 100
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0 0
3 15 15
0 0 0
3 15 15 100
0 0 0
0 0 0
0 0 0
0 0 0 0
0 0 0
0 0 0
0 0 0
1 3 5
0 0 0
0 0 0
2 7 10
2 5 10
0 0 0
0 0 0
5 15 25 50
250
1 0 5
1 0 5
4 12 20
0 0 0
0 0 0
0 0 0
0 0 0
6 12 30 25
20 0 100
13 0 65
8 0 40
0 0 0
3 0 15
2 0 10
0 0 0
1 0 5
0 0 0
0 0 0
47 0 235 0
0 0 0
2 6 10
3 8 15
0 0 0
5 14 25 45
70

0 0 0

0 0 0

0 0 0

0 0 0

0 0 0

0 0 0
0 0 0 0
2 8 10
1 3 5
3 11 15 66.67
5 23 25
2 10 10
1 5 5
0 0 0
8 38 40 93.75
0 0 0
0 0 0
0 0 0
1 5 5
0 0 0
0 0 0
1 5 5 100
0 0 0
0 0 0
0 0 0
1 4 5
1 4 5 75
335.42
19 50 95
1 5 5
2 8 10
3 13 15
18 81 90
44 161 220 66.48
50 209 250
17 62 85
2 8 10
4 17 20
3 10 15
0 0 0
76 306 380 75.66
142.14
5 17 25
4 16 20
1 3 5
1 4 5
0 0 0
11 40 55 65.91
9 41 45
1 3 5
1 5 5
2 8 10
13 57 65 84.62
7 24 35
0 0 0
7 24 35 60.71
4 9 20
5 11 25
9 20 45 30.56
241.8
0 0 0
0 0 0
2 9 10
0 0 0
0 0 0
0 0 0
2 9 10 87.5
0 0 0
0 0 0
0 0 0 0
87.5
Category applicability Function capability

1
4 37.5

0
1

1
4 59.82

1
1

1
3 62.69

1
5 55.15

0
0 0
1

1
3 83.33

1
1

1
3 23.33

1
4 83.86
1

1
2 71.07

1
4 60.45

0
1 87.5
Charter document completeness
11 Incomplete
12 Partially complete
13 Partially complete
14 Partially complete
15 Averagely complete
16 Averagely complete
17 Averagely complete
18 Averagely complete
19 Mostly complete
20 Mostly complete
21 Mostly complete
22 Fully complete

Governance elements completeness


14 Incomplete
15 Partially complete
16 Partially complete
17 Partially complete
18 Partially complete
19 Averagely complete
20 Averagely complete
21 Averagely complete
22 Averagely complete
23 Averagely complete
24 Mostly complete
25 Mostly complete
26 Mostly complete
27 Mostly complete
28 Fully complete

Cost management elements completeness


8 Incomplete
9 Partially complete
10 Partially complete
11 Averagely complete
12 Averagely complete
13 Averagely complete
14 Mostly complete
15 Mostly complete
16 Fully complete

SOC Management elements completeness


10 Incomplete
11 Partially complete
12 Partially complete
13 Partially complete
14 Averagely complete
15 Averagely complete
16 Averagely complete
17 Mostly complete
18 Mostly complete
19 Mostly complete
20 Fully complete

Role documentation completeness


8 Incomplete
9 Partially complete
10 Partially complete
11 Averagely complete
12 Averagely complete
13 Averagely complete
14 Mostly complete
15 Mostly complete
16 Fully complete

Training program completeness


6 Incomplete
7 Partially complete
8 Averagely complete
9 Averagely complete
10 Averagely complete
11 Mostly complete
12 Fully complete

Certification program completeness


3 Incomplete
4 Partially complete
5 Mostly complete
6 Fully complete

General documentation completeness


11 Incomplete
12 Partially complete
13 Partially complete
14 Partially complete
15 Averagely complete
16 Averagely complete
17 Averagely complete
18 Averagely complete
19 Mostly complete
20 Mostly complete
21 Mostly complete
22 Fully complete
General Maturity indicators completeness
12 Incomplete
13 Partially complete
14 Partially complete
15 Partially complete
16 Averagely complete
17 Averagely complete
18 Averagely complete
19 Averagely complete
20 Averagely complete
21 Mostly complete
22 Mostly complete
23 Mostly complete
24 Fully complete

Security Incident Management documentation completeness


11 Incomplete
12 Partially complete
13 Partially complete
14 Partially complete
15 Averagely complete
16 Averagely complete
17 Averagely complete
18 Averagely complete
19 Mostly complete
20 Mostly complete
21 Mostly complete
22 Fully complete

Security Incident Management Maturity indicators completeness


11 Incomplete
12 Partially complete
13 Partially complete
14 Partially complete
15 Averagely complete
16 Averagely complete
17 Averagely complete
18 Averagely complete
19 Mostly complete
20 Mostly complete
21 Mostly complete
22 Fully complete

Threat Hunting maturity indicators completeness


11 Incomplete
12 Partially complete
13 Partially complete
14 Partially complete
15 Averagely complete
16 Averagely complete
17 Averagely complete
18 Averagely complete
19 Mostly complete
20 Mostly complete
21 Mostly complete
22 Fully complete

SOC policy completeness


8 Incomplete
9 Partially complete
10 Partially complete
11 Averagely complete
12 Averagely complete
13 Averagely complete
14 Mostly complete
15 Mostly complete
16 Fully complete

SOC exercises completeness


6 Incomplete
7 Partially complete
8 Averagely complete
9 Averagely complete
10 Averagely complete
11 Mostly complete
12 Fully complete

Report type completeness


8 Incomplete
9 Partially complete
10 Partially complete
11 Averagely complete
12 Averagely complete
13 Averagely complete
14 Mostly complete
15 Mostly complete
16 Fully complete

Report metrics completeness


6 Incomplete
7 Partially complete
8 Averagely complete
9 Averagely complete
10 Averagely complete
11 Mostly complete
12 Fully complete
Skill matrix completeness
4 Incomplete
5 Partially complete
6 Averagely complete
7 Mostly complete
8 Fully complete

Knowledge matrix completeness


3 Incomplete
4 Partially complete
5 Mostly complete
6 Fully complete

Continuous improvement completeness


7 Incomplete
8 Partially complete
9 Partially complete
10 Averagely complete
11 Averagely complete
12 Mostly complete
13 Mostly complete
14 Fully complete

Quality assurance completeness


5 Incomplete
6 Partially complete
7 Averagely complete
8 Averagely complete
9 Mostly complete
10 Fully complete

Architecture completeness
5 Incomplete
6 Partially complete
7 Averagely complete
8 Averagely complete
9 Mostly complete
10 Fully complete
SOC-CMM - Business Domain
B1 - Business Drivers answer
B 1.1 0
1
2
3
4
5
B 1.2 0
1
2
3
4
5
B 1.3 0
1
2
3
4
5
B 1.4 0
1
2
3
4
5
B 1.5 0
1
2
3
4
5
B2 - Customers answer
B 2.1 0
1
2
3
4
5
B 2.3 0
1
2
3
4
5
B 2.4 0
1
2
3
4
5
B 2.5 0
1
2
3
4
5
B 2.6 0
1
2
3
4
5
B 2.7 0
1
2
3
4
5
B3 - SOC Charter answer
B 3.1 0
1
2
3
4
5
B 3.3 0
1
2
3
4
5
B 3.4 0
1
2
3
4
5
B 3.5 0
1
2
3
4
5
B4 - Governance answer
B 4.1 0
1
2
3
4
5
B 4.2 0
1
2
3
4
5
B 4.4 0
1
2
3
4
5
B 4.6 0
1
2
3
4
5
B 4.7 0
1
2
3
4
5
B 4.8 0
1
2
3
4
5
B 4.9 0
1
2
3
4
5
B 4.10 0
1
2
3
4
5
B 4.11 0
1
2
3
4
5
B5 - Privacy answer
B 5.1 0
1
2
3
4
5
B 5.2 0
1
2
3
4
5
B 5.4 0
1
2
3
4
5
B 5.5 0
1
2
3
4
5
B 5.6 0
1
2
3
4
5
B 5.7 0
1
2
3
4
5
B 5.8 0
1
2
3
4
5
B 5.9 0
1
2
3
4
5
B 5.10 0
1
2
3
4
5
B 5.11 0
1
2
3
4
5

SOC-CMM - People Domain


P1 - SOC Employees answer
P 1.3 0
1
2
3
4
5
P 1.4 0
1
2
3
4
5
P 1.5 0
1
2
3
4
5
P 1.6 0
1
2
3
4
5
P 1.7 0
1
2
3
4
5
P 1.8 0
1
2
3
4
5
P 1.9 0
1
2
3
4
5
P 1.10 0
1
2
3
4
5
P2 - SOC Roles and Hierarchy answer
P 2.1 0
1
2
3
4
5
P 2.3 0
1
2
3
4
5
P 2.4 0
1
2
3
4
5
P 2.5 0
1
2
3
4
5
P 2.6 0
1
2
3
4
5
P 2.8 0
1
2
3
4
5
P 2.9 0
1
2
3
4
5
P 2.10 0
1
2
3
4
5

P3 - People Management answer


P 3.1 0
1
2
3
4
5
P 3.2 0
1
2
3
4
5
P 3.3 0
1
2
3
4
5
P 3.4 0
1
2
3
4
5
P 3.5 0
1
2
3
4
5
P 3.6 0
1
2
3
4
5
P 3.7 0
1
2
3
4
5
P 3.8 0
1
2
3
4
5
P 3.9 0
1
2
3
4
5
P 3.10 0
1
2
3
4
5
P 3.11 0
1
2
3
4
5
P 3.12 0
1
2
3
4
5
P 3.13 0
1
2
3
4
5
P 3.14 0
1
2
3
4
5

P4 - Knowledge Management answer


P 4.1 0
1
2
3
4
5
P 4.2 0
1
2
3
4
5
P 4.4 0
1
2
3
4
5
P 4.5 0
1
2
3
4
5
P 4.7 0
1
2
3
4
5
P 4.8 0
1
2
3
4
5
P 4.9 0
1
2
3
4
5
P 4.10 0
1
2
3
4
5

P5 - Training & Education answer


P 5.1 0
1
2
3
4
5
P 5.3 0
1
2
3
4
5
P 5.5 0
1
2
3
4
5
P 5.6 0
1
2
3
4
5
P 5.7 0
1
2
3
4
5
P 5.8 0
1
2
3
4
5
P 5.9 0
1
2
3
4
5

SOC-CMM - Process Domain


M1 - SOC Management answer
M 1.1 0
1
2
3
4
5
M 1.2 0
1
2
3
4
5
M 1.4 0
1
2
3
4
5
M 1.5 0
1
2
3
4
5
M 1.6 0
1
2
3
4
5
M 1.8 0
1
2
3
4
5
M 1.10 0
1
2
3
4
5
M2 - Security Operations & Facilities answer
M 2.1.1 0
1
2
3
4
5
M 2.1.3 0
1
2
3
4
5
M 2.1.4 0
1
2
3
4
5
M 2.1.5 0
1
2
3
4
5
M 2.2.1 0
1
2
3
4
5
M 2.2.2 0
1
2
3
4
5
M 2.2.3 0
1
2
3
4
5
M 2.2.4 0
1
2
3
4
5
M 2.2.5 0
1
2
3
4
5
M 2.3.1 0
1
2
3
4
5
M 2.3.2 0
1
2
3
4
5
M 2.3.3 0
1
2
3
4
5
M 2.3.4 0
1
2
3
4
5
M 2.3.5 0
1
2
3
4
5
M 2.4.1 0
1
2
3
4
5
M 2.4.2 0
1
2
3
4
5
M 2.4.3 0
1
2
3
4
5
M 2.4.4 0
1
2
3
4
5
M 2.4.5 0
1
2
3
4
5
M 2.4.6 0
1
2
3
4
5
M 2.4.7 0
1
2
3
4
5
M 2.4.8 0
1
2
3
4
5
M 2.4.9 0
1
2
3
4
5
M 2.5.1 0
1
2
3
4
5
M 2.5.2 0
1
2
3
4
5
M 2.5.3 0
1
2
3
4
5
M 2.5.4 0
1
2
3
4
5
M 2.5.5 0
1
2
3
4
5
M 2.5.6 0
1
2
3
4
5
M 2.6.1 0
1
2
3
4
5
M 2.6.2 0
1
2
3
4
5

M3 - Reporting answer
M 3.1 0
1
2
3
4
5
M 3.2 0
1
2
3
4
5
M 3.3 0
1
2
3
4
5
M 3.4 0
1
2
3
4
5
M 3.5 0
1
2
3
4
5
M 3.6 0
1
2
3
4
5
M 3.7 0
1
2
3
4
5
M 3.9 0
1
2
3
4
5
M 3.11.1 0
1
2
3
4
5
M 3.11.2 0
1
2
3
4
5
M 3.11.3 0
1
2
3
4
5
M 3.12.1 0
1
2
3
4
5
M 3.12.2 0
1
2
3
4
5
M 3.13.1 0
1
2
3
4
5
M 3.13.2 0
1
2
3
4
5
M 3.13.3 0
1
2
3
4
5
M 3.13.4 0
1
2
3
4
5

M4 - Use Case Management answer


M 4.1 0
1
2
3
4
5
M 4.2 0
1
2
3
4
5
M 4.3 0
1
2
3
4
5
M 4.4 0
1
2
3
4
5
M 4.5 0
1
2
3
4
5
M 4.6 0
1
2
3
4
5
M 4.7 0
1
2
3
4
5
M 4.8 0
1
2
3
4
5
M 4.9 0
1
2
3
4
5
M 4.10 0
1
2
3
4
5
M 4.11 0
1
2
3
4
5
M 4.2.1 0
1
2
3
4
5
M 4.2.2 0
1
2
3
4
5
M 4.2.3 0
1
2
3
4
5
M 4.2.4 0
1
2
3
4
5
M 4.2.5 0
1
2
3
4
5
M 4.2.6 0
1
2
3
4
5
M 4.3.1 0
1
2
3
4
5
M 4.3.2 0
1
2
3
4
5
M 4.3.3 0
1
2
3
4
5

M5 - Detection Engineering answer


M 5.1.1 0
1
2
3
4
5
M 5.1.2 0
1
2
3
4
5
M 5.1.3 0
1
2
3
4
5
M 5.1.4 0
1
2
3
4
5
M 5.1.5 0
1
2
3
4
5
M 5.1.6 0
1
2
3
4
5
M 5.1.7 0
1
2
3
4
5
M 5.1.8 0
1
2
3
4
5
M 5.1.9 0
1
2
3
4
5
M 5.1.10 0
1
2
3
4
5
M 5.2.1 0
1
2
3
4
5
M 5.2.2 0
1
2
3
4
5
M 5.2.3 0
1
2
3
4
5
M 5.2.4 0
1
2
3
4
5
M 5.2.5 0
1
2
3
4
5
M 5.2.6 0
1
2
3
4
5
M 5.2.7 0
1
2
3
4
5
M 5.2.8 0
1
2
3
4
5

SOC-CMM - Technology Domain


T1 - SIEM Technology answer
T 1.1.1 0
1
2
3
4
5
T 1.1.2 0
1
2
3
4
5
T 1.2.1 0
1
2
3
4
5
T 1.2.2 0
1
2
3
4
5
T 1.3.1 0
1
2
3
4
5
T 1.3.2 0
1
2
3
4
5
T 1.3.3 0
1
2
3
4
5
T 1.3.4 0
1
2
3
4
5
T 1.4.1 0
1
2
3
4
5
T 1.4.2 0
1
2
3
4
5
T 1.4.3 0
1
2
3
4
5
T 1.4.4 0
1
2
3
4
5
T 1.4.5 0
1
2
3
4
5
T 1.5.1 0
1
2
3
4
5
T 1.5.2 0
1
2
3
4
5
T 1.5.3 0
1
2
3
4
5
T 1.5.4 0
1
2
3
4
5
T 1.5.5 0
1
2
3
4
5
T 1.5.6 0
1
2
3
4
5
T 1.6.1 0
1
2
3
4
5
T 1.6.2 0
1
2
3
4
5
T 1.6.3 0
1
2
3
4
5

T2 - IDPS Tooling answer


T 2.1.1 0
1
2
3
4
5
T 2.1.2 0
1
2
3
4
5
T 2.2.1 0
1
2
3
4
5
T 2.2.2 0
1
2
3
4
5
T 2.3.1 0
1
2
3
4
5
T 2.3.2 0
1
2
3
4
5
T 2.3.3 0
1
2
3
4
5
T 2.3.4 0
1
2
3
4
5
T 2.4.1 0
1
2
3
4
5
T 2.4.2 0
1
2
3
4
5
T 2.4.3 0
1
2
3
4
5
T 2.4.4 0
1
2
3
4
5
T 2.4.5 0
1
2
3
4
5
T 2.5.1 0
1
2
3
4
5
T 2.5.2 0
1
2
3
4
5
T 2.5.3 0
1
2
3
4
5
T 2.5.4 0
1
2
3
4
5
T 2.5.5 0
1
2
3
4
5
T 2.5.6 0
1
2
3
4
5
T 2.6.1 0
1
2
3
4
5
T 2.6.2 0
1
2
3
4
5
T 2.6.3 0
1
2
3
4
5

T3 - Security Analytics answer


T 3.1.1 0
1
2
3
4
5
T 3.1.2 0
1
2
3
4
5
T 3.2.1 0
1
2
3
4
5
T 3.2.2 0
1
2
3
4
5
T 3.3.1 0
1
2
3
4
5
T 3.3.2 0
1
2
3
4
5
T 3.3.3 0
1
2
3
4
5
T 3.3.4 0
1
2
3
4
5
T 3.4.1 0
1
2
3
4
5
T 3.4.2 0
1
2
3
4
5
T 3.4.3 0
1
2
3
4
5
T 3.4.4 0
1
2
3
4
5
T 3.4.5 0
1
2
3
4
5
T 3.5.1 0
1
2
3
4
5
T 3.5.2 0
1
2
3
4
5
T 3.5.3 0
1
2
3
4
5
T 3.5.4 0
1
2
3
4
5
T 3.5.5 0
1
2
3
4
5
T 3.5.6 0
1
2
3
4
5
T 3.6.1 0
1
2
3
4
5
T 3.6.2 0
1
2
3
4
5
T 3.6.3 0
1
2
3
4
5

T4 - Security Automation & Orchestration answer


T 4.1.1 0
1
2
3
4
5
T 4.1.2 0
1
2
3
4
5
T 4.2.1 0
1
2
3
4
5
T 4.2.2 0
1
2
3
4
5
T 4.3.1 0
1
2
3
4
5
T 4.3.2 0
1
2
3
4
5
T 4.3.3 0
1
2
3
4
5
T 4.3.4 0
1
2
3
4
5
T 5.4.1 0
1
2
3
4
5
T 5.4.2 0
1
2
3
4
5
T 5.4.3 0
1
2
3
4
5
T 5.4.4 0
1
2
3
4
5
T 5.4.5 0
1
2
3
4
5
T 4.5.1 0
1
2
3
4
5
T 4.5.2 0
1
2
3
4
5
T 4.5.3 0
1
2
3
4
5
T 4.5.4 0
1
2
3
4
5
T 4.5.5 0
1
2
3
4
5
T 4.5.6 0
1
2
3
4
5
T 4.6.1 0
1
2
3
4
5
T 4.6.2 0
1
2
3
4
5
T 4.6.3 0
1
2
3
4
5
T 4.7.19 0
1
2
3
4
5
6

SOC-CMM - Services Domain


S1 - Security Monitoring answer
S 1.1 0
1
2
3
4
5
S 1.3 0
1
2
3
4
5
S 1.4 0
1
2
3
4
5
S 1.5 0
1
2
3
4
5
S 1.6 0
1
2
3
4
5
S 1.7 0
1
2
3
4
5
S 1.8 0
1
2
3
4
5
S 1.9 0
1
2
3
4
5
S 1.10 0
1
2
3
4
5
S 1.11 0
1
2
3
4
5
S 1.12 0
1
2
3
4
5
S 1.13 0
1
2
3
4
5
S 1.14 0
1
2
3
4
5
S 1.15 0
1
2
3
4
5

S 2 - Security incident Management answer


S 2.2 0
1
2
3
4
5
S 2.3 0
1
2
3
4
5
S 2.5 0
1
2
3
4
5
S 2.6 0
1
2
3
4
5
S 2.7 0
1
2
3
4
5
S 2.8 0
1
2
3
4
5
S 2.9 0
1
2
3
4
5
S 2.10 0
1
2
3
4
5
S 2.11 0
1
2
3
4
5
S 2.12 0
1
2
3
4
5
S 2.13 0
1
2
3
4
5
S 2.14 0
1
2
3
4
5
S 2.15 0
1
2
3
4
5
S 2.16 0
1
2
3
4
5
S 3 - Security Analysis answer
S 3.1 0
1
2
3
4
5
S 3.3 0
1
2
3
4
5
S 3.4 0
1
2
3
4
5
S 3.5 0
1
2
3
4
5
S 3.6 0
1
2
3
4
5
S 3.7 0
1
2
3
4
5
S 3.8 0
1
2
3
4
5
S 3.9 0
1
2
3
4
5
S 3.10 0
1
2
3
4
5
S 3.11 0
1
2
3
4
5
S 3.12 0
1
2
3
4
5
S 3.12 0
1
2
3
4
5
S 3.13 0
1
2
3
4
5
S 3.14 0
1
2
3
4
5

S4 - Threat Intelligence answer


S 4.1 0
1
2
3
4
5
S 4.3 0
1
2
3
4
5
S 4.4 0
1
2
3
4
5
S 4.5 0
1
2
3
4
5
S 4.6 0
1
2
3
4
5
S 4.7 0
1
2
3
4
5
S 4.8 0
1
2
3
4
5
S 4.9 0
1
2
3
4
5
S 4.10 0
1
2
3
4
5
S 4.11 0
1
2
3
4
5
S 4.12 0
1
2
3
4
5
S 4.13 0
1
2
3
4
5
S 4.14 0
1
2
3
4
5

S5 - Hunting answer
S 5.1 0
1
2
3
4
5
S 5.2 0
1
2
3
4
5
S 5.4 0
1
2
3
4
5
S 5.5 0
1
2
3
4
5
S 5.6 0
1
2
3
4
5
S 5.7 0
1
2
3
4
5
S 5.8 0
1
2
3
4
5
S 5.9 0
1
2
3
4
5
S 5.10 0
1
2
3
4
5
S 5.11 0
1
2
3
4
5
S 5.12 0
1
2
3
4
5
S 5.13 0
1
2
3
4
5
S 5.14 0
1
2
3
4
5
S 5.15 0
1
2
3
4
5

S6 - Vulnerability Management answer


S 6.1 0
1
2
3
4
5
S 6.3 0
1
2
3
4
5
S 6.4 0
1
2
3
4
5
S 6.5 0
1
2
3
4
5
S 6.6 0
1
2
3
4
5
S 6.7 0
1
2
3
4
5
S 6.8 0
1
2
3
4
5
S 6.9 0
1
2
3
4
5
S 6.10 0
1
2
3
4
5
S 6.11 0
1
2
3
4
5
S 6.12 0
1
2
3
4
5
S 6.13 0
1
2
3
4
5
S 6.14 0
1
2
3
4
5

S7 - Log Management answer


S 7.1 0
1
2
3
4
5
S 7.3 0
1
2
3
4
5
S 7.4 0
1
2
3
4
5
S 7.5 0
1
2
3
4
5
S 7.6 0
1
2
3
4
5
S 7.7 0
1
2
3
4
5
S 7.8 0
1
2
3
4
5
S 7.9 0
1
2
3
4
5
S 7.10 0
1
2
3
4
5
S 7.11 0
1
2
3
4
5
S 7.12 0
1
2
3
4
5
S 7.13 0
1
2
3
4
5
S 7.14 0
1
2
3
4
5

Generic capability guidance

Generic capability guidance 0


1
2
3
4
5
6

Monitoring capability guidance 0


1
2
3
4
5
6
SOC-CMM - Business Domain
guidance

Business drivers are unknown


Basic awareness of business drivers
Some business drivers have been identified
Most business drivers have been identified
All business drivers are well known within the SOC

No documentation in place
Some ad-hoc information across documents
Basic documentation of business drivers
Single document, full description of business drivers
Document completed, approved and formally published

Business drivers are not part of decision making


Business drivers are referred to on an ad-hoc basis
Business drivers are occasionally used in decisions
Business drivers are used in most decisions
Business drivers are used in all relevant decisions

Service catalogue has not been checked for alignment


Alignment is performed on an ad-hoc basis
Alignment was performed but not maintained
Alignment is performed and maintained regularly
Every change in the catalogue is checked against drivers

Business drivers have not been validated


Basic awareness of SOC drivers exists among stakeholders
Stakeholders informally informed of business drivers
Alignment of SOC drivers with stakeholders is performed
Business drivers are formally validated by stakeholders
guidance

SOC customers are not known


Basic awareness of SOC customers
Some customers have been identified
Customers have mostly been identified
All customers are identified, including relevance and context

No documentation in place
Some ad-hoc information across documents
Basic documentation of SOC customers
Single document, full description of SOC customers
Document completed, approved and formally published

Output is the same for all customers


Output is somewhat contextualized
Some customers receive differentiated output
All important customers receive differentiated output
All customers receive specific output based on context and type

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No updates sent to customers


Ad-hoc updates sent to some customers
Frequent updates sent to most customers
Periodical updates sent to all customers
Periodical updates sent and discussed with all customers

Customer satisfaction not measured or managed


Customer satisfaction managed in ad-hoc fashion
Customer satisfaction metrics defined, not applied structurally
Customer satisfaction measured structurally, not actively managed
Customer satisfaction fully managed and improved over time
guidance

No charter document in place


Some ad-hoc information across documents
Basic charter document created
Single charter, full description of SOC strategic elements
Charter completed, approved and formally published

Charter is never updated


Charter is updated on ad-hoc basis
Charter is updated on major changes in business strategy
Charter is regularly updated
Charter periodically updated and realigned with business strategy

Charter is not approved


Business / CISO has basic awareness of the charter
Business / CISO has full awareness of the charter
Business / CISO approves of the content, but not formally
Charter is formally approved by the business / CISO

Stakeholders are unfamiliar with the charter


Some stakeholders are aware of the charter, but not its contents
Some stakeholders are aware of the charter and its contents
All stakeholders are aware, not all stakeholders know its contents
All Stakeholders are aware of the charter and its contents
guidance
SOC governance process is not in place
SOC governance is done in an ad-hoc fashion
Several governance elements are in place, but not structurally
Formal governance process is in place that covers most SOC aspects
Formal governance process is in place and covers all SOC aspects

No governance elements have been identified


Some governance elements are identified and governed ad-hoc
Some governance elements are identified and governed actively
Most governance elements are identified and actively governed
All elements are identified and actively governed

Cost management not in place


Costs visible, basic budget allocation in place
Costs fully visible and mostly managed, forecasting in place
Costs fully managed, not formally aligned with business stakeholders
Costs fully managed and formally aligned with business stakeholders

No governance document in place


Some ad-hoc information across documents
Basic governance document created
Single document, full description of governance elements
Governance document completed, approved and formally published

No governance meetings held


Governance meetings held in an ad-hoc fashion
Goveranancer meetings regularly scheduled
Governance meetings at different levels scheduled and structured
Governance meetings at different levels scheduled, ToR formalised

Governance process is not reviewed


Governance process is reviewed in an ad-hoc fashion
Process is reviewed using a structured approach in an ad-hoc fashion
Process is regularly and informally reviewed and updated
Process is regularly and formally reviewed and updated with findings

Stakeholders are unfamiliar with the process


Some stakeholders are aware of the process, but not its details
Some stakeholders are aware of the process and its details
All stakeholders are aware, not all stakeholders know its details
All stakeholders are aware of the process and its details

No assessments are performed


The SOC is assessed in an ad-hoc fashion
The SOC is assessed using a structured approach in an ad-hoc fashion
The SOC is regularly and informally assessed
The SOC is regularly and formally assessed by a third party
No cooperation with other SOCs
Some ad-hoc informal information exchange
Some information exchange with external SOCs
Information exchanged regularly, cooperation not formalized
Continuous formalized active cooperation and information exchange
guidance

No formal information security policy in place


Information security in place, no mention of SOC activities
Policy in place, SOC activities mentioned without mandate
Policy in place, SOC activities mentioned in detail with mandate
Policy with SOC activities and mandate in place, actively communicated

No SOC policy in place


Some ad-hoc information across documents
Basic SOC policy document created
Single document, full description of SOC policy elements
SOC policy document completed, approved and formally published

SOC not consulted or informed of policy creation or policy updates


SOC informed of policy creation and updates only
SOC consulted before policy creation and updates, not actively involved
SOC consulted before policy creation and updates, performs reviews
Full involvement of the SOC in the creation of operational security policy

No reporting policy in place


Policy in place, no mention of the SOC
Policy in place, role of the SOC in security incidents mentioned
Policy in place with all SOC activities, no central point for reporting
Policy & central point for reporting in place, SOC part of the workflow

No policy is in place
Information regarding privacy is scattered across documents
A policy exists, but has not been accepted formally
A formal policy exists, its contents are known to all employees
A formal policy exists, its contents are accepted by all employees

Regulations are not known and the SOC is non-compliant


Some regulations are known and the SOC is non-compliant
Most regulations are known and the SOC is partially compliant
Regulations are fully known and the SOC is mostly compliant
Regulations are fully known and the SOC is fully compliant

There is no cooperation between the SOC and legal


There is some ad-hoc cooperation between SOC and legal
There is structural cooperation between SOC and legal
Alignment exists between SOC and legal
Full and regular alignment exists between SOC and legal

No privacy procedures in place


Some ad-hoc information across documents
Basic privacy procedure created
Single document, full description of privacy investigations
Procedure completed, approved and formally published

The SOC is unaware of any information


The SOC is aware of such information, no formal identification
The SOC is fully aware, some information is formally identified
Most privacy related information is identified and documented
All privacy related information is identified and documented

PIAs are not conducted


PIAs are conducted in an ad-hoc fashion
PIAs are conducted using a structured approach in an ad-hoc fashion
PIAs are conducted informally and regularly
PIAs are conducted formally and regularly

SOC-CMM - People Domain


guidance

The SOC is either heavily overstaffed or understaffed


The SOC is overstaffed or understaffed
The SOC is somewhat overstaffed or understaffed
The SOC mostly meets FTE requirements
The SOC is staffed to full satisfaction in terms of FTE requirements

There are either way too few or too many external employees
There are too few or too many external employees
The SOC has somewhat too many or too few external employees
The SOC mostly meets requirements for external employee FTE count
The external employee ratio meets all requirements

There are too many skills only present within the external employees
Some required skills are not present internally, and not transferred
Some required skills are not present internally, but being transferred
Most skills are covered with internal employees
All required skills are covered with internal employees as well

Not all positions filled, service delivery cannot be assured


Sufficient positions filled to ensure service delivery
All key positions filled
All positions currently filled, not meeting external ration requirements
All positions currently filled, meeting external ratio requirements

There is no recruitment process in place


Recruitment is performed at an ad-hoc basis
A basic recruitment process is in place
A full recruitment process is in place, but not performing effectively
A full recruitment process is in place and performing effectively

No talent acquisition process in place


Talent acquisition is performed at an ad-hoc basis
A basic talent acquisition process is in place
A full acquisition process is in place, but not performing effectively
A full acquisition process is in place and performing effectively

KSAOs have not been created


KSOAs are used ad-hoc in staffing attempts
A basic standardized KSAO set is created
A full KSAO is created, but not actively used in staffing
A full KSAO is created, regularly updated and actively used in staffing

A safe environment is not actively created


Basic awareness of a safe environment exists, but is not implemented
A safe environment has been established, but not actively managed
A safe environment has been established, and actively managed
A safe and actively managed environment exists and is evaluated
guidance

No roles are used in the SOC


Some roles exist, but are not actively being used
Some roles exist, and are actively being used
All roles are fully in use, but not formalized
All roles are fully in use and formalized

No tiers exist within these roles


Some tiers exist, but are not actively being used
Some tiers exist, and are actively being used
All tiers are fully in use, but not formalized
All relevant roles are tiered and formalized

None of the roles meets FTE requirements


Some roles meet FTE requirements
Vital roles meet FTE requirements
All vital roles and most other roles meet FTE requirements
All roles fully meet FTE requirements

No hierarchy exists
A basic hierarchy exists, but is not fully operational
A basic hierarchy is in place and fully operational
A full hierarchy is in place, but not formalized
A full hierarchy is in place and formalized
No documentation in place
Some ad-hoc information across documents
Basic documentation of SOC roles
Single document, full description of SOC roles
Document completed, approved and formally published

Responsibilities not understood


Basic awareness of responsibilities
Responsibilities for some roles understood and adhered to
Responsibilities for all roles mostly understood and adhered to
Full understanding of responsibilities formalized in training sessions

No documentation in place
Some ad-hoc information across documents
Basic documentation of career progression for roles
Single document, full description of career progression for roles
Document completed, approved and formally published

Documentation is not reviewed


Documentation is reviewed ad-hoc, not using a structured approach
Documentation is reviewed ad-hoc, using a structured approach
Documentation is regularly and informally reviewed and updated
Documentation is regularly and formally reviewed and updated

guidance

No plan exists
A plan covering some roles is in place, but not operational
A plan covering some roles is in place and operational
A plan covering all roles is in place, but not formalized
A plan covering all roles is in place and formalized

No career progression process is in place


A process covering some roles is in place, but not operational
A process covering some roles is in place and operational
A process covering all roles is in place, but not formalized
A process covering all roles is in place and formalized

No talent management process in place


Talent management is performed at an ad-hoc basis
A basic talent management process is in place
A full process is in place, but not performing effectively
A full talent management process is in place and performing effectively

No diversity goals exist


Diversity goals are recognized but not defined
Diversity goals are defined but not formalized
Diversity goals have been formally defined and are not met
Diversity goals have been formally defined and are met

Team goals are not determined


Team goals are determined, but not formally documented
Team goals are determined, and formally documented
Team goals are determined and documented, but not tracked
Team goals are determined, approved and tracked regularly

Individual goals are not determined


Individual goals are determined, but not formally documented
Individual goals are determined, and formally documented
Individual goals are determined and documented, but not tracked
Individual goals are determined, approved and tracked regularly

No periodic evaluation is performed


Periodic evaluation is performed in an ad-hoc fashion
Periodic evaluation is performed in a structured fashion
Periodic evaluation is performed, but results are not used structurally
Periodic evaluation is performed, results are used for personal growth

No new hire process in place


New hire training is done in an ad-hoc fashion
A process is in place, but does not cover all aspects
An informal process covering people, process and technology is in place
A formal process covering people, process and technology is in place

Screening not performed


Basic screening performed in ad-hoc fashion
Basic screening procedure in place, applied structurally
Full screening procedure in place, applied structurally, not formalized
Formal screening procedure and background checks applied structurally

Employee satisfaction is not measured


Employee satisfaction is measured in an ad-hoc fashion
Satisfaction is usually measured, but not embedded in processes
Employee satisfaction is measured, not used for improvement
Employee satisfaction measured periodically and used for improvement

1-on-1 meetings are not held within the SOC


1-on-1 meetings are held on ad-hoc basis
Informal 1-on-1 meetings are held periodically
Formal 1-on-1 meetings are regularly held, results are not structured
1-on-1 meetings are regularly held and used for coaching and growth

No team building exercises are performed


Exercises are performed in an ad-hoc fashion
Exercises are usually performed, but not embedded in processes
Exercises are regularly done, but not focused on improvement
Exercises are regularly done and focused on improving team dynamics

No MTS team building exercises are performed


MTS exercises are performed in an ad-hoc fashion
MTS exercises are usually performed, but not embedded in processes
MTS exercises are regularly done, but not focused on improvement
MTS exercises done regularly and focused on improving team dynamics

No periodic evaluation is performed


Periodic evaluation is performed in an ad-hoc fashion
Periodic evaluation is performed in a structured fashion
Periodic evaluation is performed, but results are not used structurally
Periodic evaluation is performed, results are used for team growth

guidance
Do you have a formal knowledge managemen
A knowledge management process is not in place
Knowledge management is done in an ad-hoc fashion
A basic process is in place, that covers some knowledge aspects
An informal process is in place that covers most knowledge aspects
A formal process is in place, covering all knowledge aspects
Do you have a skill matrix in place?
A skill matrix is not in place
A basic skill matrix is in place, but incomplete
A complete skill matrix is in place, not approved
A complete skill matrix is in place and approved, not regularly updated
A complete skill matrix is in place, approved and regularly updated
Is the skill matrix actively used for team and p
Matrix not used for improvement
Matrix used for improvement in an ad-hoc fashion
Matrix used to improve some personal and team results
Matrix used to improve all personal and team results
Matrix used to improve personal and team results, improvements tracked
Do you have a knowledge matrix in place?
A knowledge matrix is not in place
A basic knowledge matrix is in place, but incomplete
A complete knowledge matrix is in place, not approved
A complete knowledge matrix is in place and approved, not regularly updated
A complete knowledge matrix is in place, approved and regularly updated
Is the knowledge matrix acively used to deter
Matrix not used for identification of training needs
Matrix used for training identification in an ad-hoc fashion
Matrix used to identify training needs, but not for all employees
Matrix used to identify all training needs, but not tracked for execution
Matrix structurally used to identify training needs, training tracked
Have you documented SOC team member abi
Documentation is not in place
Documentation only covers some employees' abilities
Documentation covers the most relevant abilities for the team
All employee abilities documented, but is not regularly updated
All employee abilities documented, and regularly updated
Do you regularly assess and revise the knowle
Documentation is not reviewed
Documentation is reviewed ad-hoc, not using a structured approach
Documentation is reviewed ad-hoc, using a structured approach
Documentation is regularly and informally reviewed and updated
Documentation is regularly and formally reviewed and updated
Is there effective tooling in place to support k
Tooling is not in place
Tooling is in place, but used in an ad-hoc fashion
Tooling is in place, and used regularly
Tooling is in place and use of the tool is embedded in processes
Tooling is in place and optimized for knowledge management purposes

guidance

A training program is not in place


A training program some roles is in place, but not operational
A training program covering some roles is in place and operational
A training program covering all roles is in place, but not formalized
A training program covering all roles is in place and formalized

A certification program is not in place


A certification program some roles is in place, but not operational
A certification program covering some roles is in place and operational
A certification program covering all roles is in place, but not formalized
A certification program covering all roles is in place and formalized

The programs are not connected


The programs are connected in an ad-hoc fashion
The programs are regularly used, but not embedded in processes
The programs are mostly aligned, but not formally
The programs are formally embedded in evaluation and progression

No budget is allocated
Insufficient budget is allocated for the team as a whole
Sufficient budget is allocated for the team as a whole
Employees have sufficient budget, not encouraged to attend training
Employees have sufficient budget, encouraged to attend training

No time is allocated
Insufficient time is allocated for the team as a whole
Sufficient time is allocated for the team as a whole
Employees have sufficient time, but not encouraged to attend training
Employees have sufficient time, and encouraged to attend training
Workshops are not held
Workshops are held in an ad-hoc fashion
Workshops are held periodically
Workshops are held regularly, not aligned with knowledge & training
Workshops are held regularly and aligned with knowledge & training

Programs are not reviewed


Programs are reviewed ad-hoc, not using a structured approach
Programs are reviewed ad-hoc, using a structured approach
Programs are regularly and informally reviewed and updated
Programs are regularly and formally reviewed and updated

SOC-CMM - Process Domain


guidance
Is there a SOC management process in place?
A SOC management process is not in place
SOC management is done in an ad-hoc fashion
A basic process is in place, that covers some aspects
An informal process is in place that covers most aspects
A formal process is in place, covering all aspects
Are SOC management elements formally iden
No documentation in place
Some ad-hoc information across documents
Basic documentation of SOC management process
Single document, full description of SOC management process
Document completed, approved and formally published
Is the SOC management process regularly rev
SOC management process is not reviewed
SOC management process is reviewed in an ad-hoc fashion
Process is reviewed using a structured approach in an ad-hoc fashion
Process is regularly and informally reviewed and updated
Process is regularly and formally reviewed and updated with findings
Is the SOC management process aligned with
Stakeholders are unfamiliar with the process
Some stakeholders are aware of the process, but not its details
Some stakeholders are aware of the process and its details
All stakeholders are aware, not all stakeholders know its details
All stakeholders are aware of the process and its details
Have you implemented a process for continuo
CI process not implemented
CI conducted in an ad-hoc fashion
CI conducted structurally, not documented
CI conducted structurally, following a defined process
CI conducted structurally, following an approved and measured process
Have you implemented a process to manage S
QA process not implemented
QA conducted in an ad-hoc fashion
QA conducted structurally, not documented
QA conducted structurally, following a defined process
QA conducted structurally, following an approved and measured process
Have you implemented a process to manage S
Architecture process not in place
Architecture conducted in an ad-hoc fashion
Basic documentation of SOC architecture, principles defined
SOC architecture process completed, not used consistently decisions
SOC architecture process approved and used actively in decision making
guidance
Do you have a documented exercise plan?
No exercise plan in place
Some ad-hoc information across documents
Basic description of SOC exercises
Single document, full description of SOC exercises
Exercises plan completed, approved and formally published
Do you perform security operations exercises
No security operations exercises are performed
Exercises are performed on ad-hoc basis
Exercises are sometimes performed in a structured manner
Informal structured exercises are performed regularly
Formal structured exercises are performed regularly
Are the results from exercises documented?
Results not documented
Results documented in a ad-hoc fashion
Results documented in a single document
Results documented using a defined template
Results documented, reviewed and formally approved
Is the output from exercises actively used to i
Output not used for improvement purposes
Improvements done in an ad-hoc fashion
Improvements determined and documented
Improvements determined and formally approved
Improvements determined, formally assigned, implementation tracked

No standard operating procedures are in place


Only vital procedures are in place
Most procedures are in place
All procedures are in place, not optimized through feedback
All procedures are in place, up to date and optimized for performance

No checklists are in place


A basic checklist is used in an ad-hoc fashion
Checklists are in place, but not used consistently
Checklists are used consistently, but not formally signed off
Checklists are used consistently and formally signed off

Workflows are not in place


Some ad-hoc information across documents
Basic documentation of workflows
Single document, full description of workflows
Workflows are completed, approved and formally published

An operational handbook is not in place


Some ad-hoc information across documents
Basic documentation of SOC tasks & rules
Single document, full description of SOC tasks & rules
Handbook is completed, approved and formally published

An OPSEC program is not in place


Some ad-hoc information across documents regarding OPSEC
Consistent OPSEC documentation, no program
Comprehensive OPSEC program exists, not consistently enforced/maintained
OPSEC program is formally approved, enforced and regularly maintained

Process not integrated


Configuration management is executed in an ad-hoc fashion
Baselines established and documented
Configuration management is mostly automated
All configuration updates reflected in CMDB and security tooling

Process not integrated


Change management is executed in an ad-hoc fashion
Change management process in place, not structurally executed
Change management process in place, structurally executed
SOC follows change management, all changes embedded in monitoring

Process not integrated


Problem management is executed in an ad-hoc fashion
Problem management process in place, not structurally executed
Problem management process in place, structurally executed
Problem management is executed and reviewed for all problems

Process not integrated


Incident management is executed in an ad-hoc fashion
Incident management process in place, not structurally executed
Incident management process in place, structurally executed
Incident management is executed and reviewed for all incidents

Process not integrated


Asset management is executed in an ad-hoc fashion
Asset management is executed structurally, but not automated
Asset management is mostly automated
All asset management updates reflected in CMDB and security tooling

No dedicated physical location


Floorplan in place, not operationalized
SOC established on single floor
Dedicated insecure location established
Dedicated secure location established, fully optimized for sec ops

No war room available


Room available, not equipped and not dedicated for the SOC
Room available, somewhat equipped but not dedicated for the SOC
Dedicated room available, not equipped for major incidents
Dedicated and fully equipped room available to the SOC

No dedicated network
Critical SOC components placed in separate network
Most SOC equipment in separate network, basic access controls in place
All SOC equipment in separate network, full access control in place
Dedicated SOC network in place, fully protected and monitored

Physical access controls not in place


Physical access controls in place, not dedicated for SOC
Dedicated access control in place using badges, access restricted
Dedicated access control in place using badges, access not reviewed
Access secured through badges, authorizations restricted and monitored

Secure storage facilities in place


Secure storage facilities in place, not dedicated for SOC
Dedicated secure storage in place, basic access control
Dedicated secure storage in place, granular access control, no monitoring
Access secured, granular access, regularly reviewed and monitored

No video wall in place


Single screen in place showing basic security information
Multiple screens in place showing basic static security information
Multiple screens in place, showing prioritized events and alerts
Video wall in place, fully optimized for real-time monitoring

Call-center capability not in place


Some basic communication means in place
Communication means in place, not separate from regular comms
Dedicated communication in place, separate from regular comms
Call-center in place, fully optimized for coordination & communication

No dedicated workstations in place


Workstations customized by individual analysts
Dedicated analyst workstations, toolset not standardized
Dedicated analyst workstations, toolset standardized but incomplete
Dedicated analyst workstations, optimized for monitoring & analysis

Remote working not enabled


Remote working enabled, no additional security measures for SOC
Remote working facilitated, additional security measures taken for SOC
Remote working facilitated, fully secured, not actively monitored
Remote working fully facilitated: secured, monitored and controlled

Shift schedules not in place


Basic schedule in place, not applied structurally
Basic schedule in place, applied structurally
Shift schedules in place, coverage mostly guaranteed for SOC roles
Shift schedules in place, guaranteeing full shift coverage for all roles

Shift schedules not created with vigilance in mind


Vigilance requirements understood, but not implemented
Vigilance requirements implemented, but not optimized
Shift schedule optimized for vigilance, but not regularly improved
Shift schedule optimized for vigilance, regularly evaluated and improved

No shift log in place


No central shift log in place, notes are kept, not always disseminated
Central shift log in place, but not used structurally
Shift log in place, used structurally but not checked for accuracy
Shift log in place, fully accurate and up to date

No shift turnover procedures in place


Some ad-hoc information across documents
Basic shift turnover procedure created
Single document, full description of shift turnover handling
Procedure completed, approved and formally published

No daily stand-up procedure in place


Stand-up carried out in an ad-hoc fashion and not regularly
Stand-up carried out regularly, but not in structured fashion
Structured stand-up procedure in place, not optimized
Stand-up procedure in place, executed daily, optimized for efficiency

No stand-by arrangements exist


Best-effort stand-by arrangement in place
Stand-by arrangement in place, not supported by tooling and not tested
Stand-by arrangements in place, supported by tooling, but not tested
Stand-by arrangements in place, supported by tooling and tested

No DMS in place
Documentation centralized on file shares
DMS in place, documentation updates not enforced
DMS in place, documentation updates and versions enforced
DMS in place, fully supporting SOC documentation requirements

No knowledge & collaboration platform in place


Knowledge & collaboration performed in an ad-hoc fashion
Platform in place, not dedicated, not restricted to SOC
Platform in place, not dedicated, restricted to SOC
Dedicated platform, fully supporting sec ops, integrated in ITSM process

importance

No reports are provided


Reports are provided in an ad-hoc fashion
Reports are provided regularly, not standardized
Reports are provided regularly and standardized using quality criteria
Reports are provided regularly, standardized and regularly optimized

Reports not tailored


Only basic customizations for customers applied
Customizations applied structurally to customer reports
Reports fully tailored to recipients, manual customization required
Reports fully tailored to recipients using automated templates

Reports not approved or reviewed


Informal report review conducted
Structural report review conducted
Reports regularly reviewed, not formally signed off by recipients
Reports regularly reviewed and formally signed off by recipients

No established reporting lines


Reports have limited dissemination
Reports have a standard distribution list
Reports dissemination through standard reporting lines, not approved
Reports dissemination through standard and approved reporting lines

Report templates not updated


Report templates updated in an ad-hoc fashion
Report templates regularly revised and updated
Report templates revised and updated using customer feedback
Reports templates regularly updated and formally approved

No agreements exist
Informal agreements made, not applied structurally
Informal agreements made, applied structurally
Formal agreements exists, not measured
Formal agreements exists, metrics applied to reporting
Do you provide different types of reports to y
Different reporting types not provided
Some reporting types provided
Most required reporting types provided
Required reporting types provided, not regularly evaluated
Required reporting types provided and regularly evaluated
Do you use different types of metrics in your r
Different metrics types not used
Some metric types used
Most required metric types used
Required metric types used, not regularly evaluated
Required metric types used and regularly evaluated

Advisories not provided


Advisories provided in an ad-hoc fashion
Advisories provided regularly
Advisories provided regularly, format discussed but not approved
Advisories provided regularly and format formally approved

Risk & impact assessment not performed


Risk & impact assessments performed in an ad-hoc fashion
Unstructured risk & impact assessments performed
Informal structured risk & impact assessment performed for advisories
Formal risk & impact assessment performed for all advisories

Follow-up of advisories not performed


Follow-up of advisories performed in an ad-hoc fashion
Follow-up performed for critical advisories
Follow-up performed for most advisories, aligned with ITSM processes
Follow-up performed for all advisories, aligned with ITSM processes

E&A not provided


E&A provided in an ad-hoc fashion
E&A provided in a structured manner
E&A provided through an established program
E&A provided and improved through an established program

Efforts not measured


Efforts measured in an ad-hoc fashion
Efforts measured in a structured fashion, output not used in improvement
Efforts measured in a structured fashion, output used in improvement
Efforts continuously being measured and optimized
Do you use communication templates?
No communication templates or standardization in place
Some communication standardized, no templates
Templates created, but not used consistently
Communication templates used consistently, embedded in technology
Communication templates formally approved and regularly reviewed
Do you have a communication matrix in place
Communication matrix not in place
Communication information available across documents
Communication information available in a single matrix
Communication matrix formally published and known to stakeholders
Communication matrix in place, formally published and regularly reviewed
Is communication training (verbal/written) av
Communication training not available
The need for training is identified, but no training has been selected
Training selected and made available to SOC personnel
Training selected and made available, actively promoted to SOC personnel
Communication training formal part of employee onboarding and evaluation

Communication skills not identified Are communication skills part of SOC role des
Communication skills identified, but not documented
Communication skills documented in role description
Communication skills documented and approved, not evaluated
Communication skills formally documented and evaluated for employees

importance

A use case management process is not in place


Use case management is done in an ad-hoc fashion
Basic process in place, not applied to all phases of the use case lifecycle
Informal process in place covering all aspects of the use case lifecycle
Formal process in place, covering all aspects of the use case lifecycle

No documentation in place
Some ad-hoc information across documents
Basic documentation of use cases
Single repository, full description of use cases
Repository completed, approved and actively maintained

Use cases not approved


Use cases not approved, but some are known to stakeholders
Use cases not approved, all critical use cases known to stakeholders
All important use cases approved by relevant stakeholders
All use cases formally approved by relevant stakeholders

Use case management process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally and regularly with relevant processes
Use case management process fully aligned with relevant processes

Use case not created using a standardized approach


Use cases created in a structured but undocumented fashion
Use cases mostly created in a structured and documented fashion
All use cases created using a standardized but unapproved approach
All use cases created using a standardized and approved approach

Use cases not created using a top-down approach


Use case creation performed in an ad-hoc fashion
Use cases created in a structured top-down way, SOC context only
Use cases created top-down, based on risk and business context
All use cases created top-down, full risk and business alignment

No traceability exists
Traceability is possible for some use cases, but requires manual effort
Traceability is possible for all use cases, but requires manual effort
Full traceability exists in documentation, not validated by stakeholders
Full traceability exists in documentation, validated by stakeholders

No traceability exists
Traceability is possible for some use cases, but requires manual effort
Traceability is possible for all use cases, but requires manual effort
Full traceability exists in documentation, not validated by stakeholders
Full traceability exists in documentation, validated by stakeholders

No metrics applied to use cases


Some ad-hoc measurements regarding use cases take place
Basic quantitative metrics in place for critical use cases
Metrics applied to all use cases, no risk-based feedback loop
Metrics applied to all use cases, used to guide risk-based use case growth

No scoring or prioritization applied


Scoring and prioritization applied in an ad-hoc manner
Scoring and prioritization applied structurally to critical use cases
Scoring and prioritization applied structurally to all use cases
All use cases scored and prioritized, validated & reviewed by stakeholders

Use cases are not reviewed


Use cases are reviewed ad-hoc, not using a structured approach
All critical use cases are reviewed using a structured approach
All use cases are regularly and informally reviewed and updated
All use cases are regularly and formally reviewed and updated
Do you measure use cases against the MITRE
Use cases not measured against Mitre ATT&CK
High risk use cases measured against MITRE ATT&CK®
High- and medium risk use cases measured against MITRE ATT&CK®
All use cases frequently measured, output used in improvement
All use cases continuously measured, output used in improvement
Are monitoring rules tagged with MITRE ATT&
Monitoring rules not tagged
High risk monitoring rules tagged
High- and medium risk monitoring rules tagged
All monitoring rules tagged, not regularly revised
All monitoring rules tagged and regularly revised
Have you created a MITRE ATT&CK® risk profi
MITRE ATT&CK® profile not created
MITRE ATT&CK® analysis performed, no formal profile in place
MITRE ATT&CK® profile created, not validated or maintained
MITRE ATT&CK® profile created and validated, not regularly maintained
Mitre ATT&CK profile created, validated and regularly maintained
Have you prioritized MITRE ATT&CK® techniqu
ATT&CK® techniques not prioritized
ATT&CK® analysis performed, no formal prioritization created
ATT&CK® techniques prioritized, not validated or maintained
ATT&CK® techniques prioritized and validated, not regularly maintained
ATT&CK® techniques prioritized, validated and regularly updated
Is use case output (alerts) used in threat intell
Use case output not used in TI activities
Use case output used in TI activities in an ad-hoc fashion
Use case output used in TI activities for high-risk alerts
Use case output used in TI activities for most alerts, no formal process
Formal process in place, connecting use case output to TI activities
Is threat intelligence used for the creation and
TI not used in use case creation / updates
TI used in use case creation / updates in an ad-hoc fashion
TI used in use case creation / updates for high-risk threats
TI used in use case creation / updates for most threats, no formal process
Formal process in place, connecting TI activities to use cases
Do you determine and document visibility req
Visibility requirements not determined
Visibility requirements determined for some use cases
Visibility requirements determined for most use cases, not documented
Visibility requirements formally determined and documented, not reviewed
Visibility requirements formally determined, documented and reviewed
Do you measure visibility status for your use c
Visibility status not measured
Visibility status measured in an ad-hoc fashion
Visibility status measured, not actively used in improvement
Visibility status measured frequently, output used in improvement
Visibility status continuously measured, output used in improvement
Do you map data source visibility to the MITR
Data sources not mapped to MITRE ATT&CK®
Data source mapping to ATT&CK® performed in an ad-hoc fashion
Data sources mapped to ATT&CK®, not actively used in improvement
Data sources frequently mapped to ATT&CK®, output used in improvement
Data sources continuously mapped to ATT&CK®, output used in improvement

importance
Do you have a detection engineering process
A detection engineering process is not in place
Detection engineering is done in an ad-hoc fashion
Basic process in place, not applied to all use cases
Informal process in place covering all use cases
Formal process in place, covering all use cases
Is the detection engineering process formally
No documentation in place
Some ad-hoc information across documents
Basic documentation of detection engineering process
Single document, full description of detection engineering process
Document completed, approved and formally published
Are there specific roles and requirements for
No specific roles and requirements
Requirements identified, not formalised in roles
Requirements identified, role defined but not documented
Requirements identified, role defined and documented
Roles formally documented, approved and regularly revised
Is there active cooperation between the SOC
No cooperation between teams
Cooperation between teams on an ad-hoc basis
SOC analysts are informed, no further cooperation
SOC analysts are informed and review outcomes
SOC analyst are actively involved in the detection engineering process
Is there active cooperation between the Threa
No cooperation between teams
Cooperation between teams on an ad-hoc basis
Threat analysts are informed, no further cooperation
Threat analysts are informed and review outcomes
Threat analyst are actively involved in the detection engineering process
Are there formal hand-over to the analyst tea
Formal handover not in place
Handover performed in an ad-hoc manner
Handover performed, process not documented of formalised
Handover performed, process documentation in place
Formal handover procedure in place, documented and regularly evaluated
Is there a testing enviroment to test and valid
Testing environment not in place
Testing environment in place, not actively used for detection engineering
Testing environment used, testing process not documented or formalised
Testing environment used, testing process documented
Testing environment used, process documented and regularly evaluated
Is there a formal release process in place for n
Release process not in place
Releases performed in an ad-hoc manner
Releases done structurally, process not documented of formalised
Releases done structurally, process documentation in place
Formal release procedure in place, documented and regularly evaluated
Do you apply a versioning system to detection
Versioning system not in place
Versioning system in place, not actively used
Versioning system used for some detections
Versioning system used for all detections, no formal commit procedure
Versioning system used for all detections, commit procedure formalised
Do you have a roll-back procedure in place in
Roll-back procedure not in place
Roll-back procedure requirements understood, but not operationalized
Roll-back capability in place, but not documented
Roll-back capability in place and documented
Formal roll-back capability in place, documented and regularly tested
Do you perform adversary emulation?
Validation activities not performed
Validation activities performed in an ad-hoc fashion
Validation activities performed structurally, no documented process
Validation activities performed structurally following a documented process
Validation activities fully aligned with TI and continuously improved
Do you test for detection of MITRE ATT&CK®t
Use case testing not in place
Use case testing performed in ad-hoc fashion, no detection targets set
Some use case testing performed, detection targets set, no formal process
All use cases tested, process formalized, detection targets set
All use cases tested, visibility and detection targets used in improvements
Do you test uses cases not directly associated
Use case testing not in place
Use case testing performed in ad-hoc fashion, no detection targets set
Some use case testing performed, detection targets set, no formal process
All use cases tested, process formalized, detection targets set
All use cases tested, visibility and detection targets used in improvements
Do you test response playbooks?
Response playbooks not tested
Response playbooks tested in an ad-hoc fashion
Some response playbooks tested, no formal process
Response playbooks tested structurally following a documented process
All response playbooks formally tested, output used for improvements
Is ADT/AE fully integrated in the detection eng
New releases do not trigger ADT/AE
New releases trigger ADT/AE in an ad-hoc fashion
Release process triggers ADT/AE for some use cases, not documented
Releases process triggers ADT/AE for all use cases, documented process
Full integration into release process, formalized and (partly) automated
Is the outcome from the ADT/AE tests used as
ADT/AE outcome not used
ADT/AE outcome used in an ad-hoc fashion
ADT/AE outcome used, no documented process
ADT/AE outcome used, documented process
ADT/EA outcome used, process documented and regularly evaluated
Do you monitor the data ingestion status for l
Data ingestion status not monitored
Data ingestion status monitored in an ad-hoc fashion
Data ingestion status monitored, not complete for all data source types
Data ingestion status monitored for all log sources, failures result in alerts
Data ingestion status monitored, following a defined resolution proces
Do you actively monitoring and improve data
Data source coverage not measured
Data source coverage measured in an ad-hoc fashion
Data source coverage measured , not complete for all data source types
Data source coverage structurally measured and improved on
Data source coverage structurally improved on following a defined process

SOC-CMM - Technology Domain


guidance

Functional ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Functional ownership fully described and assigned, not approved
Functional ownership fully described, assigned and formally approved

Technical ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Technical ownership fully described and assigned, not approved
Technical ownership fully described, assigned and formally approved

No documentation in place
Some ad-hoc information across documents
Basic documentation of the SIEM system in place
Single document, full technical description of SIEM system
Document completed, approved and formally published

No documentation in place
Some ad-hoc information across documents
Basic documentation of the SIEM system in place
Single document, full functional description of SIEM system
Document completed, approved and formally published

No personnel for SIEM support


Personnel for support available, not dedicated or sufficient
Sufficient dedicated personnel available, not documented
Sufficient dedicated personnel available & documented, not formalized
Sufficient dedicated personnel available, documented and formalized

Personnel not formally trained


Product training identified, no training currently in place
Individual training, not part of the training program
Training part of training program, all key personnel trained
All personnel formally trained

Personnel not formally certified


Product certification identified, no certification currently in place
Individual certification, not part of the certification program
Certification part of certification program, all key personnel certified
All personnel formally certified
Support contract not in place
Basic support contract in place, not covering SOC requirements
Support contract in place, covering basic SOC requirements
Support contract in place, covering most SOC requirements
Support contract in place, covering all SOC requirements
Is the system regularly maintained?
System maintenance not performed
System maintenance done in an ad-hoc fashion
System maintenance done structurally, not following procedures
System maintenance done structurally, following procedures
Maintenance executed following approved procedures, regularly reviewed
Is remote maintenance on the system manage
Remote maintenance not managed
Remote maintenance done in an ad-hoc fashion
Remote maintenance controlled, not documented
Remote maintenance controlled, in a documented process
Remote maintenance controlled & monitored, formally documented
Is maintenance executed through the change
Maintenance performed without changes
Some maintenance executed through change management
All major maintenance executed through change management
All maintenance executed through changes, no formal approval
All maintenance executed through changes, with formal approval
Have maintenance windows been established
Maintenance windows not established
Maintenance windows used sometimes
Maintenance windows established, not formally approved
Maintenance windows established and formally approved
Established, formally approved & aligned with change management
Is maintenance performed using authorised a
Maintenance not performed using authorized & trusted tooling
Standardized tooling set used for most maintenance
Standardized tooling set used for all maintenance, not formally authorized
authorized tooling used, updated before maintenance
authorized & updated tooling used, regularly evaluated

HA not in place
HA requirements identified, not implemented
Manual actions required for achieving redundancy
Fully automated HA in place, not aligned with business continuity plans
Fully automated HA in place, aligned with business continuity plans

Data backup or replication not in place


Data backed up in ad-hoc fashion
Weekly backup routine in place
Daily backup routine in place
Real-time replication of data implemented
Configuration backup or replication not in place
Configuration backed up in ad-hoc fashion
Configuration backed up manually after each change
Daily backup routine in place
Real-time replication of configuration implemented

DR plan not in place


DR requirements identified, plan not yet in place
Basic DR plan in place
Full DR plan in place, not approved by business continuity stakeholders
Full DR plan in place, approved by business continuity stakeholders

DR plan not tested


DR plan tested on ad-hoc basis
DR plan tested, but not formally
DR plan regularly and fully tested, results not formally published
DR plan regularly tested, results formally published and followed up

Test environment not in place, testing not performed


Test environment not in place, testing performed in ad-hoc fashion
Separate test environment in place, not used structurally
Separate test environment with informal procedures in place
Separate test environment with formal procedures in place

Access to the system not restricted


Basic access control in place
Granular access rights implemented, not monitored
Granular access rights implemented and monitored, not audited
Granular access rights implemented, monitored and subjected to audit

Review of access rights not performed


Review of access rights performed in ad-hoc fashion
Access right review documented, but not executed structurally
Access rights reviewed periodically and structurally
Access rights reviewed periodically and after each change in employees

Break glass procedure and account not in place


Break glass account in place, no defined procedure
Break glass account and defined procedure in place
Break glass account and defined procedure in place, formally approved
Break glass account and procedure in place, approved & regularly tested

guidance

Functional ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Functional ownership fully described and assigned, not approved
Functional ownership fully described, assigned and formally approved

Technical ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Technical ownership fully described and assigned, not approved
Technical ownership fully described, assigned and formally approved

No documentation in place
Some ad-hoc information across documents
Basic documentation of the IDPS system in place
Single document, full technical description of IDPS system
Document completed, approved and formally published

No documentation in place
Some ad-hoc information across documents
Basic documentation of the IDPS system in place
Single document, full functional description of IDPS system
Document completed, approved and formally published

No personnel for IDPS support


Personnel for support available, not dedicated or sufficient
Sufficient dedicated personnel available, not documented
Sufficient dedicated personnel available & documented, not formalized
Sufficient dedicated personnel available, documented and formalized

Personnel not formally trained


Product training identified, no training currently in place
Individual training, not part of the training program
Training part of training program, all key personnel trained
All personnel formally trained

Personnel not formally certified


Product certification identified, no certification currently in place
Individual certification, not part of the certification program
Certification part of certification program, all key personnel certified
All personnel formally certified

Support contract not in place


Basic support contract in place, not covering SOC requirements
Support contract in place, covering basic SOC requirements
Support contract in place, covering most SOC requirements
Support contract in place, covering all SOC requirements
Is the system regularly maintained?
System maintenance not performed
System maintenance done in an ad-hoc fashion
System maintenance done structurally, not following procedures
System maintenance done structurally, following procedures
Maintenance executed following approved procedures, regularly reviewed
Is remote maintenance on the system manage
Remote maintenance not managed
Remote maintenance done in an ad-hoc fashion
Remote maintenance controlled, not documented
Remote maintenance controlled, in a documented process
Remote maintenance controlled & monitored, formally documented
Is maintenance executed through the change
Maintenance performed without changes
Some maintenance executed through change management
All major maintenance executed through change management
All maintenance executed through changes, no formal approval
All maintenance executed through changes, with formal approval
Have maintenance windows been established
Maintenance windows not established
Maintenance windows used sometimes
Maintenance windows established, not formally approved
Maintenance windows established and formally approved
Established, formally approved & aligned with change management
Is maintenance performed using authorised a
Maintenance not performed using authorized & trusted tooling
Standardized tooling set used for most maintenance
Standardized tooling set used for all maintenance, not formally authorized
authorized tooling used, updated before maintenance
authorized & updated tooling used, regularly evaluated

HA not in place
HA requirements identified, not implemented
Manual actions required for achieving redundancy
Fully automated HA in place, not aligned with business continuity plans
Fully automated HA in place, aligned with business continuity plans

Data backup or replication not in place


Data backed up in ad-hoc fashion
Weekly backup routine in place
Daily backup routine in place
Real-time replication of data implemented

Configuration backup or replication not in place


Configuration backed up in ad-hoc fashion
Configuration backed up manually after each change
Daily backup routine in place
Real-time replication of configuration implemented

DR plan not in place


DR requirements identified, plan not yet in place
Basic DR plan in place
Full DR plan in place, not approved by business continuity stakeholders
Full DR plan in place, approved by business continuity stakeholders

DR plan not tested


DR plan tested on ad-hoc basis
DR plan tested, but not formally
DR plan regularly and fully tested, results not formally published
DR plan regularly tested, results formally published and followed up

Test environment not in place, testing not performed


Test environment not in place, testing performed in ad-hoc fashion
Separate test environment in place, not used structurally
Separate test environment with informal procedures in place
Separate test environment with formal procedures in place

Access to the system not restricted


Basic access control in place
Granular access rights implemented, not monitored
Granular access rights implemented and monitored, not audited
Granular access rights implemented, monitored and subjected to audit

Review of access rights not performed


Review of access rights performed in ad-hoc fashion
Access right review documented, but not executed structurally
Access rights reviewed periodically and structurally
Access rights reviewed periodically and after each change in employees

Break glass procedure and account not in place


Break glass account in place, no defined procedure
Break glass account and defined procedure in place
Break glass account and defined procedure in place, formally approved
Break glass account and procedure in place, approved & regularly tested

guidance

Functional ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Functional ownership fully described and assigned, not approved
Functional ownership fully described, assigned and formally approved

Technical ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Technical ownership fully described and assigned, not approved
Technical ownership fully described, assigned and formally approved

No documentation in place
Some ad-hoc information across documents
Basic documentation of the analytics system in place
Single document, full technical description of analytics system
Document completed, approved and formally published

No documentation in place
Some ad-hoc information across documents
Basic documentation of the analytics system in place
Single document, full functional description of analytics system
Document completed, approved and formally published

No personnel for analytics support


Personnel for support available, not dedicated or sufficient
Sufficient dedicated personnel available, not documented
Sufficient dedicated personnel available & documented, not formalized
Sufficient dedicated personnel available, documented and formalized

Personnel not formally trained


Product training identified, no training currently in place
Individual training, not part of the training program
Training part of training program, all key personnel trained
All personnel formally trained

Personnel not formally certified


Product certification identified, no certification currently in place
Individual certification, not part of the certification program
Certification part of certification program, all key personnel certified
All personnel formally certified

Support contract not in place


Basic support contract in place, not covering SOC requirements
Support contract in place, covering basic SOC requirements
Support contract in place, covering most SOC requirements
Support contract in place, covering all SOC requirements
Is the system regularly maintained?
System maintenance not performed
System maintenance done in an ad-hoc fashion
System maintenance done structurally, not following procedures
System maintenance done structurally, following procedures
Maintenance executed following approved procedures, regularly reviewed
Is remote maintenance on the system manage
Remote maintenance not managed
Remote maintenance done in an ad-hoc fashion
Remote maintenance controlled, not documented
Remote maintenance controlled, in a documented process
Remote maintenance controlled & monitored, formally documented
Is maintenance executed through the change
Maintenance performed without changes
Some maintenance executed through change management
All major maintenance executed through change management
All maintenance executed through changes, no formal approval
All maintenance executed through changes, with formal approval
Have maintenance windows been established
Maintenance windows not established
Maintenance windows used sometimes
Maintenance windows established, not formally approved
Maintenance windows established and formally approved
Established, formally approved & aligned with change management
Is maintenance performed using authorised a
Maintenance not performed using authorized & trusted tooling
Standardized tooling set used for most maintenance
Standardized tooling set used for all maintenance, not formally authorized
Authorized tooling used, updated before maintenance
Authorized & updated tooling used, regularly evaluated

HA not in place
HA requirements identified, not implemented
Manual actions required for achieving redundancy
Fully automated HA in place, not aligned with business continuity plans
Fully automated HA in place, aligned with business continuity plans

Data backup or replication not in place


Data backed up in ad-hoc fashion
Weekly backup routine in place
Daily backup routine in place
Real-time replication of data implemented

Configuration backup or replication not in place


Configuration backed up in ad-hoc fashion
Configuration backed up manually after each change
Daily backup routine in place
Real-time replication of configuration implemented

DR plan not in place


DR requirements identified, plan not yet in place
Basic DR plan in place
Full DR plan in place, not approved by business continuity stakeholders
Full DR plan in place, approved by business continuity stakeholders

DR plan not tested


DR plan tested on ad-hoc basis
DR plan tested, but not formally
DR plan regularly and fully tested, results not formally published
DR plan regularly tested, results formally published and followed up

Test environment not in place, testing not performed


Test environment not in place, testing performed in ad-hoc fashion
Separate test environment in place, not used structurally
Separate test environment with informal procedures in place
Separate test environment with formal procedures in place

Access to the system not restricted


Basic access control in place
Granular access rights implemented, not monitored
Granular access rights implemented and monitored, not audited
Granular access rights implemented, monitored and subjected to audit

Review of access rights not performed


Review of access rights performed in ad-hoc fashion
Access right review documented, but not executed structurally
Access rights reviewed periodically and structurally
Access rights reviewed periodically and after each change in employees

Break glass procedure and account not in place


Break glass account in place, no defined procedure
Break glass account and defined procedure in place
Break glass account and defined procedure in place, formally approved
Break glass account and procedure in place, approved & regularly tested

guidance

Functional ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Functional ownership fully described and assigned, not approved
Functional ownership fully described, assigned and formally approved

Technical ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Technical ownership fully described and assigned, not approved
Technical ownership fully described, assigned and formally approved

No documentation in place
Some ad-hoc information across documents
Basic documentation of the analytics system in place
Single document, full technical description of analytics system
Document completed, approved and formally published

No documentation in place
Some ad-hoc information across documents
Basic documentation of the analytics system in place
Single document, full functional description of analytics system
Document completed, approved and formally published
Is the system regularly maintained?
No personnel for security automation & orchestration support
Personnel for support available, not dedicated or sufficient
Sufficient dedicated personnel available, not documented
Sufficient dedicated personnel available & documented, not formalized
Sufficient dedicated personnel available, documented and formalized
Is remote maintenance on the system manage
Personnel not formally trained
Product training identified, no training currently in place
Individual training, not part of the training program
Training part of training program, all key personnel trained
All personnel formally trained
Is maintenance executed through the change
Personnel not formally certified
Product certification identified, no certification currently in place
Individual certification, not part of the certification program
Certification part of certification program, all key personnel certified
All personnel formally certified
Have maintenance windows been established
Support contract not in place
Basic support contract in place, not covering SOC requirements
Support contract in place, covering basic SOC requirements
Support contract in place, covering most SOC requirements
Support contract in place, covering all SOC requirements
Is maintenance performed using authorised a
System maintenance not performed
System maintenance done in an ad-hoc fashion
System maintenance done structurally, not following procedures
System maintenance done structurally, following procedures
Maintenance executed following approved procedures, regularly reviewed

Remote maintenance not managed


Remote maintenance done in an ad-hoc fashion
Remote maintenance controlled, not documented
Remote maintenance controlled, in a documented process
Remote maintenance controlled & monitored, formally documented

Maintenance performed without changes


Some maintenance executed through change management
All major maintenance executed through change management
All maintenance executed through changes, no formal approval
All maintenance executed through changes, with formal approval

Maintenance windows not established


Maintenance windows used sometimes
Maintenance windows established, not formally approved
Maintenance windows established and formally approved
Established, formally approved & aligned with change management
Maintenance not performed using authorized & trusted tooling
Standardized tooling set used for most maintenance
Standardized tooling set used for all maintenance, not formally authorized
Authorized tooling used, updated before maintenance
Authorized & updated tooling used, regularly evaluated

HA not in place
HA requirements identified, not implemented
Manual actions required for achieving redundancy
Fully automated HA in place, not aligned with business continuity plans
Fully automated HA in place, aligned with business continuity plans

Data backup or replication not in place


Data backed up in ad-hoc fashion
Weekly backup routine in place
Daily backup routine in place
Real-time replication of data implemented

Configuration backup or replication not in place


Configuration backed up in ad-hoc fashion
Configuration backed up manually after each change
Daily backup routine in place
Real-time replication of configuration implemented

DR plan not in place


DR requirements identified, plan not yet in place
Basic DR plan in place
Full DR plan in place, not approved by business continuity stakeholders
Full DR plan in place, approved by business continuity stakeholders

DR plan not tested


DR plan tested on ad-hoc basis
DR plan tested, but not formally
DR plan regularly and fully tested, results not formally published
DR plan regularly tested, results formally published and followed up

Test environment not in place, testing not performed


Test environment not in place, testing performed in ad-hoc fashion
Separate test environment in place, not used structurally
Separate test environment with informal procedures in place
Separate test environment with formal procedures in place

Access to the security automation system not restricted


Basic access control in place
Granular access rights implemented, not monitored
Granular access rights implemented and monitored, not audited
Granular access rights implemented, monitored and subjected to audit
Review of access rights not performed
Review of access rights performed in ad-hoc fashion
Access right review documented, but not executed structurally
Access rights reviewed periodically and structurally
Access rights reviewed periodically and after each change in employees

Break glass procedure and account not in place


Break glass account in place, no defined procedure
Break glass account and defined procedure in place
Break glass account and defined procedure in place, formally approved
Break glass account and procedure in place, approved & regularly tested

No automation in playbooks
Enrichment playbooks only
Automation of triage activity
Automation used as decision support for remediation activities
fully automated playbooks where possible
Not required for security operations

SOC-CMM - Services Domain


guidance

No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published

Service not measured for quality


Metrics defined, applied in an ad-hoc fashion
Metrics defined, applied in a structured but informal fashion
Metrics formalized and used in regular reports
Formal and approved metrics in place, feedback used for improvement

Service not measured


SLA defined, measured in an ad-hoc fashion
SLA defined, measured periodically but not reported
SLA compliance reported periodically, not discussed with customers
SLA compliance discussed with customers regularly for improvement

No updates sent to customers/stakeholders


Ad-hoc updates sent to some customers/stakeholders
Frequent updates sent to most customers/stakeholders
Periodical updates sent to all customers/stakeholders
Periodical updates sent and discussed with all customers/stakeholders

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable

Process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally & regularly with most relevant processes
Alignment done structurally & regularly with all relevant processes

No such process in place


Continuity requirements identified, process not yet in place
Basic service continuity process in place
Full process in place, not approved by relevant stakeholders
Full process in place, formally approved by relevant stakeholders

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
Full procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed

Best practices not applied


Best practices identified, but not applied
Best practices applied, but not structurally
Best practices applied to service architecture and service delivery
Best practices applied and adherence checked regularly

Use cases not used


Use cases undocumented and used in an ad-hoc fashion
Use cases documented and applied structurally
Use cases embedded in the security monitoring processes
Use cases fully embedded, tuning and Life Cycle Management applied

Service performance not measured


Goals set for service performance, measured ad-hoc
Goals set for service performance, measured structurally but informally
Goals set for service performance, measured structurally and formally
Continuous measurement to determine progress & adjust process

Improvement not done


Goals defined, but not pursued
Goals defined and pursued structurally, but not formalized
Goals formally defined and pursued structurally and periodically
Continuous improvement based on targets and feedback loops

guidance

Standard not adopted


Awareness of standards, used in ad-hoc fashion
Standard used structurally as reference during incident response
Many elements adopted, not fully aligned
Standard fully adopted, process set up and executed using standard

No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published

Service not measured for quality


Metrics defined, applied in an ad-hoc fashion
Metrics defined, applied in a structured but informal fashion
Metrics formalized and used in regular reports
Formal and approved metrics in place, feedback used for improvement

Service not measured


SLA defined, measured in an ad-hoc fashion
SLA defined, measured periodically but not reported
SLA compliance reported periodically, not discussed with customers
SLA compliance discussed with customers regularly for improvement

No updates sent to customers/stakeholders


Ad-hoc updates sent to some customers/stakeholders
Frequent updates sent to most customers/stakeholders
Periodical updates sent to all customers/stakeholders
Periodical updates sent and discussed with all customers/stakeholders

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers
No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable

Process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally & regularly with most relevant processes
Alignment done structurally & regularly with all relevant processes

No mandate
Mandate requested in ad-hoc fashion during incident response
Mandate informally given, not supported by all stakeholders
Mandate given and supported by all stakeholders, not formalized
Full mandate, formally documented, approved and published

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
Full procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed

Best practices not applied


Best practices identified, but not applied
Best practices applied, but not structurally
Best practices applied to service architecture and service delivery
Best practices applied and adherence checked regularly

No workflows or scenarios in place


Some ad-hoc information across documents
Basic workflows in place, not covering all incident types
Workflows created for all incident types, not formalized
Formal workflows created, approved & published for all incident types

Service performance not measured


Goals set for service performance, measured ad-hoc
Goals set for service performance, measured structurally but informally
Goals set for service performance, measured structurally and formally
Continuous measurement to determine progress & adjust process

Improvement not done


Goals defined, but not pursued
Goals defined and pursued structurally, but not formalized
Goals formally defined and pursued structurally and periodically
Continuous improvement based on targets and feedback loops
guidance

No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published

Service not measured for quality


Metrics defined, applied in an ad-hoc fashion
Metrics defined, applied in a structured but informal fashion
Metrics formalized and used in regular reports
Formal and approved metrics in place, feedback used for improvement

Service not measured


SLA defined, measured in an ad-hoc fashion
SLA defined, measured periodically but not reported
SLA compliance reported periodically, not discussed with customers
SLA compliance discussed with customers regularly for improvement

No updates sent to customers/stakeholders


Ad-hoc updates sent to some customers/stakeholders
Frequent updates sent to most customers/stakeholders
Periodical updates sent to all customers/stakeholders
Periodical updates sent and discussed with all customers/stakeholders

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable

Process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally & regularly with most relevant processes
Alignment done structurally & regularly with all relevant processes

No such process in place


Continuity requirements identified, process not yet in place
Basic service continuity process in place
Full process in place, not approved by relevant stakeholders
Full process in place, formally approved by relevant stakeholders

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
Full procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed

Best practices not applied


Best practices identified, but not applied
Best practices applied, but not structurally
Best practices applied to service architecture and service delivery
Best practices applied and adherence checked regularly

No workflows or scenarios in place


Some ad-hoc information across documents
Basic workflows in place, not covering all incident types
Workflows created for all incident types, not formalized
Formal workflows created, approved and published for all incident types

Service performance not measured


Goals set for service performance, measured ad-hoc
Goals set for service performance, measured structurally but informally
Goals set for service performance, measured structurally and formally
Continuous measurement to determine progress & adjust process

Improvement not done


Goals defined, but not pursued
Goals defined and pursued structurally, but not formalized
Goals formally defined and pursued structurally and periodically
Continuous improvement based on targets and feedback loops

guidance

No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published

Service not measured for quality


Metrics defined, applied in an ad-hoc fashion
Metrics defined, applied in a structured but informal fashion
Metrics formalized and used in regular reports
Formal and approved metrics in place, feedback used for improvement

Service not measured


SLA defined, measured in an ad-hoc fashion
SLA defined, measured periodically but not reported
SLA compliance reported periodically, not discussed with customers
SLA compliance discussed with customers regularly for improvement

No updates sent to customers/stakeholders


Ad-hoc updates sent to some customers/stakeholders
Frequent updates sent to most customers/stakeholders
Periodical updates sent to all customers/stakeholders
Periodical updates sent and discussed with all customers/stakeholders

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable

Process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally & regularly with most relevant processes
Alignment done structurally & regularly with all relevant processes

No such process in place


Continuity requirements identified, process not yet in place
Basic service continuity process in place
Full process in place, not approved by relevant stakeholders
Full process in place, formally approved by relevant stakeholders

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
Full procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed

Best practices not applied


Best practices identified, but not applied
Best practices applied, but not structurally
Best practices applied to service architecture and service delivery
Best practices applied and adherence checked regularly

Service performance not measured


Goals set for service performance, measured ad-hoc
Goals set for service performance, measured structurally but informally
Goals set for service performance, measured structurally and formally
Continuous measurement to determine progress & adjust process

Improvement not done


Goals defined, but not pursued
Goals defined and pursued structurally, but not formalized
Goals formally defined and pursued structurally and periodically
Continuous improvement based on targets and feedback loops

guidance

Methodology not adopted


Awareness of methodologies, used in ad-hoc fashion
Methodologies used structurally as reference during hunting activities
Many elements adopted, not fully aligned
Methodology fully adopted, process set up and executed accordingly

No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published

Service not measured for quality


Metrics defined, applied in an ad-hoc fashion
Metrics defined, applied in a structured but informal fashion
Metrics formalized and used in regular reports
Formal and approved metrics in place, feedback used for improvement

Service not measured


SLA defined, measured in an ad-hoc fashion
SLA defined, measured periodically but not reported
SLA compliance reported periodically, not discussed with customers
SLA compliance discussed with customers regularly for improvement
No updates sent to customers/stakeholders
Ad-hoc updates sent to some customers/stakeholders
Frequent updates sent to most customers/stakeholders
Periodical updates sent to all customers/stakeholders
Periodical updates sent and discussed with all customers/stakeholders

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable

Process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally & regularly with most relevant processes
Alignment done structurally & regularly with all relevant processes

No such process in place


Continuity requirements identified, process not yet in place
Basic service continuity process in place
Full process in place, not approved by relevant stakeholders
Full process in place, formally approved by relevant stakeholders

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
Full procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed

Best practices not applied


Best practices identified, but not applied
Best practices applied, but not structurally
Best practices applied to service architecture and service delivery
Best practices applied and adherence checked regularly
Service performance not measured
Goals set for service performance, measured ad-hoc
Goals set for service performance, measured structurally but informally
Goals set for service performance, measured structurally and formally
Continuous measurement to determine progress & adjust process

Improvement not done


Goals defined, but not pursued
Goals defined and pursued structurally, but not formalized
Goals formally defined and pursued structurally and periodically
Continuous improvement based on targets and feedback loops

guidance

No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published

Service not measured for quality


Metrics defined, applied in an ad-hoc fashion
Metrics defined, applied in a structured but informal fashion
Metrics formalized and used in regular reports
Formal and approved metrics in place, feedback used for improvement

Service not measured


SLA defined, measured in an ad-hoc fashion
SLA defined, measured periodically but not reported
SLA compliance reported periodically, not discussed with customers
SLA compliance discussed with customers regularly for improvement

No updates sent to customers/stakeholders


Ad-hoc updates sent to some customers/stakeholders
Frequent updates sent to most customers/stakeholders
Periodical updates sent to all customers/stakeholders
Periodical updates sent and discussed with all customers/stakeholders

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable

Process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally & regularly with most relevant processes
Alignment done structurally & regularly with all relevant processes

No such process in place


Continuity requirements identified, process not yet in place
Basic service continuity process in place
Full process in place, not approved by relevant stakeholders
Full process in place, formally approved by relevant stakeholders

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
Full procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed

Best practices not applied


Best practices identified, but not applied
Best practices applied, but not structurally
Best practices applied to service architecture and service delivery
Best practices applied and adherence checked regularly

Service performance not measured


Goals set for service performance, measured ad-hoc
Goals set for service performance, measured structurally but informally
Goals set for service performance, measured structurally and formally
Continuous measurement to determine progress & adjust process

Improvement not done


Goals defined, but not pursued
Goals defined and pursued structurally, but not formalized
Goals formally defined and pursued structurally and periodically
Continuous improvement based on targets and feedback loops

guidance

No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published

Service not measured for quality


Metrics defined, applied in an ad-hoc fashion
Metrics defined, applied in a structured but informal fashion
Metrics formalized and used in regular reports
Formal and approved metrics in place, feedback used for improvement

Service not measured


SLA defined, measured in an ad-hoc fashion
SLA defined, measured periodically but not reported
SLA compliance reported periodically, not discussed with customers
SLA compliance discussed with customers regularly for improvement

No updates sent to customers/stakeholders


Ad-hoc updates sent to some customers/stakeholders
Frequent updates sent to most customers/stakeholders
Periodical updates sent to all customers/stakeholders
Periodical updates sent and discussed with all customers/stakeholders

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable

Process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally & regularly with most relevant processes
Alignment done structurally & regularly with all relevant processes

No such process in place


Continuity requirements identified, process not yet in place
Basic service continuity process in place
Full process in place, not approved by relevant stakeholders
Full process in place, formally approved by relevant stakeholders

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
Full procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed

Best practices not applied


Best practices identified, but not applied
Best practices applied, but not structurally
Best practices applied to service architecture and service delivery
Best practices applied and adherence checked regularly

Service performance not measured


Goals set for service performance, measured ad-hoc
Goals set for service performance, measured structurally but informally
Goals set for service performance, measured structurally and formally
Continuous measurement to determine progress & adjust process

Improvement not done


Goals defined, but not pursued
Goals defined and pursued structurally, but not formalized
Goals formally defined and pursued structurally and periodically
Continuous improvement based on targets and feedback loops

Not in place
Partially implemented, incomplete
Averagely implemented, partially documented
Mostly implemented, documented and approved
Fully implemented, documented, approved, actively improved
Not required for SOC operations

Not in place
Log sources connected, basic monitoring
Specific use cases defined and operationalised
Use cases, playbooks and procedures defined and implemented
Fully implemented, performance measured and improved
Not required for SOC operations
knowledge management process in place?

atrix in place?

vely used for team and personal improvement?

edge matrix in place?

trix acively used to determine training and education needs?

d SOC team member abilities?


ss and revise the knowledge management process?

ing in place to support knowledge documentation and distribution?


ement process in place?

t elements formally identified and documented?

ent process regularly reviewed?

ent process aligned with all stakeholders?

ed a process for continuous improvement?

ed a process to manage SOC quality assurance?


ed a process to manage SOC quality assurance?

mented exercise plan?

rity operations exercises regularly?

exercises documented?

ercises actively used to improve security operations?


ent types of reports to your recipients?

types of metrics in your reports?


cation templates?

unication matrix in place?

ining (verbal/written) available for SOC personnel?


kills part of SOC role descriptions?
cases against the MITRE ATT&CK® framework for gap analysis purposes?

tagged with MITRE ATT&CK® framework identifiers?

MITRE ATT&CK® risk profile for your organization?


MITRE ATT&CK® techniques for relevance?

erts) used in threat intelligence activities?

used for the creation and updates of use cases?

d document visibility requirements for each use case?

ility status for your use cases for gap analysis purposes?

rce visibility to the MITRE ATT&CK® framework?

tion engineering process in place?

neering process formally documented?


es and requirements for detection engineers?

ration between the SOC analysts and the detection engineers?

ration between the Threat Intelligence analysts and detection engineers?

d-over to the analyst team?

roment to test and validate detections before deploying them?

ase process in place for new detections?

ning system to detections?

ck procedure in place in case of problems with detections?


rsary emulation?

tion of MITRE ATT&CK®techniques?

s not directly associated with MITRE ATT&CK®?

playbooks?

ated in the detection engineering release process?

the ADT/AE tests used as input into monitoring and detection engineering?

ata ingestion status for log sources?

toring and improve data source coverage?


y maintained?

ce on the system managed?

uted through the change management process?

indows been established?

rmed using authorised and trusted tooling?


y maintained?
ce on the system managed?

uted through the change management process?

indows been established?

rmed using authorised and trusted tooling?


y maintained?

ce on the system managed?

uted through the change management process?


indows been established?

rmed using authorised and trusted tooling?


y maintained?

ce on the system managed?

uted through the change management process?

indows been established?

rmed using authorised and trusted tooling?


question type answer options
Yes/No 1 No
2 Yes
optional 3 Not required

Yes/No/Unknown answer options


1 No
2 Unknown
3 Yes
4 Not required

Detailed 1 No
2 Partially
3 Averagely
4 Mostly
5 Fully
Optional 6 Not required

Completeness 1 Incomplete
2 Partially complete
3 Averagely complete
4 Mostly complete
5 Fully complete

Importance 1 None
2 Low
3 Normal
4 High
5 Critical

Weighing 1 x1
2 x2
3 x3
4 x4
5 x5

Occurrence 1 Never
2 Sometimes
3 Averagely
4 Mostly
5 Always

Satisfaction 1 No
2 Somewhat
3 Averagely
4 Mostly
5 Fully

Assessment type quick scan


scoped assessment
initial assessment (baseline)
progress assessment

Assessment style self-assessment


guided self-assessment
3rd party assessment

Business size 1-49


50-249
250-999
1.000-4.999
5.000-9.999
10.000-50.000
50.000+

Sector Aerospace industry


Agriculture
Chemical / pharmaceutical
Computer / software
Consulting
Defense
Education
Energy / Utilities
Entertainment
Health Care
Finance
Insurance
Service Provider
Telecommunications
Other

SOC size 1-5


5-10
10-50
50-100
100+

SOC region Africa


Asia
Autralia/New Zealand
Canada
Europe
South America
North America
Middle East

SOC model Distributed SOC


Centralised SOC
Federated SOC
Coordinating SOC
Hierarchical SOC
National SOC
MSSP SOC
Hybrid SOC

SOC operations Regional


National
Continental
Global
score matrix 1 - criticallity versus occurrence
never sometimesmostly always
1 2 3 4
none 1 0 0 0 0
low 2 -1 0 0 1
normal 3 -2 -1 1 2
high 4 -4 -2 2 4
critical 5 -8 -4 4 8

score matrix 2 - criticallity versus completeness


incomplete partially mostly fully
1 2 3 4
none 1 0 0 0 0
low 2 -1 0 0 1
normal 3 -2 -1 1 2
high 4 -4 -2 2 4
critical 5 -8 -4 4 8

score matrix 3 - criticallity versus existence


no yes
1 2
none 1 0 0
low 2 -1 1
normal 3 -2 2
high 4 -4 4
critical 5 -8 8

score matrix 4 - scoring factors for criticallity


factor
none 1 0
low 2 0.5
normal 3 1
high 4 2
critical 5 4

You might also like