Professional Documents
Culture Documents
Soc Cmmi Omartinez
Soc Cmmi Omartinez
Index
Click on any section name to proceed directly to that part of the assessment
Domain Section Questions remaining
Introduction 1. Introduction N/A
2. Usage N/A
3. Change notes
Questions remaining
General 1. Profile N/A
2. Scope N/A
Questions remaining
Business 1. Business drivers 0/5
2. Customers 0/6
3. Charter 0/4
4. Governance 0/9
5. Privacy & Policy 0/10
Questions remaining
People 1. Employees 0/8
2. Roles and Hierarchy 0/8
3. People Management 0/14
4. Knowledge Management 0/8
5. Training & Education 0/7
Questions remaining
Process 1. SOC Management 0/7
2. Operations and Facilities 0/31
3. Reporting & Communication 0/17
4. Use Case Management 0/20
5. Detection Engineering & Validation 0/18
Questions remaining
Technology 1. SIEM / UEBA 0/60
2. NDR 0/0
3. EDR 0/71
4. SOAR 0/46
Questions remaining
Services 1. Security Monitoring 0/41
2. Security Incident Management 0/50
3. Security Analysis and Forensics 0/38
4. Threat Intelligence 44/44
5. Threat Hunting 17/35
6. Vulnerability Management 30/33
7. Log Management 1/33
Questions remaining
Results 1. Results N/A
2. NIST CSF Scoring
3. Results Sharing
Questions remaining
Next steps 1. Next steps N/A
Questions remaining
N/A
N/A
N/A
Questions remaining
N/A
N/A
Questions remaining
0/5
0/6
0/4
0/9
0/10
Questions remaining
0/8
0/8
0/14
0/8
0/7
Questions remaining
0/7
0/31
0/17
0/20
0/18
Questions remaining
0/60
0/0
0/71
0/46
Questions remaining
0/41
0/50
0/38
44/44
17/35
30/33
1/33
Questions remaining
N/A
N/A
N/A
Questions remaining
N/A
Introduction
1. Introduction
2. Usage
3. Change notes
General information
Author Rob van Os
Site https://www.soc-cmm.com/
Contact info@SOC-CMM.com
Version 2.3.3, basic version
Date April 19th, 2024
Assessment training https://www.soc-cmm.com/services/training/
Background
The SOC-CMM model is a capability maturity model that can be used to perform a self-assessment of your Security O
review conducted on literature regarding SOC setup and existing SOC models as well as literature on specific elemen
validated by questioning several Security Operations Centers in different sectors and on different maturity levels to d
The output from the survey, combined with the initial analysis is the basis for this self-assessment.
For more information regarding the scientific background and the literature used to create the SOC-CMM self-asses
available through: https://www.soc-cmm.com/
If you have any questions or comments regarding the contents of this document, please use the above information t
Purpose and intended audience
The purpose of the SOC-CMM is to gain insight into the strengths and weaknesses of the SOC. This enables the SOC
which elements of the SOC require additional attention and/or budget. By regularly assessing the SOC for maturity a
Besides the primary purpose of performing an assessment of the SOC, the assessment can also be used for extensive
valuable insights.
This tool is intended for use by SOC and security managers, experts within the SOC and SOC consultants.
Navigation
Navigation through this tool is done using the navigation bar at the top of each page. Each of the numbered section
Furthermore, the icons can be used to navigate through sections within a domain and between domains. The icons
Assessment Model
The assessment model consists of 5 domains and 25 aspects. All domains are evaluated for maturity (blue), only tec
maturity and capability (purple)
Maturity Levels
CMMI defines maturity as a means for an organization "to characterize its performance" for a specific entity (here:
The SOC-CMM calculates a maturity score using 6 maturity levels:
- Level 0: non-existent
- Level 1: initial
- Level 2: managed
- Level 3: defined
- Level 4: quantitatively managed
- Level 5: optimizing
These maturity levels are measured across 5 domains: business, people, process, technology and services. The mat
staged with pre-requisites for each level. Instead, every element adds individually to the maturity score: a continuo
Capability Levels
Capabilities are indicators of completeness. In essence, capabilities can support maturity.
The SOC-CMM calculates a capability score using 4 capability levels, similar to CMMi:
- Level 0: incomplete
- Level 1: performed
- Level 2: managed
- Level 3: defined
These capability levels have a strong technical focus and are measured across 2 domains: technology and services.
capability level is continuous. There are no prerequisites for advancing to a higher level, thus the capability growth
Disclaimer
The SOC-CMM is provided without warranty of any kind. The author of the document cannot assure its accuracy an
based on the output of this tool. The usage of this tool does not in any way entitle the user to support or consultan
License
Copyright (C) 2024 - SOC-CMM
The SOC-CMM assessment tool is free software, released under the CC SA-BY license: https://creativecommons.org
No additional restrictions — You may not apply legal terms or technological measures that legally restrict others fr
This license is acceptable for Free Cultural Works. The licensor cannot revoke these freedoms as long as you follow
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the impl
A PARTICULAR PURPOSE.
ShareAlike — If you remix, transform, or build upon the material, you must distribute your contributions under the
No additional restrictions — You may not apply legal terms or technological measures that legally restrict others fr
This license is acceptable for Free Cultural Works. The licensor cannot revoke these freedoms as long as you follow
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the impl
A PARTICULAR PURPOSE.
.com/
.com/services/training/
to create the SOC-CMM self-assessment tool, please refer to the thesis document as
ment can also be used for extensive discussions about the SOC and can thus provide
age. Each of the numbered sections can be clicked to proceed directly to that section.
n and between domains. The icons are as follows:
rectly to results
aluated for maturity (blue), only technology and services are evaluated for both
mance" for a specific entity (here: the SOC).
technology and services. The maturity levels as implemented in this tool are not
y to the maturity score: a continuous maturity model.
maturity.
MMi:
ment cannot assure its accuracy and is not liable for any cost as a result of decisions
e the user to support or consultancy. By using this tool, you agree to these conditions.
ense: https://creativecommons.org/licenses/by-sa/4.0/
mmercially.
d indicate if changes were made. You may do so in any reasonable manner, but not in
asures that legally restrict others from doing anything the license permits.
asures that legally restrict others from doing anything the license permits.
The evaluation is based on questions that can be answered using a drop-down that presents a 5-point scale. This sc
under 'Scoring mechanism'. This tool should be used by assessing each sheet in order. When all domains are comple
total scoring and detailed scoring for each domain. A sheet 'Next steps' is also included to provide pointers for follow
In the advanced version only, there is also a weighing mechanism in place. For each question, the importance of tha
importance is 'normal', which means that the score is not modified. Changing to importance to 'low' will cause the e
it to 'High' or 'Critical' will cause the element to have more impact on the score. Setting it to 'none' will ignore the el
'Weighing mechanism'. This feature should be used with care.
Scoring mechanism
Each question that is part of the maturity scoring can be answered by selecting one of 5 options. These options vary
questions regarding completeness, the following applies:
- Incomplete, score: 0
- Partially complete, score: 1,25
- Averagely complete, score: 2,5
- Mostly complete, score: 3,75
- Fully complete, score: 5
As indicated, the score can be modified by using the weighing mechanism (use with care)
- Averagely complete, score: 2,5
- Mostly complete, score: 3,75
- Fully complete, score: 5
As indicated, the score can be modified by using the weighing mechanism (use with care)
Guidance
For each of the maturity questions, guidance is available. When a value is selected from the dropdown box, guidanc
This guidance can be used to help determine the correct level. Note that this is truly meant as guidance on interpret
prescriptive.
Customization
The SOC-CMM is built using standard Excel features without macros. The sheets are not locked or password protect
applying other changes such as changing guidance or adding elements is possible. To show columns and rows, go to
the tabs underlying the SOC-CMM, go top 'File' --> 'Options' --> 'Advanced' --> 'Display options' and check 'Show she
understanding of the calculations done in the '_Output' sheet. However, guidance for this type of customization is n
way the SOC-CMM sheets are connected.
profile sheet is filled in and the scope for assessment is selected. Then, the 5 domains
h evaluated in separate sections of this tool.
at presents a 5-point scale. This scale relates to the maturity level as explained below
rder. When all domains are completed, the sheet 'Results' will provide you with the
cluded to provide pointers for follow-up.
ach question, the importance of that element can be changed. The standard
mportance to 'low' will cause the element to have less impact on the score. Changing
etting it to 'none' will ignore the element in scoring entirely, as explained under
a guideline for answering other questions. These elements have a lighter colour. For
ments in 3.2 (not part of maturity score) as a guideline.
matically by filling in those parts of the assessment.
hese capabilities do not have a 5-point scale and an importance, but use a 6-point
nt in the scale is 'not required'. Use this if you do not feel like you need that particular
sing the questions in this self-assessments will likely uncover some improvements. This
efore, it is important to strongly consider and possibly document why you wish to
rengths and weaknesses and to improve the SOC, not to obtain the highest score
depending on the level of detail you put into the assessment. Before you start, ensure
knowledgeable SOC employee perform a quick scan and subsequently focus on areas
uce the assessment effort.
ne of 5 options. These options vary based on the type of question. For example, for
ith care)
ith care)
d from the dropdown box, guidance for that value is show under the guidance column.
uly meant as guidance on interpretation and scoring, not as mandatory and
ore as follows:
w for granular scoring. The exact mapping can be found on the SOC-CMM site as a
are not locked or password protected. Therefore, adding columns and rows and
e. To show columns and rows, go to 'View' --> 'Show' and check 'Headings'. To show all
splay options' and check 'Show sheet tabs'. Customizing calculations will require
e for this type of customization is not provided, as it requires an understanding of the
Introduction
1. Introduction Version 2.3
2. Usage Version 2.2
3. Change notes
Version 2.3.x
Business domain:
Governance
4.3.13: included SOC risk management into governance elements
4.7: Added question on governance meetings
4.7 - 4.10: renumbered to 4.8 - 4.11
Privacy & Policy:
5.2: SOC policy added as a new question, 5.3 outlines the elements in the policy
5.4 - 5.11: renumbered from 5.2 - 5.9
People domain:
Roles & hierarchy
2.2.3: changed security specialist to forensic analyst
2.2.12: added detection engineer
2.2.13: added automation engineer
Knowledge management:
Skill matrix questions aggregated into two questions
Knowledge matrix questions aggregated into two questions
Renumbering applied to section
Process domain:
SOC management:
1.2: guidance updated for this question
1.6: added a question on continuous improvement
1.7: added a question on quality assurance
1.8: added 3 questions on SOC architecture
Operations and facilities:
2.1: security operations exercises turned into a separate section, additional questions inserte
2.1.1: renumbered to 2.1.3
2.1.2 - 2.5.2: renumbered to 2.2.1 - 2.6.2
Technology domain:
All technologies: confidentiality section changed to access control
All technologies: break glass procedure added to access control
All technologies: maintenance (x.4) updated to 'maintenance & configuration', question x.4.3 updated to
EDR (new)
New section for the SOC-CMM
SOAR
Capabilities restructured
Capabilities from previous version renumbered and re-ordered
Services domain:
All services: removed CMMI level references, as they were somewhat confusing and did not add any valu
Security incident response
2.17.6: change questions from 'password reset procedure' to more generic 'incident containm
Navigation improvements
Scrolling error fixed for sheets that did not allow scrolling with the scroll wheel
Clicking on SOC-CMM elements in the results section navigates directly to that section
Added lines in each comments section to document the rationale for choosing a certain value
Backend improvements:
Index calculations improved to exclude questions with importance set to 'none' and provide a better ove
Guidance added for new questions
Generic guidance modified to better reflect capability levels
Version 2.3.1
Fixed description in the profile section
Version 2.3.2
Fixed reference bug in guidance for 1 capability
Version 2.3.3
Content changes
Business domain\Privacy & Policy
Added comments fields for the last 2 questions
Technology domain\EDR
Removed duplicate capability (attack surface management), applied renumbering
Results section
Mapping to NIST CSF 2.0 implemented
Results sharing resource embedded into assessment sheets
Version 2.2
Business domain:
Governance
question 4.10 added (external SOC cooperation)
Privacy & Policy:
questions 5.1, 5.2 and 5.3 added (security policy)
question 5.4: additional NIST mapping applied
People domain:
Employees:
questions 1.9 and 1.10 added (KSAOs)
People management:
questions 3.5 and 3.6 added, renumbering applied (team goals and tracking of goals)
questions 3.13 and 3.14 added (multi-team systems and team performance)
Knowledge management:
question 4.4.1 added, renumbering applied (employee abilities)
Process domain:
Operations and facilities:
question 2.1.6 added (OPSEC program)
questions 2.3.2, 2.3.5, 2.3.9 added, renumbering applied (war room, physical storage, remot
question 2.4.2 added, renumbering applied (vigilance)
Reporting: (changed to reporting & communication)
question 3.8.6 added (proactive & reactive metrics)
questions 3.10.1 and 3.10.2 added (education & awareness)
question 3.11 added (communication)
Use case management:
question 4.1.9 (testing use cases) moved to detection engineering, renumbering applied
section 4.2 added (MITRE ATT&CK®)
section 4.3 added (visibility)
Detection Engineering & Validation:
completely new section
Technology domain:
All technologies: maintenance and support removed from capabilities, and moved to maturity (section x
Services domain:
All services: question about onboarding procedure included
Threat Intelligence
question 4.14.25 added, renumbering applied (threat landscaping)
question 4.14.31 added (CTI infrastructure management)
Backend improvements:
calculations improved and simplified
Index updated from percentage completed to remaining questions
generic guidance applied for all capabilities (technology & services domain)
guidance added for new questions
License updated:
CC BY-SA 4.0 license replaces previous GPLv3 license
elements in the policy
ection, additional questions inserted
ly to that section
choosing a certain value
Please fill in the information below to create a short profile of the SOC and the assessment
Assessment Details
Date of assessment 5/11/2024
Name(s)
Department(s) Estrategias
Scope
Assessment type
Assessment style
Notes or comments
5/11/2024
Estrategias
5/18/2024
1.000-4.999
Consulting
3
5
Distributed SOC Click on the dropdown box for more information
South America
Regional
3 Indicate a score from 1 to 5. Decimals can be used
3 Indicate a score from 1 to 5. Decimals can be used
3 Indicate a score from 1 to 5. Decimals can be used
3 Indicate a score from 1 to 5. Decimals can be used
3 Indicate a score from 1 to 5. Decimals can be used
3
Please select the services and technologies that should be included into the assessment. Excluding a service or techn
Remarks
Security Information and Event management tooling. Used to gather logging information from company assets and correla
Network security solution, used detect network exploits and anomalous network activity and perform network forensics
End-point security solution, used to prevent, detect and respond to threats on end-points
Used to automate workflows and SOC actions, support incident response and orchestrate between different security prod
Remarks
The security monitoring service aims at detecting security incidents and events
The security incident management service aims at responding to security incidents in a timely, accurate and organized fash
The security analysis service supports security monitoring and security incident management. Analysis includes event anal
The threat intelligence service provides information about potential threats that can be used in security monitoring, secur
The hunting service takes a proactive approach to finding threats in the infrastructure. Threat intelligence is often used to
The vulnerability management service is used to detect vulnerabilities in assets by discovery and actively scanning assets f
The log management service is used to collect, store and retain logging. Can be used for compliance purposes as well as in
: changes to these values take some time to process
er logging information from company assets and correlate events. Also includes User and Entity Behaviour Analytics (UEBA)
alous network activity and perform network forensics
o threats on end-points
sponse and orchestrate between different security products
s and events
ecurity incidents in a timely, accurate and organized fashion
urity incident management. Analysis includes event analysis and forensic analysis
al threats that can be used in security monitoring, security incident response, security analysis and threat hunting
n the infrastructure. Threat intelligence is often used to guide hunting efforts
ties in assets by discovery and actively scanning assets for known vulnerabilities
ging. Can be used for compliance purposes as well as investigation purposes
ludes User and Entity Behaviour Analytics (UEBA)
1 Business Drivers
1.1 Have you identified the main business drivers?
1.2 Have you documented the main business drivers?
1.3 Do you use business drivers in the decision making process?
1.4 Do you regularly check if the current service catalogue is aligned with business drivers?
1.5 Have the business drivers been validated with business stakeholders?
Answer
h business drivers?
1.1
1.2
1.3
1.4
1.5
Guidance
Most business drivers have been identified
Basic documentation of business drivers
Business drivers are occasionally used in decisions
Every change in the catalogue is checked against drivers
Alignment of SOC drivers with stakeholders is performed
Remarks
Example business drivers: cyber crime prevention, risk reduction, law / regulation, audit / compliance, business continuity
Documentation of business drivers is important for demonstrable business alignment
e.g. to determine priorities or make decisions regarding the on-boarding of new services or operations
i.e. do you check for services or operations that outside the scope of business drivers?
Business stakeholders can be C-level management
Business
1. Business Drivers 5. Privacy & Policy
2. Customers
3. Charter
4. Governance
2 Customers
2.1 Have you identified the SOC customers?
2.2 Please specify your customers:
2.2.1 Legal
2.2.2 Audit
2.2.3 Engineering / R&D
2.2.4 IT
2.2.5 Business
2.2.6 External customers
2.2.7 (Senior) Management
2.2.8 Other customers:
Answer
2.1
2.3
2.4
2.5
2.6
2.7
Guidance
All customers are identified, including relevance and context
Formal registration of customer contact details, place in the organization, geolocation, etc.
For example, are communication style and contents to Business customers different than that to IT?
Service level agreements are used to provide standardized services operating within known boundaries
For example: changes in service scope or delivery. Can also be reports, dashboards, etc.
Understanding customer satisfaction will help to better align with business needs
Business
1. Business Drivers 5. Privacy & Policy
2. Customers
3. Charter
4. Governance
3 Charter
3.1 Does the SOC have a formal charter document in place?
3.2 Please specify elements of the charter document:
3.2.1 Mission
3.2.2 Vision
3.2.3 Strategy
3.2.4 Service Scope
3.2.5 Deliverables
3.2.6 Responsibilities
3.2.7 Accountability
3.2.8 Operational Hours
3.2.9 Stakeholders
3.2.10 Objectives / Goals
3.2.11 Statement of success
Completeness
3.3 Is the SOC charter document regularly updated?
3.4 Is the SOC charter document approved by the business / CISO?
3.5 Are all stakeholders familiar with the SOC charter document contents?
Answer
Mostly complete
3.1
3.3
3.4
3.5
Guidance
Single charter, full description of SOC strategic elements
A SOC mission should be established to provide insight into the reason for existence of the SOC
A vision should be created to determine long-term goals for the SOC
A strategy should be in place to show how to meet goals and targets set by mission and vision
Service scope is documented to provide insight into SOC service delivery
The output provided by the SOC, for example: reports, incidents, investigations, advisories, etc.
Responsibilities of the SOC
Accountability for the SOC for actions taken
Operational hours of the SOC
All relevant stakeholders for the SOC
Objectives and goals should be concrete and measurable so that they are fit for reporting purposes
A statement of success is used to determine when the SOC is successful. Should be aligned with goals and objectives
Use this outcome as a guideline to determine the score for 3.1
Regularity should be matched to your own internal policy. At least yearly is recommended
Approval from the relevant stakeholders will aid in business support for SOC operations
Making stakeholders aware of the contents helps in getting organizational support for security operations
Business
1. Business Drivers 5. Privacy & Policy
2. Customers
3. Charter
4. Governance
4 Governance
4.1 Does the SOC have a governance process in place?
4.2 Have all governance elements been identified?
4.3 Please specify identified governance elements
4.3.1 Business Alignment
4.3.2 Accountability
4.3.3 Sponsorship
4.3.4 Mandate
4.3.5 Relationships & Third Party Management
4.3.6 Vendor Engagement
4.3.7 Service Commitment
4.3.8 Project / Program Management
4.3.9 Continual Improvement
4.3.10 Span of control / federation governance
4.3.11 Outsourced service management
4.3.12 SOC KPIs & Metrics
4.3.13 SOC risk management
4.3.14 Customer Engagement / Satisfaction
Completeness
4.4 Is cost management in place?
4.5 Please specify cost management elements
4.5.1 People cost
4.5.2 Process cost
4.5.3 Technology cost
4.5.4 Services cost
4.5.5 Facility cost
4.5.6 Budget forecasting
4.5.7 Budget alignment
4.5.8 Return on investment
Completeness
4.6 Are all governance elements formally documented?
4.7 Are SOC governance meetings regularly held?
4.8 Is the governance process regularly reviewed?
4.9 Is the governance process aligned with all stakeholders?
4.10 Is the SOC regularly audited or subjected to (external) assessments?
4.11 Is there an active cooperation with other SOCs (external)?
Answer
Fully complete
Fully complete
4.1
4.2
4.4
4.6
4.7
4.8
4.9
4.10
4.11
Guidance
Several governance elements are in place, but not structurally
Some governance elements are identified and governed actively
Costs associated with employees. Should be managed to prove FTE requirements to stakeholders
Cost associated with processes. Should be managed to ensure process elements can be delivered
Cost associated with technology. Should be managed to prove budget requirements for new technology or replacement
Cost associated with service delivery. Especially important for managed service providers to ensure a healthy business mo
Cost associated with facilities used by the SOC
Forecasting of required budget over time. Should be aligned with business needs; increased spending must be justified
Alignment of budget with business requirements and drivers to ensure balanced spending on the SOC
Prove the return on investment to stakeholders to ensure continued budget allocation
Use this outcome as a guideline to determine the score for 4.4
Formal documentation should be signed off and stored in a quality management system
Meetings at different levels (operational, tactical, strategic) should be formalised in Terms of Reference (ToR) and driven b
Frequency should be matched to your own internal policy. At least yearly is recommended
Alignment will help the SOC obtain required mandate, budget and management support
Frequency should be matched to the SOC policy. At least yearly is recommended. 3rd party assessments have a higher obj
Exchange of best practices, intelligence and actions on threats with other SOCs is vital for improving cyber defence
Business
1. Business Drivers 5. Privacy & Policy
2. Customers
3. Charter
4. Governance
Answer
Averagely complete
and regulations?
privacy regulations?
5.1
5.2
5.4
5.5
5.6
5.7
5.8
5.9
5.10
5.11
Guidance
Policy in place, SOC activities mentioned in detail with mandate
Some ad-hoc information across documents
How to behave in the SOC, mandatory and optional meetings, SOC culture, repercussions for non-compliance, etc.
What activities can and can not be performed as part of the job
What documentation is subjected to review and how often that documentation needs to be reviewed
How often the SOC is assessed and in what manner (self-assessment, audit, external assessment, etc.)
The means (meetings and platforms) for knowledge exchange and rules for maintenance of knowledge bases
Frequency and type (table top, cyber range, red team, etc.) of exercises in the SOC
Information exchange protocols. Especially important for collaborations outside the organisation
Agreements on length of meetings, length of agile sprints, transparency of completed work, etc.
Use this outcome as a guideline to determine the score for 5.2
Consulting the SOC in the creation of security policy will ensure that SOC activities are properly mentioned and enforceabl
A reporting policy for security incidents will aid the SOC in identifying incidents and threats in the organization
A privacy policy should state that monitoring of employees is possible within acceptable limits
Local laws and regulations as well as company policy may apply and should all be considered
Cooperation will ensure that the SOC is enabled to perform activities, rather than blocked
Privacy related issues require careful examination, especially those potentially leading to court cases
Such information includes IP addresses, customer identifiers, user names, host names (for personally owned devices), etc.
Can be used to determine the impact of monitoring on privacy, and can help uncover potential violations
People
1. Employees 5. Training & Education
2. Roles and Hierarchy
3. People Management
4. Knowledge Management
1 Employees
1.1 How many FTE’s are in your SOC?
1.2 Do you use external employees / contractors in your SOC?
1.2.1 If yes, specify the number of external FTE's
1.3 Does the current size of the SOC meet FTE requirements?
1.4 Does the SOC meet requirements for internal to external employee FTE ratio?
1.5 Does the SOC meet requirements for internal to external employee skillset?
1.6 Are all positions filled?
1.7 Do you have a recruitment process in place?
1.8 Do you have a talent acquisition process in place?
1.9 Do you have specific KSAOs established for SOC personnel?
1.10 Do you actively seek to create a psychologically safe environment for SOC personnel?
Answer
4
SOC personnel?
1.3
1.4
1.5
1.6
1.7
1.8
1.9
1.10
dent Responder roles:
-resource-center/nice-framework-current-versions
Guidance
Some required skills are not present internally, and not transferred
Answer
Fully complete
hese roles?
2.1
2.3
2.4
2.5
2.6
2.8
2.9
2.10
Guidance
All roles are fully in use and formalized
3 People Management
3.1 Do you have a job rotation plan in place?
3.2 Do you have a career progression process in place?
3.3 Do you have a talent management process in place?
3.4 Do you have team diversity goals?
3.5 Have you established team goals?
3.6 Do you document and track individual team member goals?
3.7 Do you periodically evaluate SOC employees?
3.8 Do you have a 'new hire' process in place?
3.9 Are all SOC employees subjected to screening?
3.10 Do you measure employee satisfaction for improving the SOC?
3.11 Are there regular 1-on-1 meetings between the SOC manager and the employees?
3.12 Do you perform regular teambuilding exercises?
3.13 Do you perform regular teambuilding exercises with other teams relevant to the SOC?
3.14 Do you periodically evaluate team performance?
Answer
e employees?
3.1
3.2
3.3
3.4
3.5
3.6
3.7
3.8
3.9
3.10
3.11
3.12
3.13
3.14
Guidance
A plan covering some roles is in place, but not operational
A process covering some roles is in place and operational
No talent management process in place
Diversity goals have been formally defined and are not met
Team goals are determined, approved and tracked regularly
Individual goals are determined, approved and tracked regularly
Periodic evaluation is performed in an ad-hoc fashion
A process is in place, but does not cover all aspects
Basic screening procedure in place, applied structurally
Employee satisfaction is measured, not used for improvement
1-on-1 meetings are regularly held and used for coaching and growth
Exercises are regularly done and focused on improving team dynamics
MTS exercises are usually performed, but not embedded in processes
Periodic evaluation is performed, results are used for team growth
Remarks
Job rotation can be used to train employees in a variety of tasks and avoid too much routine
Career development, promotion, etc.
Talent should be adequately managed to retain such staff and fully develop their potential.
e.g. background diversity, ethnic diversity, gender diversity, etc.
Team goals help to bring focus to the team and monitor progress
Individual team member goals should be set to help grow the employee to full potential
Can also be included in the regular organization evaluation process
i.e. a defined process to quickly let new employees find their place and perform well in the SOC
Personnel screening is performed to avoid infiltration or misbehaviour by SOC employees
Employee satisfaction should be taken seriously as lack of satisfaction may lead to key personnel leaving
Such informal 1-on-1 conversations are used to coach employees and help the SOC manager gain insight in personal challe
Teambuilding exercises are used to promote collaboration between individuals in the team and to raise team spirit
In multi-team systems (MTS), the SOC collaborates with other teams. Use cross-team teambuilding to maximize performan
Besides individual performance, team performance and dynamics are also important to measure and improve on
People
1. Employees 5. Training & Education
2. Roles and Hierarchy
3. People Management
4. Knowledge Management
4 Knowledge Management
4.1 Do you have a formal knowledge management process in place?
4.2 Do you have a skill matrix in place?
4.3 Please specify elements of the skill matrix:
4.3.1 All SOC employees
4.3.2 Hard skills
4.3.3 Soft skills
4.3.4 Skill levels (novice, intermediate, expert)
Completeness
4.4 Is the skill matrix actively used for team and personal improvement?
4.5 Do you have a knowledge matrix in place?
4.6 Please specify elements of the knowledge matrix:
4.6.1 All SOC employees
4.6.2 All relevant knowledge areas
4.6.3 Knowledge levels (novice, intermediate, expert)
Completeness
4.7 Is the knowledge matrix actively used to determine training and education needs?
4.8 Have you documented SOC team member abilities?
4.9 Do you regularly assess and revise the knowledge management process?
4.10 Is there effective tooling in place to support knowledge documentation and distribution?
Answer
Fully complete
Fully complete
cation needs?
on and distribution?
4.1
4.2
4.4
4.5
4.7
4.8
4.9
4.10
Guidance
A formal process is in place, covering all knowledge aspects
A complete skill matrix is in place and approved, not regularly updated
Matrix used to identify all training needs, but not tracked for execution
All employee abilities documented, but is not regularly updated
Documentation is regularly and informally reviewed and updated
Tooling is in place and use of the tool is embedded in processes
Remarks
Formal knowledge management helps to optimize knowledge creation and distribution
A matrix may consist of: SOC skills, SOC employees and skill levels (novice, intermediate, expert)
The skill matrix should cover all SOC employees, both internal and external
e.g. ability to effectively use analysis tools
e.g. communication skills
Determining and documenting skill levels helps to identify areas where limited expertise is available
Use this outcome as a guideline to determine the score for 4.2
Personal improvement is essential, team improvement requires insight in team dynamics and skill distribution
The knowledge matrix should cover all SOC employees, both internal and external
Knowledge for service delivery: technical (i.e. support), functional (i.e. configuration) and foundational (e.g. networking, en
Determining and documenting knowledge levels helps to identify areas where limited expertise is available
Use this outcome as a guideline to determine the score for 4.5
The matrix should be used as a means to identify and resolve knowledge gaps
Besides knowledge and skills, team member abilities are also important to document
This refers to the knowledge management process as a whole
Such tooling can help to avoid investigation similar issues multiple times by integrating into the security monitoring proces
People
1. Employees 5. Training & Education
2. Roles and Hierarchy
3. People Management
4. Knowledge Management
Answer
Mostly complete
Incomplete
career progression?
5.1
5.3
5.5
5.6
5.7
5.8
5.9
Guidance
A training program covering all roles is in place, but not formalized
Training on the job can be done internally by senior employees or using external consultants
Product-specific training may be required for new technologies or complex solutions
e.g. training on internal policies
For example: security analysis training for the security analyst role
To complement hard skills, soft skills should be trained as well
Formal education may be university or university college degrees
Use this outcome as a guideline to determine the score for 5.1
A certification program is used to provide a demonstrable minimum level of knowledge and skills
Internal certifications may be in place to demonstrate knowledge of company processes and policies
Certification track with external certification organizations (e.g. ISACA, (ISC)2, SANS
Permanent education (PE) may be part of the certification itself
Use this outcome as a guideline to determine the score for 5.3
e.g. certain training and certifications are required to grow from a junior level function to a more senior level function
i.e. a fixed percentage of the total SOC budget that is allocated for education and cannot be used for other purposes
This is an extension of education budget
Workshops are an informal way of distributing knowledge
Training and certification must be a relevant reflection of SOC knowledge and skill requirements
Process
1. SOC Management 5. Detection Engineering & Validation
2. Operations & Facilities
3. Reporting & Communication
4. Use Case Management
1 Management
1.1 Is there a SOC management process in place?
1.2 Are SOC management elements formally identified and documented?
1.3 Please specify identified SOC management elements:
1.3.1 Internal relationship management
1.3.2 External relationship management
1.3.3 Vendor management
1.3.4 Continuous service improvement
1.3.5 Project methodology
1.3.6 Process documentation and diagrams
1.3.7 RACI matrix
1.3.8 Service Catalogue
1.3.9 Service on-boarding procedure
1.3.10 Service off-loading procedure
Completeness
1.4 Is the SOC management process regularly reviewed?
1.5 Is the SOC management process aligned with all stakeholders?
1.6 Have you implemented a process for continuous improvement (CI)?
1.7 Specify elements of the continuous improvement program:
1.7.1 Daily progress tracking
1.7.2 Weekly planning
1.7.3 Backlog management
1.7.4 Work item effort estimation
1.7.5 Work item prioritisation
1.7.6 Refinement
1.7.7 Capacity for change
Completeness
1.8 Have you implemented a process to manage SOC quality assurance (QA)?
1.9 Please specify elements of the quality assurance program:
1.9.1 Ticket quality assurance
1.9.2 Incident quality assurance
1.9.3 Service quality assurance
1.9.4 Process quality assurance
1.9.5 Report quality assurance
Completeness
1.10 Have you implemented a SOC architecture process?
1.11 Please specify elements of the SOC architecture:
1.11.1 SOC process architecture
1.11.2 SOC technology architecture
1.11.3 SOC service architecture
1.11.4 Architecture diagrams
1.11.5 Architecture principles
Completeness
Answer
Averagely complete
Averagely complete
Averagely complete
Mostly complete
1.1
1.2
1.4
1.5
1.6
1.8
1.9
1.10
Guidance
An informal process is in place that covers most aspects
Single document, full description of SOC management process
Daily progress tracking is used to identify (blocking) issues, determine priorities and request help for certain activities
Weekly planning is done to create a balanced improvement workload for the team
Managing the backlog includes structuring the backlog and grooming the backlog
Work item estimation (through t-shirt sizing, or more accurate estimation) is essential to realistic planning
Prioritisation of work items should follow a defined prioritisation method and be done by the owner of the backlog
Refinement of items, including a definition of ready, is required to ensure all team members understand the task at hand
Having a reserved capacity for change ensures the improvement is continuous, and not overtaken by operational tasks
Use this outcome as a guideline to determine the score for 1.6
Quality assurance is aimed at ensuring SOC processes, technology and services meet their quality requirements
Correct and timely analysis of alerts, including correct usage of playbooks
Correct and timely follow-up of incidents, including correctly following the incident response procedures
Delivery of services in accordance with established quality criteria
Execution of processes in accordance with established quality criteria
Correct and timely report provisioning
Use this outcome as a guideline to determine the score for 1.8
A SOC architecture describes how different components
A process architecture outlines the different processes within the SOC and how they interact / integrate
A technology architecture outlines the different technologies used within the SOC and how they interact / integrate
A service architecture outlines the different services used within the SOC and how they interact / integrate
Architecture diagrams are visualisations of components and integrations
Architecture principles are guidelines for implementing processes, technology & services
Use this outcome as a guideline to determine the score for 1.10
Process
1. SOC Management 5. Detection Engineering & Validation
2. Operations & Facilities
3. Reporting & Communication
4. Use Case Management
Answer
Averagely complete
2.1.1
2.1.3
2.1.4
2.1.5
2.2.1
2.2.2
2.2.3
2.2.4
2.2.5
2.3.1
2.3.2
2.3.3
2.3.4
2.3.5
2.4.1
2.4.2
2.4.3
2.4.4
2.4.5
2.4.6
2.4.7
2.4.8
2.4.9
2.5.1
2.5.2
2.5.3
2.5.4
2.5.5
2.5.6
2.6.1
2.6.2
Guidance
Shift schedules in place, guaranteeing full shift coverage for all roles
Shift schedule optimized for vigilance, but not regularly improved
Shift log in place, fully accurate and up to date
Basic shift turnover procedure created
Stand-up carried out regularly, but not in structured fashion
Stand-by arrangement in place, not supported by tooling and not tested
An exercise plan lists the type of exercises to be conducted, the frequency and the exercise goals
Table-top exercises are an easy way to go through a scenario and determine if all procedures and actions are clear
Playbook drills are lowlevel exercises to test the accuracy, effectiveness, and efficiency of playbooks
Cyber range exercises provide a simulated environment that is used to train analysts. A simulation of the actual IT environ
A capture the flag event is a gamification for security analysts, in which they must achieve a specified goal
These are types of exercises to conduct an actual attack against the organisation, without malicious intent
Public exercsies are exercises that the organisation can participate in. Often, these are large-scale exercises
Use this outcome as a guideline to determine the score for 2.1.1
Regularity should be matched to your own internal policy
Results from exercisess should be structurally documented for future reference and identification of improvements
Exercise output should be used to structurally improve security operations
SOC services and procedures should be aligned and integrated with the organization's configuration management process
SOC services and procedures should be aligned and integrated with the organization's change management process
SOC services and procedures should be aligned and integrated with the organization's problem management process
SOC services and procedures should be aligned and integrated with the organization's incident management process
SOC services and procedures should be aligned and integrated with the organization's asset management process
A dedicated physical location decreases likelihood of unauthorized access and provides confidentiality for security incident
A dedicated facility for coordination of major security incidents
Given the confidentiality of the SOC and the importance of monitoring, it is recommended to use a separate network
e.g. key cards (badges) for access with access logging
Secure storage facilities can be used to store evidence collected during investigations or other operational security purpos
A video wall can be used to display the real-time security status and can be used for decision making as well as PR
Since communication and coordination are important features of a SOC, call-center capability may be required
e.g. multiple screen setup, virtual machines, etc.
Secure working enabled means secure access (MFA, encryption, etc.), secure working facilitated also means equipped and
The system should support different file types, authorizations and version management; possibly even encryption
e.g. a wiki space or SharePoint that allows collaboration and supports team efforts
nt is prefered
Process
1. SOC Management 5. Detection Engineering & Validation
2. Operations & Facilities
3. Reporting & Communication
4. Use Case Management
[1] SOC-CMM has published a metrics suite that can serve as a starting point:
https://www.soc-cmm.com/products/metrics/
neering & Validation
Answer
Averagely complete
Averagely complete
vulnerabilities?
3.1
3.2
3.3
3.4
3.5
3.6
3.7
3.9
3.11.1
3.11.2
3.11.3
3.12.1
3.12.2
3.13.1
3.13.2
3.13.3
3.13.4
Guidance
Reports are provided regularly, standardized and regularly optimized
Reports fully tailored to recipients, manual customization required
Reports regularly reviewed, not formally signed off by recipients
Reports dissemination through standard and approved reporting lines
Report templates regularly revised and updated
Formal agreements exists, not measured
Required reporting types provided, not regularly evaluated
The SOC may be involved in security awareness and education to make users in the organisation aware of their role in cyb
Measuring the efforts is necessary for improvement of the security awareness function
4.3 Visibility
4.3.1 Do you determine and document visibility requirements for each use case?
4.3.2 Do you measure visibility status for your use cases for gap analysis purposes?
4.3.3 Do you map data source visibility to the MITRE ATT&CK® framework? [4]
Answer
4.1.1
4.1.2
Basic process in place, not applied to all phases of the use case lifecycle
Single repository, full description of use cases
Use cases not approved, all critical use cases known to stakeholders
Alignment done structurally and regularly with relevant processes
Use cases mostly created in a structured and documented fashion
Use cases created in a structured top-down way, SOC context only
Traceability is possible for all use cases, but requires manual effort
Traceability is possible for all use cases, but requires manual effort
Metrics applied to all use cases, no risk-based feedback loop
All use cases scored and prioritized, validated & reviewed by stakeholders
All use cases are regularly and formally reviewed and updated
A framework, such as MaGMa UCF[1], can be used to guide use case lifecycle and document use case in a standardized form
Formal documentation may include use case documentation templates
e.g. business stakeholders, IT stakeholders, CISOs, audit & compliance, risk management, etc.
e.g. integration with the threat / risk management process to revise use cases when the threat landscape changes
i.e. a standardized approach to derive use cases from threats or business requirements
e.g. use cases can be derived from business requirements, risk assessments, threat management / intelligence
Top-down traceability is important to determine completeness of implementation and demonstrable risk reduction
Bottom-up traceability is important for contextualizing use case output and business alignment
Metrics can be applied to use cases to determine growth and maturity by measuring effectiveness and implementation
Risks can be (cyber)threats, but also non-compliance or penalties (laws & regulations)
Use cases should be subjected to life cycle management and may require updates or may be outdated and decommissione
By measuring use cases against MITRE ATT&CK®, it is possible to determine strengths and weaknesses in your layered dete
Tagging monitoring rules with MITRE ATT&CK® identifiers allows for reporting on sightings of attack techniques
The creation of a risk profile in MITRE ATT&CK® can help to identify relevant attack techniques
Using organizational context (protection and detection mechanisms), ATT&CK® techniques can be prioritized
Using MITRE ATT&CK®, it is possible to connect alerts to specific threat actors, or potentially even active campaigns
Threat intelligence can provide input into security monitoring, especially when using MITRE ATT&CK® to connect both
Answer
on engineers?
and detection engineers?
eploying them?
rocess / pipeline?
ng and detection engineering?
5.1.1
5.1.2
5.1.3
5.1.4
5.1.5
5.1.6
5.1.7
5.1.8
5.1.9
5.1.10
5.2.1
5.2.2
5.2.3
5.2.4
5.2.5
5.2.6
5.2.7
5.2.8
Guidance
All use cases tested, visibility and detection targets used in improvements
Releases process triggers ADT/AE for all use cases, documented process
Data ingestion status monitored, not complete for all data source types
Data source coverage measured , not complete for all data source types
Remarks
A detection engineering process supports the creation and deployment of detection rules for security monitoring purpose
Formal documentation supports process standardisation, and allows for faster training of new engineers
Detection engineers have a skillset that is different from security analysts and security engineers
SOC analyst deal with alerts resulting from detections created by engineers, so a tight interaction is required to optimize th
Threat intelligence is a major input into the creation or updating of detection rules
Once the detections are created, they must be operationalized. This should be done with a formal hand-over to production
A testing environment allows for thorough testing of new detections, which ensures a higher level of quality
A formal release process includes automated deployment of rules and adheres to organizational change management pro
A versioning system will allow to revert back to previous versions of detections
A roll-back procedure enabled reverting abck to a good state if a deployment has an adverse effect on security monitoring
These validation activities provides insights into how well security monitoring is able to detect certain adversaries or attack
Testing for MITRE ATT&CK® techniques can augment mapping of use cases and visibility in Mitre ATT&CK
Not all use cases and risks have a relationship to MITRE ATT&CK®. These use cases should be tested as well
Testing both detection and response provides a more complete view of SOC capabilities
When deploying new or updated detection, automated detection testing should be executed as a quality gate
Output should lead to updates in detections and new detections, as well as instructions for SOC analysts
Monitoring data ingestion is used to identify data ingestion problems or inactive data sources
Data source coverage should be optimized to avoid blind spots in monitoring
Technology
1. SIEM / UEBA
2. NDR
3. EDR
4. SOAR
1 SIEM tooling
Maturity
1.1 Accountability
1.1.1 Has functional ownership of the solution been formally assigned?
1.1.2 Has technical ownership of the solution been formally assigned?
1.2 Documentation
1.2.1 Has the solution been technically documented?
1.2.2 Has the solution been functionally documented?
1.3 Personnel & support
1.3.1 Is there dedicated personnel for support?
1.3.2 Is the personnel for support formally trained?
1.3.3 Is the personnel for support certified?
1.3.4 Is there a support contract for the solution?
1.4 Maintenance & configuration
1.4.1 Is the system regularly maintained?
1.4.2 Is remote maintenance on the system managed?
1.4.3 Are maintenance & configuration updates executed through the change management process?
1.4.4 Have you established maintenance windows?
1.4.5 Is maintenance performed using authorised and trusted tooling?
1.5 Availability & Integrity
1.5.1 Is there high availability (HA) in place for the solution?
1.5.2 Is there data backup / replication in place for the solution?
1.5.3 Is there configuration backup / replication in place for the solution?
1.5.4 Is there a Disaster Recovery plan in place for this solution?
1.5.5 Is the Disaster Recovery plan regularly tested?
1.5.6 Is there a separate development / test environment for this solution?
1.6 Access management
1.6.1 Is access to the solution limited to authorized personnel?
1.6.2 Are access rights regularly reviewed and revoked if required?
1.6.3 Is a break glass procedure in place?
Capability
1.7 Specify which technological capabilities and artefacts are present and implemented:
Technical capabilities
1.7.1 Subtle event detection
1.7.2 Automated alerting
1.7.3 Alert acknowledgement
1.7.4 Case management system
1.7.5 Network model
1.7.6 Detailed audit trail of analyst activities
1.7.7 Historical activity detection
1.7.8 Flexible and scalable architecture
1.7.9 MITRE ATT&CK® identifier tagging
Anomaly detection
1.7.29 User anomalies
1.7.30 Application anomalies
1.7.31 Device anomalies
1.7.32 Network anomalies
Maturity 1.1.1
1.1.2
1.2.1
1.2.2
1.3.1
1.3.2
1.3.3
1.3.4
1.4.1
1.4.2
1.4.3
1.4.4
1.4.5
1.5.1
1.5.2
1.5.3
1.5.4
1.5.5
1.5.6
1.6.1
1.6.2
1.6.3
Capability 1.7.1
1.7.2
1.7.3
1.7.4
1.7.5
1.7.6
1.7.7
1.7.8
1.7.9
1.7.10
1.7.11
1.7.12
1.7.13
1.7.14
1.7.15
1.7.16
1.7.17
1.7.18
1.7.19
1.7.20
1.7.21
1.7.22
1.7.23
1.7.24
1.7.25
1.7.26
1.7.27
1.7.28
1.7.29
1.7.30
1.7.31
1.7.32
1.7.33
1.7.34
1.7.35
1.7.36
1.7.37
1.7.38
Guidance
Dedicated personnel should be in place to ensure that support is always available. Can also be staff with outsourced provid
Training helps to jump start new hires, and to learn a proper way of working with the tool
Certification demonstrates ability to handle the tooling properly
A support contract may cover on-site support, support availability, response times, escalation and full access to resources
Systems should be regularly maintained to keep up with the latest features and bug fixes
Remote maintenance by a third party may be part of system maintenance procedures
Maintenance and configuration updates should follow the formal change management process
Setting maintenance windows helps to structure the maintenance process and make it more predictable
Performing maintenance with authorised and trusted tooling helps to ensure security and integrity of the system
The system will contain confidential information and information that possibly impacts employee privacy
Revocation is part of normal employee termination. Special emergency revocation should be in place for suspected misuse
A break glass procedure and account is required to gain access to the tooling even in case of major IT outages
Capability to detect slight changes in systems, applications or network that may indicate malicious behavior
Alerting based on different alerting mechanisms (SMS, mail, etc.)
Capability to acknowledge alerts so other analysts know the alert is being investigated
A case management system that supports SOC analyst workflows
A full network model in which zones and segments are defined
The audit trail can be used to report on analyst activities and to uncover potential abuse of the big data solution
Capability of detecting historical activity for recently uncovered threats
A flexible and scalable architecture supports the SOC as it grows in size (FTE) and data (coverage)
Add MITRE ATT&CK® tags to rules / analytics for mapping purposes
Integration into the asset management process for automated adding of assets to the SIEM for monitoring
Integration of business context (business function, asset classification, etc.)
Integration of identity information into the SIEM for enhanced monitoring of users and groups
Integration of asset management information into the SIEM (asset owner, asset location, etc.)
Integration of vulnerability management information into SIEM assets to determine risk levels for assets
Integration of threat intelligence information (observables / IoCs) into the security monitoring tooling
Integration of the tooling into the threat hunting process to support threat hunting investigations
Integration of the security incident management process to support incident investigation
Integration with the SOAR tooling for automation purposes
Use of standard content packs in the security monitoring solution
Use of custom content (correlation rules, etc.) in the security monitoring solution
2 IDPS Tooling
Maturity
2.1 Accountability
2.1.1 Has functional ownership of the solution been formally assigned?
2.1.2 Has technical ownership of the solution been formally assigned?
2.2 Documentation
2.2.1 Has the solution been technically documented?
2.2.2 Has the solution been functionally documented?
2.3 Personnel & support
2.3.1 Is there dedicated personnel for support?
2.3.2 Is the personnel for support formally trained?
2.3.3 Is the personnel for support certified?
2.3.4 Is there a support contract for the solution?
2.4 Maintenance & configuration
2.4.1 Is the system regularly maintained?
2.4.2 Is remote maintenance on the system managed?
2.4.3 Are maintenance & configuration updates executed through the change management process?
2.4.4 Have you established maintenance windows?
2.4.5 Is maintenance performed using authorised and trusted tooling?
2.5 Availability & Integrity
2.5.1 Is there high availability (HA) in place for the solution?
2.5.2 Is there data backup / replication in place for the solution?
2.5.3 Is there configuration backup / replication in place for the solution?
2.5.4 Is there a Disaster Recovery plan in place for this solution?
2.5.5 Is the Disaster Recovery plan regularly tested?
2.5.6 Is there a separate development / test environment for this solution?
2.6 Access Management
2.6.1 Is access to the solution limited to authorized personnel?
2.6.2 Are access rights regularly reviewed and revoked if required?
2.6.3 Is a break glass procedure in place?
Capability
2.7 Specify which technological capabilities and artefacts are present and implemented:
Technical capabilities
2.7.1 Encrypted traffic analysis
2.7.2 IDS signature matching
2.7.3 Supervised machine learning
2.7.4 Unsupervised machine learning
2.7.5 Traffic blocking
2.7.6 Unauthorised device detection
2.7.7 MITRE ATT&CK® identifier tagging
2.7.8 Deep packet inspection
2.7.9 Correlation
Monitoring capabilities
2.7.12 Monitoring north - south network traffic
2.7.13 Monitoring east - west network traffic
2.7.14 Monitoring classified network segements
2.7.15 Monitoring cloud environments
2.7.16 Monitoring ICS/SCADA networks
2.7.17 Monitoring DNS traffic
Anomaly detection
2.7.27 Traffic baselining
2.7.28 Pattern analysis
Maturity 2.1.1
2.1.2
2.2.1
2.2.2
2.3.1
2.3.2
2.3.3
2.3.4
2.4.1
2.4.2
2.4.3
2.4.4
2.4.5
2.5.1
2.5.2
2.5.3
2.5.4
2.5.5
2.5.6
2.6.1
2.6.2
2.6.3
Capability 2.7.1
2.7.2
2.7.3
2.7.4
2.7.5
2.7.6
2.7.7
2.7.8
2.7.9
2.7.10
2.7.11
2.7.12
2.7.13
2.7.14
2.7.15
2.7.16
2.7.17
2.7.18
2.7.19
2.7.20
2.7.21
2.7.22
2.7.23
2.7.24
2.7.25
2.7.26
2.7.27
2.7.28
2.7.29
2.7.30
2.7.31
2.7.32
2.7.33
2.7.34
Guidance
Remarks
Dedicated personnel should be in place to ensure that support is always available. Can also be staff with outsourced provid
Training helps to jump start new hires, and to learn a proper way of working with the tool
Certification demonstrates ability to handle the tooling properly
A support contract may cover on-site support, support availability, response times, escalation and full access to resources
Systems should be regularly maintained to keep up with the latest features and bug fixes
Remote maintenance by a third party may be part of system maintenance procedures
Maintenance and configuration updates should follow the formal change management process
Setting maintenance windows helps to structure the maintenance process and make it more predictable
Performing maintenance with authorised and trusted tooling helps to ensure security and integrity of the system
The IDPS system will contain confidential information and possibly information that impacts employee privacy
Revocation is part of normal employee termination. Special emergency revocation should be in place for suspected misuse
A break glass procedure and account is required to gain access to the tooling even in case of major IT outages
Helps to identify potentially malicious traffic in encrypted communications, using fingerprinting or certificate analysis
The ability to use IDS signatures (e.g. YARA) in network monitoring
Machine learning trained on a predefined data set with known good and bad traffic
Machine learning trained without a predefined data set
In-line appliances can block malicious traffic as part of their response capability
Detection of unauthorised devices accessing the network
Add MITRE ATT&CK® tags to rules / analytics for mapping and hunting purposes
Detailed inspection of data sent across the network
Correlation of anomalies with previously detected anomalies or detection rules
Maturity
3.1 Accountability
3.1.1 Has functional ownership of the solution been formally assigned?
3.1.2 Has technical ownership of the solution been formally assigned?
3.2 Documentation
3.2.1 Has the solution been technically documented?
3.2.2 Has the solution been functionally documented?
3.3 Personnel & support
3.3.1 Is there dedicated personnel for support?
3.3.2 Is the personnel for support formally trained?
3.3.3 Is the personnel for support certified?
3.3.4 Is there a support contract for the solution?
3.4 Maintenance & configuration
3.4.1 Is the system regularly maintained?
3.4.2 Is remote maintenance on the system managed?
3.4.3 Are maintenance & configuration updates executed through the change management process?
3.4.4 Have you established maintenance windows?
3.4.5 Is maintenance performed using authorised and trusted tooling?
3.5 Availability & Integrity
3.5.1 Is there high availability (HA) in place for the solution?
3.5.2 Is there data backup / replication in place for the solution?
3.5.3 Is there configuration backup / replication in place for the solution?
3.5.4 Is there a Disaster Recovery plan in place for this solution?
3.5.5 Is the Disaster Recovery plan regularly tested?
3.5.6 Is there a separate development / test environment for this solution?
3.6 Confidentiality
3.6.1 Is access to the solution limited to authorized personnel?
3.6.2 Are access rights regularly reviewed and revoked if required?
3.6.3 Is a break glass procedure in place?
Capability
3.7 Specify which technological capabilities and artefacts are present and implemented:
Technical capabilities
3.7.1 OS support
3.7.2 Mobile device support
3.7.3 Physical, virtual & cloud deployment
3.7.4 Vulnerability patching
3.7.5 Forensic information preservation
3.7.6 Historic data retention
3.7.7 MITRE ATT&CK® identifier tagging
3.7.8 Memory analysis
3.7.9 Correlation
Prevention capabilities
3.7.10 Exploit prevention
3.7.11 Fileless malware protection
3.7.12 Application allowlisting
3.7.13 Ransomware protection
3.7.14 Attack surface reduction
Integrations
3.7.32 Threat Intelligence integration
3.7.33 Vulnerability intelligence integration
3.7.34 Threat hunting integration - TTPs
3.7.35 Threat hunting integration - Tools & artifacts
3.7.36 Threat hunting integration - Technical indicators
3.7.37 Security incident management integration
3.7.38 SIEM integration
3.7.39 Malware sandbox integration
Rule-based detection
3.7.40 Online signature-based detection
3.7.41 Offline signature-based detection
3.7.42 Custom rules
Anomaly detection
3.7.43 Behavioural detection
[1] see:
https://github.com/tsale/EDR-Telemetry
Answer
Capability 3.7.1
3.7.2
3.7.3
3.7.4
3.7.5
3.7.6
3.7.7
3.7.8
3.7.9
3.7.10
3.7.11
3.7.12
3.7.13
3.7.14
3.7.15
3.7.16
3.7.17
3.7.18
3.7.19
3.7.20
3.7.21
3.7.22
3.7.23
3.7.24
3.7.25
3.7.26
3.7.27
3.7.28
3.7.29
3.7.30
3.7.31
3.7.32
3.7.33
3.7.34
3.7.35
3.7.36
3.7.37
3.7.38
3.7.39
3.7.40
3.7.41
3.7.42
3.7.43
3.7.44
3.7.45
3.7.46
3.7.47
3.7.48
3.7.49
3.7.50
Guidance
Dedicated personnel should be in place to ensure that support is always available. Can also be staff with outsourced provid
Training helps to jump start new hires, and to learn a proper way of working with the tool
Certification demonstrates ability to handle the tooling properly
A support contract may cover on-site support, support availability, response times, escalation and full access to resources
Systems should be regularly maintained to keep up with the latest features and bug fixes
Remote maintenance by a third party may be part of system maintenance procedures
Maintenance and configuration updates should follow the formal change management process
Setting maintenance windows helps to structure the maintenance process and make it more predictable
Performing maintenance with authorised and trusted tooling helps to ensure security and integrity of the system
Can be fully implemented HA, partially implemented, hot spare, etc. May not be applicable
May not be applicable to all EDR solutions
Configuration synchronization could be part of a HA setup
A DR plan is required to restore service in case of catastrophic events
Testing the DR plan is required to ensure that it is complete, functional and tasks and responsibilities for involved personn
A separate test environment allows for testing of new configurations before deployment in production
The analytics system will contain confidential information and information that possibly impacts employee privacy
Revocation is part of normal employee termination. Special emergency revocation should be in place for suspected misuse
A break glass procedure and account is required to gain access to the tooling even in case of major IT outages
Integration of threat intelligence information (observables / IoCs) into the EDR tooling for reputation-based monitoring
Integration of vulnerability intelligence information into the EDR for vulnerability monitoring purposes
Integration of the tooling into the threat hunting process to support threat hunting investigations on the TTP level
Integration of the tooling into the threat hunting process to support threat hunting investigations on the Tools & artifacts l
Integration of the tooling into the threat hunting process to support threat hunting investigations on the indicator level (IP
Integration of the security incident management process to support incident investigation
Integration with the SIEM tooling for centralised correlation
Detonate potential malware samples in a sandbox environment
Maturity
4.1 Accountability
4.1.1 Has functional ownership of the solution been formally assigned?
4.1.2 Has technical ownership of the solution been formally assigned?
4.2 Documentation
4.2.1 Has the solution been technically documented?
4.2.2 Has the solution been functionally documented?
4.3 Personnel & support
4.3.1 Is there dedicated personnel for support?
4.3.2 Is the personnel for support formally trained?
4.3.3 Is the personnel for support certified?
4.3.4 Is there a support contract for the solution?
4.4 Maintenance & configuration
4.4.1 Is the system regularly maintained?
4.4.2 Is remote maintenance on the system managed?
4.4.3 Are maintenance & configuration updates executed through the change management process?
4.4.4 Have you established maintenance windows?
4.4.5 Is maintenance performed using authorised and trusted tooling?
4.5 Availability & Integrity
4.5.1 Is there high availability (HA) in place for the solution?
4.5.2 Is there data backup / replication in place for the solution?
4.5.3 Is there configuration backup / replication in place for the solution?
4.5.4 Is there a Disaster Recovery plan in place for this solution?
4.5.5 Is the Disaster Recovery plan regularly tested?
4.5.6 Is there a separate development / test environment for this solution?
4.6 Confidentiality
4.6.1 Is access to the solution limited to authorized personnel?
4.6.2 Are access rights regularly reviewed and revoked if required?
4.6.3 Is a break glass procedure in place?
Capability
4.7 Specify which technological capabilities and artefacts are present and implemented:
Technical capabilities
4.7.1 Historical event matching
4.7.2 Risk-based event prioritization
4.7.3 Ticket workflow support
Data integrations
4.7.4 SIEM data integration
4.7.5 Threat intelligence integration
4.7.6 Asset context integration
4.7.7 Identity context integration
4.7.8 Vulnerability management integration
Response integrations
4.7.9 Knowledge base integration
4.7.10 Firewall integration
4.7.11 NDR integration
4.7.12 EDR integration
4.7.13 Email protection integration
4.7.14 Malware protection integration
4.7.15 Sandbox integration
4.7.16 Active Directory / IAM integration
4.7.17 SIEM integration
Playbooks [1]
4.7.18 Standard playbooks
4.7.19 Customised playbooks
4.7.20 Playbook automation
4.7.21 Playbook development process
[1] the SOAR maturity model is a helpful resource for understanding levels of playbook application:
https://chronicle.security/blog/posts/SOAR-adoption-maturity-model/
Answer
4.1.1
4.1.2
ybook application: 4.2.1
4.2.2
4.3.1
4.3.2
4.3.3
4.3.4
4.4.1
4.4.2
4.4.3
4.4.4
4.4.5
4.5.1
4.5.2
4.5.3
4.5.4
4.5.5
4.5.6
4.6.1
4.6.2
4.6.3
4.7.1
4.7.2
4.7.3
4.7.4
4.7.5
4.7.6
4.7.7
4.7.8
4.7.9
4.7.10
4.7.11
4.7.12
4.7.13
4.7.14
4.7.15
4.7.16
4.7.17
4.7.18
4.7.19
4.7.20
4.7.21
4.7.22
4.7.23
4.7.24
Guidance
A technical description of the automation & orchestration system components and configuration
A description of the automation & orchestration system functional configuration (workflows, integrations, etc.)
Dedicated personnel should be in place to ensure that support is always available. Can also be staff with outsourced provid
Training helps to jump start new hires, and to learn a proper way of working with the tool
Certification demonstrates ability to handle the tooling properly
A support contract may cover on-site support, support availability, response times, escalation and full access to resources
Systems should be regularly maintained to keep up with the latest features and bug fixes
Remote maintenance by a third party may be part of system maintenance procedures
Maintenance and configuration updates should follow the formal change management process
Setting maintenance windows helps to structure the maintenance process and make it more predictable
Performing maintenance with authorised and trusted tooling helps to ensure security and integrity of the system
The automation system may have automated actions that can impact the usage of systems and should be restricted
Revocation is part of normal employee termination. Special emergency revocation should be in place for suspected misuse
A break glass procedure and account is required to gain access to the tooling even in case of major IT outages
The automation & orchestration tool receives events from the SIEM system
Contextualize potential incidents using threat intelligence
Contextualize potential incidents using asset information
Contextualize potential incidents using user information
Contextualize potential incidents using vulnerability management information
1 Security Monitoring
Maturity
1.1 Have you formally described the security monitoring service?
1.2 Please specify elements of the security monitoring service document:
1.2.1 Key performance indicators
1.2.2 Quality indicators
1.2.3 Service dependencies
1.2.4 Service levels
1.2.5 Hours of operation
1.2.6 Service customers and stakeholders
1.2.7 Purpose
1.2.8 Service input / triggers
1.2.9 Service output / deliverables
1.2.10 Service activities
1.2.11 Service roles & responsibilities
Completeness
1.3 Is the service measured for quality?
1.4 Is the service measured for service delivery in accordance with service levels?
1.5 Are customers and/or stakeholders regularly updated about the service?
1.6 Is there a contractual agreement between the SOC and the customers?
1.7 Is sufficient personnel allocated to the process to ensure required service delivery?
1.8 Is the service aligned with other relevant processes?
1.9 Is there a incident resolution / service continuity process in place for this service?
1.10 Has a set of procedures been created for this service?
1.11 Is there an onboarding and offloading procedure for this service?
1.12 Are best practices applied to the service?
1.13 Are use cases used in the security monitoring service?
1.14 Is process data gathered for prediction of service performance?
1.15 Is the service continuously being improved based on improvement goals?
Capability
1.16 Please specify capabilities of the security monitoring service:
1.16.1 Early detection
1.16.2 Intrusion detection
1.16.3 Exfiltration detection
1.16.4 Subtle event detection
1.16.5 Malware detection
1.16.6 Anomaly detection
1.16.7 Real-time detection
1.16.8 Alerting & notification
1.16.9 False-positive reduction
1.16.10 Continuous tuning
1.16.11 Coverage management
1.16.12 Status monitoring
1.16.13 Perimeter monitoring
1.16.14 Host monitoring
1.16.15 Network & traffic monitoring
1.16.16 Access & usage monitoring
1.16.17 User / identity monitoring
1.16.18 Application & service monitoring
1.16.19 Behavior monitoring
1.16.20 Database monitoring
1.16.21 Data loss monitoring
1.16.22 Device loss / theft monitoring
1.16.23 Third-party monitoring
1.16.24 Physical environment monitoring
1.16.25 Cloud monitoring
1.16.26 Mobile device monitoring
1.16.27 OT monitoring
Completeness (%)
Answer
Averagely complete
rvice delivery?
this service?
84
Maturity 1.1
1.3
1.4
1.5
1.6
1.7
1.8
1.9
1.10
1.11
1.12
1.13
1.14
1.15
Capability 1.16.1
1.16.2
1.16.3
1.16.4
1.16.5
1.16.6
1.16.7
1.16.8
1.16.9
1.16.10
1.16.11
1.16.12
1.16.13
1.16.14
1.16.15
1.16.16
1.16.17
1.16.18
1.16.19
1.16.20
1.16.21
1.16.22
1.16.23
1.16.24
1.16.25
1.16.26
Guidance
Maturity
2.1 Have you adopted a maturity assessment methodology for Security Incident Management?
2.1.1 If yes, please specify the methodology
2.1.2 If yes, please specify the maturity level (can have up to 2 digits)
If yes, skip directly to 2.17 (capabilities)
2.2 Have you adopted a standard for the Security Incident Management process?
2.3 Have you formally described the security incident management process?
2.4 Please specify elements of the security incident management document:
2.4.1 Security incident definition
2.4.2 Service levels
2.4.3 Workflow
2.4.4 Decision tree
2.4.5 Hours of operation
2.4.6 Service customers and stakeholders
2.4.7 Purpose
2.4.8 Service input / triggers
2.4.9 Service output / deliverables
2.4.10 Service activities
2.4.11 Service roles & responsibilities
Completeness
2.5 Is the service measured for quality?
2.6 Is the service measured for service delivery in accordance with service levels?
2.7 Are customers and/or stakeholders regularly updated about the service?
2.8 Is there a contractual agreement between the SOC and the customers?
2.9 Is sufficient personnel allocated to the process to ensure required service delivery?
2.10 Is the service aligned with other relevant processes?
2.11 Is the incident response team authorized to perform (invasive) actions when required?
2.12 Is there an onboarding and offloading procedure for this service?
2.13 Are best practices applied to the service?
2.14 Is the service supported by predefined workflows or scenarios?
2.15 Is process data gathered for prediction of service performance?
2.16 Is the service continuously being improved based on improvement goals?
Capability
2.17 Please specify capabilities and artefacts of the security incident management service:
2.17.1 Incident logging procedure
2.17.2 Incident resolution procedure
2.17.3 Incident investigation procedure
2.17.4 Escalation procedure
2.17.5 Evidence collection procedure
2.17.6 Incident containment procedures
2.17.7 IR Training
2.17.8 Table-top exercises
2.17.9 Red team / blue team exercises
2.17.10 RACI matrix
2.17.11 Response authorization
2.17.12 Incident template
2.17.13 Incident tracking system
2.17.14 False-positive reduction
2.17.15 Priority assignment
2.17.16 Severity assignment
2.17.17 Categorization
2.17.18 Critical bridge
2.17.19 War room
2.17.20 Communication plan & email templates
2.17.21 Backup communication technology
2.17.22 Secure communication channels
2.17.23 (dedicated) information sharing platform
2.17.24 Change management integration
2.17.25 Malware extraction & analysis
2.17.26 On-site incident response
2.17.27 Remote incident response
2.17.28 Third-party escalation
2.17.29 Evaluation template
2.17.30 Reporting template
2.17.31 Incident closure
2.17.32 Lessons learned extraction for process improvement
2.17.33 External security incident support agreements
2.17.34 Exercises with other incident response teams
2.17.35 Root Cause Analysis
2.17.36 Restore integrity verification
Completeness (%)
RE&CT framework:
https://atc-project.github.io/atc-react/
Management
nt
Answer
ncident Management?
Partially complete
rvice delivery?
s when required?
gement service:
56
Maturity 2.2
2.3
2.5
2.6
2.7
2.8
2.9
2.10
2.11
2.12
2.13
2.14
2.15
2.16
Capability 2.17.1
2.17.2
2.17.3
2.17.4
2.17.5
2.17.6
2.17.7
2.17.8
2.17.9
2.17.10
2.17.11
2.17.12
2.17.13
2.17.14
2.17.15
2.17.16
2.17.17
2.17.18
2.17.19
2.17.20
2.17.21
2.17.22
2.17.23
2.17.24
2.17.25
2.17.26
2.17.27
2.17.28
2.17.29
2.17.30
2.17.31
2.17.32
2.17.33
2.17.34
2.17.35
Guidance
Maturity
3.1 Have you formally described the security analysis & forensics service?
3.2 Please specify elements of the security analysis service document:
3.2.1 Key performance indicators
3.2.2 Quality indicators
3.2.3 Service dependencies
3.2.4 Service levels
3.2.5 Hours of operation
3.2.6 Service customers and stakeholders
3.2.7 Purpose
3.2.8 Service input / triggers
3.2.9 Service output / deliverables
3.2.10 Service activities
3.2.11 Service roles & responsibilities
Completeness
3.3 Is the service measured for quality?
3.4 Is the service measured for service delivery in accordance with service levels?
3.5 Are customers and/or stakeholders regularly updated about the service?
3.6 Is there a contractual agreement between the SOC and the customers?
3.7 Is sufficient personnel allocated to the process to ensure required service delivery?
3.8 Is the service aligned with other relevant processes?
3.9 Is there a incident resolution / service continuity process in place for this service?
3.10 Has a set of procedures been created for this service?
3.11 Is there an onboarding and offloading procedure for this service?
3.12 Are best practices applied to the service?
3.13 Is the service supported by predefined workflows or scenarios?
3.14 Is process data gathered for prediction of service performance?
3.15 Is the service continuously being improved based on improvement goals?
Capability
3.16 Please specify capabilities and artefacts of the security analysis process:
3.16.1 Event analysis
3.16.2 Event analysis toolkit
3.16.3 Trend analysis
3.16.4 Incident analysis
3.16.5 Visual analysis
3.16.6 Static malware analysis
3.16.7 Dynamic malware analysis
3.16.8 Tradecraft analysis
3.16.9 Historic analysis
3.16.10 Network analysis
3.16.11 Memory analysis
3.16.12 Mobile device analysis
3.16.13 Volatile information collection
3.16.14 Remote evidence collection
3.16.15 Forensic hardware toolkit
3.16.16 Forensic analysis software toolkit
3.16.17 Dedicated analysis workstations
3.16.18 Security analysis & forensics handbook
3.16.19 Security analysis & forensics workflows
3.16.20 Case management system
3.16.21 Report templates
3.16.22 Evidence seizure procedure
3.16.23 Evidence transport procedure
3.16.24 Chain of custody preservation procedure
Completeness (%)
Answer
Fully complete
rvice delivery?
this service?
96
Maturity 3.1
3.3
3.4
3.5
3.6
3.7
3.8
3.9
3.10
3.11
3.12
3.13
3.14
3.15
Capability 3.16.1
3.16.2
3.16.3
3.16.4
3.16.5
3.16.6
3.16.7
3.16.8
3.16.9
3.16.10
3.16.11
3.16.12
3.16.13
3.16.14
3.16.15
3.16.16
3.16.17
3.16.18
3.16.19
3.16.20
3.16.21
3.16.22
3.16.23
3.16.24
Guidance
Formal workflows created, approved and published for all incident types
4 Threat Intelligence
Maturity
4.1 Have you formally described the threat intelligence service?
4.2 Please specify elements of the threat intelligence service document:
4.2.1 Key performance indicators
4.2.2 Quality indicators
4.2.3 Service dependencies
4.2.4 Service levels
4.2.5 Hours of operation
4.2.6 Service customers and stakeholders
4.2.7 Purpose
4.2.8 Service input / triggers
4.2.9 Service output / deliverables
4.2.10 Service activities
4.2.11 Service roles & responsibilities
Completeness
4.3 Is the service measured for quality?
4.4 Is the service measured for service delivery in accordance with service levels?
4.5 Are customers and/or stakeholders regularly updated about the service?
4.6 Is there a contractual agreement between the SOC and the customers?
4.7 Is sufficient personnel allocated to the process to ensure required service delivery?
4.8 Is the service aligned with other relevant processes?
4.9 Is there a incident resolution / service continuity process in place for this service?
4.10 Has a set of procedures been created for this service?
4.11 Is there an onboarding and offloading procedure for this service?
4.12 Are best practices applied to the service?
4.13 Is process data gathered for prediction of service performance?
4.14 Is the service continuously being improved based on improvement goals?
Capability
4.15 Please specify capabilities and artefacts of the threat intelligence process:
Collection
4.15.1 Continuous intelligence gathering
4.15.2 Automated intelligence gathering & processing
4.15.3 Centralized collection & distribution
4.15.4 Intelligence collection from open / public sources
4.15.5 Intelligence collection from closed communities
4.15.6 Intelligence collection from intelligence provider
4.15.7 Intelligence collection from business partners
4.15.8 Intelligence collection from mailing lists
4.15.9 Intelligence collection from internal sources
Processing
4.15.10 Structured data analysis
4.15.11 Unstructured data analysis
4.15.12 Past incident analysis
4.15.13 Trend analysis
4.15.14 Automated alerting
4.15.15 Adversary movement tracking
4.15.16 Attacker identification
4.15.17 Threat identification
4.15.18 Threat prediction
4.15.19 TTP extraction
4.15.20 Deduplication
4.15.21 Enrichment
4.15.22 Contextualization
4.15.23 Prioritization
4.15.24 Threat intelligence reporting
4.15.25 Threat landscaping
4.15.26 Forecasting
Dissemination
4.15.27 Sharing within the company
4.15.28 Sharing with the industry
4.15.29 Sharing outside the industry
4.15.30 Sharing in standardized format (e.g. STIX)
Infrastructure Management
4.15.31 Management of the CTI infrastructure (Threat Intelligence Platform)
Completeness (%)
Answer
Incomplete
rvice delivery?
this service?
0
Maturity 4.1
4.3
4.4
4.5
4.6
4.7
4.8
4.9
4.10
4.11
4.12
4.13
4.14
Capability 4.15.1
4.15.2
4.15.3
4.15.4
4.15.5
4.15.6
4.15.7
4.15.8
4.15.9
4.15.10
4.15.11
4.15.12
4.15.13
4.15.14
4.15.15
4.15.16
4.15.17
4.15.18
4.15.19
4.15.20
4.15.21
4.15.22
4.15.23
4.15.24
4.15.25
4.15.26
4.15.27
4.15.28
4.15.29
4.15.30
4.15.31
Guidance
Remarks
5 Threat Hunting
Maturity
5.1 Do you use a standardized threat hunting methodology?
5.2 Have you formally described the threat hunting service?
5.3 Please specify elements of the threat hunting service document:
5.3.1 Key performance indicators
5.3.2 Quality indicators
5.3.3 Service dependencies
5.3.4 Service levels
5.3.5 Hours of operation
5.3.6 Service customers and stakeholders
5.3.7 Purpose
5.3.8 Service input / triggers
5.3.9 Service output / deliverables
5.3.10 Service activities
5.3.11 Service roles & responsibilities
Completeness
5.4 Is the service measured for quality?
5.5 Is the service measured for service delivery in accordance with service levels?
5.6 Are customers and/or stakeholders regularly updated about the service?
5.7 Is there a contractual agreement between the SOC and the customers?
5.8 Is sufficient personnel allocated to the process to ensure required service delivery?
5.9 Is the service aligned with other relevant processes?
5.10 Is there a incident resolution / service continuity process in place for this service?
5.11 Has a set of procedures been created for this service?
5.12 Is there an onboarding and offloading procedure for this service?
5.13 Are best practices applied to the service?
5.14 Is process data gathered for prediction of service performance?
5.15 Is the service continuously being improved based on improvement goals?
Capability
5.16 Please specify capabilities and artefacts of the threat hunting process:
5.16.1 Hash value hunting
5.16.2 IP address hunting
5.16.3 Domain name hunting
5.16.4 Network artefact hunting
5.16.5 Host-based artefact hunting
5.16.6 Adversary tools hunting
5.16.7 Adversary TTP hunting
5.16.8 Inbound threat hunting
5.16.9 Outbound threat hunting
5.16.10 Internal threat hunting
5.16.11 Outlier detection
5.16.12 Hunting coverage
5.16.13 Leveraging of existing tooling
5.16.14 Custom hunting scripts and tools
5.16.15 Dedicated hunting platform
5.16.16 Continuous hunting data collection
5.16.17 Historic hunting
5.16.18 Automated hunting
5.16.19 Hunt alerting
5.16.20 Vulnerability information integration
5.16.21 Threat intelligence integration
Completeness (%)
[1] The TaHiTI threat hunting methodology is a methodology for conducting threat hunting investigations created
by the Dutch financial sector and can be obtained from the following location:
https://www.betaalvereniging.nl/en/safety/tahiti/
Management
nt
Answer
Mostly complete
rvice delivery?
this service?
12
Maturity 5.1
5.2
at hunting investigations created 5.4
5.5
5.6
5.7
5.8
5.9
5.10
5.11
5.12
5.13
5.14
5.15
Capability 5.16.1
5.16.2
5.16.3
5.16.4
5.16.5
5.16.6
5.16.7
5.16.8
5.16.9
5.16.10
5.16.11
5.16.12
5.16.13
5.16.14
5.16.15
5.16.16
5.16.17
5.16.18
5.16.19
5.16.20
5.16.21
Guidance
Can be an internally developed approach or a publically available methodolology, such as TaHiTI [1]
A service description should be in place
6 Vulnerability Management
Maturity
6.1 Have you formally described the vulnerability management service?
6.2 Please specify elements of the vulnerability management service document:
6.2.1 Key performance indicators
6.2.2 Quality indicators
6.2.3 Service dependencies
6.2.4 Service levels
6.2.5 Hours of operation
6.2.6 Service customers and stakeholders
6.2.7 Purpose
6.2.8 Service input / triggers
6.2.9 Service output / deliverables
6.2.10 Service activities
6.2.11 Service roles & responsibilities
Completeness
6.3 Is the service measured for quality?
6.4 Is the service measured for service delivery in accordance with service levels?
6.5 Are customers and/or stakeholders regularly updated about the service?
6.6 Is there a contractual agreement between the SOC and the customers?
6.7 Is sufficient personnel allocated to the process to ensure required service delivery?
6.8 Is the service aligned with other relevant processes?
6.9 Is there a incident resolution / service continuity process in place for this service?
6.10 Has a set of procedures been created for this service?
6.11 Is there an onboarding and offloading procedure for this service?
6.12 Are best practices applied to the service?
6.13 Is process data gathered for prediction of service performance?
6.14 Is the service continuously being improved based on improvement goals?
Capability
6.15 Please specify capabilities and artefacts of the vulnerability management process:
6.15.1 Network mapping
6.15.2 Vulnerability identification
6.15.3 Risk identification
6.15.4 Risk acceptance
6.15.5 Security baseline scanning
6.15.6 Authenticated scanning
6.15.7 Incident management integration
6.15.8 Asset management integration
6.15.9 Configuration management integration
6.15.10 Patch management integration
6.15.11 Trend identification
6.15.12 Enterprise vulnerability repository
6.15.13 Enterprise application inventory
6.15.14 Vulnerability Management procedures
6.15.15 Scanning policy tuning
6.15.16 Detailed Vulnerability Reporting
6.15.17 Management Reporting
6.15.18 Scheduled scanning
6.15.19 Ad-hoc specific scanning
6.15.20 Vulnerability information gathering & analysis
Completeness (%)
Answer
Partially complete
rvice delivery?
this service?
ment process:
Maturity 6.1
6.3
6.4
6.5
6.6
6.7
6.8
6.9
6.10
6.11
6.12
6.13
6.14
Capability 6.15.1
6.15.2
6.15.3
6.15.4
6.15.5
6.15.6
6.15.7
6.15.8
6.15.9
6.15.10
6.15.11
6.15.12
6.15.13
6.15.14
6.15.15
6.15.16
6.15.17
6.15.18
6.15.19
6.15.20
Guidance
7 Log Management
Maturity
7.1 Have you formally described the log management service?
7.2 Please specify elements of the log management service document:
7.2.1 Key performance indicators
7.2.2 Quality indicators
7.2.3 Service dependencies
7.2.4 Service levels
7.2.5 Hours of operation
7.2.6 Service customers and stakeholders
7.2.7 Purpose
7.2.8 Service input / triggers
7.2.9 Service output / deliverables
7.2.10 Service activities
7.2.11 Service roles & responsibilities
Completeness
7.3 Is the service measured for quality?
7.4 Is the service measured for service delivery in accordance with service levels?
7.5 Are customers and/or stakeholders regularly updated about the service?
7.6 Is there a contractual agreement between the SOC and the customers?
7.7 Is sufficient personnel allocated to the process to ensure required service delivery?
7.8 Is the service aligned with other relevant processes?
7.9 Is there a incident resolution / service continuity process in place for this service?
7.10 Has a set of procedures been created for this service?
7.11 Is there an onboarding and offloading procedure for this service?
7.12 Are best practices applied to the service?
7.13 Is process data gathered for prediction of service performance?
7.14 Is the service continuously being improved based on improvement goals?
Capability
7.15 Please specify capabilities and artefacts of the log management process:
7.15.1 End-point log collection
7.15.2 Application log collection
7.15.3 Database log collection
7.15.4 Network flow data collection
7.15.5 Network device log collection
7.15.6 Security device log collection
7.15.7 Centralized aggregation and storage
7.15.8 Multiple retention periods
7.15.9 Secure log transfer
7.15.10 Support for multiple log formats
7.15.11 Support for multiple transfer techniques
7.15.12 Data normalization
7.15.13 Log searching and filtering
7.15.14 Alerting
7.15.15 Reporting and dashboards
7.15.16 Log tampering detection
7.15.17 Log collection policy
7.15.18 Logging policy
7.15.19 Data retention policy
7.15.20 Privacy and Sensitive data handling policy
Completeness (%)
Answer
Fully complete
rvice delivery?
this service?
91
Maturity 7.1
7.3
7.4
7.5
7.6
7.7
7.8
7.9
7.10
7.11
7.12
7.13
7.14
Capability 7.15.1
7.15.2
7.15.3
7.15.4
7.15.5
7.15.6
7.15.7
7.15.8
7.15.9
7.15.10
7.15.11
7.15.12
7.15.13
7.15.14
7.15.15
7.15.16
7.15.17
7.15.18
7.15.19
7.15.20
Guidance
3 N/A N/A
3 N/A N/A
3 N/A N/A
2.21 Yes
0 No
2.13 Yes
2.12 Yes
3 2.15 2
2.53 Yes
1.69 Yes
2.87 Yes
0 Yes
0.36 Yes
0 Yes
2.74 Yes
3 1.46 1
1. Business Drivers
7. Log Management 2. Custome
Services
6. Vulnerability Management
5
3.5
4. Threat Intelligence
3
2.5
3. Security Analysis & Forensics
2
1.5
1
2. Security Incident Management
0.5
1. Security Monitoring
4. SOAR
3. EDR
Technology 2. NDR
1. SIEM / UEBA
1. SIEM / UEBA
Business
Services 2 People
Technology Process
4.5 4. Governance
3.5
5. Privacy & policy
3
2.5
1. Employees
2
1.5
1
2. Roles and Hierarchy
0.5
M
0
C
3. People Management
4. Knowledge Management
People
5. Training & Education
1. SOC Management
2.5
2
People
1.5
0.5
0
Process Technology Services
Domain Aspect
Identify Asset Management (ID.AM)
Business Environment (ID.BE)
Governance (ID.GV)
Risk Assessment (ID.RA)
Risk Management Strategy (ID.RM)
Supply Chain Risk Management (ID.SC)
overall Identify
Protect Access Control (PR.AC)
Awareness and Training (PR.AT)
Data Security (PR.DS)
Information Protection Processes and Procedures (PR.IP)
Maintenance (PR.MA)
Protective Technology (PR.PT)
overall Protect
Detect Anomalies and Events (DE.AE)
Security Continuous Monitoring (DE.CM)
Detection Processes (DE.DP)
overall Detect
Respond Response Planning (RS.RP)
Communications (RS.CO)
Analysis (RS.AN)
Mitigation (RS.MI)
Improvements (RS.IM)
overall Respond
Recover Recovery Planning (RC.RP)
Improvements (RC.IM)
Communications (RC.CO)
overall Recover
Maturity Score Capability score
4.46 N/A
3.61 N/A
Recove
2.63 3
2.69 0
3.75 N/A
N/A 1.5 Recovery P
3.43 1.5
3.3 0
4.08 1.5 Improvements (
2.75 2.68
1.66 0
3.21 0
Mitigation (RS.M
4.31 3
3.22 1.2
3.75 2.32
3.75 1.24
Analysis (RS.AN
3.83 2.08
3.78 1.88
2.5 2.25
2.5 1.85 Communications (R
5 2.25
2.5 0.61 Respon
2.19 1.31 Response P
2.94 1.65
N/A N/A
N/A N/A
N/A N/A
Detect
N/A N/A
Recover Communications (RC.CO)
Asset Management (ID.AM)
Business Environment (ID.B
4.5
3.5
1.5
Mitigation (RS.MI) 1
0.5
Analysis (RS.AN)
Communications (RS.CO)
Respond
Response Planning (RS.RP)
Recover 2 Protect
Respond Detect
Maturity score
Asset Management (ID.AM)
(RC.CO) Business Environment (ID.BE) Identify
5 Governance (ID.GV)
4.5
1.5 Ma
Cap
1 Access Control (PR.AC)
0.5
Maintenance (PR.MA)
(DE.CM)
Anomalies and Events (DE.AE)
Protective Technology (PR.PT) Protect
3
2.5
2
Protect
1.5
0.5
0
Detect Identify Protect Detect Respond Recov
e Capability score
Results
1. Results NIST CSF 2.0
2. NIST CSF Scoring NIST CSF 1.1
3. Results sharing
Function Category
Govern Organizational Context (GV.OC)
Risk Management Strategy (GV.RM)
Roles, Responsibilities, and Authorities (GV.RR)
Policy (GV.PO)
Oversight (GV.OV)
Cybersecurity Supply Chain Risk Management (GV.SC)
overall Govern
Identify Asset Management (ID.AM)
Risk Assessment (ID.RA)
Improvement (ID.IM)
overall Identify
Protect Identity Management, Authentication, and Access Control (PR.AA)
Awareness and Training (PR.AT)
Data Security (PR.DS)
Platform Security (PR.PS)
Technology Infrastructure Resilience (PR.IR)
overall Protect
Detect Continuous Monitoring (DE.CM)
Adverse Event Analysis (DE.AE)
overall Detect
Respond Incident Management (RS.MA)
Incident Analysis (RS.AN)
Incident Response Reporting and Communication (RS.CO)
Incident Mitigation (RS.MI)
overall Respond
Recover Incident Recovery Plan Execution (RC.RP)
Incident Recovery Communication (RC.CO)
overall Recover
Maturity Score Capability score
3.37 3
Recove
3.13 N/A
3.94 N/A
2.86 3
1.25 N/A
4.38 1.5
3.16 2.5
4.04 0.75 Respon
1.76 0
Incident Response Repor
2.64 1.35
2.81 0.7
3.38 N/A
3.75 2
3.54 2.81
4.17 3
In
2.57 2.25
3.48 2.52
3.67 1.99 Detect
3.75 2.27
3.71 2.13
3.5 1.98
5 2.54
2.5 1.82
2.5 0.92
3.38 1.82
N/A 2.63
N/A N/A Protect
N/A 2.63
Recover
Detect
Adverse Event Analysis (DE.AE)
Protect
Govern
5
4
Recover 3 Identify
2
1
0
Respond Protect
Detect
Maturity score
Govern
3.5
3
Oversight (GV.OV)
2.5
2
1.5
1 Cybersecurity Supply Chain Risk Management (GV.SC)
0.5
0
M) Improvement (ID.IM)
Identify
lience (PR.IR) Identity Management, Authentication, and Access Control (PR.AA)
2.5
Identify
2
1.5
Protect
0.5
0
Govern Identify Protect Detect Respond Reco
e Capability score
Assessment results
Personal information
May we contact you regarding your scoring?
If yes: please provide your email address
SOC assessment scores Maturity score Maturity target Capability score Capability target
Business domain 3.44 3
Business drivers 3.5
Customers customers 4.58
Charter charter 3.75
Governance governance 3.47
Privacy & policy 1.88
People domain 3.52 3
Employees 3.13
Roles & hierarchy 4.38
People & team management 3.21
Knowledge management 4.22
Training & Education 2.68
Process 3.35 3
SOC management 3.39
Operations & facilities 3.06
Reporting & communication 3.6
Use case management 3.5
Detection engineering & validatio 3.19
Technology 3.75 3 2.15 2 In scope?
SIEM / UEBA tooling 3.81 2.21 Yes
NDR toolgin 0 0 No
EDR tooling 3.64 2.13 Yes
SOAR tooling 3.81 2.12 Yes
Services 2.3 3 2.15 1 In scope?
Security monitoring 3.57 2.53 Yes
Security incident management 2.41 1.69 Yes
Security analysis & forensics 4.73 2.87 Yes
Threat intelligence 0 0 Yes
Threat hunting 0.71 0.36 Yes
Vulnerability management 0 0 Yes
Log management 4.71 2.74 Yes
Next steps
1. Next steps for improvement
Maturity improvement
With the SOC-CMM assessment completed, the next steps are to determine the areas to improve. This requires som
analysed top-down. First, determine which domains are scoring less than the target maturity level. Then, drill down
maturity level was not used, then the domains should be chosen that underperform in comparison to the other dom
those domains yield the lowest scores.
When the domains and the respective aspects that require improvement have been identified, detailed information
that need to be made. The sheets for those domains provide the detailed information that is required for improvem
'Usage' sheet to determine which of the individual elements is negatively contributing to the overall score. Those ele
Improvement can as simple as creating and maintaining the appropriate documentation or as complex as introducin
CMM does not provide guidance on how to execute the improvement. This should be determined by internal expert
to purchase a licensed and supported version of the SOC-CMM. This licensed and supported version contains a num
Capability improvement
Capabilities apply to services and technologies and indicate how capable a service or technology is to reach it's goal
be improved, the first question to ask is: which service or technology is negatively impacted the most by lack of capa
candidate for improvement.
Similar to maturity improvement, the detailed information is provided in the sheets for those domains. The element
to be addressed. It is recommended to search for groups of elements that perhaps have the same underlying reason
improvement of capabilities can be optimised. A common root cause is lack of documentation and formalisation.
Comparison
When a second assessment is performed, the results should be compared to the previous assessment to determine t
both the high-level and the detailed information about the improvement. Use the result tables to determine the diff
of the assessment to see where actual improvement was made, and if this is in line with goals set for improvement.
areas to improve. This requires some analysis of the results. The results should be
get maturity level. Then, drill down into those domains using the graphs. If a target
rm in comparison to the other domains. The next step is to determine which aspects of
ets for those domains. The elements that score the lowest are the elements that need
ps have the same underlying reason (root cause) for underscoring. This way,
cumentation and formalisation.
previous assessment to determine the growth and evolution of the SOC. This includes
e result tables to determine the differences and then drill down to those specific parts
ne with goals set for improvement.
in scope type answer importance
SOC-CMM - Business Domain
B1 - Business Drivers
B 1.1 1 M 4 3
B 1.2 1 M 3 3
B 1.3 1 M 3 3
B 1.4 1 M 5 3
B 1.5 1 M 4 3
SUM 19 15
B2 - Customers
B 2.1 1 M 5 3
B 2.2
B 2.2.1 2
B 2.2.2 2
B 2.2.3 1
B 2.2.4 2
B 2.2.5 1
B 2.2.6 2
B 2.2.7 2
B 2.2.8
B 2.3 1 M 4 3
B 2.4 1 M 4 3
B 2.5 1 M 5 3
B 2.6 1 M 5 3
B 2.7 1 M 5 3
SUM 28 18
B3 - SOC Charter
B 3.1 1 M 4 3
B 3.2 Mostly complete
B 3.2.1 2
B 3.2.2 2
B 3.2.3 1
B 3.2.4 2
B 3.2.5 1
B 3.2.6 2
B 3.2.7 2
B 3.2.8 2
B 3.2.9 2
B 3.2.10 2
B 3.2.11 2
B 3.3 1 M 3 3
B 3.4 1 M 4 3
B 3.5 1 M 5 3
SUM 16 12
B4 - Governance
B 4.1 1 M 3 3
B 4.2 1 M 3 3
B 4.3 Fully complete
B 4.3.1 2
B 4.3.2 2
B 4.3.3 2
B 4.3.4 2
B 4.3.5 2
B 4.3.6 2
B 4.3.7 2
B 4.3.8 2
B 4.3.9 2
B 4.3.10 2
B 4.3.11 2
B 4.3.12 2
B 4.3.13 2
B 4.3.14 2
B 4.4 1 M 5 3
B 4.5 Fully complete
B 4.5.1 2
B 4.5.2 2
B 4.5.3 2
B 4.5.4 2
B 4.5.5 2
B 4.5.6 2
B 4.5.7 2
B 4.5.8 2
B 4.8 1 M 5 3
B 4.9 1 M 5 3
B 4.10 1 M 3 3
B 4.11 1 M 4 3
Maturity SUM 34 27
P3 - People Management
P 3.1 1 M 2 3
P 3.2 1 M 3 3
P 3.3 1 M 1 3
P 3.4 1 M 4 3
P 3.7 1 M 2 3
P 3.8 1 M 3 3
P 3.9 1 M 3 3
P 3.10 1 M 4 3
P 3.11 1 M 5 3
P 3.12 1 M 5 3
Maturity SUM 50 42
P4 - Knowledge Management
P 4.1 1 M 5 3
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
P 4.9 1 M 4 3
P 4.10 1 M 4 3
Maturity SUM 35 24
T2 - NDR Tooling
T 2 - Scope 1
T 2.1
T 2.1.1 0 M 0 3
T 2.1.2 0 M 0 3
T 2.2
T 2.2.1 0 M 0 3
T 2.2.2 0 M 0 3
T 2.3
T 2.3.1 0 M 0 3
T 2.3.2 0 M 0 3
T 2.3.3 0 M 0 3
T 2.3.4 0 M 0 3
T 2.5
T 2.5.1 0 M 0 3
T 2.5.2 0 M 0 3
T 2.5.3 0 M 0 3
T 2.5.4 0 M 0 3
T 2.5.5 0 M 0 3
T 2.5.6 0 M 0 3
T 2.6
T 2.6.1 0 M 0 3
T 2.6.1 0 M 0 3
T 2.6.2 0 M 0 3
T 2.6.2 0 M 0 3
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Capability SUM 0 0
Maturity SUM 0 0
T3 - Security Analytics
T 3 - Scope 2
T 3.1
T 3.1.1 1 M 4 3
T 3.1.2 1 M 4 3
T 3.2
T 3.2.1 1 M 4 3
T 3.2.2 1 M 4 3
T 3.3 1
T 3.3.1 1 M 3 3
T 3.3.2 1 M 5 3
T 3.3.3 1 M 5 3
T 3.3.4 1 M 5 3
T 3.5
T 3.5.1 1 M 3 3
T 3.5.2 1 M 3 3
T 3.5.3 1 M 4 3
T 3.5.4 1 M 3 3
T 3.5.5 1 M 3 3
T 3.5.6 1 M 3 3
T 3.6
T 3.6.1 1 M 3 3
T 3.6.1 1 M 3 3
T 3.6.2 1 M 3 3
T 3.6.2 1 M 3 3
T 3.7
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Removed, keep lines for backwards compatibility
Capability SUM 188 147
Maturity SUM 86 66
S 3 - Security Analysis
S 3 - Scope 2
S 3.1 1 M 5 3
S 3.1 1 M 5 3
S 3.1 1 M 5 3
S 3.1 1 M 5 3
S 3.2
S 3.2.1 2
S 3.2.2 2
S 3.2.3 2
S 3.2.4 2
S 3.2.5 2
S 3.2.6 2
S 3.2.7 2
S 3.2.8 2
S 3.2.9 2
S 3.2.10 2
S 3.2.11 2
S 3.3 1 M 5 3
S 3.4 1 M 5 3
S 3.5 1 M 4 3
S 3.6 1 M 4 3
S 3.7 1 M 5 3
S 3.8 1 M 5 3
S 3.9 1 M 5 3
S 3.9 1 M 5 3
S 3.10 1 M 5 3
S 3.12 1 M 5 3
S 3.13 1 M 5 3
S 3.13 1 M 5 3
S 3.13 1 M 5 3
S 3.13 1 M 5 3
S 3.14 1 M 5 3
S 3.15 1 M 5 3
S 3.16
S 3.16.1 1 C 5 3
S 3.16.2 1 C 5 3
S 3.16.3 1 C 3 3
S 3.16.4 1 C 5 3
S 3.16.5 1 C 5 3
S 3.16.6 1 C 5 3
S 3.16.7 1 C 5 3
S 3.16.8 1 C 5 3
S 3.16.9 1 C 5 3
S 3.16.10 1 C 5 3
S 3.16.11 1 C 5 3
S 3.16.12 1 C 5 3
S 3.16.13 1 C 5 3
S 3.16.14 1 C 5 3
S 3.16.15 1 C 5 3
S 3.16.16 1 C 5 3
S 3.16.17 1 C 5 3
S 3.16.18 1 C 5 3
S 3.16.19 1 C 5 3
S 3.16.20 1 C 5 3
S 3.16.21 1 C 5 3
S 3.16.22 1 C 5 3
S 3.16.23 1 C 5 3
S 3.16.24 1 C 3 3
S 3.17
Capability SUM 116 72
Maturity SUM 67 42
S4 - Threat Intelligence
S 4 - Scope 2
S 4.1 1 M 0 3
S 4.2
S 4.2.1 1
S 4.2.2 1
S 4.2.3 1
S 4.2.4 1
S 4.2.5 1
S 4.2.6 1
S 4.2.7 1
S 4.2.8 1
S 4.2.9 1
S 4.2.10 1
S 4.2.11 1
S 4.3 1 M 0 3
S 4.4 1 M 0 3
S 4.5 1 M 0 3
S 4.6 1 M 0 3
S 4.7 1 M 0 3
S 4.8 1 M 0 3
S 4.9 1 M 0 3
S 4.9 1 M 0 3
S 4.10 1 M 0 3
S 4.12 1 M 0 3
S 4.13 1 M 0 3
S 4.14 1 M 0 3
S 4.15
S 4.15.1 1 C 0 3
S 4.15.2 1 C 0 3
S 4.15.3 1 C 0 3
S 4.15.4 1 C 0 3
S 4.15.5 1 C 0 3
S 4.15.6 1 C 0 3
S 4.15.7 1 C 0 3
S 4.15.8 1 C 0 3
S 4.15.9 1 C 0 3
S 4.15.10 1 C 0 3
S 4.15.11 1 C 0 3
S 4.15.12 1 C 0 3
S 4.15.13 1 C 0 3
S 4.15.14 1 C 0 3
S 4.15.15 1 C 0 3
S 4.15.16 1 C 0 3
S 4.15.17 1 C 0 3
S 4.15.18 1 C 0 3
S 4.15.19 1 C 0 3
S 4.15.20 1 C 0 3
S 4.15.21 1 C 0 3
S 4.15.22 1 C 0 3
S 4.15.23 1 C 0 3
S 4.15.24 1 C 0 3
S 4.15.25 1 C 0 3
S 4.15.26 1 C 0 3
S 4.15.27 1 C 0 3
S 4.15.28 1 C 0 3
S 4.15.29 1 C 0 3
S 4.16
Capability SUM 0 93
Maturity SUM 0 39
S5 - Hunting
S 5 - Scope 2
S 5.1 1 M 0 3
S 5.2 1 M 0 3
S 5.3
S 5.3.1 2
S 5.3.2 2
S 5.3.3 2
S 5.3.4 2
S 5.3.5 2
S 5.3.6 2
S 5.3.7 1
S 5.3.8 2
S 5.3.9 2
S 5.3.10 2
S 5.3.11 1
S 5.4 1 M 0 3
S 5.5 1 M 3 3
S 5.6 1 M 4 3
S 5.7 1 M 3 3
S 5.8 1 M 0 3
S 5.9 1 M 0 3
S 5.10 1 M 0 3
S 5.10 1 M 0 3
S 5.11 1 M 3 3
S 5.13 1 M 3 3
S 5.14 1 M 3 3
S 5.15 1 M 0 3
S 5.16
S 5.16.1 1 C 0 3
S 5.16.2 1 C 3 3
S 5.16.3 1 C 0 3
S 5.16.4 1 C 2 3
S 5.16.5 1 C 0 3
S 5.16.6 1 C 3 3
S 5.16.7 1 C 0 3
S 5.16.8 1 C 3 3
S 5.16.9 1 C 0 3
S 5.16.10 1 C 3 3
S 5.16.11 1 C 3 3
S 5.16.12 1 C 0 3
S 5.16.13 1 C 0 3
S 5.16.14 1 C 0 3
S 5.16.15 1 C 0 3
S 5.16.16 1 C 0 3
S 5.16.17 1 C 3 3
S 5.16.18 1 C 3 3
S 5.16.19 1 C 3 3
S 5.16.20 1 C 3 3
S 5.16.21 1 C 2 3
S 5.17
Capability SUM 31 63
Maturity SUM 22 42
S6 - Vulnerability Management
S 6 - Scope 2
S 6.1 1 M 0 3
S 6.1 1 M 0 3
S 6.2
S 6.2.1 1
S 6.2.2 2
S 6.2.3 1
S 6.2.4 1
S 6.2.5 1
S 6.2.6 1
S 6.2.7 1
S 6.2.8 2
S 6.2.9 1
S 6.2.10 1
S 6.2.11 1
S 6.3 1 M 0 3
S 6.4 1 M 0 3
S 6.5 1 M 0 3
S 6.6 1 M 0 3
S 6.7 1 M 0 3
S 6.8 1 M 0 3
S 6.9 1 M 0 3
S 6.9 1 M 0 3
S 6.10 1 M 2 3
S 6.10 1 M 2 3
S 6.12 1 M 3 3
S 6.13 1 M 0 3
S 6.14 1 M 0 3
S 6.15
S 6.15.1 1 C 0 3
S 6.15.1 1 C 0 3
S 6.15.2 1 C 0 3
S 6.15.2 1 C 0 3
S 6.15.3 1 C 0 3
S 6.15.3 1 C 0 3
S 6.15.4 1 C 0 3
S 6.15.5 1 C 0 3
S 6.15.6 1 C 0 3
S 6.15.7 1 C 0 3
S 6.15.8 1 C 0 3
S 6.15.9 1 C 0 3
S 6.15.10 1 C 0 3
S 6.15.10 1 C 0 3
S 6.15.10 1 C 0 3
S 6.15.11 1 C 0 3
S 6.15.11 1 C 0 3
S 6.15.12 1 C 0 3
S 6.15.13 1 C 0 3
S 6.15.13 1 C 0 3
S 6.15.14 1 C 0 3
S 6.15.15 1 C 0 3
S 6.15.16 1 C 0 3
S 6.15.17 1 C 0 3
S 6.15.18 1 C 0 3
S 6.15.19 1 C 0 3
S 6.16
Capability SUM 0 60
Maturity SUM 8 39
S7 - Log Management
S 7 - Scope 2
S 7.1 1 M 5 3
S 7.2
S 7.2.1 2
S 7.2.2 2
S 7.2.3 2
S 7.2.4 2
S 7.2.5 2
S 7.2.6 2
S 7.2.7 2
S 7.2.8 2
S 7.2.9 2
S 7.2.10 2
S 7.2.11 2
S 7.3 1 M 5 3
S 7.4 1 M 5 3
S 7.5 1 M 4 3
S 7.6 1 M 5 3
S 7.7 1 M 5 3
S 7.8 1 M 5 3
S 7.9 1 M 3 3
S 7.9 1 M 3 3
S 7.10 1 M 5 3
S 7.12 1 M 5 3
S 7.13 1 M 5 3
S 7.14 1 M 5 3
S 7.15
S 7.15.1 1 C 0 3
S 7.15.2 1 C 5 3
S 7.15.3 1 C 5 3
S 7.15.4 1 C 5 3
S 7.15.5 1 C 4 3
S 7.15.6 1 C 5 3
S 7.15.7 1 C 5 3
S 7.15.8 1 C 4 3
S 7.15.9 1 C 5 3
S 7.15.10 1 C 5 3
S 7.15.11 1 C 5 3
S 7.15.12 1 C 5 3
S 7.15.13 1 C 5 3
S 7.15.14 1 C 5 3
S 7.15.15 1 C 5 3
S 7.15.16 1 C 5 3
S 7.15.17 1 C 5 3
S 7.15.18 1 C 5 3
S 7.15.19 1 C 5 3
S 7.15.19 1 C 5 3
S 7.15.20 1 C 5 3
S 7.16
Capability SUM 93 60
Maturity SUM 62 39
M5 - Detection Engineering
M 5.1.1 1 M 3 3
M 5.1.2 1 M 3 3
M 5.1.3 1 M 3 3
M 5.1.4 1 M 4 3
M 5.1.5 1 M 4 3
M 5.1.6 1 M 4 3
M 5.1.7 1 M 4 3
M 5.1.8 1 M 3 3
M 5.1.9 1 M 3 3
M 5.1.10 1 M 4 3
M 5.2.1 1 M 4 3
M 5.2.2 1 M 5 3
M 5.2.3 1 M 4 3
M 5.2.4 1 M 3 3
M 5.2.5 1 M 4 3
M 5.2.6 1 M 3 3
Maturity SUM 64 54
Technology
T 1.4.1 1 M 4 3
T 1.4.2 1 M 4 3
T 1.4.3 1 M 4 3
T 1.4.4 1 M 5 3
T 1.4.5 1 M 5 3
T 2.4.1 0 M 0 3
T 2.4.2 0 M 0 3
T 2.4.3 0 M 0 3
T 2.4.4 0 M 0 3
T 2.4.5 0 M 0 3
T 3.4.1 1 M 4 3
T 3.4.2 1 M 4 3
T 3.4.3 1 M 5 3
T 3.4.4 1 M 5 3
T 3.4.5 1 M 5 3
T 4.4.1 1 M 3 3
T 4.4.2 1 M 3 3
T 4.4.3 1 M 3 3
T 4.4.4 1 M 5 3
T 4.4.5 1 M 5 3
Services
S 1.11 1 M 4 3
S 2.12 1 M 3 3
S 3.11 1 M 4 3
S 4.11 1 M 0 3
S 5.12 1 M 3 3
S 6.11 1 M 3 3
S 7.11 1 M 5 3
S 4.14.25 1 C 0 3
S 4.14.31 1 C 0 3
1
1
1
1
1
PR.AT-1 PR.AT-1 PR.AT-01 PR.AT-01 1
PR.IP-11 PR.IP-11 GV.RR-04 GV.RR-04 1
1
1
1
0 0 14
1
1
0 0 8
1
PR.AT-1 PR.AT-1 PR.AT-02 PR.AT-02 1
PR.AT-1 PR.AT-1 PR.AT-02 PR.AT-02 1
PR.AT-1 PR.AT-1 PR.AT-02 PR.AT-02 1
1
PR.AT-1 PR.AT-1 PR.AT-02 PR.AT-02 1
0 0 7
1
1
1
1
0 0 7
1
PR.IP-3 PR.IP-3 PR.PS-01 PR.PS-01 1
1
1
ID.AM-08 ID.AM-08 1
1
1
0 0 31
1
1
1
1
1
1
1
1
1
0 0 17
1
1
PR.IR-04 PR.IR-04 1
PR.AT-5 PR.AT-5 PR.AT-02 PR.AT-02 1
PR.AT-5 PR.AT-5 PR.AT-02 PR.AT-02 1
ID.SC-3 ID.SC-3 GV.SC-04 GV.SC-04 1
1
PR.AC-4 PR.AC-4 PR.AA-05 PR.AA-05 1
1
PR.AC-4 PR.AC-4 PR.AA-05 PR.AA-05 1
0 0 38
0 0 22
DE.DP-1 GV.RR-02 1
DE.DP-1 GV.RR-02 1
1
1
PR.IR-04 1
PR.AT-5 PR.AT-02 1
PR.AT-5 PR.AT-02 1
ID.SC-3 GV.SC-04 1
PR.PT-5 PR.IR-03 1
PR.IP-4 PR.DS-11 1
PR.IP-4 PR.DS-11 1
PR.IP-9 ID.IM-04 1
PR.IP-10 ID.IM-02 1
PR.DS-7 PR.IR-01 1
PR.PT-3 PR.PS-04 1
PR.AC-4 PR.AA-05 1
PR.PT-3 PR.PS-04 1
PR.AC-4 PR.AA-05 1
0 0 0
0 0 0
1
1
PR.IR-04 PR.IR-04 1
PR.AT-5 PR.AT-5 PR.AT-02 PR.AT-02 1
PR.AT-5 PR.AT-5 PR.AT-02 PR.AT-02 1
ID.SC-3 ID.SC-3 GV.SC-04 GV.SC-04 1
0 0 49
0 0 22
1
1
PR.IR-04 PR.IR-04 1
PR.AT-5 PR.AT-5 PR.AT-02 PR.AT-02 1
PR.AT-5 PR.AT-5 PR.AT-02 PR.AT-02 1
ID.SC-3 ID.SC-3 GV.SC-04 GV.SC-04 1
1
DE.AE-07 DE.AE-07 1
1
1
1
1
1
1
1
1
1
1
1
1
1
0 0 24
0 0 22
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-2 DE.CM-2 DE.CM-02 DE.CM-02 1
DE.CM-3 DE.CM-3 DE.CM-03 DE.CM-03 1
DE.CM-4 DE.CM-4 DE.CM-01 DE.CM-01 1
DE.CM-5 DE.CM-5 DE.CM-09 DE.CM-09 1
DE.CM-6 DE.CM-6 DE.CM-06 DE.CM-06 1
DE.CM-7 DE.CM-7 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-4 DE.DP-4 DE.AE-06 DE.AE-06 1
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1
DE.DP-2 DE.DP-2 1
PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1
PR.MA-1 PR.MA-1 ID.AM-08 ID.AM-08 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.CM-2 DE.CM-2 DE.CM-02 DE.CM-02 1
DE.CM-3 DE.CM-3 DE.CM-03 DE.CM-03 1
DE.CM-4 DE.CM-4 DE.CM-01 DE.CM-01 1
DE.CM-5 DE.CM-5 DE.CM-09 DE.CM-09 1
DE.CM-6 DE.CM-6 DE.CM-06 DE.CM-06 1
DE.CM-7 DE.CM-7 1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
DE.DP-2 DE.DP-2 1
DE.DP-5 DE.DP-5 ID.IM-03 ID.IM-03 1
0 0 27
0 0 14
1
RS.CO-1 RS.CO-1 PR.AT-01 PR.AT-01 1
0 0 36
0 0 14
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-4 DE.DP-4 DE.AE-06 DE.AE-06 1
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1
DE.DP-2 DE.DP-2 1
PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1
PR.MA-1 PR.MA-1 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
RS.AN-1 RS.AN-1 RS.MA-02 RS.MA-02 1
RS.AN-3 RS.AN-3 RS.AN-03 RS.AN-03 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.DP-2 DE.DP-2 1
DE.DP-5 DE.DP-5 ID.IM-03 ID.IM-03 1
0 0 24
0 0 14
1
1
1
1
PR.IR-04 PR.IR-04 1
ID.RA-5 ID.RA-5 1
PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1
PR.MA-1 PR.MA-1 1
1
ID.RA-3 ID.RA-3 ID.RA-03 ID.RA-03 1
1
1
0 0 31
0 0 13
1
ID.RA-3 ID.RA-3 ID.RA-03 ID.RA-03 1
1
1
1
1
PR.IR-04 PR.IR-04 1
1
PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1
PR.MA-1 PR.MA-1 1
1
1
1
DE.DP-5 DE.DP-5 ID.IM-03 ID.IM-03 1
0 0 21
0 0 14
PR.IP-12 PR.IP-12 1
ID.RA-1 ID.RA-1 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-9 PR.IP-9 ID.IM-04 ID.IM-04 1
PR.MA-1 PR.MA-1 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
ID.RA-1 ID.RA-1 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
PR.IP-12 PR.IP-12 ID.RA-01 ID.RA-01 1
0 0 20
0 0 13
0 0 20
0 0 13
1
ID.RA-2 ID.RA-2 ID.RA-02 ID.RA-02 1
1
ID.GV-1 ID.GV-1 GV.PO-01 GV.PO-01 1
1
ID.GV-1 ID.GV-1 GV.PO-01 GV.PO-01 1
1
ID.GV-1 ID.GV-1 GV.PO-01 GV.PO-01 1
ID.GV-1 ID.GV-1 GV.OC-03 GV.OC-03 1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
DE.DP-1 DE.DP-1 GV.RR-02 GV.RR-02 1
1
1
1
PR.DS-7 PR.DS-7 PR.IR-01 PR.IR-01 1
1
1
1
DE.DP-3 DE.DP-3 ID.IM-02 ID.IM-02 1
DE.DP-3 DE.DP-3 ID.IM-02 ID.IM-02 1
DE.DP-3 DE.DP-3 ID.IM-02 ID.IM-02 1
DE.DP-3 DE.DP-3 ID.IM-02 ID.IM-02 1
1
DE.DP-5 DE.DP-5 ID.IM-03 ID.IM-03 1
0 0 18
1
1
1
1
1
1
1
ID.RA-5 ID.RA-5 ID.RA-05 ID.RA-05 1
1
GV.OV-03 GV.OV-03 1
ID.GV-1 ID.GV-1 GV.PO-02 GV.PO-02 1
ID.GV-2 ID.GV-2 1
1
1
1
ID.IM-03 ID.IM-03 1
DE.DP-5 DE.DP-5 ID.IM-03 ID.IM-03 1
1
1
1
1
1
DE.CM-4 DE.CM-4 DE.CM-09 DE.CM-09 1
DE.CM-5 DE.CM-5 1
1
DE.CM-4 DE.CM-4 DE.CM-09 DE.CM-09 1
1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
DE.CM-1 DE.CM-1 DE.CM-01 DE.CM-01 1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
DE.DP-2 DE.DP-2 DE.AE-02 DE.AE-02 1
1
1
1
1
1
1
DE.AE-07 DE.AE-07 1
1
1
1
DE.AE-3 DE.AE-3 DE.AE-03 DE.AE-03 1
1
1
1
1
1
1
1
1
1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
DE.AE-2 DE.AE-2 DE.AE-02 DE.AE-02 1
1
1
1
1
1
1
1
1
RC.RP-03 RC.RP-03 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 DE.CM-09 1
DE.CM-09 RC.RP-03 1
DE.AE-02 1
PR.PS-03 PR.PS-03 1
PR.PS-02 PR.PS-02 1
PR.DS-02 PR.DS-02 1
PR.DS-10 PR.DS-10 1
RS.AN-06 RS.AN-06 1
RS.AN-07 RS.AN-07 1
RS.AN-08 RS.AN-08 1
RS.AN-08 RS.AN-08 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 1
PR.PS-03 1
PR.PS-03 1
PR.PS-03 1
PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.PS-03 PR.PS-03 1
PR.IR-04 PR.IR-04 1
PR.IR-04 PR.IR-04 1
PR.IR-04 PR.IR-04 1
PR.IR-04 PR.IR-04 1
PR.IR-04 PR.IR-04 1
GV.RR-03 GV.RR-03 1
GV.PO-01 GV.PO-01 1
GV.SC-02 GV.SC-02 1
DE.CM-06 DE.CM-06 1
DE.CM-06 1
DE.CM-06 DE.CM-06 1
DE.CM-06 DE.CM-06 1
total score MAX score final score
4 5 75
3 5 50
3 5 50
5 5 100
4 5 75
19 25 70
5 5
4 5
4 5
5 5
5 5
5 5
28 30 91.67
4 5
3 5
4 5
5 5
16 20 75
3 5
3 5
5 5
5 5
5 5
3 5
4 5
34 45 69.44
3 5
2 5
2 5
2 5
2 5
2 5
3 5
3 5
25 50 37.5
3 5
2 5
2 5
3 5
5 5
4 5
28 40 62.5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
4 5
4 5
4 5
5 5
5 5
5 5
4 5
3 5
36 40 87.5
2 5
3 5
1 5
4 5
2 5
3 5
3 5
4 5
5 5
5 5
50 70 64.29
5 5
4 5
4 5
35 40 84.38
4 5
1 5
5 5
3 5
5 5
2 5
2 5
22 35 53.57
4 5
4 5
5 5
4 5
26 35 67.86
3 5
3 5
3 5
4 5
4 5
4 5
4 5
4 5
4 5
5 5
3 5
3 5
3 5
4 5
2 5
3 5
5 5
5 5
3 5
3 5
3 5
2 5
2 5
107 155 61.29
5 5
4 5
4 5
5 5
3 5
4 5
4 5
3 5
4 5
66 85 72.06
3 5
3 5
3 5
4 5
4 5
4 5
3 5
3 5
3 5
4 5
4 5
4 5
4 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
4 5
4 5
4 5
5 5
5 5
5 5
5 5
5 5
5 5
76 100 70
4 5
4 5
4 5
4 5
5 5
5 5
5 5
4 5
4 5
4 5
4 5
4 5
3 5
2 5
4 5
4 5
4 5
4 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 0 #DIV/0!
0 0 #DIV/0!
4 5
4 5
4 5
4 5
3 5
5 5
5 5
5 5
3 5
3 5
4 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
4 5
4 5
4 5
4 5
3 5
3 5
5 5
5 5
4 5
4 5
4 5
4 5
4 5
5 5
5 5
5 5
4 5
4 5
4 5
4 5
4 5
4 5
3 5
4 5
4 5
4 5
4 5
4 5
3 5
3 5
4 5
3 5
5 5
5 5
92 120 70.83
89 110 76.14
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
3 5
3 5
3 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
3 5
5 5
4 5
4 5
5 5
5 5
5 5
5 5
5 5
4 5
5 5
5 5
4 5
3 5
5 5
4 5
4 5
4 5
5 5
4 5
2 5
3 5
3 5
3 5
3 5
3 5
3 5
2 5
2 5
3 5
3 5
3 5
2 5
3 5
3 5
3 5
3 5
3 5
3 5
4 5
4 5
5 5
4 5
3 5
4 5
3 5
4 5
4 5
4 5
3 5
4 5
4 5
4 5
4 5
4 5
3 5
5 5
4 5
3 5
2 5
2 5
3 5
3 5
3 5
2 5
2 5
1 5
1 5
2 5
2 5
2 5
3 5
3 5
3 5
3 5
5 5
5 5
5 5
5 5
5 5
5 5
4 5
4 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
3 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
3 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 155 0
0 65 0
0 5
0 5
0 5
3 5
4 5
3 5
0 5
0 5
0 5
0 5
3 5
3 5
3 5
0 5
0 5
3 5
0 5
2 5
0 5
3 5
0 5
3 5
0 5
3 5
3 5
0 5
0 5
0 5
0 5
0 5
3 5
3 5
3 5
3 5
2 5
31 105 11.9
22 70 14.29
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
2 5
2 5
3 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 100 0
8 65 0
5 5
5 5
5 5
4 5
5 5
5 5
5 5
3 5
3 5
5 5
5 5
5 5
5 5
0 5
5 5
5 5
5 5
4 5
5 5
5 5
4 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
5 5
93 100 91.25
62 65 94.23
3 5
5 5
4 5
5 5
2 5
3 5
3 5
3 5
3 5
0 5
4 5
4 5
4 5
4 5
2 5
2 5
2 5
2 5
3 5
4 5
5 5
5 5
5 5
3 5
5 5
4 5
3 5
2 5
4 5
4 5
4 5
4 5
4 5
3 5
3 5
5 5
3 5
5 5
3 5
3 5
5 5
5 5
3 5
3 5
4 5
5 5
3 5
3 5
3 5
4 5
4 5
4 5
4 5
3 5
3 5
4 5
4 5
5 5
4 5
3 5
4 5
3 5
64 90 63.89
4 5
4 5
4 5
5 5
5 5
0 5
0 5
0 5
0 5
0 5
4 5
4 5
5 5
5 5
5 5
3 5
3 5
3 5
5 5
5 5
4 5
3 5
4 5
0 5
3 5
3 5
5 5
0 5
0 5
2 5
2 5
2 5
4 5
5 5
5 5
4 5
3 5
3 5
3 5
3 5
5 5
4 5
4 5
3 5
3 5
3 5
3 5
3 5
0 5
4 5
4 5
4 5
5 5
5 5
4 5
5 5
4 5
5 5
4 5
4 5
3 5
3 5
4 5
5 5
3 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
3 5
2 5
4 5
4 5
4 5
5 5
5 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
5 5
3 5
4 5
5 5
4 5
4 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
3 5
3 5
3 5
4 5
4 5
5 5
3 5
5 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
4 5
5 5
5 5
5 5
5 5
5 5
5 5
2 5
2 5
3 5
3 5
4 5
4 5
4 5
4 5
3 5
3 5
3 5
3 5
3 5
3 5
3 5
5 5
4 5
4 5
4 5
5 5
4 5
4 5
2 5
3 5
3 5
3 5
4 5
4 5
3 5
3 5
4 5
3 5
5 5
4 5
5 5
5 5
5 5
4 5
2 5
3 5
4 5
4 5
4 5
4 5
5 5
4 5
3 5
4 5
4 5
4 5
5 5
5 5
5 5
5 5
5 5
4 5
5 5
5 5
4 5
3 5
5 5
5 5
5 5
5 5
0 5
3 5
3 5
5 5
5 5
3 5
5 5
4 5
4 5
4 5
4 5
4 5
5 5
5 5
0 5
0 5
0 5
0 5
0 5
4 5
4 5
5 5
5 5
5 5
3 5
3 5
3 5
5 5
5 5
4 5
2 5
5 5
0 5
5 5
5 5
5 5
4 5
4 5
0 5
4 5
3 5
remarks
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
Not part of scoring
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
not used in calculations, but to determine 1.6
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST MAPPING
NIST CSF
NIST CSF version Function Category
2.0 PROTECT (PR) Identity Management, Authentication, and Access Control (PR.AA)
2.0 PROTECT (PR) Identity Management, Authentication, and Access Control (PR.AA)
2.0 PROTECT (PR) Identity Management, Authentication, and Access Control (PR.AA)
2.0 PROTECT (PR) Identity Management, Authentication, and Access Control (PR.AA)
2.0 PROTECT (PR) Identity Management, Authentication, and Access Control (PR.AA)
2.0 PROTECT (PR) Identity Management, Authentication, and Access Control (PR.AA)
GV.OC-01 0 0 0 0
GV.OC-02 7 7 32 35
GV.OC-03 7 7 18 35
GV.OC-04 9 9 35 45
GV.OC-05 0 0 0 0
SUM 23 85 115
GV.RM-01 1 1 3 5
GV.RM-02 0 0 0 0
GV.RM-03 1 1 4 5
GV.RM-04 0 0 0 0
GV.RM-05 0 0 0 0
GV.RM-06 0 0 0 0
GV.RM-07 0 0 0 0
SUM 2 7 10
GV.RR-01 0 0 0 0
GV.RR-02 17 17 72 85
GV.RR-03 2 2 8 10
GV.RR-04 1 1 3 5
SUM 20 83 100
GV.PO-01 4 4 11 20
GV.PO-02 3 3 12 15
SUM 7 23 35
GV.OV-01 0 0 0 0
GV.OV-02 0 0 0 0
GV.OV-03 1 1 2 5
SUM 1 2 5
GV.SC-01 0 0 0 0
GV.SC-02 9 9 40 45
GV.SC-03 0 0 0 0
GV.SC-04 3 3 14 15
GV.SC-05 0 0 0 0
GV.SC-06 0 0 0 0
GV.SC-07 0 0 0 0
GV.SC-08 0 0 0 0
GV.SC-09 0 0 0 0
GV.SC-10 0 0 0 0
SUM 12 54 60
Total
ID.AM-01 0 0 0 0
ID.AM-02 0 0 0 0
ID.AM-03 0 0 0 0
ID.AM-04 0 0 0 0
ID.AM-05 0 0 0 0
ID.AM-07 0 0 0 0
ID.AM-08 17 17 72 85
SUM 17 72 85
ID.RA-01 12 12 7 60
ID.RA-02 1 1 4 5
ID.RA-03 4 4 3 20
ID.RA-04 11 11 40 55
ID.RA-05 11 11 40 55
ID.RA-06 0 0 0 0
ID.RA-07 0 0 0 0
ID.RA-08 0 0 0 0
ID.RA-09 0 0 0 0
ID.RA-10 0 0 0 0
SUM 39 94 195
ID.IM-01 1 1 3 5
ID.IM-02 12 12 43 60
ID.IM-03 12 12 38 60
ID.IM-04 10 10 25 50
SUM 35 109 175
Total
PR.AA-01 0
0 0 0
PR.AA-02 0
0 0 0
PR.AA-03 0
0 0 0
PR.AA-04 0
0 0 0
PR.AA-05 9
9 34 45
PR.AA-06 1
1 3 5
SUM 10 37 50
PR.AT-01 6 6 21 30
PR.AT-02 11 11 47 55
SUM 17 68 85
PR.DS-01 0 0 0 0
PR.DS-02 0 0 0 0
PR.DS-10 0 0 0 0
PR.DS-11 6 6 23 30
SUM 6 23 30
PR.PS-01 1 1 4 5
PR.PS-02 1 1 3 5
PR.PS-03 16 16 67 80
PR.PS-04 15 15 69 75
PR.PS-05 0 0 0 0
PR.PS-06 0 0 0 0
SUM 33 143 165
PR.IR-01 5 5 17 25
PR.IR-02 1 1 3 5
PR.IR-03 3 3 11 15
PR.IR-04 10 10 27 50
SUM 19 58 95
Total
DE.CM-01 4 4 16 20
DE.CM-02 2 2 8 10
DE.CM-03 2 2 8 10
DE.CM-06 5 5 19 25
DE.CM-09 2 2 8 10
SUM 15 59 75
DE.AE-02 2 2 10 10
DE.AE-03 4 4 14 20
DE.AE-04 0 0 0 0
DE.AE-06 2 2 8 10
DE.AE-07 0 0 0 0
DE.AE-08 0 0 0 0
SUM 8 32 40
Total
RS.MA-01 3 3 9 15
RS.MA-02 2 2 10 10
RS.MA-03 0 0 0 0
RS.MA-04 0 0 0 0
RS.MA-05 0 0 0 0
SUM 5 19 25
RS.AN-03 2 2 10 10
RS.AN-06 0 0 0 0
RS.AN-07 0 0 0 0
RS.AN-08 0 0 0 0
SUM 2 10 10
RS.CO-02 1 1 3 5
RS.CO-03 1 1 3 5
SUM 2 6 10
RS.MI-01 1 1 3 5
RS.MI-02 1 1 3 5
SUM 2 6 10
Total
RC.RP-01 0 0 0 0
RC.RP-02 0 0 0 0
RC.RP-03 0 0 0 0
RC.RP-04 0 0 0 0
RC.RP-05 0 0 0 0
RC.RP-06 0 0 0 0
SUM 0 0 0
RC.CO-03 0 0 0 0
RC.CO-04 0 0 0 0
SUM 0 0 0
Total
Maturity
Category Category Function maturity Applicable?
maturity applicability
1
1
0
0
0
0
89.29 1 SUM
0
0
0
0
0
72.22 1 SUM
0
0
3
0
52.5 1 SUM
5
13
8
0
4
0
53.75 1 SUM
0
0
0
75 1 SUM
0
0
1
2
2
91.67 1 SUM
434.43 6 72.41 Total
0
0
0
0
0
0
0
65.91 1 SUM
0
0
0
0
1
81.67 1 SUM
1
2
0
2
2
0
0
0
55 1 SUM
0
0
0
0
0
1
0
0
0
1
0
8
33.11 1 SUM
0
0
64.29 1 SUM
1
0
0
0
0
86.11 1 SUM
386.09 6 64.35 Total
4
25
17
2
4
75 1 SUM
20
1
1
3
2
3
1
7
75 1 SUM
0
25
1
0
0
76.6 1 SUM
226.6 3 75.53 Total
1
50 1 SUM
2
7
0
4
0
50 1 SUM
2
3
9
1
1
100 1 SUM
4
5
2
50 1 SUM
3
1
43.75 1 SUM
293.75 5 58.75 Total
0
0 0 SUM
0
0
0 0 SUM
0
0
0
0 0 SUM
0 0 0 Total
0
0
1
0
0
67.39 1 SUM
0
0
0
0
0
0
0
62.5 1 SUM
0
0
0
0
78.75 1 SUM
3
0
57.14 1 SUM
0
0
0
25 1 SUM
0
0
0
1
0
0
2
2
0
0
87.5 1 SUM
378.28 6 63.05 Total
1
1
4
0
0
0
0
80.88 1 SUM
20
13
8
0
3
2
0
1
0
0
35.26 1 SUM
0
2
3
0
52.86 1 SUM
169 3 56.33 Total
0
67.5 1 SUM
2
1
75 1 SUM
5
2
1
0
70.83 1 SUM
0
0
0
1
0
0
83.33 1 SUM
0
0
0
1
51.32 1 SUM
347.98 5 69.6 Total
19
1
2
3
18
73.33 1 SUM
50
17
2
4
3
0
75 1 SUM
148.33 2 74.17 Total
5
4
1
1
0
70 1 SUM
9
1
1
2
100 1 SUM
7
0
50 1 SUM
4
5
50 1 SUM
270 4 67.5 Total
0
0
2
0
0
0
0 0 SUM
0
0
0 0 SUM
0 0 0 Total
Capability
Subcategory Subcategory Subcategory Category
capability MIN capability TOTAL capability MAX capability
1 0 5
1 0 5
0 0 0
0 0 0
0 0 0
0 0 0
2 0 10 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0 0
0 0 0
0 0 0
3 15 15
0 0 0
3 15 15 100
5 0 25
13 0 65
8 0 40
0 0 0
4 0 20
0 0 0
30 0 150 0
0 0 0
0 0 0
0 0 0
0 0 0 0
0 0 0
0 0 0
1 3 5
2 7 10
2 5 10
5 15 25 50
150
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
1 3 5
1 3 5 50
1 5 5
2 10 10
0 0 0
2 8 10
2 9 10
0 0 0
0 0 0
0 0 0
7 32 35 89.29
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
1 5 5
0 0 0
0 0 0
0 0 0
1 3 5
0 0 0
8 0 40
10 8 50 0
0 0 0
0 0 0
0 0 0 0
1 5 5
0 0 0
0 0 0
0 0 0
0 0 0
1 5 5 100
239.29
4 12 20
25 114 125
17 62 85
2 8 10
4 17 20
52 213 260 77.4
20 55 100
1 5 5
1 3 5
3 12 15
2 8 10
3 13 15
1 5 5
7 0 35
38 101 190 41.45
0 0 0
25 95 125
1 3 5
0 0 0
0 0 0
26 98 130 69.23
188.08
1 4 5
1 4 5 75
2 8 10
7 24 35
0 0 0
4 13 20
0 0 0
13 45 65 61.54
2 8 10
3 12 15
9 41 45
1 3 5
1 0 5
16 64 80 75
4 9 20
5 11 25
2 0 10
11 20 55 20.45
3 8 15
1 3 5
4 11 20 43.75
275.74
0 0 0
0 0 0 0
0 0 0
0 0 0
0 0 0 0
0 0 0
0 0 0
0 0 0
0 0 0 0
0
0 0 0
0 0 0
1 5 5
0 0 0
0 0 0
1 5 5 100
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0 0
3 15 15
0 0 0
3 15 15 100
0 0 0
0 0 0
0 0 0
0 0 0 0
0 0 0
0 0 0
0 0 0
1 3 5
0 0 0
0 0 0
2 7 10
2 5 10
0 0 0
0 0 0
5 15 25 50
250
1 0 5
1 0 5
4 12 20
0 0 0
0 0 0
0 0 0
0 0 0
6 12 30 25
20 0 100
13 0 65
8 0 40
0 0 0
3 0 15
2 0 10
0 0 0
1 0 5
0 0 0
0 0 0
47 0 235 0
0 0 0
2 6 10
3 8 15
0 0 0
5 14 25 45
70
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0 0
2 8 10
1 3 5
3 11 15 66.67
5 23 25
2 10 10
1 5 5
0 0 0
8 38 40 93.75
0 0 0
0 0 0
0 0 0
1 5 5
0 0 0
0 0 0
1 5 5 100
0 0 0
0 0 0
0 0 0
1 4 5
1 4 5 75
335.42
19 50 95
1 5 5
2 8 10
3 13 15
18 81 90
44 161 220 66.48
50 209 250
17 62 85
2 8 10
4 17 20
3 10 15
0 0 0
76 306 380 75.66
142.14
5 17 25
4 16 20
1 3 5
1 4 5
0 0 0
11 40 55 65.91
9 41 45
1 3 5
1 5 5
2 8 10
13 57 65 84.62
7 24 35
0 0 0
7 24 35 60.71
4 9 20
5 11 25
9 20 45 30.56
241.8
0 0 0
0 0 0
2 9 10
0 0 0
0 0 0
0 0 0
2 9 10 87.5
0 0 0
0 0 0
0 0 0 0
87.5
Category applicability Function capability
1
4 37.5
0
1
1
4 59.82
1
1
1
3 62.69
1
5 55.15
0
0 0
1
1
3 83.33
1
1
1
3 23.33
1
4 83.86
1
1
2 71.07
1
4 60.45
0
1 87.5
Charter document completeness
11 Incomplete
12 Partially complete
13 Partially complete
14 Partially complete
15 Averagely complete
16 Averagely complete
17 Averagely complete
18 Averagely complete
19 Mostly complete
20 Mostly complete
21 Mostly complete
22 Fully complete
Architecture completeness
5 Incomplete
6 Partially complete
7 Averagely complete
8 Averagely complete
9 Mostly complete
10 Fully complete
SOC-CMM - Business Domain
B1 - Business Drivers answer
B 1.1 0
1
2
3
4
5
B 1.2 0
1
2
3
4
5
B 1.3 0
1
2
3
4
5
B 1.4 0
1
2
3
4
5
B 1.5 0
1
2
3
4
5
B2 - Customers answer
B 2.1 0
1
2
3
4
5
B 2.3 0
1
2
3
4
5
B 2.4 0
1
2
3
4
5
B 2.5 0
1
2
3
4
5
B 2.6 0
1
2
3
4
5
B 2.7 0
1
2
3
4
5
B3 - SOC Charter answer
B 3.1 0
1
2
3
4
5
B 3.3 0
1
2
3
4
5
B 3.4 0
1
2
3
4
5
B 3.5 0
1
2
3
4
5
B4 - Governance answer
B 4.1 0
1
2
3
4
5
B 4.2 0
1
2
3
4
5
B 4.4 0
1
2
3
4
5
B 4.6 0
1
2
3
4
5
B 4.7 0
1
2
3
4
5
B 4.8 0
1
2
3
4
5
B 4.9 0
1
2
3
4
5
B 4.10 0
1
2
3
4
5
B 4.11 0
1
2
3
4
5
B5 - Privacy answer
B 5.1 0
1
2
3
4
5
B 5.2 0
1
2
3
4
5
B 5.4 0
1
2
3
4
5
B 5.5 0
1
2
3
4
5
B 5.6 0
1
2
3
4
5
B 5.7 0
1
2
3
4
5
B 5.8 0
1
2
3
4
5
B 5.9 0
1
2
3
4
5
B 5.10 0
1
2
3
4
5
B 5.11 0
1
2
3
4
5
M3 - Reporting answer
M 3.1 0
1
2
3
4
5
M 3.2 0
1
2
3
4
5
M 3.3 0
1
2
3
4
5
M 3.4 0
1
2
3
4
5
M 3.5 0
1
2
3
4
5
M 3.6 0
1
2
3
4
5
M 3.7 0
1
2
3
4
5
M 3.9 0
1
2
3
4
5
M 3.11.1 0
1
2
3
4
5
M 3.11.2 0
1
2
3
4
5
M 3.11.3 0
1
2
3
4
5
M 3.12.1 0
1
2
3
4
5
M 3.12.2 0
1
2
3
4
5
M 3.13.1 0
1
2
3
4
5
M 3.13.2 0
1
2
3
4
5
M 3.13.3 0
1
2
3
4
5
M 3.13.4 0
1
2
3
4
5
S5 - Hunting answer
S 5.1 0
1
2
3
4
5
S 5.2 0
1
2
3
4
5
S 5.4 0
1
2
3
4
5
S 5.5 0
1
2
3
4
5
S 5.6 0
1
2
3
4
5
S 5.7 0
1
2
3
4
5
S 5.8 0
1
2
3
4
5
S 5.9 0
1
2
3
4
5
S 5.10 0
1
2
3
4
5
S 5.11 0
1
2
3
4
5
S 5.12 0
1
2
3
4
5
S 5.13 0
1
2
3
4
5
S 5.14 0
1
2
3
4
5
S 5.15 0
1
2
3
4
5
No documentation in place
Some ad-hoc information across documents
Basic documentation of business drivers
Single document, full description of business drivers
Document completed, approved and formally published
No documentation in place
Some ad-hoc information across documents
Basic documentation of SOC customers
Single document, full description of SOC customers
Document completed, approved and formally published
No policy is in place
Information regarding privacy is scattered across documents
A policy exists, but has not been accepted formally
A formal policy exists, its contents are known to all employees
A formal policy exists, its contents are accepted by all employees
There are either way too few or too many external employees
There are too few or too many external employees
The SOC has somewhat too many or too few external employees
The SOC mostly meets requirements for external employee FTE count
The external employee ratio meets all requirements
There are too many skills only present within the external employees
Some required skills are not present internally, and not transferred
Some required skills are not present internally, but being transferred
Most skills are covered with internal employees
All required skills are covered with internal employees as well
No hierarchy exists
A basic hierarchy exists, but is not fully operational
A basic hierarchy is in place and fully operational
A full hierarchy is in place, but not formalized
A full hierarchy is in place and formalized
No documentation in place
Some ad-hoc information across documents
Basic documentation of SOC roles
Single document, full description of SOC roles
Document completed, approved and formally published
No documentation in place
Some ad-hoc information across documents
Basic documentation of career progression for roles
Single document, full description of career progression for roles
Document completed, approved and formally published
guidance
No plan exists
A plan covering some roles is in place, but not operational
A plan covering some roles is in place and operational
A plan covering all roles is in place, but not formalized
A plan covering all roles is in place and formalized
guidance
Do you have a formal knowledge managemen
A knowledge management process is not in place
Knowledge management is done in an ad-hoc fashion
A basic process is in place, that covers some knowledge aspects
An informal process is in place that covers most knowledge aspects
A formal process is in place, covering all knowledge aspects
Do you have a skill matrix in place?
A skill matrix is not in place
A basic skill matrix is in place, but incomplete
A complete skill matrix is in place, not approved
A complete skill matrix is in place and approved, not regularly updated
A complete skill matrix is in place, approved and regularly updated
Is the skill matrix actively used for team and p
Matrix not used for improvement
Matrix used for improvement in an ad-hoc fashion
Matrix used to improve some personal and team results
Matrix used to improve all personal and team results
Matrix used to improve personal and team results, improvements tracked
Do you have a knowledge matrix in place?
A knowledge matrix is not in place
A basic knowledge matrix is in place, but incomplete
A complete knowledge matrix is in place, not approved
A complete knowledge matrix is in place and approved, not regularly updated
A complete knowledge matrix is in place, approved and regularly updated
Is the knowledge matrix acively used to deter
Matrix not used for identification of training needs
Matrix used for training identification in an ad-hoc fashion
Matrix used to identify training needs, but not for all employees
Matrix used to identify all training needs, but not tracked for execution
Matrix structurally used to identify training needs, training tracked
Have you documented SOC team member abi
Documentation is not in place
Documentation only covers some employees' abilities
Documentation covers the most relevant abilities for the team
All employee abilities documented, but is not regularly updated
All employee abilities documented, and regularly updated
Do you regularly assess and revise the knowle
Documentation is not reviewed
Documentation is reviewed ad-hoc, not using a structured approach
Documentation is reviewed ad-hoc, using a structured approach
Documentation is regularly and informally reviewed and updated
Documentation is regularly and formally reviewed and updated
Is there effective tooling in place to support k
Tooling is not in place
Tooling is in place, but used in an ad-hoc fashion
Tooling is in place, and used regularly
Tooling is in place and use of the tool is embedded in processes
Tooling is in place and optimized for knowledge management purposes
guidance
No budget is allocated
Insufficient budget is allocated for the team as a whole
Sufficient budget is allocated for the team as a whole
Employees have sufficient budget, not encouraged to attend training
Employees have sufficient budget, encouraged to attend training
No time is allocated
Insufficient time is allocated for the team as a whole
Sufficient time is allocated for the team as a whole
Employees have sufficient time, but not encouraged to attend training
Employees have sufficient time, and encouraged to attend training
Workshops are not held
Workshops are held in an ad-hoc fashion
Workshops are held periodically
Workshops are held regularly, not aligned with knowledge & training
Workshops are held regularly and aligned with knowledge & training
No dedicated network
Critical SOC components placed in separate network
Most SOC equipment in separate network, basic access controls in place
All SOC equipment in separate network, full access control in place
Dedicated SOC network in place, fully protected and monitored
No DMS in place
Documentation centralized on file shares
DMS in place, documentation updates not enforced
DMS in place, documentation updates and versions enforced
DMS in place, fully supporting SOC documentation requirements
importance
No agreements exist
Informal agreements made, not applied structurally
Informal agreements made, applied structurally
Formal agreements exists, not measured
Formal agreements exists, metrics applied to reporting
Do you provide different types of reports to y
Different reporting types not provided
Some reporting types provided
Most required reporting types provided
Required reporting types provided, not regularly evaluated
Required reporting types provided and regularly evaluated
Do you use different types of metrics in your r
Different metrics types not used
Some metric types used
Most required metric types used
Required metric types used, not regularly evaluated
Required metric types used and regularly evaluated
Communication skills not identified Are communication skills part of SOC role des
Communication skills identified, but not documented
Communication skills documented in role description
Communication skills documented and approved, not evaluated
Communication skills formally documented and evaluated for employees
importance
No documentation in place
Some ad-hoc information across documents
Basic documentation of use cases
Single repository, full description of use cases
Repository completed, approved and actively maintained
No traceability exists
Traceability is possible for some use cases, but requires manual effort
Traceability is possible for all use cases, but requires manual effort
Full traceability exists in documentation, not validated by stakeholders
Full traceability exists in documentation, validated by stakeholders
No traceability exists
Traceability is possible for some use cases, but requires manual effort
Traceability is possible for all use cases, but requires manual effort
Full traceability exists in documentation, not validated by stakeholders
Full traceability exists in documentation, validated by stakeholders
importance
Do you have a detection engineering process
A detection engineering process is not in place
Detection engineering is done in an ad-hoc fashion
Basic process in place, not applied to all use cases
Informal process in place covering all use cases
Formal process in place, covering all use cases
Is the detection engineering process formally
No documentation in place
Some ad-hoc information across documents
Basic documentation of detection engineering process
Single document, full description of detection engineering process
Document completed, approved and formally published
Are there specific roles and requirements for
No specific roles and requirements
Requirements identified, not formalised in roles
Requirements identified, role defined but not documented
Requirements identified, role defined and documented
Roles formally documented, approved and regularly revised
Is there active cooperation between the SOC
No cooperation between teams
Cooperation between teams on an ad-hoc basis
SOC analysts are informed, no further cooperation
SOC analysts are informed and review outcomes
SOC analyst are actively involved in the detection engineering process
Is there active cooperation between the Threa
No cooperation between teams
Cooperation between teams on an ad-hoc basis
Threat analysts are informed, no further cooperation
Threat analysts are informed and review outcomes
Threat analyst are actively involved in the detection engineering process
Are there formal hand-over to the analyst tea
Formal handover not in place
Handover performed in an ad-hoc manner
Handover performed, process not documented of formalised
Handover performed, process documentation in place
Formal handover procedure in place, documented and regularly evaluated
Is there a testing enviroment to test and valid
Testing environment not in place
Testing environment in place, not actively used for detection engineering
Testing environment used, testing process not documented or formalised
Testing environment used, testing process documented
Testing environment used, process documented and regularly evaluated
Is there a formal release process in place for n
Release process not in place
Releases performed in an ad-hoc manner
Releases done structurally, process not documented of formalised
Releases done structurally, process documentation in place
Formal release procedure in place, documented and regularly evaluated
Do you apply a versioning system to detection
Versioning system not in place
Versioning system in place, not actively used
Versioning system used for some detections
Versioning system used for all detections, no formal commit procedure
Versioning system used for all detections, commit procedure formalised
Do you have a roll-back procedure in place in
Roll-back procedure not in place
Roll-back procedure requirements understood, but not operationalized
Roll-back capability in place, but not documented
Roll-back capability in place and documented
Formal roll-back capability in place, documented and regularly tested
Do you perform adversary emulation?
Validation activities not performed
Validation activities performed in an ad-hoc fashion
Validation activities performed structurally, no documented process
Validation activities performed structurally following a documented process
Validation activities fully aligned with TI and continuously improved
Do you test for detection of MITRE ATT&CK®t
Use case testing not in place
Use case testing performed in ad-hoc fashion, no detection targets set
Some use case testing performed, detection targets set, no formal process
All use cases tested, process formalized, detection targets set
All use cases tested, visibility and detection targets used in improvements
Do you test uses cases not directly associated
Use case testing not in place
Use case testing performed in ad-hoc fashion, no detection targets set
Some use case testing performed, detection targets set, no formal process
All use cases tested, process formalized, detection targets set
All use cases tested, visibility and detection targets used in improvements
Do you test response playbooks?
Response playbooks not tested
Response playbooks tested in an ad-hoc fashion
Some response playbooks tested, no formal process
Response playbooks tested structurally following a documented process
All response playbooks formally tested, output used for improvements
Is ADT/AE fully integrated in the detection eng
New releases do not trigger ADT/AE
New releases trigger ADT/AE in an ad-hoc fashion
Release process triggers ADT/AE for some use cases, not documented
Releases process triggers ADT/AE for all use cases, documented process
Full integration into release process, formalized and (partly) automated
Is the outcome from the ADT/AE tests used as
ADT/AE outcome not used
ADT/AE outcome used in an ad-hoc fashion
ADT/AE outcome used, no documented process
ADT/AE outcome used, documented process
ADT/EA outcome used, process documented and regularly evaluated
Do you monitor the data ingestion status for l
Data ingestion status not monitored
Data ingestion status monitored in an ad-hoc fashion
Data ingestion status monitored, not complete for all data source types
Data ingestion status monitored for all log sources, failures result in alerts
Data ingestion status monitored, following a defined resolution proces
Do you actively monitoring and improve data
Data source coverage not measured
Data source coverage measured in an ad-hoc fashion
Data source coverage measured , not complete for all data source types
Data source coverage structurally measured and improved on
Data source coverage structurally improved on following a defined process
No documentation in place
Some ad-hoc information across documents
Basic documentation of the SIEM system in place
Single document, full technical description of SIEM system
Document completed, approved and formally published
No documentation in place
Some ad-hoc information across documents
Basic documentation of the SIEM system in place
Single document, full functional description of SIEM system
Document completed, approved and formally published
HA not in place
HA requirements identified, not implemented
Manual actions required for achieving redundancy
Fully automated HA in place, not aligned with business continuity plans
Fully automated HA in place, aligned with business continuity plans
guidance
No documentation in place
Some ad-hoc information across documents
Basic documentation of the IDPS system in place
Single document, full technical description of IDPS system
Document completed, approved and formally published
No documentation in place
Some ad-hoc information across documents
Basic documentation of the IDPS system in place
Single document, full functional description of IDPS system
Document completed, approved and formally published
HA not in place
HA requirements identified, not implemented
Manual actions required for achieving redundancy
Fully automated HA in place, not aligned with business continuity plans
Fully automated HA in place, aligned with business continuity plans
guidance
No documentation in place
Some ad-hoc information across documents
Basic documentation of the analytics system in place
Single document, full technical description of analytics system
Document completed, approved and formally published
No documentation in place
Some ad-hoc information across documents
Basic documentation of the analytics system in place
Single document, full functional description of analytics system
Document completed, approved and formally published
HA not in place
HA requirements identified, not implemented
Manual actions required for achieving redundancy
Fully automated HA in place, not aligned with business continuity plans
Fully automated HA in place, aligned with business continuity plans
guidance
No documentation in place
Some ad-hoc information across documents
Basic documentation of the analytics system in place
Single document, full technical description of analytics system
Document completed, approved and formally published
No documentation in place
Some ad-hoc information across documents
Basic documentation of the analytics system in place
Single document, full functional description of analytics system
Document completed, approved and formally published
Is the system regularly maintained?
No personnel for security automation & orchestration support
Personnel for support available, not dedicated or sufficient
Sufficient dedicated personnel available, not documented
Sufficient dedicated personnel available & documented, not formalized
Sufficient dedicated personnel available, documented and formalized
Is remote maintenance on the system manage
Personnel not formally trained
Product training identified, no training currently in place
Individual training, not part of the training program
Training part of training program, all key personnel trained
All personnel formally trained
Is maintenance executed through the change
Personnel not formally certified
Product certification identified, no certification currently in place
Individual certification, not part of the certification program
Certification part of certification program, all key personnel certified
All personnel formally certified
Have maintenance windows been established
Support contract not in place
Basic support contract in place, not covering SOC requirements
Support contract in place, covering basic SOC requirements
Support contract in place, covering most SOC requirements
Support contract in place, covering all SOC requirements
Is maintenance performed using authorised a
System maintenance not performed
System maintenance done in an ad-hoc fashion
System maintenance done structurally, not following procedures
System maintenance done structurally, following procedures
Maintenance executed following approved procedures, regularly reviewed
HA not in place
HA requirements identified, not implemented
Manual actions required for achieving redundancy
Fully automated HA in place, not aligned with business continuity plans
Fully automated HA in place, aligned with business continuity plans
No automation in playbooks
Enrichment playbooks only
Automation of triage activity
Automation used as decision support for remediation activities
fully automated playbooks where possible
Not required for security operations
No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published
No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable
No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed
No procedures in place
Basic procedures in place, used in an ad-hoc fashion
Full procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed
guidance
No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published
No mandate
Mandate requested in ad-hoc fashion during incident response
Mandate informally given, not supported by all stakeholders
Mandate given and supported by all stakeholders, not formalized
Full mandate, formally documented, approved and published
No procedures in place
Basic procedures in place, used in an ad-hoc fashion
Full procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed
No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published
No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable
No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized
No procedures in place
Basic procedures in place, used in an ad-hoc fashion
Full procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed
guidance
No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published
No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable
No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized
No procedures in place
Basic procedures in place, used in an ad-hoc fashion
Full procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed
guidance
No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published
No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable
No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized
No procedures in place
Basic procedures in place, used in an ad-hoc fashion
Full procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed
guidance
No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published
No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable
No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized
No procedures in place
Basic procedures in place, used in an ad-hoc fashion
Full procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed
guidance
No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published
No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable
No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized
No procedures in place
Basic procedures in place, used in an ad-hoc fashion
Full procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and regularly reviewed
Not in place
Partially implemented, incomplete
Averagely implemented, partially documented
Mostly implemented, documented and approved
Fully implemented, documented, approved, actively improved
Not required for SOC operations
Not in place
Log sources connected, basic monitoring
Specific use cases defined and operationalised
Use cases, playbooks and procedures defined and implemented
Fully implemented, performance measured and improved
Not required for SOC operations
knowledge management process in place?
atrix in place?
exercises documented?
ility status for your use cases for gap analysis purposes?
playbooks?
the ADT/AE tests used as input into monitoring and detection engineering?
Detailed 1 No
2 Partially
3 Averagely
4 Mostly
5 Fully
Optional 6 Not required
Completeness 1 Incomplete
2 Partially complete
3 Averagely complete
4 Mostly complete
5 Fully complete
Importance 1 None
2 Low
3 Normal
4 High
5 Critical
Weighing 1 x1
2 x2
3 x3
4 x4
5 x5
Occurrence 1 Never
2 Sometimes
3 Averagely
4 Mostly
5 Always
Satisfaction 1 No
2 Somewhat
3 Averagely
4 Mostly
5 Fully