Soc-Cmm 2.0 - Basic

You might also like

Download as xlsx, pdf, or txt
Download as xlsx, pdf, or txt
You are on page 1of 450

in scope type answer importance

SOC-CMM - Business Domain


B1 - Business Drivers
B 1.1 1 M 0 3
B 1.2 1 M 0 3
B 1.3 1 M 0 3
B 1.4 1 M 0 3
B 1.5 1 M 0 3
SUM 0 15

B2 - Customers
B 2.1 1 M 0 3
B 2.2
B 2.2.1 1
B 2.2.2 1
B 2.2.3 1
B 2.2.4 1
B 2.2.5 1
B 2.2.6 1
B 2.2.7 1
B 2.2.8
B 2.3 1 M 0 3
B 2.4 1 M 0 3
B 2.5 1 M 0 3
B 2.6 1 M 0 3
B 2.7 1 M 0 3
SUM 0 18

B3 - SOC Charter
B 3.1 1 M 0 3
B 3.2 Incomplete
B 3.2.1 1
B 3.2.2 1
B 3.2.3 1
B 3.2.4 1
B 3.2.5 1
B 3.2.6 1
B 3.2.7 1
B 3.2.8 1
B 3.2.9 1
B 3.2.10 1
B 3.2.11 1
B 3.3 1 M 0 3
B 3.4 1 M 0 3
B 3.5 1 M 0 3
SUM 0 12
B4 - Governance
B 4.1 1 M 0 3
B 4.2 1 M 0 3
B 4.3 Incomplete
B 4.3.1 1
B 4.3.2 1
B 4.3.3 1
B 4.3.4 1
B 4.3.5 1
B 4.3.6 1
B 4.3.7 1
B 4.3.8 1
B 4.3.9 1
B 4.3.10 1
B 4.3.11 1
B 4.3.12 1
B 4.3.13 1
B 4.4 1 M 0 3
B 4.5 Incomplete
B 4.5.1 1
B 4.5.2 1
B 4.5.3 1
B 4.5.4 1
B 4.5.5 1
B 4.5.6 1
B 4.5.7 1
B 4.5.8 1
B 4.6 1 M 0 3
B 4.7 1 M 0 3
B 4.8 1 M 0 3
B 4.9 1 M 0 3
Maturity SUM 0 18

B5 - Privacy
B 5.1 1 M 0 3
B 5.2 1 M 0 3
B 5.2 1 M 0 3
B 5.2 1 M 0 3
B 5.3 1 M 0 3
B 5.4 1 M 0 3
B 5.5 1 M 0 3
B 5.6 1 M 0 3
Maturity SUM 0 18

SOC-CMM - People Domain


P1 - SOC Employees
P 1.1 0
P 1.2 1
P 1.2.1 0
P 1.3 1 M 0 3
P 1.4 1 M 0 3
P 1.5 1 M 0 3
P 1.6 1 M 0 3
P 1.7 1 M 0 3
P 1.8 1 M 0 3
Maturity SUM 0 18

P2 - SOC Roles and Hierarchy


P 2.1 1 M 0 3
P 2.1 1 M 0 3
P 2.1 1 M 0 3
P 2.2
P 2.2.1 1
P 2.2.2 1
P 2.2.3 1
P 2.2.4 1
P 2.2.5 1
P 2.2.6 1
P 2.2.7 1
P 2.2.8 1
P 2.2.9 1
P 2.2.10 1
P 2.2.11 1
P 2.2.12 1
P 2.3 1 M 0 3
P 2.3 1 M 0 3
P 2.4 1 M 0 3
P 2.4.1
P 2.5 1 M 0 3
P 2.6 1 M 0 3
P 2.6 1 M 0 3
P 2.6 1 M 0 3
P 2.7
P 2.7.1 1
P 2.7.2 1
P 2.7.3 1
P 2.7.4 1
P 2.7.5 1
P 2.7.6 1
P 2.7.7 1
P 2.7.8 1
P 2.8 1 M 0 3
P 2.8 1 M 0 3
P 2.8 1 M 0 3
P 2.9 1 M 0 3
P 2.10 1 M 0 3
Maturity SUM 0 24

P3 - People Management
P 3.1 1 M 0 3
P 3.2 1 M 0 3
P 3.3 1 M 0 3
P 3.4 1 M 0 3
P 3.5 1 M 0 3
P 3.6 1 M 0 3
P 3.7 1 M 0 3
P 3.8 1 M 0 3
P 3.9 1 M 0 3
P 3.10 1 M 0 3
Maturity SUM 0 30

P4 - Knowledge Management
P 4.1 1 M 0 3
P 4.2
P 4.2.1 1 M 0 3
P 4.2.2 1 M 0 3
P 4.2.3 1 M 0 3
P 4.2.4 1 M 0 3
P 4.2.5 1 M 0 3
P 4.2.6 1 M 0 3
P 4.3
P 4.3.1 1 M 0 3
P 4.3.2 1 M 0 3
P 4.3.3 1 M 0 3
P 4.3.4 1 M 0 3
P 4.3.5 1 M 0 3
P 4.4 1 M 0 3
P 4.5 1 M 0 3
Maturity SUM 0 42

P5 - Training & Education


P 5.1 1 M 0 3
P 5.2
P 5.2.1 1
P 5.2.2 1
P 5.2.3 1
P 5.2.4 1
P 5.2.5 1
P 5.2.6 1
P 5.3 1 M 0 3
P 5.4
P 5.4.1 1
P 5.4.2 1
P 5.4.3 1
P 5.5 1 M 0 3
P 5.6 1 M 0 3
P 5.7 1 M 0 3
P 5.8 1 M 0 3
P 5.9 1 M 0 3
Maturity SUM 0 21

SOC-CMM - Process Domain


M1 - SOC Management
M 1.1 1 M 0 3
M 1.2 1 M 0 3
M 1.3
M 1.3.1 1
M 1.3.2 1
M 1.3.3 1
M 1.3.4 1
M 1.3.5 1
M 1.3.6 1
M 1.3.7 1
M 1.3.8 1
M 1.3.9 1
M 1.3.10 1
M 1.4 1 M 0 3
M 1.5 1 M 0 3
Maturity SUM 0 12

M2 - Security Operations & Facilities


M 2.1
M 2.1.1 1 M 0 3
M 2.1.2 1 M 0 3
M 2.1.3 1 M 0 3
M 2.1.4 1 M 0 3
M 2.1.5 1 M 0 3
M 2.2
M 2.2.1 1 M 0 3
M 2.2.2 1 M 0 3
M 2.2.3 1 M 0 3
M 2.2.4 1 M 0 3
M 2.2.5 1 M 0 3
M 2.3
M 2.3.1 1 M 0 3
M 2.3.2 1 M 0 3
M 2.3.3 1 M 0 3
M 2.3.4 1 M 0 3
M 2.3.5 1 M 0 3
M 2.3.6 1 M 0 3
M 2.4
M 2.4.1 1 M 0 3
M 2.4.2 1 M 0 3
M 2.4.3 1 M 0 3
M 2.4.4 1 M 0 3
M 2.4.5 1 M 0 3
M 2.5
M 2.5.1 1 M 0 3
M 2.5.2 1 M 0 3
Maturity SUM 0 69

M3 - Reporting
M 3.1 1 M 0 3
M 3.2 1 M 0 3
M 3.3 1 M 0 3
M 3.4 1 M 0 3
M 3.5 1 M 0 3
M 3.6 1 M 0 3
M 3.7
M 3.7.1 1 M 0 3
M 3.7.2 1 M 0 3
M 3.7.3 1 M 0 3
M 3.7.4 1 M 0 3
M 3.7.5 1 M 0 3
M 3.7.6 1 M 0 3
M 3.7.7 1 M 0 3
M 3.7.8 1 M 0 3
M 3.8
M 3.8.1 1 M 0 3
M 3.8.2 1 M 0 3
M 3.8.3 1 M 0 3
M 3.8.4 1 M 0 3
M 3.8.5 1 M 0 3
M 3.9
M 3.9.1 1 M 0 3
M 3.9.2 1 M 0 3
M 3.9.3 1 M 0 3
Maturity SUM 0 66

M4 - Use Case Management


M 4.1 1 M 0 3
M 4.1 1 M 0 3
M 4.1 1 M 0 3
M 4.2 1 M 0 3
M 4.2 1 M 0 3
M 4.2 1 M 0 3
M 4.3 1 M 0 3
M 4.3 1 M 0 3
M 4.3 1 M 0 3
M 4.4 1 M 0 3
M 4.4 1 M 0 3
M 4.4 1 M 0 3
M 4.4 1 M 0 3
M 4.5 1 M 0 3
M 4.5 1 M 0 3
M 4.5 1 M 0 3
M 4.6 1 M 0 3
M 4.6 1 M 0 3
M 4.6 1 M 0 3
M 4.7 1 M 0 3
M 4.7 1 M 0 3
M 4.7 1 M 0 3
M 4.8 1 M 0 3
M 4.8 1 M 0 3
M 4.8 1 M 0 3
M 4.9 1 M 0 3
M 4.9 1 M 0 3
M 4.9 1 M 0 3
M 4.10 1 M 0 3
M 4.10 1 M 0 3
M 4.10 1 M 0 3
M 4.11 1 M 0 3
M 4.11 1 M 0 3
M 4.11 1 M 0 3
M 4.12 1 M 0 3
M 4.12 1 M 0 3
M 4.12 1 M 0 3
Maturity SUM 0 36

SOC-CMM - Technology Domain


T1 - SIEM Technology
T 1 - Scope 2
T 1.1
T 1.1.1 1 M 0 3
T 1.1.2 1 M 0 3
T 1.2
T 1.2.1 1 M 0 3
T 1.2.2 1 M 0 3
T 1.3
T 1.3.1 1 M 0 3
T 1.3.2 1 M 0 3
T 1.3.3 1 M 0 3
T 1.3.4 1 M 0 3
T 1.4
T 1.4.1 1 M 0 3
T 1.4.2 1 M 0 3
T 1.4.3 1 M 0 3
T 1.4.4 1 M 0 3
T 1.4.5 1 M 0 3
T 1.4.6 1 M 0 3
T 1.5
T 1.5.1 1 M 0 3
T 1.5.1 1 M 0 3
T 1.5.2 1 M 0 3
T 1.5.2 1 M 0 3
T 1.6
T 1.6.1 1 C 0 3
T 1.6.2 1 C 0 3
T 1.6.3 1 C 0 3
T 1.6.4 1 C 0 3
T 1.6.5 1 C 0 3
T 1.6.6 1 C 0 3
T 1.6.7 1 C 0 3
T 1.6.8 1 C 0 3
T 1.6.9 1 C 0 3
T 1.6.10 1 C 0 3
T 1.6.11 1 C 0 3
T 1.6.12 1 C 0 3
T 1.6.13 1 C 0 3
T 1.6.14 1 C 0 3
T 1.6.15 1 C 0 3
T 1.6.16 1 C 0 3
T 1.6.17 1 C 0 3
T 1.6.18 1 C 0 3
T 1.6.19 1 C 0 3
T 1.6.20 1 C 0 3
T 1.6.21 1 C 0 3
T 1.6.22 1 C 0 3
T 1.6.23 1 C 0 3
T 1.6.23 1 C 0 3
T 1.6.24 1 C 0 3
T 1.6.25 1 C 0 3
T 1.6.26 1 C 0 3
Capability SUM 0 78
Maturity SUM 0 48

T2 - IDPS Tooling
T 2 - Scope 2
T 2.1
T 2.1.1 1 M 0 3
T 2.1.2 1 M 0 3
T 2.2
T 2.2.1 1 M 0 3
T 2.2.2 1 M 0 3
T 2.3
T 2.3.1 1 M 0 3
T 2.3.2 1 M 0 3
T 2.3.3 1 M 0 3
T 2.3.4 1 M 0 3
T 2.4
T 2.4.1 1 M 0 3
T 2.4.2 1 M 0 3
T 2.4.3 1 M 0 3
T 2.4.4 1 M 0 3
T 2.4.5 1 M 0 3
T 2.4.6 1 M 0 3
T 2.5
T 2.5.1 1 M 0 3
T 2.5.1 1 M 0 3
T 2.5.2 1 M 0 3
T 2.5.2 1 M 0 3
T 2.6
T 2.6.1 1 C 0 3
T 2.6.2 1 C 0 3
T 2.6.3 1 C 0 3
T 2.6.4 1 C 0 3
T 2.6.5 1 C 0 3
T 2.6.6 1 C 0 3
T 2.6.7 1 C 0 3
T 2.6.8 1 C 0 3
T 2.6.9 1 C 0 3
T 2.6.10 1 C 0 3
T 2.6.11 1 C 0 3
T 2.6.12 1 C 0 3
T 2.6.13 1 C 0 3
T 2.6.14 1 C 0 3
T 2.6.14 1 C 0 3
T 2.6.15 1 C 0 3
T 2.6.16 1 C 0 3
T 2.6.17 1 C 0 3
Capability SUM 0 51
Maturity SUM 0 48

T3 - Security Analytics
T 3 - Scope 2
T 3.1
T 3.1.1 1 M 0 3
T 3.1.2 1 M 0 3
T 3.2
T 3.2.1 1 M 0 3
T 3.2.2 1 M 0 3
T 3.3 1
T 3.3.1 1 M 0 3
T 3.3.2 1 M 0 3
T 3.3.3 1 M 0 3
T 3.3.4 1 M 0 3
T 3.4
T 3.4.1 1 M 0 3
T 3.4.2 1 M 0 3
T 3.4.3 1 M 0 3
T 3.4.4 1 M 0 3
T 3.4.5 1 M 0 3
T 3.4.6 1 M 0 3
T 3.5
T 3.5.1 1 M 0 3
T 3.5.1 1 M 0 3
T 3.5.2 1 M 0 3
T 3.5.2 1 M 0 3
T 3.6
T 3.6.1 1 C 0 3
T 3.6.2 1 C 0 3
T 3.6.3 1 C 0 3
T 3.6.4 1 C 0 3
T 3.6.5 1 C 0 3
T 3.6.6 1 C 0 3
T 3.6.7 1 C 0 3
T 3.6.8 1 C 0 3
T 3.6.9 1 C 0 3
T 3.6.10 1 C 0 3
T 3.6.11 1 C 0 3
T 3.6.12 1 C 0 3
T 3.6.13 1 C 0 3
T 3.6.14 1 C 0 3
T 3.6.15 1 C 0 3
T 3.6.16 1 C 0 3
T 3.6.17 1 C 0 3
T 3.6.18 1 C 0 3
T 3.6.19 1 C 0 3
T 3.6.20 1 C 0 3
T 3.6.21 1 C 0 3
T 3.6.22 1 C 0 3
T 3.6.23 1 C 0 3
T 3.6.23 1 C 0 3
T 3.6.24 1 C 0 3
Capability SUM 0 72
Maturity SUM 0 48

T4 - Security Automation & Orchestration


T 4 - Scope 2
T 4.1
T 4.1.1 1 M 0 3
T 4.1.2 1 M 0 3
T 4.2
T 4.2.1 1 M 0 3
T 4.2.2 1 M 0 3
T 4.3
T 4.3.1 1 M 0 3
T 4.3.2 1 M 0 3
T 4.3.3 1 M 0 3
T 4.3.4 1 M 0 3
T 4.4
T 4.4.1 1 M 0 3
T 4.4.2 1 M 0 3
T 4.4.3 1 M 0 3
T 4.4.4 1 M 0 3
T 4.4.5 1 M 0 3
T 4.4.6 1 M 0 3
T 4.5
T 4.5.1 1 M 0 3
T 4.5.1 1 M 0 3
T 4.5.2 1 M 0 3
T 4.5.2 1 M 0 3
T 4.6
T 4.6.1 1 C 0 3
T 4.6.2 1 C 0 3
T 4.6.3 1 C 0 3
T 4.6.4 1 C 0 3
T 4.6.5 1 C 0 3
T 4.6.6 1 C 0 3
T 4.6.7 1 C 0 3
T 4.6.8 1 C 0 3
T 4.6.9 1 C 0 3
T 4.6.10 1 C 0 3
T 4.6.11 1 C 0 3
T 4.6.12 1 C 0 3
T 4.6.13 1 C 0 3
T 4.6.14 1 C 0 3
T 4.6.15 1 C 0 3
T 4.6.16 1 C 0 3
T 4.6.17 1 C 0 3
T 4.6.17 1 C 0 3
T 4.6.18 1 C 0 3
T 4.6.19 1 C 0 3
Capability SUM 0 57
Maturity SUM 0 48

SOC-CMM - Services Domain


S1 - Security Monitoring
S 1 - Scope 2
S 1.1 1 M 0 3
S 1.1 1 M 0 3
S 1.1 1 M 0 3
S 1.1 1 M 0 3
S 1.1 1 M 0 3
S 1.1 1 M 0 3
S 1.1 1 M 0 3
S 1.1 1 M 0 3
S 1.1 1 M 0 3
S 1.2
S 1.2.1 1
S 1.2.2 1
S 1.2.3 1
S 1.2.4 1
S 1.2.5 1
S 1.2.6 1
S 1.2.7 1
S 1.2.8 1
S 1.2.9 1
S 1.2.10 1
S 1.2.11 1
S 1.3 1 M 0 3
S 1.4 1 M 0 3
S 1.5 1 M 0 3
S 1.6 1 M 0 3
S 1.7 1 M 0 3
S 1.8 1 M 0 3
S 1.9 1 M 0 3
S 1.9 1 M 0 3
S 1.10 1 M 0 3
S 1.11 1 M 0 3
S 1.12 1 M 0 3
S 1.12 1 M 0 3
S 1.12 1 M 0 3
S 1.12 1 M 0 3
S 1.12 1 M 0 3
S 1.12 1 M 0 3
S 1.12 1 M 0 3
S 1.12 1 M 0 3
S 1.12 1 M 0 3
S 1.13 1 M 0 3
S 1.14 1 M 0 3
S 1.15
S 1.15.1 1 C 0 3
S 1.15.2 1 C 0 3
S 1.15.3 1 C 0 3
S 1.15.4 1 C 0 3
S 1.15.5 1 C 0 3
S 1.15.5 1 C 0 3
S 1.15.6 1 C 0 3
S 1.15.7 1 C 0 3
S 1.15.8 1 C 0 3
S 1.15.9 1 C 0 3
S 1.15.10 1 C 0 3
S 1.15.11 1 C 0 3
S 1.15.12 1 C 0 3
S 1.15.13 1 C 0 3
S 1.15.14 1 C 0 3
S 1.15.15 1 C 0 3
S 1.15.16 1 C 0 3
S 1.15.17 1 C 0 3
S 1.15.18 1 C 0 3
S 1.15.19 1 C 0 3
S 1.15.20 1 C 0 3
S 1.15.21 1 C 0 3
S 1.15.22 1 C 0 3
S 1.15.23 1 C 0 3
S 1.15.24 1 C 0 3
S 1.16
Capability SUM 0 72
Maturity SUM 0 39

S 2 - Security incident Management


S 2 - Scope 2
S 2.1 1
S 2.1.1 0
S 2.1.2 0
S 2.2 1 M 0 3
S 2.3 1 M 0 3
S 2.4
S 2.4.1 1
S 2.4.2 1
S 2.4.3 1
S 2.4.4 1
S 2.4.5 1
S 2.4.6 1
S 2.4.7 1
S 2.4.8 1
S 2.4.9 1
S 2.4.10 1
S 2.4.11 1
S 2.5 1 M 0 3
S 2.6 1 M 0 3
S 2.7 1 M 0 3
S 2.7 1 M 0 3
S 2.7 1 M 0 3
S 2.7 1 M 0 3
S 2.8 1 M 0 3
S 2.9 1 M 0 3
S 2.10 1 M 0 3
S 2.11 1 M 0 3
S 2.11 1 M 0 3
S 2.11 1 M 0 3
S 2.12 1 M 0 3
S 2.13 1 M 0 3
S 2.14 1 M 0 3
S 2.15 1 M 0 3
S 2.15 1 M 0 3
S 2.16
S 2.16.1 1 C 0 3
S 2.16.2 1 C 0 3
S 2.16.3 1 C 0 3
S 2.16.4 1 C 0 3
S 2.16.5 1 C 0 3
S 2.16.6 1 C 0 3
S 2.16.7 1 C 0 3
S 2.16.8 1 C 0 3
S 2.16.9 1 C 0 3
S 2.16.10 1 C 0 3
S 2.16.11 1 C 0 3
S 2.16.12 1 C 0 3
S 2.16.13 1 C 0 3
S 2.16.14 1 C 0 3
S 2.16.15 1 C 0 3
S 2.16.15 1 C 0 3
S 2.16.16 1 C 0 3
S 2.16.16 1 C 0 3
S 2.16.17 1 C 0 3
S 2.16.18 1 C 0 3
S 2.16.19 1 C 0 3
S 2.16.20 1 C 0 3
S 2.16.21 1 C 0 3
S 2.16.22 1 C 0 3
S 2.16.23 1 C 0 3
S 2.16.24 1 C 0 3
S 2.16.25 1 C 0 3
S 2.16.26 1 C 0 3
S 2.16.26 1 C 0 3
S 2.16.27 1 C 0 3
S 2.16.27 1 C 0 3
S 2.16.28 1 C 0 3
S 2.16.28 1 C 0 3
S 2.16.29 1 C 0 3
S 2.16.30 1 C 0 3
S 2.16.31 1 C 0 3
S 2.16.32 1 C 0 3
S 2.16.32 1 C 0 3
S 2.17
Capability SUM 0 96
Maturity SUM 0 39

S 3 - Security Analysis
S 3 - Scope 2
S 3.1 1 M 0 3
S 3.1 1 M 0 3
S 3.1 1 M 0 3
S 3.1 1 M 0 3
S 3.2
S 3.2.1 1
S 3.2.2 1
S 3.2.3 1
S 3.2.4 1
S 3.2.5 1
S 3.2.6 1
S 3.2.7 1
S 3.2.8 1
S 3.2.9 1
S 3.2.10 1
S 3.2.11 1
S 3.3 1 M 0 3
S 3.4 1 M 0 3
S 3.5 1 M 0 3
S 3.6 1 M 0 3
S 3.7 1 M 0 3
S 3.8 1 M 0 3
S 3.9 1 M 0 3
S 3.9 1 M 0 3
S 3.10 1 M 0 3
S 3.11 1 M 0 3
S 3.12 1 M 0 3
S 3.12 1 M 0 3
S 3.12 1 M 0 3
S 3.12 1 M 0 3
S 3.13 1 M 0 3
S 3.14 1 M 0 3
S 3.15
S 3.15.1 1 C 0 3
S 3.15.2 1 C 0 3
S 3.15.3 1 C 0 3
S 3.15.4 1 C 0 3
S 3.15.5 1 C 0 3
S 3.15.6 1 C 0 3
S 3.15.7 1 C 0 3
S 3.15.8 1 C 0 3
S 3.15.9 1 C 0 3
S 3.15.10 1 C 0 3
S 3.15.11 1 C 0 3
S 3.15.12 1 C 0 3
S 3.15.13 1 C 0 3
S 3.15.14 1 C 0 3
S 3.15.15 1 C 0 3
S 3.15.16 1 C 0 3
S 3.15.17 1 C 0 3
S 3.15.18 1 C 0 3
S 3.15.19 1 C 0 3
S 3.15.20 1 C 0 3
S 3.15.21 1 C 0 3
S 3.15.22 1 C 0 3
S 3.15.23 1 C 0 3
S 3.15.24 1 C 0 3
S 3.16
Capability SUM 0 72
Maturity SUM 0 39

S4 - Threat Intelligence
S 4 - Scope 2
S 4.1 1 M 0 3
S 4.2
S 4.2.1 1
S 4.2.2 1
S 4.2.3 1
S 4.2.4 1
S 4.2.5 1
S 4.2.6 1
S 4.2.7 1
S 4.2.8 1
S 4.2.9 1
S 4.2.10 1
S 4.2.11 1
S 4.3 1 M 0 3
S 4.4 1 M 0 3
S 4.5 1 M 0 3
S 4.6 1 M 0 3
S 4.7 1 M 0 3
S 4.8 1 M 0 3
S 4.9 1 M 0 3
S 4.9 1 M 0 3
S 4.10 1 M 0 3
S 4.11 1 M 0 3
S 4.12 1 M 0 3
S 4.13 1 M 0 3
S 4.14
S 4.14.1 1 C 0 3
S 4.14.2 1 C 0 3
S 4.14.3 1 C 0 3
S 4.14.4 1 C 0 3
S 4.14.5 1 C 0 3
S 4.14.6 1 C 0 3
S 4.14.7 1 C 0 3
S 4.14.8 1 C 0 3
S 4.14.9 1 C 0 3
S 4.14.10 1 C 0 3
S 4.14.11 1 C 0 3
S 4.14.12 1 C 0 3
S 4.14.13 1 C 0 3
S 4.14.14 1 C 0 3
S 4.14.15 1 C 0 3
S 4.14.16 1 C 0 3
S 4.14.17 1 C 0 3
S 4.14.18 1 C 0 3
S 4.14.19 1 C 0 3
S 4.14.20 1 C 0 3
S 4.14.21 1 C 0 3
S 4.14.22 1 C 0 3
S 4.14.23 1 C 0 3
S 4.14.24 1 C 0 3
S 4.14.25 1 C 0 3
S 4.14.26 1 C 0 3
S 4.14.27 1 C 0 3
S 4.14.28 1 C 0 3
S 4.14.29 1 C 0 3
S 4.15
Capability SUM 0 87
Maturity SUM 0 36

S5 - Hunting
S 5 - Scope 2
S 5.1 1 M 0 3
S 5.2 1 M 0 3
S 5.3
S 5.3.1 1
S 5.3.2 1
S 5.3.3 1
S 5.3.4 1
S 5.3.5 1
S 5.3.6 1
S 5.3.7 1
S 5.3.8 1
S 5.3.9 1
S 5.3.10 1
S 5.3.11 1
S 5.4 1 M 0 3
S 5.5 1 M 0 3
S 5.6 1 M 0 3
S 5.7 1 M 0 3
S 5.8 1 M 0 3
S 5.9 1 M 0 3
S 5.10 1 M 0 3
S 5.10 1 M 0 3
S 5.11 1 M 0 3
S 5.12 1 M 0 3
S 5.13 1 M 0 3
S 5.14 1 M 0 3
S 5.15
S 5.15.1 1 C 0 3
S 5.15.2 1 C 0 3
S 5.15.3 1 C 0 3
S 5.15.4 1 C 0 3
S 5.15.5 1 C 0 3
S 5.15.6 1 C 0 3
S 5.15.7 1 C 0 3
S 5.15.8 1 C 0 3
S 5.15.9 1 C 0 3
S 5.15.10 1 C 0 3
S 5.15.11 1 C 0 3
S 5.15.12 1 C 0 3
S 5.15.13 1 C 0 3
S 5.15.14 1 C 0 3
S 5.15.15 1 C 0 3
S 5.15.16 1 C 0 3
S 5.15.17 1 C 0 3
S 5.15.18 1 C 0 3
S 5.15.19 1 C 0 3
S 5.15.20 1 C 0 3
S 5.15.21 1 C 0 3
S 5.16
Capability SUM 0 63
Maturity SUM 0 39

S6 - Vulnerability Management
S 6 - Scope 2
S 6.1 1 M 0 3
S 6.1 1 M 0 3
S 6.2
S 6.2.1 1
S 6.2.2 1
S 6.2.3 1
S 6.2.4 1
S 6.2.5 1
S 6.2.6 1
S 6.2.7 1
S 6.2.8 1
S 6.2.9 1
S 6.2.10 1
S 6.2.11 1
S 6.3 1 M 0 3
S 6.4 1 M 0 3
S 6.5 1 M 0 3
S 6.6 1 M 0 3
S 6.7 1 M 0 3
S 6.8 1 M 0 3
S 6.9 1 M 0 3
S 6.9 1 M 0 3
S 6.10 1 M 0 3
S 6.10 1 M 0 3
S 6.11 1 M 0 3
S 6.12 1 M 0 3
S 6.13 1 M 0 3
S 6.14
S 6.14.1 1 C 0 3
S 6.14.1 1 C 0 3
S 6.14.2 1 C 0 3
S 6.14.2 1 C 0 3
S 6.14.3 1 C 0 3
S 6.14.3 1 C 0 3
S 6.14.4 1 C 0 3
S 6.14.4 1 C 0 3
S 6.14.5 1 C 0 3
S 6.14.6 1 C 0 3
S 6.14.7 1 C 0 3
S 6.14.8 1 C 0 3
S 6.14.9 1 C 0 3
S 6.14.10 1 C 0 3
S 6.14.10 1 C 0 3
S 6.14.10 1 C 0 3
S 6.14.11 1 C 0 3
S 6.14.11 1 C 0 3
S 6.14.12 1 C 0 3
S 6.14.13 1 C 0 3
S 6.14.13 1 C 0 3
S 6.14.14 1 C 0 3
S 6.14.15 1 C 0 3
S 6.14.16 1 C 0 3
S 6.14.17 1 C 0 3
S 6.14.18 1 C 0 3
S 6.15
Capability SUM 0 57
Maturity SUM 0 36

S7 - Log Management
S 7 - Scope 2
S 7.1 1 M 0 3
S 7.2
S 7.2.1 1
S 7.2.2 1
S 7.2.3 1
S 7.2.4 1
S 7.2.5 1
S 7.2.6 1
S 7.2.7 1
S 7.2.8 1
S 7.2.9 1
S 7.2.10 1
S 7.2.11 1
S 7.3 1 M 0 3
S 7.4 1 M 0 3
S 7.5 1 M 0 3
S 7.6 1 M 0 3
S 7.7 1 M 0 3
S 7.8 1 M 0 3
S 7.9 1 M 0 3
S 7.9 1 M 0 3
S 7.10 1 M 0 3
S 7.11 1 M 0 3
S 7.12 1 M 0 3
S 7.13 1 M 0 3
S 7.14
S 7.14.1 1 C 0 3
S 7.14.2 1 C 0 3
S 7.14.3 1 C 0 3
S 7.14.4 1 C 0 3
S 7.14.5 1 C 0 3
S 7.14.6 1 C 0 3
S 7.14.7 1 C 0 3
S 7.14.8 1 C 0 3
S 7.14.9 1 C 0 3
S 7.14.10 1 C 0 3
S 7.14.11 1 C 0 3
S 7.14.12 1 C 0 3
S 7.14.13 1 C 0 3
S 7.14.14 1 C 0 3
S 7.14.15 1 C 0 3
S 7.14.16 1 C 0 3
S 7.14.17 1 C 0 3
S 7.14.18 1 C 0 3
S 7.14.19 1 C 0 3
S 7.14.19 1 C 0 3
S 7.14.20 1 C 0 3
S 7.15
Capability SUM 0 60
Maturity SUM 0 36
NIST mapping NIST in scope NIST mapping NIST in scope factor
(CSF 1.0) (CSF 1.0) (CSF 1.1) (CSF 1.1) (SUM = MIN score)

ID.BE-5 ID.BE-5 1
ID.BE-5 ID.BE-5 1
ID.BE-5 ID.BE-5 1
ID.BE-5 ID.BE-5 1
ID.BE-5 ID.BE-5 1
5

ID.AM-6 ID.AM-6 1

ID.AM-6 ID.AM-6 1
ID.AM-6 ID.AM-6 1
ID.AM-6 ID.AM-6 1
ID.AM-6 ID.AM-6 1
ID.AM-6 ID.AM-6 1
6

ID.BE-3 ID.BE-3 1

ID.BE-3 ID.BE-3 1
ID.BE-3 ID.BE-3 1
ID.BE-3 ID.BE-3 1
4
ID.GV-3 ID.GV-3 1
ID.GV-1 ID.GV-1 1

ID.BE-4 ID.BE-4 1

ID.GV-1 ID.GV-1 1
ID.GV-3 ID.GV-3 1
ID.GV-2 ID.GV-2 1
ID.GV-4 ID.GV-4 1
6

ID.GV-3 ID.GV-3 1
ID.GV-3 ID.GV-3 1
PR.IP-6 PR.IP-6 1
PR.DS-5 PR.DS-5 1
ID.GV-3 ID.GV-3 1
ID.GV-3 ID.GV-3 1
ID.GV-3 ID.GV-3 1
ID.GV-3 ID.GV-3 1
6
1
1
1
1
1
1
6

ID.AM-6 ID.AM-6 1
ID.GV-2 ID.GV-2 1
DE.DP-1 DE.DP-1 1

ID.AM-6 ID.AM-6 1
DE.DP-1 DE.DP-1 1
ID.AM-6 ID.AM-6 1

ID.AM-6 ID.AM-6 1
ID.AM-6 ID.AM-6 1
ID.GV-2 ID.GV-2 1
DE.DP-1 DE.DP-1 1

ID.AM-6 ID.AM-6 1
PR.AT-5 PR.AT-5 1
DE.DP-1 DE.DP-1 1
ID.AM-6 ID.AM-6 1
ID.AM-6 ID.AM-6 1
8

1
1
1
1
1
PR.AT-1 PR.AT-1 1
PR.IP-11 PR.IP-11 1
1
1
1
10

1
1
1
1
1
1

1
1
1
1
1
1
1
14

PR.AT-1 PR.AT-1 1

1
PR.AT-1 PR.AT-1 1
PR.AT-1 PR.AT-1 1
PR.AT-1 PR.AT-1 1
1
PR.AT-1 PR.AT-1 1
7

1
1

1
1
4

PR.IP-10 PR.IP-10 1
1
1
1
1

1
PR.IP-3 PR.IP-3 1
1
1
1

PR.IP-5 PR.IP-5 1
PR.AC-5 PR.AC-5 1
PR.AC-2 PR.AC-2 1
1
1
1

1
1
1
1
1

1
1
23

1
1
1
1
1
1

1
1
1
1
1
1
1
1

1
1
1
1
1

1
1
1
22

ID.RA-3 ID.RA-3 1
ID.RA-4 ID.RA-4 1
ID.RA-5 ID.RA-5 1
ID.RA-3 ID.RA-3 1
ID.RA-4 ID.RA-4 1
ID.RA-5 ID.RA-5 1
ID.RA-3 ID.RA-3 1
ID.RA-4 ID.RA-4 1
ID.RA-5 ID.RA-5 1
ID.RA-3 ID.RA-3 1
ID.RA-4 ID.RA-4 1
ID.RA-5 ID.RA-5 1
ID.RM-1 ID.RM-1 1
ID.RA-3 ID.RA-3 1
ID.RA-4 ID.RA-4 1
ID.RA-5 ID.RA-5 1
ID.RA-3 ID.RA-3 1
ID.RA-4 ID.RA-4 1
ID.RA-5 ID.RA-5 1
ID.RA-3 ID.RA-3 1
ID.RA-4 ID.RA-4 1
ID.RA-5 ID.RA-5 1
ID.RA-3 ID.RA-3 1
ID.RA-4 ID.RA-4 1
ID.RA-5 ID.RA-5 1
ID.RA-3 ID.RA-3 1
ID.RA-4 ID.RA-4 1
ID.RA-5 ID.RA-5 1
ID.RA-3 ID.RA-3 1
ID.RA-4 ID.RA-4 1
ID.RA-5 ID.RA-5 1
ID.RA-3 ID.RA-3 1
ID.RA-4 ID.RA-4 1
ID.RA-5 ID.RA-5 1
ID.RA-3 ID.RA-3 1
ID.RA-4 ID.RA-4 1
ID.RA-5 ID.RA-5 1
12

DE.DP-1 DE.DP-1 1
DE.DP-1 DE.DP-1 1

DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1

1
PR.AT-5 PR.AT-5 1
PR.AT-5 PR.AT-5 1
1

PR.IP-4 PR.IP-4 1
PR.IP-4 PR.IP-4 1
PR.IP-4 PR.IP-4 1
PR.IP-9 PR.IP-9 1
PR.IP-10 PR.IP-10 1
PR.DS-7 PR.DS-7 1

PR.PT-3 PR.PT-3 1
PR.AC-4 PR.AC-4 1
PR.PT-3 PR.PT-3 1
PR.AC-4 PR.AC-4 1

DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
DE.DP-2 DE.DP-2 1
DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
PR.AC-4 PR.AC-4 1
PR.MA-1 PR.MA-1 1
PR.MA-2 PR.MA-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
26
16

DE.DP-1 DE.DP-1 1
DE.DP-1 DE.DP-1 1

DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1

1
PR.AT-5 PR.AT-5 1
PR.AT-5 PR.AT-5 1
1

PR.IP-4 PR.IP-4 1
PR.IP-4 PR.IP-4 1
PR.IP-4 PR.IP-4 1
PR.IP-9 PR.IP-9 1
PR.IP-10 PR.IP-10 1
PR.DS-7 PR.DS-7 1

PR.PT-3 PR.PT-3 1
PR.AC-4 PR.AC-4 1
PR.PT-3 PR.PT-3 1
PR.AC-4 PR.AC-4 1

DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
PR.DS-6 PR.DS-6 1
DE.CM-7 DE.CM-7 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
DE.AE-1 DE.AE-1 1
DE.DP-2 DE.DP-2 1
DE.AE-3 DE.AE-3 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
PR.AC-4 PR.AC-4 1
PR.MA-1 PR.MA-1 1
PR.MA-2 PR.MA-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
17
16

DE.DP-1 DE.DP-1 1
DE.DP-1 DE.DP-1 1

DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1

1
PR.AT-5 PR.AT-5 1
PR.AT-5 PR.AT-5 1
1

PR.IP-4 PR.IP-4 1
PR.IP-4 PR.IP-4 1
PR.IP-4 PR.IP-4 1
PR.IP-9 PR.IP-9 1
PR.IP-10 PR.IP-10 1
PR.DS-7 PR.DS-7 1

PR.PT-3 PR.PT-3 1
PR.AC-4 PR.AC-4 1
PR.PT-3 PR.PT-3 1
PR.AC-4 PR.AC-4 1

DE.DP-2 DE.DP-2 1
DE.AE-3 DE.AE-3 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
PR.PT-1 PR.PT-1 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.AE-1 DE.AE-1 1
DE.AE-1 DE.AE-1 1
DE.AE-1 DE.AE-1 1
DE.AE-1 DE.AE-1 1
DE.AE-1 DE.AE-1 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
PR.AC-4 PR.AC-4 1
PR.MA-1 PR.MA-1 1
PR.MA-2 PR.MA-2 1
DE.DP-2 DE.DP-2 1
24
16

DE.DP-1 DE.DP-1 1
DE.DP-1 DE.DP-1 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1

1
PR.AT-5 PR.AT-5 1
PR.AT-5 PR.AT-5 1
1

PR.IP-4 PR.IP-4 1
PR.IP-4 PR.IP-4 1
PR.IP-4 PR.IP-4 1
PR.IP-9 PR.IP-9 1
PR.IP-10 PR.IP-10 1
PR.DS-7 PR.DS-7 1

PR.PT-3 PR.PT-3 1
PR.AC-4 PR.AC-4 1
PR.PT-3 PR.PT-3 1
PR.AC-4 PR.AC-4 1

1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
PR.AC-4 PR.AC-4 1
PR.MA-1 PR.MA-1 1
PR.MA-2 PR.MA-2 1
1
1
19
16

DE.DP-1 DE.DP-1 1
DE.CM-1 DE.CM-1 1
DE.CM-2 DE.CM-2 1
DE.CM-3 DE.CM-3 1
DE.CM-4 DE.CM-4 1
DE.CM-5 DE.CM-5 1
DE.CM-6 DE.CM-6 1
DE.CM-7 DE.CM-7 1
DE.AE-3 DE.AE-3 1

DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-4 DE.DP-4 1
DE.DP-1 DE.DP-1 1
DE.DP-1 DE.DP-1 1
DE.DP-2 DE.DP-2 1
PR.IP-9 PR.IP-9 1
PR.MA-1 PR.MA-1 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.CM-1 DE.CM-1 1
DE.CM-2 DE.CM-2 1
DE.CM-3 DE.CM-3 1
DE.CM-4 DE.CM-4 1
DE.CM-5 DE.CM-5 1
DE.CM-6 DE.CM-6 1
DE.CM-7 DE.CM-7 1
DE.AE-3 DE.AE-3 1
DE.DP-2 DE.DP-2 1
DE.DP-5 DE.DP-5 1

DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.CM-4 DE.CM-4 1
DE.CM-5 DE.CM-5 1
DE.DP-2 DE.DP-2 1
DE.AE-5 DE.AE-5 1
DE.AE-5 DE.AE-5 1
PR.DS-4 PR.DS-4 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
DE.CM-7 DE.CM-7 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
PR.DS-5 PR.DS-5 1
PR.DS-5 PR.DS-5 1
DE.CM-6 DE.CM-6 1
DE.CM-2 DE.CM-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1

24
13

1
RS.CO-1 RS.CO-1 1

RS.IM-1 RS.IM-1 1
1
RS.CO-2 RS.CO-2 1
RS.CO-3 RS.CO-3 1
RS.CO-4 RS.CO-4 1
RS.CO-5 RS.CO-5 1
1
RS.CO-1 RS.CO-1 1
1
RS.CO-1 RS.CO-1 1
RS.MI-1 RS.MI-1 1
RS.MI-2 RS.MI-2 1
1
RS.RP-1 RS.RP-1 1
RS.IM-1 RS.IM-1 1
RS.IM-1 RS.IM-1 1
RS.IM-2 RS.IM-2 1

RS.CO-2 RS.CO-2 1
RS.MI-2 RS.MI-2 1
RS.AN-1 RS.AN-1 1
RS.AN-2 RS.AN-2 1
RS.AN-3 RS.AN-3 1
1
PR.AT-5 PR.AT-5 1
RS.RP-1 RS.RP-1 1
DE.DP-3 DE.DP-3 1
RS.CO-1 RS.CO-1 1
RS.CO-1 RS.CO-1 1
RS.CO-2 RS.CO-2 1
RS.CO-2 RS.CO-2 1
RS.AN-1 RS.AN-1 1
RS.AN-2 RS.AN-2 1
DE.AE-4 DE.AE-4 1
RS.AN-2 RS.AN-2 1
DE.AE-4 DE.AE-4 1
RS.AN-4 RS.AN-4 1
RS.CO-4 RS.CO-4 1
RS.CO-4 RS.CO-4 1
RS.CO-2 RS.CO-2 1
RS.CO-4 RS.CO-4 1
RS.CO-4 RS.CO-4 1
RS.CO-2 RS.CO-2 1
1
RS.AN-3 RS.AN-3 1
RS.MI-1 RS.MI-1 1
RS.MI-2 RS.MI-2 1
RS.MI-1 RS.MI-1 1
RS.MI-2 RS.MI-2 1
RS.MI-1 RS.MI-1 1
RS.MI-2 RS.MI-2 1
RS.IM-1 RS.IM-1 1
RS.CO-2 RS.CO-2 1
RS.MI-2 RS.MI-2 1
RS.IM-1 RS.IM-1 1
RS.IM-2 RS.IM-2 1

32
13

DE.DP-1 DE.DP-1 1
RS.AN-1 RS.AN-1 1
RS.AN-3 RS.AN-3 1
DE.AE-2 DE.AE-2 1

DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-4 DE.DP-4 1
DE.DP-1 DE.DP-1 1
DE.DP-1 DE.DP-1 1
DE.DP-2 DE.DP-2 1
PR.IP-9 PR.IP-9 1
PR.MA-1 PR.MA-1 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
DE.DP-2 DE.DP-2 1
RS.AN-1 RS.AN-1 1
RS.AN-3 RS.AN-3 1
DE.AE-2 DE.AE-2 1
DE.DP-2 DE.DP-2 1
DE.DP-5 DE.DP-5 1

DE.AE-2 DE.AE-2 1
DE.AE-2 DE.AE-2 1
DE.AE-2 DE.AE-2 1
DE.AE-2 DE.AE-2 1
DE.AE-2 DE.AE-2 1
DE.AE-2 DE.AE-2 1
DE.AE-2 DE.AE-2 1
DE.AE-2 DE.AE-2 1
DE.AE-2 DE.AE-2 1
DE.AE-2 DE.AE-2 1
DE.AE-2 DE.AE-2 1
DE.AE-2 DE.AE-2 1
DE.AE-2 DE.AE-2 1
DE.AE-2 DE.AE-2 1
RS.AN-3 RS.AN-3 1
RS.AN-3 RS.AN-3 1
DE.AE-2 DE.AE-2 1
RS.AN-3 RS.AN-3 1
RS.AN-3 RS.AN-3 1
DE.AE-2 DE.AE-2 1
RS.CO-2 RS.CO-2 1
RS.AN-3 RS.AN-3 1
RS.AN-3 RS.AN-3 1
RS.AN-3 RS.AN-3 1

24
13

ID.RA-3 ID.RA-3 1

1
1
1
1
1
ID.RA-5 ID.RA-5 1
PR.IP-9 PR.IP-9 1
PR.MA-1 PR.MA-1 1
1
ID.RA-3 ID.RA-3 1
1
1
ID.RA-2 ID.RA-2 1
ID.RA-2 ID.RA-2 1
ID.RA-2 ID.RA-2 1
ID.RA-2 ID.RA-2 1
ID.RA-2 ID.RA-2 1
ID.RA-2 ID.RA-2 1
ID.RA-2 ID.RA-2 1
ID.RA-2 ID.RA-2 1
ID.RA-2 ID.RA-2 1
ID.RA-3 ID.RA-3 1
ID.RA-3 ID.RA-3 1
ID.RA-3 ID.RA-3 1
ID.RA-3 ID.RA-3 1
1
ID.RA-3 ID.RA-3 1
ID.RA-3 ID.RA-3 1
ID.RA-3 ID.RA-3 1
ID.RA-3 ID.RA-3 1
DE.AE-2 DE.AE-2 1
1
1
1
ID.RA-5 ID.RA-5 1
1
1
ID.RA-2 ID.RA-2 1
ID.RA-2 ID.RA-2 1
ID.RA-2 ID.RA-2 1
ID.RA-2 ID.RA-2 1

29
12

1
ID.RA-3 ID.RA-3 1
1
1
1
1
1
1
PR.IP-9 PR.IP-9 1
PR.MA-1 PR.MA-1 1
1
1
1
DE.DP-5 DE.DP-5 1

DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
DE.CM-1 DE.CM-1 1
1
1
1
DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
1
1
1
1

21
13

PR.IP-12 PR.IP-12 1
ID.RA-1 ID.RA-1 1
PR.IP-12 PR.IP-12 1
PR.IP-12 PR.IP-12 1
PR.IP-12 PR.IP-12 1
PR.IP-12 PR.IP-12 1
PR.IP-12 PR.IP-12 1
PR.IP-12 PR.IP-12 1
PR.IP-9 PR.IP-9 1
PR.MA-1 PR.MA-1 1
PR.IP-12 PR.IP-12 1
ID.RA-1 ID.RA-1 1
PR.IP-12 PR.IP-12 1
PR.IP-12 PR.IP-12 1
PR.IP-12 PR.IP-12 1

DE.CM-8 DE.CM-8 1
ID.AM-1 ID.AM-1 1
DE.CM-8 DE.CM-8 1
ID.RA-1 ID.RA-1 1
ID.RA-5 ID.RA-5 1
ID.RA-1 ID.RA-1 1
RS.MI-3 RS.MI-3 1
DE.CM-8 DE.CM-8 1
DE.CM-8 DE.CM-8 1
PR.IP-12 PR.IP-12 1
PR.IP-12 PR.IP-12 1
PR.IP-12 PR.IP-12 1
PR.IP-12 PR.IP-12 1
ID.RA-5 ID.RA-5 1
ID.RA-1 ID.RA-1 1
RS.MI-3 RS.MI-3 1
PR.IP-12 PR.IP-12 1
ID.RA-1 ID.RA-1 1
ID.AM-2 ID.AM-2 1
PR.IP-12 PR.IP-12 1
ID.RA-1 ID.RA-1 1
DE.CM-8 DE.CM-8 1
PR.IP-12 PR.IP-12 1
PR.IP-12 PR.IP-12 1
DE.CM-8 DE.CM-8 1
DE.CM-8 DE.CM-8 1
19
12

PR.PT-1 PR.PT-1 1

PR.PT-1 PR.PT-1 1
PR.PT-1 PR.PT-1 1
PR.PT-1 PR.PT-1 1
PR.PT-1 PR.PT-1 1
PR.PT-1 PR.PT-1 1
PR.PT-1 PR.PT-1 1
PR.IP-9 PR.IP-9 1
PR.MA-1 PR.MA-1 1
PR.PT-1 PR.PT-1 1
PR.PT-1 PR.PT-1 1
PR.PT-1 PR.PT-1 1
PR.PT-1 PR.PT-1 1

DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
DE.AE-3 DE.AE-3 1
PR.DS-4 PR.DS-4 1
PR.DS-2 PR.DS-2 1
DE.AE-3 DE.AE-3 1
PR.DS-2 PR.DS-2 1
DE.AE-3 DE.AE-3 1
1
1
1
PR.DS-1 PR.DS-1 1
ID.GV-3 ID.GV-3 1
PR.PT-1 PR.PT-1 1
ID.GV-3 ID.GV-3 1
PR.IP-6 PR.IP-6 1
ID.GV-3 ID.GV-3 1

20
12
total score MAX score final score

0 5 0
0 5 0
0 5 0
0 5 0
0 5 0
0 25 0

0 5

0 5
0 5
0 5
0 5
0 5
0 30 0

0 5

0 5
0 5
0 5
0 20 0
0 5
0 5

0 5

0 5
0 5
0 5
0 5
0 30 0

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 30 0
0 5
0 5
0 5
0 5
0 5
0 5
0 30 0

0 5
0 5
0 5

0 5
0 5
0 5

0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 40 0

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 50 0

0 5

0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 70 0

0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 35 0

0 5
0 5

0 5
0 5
0 20 0

0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 115 0

0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 110 0

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 60 0

0 5
0 5

0 5
0 5

0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 130 0
0 80 0

0 5
0 5

0 5
0 5

0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 85 0
0 80 0

0 5
0 5

0 5
0 5

0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 120 0
0 80 0

0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 95 0
0 80 0

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 120 0
0 65 0

0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 160 0
0 65 0

0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 120 0
0 65 0

0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 145 0
0 60 0

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 105 0
0 65 0

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 95 0
0 60 0

0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5
0 5

0 100 0
0 60 0
remarks

not used in calculations, but to determine 2.1

not used in calculations, but to determine 3.1


not used in calculations, but to determine 4.2

Scoring connected to B 5.2


Scoring connected to B 5.2
Scoring connected to P 2.1
Scoring connected to P 2.1
Not part of scoring

Scoring connected to P 2.3

Scoring connected to P 2.6


Scoring connected to P 2.6

Scoring connected to P 2.8


Scoring connected to P 2.8
Scoring connected to M 4.1
Scoring connected to M 4.1

Scoring connected to M 4.2


Scoring connected to M 4.2

Scoring connected to M 4.3


Scoring connected to M 4.3
Scoring connected to M 4.4
Scoring connected to M 4.4
Scoring connected to M 4.4

Scoring connected to M 4.5


Scoring connected to M 4.5

Scoring connected to M 4.6


Scoring connected to M 4.6

Scoring connected to M 4.7


Scoring connected to M 4.7

Scoring connected to M 4.8


Scoring connected to M 4.8

Scoring connected to M 4.9


Scoring connected to M 4.9

Scoring connected to M 4.10


Scoring connected to M 4.10

Scoring connected to M 4.11


Scoring connected to M 4.11

Scoring connected to M 4.12


Scoring connected to M 4.12
Scoring connected to T 1.5.1

Scoring connected to T 1.5.2

Scoring connected to T 1.6.23


Scoring connected to T 2.5.1

Scoring connected to T 2.5.2

Scoring connected to T 2.6.14


Scoring connected to T 3.5.1

Scoring connected to T 3.5.2

Scoring connected to T 3.6.23


Scoring connected to T 4.5.1

Scoring connected to T 4.5.2

Scoring connected to T 4.6.17


Scoring connected to S 1.1
Scoring connected to S 1.1
Scoring connected to S 1.1
Scoring connected to S 1.1
Scoring connected to S 1.1
Scoring connected to S 1.1
Scoring connected to S 1.1
Scoring connected to S 1.1

Scoring connected to S 1.9

Scoring connected to S 1.12


Scoring connected to S 1.12
Scoring connected to S 1.12
Scoring connected to S 1.12
Scoring connected to S 1.12
Scoring connected to S 1.12
Scoring connected to S 1.12
Scoring connected to S 1.12

Scoring connected to S 1.15.5


Scoring connected to S 2.7
Scoring connected to S 2.7
Scoring connected to S 2.7
Scoring connected to S 2.11
Scoring connected to S 2.11

Scoring connected to S 2.15

Scoring connected to S 2.16.15

Scoring connected to S 2.16.16

Scoring connected to S 2.16.26

Scoring connected to S 2.16.27

Scoring connected to S 2.16.28


Scoring connected to S 2.16.32

Note, maturity score can be overruled in S 2.2.2

Scoring connected to S 3.1


Scoring connected to S 3.1
Scoring connected to S 3.1

Scoring connected to S 3.9

Scoring connected to S 3.12


Scoring connected to S 3.12
Scoring connected to S 3.12
Scoring connected to S 4.9
Scoring connected to S 5.10

Scoring connected to S 6.1


Scoring connected to S 6.9

Scoring connected to S 6.10

Scoring connected to S 6.14.1

Scoring connected to S 6.14.2

Scoring connected to S 6.14.3

Scoring connected to S 6.14.10


Scoring connected to S 6.14.10

Scoring connected to S 6.14.11

Scoring connected to S 6.14.13


Scoring connected to S 7.9
Scoring connected to S 7.14.19
Index
Index

Click on any section name to proceed directly to that part of the assessment
Domain Section % complete
Introduction 1. Introduction N/A
2. Usage N/A

General 1. Profile N/A


2. Scope N/A

Business 1. Business drivers 0


2. Customers 0
3. Charter 0
4. Governance 0
5. Privacy 0

People 1. Employees 0
2. Roles and Hierarchy 0
3. People Management 0
4. Knowledge Management 0
5. Training and Education 0

Process 1. Management 0
2. Operations and Facilities 0
3. Reporting 0
4. Use Case Management 0

Technology 1. SIEM Tooling 0


2. IDPS Tooling 0
3. Security Analytics Tooling 0
4. Security Automation & Orchestration tooling 0

Services 1. Security Monitoring 0


2. Security Incident Management 0
3. Security Analysis and Forensics 0
4. Threat Intelligence 0
5. Threat Hunting 0
6. Vulnerability Management 0
7. Log Management 0

Results 1. Results N/A


2. NIST CSF Scoring N/A

Next steps 1. Next steps N/A


Introduction
1. Introduction
2. Usage

General information
Author Rob van Os
Site https://www.soc-cmm.com/
Contact info [at] SOC-CMM.com
Version 2.0, basic version
Date April 25th, 2018

Background

The SOC-CMM is a capability maturity model that can be used to perform a self-assessment of your Security Operati
conducted on literature regarding SOC setup and existing SOC models as well as literature on specific elements with
validated by questioning several Security Operations Centers in different sectors and on different maturity levels to d
The output from the survey, combined with the initial analysis is the basis for this self-assessment.

For more information regarding the scientific background and the literature used to create the SOC-CMM self-asses
available through: https://www.soc-cmm.com/

If you have any questions or comments regarding the contents of this document, please use the above information t

Purpose and intended audience


The purpose of the SOC-CMM is to gain insight into the strengths and weaknesses of the SOC. This enables the SOC
which elements of the SOC require additional attention and/or budget. By regularly assessing the SOC for maturity a

Besides the primary purpose of performing an assessment of the SOC, the assessment can also be used for extensive
valuable insights.

This tool is intended for use by SOC and security managers, experts within the SOC and SOC consultants.
The purpose of the SOC-CMM is to gain insight into the strengths and weaknesses of the SOC. This enables the SOC
which elements of the SOC require additional attention and/or budget. By regularly assessing the SOC for maturity a

Besides the primary purpose of performing an assessment of the SOC, the assessment can also be used for extensive
valuable insights.

This tool is intended for use by SOC and security managers, experts within the SOC and SOC consultants.

Navigation
Navigation through this tool is done using the navigation bar at the top of each page. Each of the numbered section
section. Furthermore, the icons can be used to navigate through sections within a domain and between domains. Th

navigate to previous domain navigate to previous section within the domain

navigate to index navigate to next section within the domain

navigate to next domain navigate directly to results

Assessment Model
The assessment model consists of 5 domains and 25 aspects. All domains are evaluated for maturity (blue), only tec
maturity and capability (purple)
Maturity Levels
CMMI defines maturity as a means for an organization "to characterize its performance" for a specific entity (here:
The SOC-CMM calculates a maturity score using 6 maturity levels:
- Level 0: non-existent
- Level 1: initial
- Level 2: managed
- Level 3: defined
- Level 4: quantitatively managed
- Level 5: optimizing

These maturity levels are measured across 5 domains: business, people, process, technology and services. The mat
staged with pre-requisites for each level. Instead, every element adds individually to the maturity score: a continuo

Capability Levels
Capabilities are indicators of completeness. In essence, capabilities can support maturity.
The SOC-CMM calculates a capability score using 4 capability levels, similar to CMMi:
- Level 0: incomplete
- Level 1: performed
- Level 2: managed
- Level 3: defined

These capability levels have a strong technical focus and are measured across 2 domains: technology and services.
capability level is continuous. There are no prerequisites for advancing to a higher level, thus the capability growth

Disclaimer
The SOC-CMM is provided without warranty of any kind. The author of the document cannot assure its accuracy an
based on the output of this tool. The usage of this tool does not in any way entitle the user to support or consultan
conditions.

License
The SOC-CMM advanced version is part of the SOC-CMM.

The SOC-CMM is free software: you can redistribute it and/or modify it under the terms of the GNU General Public
Foundation, either version 3 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the impl
FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program. If not, see <http://ww
t of your Security Operations Center (SOC). The model is based on review
on specific elements within a SOC. The literature analysis was then
erent maturity levels to determine which elements were actually in place.
sment.

the SOC-CMM self-assessment tool, please refer to the thesis document as

e the above information to contact me.

OC. This enables the SOC management to make informed decisions about
ng the SOC for maturity and capability, progress can be monitored.

also be used for extensive discussions about the SOC and can thus provide

consultants.
OC. This enables the SOC management to make informed decisions about
ng the SOC for maturity and capability, progress can be monitored.

also be used for extensive discussions about the SOC and can thus provide

consultants.

of the numbered sections can be clicked to proceed directly to that


and between domains. The icons are as follows:

ection within the domain

on within the domain

maturity (blue), only technology and services are evaluated for both
or a specific entity (here: the SOC).

gy and services. The maturity levels as implemented in this tool are not
aturity score: a continuous maturity model.

echnology and services. Similar to maturity levels, progress to a higher


us the capability growth is continuous as well.

ot assure its accuracy and is not liable for any cost as a result of decisions
to support or consultancy. By using this tool, you agree to these

the GNU General Public License as published by the Free Software

Y; without even the implied warranty of MERCHANTABILITY or FITNESS

m. If not, see <http://www.gnu.org/licenses/>.


Introduction
1. Introduction
2. Usage

How to use the SOC-CMM


The SOC-CMM has an embedded workflow that guides the assessment. First, the profile sheet is filled in and the sco
of the SOC-CMM (i.e. Business, People, Process, Technology and Services) are each evaluated in separate sections of

The evaluation is based on questions that can be answered using a drop-down that presents a 5-point scale. This sc
below under 'Scoring mechanism'. This tool should be used by assessing each sheet in order. When all domains are c
the total scoring and detailed scoring for each domain. A sheet 'Next steps' is also included to provide pointers for fo

There is also a weighing mechanism in place. For each question, the importance of that element can be changed. Th
that the score is not modified. Changing to importance to 'low' will cause the element to have less impact on the sco
element to have more impact on the score. Setting it to 'none' will ignore the element in scoring entirely, as explaine
should be used with care.

Some additional remarks regarding the usage of the SOC-CMM:


1. Some elements are not used directly for scoring (this is also indicated), but are a guideline for answering other qu
example, question 3.1 (part of maturity score) can be answered by using the elements in 3.2 (not part of maturity sc
2. Elements with a green color are calculated fields. These will be filled in automatically by filling in those parts of th
3. The services and Technology domains evaluate both maturity and capability. These capabilities do not have a 5-p
scale instead. This is to reduce the amount of clicks and answers. The sixth element in the scale is 'not required'. Use
particular capability and to exclude it from scoring.
4. Every sheet has a part where you can fill in some comments or remarks. Discussing the questions in this self-asses
This is added value for a self-assessment, so it is worthwhile to create notes.
5. The weighing mechanisms allows for manipulation of the maturity and capability score. Therefore, it is important
you wish to deviate from the standard importance. The goal of the SOC-CMM is to provide insight into strengths and
obtain the highest score possible.
6. The NIST score is calculated automatically as explained below
7. Performing a full SOC-CMM assessment can take a significant amount of time, depending on the level of detail yo
ensure that you have allocated sufficient time. A way to reduce effort is to have a single knowledgeable SOC employ
on areas that are debatable. Also, reducing scope for an initial assessment is a way to reduce the assessment effort

Scoring mechanism
Each question that is part of the maturity scoring can be answered by selecting one of 5 options. These options vary
questions regarding completeness, the following applies:
- Incomplete, score: 0
- Partially complete, score: 1,25
- Averagely complete, score: 2,5
- Mostly complete, score: 3,75
- Fully complete, score: 5
As indicated, the score can be modified by using the weighing mechanism (use with care)
Each question that is part of the maturity scoring can be answered by selecting one of 5 options. These options vary
questions regarding completeness, the following applies:
- Incomplete, score: 0
- Partially complete, score: 1,25
- Averagely complete, score: 2,5
- Mostly complete, score: 3,75
- Fully complete, score: 5
As indicated, the score can be modified by using the weighing mechanism (use with care)

Guidance
For each of the maturity questions, guidance is available. When a value is selected from the dropdown box, guidanc
column. This guidance can be used to help determine the correct level. Note that this is truly meant as guidance on

Weighing mechanism
The weighing mechanism in the tool works by applying a factor to the element score as follows:
- Importance 'None', factor = 0 (not included in scoring)
- Importance 'Low', factor = 0.5 (score divided by 2)
- Importance 'Normal', factor = 1 (score not affected)
- Importance 'High', factor = 2 (score doubled)
- Importance 'Critical', factor = 4 (score quadrupled)

NIST Cyber Security Framework scoring


A detailed mapping between the SOC-CMM and the NIST CSF was created to allow for granular scoring. The exact m
separate download.
eet is filled in and the scope for assessment is selected. Then, the 5 domains
ed in separate sections of this tool.

ts a 5-point scale. This scale relates to the maturity level as explained


r. When all domains are completed, the sheet 'Results' will provide you with
to provide pointers for follow-up.

ment can be changed. The standard importance is 'normal', which means


ave less impact on the score. Changing it to 'High' or 'Critical' will cause the
oring entirely, as explained under 'Weighing mechanism'. This feature

e for answering other questions. These elements have a lighter color. For
2 (not part of maturity score) as a guideline.
filling in those parts of the assessment.
bilities do not have a 5-point scale and an importance, but use a 6-point
cale is 'not required'. Use this if you do not feel like you need that

uestions in this self-assessments will likely uncover some improvements.

Therefore, it is important to strongly consider and possibly document why


insight into strengths and weaknesses and to improve the SOC, not to

g on the level of detail you put into the assessment. Before you start,
owledgeable SOC employee perform a quick scan and subsequently focus
ce the assessment effort.

tions. These options vary based on the type of question. For example, for
tions. These options vary based on the type of question. For example, for

e dropdown box, guidance for that value is show under the guidance
y meant as guidance on interpretation and scoring, not as the single truth.

ows:

nular scoring. The exact mapping can be found on the SOC-CMM site as a
Profile
1. Profile
2. Scope

Please fill in the information below to create a short profile of the SOC and the assessment

Assessment Details
Date of assessment
Name(s)

Department(s)
Intended purpose of the assessment

Scope

SOC Profile
Number of year in operation
Number of FTE's
SOC model
Geographic operation

Target Maturity (optional)


Target maturity level business domain 1
Target maturity level people domain 1
Target maturity level process domain 1
Target maturity level technology domain 1
Target maturity level services domain 1
Target overall maturity level 1

Target Capability (optional)


Target maturity level technology domain 1
Target maturity level services domain 1
Target overall capability level 1

Notes or comments
Follow the sun, hybrid (partially outsourced), centralized, multiple individual SOCs, multi-tiered SOC model
Regional, National, Continental, Global

Indicate a score from 1 to 5. Decimals can be used


Indicate a score from 1 to 5. Decimals can be used
Indicate a score from 1 to 5. Decimals can be used
Indicate a score from 1 to 5. Decimals can be used
Indicate a score from 1 to 5. Decimals can be used
d, multiple individual SOCs, multi-tiered SOC model
Profile
1. Profile
2. Scope

Please select the services and technologies that should be included into the assessment. Excluding a service or techn

SOC Tooling (Technology domain) Remarks


SIEM Tooling Security Information and
IDPS Tooling Intrusion Detection and
Security Analytics Tooling Big data security solutio
Automation & Orchestration tooling Used to automate work

SOC Services (services domain) Remarks


Security Monitoring The security monitoring
Security Incident Management The security incident ma
Security Analysis The security analysis ser
Threat Intelligence The threat intelligence s
Threat Hunting The hunting service take
Vulnerability Management The vulnerability manag
Log Management The log management se
cluding a service or technology here will exclude it from scoring.

Security Information and Event management tooling. Used to gather logging information from company assets and correlate events
Intrusion Detection and Prevention Tooling. Used to detect in-line exploits and anomalous network activity
Big data security solution. Used to gather structured and unstructured security information and find anomalies using statistical and da
Used to automate workflows and SOC actions, support incident response and orchestrate between different security products

The security monitoring service aims at detecting security incidents and events
The security incident management service aims at responding to security incidents in a timely, accurate and organized fashion
The security analysis service supports security monitoring and security incident management. Analysis includes event analysis and for
The threat intelligence service provides information about potential threats that can be used in security monitoring, security incident
The hunting service takes a proactive approach to finding threats in the infrastructure. Threat intelligence is often used to guide hunti
The vulnerability management service is used to detect vulnerabilities in assets by discovery and actively scanning assets for known vu
The log management service is used to collect, store and retain logging. Can be used for compliance purposes as well as investigation
er logging information from company assets and correlate events
exploits and anomalous network activity
ured security information and find anomalies using statistical and data analysis techniques
sponse and orchestrate between different security products

s and events
ecurity incidents in a timely, accurate and organized fashion
urity incident management. Analysis includes event analysis and forensic analysis
al threats that can be used in security monitoring, security incident response, security analysis and threat hunting
n the infrastructure. Threat intelligence is often used to guide hunting efforts
ties in assets by discovery and actively scanning assets for known vulnerabilities
gging. Can be used for compliance purposes as well as investigation purposes
lysis techniques

se, security analysis and threat hunting


Business
1. Business Drivers 5. Privacy
2. Customers
3. Charter
4. Governance

1 Business Drivers
1.1 Have you identified the main business drivers?
1.2 Have you documented the main business drivers?
1.3 Do you use business drivers in the decision making process?
1.4 Do you regularly check if the current service catalogue is aligned with business drivers?
1.5 Have the business drivers been validated with business stakeholders?

Comments and/or Remarks


1.6 Specify any comments or remarks you feel are important to this part of the assessment
Answer Guidance
Remarks
Example business drivers: cyber crime prevention, risk reduction, law / regulation, audit / compliance, business continuity
Documentation of business drivers is important for demonstrable business alignment
e.g. to determine priorities or make decisions regarding the on-boarding of new services or operations
i.e. do you check for services or operations that outside the scope of business drivers?
Business stakeholders can be C-level management
Business
1. Business Drivers 5. Privacy
2. Customers
3. Charter
4. Governance

2 Customers
2.1 Have you identified the SOC customers?
2.2 Please specify your customers:
2.2.1 Legal
2.2.2 Audit
2.2.3 Engineering / R&D
2.2.4 IT
2.2.5 Business
2.2.6 External customers
2.2.7 (Senior) Management
2.2.8 Other customers:

2.3 Have you documented the main SOC customers?


2.4 Do you differentiate output towards these specific customers?
2.5 Do you have service level agreements with these customers?
2.6 Do you regularly send updates to your customers?
2.7 Do you actively measure and manage customer satisfaction?

Comments and/or Remarks


2.8 Specify any comments or remarks you feel are important to this part of the assessment
Answer Guidance
Remarks
Types of customers, customer requirements / expectations, etc.
Use this a guideline for answering 2.1 This is also potentially useful for insights and comparison with previous assessmen
Legal department, may be a stakeholder for privacy, or may request forensic investigation to the SOC
The audit department can be supported by logging provided by the SOC
The engineering departments deal with Intellectual Property that may require additional access monitoring
IT departments can be supported by monitoring for anomalies in their infrastructure and systems
Business should be the most important customer, as all SOC activities ultimately support business processes
External customers mostly apply to managed service providers
Senior management may be a direct SOC customer, depending on organization hierarchy
Specify any additional customers

Formal registration of customer contact details, place in the organization, geolocation, etc.
For example, are communication style and contents to Business customers different than that to IT?
Service level agreements are used to provide standardized services operating within known boundaries
For example: changes in service scope or delivery. Can also be reports, dashboards, etc.
Business
1. Business Drivers 5. Privacy
2. Customers
3. Charter
4. Governance

3 Charter
3.1 Does the SOC have a formal charter document in place?
3.2 Please specify elements of the charter document:
3.2.1 Mission
3.2.2 Vision
3.2.3 Strategy
3.2.4 Service Scope
3.2.5 Deliverables
3.2.6 Responsibilities
3.2.7 Accountability
3.2.8 Operational Hours
3.2.9 Stakeholders
3.2.10 Objectives / Goals
3.2.11 Statement of success
Completeness
3.3 Is the SOC charter document regularly updated?
3.4 Is the SOC charter document approved by the business / CISO?
3.5 Are all stakeholders familiar with the SOC charter document contents?

Comments and/or Remarks


3.6 Specify any comments or remarks you feel are important to this part of the assessment
Answer Guidance

Incomplete
Remarks
See 3.2 for charter document elements

A SOC mission should be established to provide insight into the reason for existence of the SOC
A vision should be created to determine long-term goals for the SOC
A strategy should be in place to show how to meet goals and targets set by mission and vision
Service scope is documented to provide insight into SOC service delivery
The output provided by the SOC, for example: reports, incidents, investigations, advisories, etc.
Responsibilities of the SOC
Accountability for the SOC for actions taken
Operational hours of the SOC
All relevant stakeholders for the SOC
Objectives and goals should be concrete and measurable so that they are fit for reporting purposes
A statement of success is used to determine when the SOC is successful. Should be aligned with goals and objectives
Use this outcome as a guideline to determine the score for 3.1
Regularity should be matched to your own internal policy. At least yearly is recommended
Approval from the relevant stakeholders will aid in business support for SOC operations
Making stakeholders aware of the contents can help in
Business
1. Business Drivers 5. Privacy
2. Customers
3. Charter
4. Governance

4 Governance
4.1 Does the SOC have a governance process in place?
4.2 Have all governance elements been identified?
4.3 Please specify identified governance elements
4.3.1 Business Alignment
4.3.2 Accountability
4.3.3 Sponsorship
4.3.4 Mandate
4.3.5 Relationships
4.3.6 Vendor Engagement
4.3.7 Service Commitment
4.3.8 Project / Program Management
4.3.9 Continual Improvement
4.3.10 Span of control / federation governance
4.3.11 Outsourced service management
4.3.12 SOC KPIs & Metrics
4.3.13 Customer Engagement / Satisfaction
Completeness
4.4 Is cost management in place?
4.5 Please specify cost management elements
4.5.1 People cost
4.5.2 Process cost
4.5.3 Technology cost
4.5.4 Services cost
4.5.5 Facility cost
4.5.6 Budget forecasting
4.5.7 Budget alignment
4.5.8 Return on investment
Completeness
4.6 Are all governance elements formally documented?
4.7 Is the governance process regularly reviewed?
4.8 Is the governance process aligned with all stakeholders?
4.9 Is the SOC regularly audited or subjected to external assessments?

Comments and/or Remarks


4.10 Specify any comments or remarks you feel are important to this part of the assessment
Answer Guidance

Incomplete
Incomplete
Remarks
A governance process is required to determine the way the SOC should be managed
Possible governance elements can be found in under 4.3

Aligning SOC operations to business needs


Note that this can be part of the SOC charter document. This does not automatically make it part of the governance proce
Can be part of stakeholder management
Mandate for the SOC should be established so that the SOC can take action in crisis situations
Both management of internal and external relationships
For example: active involvement of vendors in the creation of a vision and strategy for the SOC
For example: service level agreements and IT controls
Project management for individual projects within the SOC / program management for larger transitions
Improvement of the SOC and of SOC management
Especially important for SOC setups where multiple SOCs exist within the same company
Especially important for hybrid SOC setups. When using outsourcing, SLAs and oversight should be in place
These are discussed in more detail in the Process section regarding reporting
Are customers an integral part of your security operations? Is their satisfaction of SOC services every inquired about?
Use this outcome as a guideline to determine the score for 4.2
Managing costs is required to justify budget allocation for the SOC and ensure continued service delivery in the future

Costs associated with employees. Should be managed to prove FTE requirements to stakeholders
Cost associated with processes. Should be managed to ensure process elements can be delivered
Cost associated with technology. Should be managed to prove budget requirements for new technology or replacement
Cost associated with service delivery. Especially important for managed service providers to ensure a healthy business mo
Cost associated with facilities used by the SOC
Forecasting of required budget over time. Should be aligned with business needs; increased spending must be justified
Alignment of budget with business requirements and drivers to ensure balanced spending on the SOC
Prove the return on investment to stakeholders to ensure continued budget allocation
Use this outcome as a guideline to determine the score for 4.4
Formal documentation should be signed off and stored in a quality management system
Regularity should be matched to your own internal policy. At least yearly is recommended
Alignment will help the SOC obtain required mandate, budget and management support
Frequency should be matched to your own internal policy. At least yearly is recommended
Business
1. Business Drivers 5. Privacy
2. Customers
3. Charter
4. Governance

5 Privacy
5.1 Is a privacy policy regarding security monitoring of employees in place?
5.2 Does the SOC operate in compliance with all applicable privacy laws and regulations?
5.3 Does the SOC cooperate with legal departments regarding privacy matters?
5.4 Are specific procedures in place for dealing with privacy related investigations?
5.5 Is the SOC aware of all information that it processes and is subject to privacy regulations?
5.6 Is a Privacy Impact Assessment (PIA) regularly conducted?

Comments and/or Remarks


5.7 Specify any comments or remarks you feel are important to this part of the assessment
Answer Guidance
Remarks
A privacy policy should state that monitoring of employees is possible within acceptable limits
Local laws and regulations as well as company policy may apply and should all be considered
Cooperation will ensure that the SOC is enabled to perform activities, rather than blocked
Privacy related issues require careful examination, especially those potentially leading to court cases
Such information includes IP addresses, customer identifiers, user names, host names (for personally owned devices), etc.
Can be used to determine the impact of monitoring on privacy, and can help uncover potential violations
People
1. Employees 5. Training and Education
2. Roles and Hierarchy
3. People Management
4. Knowledge Management

1 Employees
1.1 How many FTE’s are in your SOC?
1.2 Do you use external employees / contractors in your SOC?
1.2.1 If yes, specify the number of external FTE's
1.3 Does the current size of the SOC meet FTE requirements?
1.4 Does the SOC meet requirements for internal to external employee FTE ratio?
1.5 Does the SOC meet requirements for internal to external employee skillset?
1.6 Are all positions filled?
1.7 Do you have a recruitment process in place?
1.8 Do you have a talent acquisition process in place?

Comments and/or Remarks


1.9 Specify any comments or remarks you feel are important to this part of the assessment
Answer Guidance
0
Remarks
Include both internal and external FTE's
External employees can be hired experts to fill in vacant positions or perform project activities
Current ratio: 0%
i.e. is the SOC size sufficient to realize business goals?
Note: requirements do not need to be explicit. Set importance to 'None' if you have no external employees.
i.e. Are there any crucial skills amongst external employees? Set importance to 'None' if you have no external employees
Unfilled positions may be due to deficiencies in the recruitment process
A recruitment process is required to obtain new employees in a market where talent is scarce
Talent recruitment can be vital for SOC success, but talent retaining is equally important
People
1. Employees 5. Training and Education
2. Roles and Hierarchy
3. People Management
4. Knowledge Management

2 Roles and Hierarchy


2.1 Do you formally differentiate roles within the SOC?
2.2 Which of the following roles are present in your SOC?
2.2.1 Security Analyst
2.2.2 Security Engineer
2.2.3 Security Specialist
2.2.4 Security Architect
2.2.5 Threat Intelligence Analyst
2.2.6 Data Scientist
2.2.7 SOC Manager
2.2.8 Team Leader
2.2.9 Incident Handler
2.2.10 Incident Manager
2.2.11 Penetration Tester

2.2.12 Others, specify:

2.3 Do you differentiate tiers within these roles?


2.4 Are all roles sufficiently staffed?

2.4.1 Comments / remarks

2.5 Is there a role-based hierarchy in your SOC?


2.6 Have you formally documented all SOC roles?
2.7 Please specify elements in the role documentation:
2.7.1 Role description
2.7.2 Role tasks
2.7.3 Role responsibilities
2.7.4 Role expectations
2.7.5 Required technical skills
2.7.6 Required soft skills
2.7.7 Required educational level
2.7.8 Required or preferred certifications
Completeness
2.8 Are responsibilities for each role understood?
2.9 Have you documented career progression requirements for each of these roles?
2.10 Do you regularly revise or update the role descriptions?

Comments and/or Remarks


2.11 Specify any comments or remarks you feel are important to this part of the assessment
Answer
Incomplete
Remarks
Use the roles in 2.2 to determine if you have all roles required in the SOC

Primarily responsible for triage and analysis of security alerts


Primarily responsible for technical / functional maintenance of security systems
Primarily responsible for in-depth analysis and security projects
Primarily responsible for technical vision for security systems used within the SOC
Primarily responsible for analysis of threat intelligence
Primarily responsible for big data security analytics
Primarily responsible for managing SOC services
Primarily responsible for leading a team of other, for example, analysts and engineers
Primarily responsible for executing security incident management workflows
Primarily responsible for ensuring correct and timely management and escalation of security incidents
Primarily responsible for testing applications and systems for security weaknesses

Specify any additional roles

If you have no tiers, and you feel this is not a restriction, select importance 'None'
Consider the staffing levels (desired FTE count) as well as knowledge and experience for all roles

Any comments or remarks regarding 2.3

If you have no hierarchy, and you feel this is not a restriction, select importance 'None'
Possible documentation elements can be found in under 2.7

A formal description of the role


A description of tasks that are part of the role
The responsibilities of the role
This is an extension of responsibilities. Example expectation: take a pro-active leading role in case of security incidents
e.g. experience with specific technologies or products
e.g. communication skills, presentation skills
e.g. university college, university
e.g. technical security certifications or security management certifications
Use this outcome as a guideline to determine the score for 2.6
Responsibilities for each role should be clearly understood by all SOC personnel
Career progression for roles can be documented through training, certification, experience and soft skills requirements
To revise is to review and verify whether to documentation is still correct or requires an update
People
1. Employees 5. Training and Education
2. Roles and Hierarchy
3. People Management
4. Knowledge Management

3 People Management
3.1 Do you have a job rotation plan in place?
3.2 Do you have a career progression process in place?
3.3 Do you have a talent management process in place?
3.4 Do you have team diversity goals?
3.5 Do you perform a periodic evaluation of SOC employees?
3.6 Do you have a 'new hire' process in place?
3.7 Are all SOC employees subjected to screening?
3.8 Do you measure employee satisfaction for improving the SOC?
3.9 Are there regular 1-on-1 meetings between the SOC manager and the employees?
3.10 Do you perform regular teambuilding exercises?

Comments and/or Remarks


3.11 Specify any comments or remarks you feel are important to this part of the assessment
Answer
Remarks
Job rotation can be used to train employees in a variety of tasks and avoid too much routine
Career development, promotion, etc.
Talent should be adequately managed to retain such staff and fully develop their potential.
e.g. background diversity, ethnic diversity, gender diversity, etc.
Can also be included in the regular organization evaluation process
i.e. a defined process to quickly let new employees find their place and perform well in the SOC
Personnel screening is performed to avoid infiltration or misbehavior by SOC employees
Employee satisfaction should be taken seriously as lack of satisfaction may lead to key personnel leaving
Such informal 1-on-1 conversations are used to coach employees and help the SOC manager gain insight in personal challe
Teambuilding exercises are used to promote collaboration between individuals in the team and to raise team spirit
People
1. Employees 5. Training and Education
2. Roles and Hierarchy
3. People Management
4. Knowledge Management

4 Knowledge Management
4.1 Do you have a formal knowledge management process in place?
4.2 SOC skill matrix:
4.2.1 Does the skill matrix cover hard skills?
4.2.2 Does the skill matrix cover soft skills?
4.2.3 Is the skill matrix fully covered by current SOC personnel?
4.2.4 Is a skill assessment regularly carried out?
4.2.5 Are the results from skill assessments used for team and personal improvement?
4.2.6 Is the skill assessment process regularly updated with new skills?
4.3 SOC knowledge matrix:
4.3.1 Does the knowledge matrix cover all employees?
4.3.2 Does the knowledge matrix cover all relevant knowledge areas?
4.3.3 Is the knowledge matrix fully covered by current SOC personnel?
4.3.4 Is the knowledge matrix used to determine training and education needs?
4.3.5 Is the knowledge matrix regularly updated?
4.4 Do you regularly assess and revise the knowledge management process?
4.5 Is there effective tooling in place to support knowledge documentation and distribution?

Comments and/or Remarks


4.6 Specify any comments or remarks you feel are important to this part of the assessment
Answer

0
Formal knowledge management helps to optimize knowledge creation and distribution

e.g. ability to effectively use analysis tools


e.g. communication skills
Gaps and shortages in the skill matrix may impair SOC capabilities and service delivery
The interval depends on organizational requirements, changes to the SOC and SOC maturity level
Personal improvement is trivial, team improvement requires insight in team dynamics and knowledge distribution
New skills may become relevant when new technologies are introduced

Include both internal and external employees


The knowledge matrix should cover all relevant knowledge to carry out SOC tasks
Similar to gaps in the skill matrix, gaps and shortages in the knowledge matrix may impair SOC capabilities and service deli
The matrix should be used as a means to identify and resolve knowledge gaps
The interval depends on organizational requirements, changes to the SOC and SOC maturity level
This refers to the knowledge management process as a whole
Such tooling can help to avoid investigation similar issues multiple times by integrating into the security monitoring proces
People
1. Employees 5. Training and Education
2. Roles and Hierarchy
3. People Management
4. Knowledge Management

5 Training and Education


5.1 Do you have a training program in place?
5.2 Please specify elements of the training program:
5.2.1 Training on the Job
5.2.2 Product-specific training
5.2.3 Internal company training
5.2.4 Role-based specific training
5.2.5 Soft-skill training
5.2.6 Formal education
Completeness
5.3 Do you have a certification program in place?
5.4 Please specify elements of the certification program:
5.4.1 Internal certification track
5.4.2 External certification track
5.4.3 Re-certification track (continuous education)
Completeness
5.5 Is the training and certification program connected to evaluation and career progression?
5.6 Is there a reserved budget for education and training?
5.7 Is there a reserved amount of time for education and training?
5.8 Do you have regular workshops for knowledge development?
5.9 Do you regularly revise and update the training and certification programs?

Comments and/or Remarks


5.10 Specify any comments or remarks you feel are important to this part of the assessment
Answer

Incomplete

Incomplete
Remarks
A training program is used to ensure a minimal level of knowledge for employees

Training on the job can be done internally by senior employees or using external consultants
Product-specific training may be required for new technologies or complex solutions
e.g. training on internal policies
For example: security analysis training for the security analyst role
To complement hard skills, soft skills should be trained as well
Formal education may be university or university college degrees
Use this outcome as a guideline to determine the score for 5.1
A certification program is used to provide a demonstrable minimum level of knowledge and skills

Internal certifications may be in place to demonstrate knowledge of company processes and policies
Certification track with external certification organizations (e.g. ISACA, (ISC)2, SANS
Permanent education (PE) may be part of the certification itself
Use this outcome as a guideline to determine the score for 5.3
e.g. certain training and certifications are required to grow from a junior level function to a more senior level function
i.e. a fixed percentage of the total SOC budget that is allocated for education and cannot be used for other purposes
This is an extension of education budget
Workshops are an informal way of distributing knowledge
Training and certification must be a relevant reflection of SOC knowledge and skill requirements
Process
1. Management
2. Operations and Facilities
3. Reporting
4. Use Case Management

1 Management
1.1 Is there a SOC management process in place?
1.2 Are SOC management elements formally identified and documented?
1.3 Please specify identified SOC management elements:
1.3.1 Internal relationship management
1.3.2 External relationship management
1.3.3 Vendor management
1.3.4 Continuous service improvement
1.3.5 Project methodology
1.3.6 Process documentation and diagrams
1.3.7 RACI matrix
1.3.8 Service Catalogue
1.3.9 Service on-boarding procedure
1.3.10 Service off-loading procedure
Completeness
1.4 Is the SOC management process regularly reviewed?
1.5 Is the SOC management process aligned with all stakeholders?

Comments and/or Remarks


1.6 Specify any comments or remarks you feel are important to this part of the assessment
Answer Guidance

Incomplete
Remarks
A SOC management process is used to manage all aspects SOC service delivery and quality
Possible SOC management elements can be found in under 1.3

Relationship management within the organization


Relationship management outside of the organization
Relationship management with relevant vendors for SOC technologies
A methodology for continuously improving on SOC service delivery and internal processes supporting service delivery
For example: LEAN or agile project approach
Any documentation on SOC processes or services. May contains diagrams explaining relationships between processes
A description of all SOC responsibilities, accountabilities and cases in which the SOC is informed or consulted
A description of all SOC services and service levels
Procedure for intake, evaluation and move-to-production for requests for new services or customers
Procedure to remove existing services and customers from service delivery
Use this outcome as a guideline to determine the score for 1.2
Regular review of the SOC management process ensures optimal performance
Alignment with stakeholders will ensure the SOC delivers services that meet customer expectations
Process
1. Management
2. Operations and Facilities
3. Reporting
4. Use Case Management

2 Operations and Facilities


2.1 Service delivery standardization
2.1.1 Do you perform security operations exercises regularly?
2.1.2 Do you have standard operating procedures?
2.1.3 Do you use checklists for recurring activities?
2.1.4 Do you use documented workflows?
2.1.5 Do you have a SOC operational handbook?
2.2 Process integration
2.2.1 How well is the configuration management process integrated in the SOC?
2.2.2 How well is the change management process integrated in the SOC?
2.2.3 How well is the problem management process integrated in the SOC?
2.2.4 How well is the incident management process integrated in the SOC?
2.2.5 How well is the asset management process integrated in the SOC?
2.3 SOC Facilities
2.3.1 Do you have a dedicated physical SOC location?
2.3.2 Do you have a dedicated network for the SOC?
2.3.3 Do you have physical access control to the SOC location?
2.3.4 Do you have a video wall for monitoring purposes?
2.3.5 Do you have a call-center capability for the SOC?
2.3.6 Do you have specialized analyst workstations?
2.4 Operational shifts
2.4.1 Do you use shift schedules?
2.4.2 Do you have a shift log?
2.4.3 Do you have a formally described shift turnover procedure?
2.4.4 Do you have a daily SOC operational stand-up?
2.4.5 Do you have stand-by arrangements with employees within the SOC?
2.5 Knowledge & document management
2.5.1 Do you have a Document Management System in place?
2.5.2 Do you have a knowledge & collaboration platform in place?

Comments and/or Remarks


2.6 Specify any comments or remarks you feel are important to this part of the assessment
Answer Guidance
Remarks

Regularity should be matched to your own internal policy


Standard operating procedures are used to provide consistent output
Checklists can be useful to avoid recurring activities from being overlooked
Workflows are used to standardize steps in, for example, security analysis
A SOC operational handbook contains an overview of SOC tasks, as well as rules of engagement and expected behavior

Are SOC services and procedures aligned and integrated with the organization's configuration management process?
Are SOC services and procedures aligned and integrated with the organization's change management process?
Are SOC services and procedures aligned and integrated with the organization's problem management process?
Are SOC services and procedures aligned and integrated with the organization's incident management process?
Are SOC services and procedures aligned and integrated with the organization's asset management process?

A dedicated physical location decreases likelihood of unauthorized access and provides confidentiality for security incident
Given the confidentiality of the SOC and the importance of monitoring, it is recommended to use a separate network
e.g. key cards (badges) for access with access logging
A video wall can be used to display the real-time security status and can be used for decision making as well as PR
Since communication and coordination are important features of a SOC, call-center capability may be required
e.g. multiple screen setup, virtual machines, etc.

Shift schedules are used to ensure full shift coverage


A shift log covers all exceptions found during the shift, running investigations, etc.
i.e. a procedure for handing over a shift and exchanging information regarding running tasks or issues for further follow-up
This can also be a call in case physical attendance is not possible for all attendees
i.e. is there a formal stand-by function that obligates employees to be able to be reached within a certain time?

The system should support different file types, authorizations and version management; possibly even encryption
e.g. a wiki space or SharePoint that allows collaboration and supports team efforts
Process
1. Management
2. Operations and Facilities
3. Reporting
4. Use Case Management

3 Reporting
3.1 Do you regularly provide reports?
3.2 Are these reports tailored to the recipients?
3.3 Are the report contents approved by or reviewed by the recipients?
3.4 Do you have established reporting lines within the organization?
3.5 Do you regularly revise and update the report templates?
3.6 Do you have formal agreements with the recipients regarding reports?
3.7 Report types
3.7.1 Do you provide technical security reports?
3.7.2 Do you provide executive security reports?
3.7.3 Do you provide operational reports?
3.7.4 Do you provide incident reports?
3.7.5 Do you provide a newsletter or digest?
3.7.6 Do you provide KPI reports?
3.7.7 Do you provide trend reports?
3.7.8 Do you have real-time reporting dashboards available to SOC customers?
3.8 Metric types
3.8.1 Are quantitative metrics used in reports?
3.8.2 Are qualitative metrics used in reports?
3.8.3 Are incident & case metrics used in reports?
3.8.4 Are timing metrics used in reports?
3.8.5 Are metrics regarding SLAs used in reports?
3.9 Advisories
3.9.1 Do you provide advisories to the organization regarding threats and vulnerabilities?
3.9.2 Do you perform risk / impact assessments of these advisories?
3.9.3 Do you perform follow-up of these advisories?

Comments and/or Remarks


3.10 Specify any comments or remarks you feel are important to this part of the assessment
Answer Guidance
Remarks
Regular reports help to keep customers informed of SOC activities
e.g. management reports for senior management, technical reports for the IT organization
formal sign-off can be part of a larger service delivery sign-off
e.g. reporting lines could be: SOC management, IT management, senior management
Report templates should be regularly optimized to ensure continued
For example: timelines of delivery, report contents, etc.

i.e. reports regarding technical issues or technical solutions to security issues


i.e. reports aimed at senior executives to inform them of SOC services
i.e. reports regarding security operations in general
Ad-hoc reports created to provide insight into incidents. This can also be part of incident management
A newsletter can be an informal way to provide updates to the organization
KPI reports are used to measure service performance
Trend reports can be used to determine changes over time
Real-time reporting dashboards provide immediate insight into the current threat level

Event count, false-positive rate, number of service requests, etc.


i.e. risk level, customer satisfaction
e.g. the number of cases and incidents, number of incidents detected by SOC, average cost per incident, etc.
Time to detect, time to contain, time to eradicate
e.g. service availability, incidents handled within agreed time period, etc.

Advisories are used to inform customers of security threats and vulnerabilities


i.e. do you add organizational context to these advisories?
i.e. do you assist in coordination when required?
Process
1. Management
2. Operations and Facilities
3. Reporting
4. Use Case Management

4 Use Case Management


4.1 Is there a use case management process or framework in place?
4.2 Are use cases formally documented?
4.3 Are use cases approved by relevant stakeholders?
4.4 Is the use case management process aligned with other important processes?
4.5 Are use cases created using a standardized process?
4.6 Are use cases created using a top-down approach?
4.7 Can use cases be traced from high-level drivers to low-level implementation?
4.8 Can use cases be traced from low-level implementation to high-level drivers?
4.9 Do you perform tests to verify correct use case operation?
4.10 Are use cases measured for implementation and effectiveness?
4.11 Are use cases scored and prioritized based on risk levels?
4.12 Are use cases regularly revised and updated?

Comments and/or Remarks


4.13 Specify any comments or remarks you feel are important to this part of the assessment

[1] The MaGMa Use Case Framework is a framework and tool for use case management created by the Dutch
financial sector and can be obtained from the following location:
https://www.betaalvereniging.nl/en/safety/magma/
Answer Guidance

eated by the Dutch


Remarks
A framework, such as MaGMa UCF [1], can be used to guide use case lifecycle and document use case in a standardized form
Formal documentation may include use case documentation templates
e.g. business stakeholders, IT stakeholders, CISOs, audit & compliance, risk management, etc.
e.g. integration with the threat / risk management process to revise use cases when the threat landscape changes
i.e. a standardized approach to derive use cases from threats or business requirements
e.g. use cases can be derived from business requirements, risk assessments, threat management / intelligence
Top-down traceability is important to determine completeness of implementation and demonstrable risk reduction
Bottom-up traceability is important for contextualizing use case output and business alignment
Tests can be training exercises, table-top test or any other form of formal testing
Metrics can be applied to use cases to determine growth and maturity by measuring effectiveness and implementation
Risks can be (cyber)threats, but also loss of license (compliance) or penalties (laws & regulations)
Use cases should be subjected to life cycle management and may require updates or may be outdated and decommissione
Technology
1. SIEM tooling
2. IDPS tooling
3. Security Analytics tooling
4. Automation & Orchestration tooling

1 SIEM tooling

1.1 Accountability
1.1.1 Has functional ownership of the solution been formally assigned?
1.1.2 Has technical ownership of the solution been formally assigned?
1.2 Documentation
1.2.1 Has the solution been technically described?
1.2.2 Has the solution been functionally described?
1.3 Personnel & support
1.3.1 Is there dedicated personnel for support?
1.3.2 Is the personnel for support formally trained?
1.3.3 Is the personnel for support certified?
1.3.4 Is there a support contract for the solution?
1.4 Availability & Integrity
1.4.1 Is there high availability (HA) in place for the solution?
1.4.2 Is there data backup / replication in place for the solution?
1.4.3 Is there configuration backup / replication in place for the solution?
1.4.4 Is there a Disaster Recovery plan in place for this solution?
1.4.5 Is the Disaster Recovery plan regularly tested?
1.4.6 Is there a separate development / test environment for this solution?
1.5 Confidentiality
1.5.1 Is access to the solution limited to authorized personnel?
1.5.2 Are access rights regularly reviewed and revoked if required?
1.6 Specify which technological capabilities and artefacts are present:
1.6.1 Aggregation
1.6.2 Correlation
1.6.3 Custom parsing
1.6.4 Threat Intelligence integration
1.6.5 Subtle event detection
1.6.6 Automated alerting
1.6.7 Alert acknowledgement
1.6.8 Automated threat response
1.6.9 Multi-stage correlation
1.6.10 Pattern detection
1.6.11 Case management system
1.6.12 Asset management integration
1.6.13 Business context integration
1.6.14 Identity context integration
1.6.15 Asset context integration
1.6.16 Vulnerability context integration
1.6.17 Standard rules
1.6.18 Custom rules
1.6.19 Network model
1.6.20 Customized SIEM reports
1.6.21 Customized SIEM dashboards
1.6.22 Granular access control
1.6.23 Controlled and monitored maintenance / support
1.6.24 API Integration
1.6.25 Secure Event Transfer
1.6.26 Support for multiple event transfer technologies
Completeness (%)

Comments and/or Remarks


1.7 Specify any comments or remarks you feel are important to this part of the assessment
Answer Guidance
0
Remarks

Functional ownership includes functional accountability


Technical ownership includes technical accountability

A technical description of the SIEM system components and configuration


A description of the SIEM functional configuration (rules, filters, lists, etc.)

Dedicated personnel should be in place to ensure that support is always available. Can also be staff with outsourced provid
Training helps to jump start new hires, and to learn a proper way of working with the tool
Certification demonstrates ability to handle the tooling properly
A support contract may cover on-site support, support availability, response times, escalation and full access to resources

Can be fully implemented HA, partially implemented, hot spare, etc.


May not be feasible for all SIEM solutions
Configuration synchronization could be part of a HA setup
A DR plan is required to restore service in case of catastrophic events
DR plans should be tested to ensure restoring services and resuming normal operations is possible
A separate test environment allows for testing of new configurations before deployment in production

The SIEM system will contain confidential information and information that possibly impacts employee privacy
Revocation is part of normal employee termination. Special emergency revocation should be in place for suspected misuse

Capability to aggregate the raw event flow


Capability to correlate multiple events
Capability to create and maintain custom parsers for parsing and normalization needs
Integration of threat intelligence information (observables / IoCs) into the security monitoring tooling
Capability to detect slight changes in systems, applications or network that may indicate malicious behavior
Alerting based on different alerting mechanisms (SMS, mail, etc.)
Capability to acknowledge alerts so other analysts know the alert is being investigated
For example: roll-out of intrusion prevention rules, closing firewall ports, etc.
Capability to feed correlated events back into the engine for further processing
Detection of anomaly patterns in SIEM data
A case management system that supports SOC analyst workflows
Integration into the asset management process for automated adding of assets to the SIEM for monitoring
Integration of business context (business function, asset classification, etc.)
Integration of identity information into the SIEM for enhanced monitoring of users and groups
Integration of asset management information into the SIEM (asset owner, asset location, etc.)
Integration of vulnerability management information into SIEM assets to determine risk levels for assets
Use of standard content packs in the SIEM
Use of custom content (correlation rules, etc.) in the SIEM
A full network model in which zones and segments are defined
Automated SIEM reports for SOC customers and SOC analysts
Custom SIEM dashboards used by analysts and managers
Allows to apply the principle of least privilege to configuration of user accounts
Only trusted tools used for maintenance, remote maintenance / support monitored and controlled
Both export of information / commands and import of information
Support for secure event transfer and the actual implementation of secure transfer (e.g. regular syslog is not secure)
The SIEM should support event transfer technologies for all possible data sources
Technology
1. SIEM tooling
2. IDPS tooling
3. Security Analytics tooling
4. Automation & Orchestration tooling

2 IDPS Tooling

2.1 Accountability
2.1.1 Has functional ownership of the solution been formally assigned?
2.1.2 Has technical ownership of the solution been formally assigned?
2.2 Documentation
2.2.1 Has the solution been technically described?
2.2.2 Has the solution been functionally described?
2.3 Personnel & support
2.3.1 Is there dedicated personnel for support?
2.3.2 Is the personnel for support formally trained?
2.3.3 Is the personnel for support certified?
2.3.4 Is there a support contract for the solution?
2.4 Availability & Integrity
2.4.1 Is there high availability (HA) in place for the solution?
2.4.2 Is there data backup / replication in place for the solution?
2.4.3 Is there configuration backup / replication in place for the solution?
2.4.4 Is there a Disaster Recovery plan in place for this solution?
2.4.5 Is the Disaster Recovery plan regularly tested?
2.4.6 Is there a separate development / test environment for this solution?
2.5 Confidentiality
2.5.1 Is access to the solution limited to authorized personnel?
2.5.2 Are access rights regularly reviewed and revoked if required?
2.6 Specify which technological capabilities and artefacts are present:
2.6.1 Network-based intrusion detection
2.6.2 Host-based intrusion detection
2.6.3 File integrity checking
2.6.4 Application whitelisting
2.6.5 Honeypots
2.6.6 Custom signatures
2.6.7 Anomaly detection
2.6.8 Automated alerting
2.6.9 Central Management Console
2.6.10 Full Packet Capture for inbound / outbound internet traffic
2.6.11 Full Packet Capture for high-value internal network segments
2.6.12 Full Packet Capture for other internal networks
2.6.13 Granular access control
2.6.14 Controlled and monitored maintenance / support
2.6.15 SIEM integration
2.6.16 API integration
2.6.17 Threat Intelligence integration
Completeness (%)

Comments and/or Remarks


2.7 Specify any comments or remarks you feel are important to this part of the assessment
Answer Guidance
0
Remarks

Functional ownership includes functional accountability


Technical ownership includes technical accountability

A technical description of the IDPS system components and configuration


A description of the IDPS functional configuration (rules, alerts, etc.)

Dedicated personnel should be in place to ensure that support is always available. Can also be staff with outsourced provid
Training helps to jump start new hires, and to learn a proper way of working with the tool
Certification demonstrates ability to handle the tooling properly
A support contract may cover on-site support, support availability, response times, escalation and full access to resources

Can be fully implemented HA, partially implemented, hot spare, etc.


Data may include logs and PCAP files
Configuration synchronization could be part of a HA setup
A DR plan is required to restore service in case of catastrophic events

A separate test environment allows for testing of new configurations before deployment in production

The IDPS system will contain confidential information and possibly information that impacts employee privacy
Revocation is part of normal employee termination. Special emergency revocation should be in place for suspected misuse

i.e. an intrusion detection / prevention capability in the network


i.e. an intrusion detection / prevention capability on the end-point
i.e. a host-based intrusion detection system, specific for monitoring alteration of files
i.e. a host-based intrusion prevention system aimed to prevent unauthorized files from execution
Honeypot systems to attract potential hackers. Coverage is an indicator of how well the feature is implemented
The ability to implement custom detection rules
Capability to detect network anomalies based on statistical deviations instead of pre-defined rules
Alerting based on different alerting mechanisms (SMS, mail, etc.)
A central management console for administration of decentralized IDPS equipment
Full packet capture of any anomalies uncovered

Allows to apply the principle of least privilege to configuration of user accounts


Only trusted tools used for maintenance, remote maintenance / support monitored and controlled
Send alert logs to SIEM for security monitoring integration
e.g. used for automated deployment of custom signatures
Integration of the tool with a threat intelligence process or platform (e.g. to deploy shared YARA rules)
Technology
1. SIEM tooling
2. IDPS tooling
3. Security Analytics tooling
4. Automation & Orchestration tooling

3 Security Analytics Tooling

3.1 Accountability
3.1.1 Has functional ownership of the solution been formally assigned?
3.1.2 Has technical ownership of the solution been formally assigned?
3.2 Documentation
3.2.1 Has the solution been technically described?
3.2.2 Has the solution been functionally described?
3.3 Personnel & support
3.3.1 Is there dedicated personnel for support?
3.3.2 Is the personnel for support formally trained?
3.3.3 Is the personnel for support certified?
3.3.4 Is there a support contract for the solution?
3.4 Availability & Integrity
3.4.1 Is there high availability (HA) in place for the solution?
3.4.2 Is there data backup / replication in place for the solution?
3.4.3 Is there configuration backup / replication in place for the solution?
3.4.4 Is there a Disaster Recovery plan in place for this solution?
3.4.5 Is the Disaster Recovery plan regularly tested?
3.4.6 Is there a separate development / test environment for this solution?
3.5 Confidentiality
3.5.1 Is access to the solution limited to authorized personnel?
3.5.2 Are access rights regularly reviewed and revoked if required?
3.6 Specify which technological capabilities and artefacts are present:
3.6.1 Scalable analytics engine
3.6.2 Automated data normalization
3.6.3 Pattern-based analysis
3.6.4 Integration of security incident management
3.6.5 Integration of security monitoring
3.6.6 External threat intelligence integration
3.6.7 Advanced searching and querying
3.6.8 Data visualization techniques
3.6.9 Data drilldowns
3.6.10 Detailed audit trail of analyst activities
3.6.11 Historical activity detection
3.6.12 Structured data collection
3.6.13 Unstructured data collection
3.6.14 User baselines
3.6.15 Application baselines
3.6.16 Infrastructure baselines
3.6.17 Network baselines
3.6.18 System baselines
3.6.19 Central analysis console
3.6.20 Security data warehouse
3.6.21 Flexible data architecture
3.6.22 Granular access control
3.6.23 Controlled and monitored maintenance / support
3.6.24 API Integration
Completeness (%)

Comments and/or Remarks

3.7 Specify any comments or remarks you feel are important to this part of the assessment
Answer Guidance
0
Remarks

Functional ownership includes functional accountability


Technical ownership includes technical accountability

A technical description of the security analytics system components and configuration


A description of the security analytics functional configuration (rules, filters, lists, etc.)

Dedicated personnel should be in place to ensure that support is always available. Can also be staff with outsourced provid
Training helps to jump start new hires, and to learn a proper way of working with the tool
Certification demonstrates ability to handle the tooling properly
A support contract may cover on-site support, support availability, response times, escalation and full access to resources

Can be fully implemented HA, partially implemented, hot spare, etc.


May not be feasible for all analytics solutions
Configuration synchronization could be part of a HA setup
A DR plan is required to restore service in case of catastrophic events

A separate test environment allows for testing of new configurations before deployment in production

The analytics system will contain confidential information and information that possibly impacts employee privacy
Revocation is part of normal employee termination. Special emergency revocation should be in place for suspected misuse

An analytics engine that is capable of support growing volumes of information


Normalization of data is required for advanced searching and comparison of events from different sources
Analysis of patterns in large volumes of information
Process integration in which information from the analytics process can be followed-up by security incident management
Process integration in which anomalies uncovered in the analytics process is used to create new monitoring rules
Integration of threat intelligence information into the system for analysis and hunting purposes
Searching capabilities that support extraction of specific information based on characteristics
Graphing capabilities to support anomaly detection
Drilldowns on graphs to quickly 'zoom in' on details of visual anomalies
The audit trail can be used to report on analyst activities and to uncover potential abuse of the big data solution
Capability of detecting historical activity for recently uncovered threats
Collection of structured information (e.g. log files)
Collection of unstructured information (e.g. documents in different formats)
Baselines of 'regular' user behavior
Baselines of 'regular' application behavior
Baselines of 'regular' infrastructure behavior
Baselines of 'regular' network behavior
Baselines of 'regular' system behavior
A central console that allows access for analysts
A data warehouse for security events that is dedicated for the analytics solution
Use of a data architecture (e.g. Lambda) that is flexible in accommodating different kinds and large volumes of information
Allows to apply the principle of least privilege to configuration of user accounts
Only trusted tools used for maintenance, remote maintenance / support monitored and controlled
API integration to import and export information (such as IoCs, YARA rules or suspicious files)
Technology
1. SIEM tooling
2. IDPS tooling
3. Security Analytics tooling
4. Automation & Orchestration tooling

4 Automation & Orchestration Tooling

4.1 Accountability
4.1.1 Has functional ownership of the solution been formally assigned?
4.1.2 Has technical ownership of the solution been formally assigned?
4.2 Documentation
4.2.1 Has the solution been technically described?
4.2.2 Has the solution been functionally described?
4.3 Personnel & support
4.3.1 Is there dedicated personnel for support?
4.3.2 Is the personnel for support formally trained?
4.3.3 Is the personnel for support certified?
4.3.4 Is there a support contract for the solution?
4.4 Availability & Integrity
4.4.1 Is there high availability (HA) in place for the solution?
4.4.2 Is there data backup / replication in place for the solution?
4.4.3 Is there configuration backup / replication in place for the solution?
4.4.4 Is there a Disaster Recovery plan in place for this solution?
4.4.5 Is the Disaster Recovery plan regularly tested?
4.4.6 Is there a separate development / test environment for this solution?
4.5 Confidentiality
4.5.1 Is access to the solution limited to authorized personnel?
4.5.2 Are access rights regularly reviewed and revoked if required?
4.6 Specify which technological capabilities and artefacts are present:
4.6.1 SIEM Integration
4.6.2 Threat intelligence integration
4.6.3 Asset management integration
4.6.4 User management integration
4.6.5 Vulnerability management integration
4.6.6 Historical event matching
4.6.7 Knowledge base integration

4.6.8 Risk-based event prioritization


4.6.9 Firewall integration
4.6.10 IDPS integration
4.6.11 Email protection integration
4.6.12 Malware protection integration
4.6.13 Sandbox integration
4.6.14 Active Directory / IAM integration
4.6.15 Ticket workflow support
4.6.16 Granular access control
4.6.17 Controlled and monitored maintenance / support
4.6.18 Performance tracking
4.6.19 Runbook support
Completeness (%)

Comments and/or Remarks

4.7 Specify any comments or remarks you feel are important to this part of the assessment
Answer Guidance
0
Remarks

Functional ownership includes functional accountability


Technical ownership includes technical accountability

A technical description of the automation & orchestration system components and configuration
A description of the automation & orchestration system functional configuration (workflows, integrations, etc.)

Dedicated personnel should be in place to ensure that support is always available. Can also be staff with outsourced provid
Training helps to jump start new hires, and to learn a proper way of working with the tool
Certification demonstrates ability to handle the tooling properly
A support contract may cover on-site support, support availability, response times, escalation and full access to resources

Can be fully implemented HA, partially implemented, hot spare, etc.


May not be required for this particular solution
Configuration synchronization could be part of a HA setup
A DR plan is required to restore service in case of catastrophic events

A separate test environment allows for testing of new configurations before deployment in production

The automation system may have automated actions that can impact the usage of systems and should be restricted
Revocation is part of normal employee termination. Special emergency revocation should be in place for suspected misuse

The automation & orchestration tool receives events from the SIEM system
Contextualize potential incidents using threat intelligence
Contextualize potential incidents using asset information
Contextualize potential incidents using user information
Contextualize potential incidents using vulnerability management information
Contextualize potential incidents using similar historical events
Automatically update the knowledge base using event information
Risk-based prioritization of security events using contextualized information
Automated remediation by blocking attackers on the firewall
Automated remediation by blocking attackers in the network
Automated remediation by blocking email senders
Automated remediation by quarantining malware and scanning end-points for malware threats
Automated delivery of malware samples to sandbox environments for extensive analysis
Automated locking and suspension of user accounts or revocation of access rights based on event outcome
Automated ticket creation and workflow support
Allows to apply the principle of least privilege to configuration of user accounts
Only trusted tools used for maintenance, remote maintenance / support monitored and controlled
Application of KPIs and metrics to ticket workflow
Support for runbooks that allow for automated decision making based on predefined parameters
Services
1. Security Monitoring 5. Threat Hunting
2. Security Incident Management 6. Vulnerability Management
3. Security Analysis & Forensics 7. Log Management
4. Threat Intelligence

1 Security Monitoring

Maturity
1.1 Have you formally described the security monitoring service?
1.2 Please specify elements of the security monitoring service document:
1.2.1 Key performance indicators
1.2.2 Quality indicators
1.2.3 Service dependencies
1.2.4 Service levels
1.2.5 Hours of operation
1.2.6 Service customers and stakeholders
1.2.7 Purpose
1.2.8 Service input / triggers
1.2.9 Service output / deliverables
1.2.10 Service activities
1.2.11 Service roles & responsibilities
Completeness
1.3 Is the service measured for quality?
1.4 Is the service measured for service delivery in accordance with service levels?
1.5 Are customers and/or stakeholders regularly updated about the service?
1.6 Is there a contractual agreement between the SOC and the customers?
1.7 Is sufficient personnel allocated to the process to ensure required service delivery?
1.8 Is the service aligned with other relevant processes?
1.9 Is there a incident resolution / service continuity process in place for this service?
1.10 Has a set of procedures been created for this service?
1.11 Are best practices applied to the service?
1.12 Are use cases used in the security monitoring service?
1.13 Is process data gathered for prediction of service performance?
1.14 Is the service continuously being improved based on improvement goals?
Capability
1.15 Please specify capabilities of the security monitoring service:
1.15.1 Early detection
1.15.2 Intrusion detection
1.15.3 Exfiltration detection
1.15.4 Subtle event detection
1.15.5 Malware detection
1.15.6 Anomaly detection
1.15.7 Real-time detection
1.15.8 Alerting & notification
1.15.9 Status monitoring
1.15.10 Perimeter monitoring
1.15.11 Host monitoring
1.15.12 Network & traffic monitoring
1.15.13 Access & usage monitoring
1.15.14 User monitoring
1.15.15 Application & service monitoring
1.15.16 Behavior monitoring
1.15.17 Database monitoring
1.15.18 Data loss monitoring
1.15.19 Device loss / theft monitoring
1.15.20 Third-party monitoring

1.15.21 Physical environment monitoring


1.15.22 False-positive reduction
1.15.23 Continuous tuning
1.15.24 Coverage
Completeness (%)

Comments and/or Remarks

1.16 Specify any comments or remarks you feel are important to this part of the assessment
ent

Answer CMMI level

CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
Incomplete
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 4
CMMI level 5
0
Guidance
Remarks

A service description should be in place

Indicators to establish the performance of the service


Indicators to establish the quality of service delivery
A clear understanding of which people / process / technologies are required for adequate service delivery
Agreements on minimum performance, capacity, availability, etc.
The operational hours for this service (e.g. 24/7)
The customers and stakeholders for this service (e.g. IT management)
The purpose and objectives for this service
The service input: what triggers this service to run?
The service output: what does the service deliver? Can be tangible (e.g. reports) or intangible (e.g. situational awareness )
Which activities are carried out within the scope of the service?
Which roles and responsibilities apply to this service?
Use this outcome to determine the score for 1.2
Are the quality indicators from the previous questions used for reporting on the service?
Service levels should be used to formally commit the SOC to service delivery
Changes to the service scope, delivery, etc.
Contractual agreements should also cover penalties
Allocation of dedicated personnel will ensure highest service quality
e.g. alignment with configuration management, incident management, etc.
Service continuity is important to comply with contractual agreements, even in case of major incidents
Procedures support process standardization and quality. Personnel should be trained to use procedures correctly and stru
Best practices should be used to optimize this service
e.g. user login brute-force, denial of service, non-compliance, etc.
Service performance measurement requires establishment of performance goals
Improvement based on based evaluation, (maturity) assessment, tests, etc.
Capability to detect incidents in an early stage
Capability to detect intrusion attempts
Capability to detect information leaving the organization
Capability to detect slight changes in systems, applications or network that may indicate malicious behavior
Capability to detect malware in the infrastructure
Capability to detect anomalies
Can also be near real-time (e.g. 15 minutes delay)
Capability to automatically send alerts for all security monitoring components
Monitoring of the status of the system
Monitoring of the network perimeter for attempted intrusions and exfiltration
Monitoring of endpoints in the networks (servers, clients, etc.)
Monitoring of network and traffic flows and anomalies
Monitoring of access attempts
Monitoring of user action
Monitoring of applications & services
Monitoring of behavior against baselines (can be host, network and user behavior)
Monitoring of databases
Monitoring for loss of information
Monitoring for loss or theft of company assets
Monitoring of trusted third-parties to detect possible breach attempts through the supply chain

Monitoring of the physical environment to detect cyber security incidents


A process for reducing the amount of false-positives
A continuous tuning process for the correlation rules
How well does the security monitoring service cover your assets? This includes SIEM asset coverage and IDPS coverage
Services
1. Security Monitoring 5. Threat Hunting
2. Security Incident Management 6. Vulnerability Management
3. Security Analysis & Forensics 7. Log Management
4. Threat Intelligence

2 Security Incident Management

Maturity
2.1 Have you adopted a maturity assessment methodology for Security Incident Management?
2.1.1 If yes, please specify the methodology
2.1.2 If yes, please specify the maturity level (can have up to 2 digits)
If yes, skip directly to 2.7
2.2 Have you adopted a standard for the Security Incident Management process?
2.3 Have you formally described the security incident management process?
2.4 Please specify elements of the security incident management document:
2.4.1 Security incident definition
2.4.2 Service levels
2.4.3 Workflow
2.4.4 Decision tree
2.4.5 Hours of operation
2.4.6 Service customers and stakeholders
2.4.7 Purpose
2.4.8 Service input / triggers
2.4.9 Service output / deliverables
2.4.10 Service activities
2.4.11 Service roles & responsibilities
Completeness
2.5 Is the service measured for quality?
2.6 Is the service measured for service delivery in accordance with service levels?
2.7 Are customers and/or stakeholders regularly updated about the service?
2.8 Is there a contractual agreement between the SOC and the customers?
2.9 Is sufficient personnel allocated to the process to ensure required service delivery?
2.10 Is the service aligned with other relevant processes?
2.11 Is the incident response team authorized to perform (invasive) actions when required?
2.12 Are best practices applied to the service?
2.13 Is the service supported by predefined workflows or scenarios?
2.14 Is process data gathered for prediction of service performance?
2.15 Is the service continuously being improved based on improvement goals?

Capability
2.16 Please specify capabilities and artefacts of the security incident management service:
2.16.1 Incident logging procedure
2.16.2 Incident resolution procedure
2.16.3 Incident investigation procedure
2.16.4 Escalation procedure
2.16.5 Evidence collection procedure
2.16.6 Password change procedure
2.16.7 IR Training
2.16.8 Table-top exercises
2.16.9 Red team / blue team exercises
2.16.10 RACI matrix
2.16.11 Response authorization
2.16.12 Incident template
2.16.13 Incident tracking system
2.16.14 False-positive reduction
2.16.15 Priority assignment
2.16.16 Severity assignment
2.16.17 Categorization
2.16.18 Critical bridge
2.16.19 War room
2.16.20 Communication plan & email templates
2.16.21 Backup communication technology
2.16.22 Secure communication channels
2.16.23 (dedicated) information sharing platform
2.16.24 Change management integration
2.16.25 Malware extraction & analysis
2.16.26 On-site incident response
2.16.27 Remote incident response
2.16.28 Third-party escalation
2.16.29 Evaluation template
2.16.30 Reporting template
2.16.31 Incident closure
2.16.32 Lessons learned extraction for process improvement

Completeness (%)

Comments and/or Remarks

2.17 Specify any comments or remarks you feel are important to this part of the assessment
ent

Answer CMMI level

anagement?

CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
Incomplete
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 4
CMMI level 5
0
Guidance
Remarks

e.g., SIM3, CREST, etc.


Please convert to a 5-point scale if required. For example: 3.6 on a 4-point scale = 4.5 on a 5-point scale
The score in 2.1.2 overrules any maturity scoring in this section
e.g. NIST 800-51, CERT handbook, etc.
A service description should be in place

A clear and unambiguous definition of a security incident


e.g. response times
The process steps that are part of the security incident management process (e.g. detection, triage, etc.)
Decision tree for escalation and starting of the process
When can the security incident response process be started?
The customers and stakeholders for this service (e.g. IT management)
The purpose and objectives for this service
The service input: what triggers this service to run?
The service output: what does the service deliver? Can be tangible (e.g. reports) or intangible (e.g. situational awareness )
Which activities are carried out within the scope of the service?
Which roles and responsibilities apply to this service?
Use this outcome to determine the score for 2.4
Are the quality indicators from the previous questions used for reporting on the service?
Service levels should be used to formally commit the SOC to service delivery
Changes to the service scope, delivery, etc.
Contractual agreements should also cover penalties
Allocation of dedicated personnel will ensure highest service quality
e.g. alignment with configuration management, incident management, etc.
This is a mandate issue. The team should have mandate beforehand to optimize incident response times
Best practices should be used to optimize this service
Workflows and scenario's can be used to structure follow-up and determine expected incident progression
Service performance measurement requires establishment of performance goals
Improvement based on based evaluation, (maturity) assessment, tests, etc.

Part of preparation procedures


Part of preparation procedures
Part of preparation procedures. Includes triage procedure and workflow
Part of preparation procedures
Part of preparation procedures
Part of preparation procedures
Preparation exercises to determine service effectiveness
Preparation exercises to determine service effectiveness
Preparation exercises to determine service effectiveness
Matrix with Responsibility, Accountability and Consulted and Informed entities for the process
Authorization from senior management to take any action required for incident mitigation (e.g. disconnect systems)
Templates for security incident management registration
A system that support the security incident management workflow. If possible dedicated or supporting ticket confidentiali
A procedure to avoid false-positives in the security incident management process
Assignment of priority to the incident, part of impact assessment
Assignment of severity to the incident, part of impact assessment
Categorization of the incident
A communication bridge for continuous alignment of employees involved in security incident management
A dedicated facility for coordination of security incidents
Standardized plans and templates for communication. Includes reachability in case of emergency and outreach to custome
Backup communication technology in case of failure of primary means. Includes internet access, email systems and phone
Encrypted and secure communications (includes email and phones) that can be used during incident response
A platform for sharing information regarding the security incident
Integration with the change management process for any actions taken in the security incident management process
Extraction and analysis of malware
Localized incident response capability
Remote incident response capability
Escalation process to third parties (vendors, partners, etc.)
A template for post-incident evaluation
A template for reporting on the security incident
Formal closure of the incident, including debriefing sessions
Continuous improvement based on previous experiences
Services
1. Security Monitoring 5. Threat Hunting
2. Security Incident Management 6. Vulnerability Management
3. Security Analysis & Forensics 7. Log Management
4. Threat Intelligence

3 Security Analysis & Forensics

Maturity
3.1 Have you formally described the security analysis & forensics service?
3.2 Please specify elements of the security analysis service document:
3.2.1 Key performance indicators
3.2.2 Quality indicators
3.2.3 Service dependencies
3.2.4 Service levels
3.2.5 Hours of operation
3.2.6 Service customers and stakeholders
3.2.7 Purpose
3.2.8 Service input / triggers
3.2.9 Service output / deliverables
3.2.10 Service activities
3.2.11 Service roles & responsibilities
Completeness
3.3 Is the service measured for quality?
3.4 Is the service measured for service delivery in accordance with service levels?
3.5 Are customers and/or stakeholders regularly updated about the service?
3.6 Is there a contractual agreement between the SOC and the customers?
3.7 Is sufficient personnel allocated to the process to ensure required service delivery?
3.8 Is the service aligned with other relevant processes?
3.9 Is there a incident resolution / service continuity process in place for this service?
3.10 Has a set of procedures been created for this service?
3.11 Are best practices applied to the service?
3.12 Is the service supported by predefined workflows or scenarios?
3.13 Is process data gathered for prediction of service performance?
3.14 Is the service continuously being improved based on improvement goals?
Capability
3.15 Please specify capabilities and artefacts of the security analysis process:
3.15.1 Event analysis
3.15.2 Event analysis toolkit
3.15.3 Trend analysis
3.15.4 Incident analysis
3.15.5 Visual analysis
3.15.6 Static malware analysis
3.15.7 Dynamic malware analysis
3.15.8 Tradecraft analysis
3.15.9 Historic analysis
3.15.10 Network analysis
3.15.11 Memory analysis
3.15.12 Mobile device analysis
3.15.13 Volatile information collection
3.15.14 Remote evidence collection
3.15.15 Forensic hardware toolkit
3.15.16 Forensic analysis software toolkit
3.15.17 Dedicated analysis workstations
3.15.18 Security analysis & forensics handbook
3.15.19 Security analysis & forensics workflows
3.15.20 Case management system
3.15.21 Report templates
3.15.22 Evidence seizure procedure
3.15.23 Evidence transport procedure
3.15.24 Chain of custody preservation procedure

Completeness (%)

Comments and/or Remarks

3.16 Specify any comments or remarks you feel are important to this part of the assessment
ent

Answer CMMI level

CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
Incomplete
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 4
CMMI level 5
0
Guidance
Remarks

A service description should be in place

Indicators to establish the performance of the service


Indicators to establish the quality of service delivery
A clear understanding of which people / process / technologies are required for adequate service delivery
Agreements on minimum performance, capacity, availability, etc.
The operational hours for this service (e.g. 24/7)
The customers and stakeholders for this service (e.g. IT management)
The purpose and objectives for this service
The service input: what triggers this service to run?
The service output: what does the service deliver? Can be tangible (e.g. reports) or intangible (e.g. situational awareness )
Which activities are carried out within the scope of the service?
Which roles and responsibilities apply to this service?
Use this outcome to determine the score for 3.2
Are the quality indicators from the previous questions used for reporting on the service?
Service levels should be used to formally commit the SOC to service delivery
Changes to the service scope, delivery, etc.
Contractual agreements should also cover penalties
Allocation of dedicated personnel will ensure highest service quality
e.g. alignment with configuration management, incident management, etc.
Service continuity is important to comply with contractual agreements, even in case of major incidents
Procedures support process standardization and quality. Personnel should be trained to use procedures correctly and stru
Best practices should be used to optimize this service
Use cases can be used to guide the analysis workflows
Service performance measurement requires establishment of performance goals
Improvement based on based evaluation, (maturity) assessment, tests, etc.
Analysis of detailed events
A combination of internal and external tools that can be used for security event analysis purposes
Analysis of trends in events or incidents
Analysis of security incidents
Visualization tools for data analysis
Reverse engineering and disassembly of malware
Running malware in a controlled environment to determine its characteristics
Analysis of the tradecraft of the attacker. This includes the tools, tactics, techniques and procedures used by attackers
Analysis of historic information based on new insights. APTs can span multiple months or years
Analysis of network traffic patterns and packets

Capability to perform forensic analysis of mobile devices


Collection of volatile information (such as memory; see RFC3227) requires swift response, as evidence may be lost quickly
Capability to remotely collect evidence (files, disk images, memory dumps, etc. ) from target systems
Hardware toolkits will likely at least consist of write-blockers for disk imaging
Software tools used in forensic analysis
Dedicated workstations loaded with specialized tools should be used to make investigations more efficient
A handbook that describes security analysis workflows, tools, exceptions, known issues, etc.
An established workflow for performing security analysis
A case management system that supports the analyst workflow
Report templates for standardization of investigation reporting
Procedure for seizure of evidence in forensic analysis
Procedure for trusted transport of evidence (e.g. laptops) that preserve the chain of custody
Procedures to correctly process evidence, while preserving the chain of custody
Services
1. Security Monitoring 5. Threat Hunting
2. Security Incident Management 6. Vulnerability Management
3. Security Analysis & Forensics 7. Log Management
4. Threat intelligence

4 Threat Intelligence

Maturity
4.1 Have you formally described the threat intelligence service?
4.2 Please specify elements of the threat intelligence service document:
4.2.1 Key performance indicators
4.2.2 Quality indicators
4.2.3 Service dependencies
4.2.4 Service levels
4.2.5 Hours of operation
4.2.6 Service customers and stakeholders
4.2.7 Purpose
4.2.8 Service input / triggers
4.2.9 Service output / deliverables
4.2.10 Service activities
4.2.11 Service roles & responsibilities
Completeness
4.3 Is the service measured for quality?
4.4 Is the service measured for service delivery in accordance with service levels?
4.5 Are customers and/or stakeholders regularly updated about the service?
4.6 Is there a contractual agreement between the SOC and the customers?
4.7 Is sufficient personnel allocated to the process to ensure required service delivery?
4.8 Is the service aligned with other relevant processes?
4.9 Is there a incident resolution / service continuity process in place for this service?
4.10 Has a set of procedures been created for this service?
4.11 Are best practices applied to the service?
4.12 Is process data gathered for prediction of service performance?
4.13 Is the service continuously being improved based on improvement goals?

Capability
4.14 Please specify capabilities and artefacts of the threat intelligence process:
4.14.1 Continuous intelligence gathering
4.14.2 Automated intelligence gathering & processing
4.14.3 Centralized collection & distribution
4.14.4 Intelligence collection from open / public sources
4.14.5 Intelligence collection from closed communities
4.14.6 Intelligence collection from intelligence provider
4.14.7 Intelligence collection from business partners
4.14.8 Intelligence collection from mailing lists
4.14.9 Intelligence collection from internal sources
4.14.10 Structured data analysis
4.14.11 Unstructured data analysis
4.14.12 Past incident analysis
4.14.13 Trend analysis
4.14.14 Automated alerting
4.14.15 Adversary movement tracking
4.14.16 Attacker identification
4.14.17 Threat identification
4.14.18 Threat prediction
4.14.19 TTP extraction
4.14.20 Deduplication
4.14.21 Enrichment
4.14.22 Contextualization
4.14.23 Prioritization
4.14.24 Threat intelligence reporting
4.14.25 Forecasting
4.14.26 Sharing within the company
4.14.27 Sharing with the industry
4.14.28 Sharing outside the industry
4.14.29 Sharing in standardized format (e.g. STIX)

Completeness (%)

Comments and/or Remarks

4.15 Specify any comments or remarks you feel are important to this part of the assessment
ent

Answer CMMI level

CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
Incomplete
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 4
CMMI level 5
0
Guidance
Remarks

A service description should be in place

Indicators to establish the performance of the service


Indicators to establish the quality of service delivery
A clear understanding of which people / process / technologies are required for adequate service delivery
Agreements on minimum performance, capacity, availability, etc.
The operational hours for this service (e.g. 24/7)
The customers and stakeholders for this service (e.g. IT management)
The purpose and objectives for this service
The service input: what triggers this service to run?
The service output: what does the service deliver? Can be tangible (e.g. reports) or intangible (e.g. situational awareness )
Which activities are carried out within the scope of the service?
Which roles and responsibilities apply to this service?
Use this outcome to determine the score for 4.2
Are the quality indicators from the previous questions used for reporting on the service?
Service levels should be used to formally commit the SOC to service delivery
Changes to the service scope, delivery, etc.
Contractual agreements should also cover penalties
Allocation of dedicated personnel will ensure highest service quality
e.g. the security monitoring process, and mainly the security incident management process
Service continuity is important to comply with contractual agreements, even in case of major incidents
Procedures support process standardization and quality. Personnel should be trained to use procedures correctly and stru
Best practices should be used to optimize this service
Service performance measurement requires establishment of performance goals
Improvement based on based evaluation, (maturity) assessment, tests, etc.
A process for continuously gathering relevant intelligence information
An automated system that collects and processes security intelligence information
A central 'hub' for distributing indicators of compromise to other systems for further processing
The use of public sources in the security intelligence process
The use of closed trusted communities in the security intelligence process
The use of intelligence providers as a source for the security intelligence process
The use of business partners as a source for the security intelligence process
The use of mailing lists as a source for the security intelligence process
The use of internal intelligence sources for the security intelligence process
The capability to analyze structured information
The capability to analyze unstructured information
The capability of using past incidents in the threat intelligence process. e.g. connecting new IoCs to past threats
Analyzing trends in the threat intelligence IoCs observed within the company
Automated alerting of sightings of observables
Tracking the movement of attackers to keep track of new tools, tactics, techniques and procedures
Identification of adversaries based on correlating intelligence indicators and incidents
Identification of threats related to attacker groups
Prediction of threats based on the information gathered in the threat intelligence process
Extraction of Tactics, Techniques and Procedures (TTP) from observables within the infrastructure
Deduplication of threat intelligence feeds to avoid duplicate events
Enrichment of information with additional sources for a higher level of confidentiality
Addition of context to the threat intelligence process. Context can be vulnerability context, asset criticality, etc.
Prioritization of threat intelligence based on trustworthiness of source, sector relevance, geographic relevance, timeliness,
Reporting on threat intelligence findings and activities
Forecasting based on trends and incidents
Sharing of information with relevant parties within the company
Sharing of information with relevant parties within the same industry
Sharing of information with relevant parties outside the industry
Sharing of information in standardized exchange formats, such as STIX
Services
1. Security Monitoring 5. Threat Hunting
2. Security Incident Management 6. Vulnerability Management
3. Security Analysis & Forensics 7. Log Management
4. Threat Intelligence

5 Threat Hunting

Maturity
5.1 Do you use a standardized threat hunting approach?
5.2 Have you formally described the threat hunting service?
5.3 Please specify elements of the threat intelligence service document:
5.3.1 Key performance indicators
5.3.2 Quality indicators
5.3.3 Service dependencies
5.3.4 Service levels
5.3.5 Hours of operation
5.3.6 Service customers and stakeholders
5.3.7 Purpose
5.3.8 Service input / triggers
5.3.9 Service output / deliverables
5.3.10 Service activities
5.3.11 Service roles & responsibilities
Completeness
5.4 Is the service measured for quality?
5.5 Is the service measured for service delivery in accordance with service levels?
5.6 Are customers and/or stakeholders regularly updated about the service?
5.7 Is there a contractual agreement between the SOC and the customers?
5.8 Is sufficient personnel allocated to the process to ensure required service delivery?
5.9 Is the service aligned with other relevant processes?
5.10 Is there a incident resolution / service continuity process in place for this service?
5.11 Has a set of procedures been created for this service?
5.12 Are best practices applied to the service?
5.13 Is process data gathered for prediction of service performance?
5.14 Is the service continuously being improved based on improvement goals?
Capability
5.15 Please specify capabilities and artefacts of the threat hunting process:
5.15.1 Hash value hunting
5.15.2 IP address hunting
5.15.3 Domain name hunting
5.15.4 Network artefact hunting
5.15.5 Host-based artefact hunting
5.15.6 Adversary tools hunting
5.15.7 Adversary TTP hunting
5.15.8 Inbound threat hunting
5.15.9 Outbound threat hunting
5.15.10 Internal threat hunting
5.15.11 Outlier detection
5.15.12 Hunting coverage
5.15.13 Leveraging of existing tooling
5.15.14 Custom hunting scripts and tools
5.15.15 Dedicated hunting platform
5.15.16 Continuous hunting data collection
5.15.17 Historic hunting
5.15.18 Automated hunting
5.15.19 Hunt alerting
5.15.20 Vulnerability information integration
5.15.21 Threat intelligence integration

Completeness (%)

Comments and/or Remarks

5.16 Specify any comments or remarks you feel are important to this part of the assessment
ent

Answer CMMI level

CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
Incomplete
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 4
CMMI level 5
0
Guidance
Remarks

Given the fact that little public information is available, this can also be an internally developed approach
A service description should be in place

Indicators to establish the performance of the service


Indicators to establish the quality of service delivery
A clear understanding of which people / process / technologies are required for adequate service delivery
Agreements on minimum performance, capacity, availability, etc.
The operational hours for this service (e.g. 24/7)
The customers and stakeholders for this service (e.g. IT management)
The purpose and objectives for this service
The service input: what triggers this service to run?
The service output: what does the service deliver? Can be tangible (e.g. reports) or intangible (e.g. situational awareness )
Which activities are carried out within the scope of the service?
Which roles and responsibilities apply to this service?
Use this outcome to determine the score for 4.2
Are the quality indicators from the previous questions used for reporting on the service?
Service levels should be used to formally commit the SOC to service delivery
Changes to the service scope, delivery, etc.
Contractual agreements should also cover penalties
Allocation of dedicated personnel will ensure highest service quality
e.g. threat intelligence, security monitoring, security incident response
Service continuity is important to comply with contractual agreements, even in case of major incidents
Procedures support process standardization and quality. Personnel should be trained to use procedures correctly and stru
Best practices should be used to optimize this service
Service performance measurement requires establishment of performance goals
Improvement based on based evaluation, (maturity) assessment, tests, etc.
'Trivial', lowest added value as these change swiftly
'Easy', low added value
'Simple', somewhat higher added value
'Annoying', include network flow, packet capture, proxy logs, active network connections, historic connections, ports and s
'Annoying', includes users, processes, services, drivers, files, registry, hardware, memory, disk activity, network connection
'Challenging', includes dual-use tools
'Tough!', requires detailed knowledge of adversaries and their modus operandi
Hunting for inbound threats such as inbound connections, DNS zone transfers, inbound emails
Hunting for outbound threats such as C&C traffic, outbound emails
Hunting for threats inside the organization. Hunting may focus on lateral movement or anomalous network connections
Using statistical methods to detect outliers, such as least frequency of occurrence
How well does the hunting process cover your environment? All assets & network traffic, or only partially? Scalability is ke
Existing tools may include the SIEM system, firewall analysis tools, etc.
Any custom scripts to assist the hunting process. May include scripts to scan end-points for particular artifacts
Dedicated tooling for the hunting process
Continuous collection of information can be used to alert on indicators and to preserve system state
Hunting for indicators that may have been present on end-points in the past. Requires some sort of saved state
Fully automated hunting capability
Automated alerting based on queries performed in the hunting process
Integration of vulnerability information into the hunting process to provide additional context
Integration of threat intelligence information into the hunting process
Services
1. Security Monitoring 5. Threat Hunting
2. Security Incident Management 6. Vulnerability Management
3. Security Analysis & Forensics 7. Log Management
4. Threat Intelligence

6 Vulnerability Management

Maturity
6.1 Have you formally described the vulnerability management service?
6.2 Please specify elements of the vulnerability management service document:
6.2.1 Key performance indicators
6.2.2 Quality indicators
6.2.3 Service dependencies
6.2.4 Service levels
6.2.5 Hours of operation
6.2.6 Service customers and stakeholders
6.2.7 Purpose
6.2.8 Service input / triggers
6.2.9 Service output / deliverables
6.2.10 Service activities
6.2.11 Service roles & responsibilities
Completeness
6.3 Is the service measured for quality?
6.4 Is the service measured for service delivery in accordance with service levels?
6.5 Are customers and/or stakeholders regularly updated about the service?
6.6 Is there a contractual agreement between the SOC and the customers?
6.7 Is sufficient personnel allocated to the process to ensure required service delivery?
6.8 Is the service aligned with other relevant processes?
6.9 Is there a incident resolution / service continuity process in place for this service?
6.10 Has a set of procedures been created for this service?
6.11 Are best practices applied to the service?
6.12 Is process data gathered for prediction of service performance?
6.13 Is the service continuously being improved based on improvement goals?

Capability
6.14 Please specify capabilities and artefacts of the vulnerability management process:
6.14.1 Network mapping
6.14.2 Vulnerability identification
6.14.3 Risk identification
6.14.4 Risk acceptance
6.14.5 Security baseline scanning
6.14.6 Authenticated scanning
6.14.7 Incident management integration
6.14.8 Asset management integration
6.14.9 Configuration management integration
6.14.10 Patch management integration
6.14.11 Trend identification
6.14.12 Enterprise vulnerability repository
6.14.13 Enterprise application inventory
6.14.14 Vulnerability Management procedures
6.14.15 Scanning policy tuning
6.14.16 Detailed Vulnerability Reporting
6.14.17 Management Reporting
6.14.18 Scheduled scanning
6.14.19 Ad-hoc specific scanning

Completeness (%)

Comments and/or Remarks

6.15 Specify any comments or remarks you feel are important to this part of the assessment
ent

Answer CMMI level

CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
Incomplete
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 4
CMMI level 5
0
Guidance
Remarks

A service description should be in place

Indicators to establish the performance of the service


Indicators to establish the quality of service delivery
A clear understanding of which people / process / technologies are required for adequate service delivery
Agreements on minimum performance, capacity, availability, etc.
The operational hours for this service (e.g. 24/7)
The customers and stakeholders for this service (e.g. IT management)
The purpose and objectives for this service
The service input: what triggers this service to run?
The service output: what does the service deliver? Can be tangible (e.g. reports) or intangible (e.g. situational awareness )
Which activities are carried out within the scope of the service?
Which roles and responsibilities apply to this service?
Use this outcome to determine the score for 5.2
Are the quality indicators from 1.3.2 used for reporting on the service?
Service levels should be used to formally commit the SOC to service delivery
Changes to the service scope, delivery, etc.
Contractual agreements should also cover penalties
Allocation of dedicated personnel will ensure highest service quality
e.g. the security monitoring process, and mainly the security incident management process
Service continuity is important to comply with contractual agreements, even in case of major incidents
Procedures support process standardization and quality. Personnel should be trained to use procedures correctly and stru
Best practices should be used to optimize this service
Service performance measurement requires establishment of performance goals
Improvement based on based evaluation, (maturity) assessment, tests, etc.
The capability to map the entire network
Capability of identification of vulnerabilities on all types of assets: systems, network components, databases, etc.
Identification of the risk associated with each of these vulnerabilities
Vulnerabilities that are not mitigated must be formally accepted and documented as such
Scanning of systems for compliance to a security baselines (e.g. CIS baselines)
Scanning of systems using credentials for higher confidence and additional vulnerabilities
Integration of the vulnerability management process with the incident management process
Integration of the vulnerability management process with the asset management process
Integration of the vulnerability management process with the configuration management process
Integration of the vulnerability management process with the patch management process
Identification of vulnerability trends across the whole population of systems
A repository or database that holds all vulnerability information. Can be used for analysis
An inventory of all applications used in the enterprise and the vulnerability status for each of those applications
Procedures supporting the vulnerability management process
Continuous tuning of the scanning policy to include new threats and vulnerabilities
Detailed reporting of vulnerable assets and mitigation strategies
A management report that contains an overview of the vulnerability status in the organizations
A scheduling engine that allows for scanning at predefined times and insight into all available scans
e.g. capability to scan for specific vulnerabilities. May require consent and other processes to be in place
Services
1. Security Monitoring 5. Threat Hunting
2. Security Incident Management 6. Vulnerability Management
3. Security Analysis & Forensics 7. Log Management
4. Threat Intelligence

7 Log Management

Maturity
7.1 Have you formally described the log management service?
7.2 Please specify elements of the log management service document:
7.2.1 Key performance indicators
7.2.2 Quality indicators
7.2.3 Service dependencies
7.2.4 Service levels
7.2.5 Hours of operation
7.2.6 Service customers and stakeholders
7.2.7 Purpose
7.2.8 Service input / triggers
7.2.9 Service output / deliverables
7.2.10 Service activities
7.2.11 Service roles & responsibilities
Completeness
7.3 Is the service measured for quality?
7.4 Is the service measured for service delivery in accordance with service levels?
7.5 Are customers and/or stakeholders regularly updated about the service?
7.6 Is there a contractual agreement between the SOC and the customers?
7.7 Is sufficient personnel allocated to the process to ensure required service delivery?
7.8 Is the service aligned with other relevant processes?
7.9 Is there a incident resolution / service continuity process in place for this service?
7.10 Has a set of procedures been created for this service?
7.11 Are best practices applied to the service?
7.12 Is process data gathered for prediction of service performance?
7.13 Is the service continuously being improved based on improvement goals?

Capability
7.14 Please specify capabilities and artefacts of the log management process:
7.14.1 End-point log collection
7.14.2 Application log collection
7.14.3 Database log collection
7.14.4 Network flow data collection
7.14.5 Network device log collection
7.14.6 Security device log collection
7.14.7 Centralized aggregation and storage
7.14.8 Multiple retention periods
7.14.9 Secure log transfer
7.14.10 Support for multiple log formats
7.14.11 Support for multiple transfer techniques
7.14.12 Data normalization
7.14.13 Log searching and filtering
7.14.14 Alerting
7.14.15 Reporting and dashboards
7.14.16 Log tampering detection
7.14.17 Log collection policy
7.14.18 Logging policy
7.14.19 Data retention policy
7.14.20 Privacy and Sensitive data handling policy

Completeness (%)

Comments and/or Remarks

7.15 Specify any comments or remarks you feel are important to this part of the assessment
ent

Answer CMMI level

CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
Incomplete
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 2
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 3
CMMI level 4
CMMI level 5
0
Remarks

A service description should be in place

Indicators to establish the performance of the service


Indicators to establish the quality of service delivery
A clear understanding of which people / process / technologies are required for adequate service delivery
Agreements on minimum performance, capacity, availability, etc.
The operational hours for this service (e.g. 24/7)
The customers and stakeholders for this service (e.g. IT management)
The purpose and objectives for this service
The service input: what triggers this service to run?
The service output: what does the service deliver? Can be tangible (e.g. reports) or intangible (e.g. situational awareness )
Which activities are carried out within the scope of the service?
Which roles and responsibilities apply to this service?
Use this outcome to determine the score for 6.2
Are the quality indicators from 1.3.2 used for reporting on the service?
Service levels should be used to formally commit the SOC to service delivery
Changes to the service scope, delivery, etc.
Contractual agreements should also cover penalties
Allocation of dedicated personnel will ensure highest service quality
e.g. the security monitoring process, and mainly the security incident management process
Service continuity is important to comply with contractual agreements, even in case of major incidents
Procedures support process standardization and quality. Personnel should be trained to use procedures correctly and stru
Best practices should be used to optimize this service
Service performance measurement requires establishment of performance goals
Improvement based on based evaluation, (maturity) assessment, tests, etc.
Collection of logs from servers and clients
Collection of application logs
Collection of database logs
Collection of netflow (or equivalent) information
Collection of logs from network devices (switches, routers, etc.)
Collection of logs from security devices (firewall, remote access gateways, etc.)
A central physical or logical entity for processing and aggregating collected logging
e.g. short period for large-quantity logging (proxy logging), long period for security logging
Support for encryption and (client or server) authentication
Support for different log formats (plain text, XML, Windows Event Log, etc.)
e.g. syslog, WMI, etc.
i.e. assignment of severity, category, priority
The capability to search in large quantities of logging using search expressions and filter expressions
Basic alerting functions based on log contents or normalized information (severity, etc.)
Reports and dashboards for visualization of log information
Detection of tampering with the logging information. This can be part of techniques applied to cover tracks
A policy that enforces log collection from all systems
Policy to enforce the generation of a minimum set of operational and security logs (e.g. authentication, authorization)
A policy that defines how long logging should (or may) be stored
A policy that describes how to deal with sensitive information that may exist in the security monitoring systems
Results
1. Results
2. NIST Scoring

Domain Aspect Maturity Score Maturity Target


Business 1. Business Drivers 0
2. Customers 0
3. Charter 0
4. Governance 0
5. Privacy 0
overall Business 0 1
People 1. Employees 0
2. Roles and Hierarchy 0
3. People Management 0
4. Knowledge Management 0
5. Training and Education 0
overall People 0 1
Process 1. Management 0
2. Operations and Facilities 0
3. Reporting 0
4. Use Case Management 0
overall Process 0 1
Technology 1. SIEM tooling 0
2. IDPS tooling 0
3. Security Analytics tooling 0
4. Automation & Orchestration tooling 0
overall Technology 0 1
Services 1. Security Monitoring 0
2. Security Incident Management 0
3. Security Analysis & Forensics 0
4. Threat Intelligence 0
5. Threat Hunting 0
6. Vulnerability Management 0
7. Log Management 0
overall Services 0 1
Maturity Target Capability score Capability target In scope?

1 N/A N/A

1 N/A N/A

1 N/A N/A
0 Yes
0 Yes
0 Yes
0 Yes
1 0 1
0 Yes
0 Yes
0 Yes
0 Yes
0 Yes
0 Yes
0 Yes
1 0 1
1. Business Drivers
7. Log Management 2. Custom
Services
6. Vulnerability Management
5

4.5
5. Threat Hunting
4

3.5
4. Threat Intelligence
3

2.5

2
3. Security Analysis & Forensics
1.5

0.5
2. Security Incident Management
0

1. Security Monitoring

4. Automation & Orchestration tooling

3. Security Analytics tooling

Technology 2. IDPS tooling

1. SIEM tooling 2.
4. Use Case Management 3. Reporting
Technology 2. IDPS tooling

1. SIEM tooling 2.
4. Use Case Management 3. Reporting

Business

Services 2 People

Technology Process

Maturity score Target maturity score


1. Business Drivers
7. Log Management 2. Customers
Business
nagement 3. Charter
5

4.5
4. Governance
4

3.5
5. Privacy
3

2.5

2
1. Employees
1.5

0.5
2. Roles and Hierarchy
M
0
C

3. People Management

4. Knowledge Management

People
5. Training and Education

g 1. Management

1. SIEM tooling 2. Operations and Facilities Process


4. Use Case Management 3. Reporting
g 1. Management

1. SIEM tooling 2. Operations and Facilities Process


4. Use Case Management 3. Reporting

2.5

2
People

1.5

0.5

0
Process Technology Services

t maturity score Capability score Target capability score


Results
1. Results
2. NIST Scoring

Domain Aspect Maturity Score


Identify Asset Management (ID.AM) 0
Business Environment (ID.BE) 0
Governance (ID.GV) 0
Risk Assessment (ID.RA) 0
Risk Management Strategy (ID.RM) 0
overall Identify 0
Protect Access Control (PR.AC) 0
Awareness and Training (PR.AT) 0
Data Security (PR.DS) 0
Information Protection Processes and Procedures (PR.IP) 0
Maintenance (PR.MA) 0
Protective Technology (PR.PT) 0
overall Protect 0
Detect Anomalies and Events (DE.AE) 0
Security Continuous Monitoring (DE.CM) 0
Detection Processes (DE.DP) 0
overall Detect 0
Respond Response Planning (RS.RP) 0
Communications (RS.CO) 0
Analysis (RS.AN) 0
Mitigation (RS.MI) 0
Improvements (RS.IM) 0
overall Respond 0
Recover Recovery Planning (RC.RP) N/A
Improvements (RC.IM) N/A
Communications (RC.CO) N/A
overall Recover N/A
Maturity Score Capability score
0 N/A
Recover
0 N/A
0 0 Impro
0 0
0 N/A
Recovery Planning (RC.R
0 0
0 0
0 0
Improvements (RS.IM)
0 0
0 0
0 0
0 0 Mitigation (RS.MI)

0 0
0 0
0 0
0 0 Analysis (RS.AN)
0 0
0 0
0 0
Communications (RS.CO)
0 0
0 0
Respond
0 0
Response Planning (RS.R
0 0
N/A N/A
Detection P
N/A N/A
N/A N/A
Detect Security
N/A N/A
Recover Communications (RC.CO)
Asset Management (ID.AM)
Business Environment (ID.B

Improvements (RC.IM) 5 Governanc

4.5

Recovery Planning (RC.RP) 4

3.5

Improvements (RS.IM) 2.5

1.5

Mitigation (RS.MI) 1

0.5

Analysis (RS.AN)

Communications (RS.CO)

Respond
Response Planning (RS.RP)

Detection Processes (DE.DP) Maintenan

Detect Security Continuous Monitoring (DE.CM)


Anomalies and Events (DE.AE)
Protective Technology (PR.P
Identify

Recover 2 Protect

Respond Detect

Maturity score
(RC.CO)
Asset Management (ID.AM)
Business Environment (ID.BE) Identify
5 Governance (ID.GV)

4.5

4 Risk Assessment (ID.RA)


3.5

2.5 Risk Management Strategy (ID.RM)

1.5 Ma
Cap
1 Access Control (PR.AC)
0.5

Awareness and Training (PR.AT)

Data Security (PR.DS)

Information Protection Processes and Procedures (PR.IP)

Maintenance (PR.MA)

(DE.CM)
Anomalies and Events (DE.AE)
Protective Technology (PR.PT) Protect
3

2.5

2
Protect

1.5

0.5

0
Detect Identify Protect Detect Respond Recov

e Capability score
Next steps
1. Next steps for improvement

Maturity improvement
With the SOC-CMM assessment completed, the next steps are to determine the areas to improve. This requires som
analysed top-down. First, determine which domains are scoring less than the target maturity level. Then, drill down
maturity level was not used, then the domains should be chosen that underperform in comparison to the other dom
of those domains yield the lowest scores.

When the domains and the respective aspects that require improvement have been identified, detailed information
that need to be made. The sheets for those domains provide the detailed information that is required for improvem
the 'Usage' sheet to determine which of the individual elements is negatively contributing to the overall score. Those
Improvement can as simple as creating and maintaining the appropriate documentation or as complex as introducin
SOC-CMM does not provide guidance on how to execute the improvement. This should be determined by internal ex

Capability improvement
Capabilities apply to services and technologies and indicate how capable a service or technology is to reach it's goal
be improved, the first question to ask is: which service or technology is negatively impacted the most by lack of capa
candidate for improvement.

Similar to maturity improvement, the detailed information is provided in the sheets for those domains. The element
to be addressed. It is recommended to search for groups of elements that perhaps have the same underlying reason
improvement of capabilities can be optimised. A common root cause is lack of documentation and formalisation.

Comparison
When a second assessment is performed, the results should be compared to the previous assessment to determine t
both the high-level and the detailed information about the improvement. Use the result tables to determine the diff
of the assessment to see where actual improvement was made, and if this is in line with goals set for improvement.
prove. This requires some analysis of the results. The results should be
ty level. Then, drill down into those domains using the graphs. If a target
parison to the other domains. The next step is to determine which aspects

ed, detailed information is required to determine the exact improvements


s required for improvement. Use the scoring mechanism as described in
o the overall score. Those elements are candidate for improvement.
as complex as introducing new management elements to the SOC. The
determined by internal experts or external consultants.

ology is to reach it's goals. To determine which specific capabilities need to


the most by lack of capabilities? That service or technology is the first

se domains. The elements that score the lowest are the elements that need
e same underlying reason (root cause) for underscoring. This way,
on and formalisation.

ssessment to determine the growth and evolution of the SOC. This includes
bles to determine the differences and then drill down to those specific parts
als set for improvement.
SOC-CMM - Business Domain
B1 - Business Drivers answer
B 1.1 0
1
2
3
4
5
B 1.2 0
1
2
3
4
5
B 1.3 0
1
2
3
4
5
B 1.4 0
1
2
3
4
5
B 1.5 0
1
2
3
4
5
B2 - Customers answer
B 2.1 0
1
2
3
4
5
B 2.3 0
1
2
3
4
5
B 2.4 0
1
2
3
4
5
B 2.5 0
1
2
3
4
5
B 2.6 0
1
2
3
4
5
B 2.7 0
1
2
3
4
5
B3 - SOC Charter answer
B 3.1 0
1
2
3
4
5
B 3.3 0
1
2
3
4
5
B 3.4 0
1
2
3
4
5
B 3.5 0
1
2
3
4
5
B4 - Governance answer
B 4.1 0
1
2
3
4
5
B 4.2 0
1
2
3
4
5
B 4.4 0
1
2
3
4
5
B 4.5 0
1
2
3
4
5
B 4.7 0
1
2
3
4
5
B 4.8 0
1
2
3
4
5
B 4.9 0
1
2
3
4
5
B5 - Privacy answer
B 5.1 0
1
2
3
4
5
B 5.2 0
1
2
3
4
5
B 5.3 0
1
2
3
4
5
B 5.4 0
1
2
3
4
5
B 5.5 0
1
2
3
4
5
B 5.6 0
1
2
3
4
5

SOC-CMM - People Domain


P1 - SOC Employees answer
P 1.3 0
1
2
3
4
5
P 1.4 0
1
2
3
4
5
P 1.5 0
1
2
3
4
5
P 1.6 0
1
2
3
4
5
P 1.7 0
1
2
3
4
5
P 1.8 0
1
2
3
4
5

P2 - SOC Roles and Hierarchy answer


P 2.1 0
1
2
3
4
5
P 2.3 0
1
2
3
4
5
P 2.4 0
1
2
3
4
5
P 2.5 0
1
2
3
4
5
P 2.6 0
1
2
3
4
5
P 2.8 0
1
2
3
4
5
P 2.9 0
1
2
3
4
5
P 2.10 0
1
2
3
4
5

P3 - People Management answer


P 3.1 0
1
2
3
4
5
P 3.2 0
1
2
3
4
5
P 3.3 0
1
2
3
4
5
P 3.4 0
1
2
3
4
5
P 3.5 0
1
2
3
4
5
P 3.6 0
1
2
3
4
5
P 3.7 0
1
2
3
4
5
P 3.8 0
1
2
3
4
5
P 3.9 0
1
2
3
4
5
P 3.9 0
1
2
3
4
5

P4 - Knowledge Management answer


P 4.1 0
1
2
3
4
5
P 4.2.1 0
1
2
3
4
5
P 4.2.2 0
1
2
3
4
5
P 4.2.3 0
1
2
3
4
5
P 4.2.4 0
1
2
3
4
5
P 4.2.5 0
1
2
3
4
5
P 4.2.6 0
1
2
3
4
5
P 4.3.1 0
1
2
3
4
5
P 4.3.2 0
1
2
3
4
5
P 4.3.3 0
1
2
3
4
5
P 4.3.4 0
1
2
3
4
5
P 4.3.5 0
1
2
3
4
5
P 4.4 0
1
2
3
4
5
P 4.5 0
1
2
3
4
5

P5 - Training & Education answer


P 5.1 0
1
2
3
4
5
P 5.3 0
1
2
3
4
5
P 5.5 0
1
2
3
4
5
P 5.6 0
1
2
3
4
5
P 5.7 0
1
2
3
4
5
P 5.8 0
1
2
3
4
5
P 5.9 0
1
2
3
4
5

SOC-CMM - Process Domain


M1 - SOC Management answer
M 1.1 0
1
2
3
4
5
M 1.2 0
1
2
3
4
5
M 1.4 0
1
2
3
4
5
M 1.5 0
1
2
3
4
5

M2 - Security Operations & Facilities answer


M 2.1.1 0
1
2
3
4
5
M 2.1.2 0
1
2
3
4
5
M 2.1.3 0
1
2
3
4
5
M 2.1.4 0
1
2
3
4
5
M 2.1.5 0
1
2
3
4
5
M 2.2.1 0
1
2
3
4
5
M 2.2.2 0
1
2
3
4
5
M 2.2.3 0
1
2
3
4
5
M 2.2.4 0
1
2
3
4
5
M 2.2.5 0
1
2
3
4
5
M 2.3.1 0
1
2
3
4
5
M 2.3.2 0
1
2
3
4
5
M 2.3.3 0
1
2
3
4
5
M 2.3.4 0
1
2
3
4
5
M 2.3.5 0
1
2
3
4
5
M 2.3.6 0
1
2
3
4
5
M 2.4.1 0
1
2
3
4
5
M 2.4.2 0
1
2
3
4
5
M 2.4.3 0
1
2
3
4
5
M 2.4.4 0
1
2
3
4
5
M 2.4.5 0
1
2
3
4
5
M 2.5.1 0
1
2
3
4
5
M 2.5.2 0
1
2
3
4
5
M3 - Reporting answer
M 3.1 0
1
2
3
4
5
M 3.2 0
1
2
3
4
5
M 3.3 0
1
2
3
4
5
M 3.4 0
1
2
3
4
5
M 3.5 0
1
2
3
4
5
M 3.6 0
1
2
3
4
5
M 3.7.1 0
1
2
3
4
5
M 3.7.2 0
1
2
3
4
5
M 3.7.3 0
1
2
3
4
5
M 3.7.4 0
1
2
3
4
5
M 3.7.5 0
1
2
3
4
5
M 3.7.6 0
1
2
3
4
5
M 3.7.7 0
1
2
3
4
5
M 3.7.8 0
1
2
3
4
5
M 3.8.1 0
1
2
3
4
5
M 3.8.2 0
1
2
3
4
5
M 3.8.3 0
1
2
3
4
5
M 3.8.4 0
1
2
3
4
5
M 3.8.5 0
1
2
3
4
5
M 3.9.1 0
1
2
3
4
5
M 3.9.2 0
1
2
3
4
5
M 3.9.3 0
1
2
3
4
5

M4 - Use Case Management answer


M 4.1 0
1
2
3
4
5
M 4.2 0
1
2
3
4
5
M 4.3 0
1
2
3
4
5
M 4.4 0
1
2
3
4
5
M 4.5 0
1
2
3
4
5
M 4.6 0
1
2
3
4
5
M 4.7 0
1
2
3
4
5
M 4.8 0
1
2
3
4
5
M 4.9 0
1
2
3
4
5
M 4.10 0
1
2
3
4
5
M 4.11 0
1
2
3
4
5
M 4.12 0
1
2
3
4
5

SOC-CMM - Technology Domain


T1 - SIEM Technology answer
T 1.1.1 0
1
2
3
4
5
T 1.1.2 0
1
2
3
4
5
T 1.2.1 0
1
2
3
4
5
T 1.2.2 0
1
2
3
4
5
T 1.3.1 0
1
2
3
4
5
T 1.3.2 0
1
2
3
4
5
T 1.3.3 0
1
2
3
4
5
T 1.3.4 0
1
2
3
4
5
T 1.4.1 0
1
2
3
4
5
T 1.4.2 0
1
2
3
4
5
T 1.4.3 0
1
2
3
4
5
T 1.4.4 0
1
2
3
4
5
T 1.4.5 0
1
2
3
4
5
T 1.4.6 0
1
2
3
4
5
T 1.5.1 0
1
2
3
4
5
T 1.5.2 0
1
2
3
4
5
T 1.6.1 0
T 1.6.2 0
T 1.6.3 0
T 1.6.4 0
T 1.6.5 0
T 1.6.6 0
T 1.6.7 0
T 1.6.8 0
T 1.6.9 0
T 1.6.10 0
T 1.6.11 0
T 1.6.12 0
T 1.6.13 0
T 1.6.14 0
T 1.6.15 0
T 1.6.16 0
T 1.6.17 0
T 1.6.18 0
T 1.6.19 0
T 1.6.20 0
T 1.6.21 0
T 1.6.22 0
T 1.6.23 0
T 1.6.24 0
T 1.6.25 0

T2 - IDPS Tooling answer


T 2.1.1 0
1
2
3
4
5
T 2.1.2 0
1
2
3
4
5
T 2.2.1 0
1
2
3
4
5
T 2.2.2 0
1
2
3
4
5
T 2.3.1 0
1
2
3
4
5
T 2.3.2 0
1
2
3
4
5
T 2.3.3 0
1
2
3
4
5
T 2.3.4 0
1
2
3
4
5
T 2.4.1 0
1
2
3
4
5
T 2.4.2 0
1
2
3
4
5
T 2.4.3 0
1
2
3
4
5
T 2.4.4 0
1
2
3
4
5
T 2.4.5 0
1
2
3
4
5
T 2.4.6 0
1
2
3
4
5
T 2.5.1 0
1
2
3
4
5
T 2.5.2 0
1
2
3
4
5
T 2.6.1 0
T 2.6.2 0
T 2.6.3 0
T 2.6.4 0
T 2.6.5 0
T 2.6.6 0
T 2.6.7 0
T 2.6.8 0
T 2.6.9 0
T 2.6.10 0
T 2.6.11 0
T 2.6.12 0
T 2.6.13 0

T3 - Security Analytics answer


T 3.1.1 0
1
2
3
4
5
T 3.1.2 0
1
2
3
4
5
T 3.2.1 0
1
2
3
4
5
T 3.2.2 0
1
2
3
4
5
T 3.3.1 0
1
2
3
4
5
T 3.3.2 0
1
2
3
4
5
T 3.3.3 0
1
2
3
4
5
T 3.3.4 0
1
2
3
4
5
T 3.4.1 0
1
2
3
4
5
T 3.4.2 0
1
2
3
4
5
T 3.4.3 0
1
2
3
4
5
T 3.4.4 0
1
2
3
4
5
T 3.4.5 0
1
2
3
4
5
T 3.4.6 0
1
2
3
4
5
T 3.5.1 0
1
2
3
4
5
T 3.5.2 0
1
2
3
4
5
T 3.6.1 0
T 3.6.2 0
T 3.6.3 0
T 3.6.4 0
T 3.6.5 0
T 3.6.6 0
T 3.6.7 0
T 3.6.8 0
T 3.6.9 0
T 3.6.10 0
T 3.6.11 0
T 3.6.12 0
T 3.6.13 0
T 3.6.14 0
T 3.6.15 0
T 3.6.16 0
T 3.6.17 0
T 3.6.18 0
T 3.6.19 0
T 3.6.20 0
T 3.6.21 0
T 3.6.22 0

T4 - Security Automation & Orchestration answer


T 4.1.1 0
1
2
3
4
5
T 4.1.2 0
1
2
3
4
5
T 4.2.1 0
1
2
3
4
5
T 4.2.2 0
1
2
3
4
5
T 4.3.1 0
1
2
3
4
5
T 4.3.2 0
1
2
3
4
5
T 4.3.3 0
1
2
3
4
5
T 4.3.4 0
1
2
3
4
5
T 4.4.1 0
1
2
3
4
5
T 4.4.2 0
1
2
3
4
5
T 4.4.3 0
1
2
3
4
5
T 4.4.4 0
1
2
3
4
5
T 4.5.5 0
1
2
3
4
5
T 4.4.6 0
1
2
3
4
5
T 4.5.1 0
1
2
3
4
5
T 4.5.2 0
1
2
3
4
5
T 4.6.1
T 4.6.2
T 4.6.3
T 4.6.4
T 4.6.5
T 4.6.6
T 4.6.7
T 4.6.8
T 4.6.9
T 4.6.10
T 4.6.11
T 4.6.12
T 4.6.13
T 4.6.14
T 4.6.15
T 4.6.16
T 4.6.17

SOC-CMM - Services Domain


S1 - Security Monitoring answer
S 1.1 0
1
2
3
4
5
S 1.3 0
1
2
3
4
5
S 1.4 0
1
2
3
4
5
S 1.5 0
1
2
3
4
5
S 1.6 0
1
2
3
4
5
S 1.7 0
1
2
3
4
5
S 1.8 0
1
2
3
4
5
S 1.9 0
1
2
3
4
5
S 1.10 0
1
2
3
4
5
S 1.11 0
1
2
3
4
5
S 1.12 0
1
2
3
4
5
S 1.13 0
1
2
3
4
5
S 1.14 0
1
2
3
4
5
S 1.15
S 1.15.1
S 1.15.2
S 1.15.3
S 1.15.4
S 1.15.5
S 1.15.6
S 1.15.7
S 1.15.8
S 1.15.9
S 1.15.10
S 1.15.11
S 1.15.12
S 1.15.13
S 1.15.14
S 1.15.15
S 1.15.16
S 1.15.17
S 1.15.18
S 1.15.19
S 1.15.20
S 1.16

S 2 - Security incident Management answer


S 2.2 0
1
2
3
4
5
S 2.3 0
1
2
3
4
5
S 2.5 0
1
2
3
4
5
S 2.6 0
1
2
3
4
5
S 2.7 0
1
2
3
4
5
S 2.8 0
1
2
3
4
5
S 2.9 0
1
2
3
4
5
S 2.10 0
1
2
3
4
5
S 2.11 0
1
2
3
4
5
S 2.12 0
1
2
3
4
5
S 2.13 0
1
2
3
4
5
S 2.14 0
1
2
3
4
5
S 2.15 0
1
2
3
4
5
S 2.16
S 2.16.1 0
S 2.16.2 0
S 2.16.3 0
S 2.16.4 0
S 2.16.5 0
S 2.16.6 0
S 2.16.7 0
S 2.16.8 0
S 2.16.9 0
S 2.16.10 0
S 2.16.11 0
S 2.16.12 0
S 2.16.13 0
S 2.16.14 0
S 2.16.15 0
S 2.16.16 0
S 2.16.17 0
S 2.16.18 0
S 2.16.19 0
S 2.16.20 0
S 2.16.21 0
S 2.16.22 0
S 2.16.23 0
S 2.16.24 0
S 2.16.25 0
S 2.16.26 0
S 2.16.27 0
S 2.16.28 0
S 2.16.29 0
S 2.16.30 0
S 2.17

S 3 - Security Analysis answer


S 3.1 0
1
2
3
4
5
S 3.3 0
1
2
3
4
5
S 3.4 0
1
2
3
4
5
S 3.5 0
1
2
3
4
5
S 3.6 0
1
2
3
4
5
S 3.7 0
1
2
3
4
5
S 3.8 0
1
2
3
4
5
S 3.9 0
1
2
3
4
5
S 3.10 0
1
2
3
4
5
S 3.11 0
1
2
3
4
5
S 3.12 0
1
2
3
4
5
S 3.13 0
1
2
3
4
5
S 3.14 0
1
2
3
4
5
3.15
3.15.1
3.15.2
3.15.3
3.15.4
3.15.5
3.15.6
3.15.7
3.15.8
3.15.9
3.15.10
3.15.11
3.15.12
3.15.13
3.15.14
3.15.15
3.15.16
3.15.17
3.16

S4 - Threat Intelligence answer


S 4.1 0
1
2
3
4
5
S 4.3 0
1
2
3
4
5
S 4.4 0
1
2
3
4
5
S 4.5 0
1
2
3
4
5
S 4.6 0
1
2
3
4
5
S 4.7 0
1
2
3
4
5
S 4.8 0
1
2
3
4
5
S 4.9 0
1
2
3
4
5
S 4.10 0
1
2
3
4
5
S 4.11 0
1
2
3
4
5
S 4.12 0
1
2
3
4
5
S 4.13 0
1
2
3
4
5
S 4.14
S 4.14.1 0
S 4.14.2 0
S 4.14.3 0
S 4.14.4 0
S 4.14.5 0
S 4.14.6 0
S 4.14.7 0
S 4.14.8 0
S 4.14.9 0
S 4.14.10 0
S 4.14.11 0
S 4.14.12 0
S 4.14.13 0
S 4.14.14 0
S 4.14.15 0
S 4.14.16 0
S 4.14.17 0
S 4.14.18 0
S 4.14.19 0
S 4.14.20 0
S 4.14.21 0
S 4.14.22 0
S 4.14.23 0
S 4.14.24 0
S 4.14.25 0
S 4.14.26 0
S 4.15

S5 - Hunting answer
S 5.1 0
1
2
3
4
5
S 5.2 0
1
2
3
4
5
S 5.4 0
1
2
3
4
5
S 5.5 0
1
2
3
4
5
S 5.6 0
1
2
3
4
5
S 5.7 0
1
2
3
4
5
S 5.8 0
1
2
3
4
5
S 5.9 0
1
2
3
4
5
S 5.10 0
1
2
3
4
5
S 5.11 0
1
2
3
4
5
S 5.12 0
1
2
3
4
5
S 5.13 0
1
2
3
4
5
S 5.14 0
1
2
3
4
5

S 5.15
S 5.15.1
S 5.15.2
S 5.15.3
S 5.15.4
S 5.15.5
S 5.15.6
S 5.15.7
S 5.15.8
S 5.15.9
S 5.15.10
S 5.15.11
S 5.15.12
S 5.15.13
S 5.15.14
S 5.15.15
S 5.15.16
S 5.15.17
S 5.15.18
S 5.15.19
S 5.15.20

S6 - Vulnerability Management answer


S 6.1 0
1
2
3
4
5
S 6.3 0
1
2
3
4
5
S 6.4 0
1
2
3
4
5
S 6.5 0
1
2
3
4
5
S 6.6 0
1
2
3
4
5
S 6.7 0
1
2
3
4
5
S 6.8 0
1
2
3
4
5
S 6.9 0
1
2
3
4
5
S 6.10 0
1
2
3
4
5
S 6.11 0
1
2
3
4
5
S 6.12 0
1
2
3
4
5
S 6.13 0
1
2
3
4
5
S 6.14
S 6.14.1 0
S 6.14.2 0
S 6.14.3 0
S 6.14.4 0
S 6.14.5 0
S 6.14.6 0
S 6.14.7 0
S 6.14.8 0
S 6.14.9 0
S 6.14.10 0
S 6.14.11 0
S 6.14.12 0
S 6.14.13 0
S 6.14.14 0
S 6.14.15 0
S 6.14.16 0
S 6.14.17 0
S 6.14.18 0

S7 - Log Management answer


S 7.1 0
1
2
3
4
5
S 7.3 0
1
2
3
4
5
S 7.4 0
1
2
3
4
5
S 7.5 0
1
2
3
4
5
S 7.6 0
1
2
3
4
5
S 7.7 0
1
2
3
4
5
S 7.8 0
1
2
3
4
5
S 7.9 0
1
2
3
4
5
S 7.10 0
1
2
3
4
5
S 7.11 0
1
2
3
4
5
S 7.12 0
1
2
3
4
5
S 7.13 0
1
2
3
4
5

S 7.14 0
S 7.14.1 0
S 7.14.2 0
S 7.14.3 0
S 7.14.4 0
S 7.14.5 0
S 7.14.6 0
S 7.14.7 0
S 7.14.8 0
S 7.14.9 0
S 7.14.10 0
S 7.14.11 0
S 7.14.12 0
S 7.14.13 0
S 7.14.14 0
S 7.14.15 0
S 7.14.16 0
S 7.14.17 0
S 7.14.18 0
S 7.14.19 0
S 7.14.20 0

For backwards compatibility, add any guidance implemented after version 1.0 hereafter
SOC-CMM - Business Domain
guidance

Business drivers are unknown


Basic awareness of business drivers
Some business drivers have been identified
Most business drivers have been identified
All business drivers are well known within the SOC

No documentation in place
Some ad-hoc information across documents
Basic documentation of business drivers
Single document, full description of business drivers
Document completed, approved and formally published

Business drivers are not part of decision making


Business drivers are referred to on an ad-hoc basis
Business drivers are occasionally used in decisions
Business drivers are used in most decision
Business drivers are used in all relevant decisions

Service catalogue has not been checked for alignment


Alignment is performed on an ad-hoc basis
Alignment was performed but not maintained
Alignment is performed and maintained regularly
Every change in the catalogue is checked against drivers

Business drivers have not been validated


Basic awareness of SOC drivers exists among stakeholders
Stakeholders informally informed of business drivers
Alignment of SOC drivers with stakeholders is performed
Business drivers are formally validated by stakeholders
guidance

SOC customers are not known


Basic awareness of SOC customers
Some customers have been identified
Customers have mostly been identified
All customers are identified, including relevance and context

No documentation in place
Some ad-hoc information across documents
Basic documentation of SOC customers
Single document, full description of SOC customers
Document completed, approved and formally published

Output is the same for all customers


Output is somewhat contextualized
Some customers receive differentiated output
All important customers receive differentiated output
All customers receive specific output based on context and type

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No updates sent to customers


Ad-hoc updates sent to some customers
Frequent updates sent to most customers
Periodical updates sent to all customers
Periodical updates sent and discussed with all customers

Customer satisfaction not measured or managed


Customer satisfaction managed in ad-hoc fashion
Customer satisfaction metrics defined, not applied structurally
Customer satisfaction measured structurally, not actively managed
Customer satisfaction fully managed and improved over time
guidance

No charter document in place


Some ad-hoc information across documents
Basic charter document created
Single charter, full description of SOC strategic elements
Charter completed, approved and formally published

Charter is never updated


Charter is updated on ad-hoc basis
Charter is updated on major changes in business strategy
Charter is regularly updated
Charter periodically updated and realigned with business strategy

Charter is not approved


Business / CISO has basic awareness of the charter
Business / CISO has full awareness of the charter
Business / CISO approves of the content, but not formally
Charter is formally approved by the business / CISO

Stakeholders are unfamiliar with the charter


Some stakeholders are aware of the charter, but not its contents
Some stakeholders are aware of the charter and its contents
All stakeholders are aware, not all stakeholders know its contents
All Stakeholders are aware of the charter and its contents
guidance
SOC governance process is not in place
SOC governance is done in an ad-hoc fashion
Several governance elements are in place, but not structurally
Formal governance process is in place that covers most SOC aspects
Formal governance process is in place and covers all SOC aspects

No governance elements have been identified


Some governance elements are identified and governed ad-hoc
Some governance elements are identified and governed actively
Most governance elements are identified and actively governed
All elements are identified and actively governed

No governance document in place


Some ad-hoc information across documents
Basic governance document created
Single document, full description of governance elements
Governance document completed, approved and formally published

Cost management not in place


Cost visible, basic budget allocation in place
Costs fully visible and mostly managed, forecasting in place
Costs fully managed, not formally aligned with business stakeholders
Costs fully managed and formally aligned with business stakeholders

Governance process is not reviewed


Governance process is reviewed in an ad-hoc fashion
Process is reviewed using a structured approach in an ad-hoc fashion
Process is regularly and informally reviewed and updated
Process is regularly and formally reviewed and updated with findings

Stakeholders are unfamiliar with the process


Some stakeholders are aware of the process, but not its details
Some stakeholders are aware of the process and its details
All stakeholders are aware, not all stakeholders know its details
All stakeholders are aware of the process and its details

No assessments are performed


The SOC is assessed in an ad-hoc fashion
The SOC is assessed using a structured approach in an ad-hoc fashion
The SOC is regularly and informally assessed
The SOC is regularly and formally assessed by a third party
guidance

No policy is in place
Information regarding privacy is scattered across documents
A policy exists, but has not been accepted formally
A formal policy exists, its contents are known to all employees
A formal policy exists, its contents are accepted by all employees

Regulations are not known and the SOC is non-compliant


Some regulations are known and the SOC is non-compliant
Most regulations are known and the SOC is partially compliant
Regulations are fully known and the SOC is mostly compliant
Regulations are fully known and the SOC is fully compliant

There is no cooperation between the SOC and legal


There is some ad-hoc cooperation between SOC and legal
There is structural cooperation between SOC and legal
Alignment exists between SOC and legal
Full and regular alignment exists between SOC and legal

No privacy procedures in place


Some ad-hoc information across documents
Basic privacy procedure created
Single document, full description of privacy investigations
Procedure completed, approved and formally published

The SOC is unaware of any information


The SOC is aware of such information, no formal identification
The SOC is fully aware, some information is formally identified
Most privacy related information is identified and documented
All privacy related information is identified and documented

PIAs are not conducted


PIAs are conducted in an ad-hoc fashion
PIAs are conducted using a structured approach in an ad-hoc fashion
PIAs are conducted informally and regularly
PIAs are conducted formally and regularly

SOC-CMM - People Domain


guidance

The SOC is either heavily overstaffed or understaffed


The SOC is overstaffed or understaffed
The SOC is somewhat overstaffed or understaffed
The SOC mostly meets FTE requirements
The SOC is staffed to full satisfaction in terms of FTE requirements

There are either way too few or too many external employees
There are too few or too many external employees
The SOC has somewhat too many or too few external employees
The SOC mostly meets requirements for external employee FTE count
The external employee ratio meets all requirements

There are too many skills only present within the external employees
Some required skills are not present internally, and not transferred
Some required skills are not present internally, but being transferred
Most skills are covered with internal employees
All required skills are covered with internal employees as well

Not all positions filled, service delivery cannot be assured


Sufficient positions filled to ensure service delivery
All key positions filled
All positions currently filled, not meeting external ration requirements
All positions currently filled, meeting external ratio requirements

There is no recruitment process in place


Recruitment is performed at an ad-hoc basis
A basic recruitment process is in place
A full recruitment process is in place, but not performing effectively
A full recruitment process is in place and performing effectively

No talent acquisition process in place


Talent acquisition is performed at an ad-hoc basis
A basic talent acquisition process is in place
A full acquisition process is in place, but not performing effectively
A full acquisition process is in place and performing effectively

guidance

No roles are used in the SOC


Some roles exist, but are not actively being used
Some roles exist, and are actively being used
All roles are fully in use, but not formalized
All roles are fully in use and formalized

No tiers exist within these roles


Some tiers exist, but are not actively being used
Some tiers exist, and are actively being used
All tiers are fully in use, but not formalized
All relevant roles are tiered and formalized

None of the roles meets FTE requirements


Some roles meet FTE requirements
Vital roles meet FTE requirements
All vital roles and most other roles meet FTE requirements
All roles fully meet FTE requirements

No hierarchy exists
A basic hierarchy exists, but is not fully operational
A basic hierarchy is in place and fully operational
A full hierarchy is in place, but not formalized
A full hierarchy is in place and formalized
No documentation in place
Some ad-hoc information across documents
Basic documentation of SOC roles
Single document, full description of SOC roles
Document completed, approved and formally published

Responsibilities not understood


Basic awareness of responsibilities
Responsibilities for some roles understood and adhered to
Responsibilities for all roles mostly understood and adhered to
Full understanding of responsibilities formalized in training sessions

No documentation in place
Some ad-hoc information across documents
Basic documentation of career progression for roles
Single document, full description of career progression for roles
Document completed, approved and formally published

Documentation is not reviewed


Documentation is reviewed ad-hoc, not using a structured approach
Documentation is reviewed ad-hoc, using a structured approach
Documentation is regularly and informally reviewed and updated
Documentation is regularly and formally reviewed and updated

guidance

No hierarchy exists
A plan covering some roles is in place, but not operational
A plan covering some roles is in place and operational
A plan covering all roles is in place, but not formalized
A plan covering all roles is in place and formalized

No career progression process is in place


A process covering some roles is in place, but not operational
A process covering some roles is in place and operational
A process covering all roles is in place, but not formalized
A process covering all roles is in place and formalized

No talent management process in place


Talent management is performed at an ad-hoc basis
A basic talent management process is in place
A full process is in place, but not performing effectively
A full talent management process is in place and performing effectively

No diversity goals exist


Diversity goals are recognized but not defined
Diversity goals are defined but not formalized
Diversity goals have been formally defined and are not met
Diversity goals have been formally defined and are met

No periodic evaluation is performed


Periodic evaluation is perform in an ad-hoc fashion
Periodic evaluation is perform in a structured fashion
Periodic evaluation is performed, but results are not used structurally
Periodic evaluation is performed, results are used for personal growth

No new hire process in place


New hire training is done in an ad-hoc fashion
A process is in place, but does not cover all aspects
An informal process covering people, process and technology is in place
A formal process covering people, process and technology is in place

Screening not performed


Basic screening performed in ad-hoc fashion
Basic screening procedure in place, applied structurally
Full screening procedure in place, applied structurally, not formalized
Formal screening procedure and background checks applied structurally

Employee satisfaction is not measured


Employee satisfaction is measured in an ad-hoc fashion
Satisfaction is usually measured, but not embedded in processes
Employee satisfaction is measured, not used for improvement
Employee satisfaction measured periodically and used for improvement

1-on-1 meetings are not held within the SOC


1-on-1 meetings are held on ad-hoc basis
Informal 1-on-1 meetings are held periodically
Formal 1-on-1 meetings are regularly held, results are not structured
1-on-1 meetings are regularly held and used for coaching and growth

No team building exercises are performed


Exercises are performed in an ad-hoc fashion
Exercises are usually performed, but not embedded in processes
Exercises are regularly done, but not focused on improvement
Exercises are regularly done and focused on improving team dynamics

guidance

A knowledge management process is not in place


Knowledge management is done in an ad-hoc fashion
A basic process is in place, that covers some knowledge aspects
An informal process is in place that covers most knowledge aspects
A formal process is in place, covering all knowledge aspects

Hard skills are not being evaluated


Only basic hard skill are being evaluated
All vital hard skills are being evaluated
Most hard skills are formally evaluated
All hard skills are formally evaluated

Soft skills are not being evaluated


Only basic soft skill are being evaluated
All vital soft skills are being evaluated
Most soft skills are formally evaluated
All soft skills are formally evaluated

Significant gaps or shortages exist in the skill matrix


No gaps exists, but shortages are present
No gaps exists, shortages only exist in non-vital skillset
No current gaps or shortages, but coverage is minimal for some skills
No gaps or shortages in the skill matrix

Skill assessment is not conducted


Skill assessment is conducted in an ad-hoc fashion
Skills are usually assessed, but not embedded in processes
Skill assessment is regularly and informally conducted
Skill assessment is regularly and formally conducted

Results are not used for improvement


Results are used for improvement in an ad-hoc fashion
Results are used to improve personal results
Results are used to improve team results
Results are used to improve personal and team results

The assessment process is never updated


The skill assessment process is updated in an ad-hoc fashion
The assessment process is updated, but not embedded in processes
The process is informally and regularly reviewed and updated
The process is formally and regularly reviewed and updated

A knowledge matrix is not in place


The matrix only covers some employees
The matrix only covers all vital employees
The matrix covers all employees, but is not regularly updated
The matrix covers all employees and is regularly updated

A knowledge matrix is incomplete


A knowledge matric covering only basic skills is in place
A knowledge matrix covering only vital skills is in place
A full knowledge matrix covering is in place, but not regularly updated
A full knowledge matrix is in place and regularly updated

Significant gaps or shortages exist in the knowledge matrix


No gaps exists, but shortages are present
No gaps exists, shortages only exist in non-vital knowledge areas
No current gaps or shortages, but coverage is minimal for some topics
No gaps or shortages in the knowledge matrix

The knowledge matrix is not used to determine training needs


The matrix is used in an ad-hoc fashion
Training is based on the knowledge matrix but not regularly updated
Training is based on the knowledge matrix and regularly updated
The matrix is fully aligned with the training & certification programs

The knowledge matrix is never updated


The knowledge matrix is updated in an ad-hoc fashion
The matrix is updated, but not embedded in processes
The matrix is informally and regularly reviewed and updated
The matrix is formally and regularly reviewed and updated

Documentation is not reviewed


Documentation is reviewed ad-hoc, not using a structured approach
Documentation is reviewed ad-hoc, using a structured approach
Documentation is regularly and informally reviewed and updated
Documentation is regularly and formally reviewed and updated

Tooling is not in place


Tooling is in place, but used in an ad-hoc fashion
Tooling is in place, and used regularly
Tooling is in place and use of the tool is embedded in processes
Tooling is in place and optimized for knowledge management purposes

guidance

A training program is not in place


A training program some roles is in place, but not operational
A training program covering some roles is in place and operational
A training program covering all roles is in place, but not formalized
A training program covering all roles is in place and formalized

A certification program is not in place


A certification program some roles is in place, but not operational
A certification program covering some roles is in place and operational
A certification program covering all roles is in place, but not formalized
A certification program covering all roles is in place and formalized

The programs are not connected


The programs are connected in an ad-hoc fashion
The programs are regularly used, but not embedded in processes
The programs are mostly aligned, but not formally
The programs are formally embedded in evaluation and progression
No budget is allocated
Insufficient budget is allocated for the team as a whole
Sufficient budget is allocated for the team as a whole
Employees have sufficient budget, not encouraged to attend training
Employees have sufficient budget, encouraged to attend training

No time is allocated
Insufficient time is allocated for the team as a whole
Sufficient time is allocated for the team as a whole
Employees have sufficient time, but not encouraged to attend training
Employees have sufficient time, and encouraged to attend training

Workshops are not held


Workshops are held in an ad-hoc fashion
Workshops are held periodically
Workshops are held regularly, not aligned with knowledge & training
Workshops are held regularly and aligned with knowledge & training

Programs are not reviewed


Programs are reviewed ad-hoc, not using a structured approach
Programs are reviewed ad-hoc, using a structured approach
Programs are regularly and informally reviewed and updated
Programs are regularly and formally reviewed and updated

SOC-CMM - Process Domain


guidance

A SOC management process is not in place


SOC management is done in an ad-hoc fashion
A basic process is in place, that covers some aspects
An informal process is in place that covers most aspects
A formal process is in place, covering all aspects

No documentation in place
Some ad-hoc information across documents
Basic documentation of business drivers
Single document, full description of business drivers
Document completed, approved and formally published

Governance process is not reviewed


Governance process is reviewed in an ad-hoc fashion
Process is reviewed using a structured approach in an ad-hoc fashion
Process is regularly and informally reviewed and updated
Process is regularly and formally reviewed and updated with findings

Stakeholders are unfamiliar with the process


Some stakeholders are aware of the process, but not its details
Some stakeholders are aware of the process and its details
All stakeholders are aware, not all stakeholders know its details
All stakeholders are aware of the process and its details

guidance

No security operations exercises are performed


Exercises are performed on ad-hoc basis
Exercises are sometimes performed in a structured manner
Informal structured exercises are performed regularly
Formal exercises are performed regularly, reported and improved on

No standard operating procedures are in place


All vital procedures are in place
Most procedures are in place
All procedures are in place, not optimized through feedback
All procedures are in place, up to date and optimized for performance

No checklists are in place


A basic checklist is used in an ad-hoc fashion
Checklists are in place, but not used consistently
Checklists are used consistently, but not formally signed off
Checklists are used consistently and formally signed off

Workflows are not in place


Some ad-hoc information across documents
Basic documentation of workflows
Single document, full description of workflows
Workflows are completed, approved and formally published

An operational handbook is not in place


Some ad-hoc information across documents
Basic documentation of SOC tasks & rules
Single document, full description of SOC tasks & rules
Handbook is completed, approved and formally published

Process not integrated


Configuration management is executed in an ad-hoc fashion
Baselines established and documented
Configuration management is mostly automated
All configuration updates reflected in CMDB and security tooling

Process not integrated


Change management is executed in an ad-hoc fashion
Change management process in place, not structurally executed
Change management process in place, structurally executed
SOC follows change management, all changes embedded in monitoring
Process not integrated
Problem management is executed in an ad-hoc fashion
Problem management process in place, not structurally executed
Problem management process in place, structurally executed
Problem management is executed and reviewed for all problems

Process not integrated


Incident management is executed in an ad-hoc fashion
Incident management process in place, not structurally executed
Incident management process in place, structurally executed
Incident management is executed and reviewed for all incidents

Process not integrated


Asset management is executed in an ad-hoc fashion
Asset management is executed structurally, but not automated
Asset management is mostly automated
All asset management updates reflected in CMDB and security tooling

No dedicated physical location


Floorplan in place, not operationalized
SOC established on single floor
Dedicated insecure location established
Dedicated secure location established, fully optimized for sec ops

No dedicated network
Critical SOC components placed in separate network
Most SOC equipment in separate network, basic access controls in place
All SOC equipment in separate network, full access control in place
Dedicated SOC network in place, fully protected and monitored

Physical access controls not in place


Physical access controls in place, not dedicated for SOC
Dedicated access control in place using badges, access restricted
Dedicated access control in place using badges, access not reviewed
Access secured through badges, authorizations restricted and monitored

No video wall in place


Single screen in place showing basic security information
Multiple screens in place showing basic static security information
Multiple screens in place, showing prioritized events and alerts
Video wall in place, fully optimized for real-time monitoring

Call-center capability not in place


Some basic communication means in place
Communication means in place, not separate from regular comms
Dedicated communication in place, separate from regular comms
Call-center in place, fully optimized for coordination & communication
No dedicated workstations in place
Workstations customized by individual analysts
Dedicated analyst workstations, toolset not standardized
Dedicated analyst workstations, toolset standardized but incomplete
Dedicated analyst workstations, optimized for monitoring & analysis

Shift schedules not in place


Basic schedule in place, not applied structurally
Basic schedule in place, applied structurally
Shift schedules in place, coverage mostly guaranteed for SOC roles
Shift schedules in place, guaranteeing full shift coverage for all roles

No shift log in place


No central shift log in place, notes are kept, not always disseminated
Central shift log in place, but not used structurally
Shift log in place, used structurally but not checked for accuracy
Shift log in place, fully accurate and up to date

No shift turnover procedures in place


Some ad-hoc information across documents
Basic shift turnover procedure created
Single document, full description of shift turnover handling
Procedure completed, approved and formally published

No daily stand-up procedure in place


Stand-up carried out in an ad-hoc fashion and not regularly
Stand-up carried out regularly, but not in structured fashion
Structured stand-up procedure in place, not optimized
Stand-up procedure in place, executed daily, optimized for efficiency

No stand-by arrangements exist


Best-effort stand-by arrangement in place
Stand-by arrangement in place, not supported by tooling and not tested
Stand-by arrangements in place, supported by tooling, but not tested
Stand-by arrangements in place, supported by tooling and tested

No DMS in place
Documentation centralized on file shares
DMS in place, documentation updates not enforced
DMS in place, documentation updates and versions enforced
DMS in place, fully supporting SOC documentation requirements

No knowledge & collaboration platform in place


Knowledge & collaboration performed in an ad-hoc fashion
Platform in place, not dedicated, not restricted to SOC
Platform, in place, not dedicated, restricted to SOC
Dedicated platform, fully supporting sec ops, integrated in ITSM process
importance

No reports are provided


Reports are provided in an ad-hoc fashion
Reports are provided regularly, not standardized
Reports are provided regularly and standardized using quality criteria
Reports are provided regularly, standardized and regularly optimized

Reports not tailored


Only basic customizations for customers applied
Customizations applied structurally to customer reports
Reports fully tailored to recipients, manual customization required
Reports fully tailored to recipients using automated templates

Reports not approved or reviewed


Informal report review conducted
Structural report review conducted
Reports regularly reviewed, not formally signed off by recipients
Reports regularly reviewed and formally signed off by recipients

No established reporting lines


Reports have limited dissemination
Reports have a standard distribution list
Reports dissemination through standard reporting lines, not approved
Reports dissemination through standard and approved reporting lines

Report templates not updated


Report templates updated in an ad-hoc fashion
Report templates regularly revised and updated
Report templates revised and updated using customer feedback
Reports templates regularly updated and formally approved

No agreements exist
Informal agreements made, not applied structurally
Informal agreements made, applied structurally
Formal agreements exists, not measured
Formal agreements exists, metrics applied to reporting

Report type not provided


Report type provided in an ad-hoc fashion
Report type provided regularly
Report type provided regularly, contents discussed but not approved
Report type provided regularly and contents formally approved

Report type not provided


Report type provided in an ad-hoc fashion
Report type provided regularly
Report type provided regularly, contents discussed but not approved
Report type provided regularly and contents formally approved

Report type not provided


Report type provided in an ad-hoc fashion
Report type provided regularly
Report type provided regularly, contents discussed but not approved
Report type provided regularly and contents formally approved

Report type not provided


Report type provided in an ad-hoc fashion
Report type provided regularly
Report type provided regularly, contents discussed but not approved
Report type provided regularly and contents formally approved

Report type not provided


Report type provided in an ad-hoc fashion
Report type provided regularly
Report type provided regularly, contents discussed but not approved
Report type provided regularly and contents formally approved

Report type not provided


Report type provided in an ad-hoc fashion
Report type provided regularly
Report type provided regularly, contents discussed but not approved
Report type provided regularly and contents formally approved

Report type not provided


Report type provided in an ad-hoc fashion
Report type provided regularly
Report type provided regularly, contents discussed but not approved
Report type provided regularly and contents formally approved

Dashboards not provided


Dashboards provided, updated ad-hoc
Dashboards provided, updated periodically
Real-time dashboard provided, contents discussed but not approved
Real-time dashboards provided and contents formally approved

Metric type not used


Metric type used in an ad-hoc fashion
Metric type applied to some SOC services
Metric type consistently applied to most SOC services
Metric type fully and consistently applied to all SOC services

Metric type not used


Metric type used in an ad-hoc fashion
Metric type applied to some SOC services
Metric type consistently applied to most SOC services
Metric type fully and consistently applied to all SOC services

Metric type not used


Metric type used in an ad-hoc fashion
Metric type applied to some SOC services
Metric type consistently applied to most SOC services
Metric type fully and consistently applied to all SOC services

Metric type not used


Metric type used in an ad-hoc fashion
Metric type applied to some SOC services
Metric type consistently applied to most SOC services
Metric type fully and consistently applied to all SOC services

Metric type not used


Metric type used in an ad-hoc fashion
Metric type applied to some SOC services
Metric type consistently applied to most SOC services
Metric type fully and consistently applied to all SOC services

Advisories not provided


Advisories provided in an ad-hoc fashion
Advisories provided regularly
Advisories provided regularly, format discussed but not approved
Advisories provided regularly and format formally approved

Risk & impact assessment not performed


Risk & impact assessments performed in an ad-hoc fashion
Unstructured risk & impact assessments performed
Informal structured risk & impact assessment performed for advisories
Formal risk & impact assessment performed for all advisories

Follow-up of advisories not performed


Follow-up of advisories performed in an ad-hoc fashion
Follow-up performed for critical advisories
Follow-up performed for most advisories, aligned with ITSM processes
Follow-up performed for all advisories, aligned with ITSM processes

importance

A use case management process is not in place


Use case management is done in an ad-hoc fashion
Basic process in place, not applied to all phases of the use case lifecycle
Informal process in place covering all aspects of the use case lifecycle
Formal process in place, covering all aspects of the use case lifecycle

No documentation in place
Some ad-hoc information across documents
Basic documentation of business drivers
Single document, full description of business drivers
Document completed, approved and formally published

Use cases not approved


Use cases not approved, but some are known to stakeholders
Use cases not approved, all critical use cases known to stakeholders
All important use cases approved by relevant stakeholders
All use cases formally approved by relevant stakeholders

Use case management process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally and regularly with relevant processes
Use case management process fully aligned with relevant processes

Use case not created using a standardized approach


Use cases created in a structured but undocumented fashion
Use cases mostly created in a structured and documented fashion
All use cases created using a standardized but unapproved approach
All use cases created using a standardized and approved approach

Use cases not created using a top-down approach


Use case creation performed in an ad-hoc fashion
Use cases created in a structured top-down way, SOC context only
Use cases created top-down, based on risk and business context
All use cases created top-down, full risk and business alignment

No traceability exists
Traceability is possible for some use cases, but requires manual effort
Traceability is possible for all use cases, but requires manual effort
Full traceability exists in documentation, not validated by stakeholders
Full traceability exists in documentation, validated by stakeholders

No traceability exists
Traceability is possible for some use cases, but requires manual effort
Traceability is possible for all use cases, but requires manual effort
Full traceability exists in documentation, not validated by stakeholders
Full traceability exists in documentation, validated by stakeholders

No test are performed


Testing is performed in an ad-hoc fashion
Testing is performed structurally, covers only a small part of all rules
All critical rules are tested regularly in a manual effort
Automated tests are performed regularly to verify correct functioning

No metrics applied to use cases


Some ad-hoc measurements regarding use cases take place
Basic quantitative metrics in place for critical use cases
Metrics applied to all use cases, no risk-based feedback loop
Metrics applied to all use cases, used to guide risk-based use case growth

No scoring or prioritization applied


Scoring and prioritization applied in an ad-hoc manner
Scoring and prioritization applied structurally to critical use cases
Scoring and prioritization applied structurally to all use cases
All use cases scored and prioritized, validated & reviewed by stakeholders

Use cases are not reviewed


Use cases are reviewed ad-hoc, not using a structured approach
All critical use cases are reviewed using a structured approach
All use cases are regularly and informally reviewed and updated
All use cases are regularly and formally reviewed and updated

SOC-CMM - Technology Domain


guidance

Functional ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Functional ownership fully described and assigned, not approved
Functional ownership fully described, assigned and formally approved

Technical ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Technical ownership fully described and assigned, not approved
Technical ownership fully described, assigned and formally approved

No documentation in place
Some ad-hoc information across documents
Basic documentation of the SIEM system in place
Single document, full technical description of SIEM system
Document completed, approved and formally published

No documentation in place
Some ad-hoc information across documents
Basic documentation of the SIEM system in place
Single document, full functional description of SIEM system
Document completed, approved and formally published

No personnel for SIEM support


Personnel for support available, not dedicated or sufficient
Sufficient dedicated personnel available, not documented
Sufficient dedicated personnel available & documented, not formalized
Sufficient dedicated personnel available, documented and formalized
Personnel not formally trained
Product training identified, no training currently in place
Individual training, not part of the training program
Training part of training program, all key personnel trained
All personnel formally trained

Personnel not formally certified


Product certification identified, no certification currently in place
Individual certification, not part of the certification program
Certification part of certification program, all key personnel certified
All personnel formally certified

Support contract not in place


Basic support contract in place, not covering SOC requirements
Support contract in place, covering basic SOC requirements
Support contract in place, covering most SOC requirements
Support contract in place, covering all SOC requirements

HA not in place
HA requirements identified, not implemented
Manual actions required for achieving redundancy
Fully automated HA in place, nog aligned with business continuity plans
Fully automated HA in place, aligned with business continuity plans

Data backup or replication not in place


Data backed up in ad-hoc fashion
Weekly backup routine in place
Daily backup routine in place
Real-time replication of data implemented

Configuration backup or replication not in place


Configuration backed up in ad-hoc fashion
Configuration backed up manually after each change
Daily backup routine in place
Real-time replication of configuration implemented

DR plan not in place


DR requirements identified, plan not yet in place
Basic DR plan in place
Full DR plan in place, not approved by business continuity stakeholders
Full DR plan in place, approved by business continuity stakeholders

DR plan not tested


DR plan tested on ad-hoc basis
DR plan tested, but not formally
DR plan regularly and fully tested, results not formally published
DR plan regularly tested, results formally published and followed up
Test environment not in place, testing not performed
Test environment not in place, testing performed in ad-hoc fashion
Separate test environment in place, not used structurally
Separate test environment with informal procedures in place
Separate test environment with formal procedures in place

Access to the system not restricted


Basic access control in place
Granular access rights implemented, not monitored
Granular access rights implemented and monitored, not audited
Granular access rights implements, monitored and subjected to audit

Review of access rights not performed


Review of access rights performed in ad-hoc fashion
Access right review documented, but not executed structurally
Access rights reviewed periodically and structurally
Access rights reviewed periodically and after each change in employees
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3

guidance

Functional ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Functional ownership fully described and assigned, not approved
Functional ownership fully described, assigned and formally approved

Technical ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Technical ownership fully described and assigned, not approved
Technical ownership fully described, assigned and formally approved

No documentation in place
Some ad-hoc information across documents
Basic documentation of the IDPS system in place
Single document, full technical description of IDPS system
Document completed, approved and formally published

No documentation in place
Some ad-hoc information across documents
Basic documentation of the IDPS system in place
Single document, full functional description of IDPS system
Document completed, approved and formally published

No personnel for IDPS support


Personnel for support available, not dedicated or sufficient
Sufficient dedicated personnel available, not documented
Sufficient dedicated personnel available & documented, not formalized
Sufficient dedicated personnel available, documented and formalized

Personnel not formally trained


Product training identified, no training currently in place
Individual training, not part of the training program
Training part of training program, all key personnel trained
All personnel formally trained

Personnel not formally certified


Product certification identified, no certification currently in place
Individual certification, not part of the certification program
Certification part of certification program, all key personnel certified
All personnel formally certified

Support contract not in place


Basic support contract in place, not covering SOC requirements
Support contract in place, covering basic SOC requirements
Support contract in place, covering most SOC requirements
Support contract in place, covering all SOC requirements

HA not in place
HA requirements identified, not implemented
Manual actions required for achieving redundancy
Fully automated HA in place, nog aligned with business continuity plans
Fully automated HA in place, aligned with business continuity plans

Data backup or replication not in place


Data backed up in ad-hoc fashion
Weekly backup routine in place
Daily backup routine in place
Real-time replication of data implemented

Configuration backup or replication not in place


Configuration backed up in ad-hoc fashion
Configuration backed up manually after each change
Daily backup routine in place
Real-time replication of configuration implemented

DR plan not in place


DR requirements identified, plan not yet in place
Basic DR plan in place
Full DR plan in place, not approved by business continuity stakeholders
Full DR plan in place, approved by business continuity stakeholders

DR plan not tested


DR plan tested on ad-hoc basis
DR plan tested, but not formally
DR plan regularly and fully tested, results not formally published
DR plan regularly tested, results formally published and followed up

Test environment not in place, testing not performed


Test environment not in place, testing performed in ad-hoc fashion
Separate test environment in place, not used structurally
Separate test environment with informal procedures in place
Separate test environment with formal procedures in place

Access to the system not restricted


Basic access control in place
Granular access rights implemented, not monitored
Granular access rights implemented and monitored, not audited
Granular access rights implements, monitored and subjected to audit

Review of access rights not performed


Review of access rights performed in ad-hoc fashion
Access right review documented, but not executed structurally
Access rights reviewed periodically and structurally
Access rights reviewed periodically and after each change in employees
guidance

Functional ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Functional ownership fully described and assigned, not approved
Functional ownership fully described, assigned and formally approved

Technical ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Technical ownership fully described and assigned, not approved
Technical ownership fully described, assigned and formally approved

No documentation in place
Some ad-hoc information across documents
Basic documentation of the analytics system in place
Single document, full technical description of analytics system
Document completed, approved and formally published

No documentation in place
Some ad-hoc information across documents
Basic documentation of the analytics system in place
Single document, full functional description of analytics system
Document completed, approved and formally published

No personnel for analytics support


Personnel for support available, not dedicated or sufficient
Sufficient dedicated personnel available, not documented
Sufficient dedicated personnel available & documented, not formalized
Sufficient dedicated personnel available, documented and formalized

Personnel not formally trained


Product training identified, no training currently in place
Individual training, not part of the training program
Training part of training program, all key personnel trained
All personnel formally trained
Personnel not formally certified
Product certification identified, no certification currently in place
Individual certification, not part of the certification program
Certification part of certification program, all key personnel certified
All personnel formally certified

Support contract not in place


Basic support contract in place, not covering SOC requirements
Support contract in place, covering basic SOC requirements
Support contract in place, covering most SOC requirements
Support contract in place, covering all SOC requirements

HA not in place
HA requirements identified, not implemented
Manual actions required for achieving redundancy
Fully automated HA in place, nog aligned with business continuity plans
Fully automated HA in place, aligned with business continuity plans

Data backup or replication not in place


Data backed up in ad-hoc fashion
Weekly backup routine in place
Daily backup routine in place
Real-time replication of data implemented

Configuration backup or replication not in place


Configuration backed up in ad-hoc fashion
Configuration backed up manually after each change
Daily backup routine in place
Real-time replication of configuration implemented

DR plan not in place


DR requirements identified, plan not yet in place
Basic DR plan in place
Full DR plan in place, not approved by business continuity stakeholders
Full DR plan in place, approved by business continuity stakeholders

DR plan not tested


DR plan tested on ad-hoc basis
DR plan tested, but not formally
DR plan regularly and fully tested, results not formally published
DR plan regularly tested, results formally published and followed up

Test environment not in place, testing not performed


Test environment not in place, testing performed in ad-hoc fashion
Separate test environment in place, not used structurally
Separate test environment with informal procedures in place
Separate test environment with formal procedures in place
Access to the system not restricted
Basic access control in place
Granular access rights implemented, not monitored
Granular access rights implemented and monitored, not audited
Granular access rights implements, monitored and subjected to audit

Review of access rights not performed


Review of access rights performed in ad-hoc fashion
Access right review documented, but not executed structurally
Access rights reviewed periodically and structurally
Access rights reviewed periodically and after each change in employees

guidance

Functional ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Functional ownership fully described and assigned, not approved
Functional ownership fully described, assigned and formally approved

Technical ownership not assigned


Some elements of ownership identified, not described
All elements of ownership described, not assigned
Technical ownership fully described and assigned, not approved
Technical ownership fully described, assigned and formally approved
No documentation in place
Some ad-hoc information across documents
Basic documentation of the analytics system in place
Single document, full technical description of analytics system
Document completed, approved and formally published

No documentation in place
Some ad-hoc information across documents
Basic documentation of the analytics system in place
Single document, full functional description of analytics system
Document completed, approved and formally published

No personnel for security automation & orchestration support


Personnel for support available, not dedicated or sufficient
Sufficient dedicated personnel available, not documented
Sufficient dedicated personnel available & documented, not formalized
Sufficient dedicated personnel available, documented and formalized

Personnel not formally trained


Product training identified, no training currently in place
Individual training, not part of the training program
Training part of training program, all key personnel trained
All personnel formally trained

Personnel not formally certified


Product certification identified, no certification currently in place
Individual certification, not part of the certification program
Certification part of certification program, all key personnel certified
All personnel formally certified

Support contract not in place


Basic support contract in place, not covering SOC requirements
Support contract in place, covering basic SOC requirements
Support contract in place, covering most SOC requirements
Support contract in place, covering all SOC requirements

HA not in place
HA requirements identified, not implemented
Manual actions required for achieving redundancy
Fully automated HA in place, nog aligned with business continuity plans
Fully automated HA in place, aligned with business continuity plans

Data backup or replication not in place


Data backed up in ad-hoc fashion
Weekly backup routine in place
Daily backup routine in place
Real-time replication of data implemented
Configuration backup or replication not in place
Configuration backed up in ad-hoc fashion
Configuration backed up manually after each change
Daily backup routine in place
Real-time replication of configuration implemented

DR plan not in place


DR requirements identified, plan not yet in place
Basic DR plan in place
Full DR plan in place, not approved by business continuity stakeholders
Full DR plan in place, approved by business continuity stakeholders

DR plan not tested


DR plan tested on ad-hoc basis
DR plan tested, but not formally
DR plan regularly and fully tested, results not formally published
DR plan regularly tested, results formally published and followed up

Test environment not in place, testing not performed


Test environment not in place, testing performed in ad-hoc fashion
Separate test environment in place, not used structurally
Separate test environment with informal procedures in place
Separate test environment with formal procedures in place

Access to the security automation system not restricted


Basic access control in place
Granular access rights implemented, not monitored
Granular access rights implemented and monitored, not audited
Granular access rights implements, monitored and subjected to audit

Review of access rights not performed


Review of access rights performed in ad-hoc fashion
Access right review documented, but not executed structurally
Access rights reviewed periodically and structurally
Access rights reviewed periodically and after each change in employees
SOC-CMM - Services Domain
guidance

No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published

Service not measured for quality


Metrics defined, applied in an ad-hoc fashion
Metrics defined, applied in a structured but informal fashion
Metrics formalized and used in regular reports
Formal and approved metrics in place, feedback used for improvement

Service not measured


SLA defined, measured in an ad-hoc fashion
SLA defined, measured periodically but not reported
SLA compliance reported periodically, not discussed with customers
SLA compliance discussed with customers regularly for improvement

No updates sent to customers/stakeholders


Ad-hoc updates sent to some customers/stakeholders
Frequent updates sent to most customers/stakeholders
Periodical updates sent to all customers/stakeholders
Periodical updates sent and discussed with all customers/stakeholders

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable

Process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally & regularly with most relevant processes
Alignment done structurally & regularly with all relevant processes

No such process in place


Continuity requirements identified, process not yet in place
Basic service continuity process in place
Full process in place, not approved by relevant stakeholders
Full process in place, formally approved by relevant stakeholders

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized

Best practices not applied


Best practices identified, but not applied
Best practices applied, but not structurally
Best practices applied to service architecture and service delivery
Best practices applied and adherence checked regularly

Use cases not used


Use cases undocumented and used in an ad-hoc fashion
Use cases documented and applied structurally
Use cases embedded in the security monitoring processes
Use cases fully embedded, tuning and Life Cycle Management applied

Service performance not measured


Goals set for service performance, measured ad-hoc
Goals set for service performance, measured structurally but informally
Goals set for service performance, measured structurally and formally
Continuous measurement to determine progress & adjust process

Improvement not done


Goals defined, but not pursued
Goals defined and pursued structurally, but not formalized
Goals formally defined and pursued structurally and periodically
Continuous improvement based on targets and feedback loops
guidance

Standard not adopted


Awareness of standards, used in ad-hoc fashion
Standard used structurally as reference during incident response
Many elements adopted, not fully aligned
Standard fully adopted, process set up and executed using standard

No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published

Service not measured for quality


Metrics defined, applied in an ad-hoc fashion
Metrics defined, applied in a structured but informal fashion
Metrics formalized and used in regular reports
Formal and approved metrics in place, feedback used for improvement

Service not measured


SLA defined, measured in an ad-hoc fashion
SLA defined, measured periodically but not reported
SLA compliance reported periodically, not discussed with customers
SLA compliance discussed with customers regularly for improvement

No updates sent to customers/stakeholders


Ad-hoc updates sent to some customers/stakeholders
Frequent updates sent to most customers/stakeholders
Periodical updates sent to all customers/stakeholders
Periodical updates sent and discussed with all customers/stakeholders

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable

Process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally & regularly with most relevant processes
Alignment done structurally & regularly with all relevant processes

No mandate
Mandate requested in ad-hoc fashion during incident response
Mandate informally given, not supported by all stakeholders
Mandate given and supported by all stakeholders, not formalized
Full mandate, formally documented, approved and published

Best practices not applied


Best practices identified, but not applied
Best practices applied, but not structurally
Best practices applied to service architecture and service delivery
Best practices applied and adherence checked regularly

No workflows or scenarios in place


Some ad-hoc information across documents
Basic workflows in place, not covering all incident types
Workflows created for all incident types, not formalized
Formal workflows created, approved & published for all incident types

Service performance not measured


Goals set for service performance, measured ad-hoc
Goals set for service performance, measured structurally but informally
Goals set for service performance, measured structurally and formally
Continuous measurement to determine progress & adjust process

Improvement not done


Goals defined, but not pursued
Goals defined and pursued structurally, but not formalized
Goals formally defined and pursued structurally and periodically
Continuous improvement based on targets and feedback loops
guidance

No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published

Service not measured for quality


Metrics defined, applied in an ad-hoc fashion
Metrics defined, applied in a structured but informal fashion
Metrics formalized and used in regular reports
Formal and approved metrics in place, feedback used for improvement

Service not measured


SLA defined, measured in an ad-hoc fashion
SLA defined, measured periodically but not reported
SLA compliance reported periodically, not discussed with customers
SLA compliance discussed with customers regularly for improvement
No updates sent to customers/stakeholders
Ad-hoc updates sent to some customers/stakeholders
Frequent updates sent to most customers/stakeholders
Periodical updates sent to all customers/stakeholders
Periodical updates sent and discussed with all customers/stakeholders

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable

Process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally & regularly with most relevant processes
Alignment done structurally & regularly with all relevant processes

No such process in place


Continuity requirements identified, process not yet in place
Basic service continuity process in place
Full process in place, not approved by relevant stakeholders
Full process in place, formally approved by relevant stakeholders

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized

Best practices not applied


Best practices identified, but not applied
Best practices applied, but not structurally
Best practices applied to service architecture and service delivery
Best practices applied and adherence checked regularly

No workflows or scenarios in place


Some ad-hoc information across documents
Basic workflows in place, not covering all incident types
Workflows created for all incident types, not formalized
Formal workflows created, approved and published for all incident types
Service performance not measured
Goals set for service performance, measured ad-hoc
Goals set for service performance, measured structurally but informally
Goals set for service performance, measured structurally and formally
Continuous measurement to determine progress & adjust process

Improvement not done


Goals defined, but not pursued
Goals defined and pursued structurally, but not formalized
Goals formally defined and pursued structurally and periodically
Continuous improvement based on targets and feedback loops

guidance

No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published

Service not measured for quality


Metrics defined, applied in an ad-hoc fashion
Metrics defined, applied in a structured but informal fashion
Metrics formalized and used in regular reports
Formal and approved metrics in place, feedback used for improvement

Service not measured


SLA defined, measured in an ad-hoc fashion
SLA defined, measured periodically but not reported
SLA compliance reported periodically, not discussed with customers
SLA compliance discussed with customers regularly for improvement

No updates sent to customers/stakeholders


Ad-hoc updates sent to some customers/stakeholders
Frequent updates sent to most customers/stakeholders
Periodical updates sent to all customers/stakeholders
Periodical updates sent and discussed with all customers/stakeholders

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable

Process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally & regularly with most relevant processes
Alignment done structurally & regularly with all relevant processes

No such process in place


Continuity requirements identified, process not yet in place
Basic service continuity process in place
Full process in place, not approved by relevant stakeholders
Full process in place, formally approved by relevant stakeholders

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized

Best practices not applied


Best practices identified, but not applied
Best practices applied, but not structurally
Best practices applied to service architecture and service delivery
Best practices applied and adherence checked regularly

Service performance not measured


Goals set for service performance, measured ad-hoc
Goals set for service performance, measured structurally but informally
Goals set for service performance, measured structurally and formally
Continuous measurement to determine progress & adjust process

Improvement not done


Goals defined, but not pursued
Goals defined and pursued structurally, but not formalized
Goals formally defined and pursued structurally and periodically
Continuous improvement based on targets and feedback loops

guidance

Methodology not adopted


Awareness of methodologies, used in ad-hoc fashion
Methodologies used structurally as reference during hunting activities
Many elements adopted, not fully aligned
Methodology fully adopted, process set up and executed accordingly

No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published

Service not measured for quality


Metrics defined, applied in an ad-hoc fashion
Metrics defined, applied in a structured but informal fashion
Metrics formalized and used in regular reports
Formal and approved metrics in place, feedback used for improvement

Service not measured


SLA defined, measured in an ad-hoc fashion
SLA defined, measured periodically but not reported
SLA compliance reported periodically, not discussed with customers
SLA compliance discussed with customers regularly for improvement

No updates sent to customers/stakeholders


Ad-hoc updates sent to some customers/stakeholders
Frequent updates sent to most customers/stakeholders
Periodical updates sent to all customers/stakeholders
Periodical updates sent and discussed with all customers/stakeholders

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable

Process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally & regularly with most relevant processes
Alignment done structurally & regularly with all relevant processes

No such process in place


Continuity requirements identified, process not yet in place
Basic service continuity process in place
Full process in place, not approved by relevant stakeholders
Full process in place, formally approved by relevant stakeholders

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized

Best practices not applied


Best practices identified, but not applied
Best practices applied, but not structurally
Best practices applied to service architecture and service delivery
Best practices applied and adherence checked regularly

Service performance not measured


Goals set for service performance, measured ad-hoc
Goals set for service performance, measured structurally but informally
Goals set for service performance, measured structurally and formally
Continuous measurement to determine progress & adjust process

Improvement not done


Goals defined, but not pursued
Goals defined and pursued structurally, but not formalized
Goals formally defined and pursued structurally and periodically
Continuous improvement based on targets and feedback loops

guidance

No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published

Service not measured for quality


Metrics defined, applied in an ad-hoc fashion
Metrics defined, applied in a structured but informal fashion
Metrics formalized and used in regular reports
Formal and approved metrics in place, feedback used for improvement

Service not measured


SLA defined, measured in an ad-hoc fashion
SLA defined, measured periodically but not reported
SLA compliance reported periodically, not discussed with customers
SLA compliance discussed with customers regularly for improvement

No updates sent to customers/stakeholders


Ad-hoc updates sent to some customers/stakeholders
Frequent updates sent to most customers/stakeholders
Periodical updates sent to all customers/stakeholders
Periodical updates sent and discussed with all customers/stakeholders

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable

Process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally & regularly with most relevant processes
Alignment done structurally & regularly with all relevant processes

No such process in place


Continuity requirements identified, process not yet in place
Basic service continuity process in place
Full process in place, not approved by relevant stakeholders
Full process in place, formally approved by relevant stakeholders

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized

Best practices not applied


Best practices identified, but not applied
Best practices applied, but not structurally
Best practices applied to service architecture and service delivery
Best practices applied and adherence checked regularly

Service performance not measured


Goals set for service performance, measured ad-hoc
Goals set for service performance, measured structurally but informally
Goals set for service performance, measured structurally and formally
Continuous measurement to determine progress & adjust process

Improvement not done


Goals defined, but not pursued
Goals defined and pursued structurally, but not formalized
Goals formally defined and pursued structurally and periodically
Continuous improvement based on targets and feedback loops

guidance

No documentation in place
Some ad-hoc information across documents
Basic documentation of service in place
Single document, full description of service
Document completed, approved and formally published
Service not measured for quality
Metrics defined, applied in an ad-hoc fashion
Metrics defined, applied in a structured but informal fashion
Metrics formalized and used in regular reports
Formal and approved metrics in place, feedback used for improvement

Service not measured


SLA defined, measured in an ad-hoc fashion
SLA defined, measured periodically but not reported
SLA compliance reported periodically, not discussed with customers
SLA compliance discussed with customers regularly for improvement

No updates sent to customers/stakeholders


Ad-hoc updates sent to some customers/stakeholders
Frequent updates sent to most customers/stakeholders
Periodical updates sent to all customers/stakeholders
Periodical updates sent and discussed with all customers/stakeholders

Contractual agreements not in place


No contract in place, ad-hoc agreements made
Basic contract in place, not formally signed of
Contract signed, but not regularly reviewed
Contract signed, approved by- and regularly reviewed with customers

No personnel allocated
Personnel allocated, but not sufficient for required service delivery
Personnel allocated, not dedicated for this service
Sufficient dedicated personnel available, not fully trained and capable
Sufficient dedicated personnel available, trained and fully capable

Process not aligned


Alignment done in an ad-hoc fashion
Alignment done regularly, but not in a structured fashion
Alignment done structurally & regularly with most relevant processes
Alignment done structurally & regularly with all relevant processes

No such process in place


Continuity requirements identified, process not yet in place
Basic service continuity process in place
Full process in place, not approved by relevant stakeholders
Full process in place, formally approved by relevant stakeholders

No procedures in place
Basic procedures in place, used in an ad-hoc fashion
All procedures in place, operational but not used structurally
Procedures in place, operational and used structurally
Procedures in place, formally published and fully operationalized
Best practices not applied
Best practices identified, but not applied
Best practices applied, but not structurally
Best practices applied to service architecture and service delivery
Best practices applied and adherence checked regularly

Service performance not measured


Goals set for service performance, measured ad-hoc
Goals set for service performance, measured structurally but informally
Goals set for service performance, measured structurally and formally
Continuous measurement to determine progress & adjust process

Improvement not done


Goals defined, but not pursued
Goals defined and pursued structurally, but not formalized
Goals formally defined and pursued structurally and periodically
Continuous improvement based on targets and feedback loops

mented after version 1.0 hereafter


NIST CSF
NIST CSF version Function Category

1.0 IDENTIFY (ID) Asset Management (ID.AM)


1.0 IDENTIFY (ID) Asset Management (ID.AM)
1.0 IDENTIFY (ID) Asset Management (ID.AM)
1.0 IDENTIFY (ID) Asset Management (ID.AM)
1.0 IDENTIFY (ID) Asset Management (ID.AM)
1.0 IDENTIFY (ID) Asset Management (ID.AM)

1.0 IDENTIFY (ID) Business Environment (ID.BE)


1.0 IDENTIFY (ID) Business Environment (ID.BE)
1.0 IDENTIFY (ID) Business Environment (ID.BE)
1.0 IDENTIFY (ID) Business Environment (ID.BE)
1.0 IDENTIFY (ID) Business Environment (ID.BE)

1.0 IDENTIFY (ID) Governance (ID.GV)


1.0 IDENTIFY (ID) Governance (ID.GV)
1.0 IDENTIFY (ID) Governance (ID.GV)
1.0 IDENTIFY (ID) Governance (ID.GV)

1.0 IDENTIFY (ID) Risk Assessment (ID.RA)


1.0 IDENTIFY (ID) Risk Assessment (ID.RA)
1.0 IDENTIFY (ID) Risk Assessment (ID.RA)
1.0 IDENTIFY (ID) Risk Assessment (ID.RA)
1.0 IDENTIFY (ID) Risk Assessment (ID.RA)
1.0 IDENTIFY (ID) Risk Assessment (ID.RA)

1.0 IDENTIFY (ID) Risk Management Strategy (ID.RM)


1.0 IDENTIFY (ID) Risk Management Strategy (ID.RM)
1.0 IDENTIFY (ID) Risk Management Strategy (ID.RM)

1.0 PROTECT (PR) Access Control (PR.AC)


1.0 PROTECT (PR) Access Control (PR.AC)
1.0 PROTECT (PR) Access Control (PR.AC)
1.0 PROTECT (PR) Access Control (PR.AC)
1.0 PROTECT (PR) Access Control (PR.AC)

1.0 PROTECT (PR) Awareness and Training (PR.AT)


1.0 PROTECT (PR) Awareness and Training (PR.AT)
1.0 PROTECT (PR) Awareness and Training (PR.AT)
1.0 PROTECT (PR) Awareness and Training (PR.AT)
1.0 PROTECT (PR) Awareness and Training (PR.AT)

1.0 PROTECT (PR) Data Security (PR.DS)


1.0 PROTECT (PR) Data Security (PR.DS)
1.0 PROTECT (PR) Data Security (PR.DS)
1.0 PROTECT (PR) Data Security (PR.DS)
1.0 PROTECT (PR) Data Security (PR.DS)
1.0 PROTECT (PR) Data Security (PR.DS)
1.0 PROTECT (PR) Data Security (PR.DS)

1.0 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)


1.0 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.0 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.0 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.0 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.0 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.0 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.0 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.0 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.0 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.0 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)
1.0 PROTECT (PR) Information Protection Processes and Procedures (PR.IP)

1.0 PROTECT (PR) Maintenance (PR.MA)


1.0 PROTECT (PR) Maintenance (PR.MA)

1.0 PROTECT (PR) Protective Technology (PR.PT)


1.0 PROTECT (PR) Protective Technology (PR.PT)
1.0 PROTECT (PR) Protective Technology (PR.PT)
1.0 PROTECT (PR) Protective Technology (PR.PT)

1.0 DETECT (DE) Anomalies and Events (DE.AE)


1.0 DETECT (DE) Anomalies and Events (DE.AE)
1.0 DETECT (DE) Anomalies and Events (DE.AE)
1.0 DETECT (DE) Anomalies and Events (DE.AE)
1.0 DETECT (DE) Anomalies and Events (DE.AE)

1.0 DETECT (DE) Security Continuous Monitoring (DE.CM)


1.0 DETECT (DE) Security Continuous Monitoring (DE.CM)
1.0 DETECT (DE) Security Continuous Monitoring (DE.CM)
1.0 DETECT (DE) Security Continuous Monitoring (DE.CM)
1.0 DETECT (DE) Security Continuous Monitoring (DE.CM)
1.0 DETECT (DE) Security Continuous Monitoring (DE.CM)
1.0 DETECT (DE) Security Continuous Monitoring (DE.CM)
1.0 DETECT (DE) Security Continuous Monitoring (DE.CM)

1.0 DETECT (DE) Detection Processes (DE.DP)


1.0 DETECT (DE) Detection Processes (DE.DP)
1.0 DETECT (DE) Detection Processes (DE.DP)
1.0 DETECT (DE) Detection Processes (DE.DP)
1.0 DETECT (DE) Detection Processes (DE.DP)
1.0 RESPOND (RS) Response Planning (RS.RP)

1.0 RESPOND (RS) Communications (RS.CO)


1.0 RESPOND (RS) Communications (RS.CO)
1.0 RESPOND (RS) Communications (RS.CO)
1.0 RESPOND (RS) Communications (RS.CO)
1.0 RESPOND (RS) Communications (RS.CO)

1.0 RESPOND (RS) Analysis (RS.AN)


1.0 RESPOND (RS) Analysis (RS.AN)
1.0 RESPOND (RS) Analysis (RS.AN)
1.0 RESPOND (RS) Analysis (RS.AN)

1.0 RESPOND (RS) Mitigation (RS.MI)


1.0 RESPOND (RS) Mitigation (RS.MI)
1.0 RESPOND (RS) Mitigation (RS.MI)

1.0 RESPOND (RS) Improvements (RS.IM)


1.0 RESPOND (RS) Improvements (RS.IM)

1.0 RECOVER (RC) Recovery Planning (RC.RP)

1.0 RECOVER (RC) Improvements (RC.IM)


1.0 RECOVER (RC) Improvements (RC.IM)

1.0 RECOVER (RC) Communications (RC.CO)


1.0 RECOVER (RC) Communications (RC.CO)
1.0 RECOVER (RC) Communications (RC.CO)
Maturity
Subcategory Applicable? Subcategory maturity Subcategory maturity Subcategory maturity
MIN TOTAL MAX
ID.AM-1 0 0 0 0
ID.AM-2 0 0 0 0
ID.AM-3 0 0 0 0
ID.AM-4 0 0 0 0
ID.AM-5 0 0 0 0
ID.AM-6 14 14 0 70
SUM 14 0 70
ID.BE-1 0 0 0 0
ID.BE-2 0 0 0 0
ID.BE-3 4 4 0 20
ID.BE-4 1 1 0 5
ID.BE-5 5 5 0 25
SUM 10 0 50
ID.GV-1 2 2 0 10
ID.GV-2 3 3 0 15
ID.GV-3 8 8 0 40
ID.GV-4 1 1 0 5
SUM 14 0 70
ID.RA-1 2 2 0 10
ID.RA-2 0 0 0 0
ID.RA-3 15 15 0 75
ID.RA-4 12 12 0 60
ID.RA-5 13 13 0 65
ID.RA-6 0 0 0 0
SUM 42 0 210
ID.RM-1 1 1 0 5
ID.RM-2 0 0 0 0
ID.RM-3 0 0 0 0
SUM 1 0 5
Total
PR.AC-1 0 0 0 0
PR.AC-2 1 1 0 5
PR.AC-3 0 0 0 0
PR.AC-4 8 8 0 40
PR.AC-5 1 1 0 5
SUM 10 0 50
PR.AT-1 6 6 0 30
PR.AT-2 0 0 0 0
PR.AT-3 0 0 0 0
PR.AT-4 0 0 0 0
PR.AT-5 9 9 0 45
SUM 15 0 75
PR.DS-1 0 0 0 0
PR.DS-2 0 0 0 0
PR.DS-3 0 0 0 0
PR.DS-4 0 0 0 0
PR.DS-5 1 1 0 5
PR.DS-6 0 0 0 0
PR.DS-7 4 4 0 20
SUM 5 0 25
PR.IP-1 0 0 0 0
PR.IP-2 0 0 0 0
PR.IP-3 1 1 0 5
PR.IP-4 12 12 0 60
PR.IP-5 1 1 0 5
PR.IP-6 1 1 0 5
PR.IP-7 0 0 0 0
PR.IP-8 0 0 0 0
PR.IP-9 10 10 0 50
PR.IP-10 5 5 0 25
PR.IP-11 1 1 0 5
PR.IP-12 11 11 0 55
SUM 42 0 210
PR.MA-1 6 6 0 30
PR.MA-2 0 0 0 0
SUM 6 0 30
PR.PT-1 11 11 0 55
PR.PT-2 0 0 0 0
PR.PT-3 8 8 0 40
PR.PT-4 0 0 0 0
SUM 19 0 95
Total
DE.AE-1 0 0 0 0
DE.AE-2 2 2 0 10
DE.AE-3 2 2 0 10
DE.AE-4 0 0 0 0
DE.AE-5 0 0 0 0
SUM 4 0 20
DE.CM-1 2 2 0 10
DE.CM-2 2 2 0 10
DE.CM-3 2 2 0 10
DE.CM-4 2 2 0 10
DE.CM-5 2 2 0 10
DE.CM-6 2 2 0 10
DE.CM-7 2 2 0 10
DE.CM-8 0 0 0 0
SUM 14 0 70
DE.DP-1 18 18 0 90
DE.DP-2 22 22 0 110
DE.DP-3 0 0 0 0
DE.DP-4 2 2 0 10
DE.DP-5 3 3 0 15
SUM 45 0 225
Total
RS.RP-1 1 1 0 5
SUM 1 0 5
RS.CO-1 3 3 0 15
RS.CO-2 1 1 0 5
RS.CO-3 1 1 0 5
RS.CO-4 1 1 0 5
RS.CO-5 1 1 0 5
SUM 7 0 35
RS.AN-1 2 2 0 10
RS.AN-2 0 0 0 0
RS.AN-3 2 2 0 10
RS.AN-4 0 0 0 0
SUM 4 0 20
RS.MI-1 1 1 0 5
RS.MI-2 1 1 0 5
RS.MI-3 0 0 0 0
SUM 2 0 10
RS.IM-1 3 3 0 15
RS.IM-2 1 1 0 5
SUM 4 0 20
Total
RC.RP-1 0 0 0 0
SUM 0 0 0
RC.IM-1 0 0 0 0
RC.IM-2 0 0 0 0
SUM 0 0 0
RC.CO-1 0 0 0 0
RC.CO-2 0 0 0 0
RC.CO-3 0 0 0 0
SUM 0 0 0
Total
Maturity
Category Category Function maturity Applicable?
maturity applicability

1
1
0
0
0
0
0 1 SUM
0
0
0
0
0
0 1 SUM
0
0
3
0
0 1 SUM
5
13
8
0
3
0
0 1 SUM
0
0
0
0 1 SUM
0 5 0 Total
0
0
0
4
0
0 1 SUM
0
0
0
0
1
0 1 SUM
1
2
0
2
2
1
0
0 1 SUM
0
0
0
0
0
1
0
0
0
0
0
8
0 1 SUM
4
4
0 1 SUM
2
0
0
0
0 1 SUM
0 6 0 Total
6
17
24
2
2
0 1 SUM
22
1
0
1
1
1
2
7
0 1 SUM
0
44
1
0
0
0 1 SUM
0 3 0 Total
1
0 1 SUM
2
7
0
4
0
0 1 SUM
2
3
9
1
0 1 SUM
3
5
2
0 1 SUM
2
1
0 1 SUM
0 5 0 Total
0
0 0 SUM
0
0
0 0 SUM
0
0
0
0 0 SUM
0 0 0 Total
Capability
Subcategory Subcategory Subcategory Category
capability MIN capability TOTAL capability MAX capability

1 0 5
1 0 5
0 0 0
0 0 0
0 0 0
0 0 0
2 0 10 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0 0
0 0 0
0 0 0
3 0 15
0 0 0
3 0 15 0
5 0 25
13 0 65
8 0 40
0 0 0
3 0 15
0 0 0
29 0 145 0
0 0 0
0 0 0
0 0 0
0 0 0 0
0
0 0 0
0 0 0
0 0 0
4 0 20
0 0 0
4 0 20 0
0 0 0
0 0 0
0 0 0
0 0 0
1 0 5
1 0 5 0
1 0 5
2 0 10
0 0 0
2 0 10
2 0 10
1 0 5
0 0 0
8 0 40 0
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
1 0 5
0 0 0
0 0 0
0 0 0
0 0 0
0 0 0
8 0 40
9 0 45 0
4 0 20
4 0 20
8 0 40 0
2 0 10
0 0 0
0 0 0
0 0 0
2 0 10 0
0
6 0 30
17 0 85
24 0 120
2 0 10
2 0 10
51 0 255 0
22 0 110
1 0 5
0 0 0
1 0 5
1 0 5
1 0 5
2 0 10
7 0 35
35 0 175 0
0 0 0
44 0 220
1 0 5
0 0 0
0 0 0
45 0 225 0
0
1 0 5
1 0 5 0
2 0 10
7 0 35
0 0 0
4 0 20
0 0 0
13 0 65 0
2 0 10
3 0 15
9 0 45
1 0 5
15 0 75 0
3 0 15
5 0 25
2 0 10
10 0 50 0
2 0 10
1 0 5
3 0 15 0
0
0 0 0
0 0 0 0
0 0 0
0 0 0
0 0 0 0
0 0 0
0 0 0
0 0 0
0 0 0 0
0
Category applicability Function capability

0
3 0

1
1

1
6 0

1
3 0

1
5 0

0
0 0
question type answer options
Yes/No 1 No
2 Yes
optional 3 Not required

Yes/No/Unknown answer options


1 No
2 Unknown
3 Yes
4 Not required

Detailed 1 No
2 Partially
3 Averagely
4 Mostly
5 Fully
Optional 6 Not required

Completeness 1 Incomplete
2 Partially complete
3 Averagely complete
4 Mostly complete
5 Fully complete

Importance 1 None
2 Low
3 Normal
4 High
5 Critical

Weighing 1 x1
2 x2
3 x3
4 x4
5 x5

Occurrence 1 Never
2 Sometimes
3 Averagely
4 Mostly
5 Always

Satisfaction 1 No
2 Somewhat
3 Averagely
4 Mostly
5 Fully
Charter document completeness
11 Incomplete
12 Partially complete
13 Partially complete
14 Partially complete
15 Averagely complete
16 Averagely complete
17 Averagely complete
18 Averagely complete
19 Mostly complete
20 Mostly complete
21 Mostly complete
22 Fully complete

Governance elements completeness


13 Incomplete
14 Partially complete
15 Partially complete
16 Partially complete
17 Partially complete
18 Averagely complete
19 Averagely complete
20 Averagely complete
21 Averagely complete
22 Mostly complete
23 Mostly complete
24 Mostly complete
25 Mostly complete
26 Fully complete

Cost management elements completeness


8 Incomplete
9 Partially complete
10 Partially complete
11 Averagely complete
12 Averagely complete
13 Averagely complete
14 Mostly complete
15 Mostly complete
16 Fully complete

SOC Management elements completeness


10 Incomplete
11 Partially complete
12 Partially complete
13 Partially complete
14 Averagely complete
15 Averagely complete
16 Averagely complete
17 Mostly complete
18 Mostly complete
19 Mostly complete
20 Fully complete

Role documentation completeness


8 Incomplete
9 Partially complete
10 Partially complete
11 Averagely complete
12 Averagely complete
13 Averagely complete
14 Mostly complete
15 Mostly complete
16 Fully complete

Training program completeness


6 Incomplete
7 Partially complete
8 Averagely complete
9 Averagely complete
10 Averagely complete
11 Mostly complete
12 Fully complete

Certification program completeness


3 Incomplete
4 Partially complete
5 Mostly complete
6 Fully complete

General documentation completeness


11 Incomplete
12 Partially complete
13 Partially complete
14 Partially complete
15 Averagely complete
16 Averagely complete
17 Averagely complete
18 Averagely complete
19 Mostly complete
20 Mostly complete
21 Mostly complete
22 Fully complete

General Maturity indicators completeness


12 Incomplete
13 Partially complete
14 Partially complete
15 Partially complete
16 Averagely complete
17 Averagely complete
18 Averagely complete
19 Averagely complete
20 Averagely complete
21 Mostly complete
22 Mostly complete
23 Mostly complete
24 Fully complete

Security Incident Management documentation completeness


11 Incomplete
12 Partially complete
13 Partially complete
14 Partially complete
15 Averagely complete
16 Averagely complete
17 Averagely complete
18 Averagely complete
19 Mostly complete
20 Mostly complete
21 Mostly complete
22 Fully complete

Security Incident Management Maturity indicators completeness


11 Incomplete
12 Partially complete
13 Partially complete
14 Partially complete
15 Averagely complete
16 Averagely complete
17 Averagely complete
18 Averagely complete
19 Mostly complete
20 Mostly complete
21 Mostly complete
22 Fully complete

Threat Hunting maturity indicators completeness


11 Incomplete
12 Partially complete
13 Partially complete
14 Partially complete
15 Averagely complete
16 Averagely complete
17 Averagely complete
18 Averagely complete
19 Mostly complete
20 Mostly complete
21 Mostly complete
22 Fully complete

You might also like