2. About me
Vice president of the ISSA Spain chapter.
www.issa-spain.org
Vice president of the FIST Conferences
association.
www.fistconference.org
Author of a number of articles:
Google: vaceituno wikipedia
Director of the ISM3 Consortium
The consortium promotes ISM3, an ISMS standard
ISM3 is the main source for this presentation.
www.ism3.com
3. Management vs Engineering
Security Engineering: Design and build systems
than can be used securely.
Security Management: Employ people and
systems (that can be well or badly engineered)
safely.
4. Targets vs Outcomes
Activity and Targets are weakly linked.
Targets:
+Security / -Risk
Trust
Activity:
Keep systems updated
Assign user accounts
Inform users of their rights
5. Definition
Metrics are quantitative measurements that can be
interpreted in the context of a series of previous or
equivalent measurements.
Metrics make management possible:
1. Measurement – Some call this “metrics” too.
2. Interpretation – Some call this “indicator”.
3. Investigation – (When appropriate, logs are key here)
Common cause
Special cause
4. Rationalization
5. Informed Decision
6. Qualitative vs Quantitative Measurement
William Thomson (Lord Kelvin): “I often say that when you can
measure what you are speaking about, and express it in numbers, you know something
about it; but when you cannot express it in numbers, your knowledge is of a meager and
unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in your
thoughts, advanced to the stage of science, whatever the matter may be”:
“What can’t be measured, can’t
Meaning:
be managed”
7. Interpretation
It doesn’t make sense to set thresholds beforehand. You
have to learn what is normal to find out what is abnormal.
Thresholds can be fuzzy. False positives and false
negatives.
Example: 1000 students tested for HIV, 10 have it.
HIV Have HIV Don’t have HIV
Test positive for HIV 9 99
Test negative for HIV 1 891
9. Interpretation
Are outcomes better fit to their purpose?
Are outcomes getting closer or further from target?
Are we getting fewer false positives and false negatives?
Are we using resources more efficiently?
10. Rationalization
Is the correction/change working?
Is it cost effective?
Can we meet our targets with the resources we
have?
Are we getting the same outputs with fewer
resources?
12. Good Metrics are SMARTIED
S.M.A.R.T
Specific: The metric is relevant to the process being measured.
Measurable: Metric measurement is feasible with reasonable cost.
Actionable: It is possible to act on the process to improve the metric.
Relevant: Improvements in the metric meaningfully enhances the
contribution of the process towards the goals of the management
system.
Timely: The metric measurement is fast enough for being used effectively.
+Interpretable: Interpretation is feasible (there is comparable
data) with reasonable cost (false positives or false negatives
rates are low enough)
+Enquirable: Investigation is feasible with reasonable cost.
+Dynamic: The metric values change over time.
13. Fashion vs Results
Real Time vs Continuous Improvement
Management is far more than Incident Response.
Risk Assessment as a Metric
Only as useful as Investigation results.
Certification / Audit
Compliant / Not compliant is NOT a Metric.
14. What are good Metrics?
Activity: The number of outcomes produced in a time
period;
Scope: The proportion of the environment or system that
is protected by the process.
Update: The time since the last update or refresh of
process outcomes. (Are outcomes recent enough to be valid?)
Availability: The time since a process has performed as
expected upon demand (uptime), the frequency and
duration of interruptions, and the time interval between
interruptions.
Efficiency / ROSI: Ratio of outcomes to the cost of the
investment in the process. (Are we getting the same outcomes
with fewer resources? Are we getting more/better outcomes with the
same resources?)
15. What are good Metrics?
Efficacy / Benchmark: Ratio of outcomes
produced in comparison to the theoretical
maximum. Measuring efficacy of a process implies
the comparison against a baseline. (Are outputs better fit
to their purpose?, Compare against industry/peers to show relative
position)
Load: Ratio of available resources in actual use to
produce the outcomes, like CPU load, repositories
capacity, bandwidth, licenses and overtime hours
per employee.
Accuracy: Rate of false positives and false
negatives.
16. Examples
Activity:
Number of access attempts successful
Scope:
% Resources protected with Access Control
Update:
Time elapsed since last access attempt successful
Availability:
% of Time Access Control is available
Efficiency / ROSI:
Access attempts successful per euro
Efficacy / Benchmark:
Malicious access attempts failed vs Malicious access attempts successful.
Legitimate access attempts failed vs Legitimate access attempts
successful.
Load:
% mean and peak Gb, Mb/s, CPU and licenses in use.
17. Metrics and Capability
Undefined. The process might be used, but it is
not defined.
Defined. The process is documented and used.
Managed. The process is Defined and the
results of the process are used to fix and
improve the process.
Controlled. The process is Managed and
milestones and need of resources is accurately
predicted.
Optimized. The process is Controlled and
improvement leads to a saving in resources
19. Capability: Defined
Measurement - None
Interpretation - None
Investigation – (When appropriate, logs are key here)
Common cause (changes in the environment, results
of management decisions)
Special cause (incidents)
Rationalization for use of time, budget, people
and other resources – Not possible
Informed Decision – Not possible
20. Capability: Managed
Measurement: Scope, Activity, Availability
Interpretation:
Normal?, Successful?, Trends?
Benchmarking, How does it compare?
Efficacy.
Investigation (Common cause, Special cause)
Find faults before they produce incidents.
Rationalization… – Possible
Informed Decision – Possible
21. Capability: Controlled
Measurement: Load, Update
Interpretation
Can we meet our targets in time with the
resources we have?
What resources and time are necessary to
meet our targets ?
Investigation
Find bottlenecks.
Rationalization…- Possible
Informed Decision, Planning – Possible
23. Metric Specification
Name of the metric;
Description of what is measured;
How is the metric measured;
How often is the measurement taken;
How are the thresholds calculated;
Range of values considered normal for the metric;
Best possible value of the metric;
Units of measurement.
26. Metrics Representation
Access Rights Granted
16000,0 1800,0
14000,0 1600,0
12000,0 1400,0
10000,0 1200,0
8000,0 1000,0
6000,0 800,0
4000,0 600,0
2000,0 400,0
0,0 200,0
0,0
W 10
W 13
W 16
W 19
W 22
W 25
W 28
W 31
W 34
W 37
W 40
W 43
W 46
9
k4
1 4 7 10 13 16 19 22 25 28 31 34 37 40
k
k
k
k
k
k
k
k
k
k
k
k
k
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
W
Weeks
Access Rights Granted Access Rights Granted
1800,0 1600,0
1600,0 1400,0
1400,0 1200,0
1200,0 1000,0
1000,0
800,0
800,0
600,0 600,0
400,0 400,0
200,0 200,0
0,0 0,0
W 10
W 13
W 16
W 19
W 22
W 25
W 28
W 31
W 34
W 37
W 40
W 43
6
W 10
W 13
W 16
W 19
W 22
W 25
W 28
W 31
W 34
W 37
W 40
W 43
W 46
9
k4
k4
k
k
k
k
k
k
k
k
k
k
k
k
k
k
k
k
k
k
k
k
k
k
k
k
k
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
ee
W
W
Weeks Weeks
27. Using Metrics
Acumulado de Recomendaciones por Responsable (Suma de
días)
2500
Mr Blue
2000 Mr Pink
Mr Yellow
1500 Mr Purple
Mr Soft Blue
Mr Red
1000 Mr Green
Mr Orange
500
0
o
Fe r o
o
ie e
zo
pt sto
o
Ag o
O re
ic bre
e
ril
ay
ni
r
li
er
br
e
Ab
N tub
b
Ju
ar
Ju
Se o
En
m
m
br
M
m
M
c
ie
ie
ov
D
29. Creative Commons
Attribution-NoDerivs 2.0
You are free:
•to copy, distribute, display, and perform this work
Under the following conditions:
Attribution. You must give the original author credit.
No Derivative Works. You may not alter, transform, or build upon this
work.
For any reuse or distribution, you must make clear to others the license terms of this work.
Any of these conditions can be waived if you get permission from the author.
Your fair use and other rights are in no way affected by the above.
This work is licensed under the Creative Commons Attribution-NoDerivs License. To view a copy of
this license, visit http://creativecommons.org/licenses/by-nd/2.0/ or send a letter to Creative
Commons, 559 Nathan Abbott Way, Stanford, California 94305, USA.
30. @ with the sponsorship of:
THANK YOU
www.fistconference.org