This document discusses key performance indicators (KPIs) for measuring the success of application security initiatives. It provides example metrics in six areas: product security quality and risk exposure, security development lifecycle (SDLC) maturity, application security testing, consulting, training, and DevSecOps. The document recommends starting by measuring a few basic metrics and improving data over time. It emphasizes clear roles and accountability, and communicating risks financially rather than through complex assessments.
4. rGrupe:|:applicationsecurity
INTRO:
Management
Concerns
What managers want, or need, to know is …
•How does the performance of our security practice compare
with other organizations?
•Truthfully; how good, or bad, is the security quality of our applications
in production, and our development process, right now?
•Is the security quality of our applications getting better
or worse?
•What really needs to be fixed ASAP,
and what’s important to address next?
•What don’t we know;
what are we not actively monitoring and managing that we should?
•Are we paying enough attention to our production deployed applications
to ensure that they are adequately hardened from malicious attacks and
unintended sensitive data exposure
– to the best of our ability?
•What isn’t as effective as we thought it would,
and why is that?
•What should we do more, less, stop?
6. rGrupe:|:applicationsecurity
WHY:
Executive
Managers
• What is the organization’s current accepted financial risk exposure from
malicious attack or unintentional sensitive information disclosure?
• What is the confidence that security performance of all application
sources is being monitored?
◦ What % is unknown: not identified, monitored, and quantifiable
◦ What applications and components currently not actively managed
Is security risk improving or getting worse?
◦ Current exploits in production
◦ New exploits being added by new applications and update releases
• Are our security investments cost effective and delivering value as
forecasted?
◦ Reduced cost
◦ Improved productivity/velocity
◦ Improved security quality / reduced security risk
7. rGrupe:|:applicationsecurity
WHY:
Teams
•Software Development Teams
◦ What is our Insecurity Tech Debt?
◦ Production exploitable vulnerabilities in their owned applications and components?
◦ What is our Release Security Quality?
◦ For each of our production releases
•DevOps Teams
◦ Are our Tooling and Process Changes improving or worsening?
◦ security attack risks
◦ release velocity
◦ costs
•Information Security Teams
◦ What are the ROIs and KPIs of Security Quality processes, initiatives?
◦ (with tooling costs including maintenance and support)
8. rGrupe:|:applicationsecurity
THE WHAT:
AppSec & DevSecOps KPI Metrics
To be review monthly:
1. Product Security Quality & Business Financial Risk Exposure
2. SSDLC Maturity Organizational Performance
3. AppSec QA Testing
4. AppSec Consulting
5. AppSec Training
6. DevSecOps
9. rGrupe:|:applicationsecurity
WHAT:
Product Security
Quality &
Business
Financial Risk
Exposure
Product Security Quality & Business Financial Risk Exposure
•Data Sources: Vulnerability and Non-Compliant Process Issues from ...
• Compliance audit findings (internal and external)
• Penetration Testers (internal and external)
• Bug Bounty/Independent issue report submissions
•Product Security Quality Metrics
• % releases with Penetration Tests (full, not partial quick checks)
• # Vulnerabilities by type categories
(filterable by severity, and finding sources)
• i.e. Injection, Cross Site Scripting, etc.
• $$ Insecurity Tech Debt
• Production Vulnerability Remediation Costs $$
• Calculated average for organization (~$2,000? each – including management/overheads)
• $$ SSDLC Avoidance Lost Opportunity
• Cost to fix if had been detected earlier within SSDLC (shift-left)
• $$ Financial Risk Exposure (lost business, legal regress, compliance fines, etc.)
• Calculated by GRC for product/component (review and update as needed quarterly)
NOTES:
1. Any Audit or Pen Test finding indicates a failure of security requirements
definition from GRC or SSDLC practice.
2. For vulnerability types, use CWE Software Development categories.
OWASP Top Ten survey report categories change every few years.
MITRE CWEs (Common Weakness Enumerations) provide mapping to OWASP Top Ten,
and also includes others not in the Top Ten.
3. Vulnerabilities detected prior to code is deployed into production is usually not a
Business Risk, because the Dev team still has an opportunity to fix it prior to exposure.
10. rGrupe:|:applicationsecurity
WHAT:
SSDLC Maturity
Organizational
Performance
What is the maturity of organization’s compliance to their
Security Software Development Life Cycle (SSDLC) process?
• E.g. Design Threat Assessment, SAST, DAST, Security Code Review,
Security UAT, Production Penetration Test?
•Data sources:
• Project management ticketing system (e.g. Jira tasks)
• Software CI/CD release orchestration (e.g. Jenkins logs)
•Production Releases SSDLC Compliance Maturity Metrics
(OWASP SAMM framework as starting point for org)
• % Level 0 NONE/NON-COMPLIANT - or not currently not measured
• % Level 1 LOW – some SSDLC tasks with at least 1 AppSec test type
• % Level 2 PARTIAL - some SSDLC tasks with more than 1 AppSec test
• % Level 3 FULL – evidence for all SSDLC applicable tasks
11. rGrupe:|:applicationsecurity
WHAT:
AppSec QA
Testing
AppSec QA Testing in the SSDLC
• Types
• Manual Methods: Design Threat Assessment, Security Code Review
• Automated Tools: SAST (code), DAST (running UI), and OSA (Open Source Analysis)
• Data sources:
• Manual: Project management ticketing system (e.g. Jira tasks)
• Automation: Software CI/CD release orchestration (e.g. Jenkins logs)
• Metrics
• Usage
• % Production Releases that performed each type of AppSec QA test
• Effectiveness:
• # Vulnerabilities Bugs Found by Severity and Category
• # bugs (true positives) = # potential issues detected - # false positives
• $$ Cost Savings Value: (pre-Pen Test SSDLC shift-left value ~$1,000??)
• # Vulnerabilities Released into Production (e.g. not fixed)
• Reporting: Important information to include
• Confidence
• Specify any exclusions
• Pipelines or repository types
• File types (e.g. languages), large file sizes, types, etc.
• Sensitivity
• Tool sensitivity setting used: (max, medium, low)
• include on metrics reports to clarify
• Higher sensitivity creates more false positives
NOTES:
KPI’s to be based on Bugs, not reporting tool issues found results because :
a) false positives are eliminated,
b) Ensures issues are in teams’ backlogs for fix planning
12. rGrupe:|:applicationsecurity
WHAT:
AppSec
Consulting
•Metrics
• Design Threat Assessments
• # Releases
• # security requirement added to project backlog/s by AppSec team members
• E.g. missed secure design “bugs” detected by manual inspection
• $$ SME Cost Savings
• SSDLC early identification (shift-left) (??$3,000 each)
• $$ Business Financial Risk Avoidance
• As calculated by GRC for application for each release/project (??10,000)
• Consulting (Design Threat Assessments, SSDLC Process, Training, etc.)
• # teams’ supported by AppSec team members consulting
• # hrs/story-points of AppSec team members consulting
• $$ G&A services value contribution (staff hourly rates)
•Data sources:
• Manual: Project management ticketing system (e.g. Jira tasks)
13. rGrupe:|:applicationsecurity
WHAT:
AppSec Training
•Metrics: Training Programming (SSDLC & Secure Coding Compliance)
• For Individuals
• # Dev team members trained/certified
• Programmers, testers, and project/delivery managers
• % staff trained/certified
• Producers, managers (1st level), executives (2nd + levels)
• Because Security is everyone’s responsibility, with sponsors’ accountability.
• For Dev Teams
• # Dev Teams with 1+ trained programmer
• % Dev Teams with 1+ trained programmer
•Reporting Trend Analysis
• Dev Team Performance Effectiveness
• # Pen Test Findings trend for product/project/team
• Types of vulnerabilities to focus training and consulting
• # Vulnerabilities Released into Production
• SSDLC compliance maturity
• Measure only current Trained certified staff
• not departed
• Exclude expired certification (e.g. ??2-year renewal period)
•Data sources:
• Project management ticketing system (e.g. Jira tasks)
• Learning Management System (LMS)
14. rGrupe:|:applicationsecurity
WHAT:
DevSecOps
KPI Metrics for every software deployment into Production
(Alignment with Google DORA State of DevOps survey reports key metrics.)
•Data sources:
• Software CI/CD release orchestration (e.g. Jenkins logs)
•Metrics
1. FDR (Failed Deployment Rate):
# releases blocked from release to Prod due to SSDLC non-compliance
2. LTC (Lead Time for Changes):
HH:mm:ss from initial code commit to running in production
3. DER (Defect Escape Rate):
% Security Bugs released into Production (by severity)
4. DV (Defect Volume):
# Security Bugs released into Product
$$ Increased Financial Risk
5. SLA (Service Level Agreement) Compliance:
% SSDLC Compliance (full, partial, none)
6. CTV (Customer Ticket Volume):
# GRC UAT (ala Pen Test) Findings (by severity)
15. rGrupe:|:applicationsecurity
WHAT:
DevOps Security
NOTE:
The DevOps pipeline is a potential vector for malicious exploits,
and should be treated to the same level of security review and
management scrutiny as in-house developed applications.
Controls to Prevent Misuse and malware injected into
applications code/builds
• Settings & configuration file changes - security code reviews
• Merge/Build/Release actions - separation of duties
• Customized SAST scanning for security requirements compliance
verification and non-compliance notification
• Based on Threat Analysis of complete pipeline and all it’s component (e.g. SCM,
orchestration, etc.)
16. rGrupe:|:applicationsecurity
HOW: Where to Start
“Journey of a thousand miles starts with a single step.”
Don’t get overwhelmed, or try to implement all of these at once.
Start with what you have and know now;
then prioritize, divide, and deliver.
Don’t get stuck in “analysis paralysis” for $$ estimates.
Start right now with 60-second initial guesses;
then update values with better estimates when available.
17. rGrupe:|:applicationsecurity
HOW:
Organizational
Responsibilities
& Accountability
Ensure Clarification of Expectations: Identify and Document
•Who does the board of directors hold accountable
for the security quality of production products?
•Who is responsible for prioritizing security hardening
initiatives and funding allocation?
Initiatives Roles Matrix
to be reviewed quarterly (for update initiatives & names)
RASIC format:
• Responsible – for delivery management
• Accountable – approval authority - sponsor
• Supporting – teams doing work
• Informed – input and status, but not responsible
• Consulted – input, acceptance criteria, influence
18. rGrupe:|:applicationsecurity
HOW:
A Word About
“Risk”
The terms Security “Risk” and “Risk Assessments”
are not intrinsically or universally understood
• Managers and Staff are not sure exactly what Security Risk simplified
“classifications” really mean to the business.
• Military Alert Level Classifications
• Red / Orange / Yellow / Green
• Level 1 / 2 / 3 / 4
• etc.
• “What is the potential business impact?” – not sure, someone else’s problem
• InfoSec Professions understanding and definitions of “Risk Assessments” varies
• Practiced differently by organizations
• Traditional Legacy InfoSec Risk Assessment practices and calculations
are based on high governance Waterfall managed projects.
• Takes too long to calculate and isn’t scalable with manual calculation and review for Agile CD/CI.
• Multiple releases per week, per product/component
• Requires trained Risk Analyst specialists engaged with all projects and releases for analysis and
calculation
Instead, use “Financial Risk Exposure” $$:
• Puts risks into financial terms understood by management accounting for
insurance/financial reserves requirements and impact.
• HOW TO Compute: GRC Dept quarterly review and assignment of
• $$ for security incident response and potential loss - per application/component
• $ for High/Medium/Low Production Vulnerabilities
• HOW TO Use
• Include with product/project financial status summaries so can be used for
• ROI business cases performance evaluation
• Remediation prioritization and funding decisions
19. rGrupe:|:applicationsecurity
HOW: Finally,
Organization
Reporting
•Create single URL page
• 24/7 access with latest status
• Accessible by all owners and stakeholders
•Use enterprise graphical reporting tool to
• Aggregate data from multiple data sources
• Drill down filtering by
• Produce/Application components
• Organization chart product/component ownership
• Filtering options for analytic insights
• Vulnerability: Quantity, Severity, Financial Risk Exposure,
• Cost, Cost-Savings, etc.
• Include
• unknown/not-measured data (or average approximations)
• Provides full scale context
• Exclusions & Sensitivity information notes
• Provides data confidence insights (or enhancement needs)
• Use 12-month trailing trend graphs
• Use stacked histogram to visually show high/medium/low
• Include Critical with High ??
• With previous year background for seasonal trends variances.
Presentation Title:
AppSec & DevSecOps Metrics: Key Performance Indicators (KPIs) to Measure Success
Description:
This session is for organizational executive managers and security teams who want to know the effectiveness and performance of their organization’s application security initiatives.
Introductory performance KPI metrics covered for:
Product Security Quality & Business Financial Risk Exposure
SSDLC Maturity Organizational Performance
AppSec QA Testing
AppSec Consulting
AppSec Training
DevSecOps
Tags (500 characters max):
AppSec DevSecOps Metrics Key Performance Indicators KPIs ,
Robert Grupe, Grupe, CISSP, CSSLP, PMP,
training, how to, tutorial, Vlog,
agile, lean, scrum, kanban,
appsec, devsecops, Infosec, Cyber Security, SSDLC, OWASP, Security, Protection, CISO,
SDLC, SSDLC, “best practice”, metrics, measurement, KPI, KIPs
Compliance, PCI, HIPAA, GDPR
This presentation available on:
SlideShare @ https://www.slideshare.net/rgrupe
YouTube Channel “AppSec & DevSecOps” @ https://www.youtube.com/channel/UCZf4TvI-FIWUyBYTTvDhiuQ
Lots of claims are made about software application security (AppSec) and
applying security into software development CI/CD pipeline operations (DevSecOps),
but how does anyone really know if those initiatives are delivering real business value?
Can your organization easily answer the following questions, within 5-minutes, without assistance from some SME?
NOTE: Any vulnerability or risk that exists in production systems has been accepted by the organization.
These metrics are specific to organization developed applications by in-house programmers,
not anything acquired and used from 3rd parties (e.g. not COTS: Commercial Off The Shelf software).
The following are some starting baseline metrics that executive and software development management should review monthly for
financial risk acceptance and
continuous improvement initiatives
sponsorship,
prioritization, and
investment.
SAST (Static Application Security Testing), DAST (Dynamic Application Security Testing), and OSA (Open Source Analysis)
Super Important
What should we be doing?
Ensure top-down everyone on the same page.
Quarterly Reviews Because
Staff / responsibility changes
Prioritization changes (pop-up additions)
NOTE: In large organizations this may be difficult to do within a few weeks.
Don’t give up
Work on implementing metrics in parallel
Include in best current draft in quarterly management status reporting
Build familiarity / demonstrate value
Get input for missing information
Thank you for your time. I hope you’ve found this interesting or helpful.
This presentation is available on my
SlideShare @ https://www.slideshare.net/rgrupe
YouTube Channel “AppSec & DevSecOps” @ https://www.youtube.com/channel/UCZf4TvI-FIWUyBYTTvDhiuQ
Please
Like and share,
Leave a comment, or
Contact me via email
If you are interested in receiving presentation updates
and a summary roundup of weekly AppSec & DevSec articles that I found interesting,
Subscribe to my AppSecNewsBits e-newsletter.