This deck was presented during a webinar session yesterday by Joe Basirico, VP, Security Services with Security Innovation.
I built it to: build on use cases presented in pt 1 session; present a five-step best practices approach to determining how you prioritize your software testing strategy; and then how you map SI's TeamMentor product to remediating vulnerable code with the prescriptive guidance and code snippets that ship as part of the content libraries in TeamMentor.
3. TODAY’S AGENDA
Development and Security teams are looking for a better process to fix
software vulnerabilities.
1.Challenges organizations are facing in identifying, verifying and fixing
vulnerable software code.
2.Four defined use cases - where does your organization fit in?
3.Five key best practices you should consider in determining your approach.
4.Practical demonstration:
‣ A series of simulated tests
‣ Measuring the impact of those results
‣ Interpreting test results
‣ Correlation of results through TeamMentor for remediation
4. WHO WE ARE
Application Security Experts
• 10+ Years vulnerability research
• Security Testing Methodology adopted by
SAP, Microsoft, Symantec
• Authors of 8+ books
Products and Services
• Standards - Best Practices
• Education - CBT & Instructor-Led
• Assessment - Software and SDLC
Reducing Application Security Risk
• Critical Vulnerability Discovery
• Secure SDLC Rollout
• Internal Competency Development
5. OUR APPROACH
• Standards: Create security
policies, align dev activities with
standards and compliance
requirements, fix vulnerabilities.
• Education: Create internal
expertise through eLearning,
Instructor-led and virtual
classroom training.
• Assessment: Assess software
apps against online and other
threats and recommend
remediation techniques.
6. COMMON USE CASES
1.Development teams don’t know
where to go for best practices
guidance on software vulnerabilities.
2.There’s a need to communicate and
share intelligence around specific
vulnerabilities with your team.
3.Teams need to fix vulnerabilities and
map to internal policies.
4.There’s a market need for making
more sense of static analysis results to
get to full-circle remediation.
7. WHERE CAN DEVELOPERS GO FOR
THE GUIDANCE THEY NEED?
Use Case 1- Security Team
•A software vulnerability
has been identified.
• Youneed to verify it and
need more information
about it.
• What do you do, and
where do you go for
guidance?
8. HOW CAN YOU SHARE THE
INFORMATION?
Use Case 1I - Security Team
• You’veverified a
software vulnerability.
• You need to
communicate the details
of that vulnerability or
set of vulnerabilities to
your team.
• How is this accomplished
most effectively?
9. INTEGRATING WITH WHAT YOU
ALREADY HAVE
Use Case III - Development Team
• You’ve verified a given
vulnerability, and can now
prioritize it.
• You have knowledge
internally, or security policies
you need to map to.
• How can I do this in a
streamlined way?
10. DOING MORE WITH YOUR
TEST RESULTS
Use Case IV - Development Team with Tools
• The tool reports findings.
• Youneed to make more
sense of the results.
• Thefindings point to
guidance specific to the
findings.
• Fix
what you’ve found.
Re-scan.
11. DETERMINE YOUR RISK TOLERANCE
Understand your level of risk first. Determine your apps second.
• Take an inventory of your high-risk
applications.
• Determine the business criticality of
those applications.
• What’syour attack probability and how
do you define your attack surface?
• Consider the overall business impact,
security threats and compliance
mandates.
• Rank your applications accordingly.
• Startthinking about the most effective
set of testing tools.
12. DEFINE DATA AND APPLICATIONS
Classify your data relative to sensitivity, usage and risk metrics.
Then prioritize your applications.
• How sensitive is your data in a given
application(s)?
• Does
that data pertain to internal
mandates or federal regulations?
• Threat modeling can determine threats,
attacks, and the frequency and severity
they are executed with.
• Rankand prioritize your applications
accordingly.
• Compile the most effective set of
testing tools.
13. PRIORITIZE YOUR APPLICATIONS
Rank your applications using a formulaic approach to measuring risk.
Application Criteria
Threat Sensitive Compliance Customer-
Lifespan
Rating Data Stringency Facing
Tier 1 Restricted Long High Yes
Tier 2 Private Mid Medium Yes
Tier 3 Public Short N/A No
14. MAP ACTIVITY TO YOUR CRITERIA
Implement your security testing strategy.
Depth, Breadth, Frequency
Threat Static Dynamic Manual Pen Threat
Rating Analysis Analysis Test Modeling
Complete/ Complete/ Complete/ Complete/
Frequency Frequency Frequency Frequency
Required/Major Required/Major Required/Per Required/Per
Tier 1
code changes code changes Milestone Release
Suggested/ Required/ Required/Per Suggested/Per
Tier 2
Monthly Quarterly Release Release
Optional/ Required/ Optional/As Optional/As
Tier 3
Quarterly Annually Needed Needed
15. SELECT YOUR TOOLS
Selecting your tool(s) should be the final step before you start testing.
• Apply your rankings to your tools
selection.
• Determineyour combination of
automated vs manual tools.
- Consider how many applications, how
much code and time-to-result.
- Do you need them to run on their
own, or are they better used for a
singular, manual purpose?
- Assume that automated tools cannot
target business logic attacks.
• Interpret your scan results with
remediation in mind
16. SECURE DEVELOPMENT GUIDANCE
A Real-Time In-Practice Companion Containing 4500+ Articles
of Prescriptive Guidance and Code
17. TRY TEAMMENTOR TODAY!
Evaluation Version:
• OWASP Guidance Library (Creative Commons content)
• Install locally or use web version
• Watch a video: http://bit.ly/Vra3OS
• Download it: https://teammentor.net/
Enterprise and Partner Versions:
• Full set of guidance libraries (4500+ articles)
• Single user, cloud instance, business unit, and enterprise-wide
pricing available
• Partner organization licensing
• Contact us: getsecure@securityinnovation.com