While DevOps is becoming a new norm for most of the companies, security is typically still behind. The new architectures create a number of new process considerations and technical issues. In this practical talk, we will present an overview of the practical issues that go into making security a part of DevOps processes. Will cover incorporating security into existing CI/CD pipelines and tools DevOps professionals need to know to implement the automation and adhere to secure coding practices.
Join Stepan Ilyin, Chief Product Officer at Wallarm for an engaging conversation where you’ll learn:
Methodologies and tooling for dynamic and static security testing
Composite and OSS license analysis benefits
Secrets and analysis and secrets management approaches in distributed applications
Security automation and integration in CI/CD
Apps, APIs and workloads protection in cloud-native K8s enabled environments
2. Whoami
Stepan Ilyin
● Co-founder and Chief Product Officer of Wallarm
● Based in SF
● Working on several products for F500
and web scale companies to
○ protect cloud-native applications and APIs
○ automate security testing in CI/CD pipelines
3. Agenda
● It’s not a vendor talk!
● Different approaches to automate security testing in CI/CD
● Recommended set of the DevOps friendly tools you can take
● Best practices of implementing them. How to make them work?
● Examples of the workflows you can apply
5. Trends and challenges
● Agile and DevOps
○ Short timelines and frequent changes
○ Automated pipeline
● Containers
● Cloud-hosted applications
● Open source
● APIs
● Application Security Testing
(AST) is too slow and requires
too many manual steps
● False positives
● Hard to achieve complete testing
coverage
● Limited remediation advice
● Hard to prioritize issues
Trends Challenges
9. ● Integration
○ How easy is it to integrate into CI/CD
● Accuracy
○ Amount of false positives?
● Speed
○ How fast is it? Can it affect the pipeline execution?
● Actionability
○ Signal to noise ratio. Clear guidance
AST criteria — What to keep in mind
10. Static testing (aka SAST)
● Scan code to identify insecure patterns and potential vulnerabilities
● Challenges
○ False positives and a lot of noise; requires tuning
○ Hard to distinguish exploitable issues from non-exploitable issues
○ Doesn’t have any runtime context (connection with other services, DBs, etc.)
● Deployment
○ Developer machine (as left as possible)
■ IDE checks as spell-checker
○ As a part CI
■ Scan diffs
■ Run scans of full scan of the source-code
11. Static testing (aka SAST) — Pros and Cons
● Integration
○ Easy
● Accuracy
○ A lot of false positives
● Speed
○ Minutes to hours
● Actionability
○ Exact line of code. But hard to say which of the issues are real issues.
13. Dynamic testing (aka DAST)
● Sends HTTP requests to test application
○ Library of payloads (SQL injections, XSS, etc)
○ Fuzzing
● Good stuff
○ Finds exploitable stuff (I mean really exploitable)
○ Has runtime context (application is running as it is with connections to DBs, etc)
● Challenges
○ Takes more time than SAST
○ Most of the products can’t scan API and single-page apps (Wallarm FAST can)
○ Most of DAST are hard to integrate into CICD
14. Dynamic testing (aka DAST) —
Requirements for CICD
● Longest tool in the market
● Most of the tools are developed
○ For pentesters (support to be manually used)
○ For old fashioned apps (when it was easy to crawl website; not anymore with SPAs)
● Requirements
○ Does it support integration to CI?
○ Can it test APIs (and SPAs)
○ Speed
15. Dynamic testing (aka DAST) — CI/CD tool landscape
● OWASP Zap (OSS)
○ Integration: Console
○ API Testing: Challenging
● Burp Enterprise (Commercial)
○ Integration: API
○ API Testing: Challenging
● Wallarm FAST (Commercial) — DAST + Fuzzing
○ Integration: API
○ API: Strong
16. DAST uses traffic of your existing tests
Improves security test coverage
● Tests SPAs and APIs
● Detects security issues including
OWASP Top 10
● Expandable without coding
Fine-grain control via policy
Automates security testing
● Auto-generates tests using unit
and functional tests as baselines
● Application-specific fuzzing
● Testing cycles optimized for time
● Configured and run by CI/CD
17. Dynamic testing (aka DAST) — Pros and Cons
● Integration
○ Test Automation
● Accuracy
○ High. Less configuration
● Speed
○ Usually hours
● Actionability
○ Findings are usually relevant
○ Need to pinpoint the issues in the code
18. Interactive Application Security Testing (IAST)
● Runtime code analysis using instrumentation
● Looks at the code as it’s executed
● Can be deployed for 1-10% of your traffic
● Challenges:
○ Coverage is limited to what is executed
(Test automation scripts needed to drive application behavior)
○ Requires integration into CICD
○ Bound by source programming language and runtime environment
19. Interactive Application Security Testing (IAST) —
Tools for CICD
● Most of the solutions are commercial
○ Synopsys Seeker
○ Contrast Security Assess
20. Interactive testing (aka IAST) — Pros and Cons
● Integration
○ Quick. But require support of the language / stack. Test automation
● Accuracy
○ High. Runtime context give benefits
● Speed
○ Quick
● Actionability
○ Findings are usually relevant
21. Software Composition Analysis (SCA)
● SCA to reduce risk from third-party dependencies
● Map dependency tree and find vulnerabilities (CVEs)
in all OSS dependencies
● Tools
○ Snyk
○ GitHub Security Alerts
○ SourceClear
22. Secret detection
● Scan sources codes to find secrets hard-coded by developers
○ API Keys
○ AWS Keys
○ OAuth Client Secrets
○ SSH Private Keys
○ …
● Tools:
○ Tool for Yelp (github.com/Yelp/detect-secrets)
○ git-secrets from awslabs (github.com/awslabs/git-secrets)
23. Detect secrets from Yelp
● Integration:
○ Pre-commit hook
○ CI to scan all repos
● Language agnostic
○ python, puppet, javascript,
php, java, etc
24. Containers testing
● Testing performs detailed analysis on container images
● Lists all packages, files, and software artifacts,
such as Ruby GEMs and Node.JS modules
○ Package lists
○ Software installed manually (pip, rake, ...)
○ Lost credentials
○ Hashes of known vulnerabilities
○ Static binaries
25. Containers testing
● Anchore Engine (https://github.com/anchore/anchore-engine)
○ Jenkins plugin
○ REST API
○ CLI
● Clair from CoreOS team (https://github.com/coreos/clair)
● Banyan Collector (https://github.com/banyanops/collector)
● Klar (https://github.com/optiopay/klar)
○ Clair && Docker registry
● Snyk
● Red Hat OpenScap
29. Prioritize. Or how to avoid backlog overload?
● Prioritize which vulnerabilities represent the highest risk and which may
be acceptable risks
● Avoid duplicate tickets → use tools to filter all the findings out
(vulnerability correlation and security orchestration tools)
○ DefectDojo (OSS)
○ Altran (Aricent), Code Dx, Denim Group, we45, ZeroNorth
30. Red flags vs Orange flags
● Security issues was found. Now what?
● Establish Red Flags and Orange Flags
Red Flag
Really severe
(e.g. SQL injection from DAST)
● Stop the pipeline (Fail).
● Do not deploy.
Orange Flag
Less severe (potential issue from
SAST)
● Continue pipeline execution.
● Pull issues detail into the backlog
31. Infrastructure as Code
● Immutable instances / infrastructure
● Replace instead of patching
● Cloud Formation and Terraform
Everything — infrastructure stack,
network, subnets, instances inside
subnets, bridge, NAT gateway — defined
in the JSON/text
● Servers / instance — Chef, Ansible, Salt
● Containers — Docker files