Diese Präsentation wurde erfolgreich gemeldet.
Die SlideShare-Präsentation wird heruntergeladen. ×

The Thing That Should Not Be

Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Wird geladen in …3
×

Hier ansehen

1 von 33 Anzeige
Anzeige

Weitere Verwandte Inhalte

Diashows für Sie (20)

Anzeige

Ähnlich wie The Thing That Should Not Be (20)

Aktuellste (20)

Anzeige

The Thing That Should Not Be

  1. 1. The Thing That Should Not Be A glimpse into the dark future of web application security Bruno Morisson <bm@integrity.pt> IBWAS’10
  2. 2. About me •  Consultant & Partner - INTEGRITY, Consulting & Advisory •  ~12 years in Information Security •  CISSP-ISSMP/CISA/ISO27001 Lead Auditor •  Background as a Linux/Unix sysadmin •  Background as a C developer 2
  3. 3. Warning! This is all rather unscientific! Really. Consider yourself warned. 3
  4. 4. If wishes were ponies… …security would be inherent to the applications. …there would be no (security) bugs. …we would all get along just fine. 4
  5. 5. This is how they see us 5
  6. 6. This is how we see them 6
  7. 7. We’re all skewed! Security practitioners have a skewed vision of reality. We’re usually what regular people would call paranoid. Developers have a skewed vision of reality. They usually don’t care about (or understand) security issues. 7
  8. 8. We’re all skewed! We believe everyone should care about security at least as much as we do. WE’RE WRONG! 8
  9. 9. We’re all skewed! 9
  10. 10. Security Mindset “Good engineering involves thinking about how things can be made to work; the security mindset involves thinking about how things can be made to fail.” Bruce Schneier 10
  11. 11. “We have a firewall on our internets” 11
  12. 12. “We use usernames and passwords to access our web application” 12
  13. 13. SSL 13
  14. 14. Proof Source: Cenzic Web Application Security Trends Report – Q1-Q2, 2010, Cenzic Inc. 14
  15. 15. More Proof Source: Cenzic Web Application Security Trends Report – Q1-Q2, 2010, Cenzic Inc. 15
  16. 16. Even more proof Source: Verizon Data Breach Report 2010 16
  17. 17. OWASP Top Ten •  Injection •  XSS •  Broken Authentication and Session Management •  Insecure Direct Object Reference •  CSRF •  Security Misconfiguration •  Insecure Cryptographic Storage •  Failure to Restrict URL Access •  Insufficient Transport Layer Protection •  Unvalidated Redirects and Forwards 17
  18. 18. How are we solving this ? The typical approach is forcing developers to solve all of these problems. But the question is: Who are the developers ? Do they understand the problem ? Most of them know nothing about security. Some of them know little about web development. 18
  19. 19. 19
  20. 20. Render Unto Caesar… Security practitioners are not web developers Why should web developers be security practitioners ? 20
  21. 21. Flashback 21
  22. 22. Let’s party like it’s 1999 Most security vulnerabilities had to do with services: •  HTTP (IIS, apache) •  FTP (wu-ftpd, IIS) •  POP3 (Qpopper) •  SMTP (Sendmail) •  DNS (Bind) •  Telnet •  SSH •  … Buffer Overflows, Format Strings, Integer Overflows were the flavor of the decade… 22
  23. 23. What happened ? Security vulnerabilities had global impact. Few companies/groups produced that software: Microsoft, Apache, SUN, Sendmail, Linux community/vendors. Some built security into the process (Secure SDL), mainly Microsoft. Tools started having security features (from bounds checking, to static and dynamic code analysis) Operating Systems security was improved (no ASLR or DEP back then) 23
  24. 24. Back to the future 24
  25. 25. And now ? Impact of vulnerabilities is limited to that company (or set of companies that use that particular software) Anyone develops a Web Application. Myriad of development languages. Point & Click frameworks that automagically create code… 25
  26. 26. Looking into the future… Let’s break this down into 4 areas: •  Compliance •  Processes •  People •  Tools 26
  27. 27. Compliance Unless there’s a business requirement, don’t expect anyone to implement security. Ex: PCI-DSS, Data Privacy Laws, … 27
  28. 28. Processes If security is done ad-hoc, it will most surely fail. •  Embed security in the SDL •  Create internal processes for dealing specifically with security (e.g. risk assessment, engineering, testing, etc) 28
  29. 29. 29
  30. 30. People Developers won’t build security into the apps, unless it’s a requirement… They need to: understand the security impact. know how to solve the problem. know how to use the tools… Developers won’t become security gurus. 30
  31. 31. Tools People fail. Tools/frameworks should become more idiot- proof. Have security built in by default. Force insecurity to be explicit. 31
  32. 32. 32
  33. 33. Thank You!" Q&A? Bruno Morisson CISSP-ISSMP, CISA, ISO27001LA [email]: bm@integrity.pt [work]: http://www.integrity.pt/ [fun]: http://genhex.org/~mori/ 33

×