As legislators continue to expand the scope of the laws governing information security, we will take a look at some of the new European-level laws in this area from an open source perspective, and consider their impact on OSS management practices. The session will focus on the General Data Protection Regulation, not only because it applies to everyone, but also because its requirements are in many ways the most detailed and prescriptive. During the session we will also touch on some industry-specific developments like the Network and Information Services Directive and the Electronic Identification Regulation. Dan will cover what the new laws say (and perhaps more importantly what they don’t say), how to go about applying them to your OSS management regime, and what you might need to think about changing as a result.
Apidays New York 2024 - The value of a flexible API Management solution for O...
New Security Legislation & It's Implications for OSS Management
1. New EU security legislation and
open source management
Dan Hedley, Irwin Mitchell LLP
2. Some scene setting
79,000 known OSS vulnerabilities, 30,000 reported since 2000
96% of applications scanned found open source components with an average of 147 unique
components per application
67% of applications scanned had known open source vulnerabilities
Those vulnerabilities known on average for more than four years
• e.g. almost 200,000 devices with the Heartbleed vulnerability still on the Internet (Shodan
2017)
OSS vulnerabilities are attractive targets:
• OSS is ubiquitous
• Victim often doesn’t know OSS is there
3. • For everyone:
• General Data Protection Regulation
• For some businesses:
• NIS Directive
• Electronic Identification Regulation
Why this matters – some key new legislation
4. • Comprehensive reform of data privacy law
• In force 25 May 2018
• NEW – extra-territorial effect
• NEW – direct obligations for processors
• NEW –more detailed and prescriptive security requirements
• NEW – obligation to document security decisions and processes
• NEW – mandatory detailed breach reporting for everyone (most of the time)
GDPR – what it is and what it does
5. APPLIES TO:
“the processing of personal data wholly or partly by automated means and to the processing other
than by automated means of personal data which form part of a filing system or are intended to form
part of a filing system”
“processing” = “any operation or set of operations which is performed on personal data or on sets of
personal data, whether or not by automated means, such as collection, recording, organisation,
structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission,
dissemination or otherwise making available, alignment or combination, restriction, erasure or
destruction”
“personal data” = “any information relating to an identified or identifiable natural person (‘data
subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular
by reference to an identifier such as a name, an identification number, location data, an online
identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic,
cultural or social identity of that natural person”
GDPR – subject matter scope (art 2)
6. • Applies if processing takes place in the context of the
activities of an establishment in a member state (regardless
of data location).
• ALSO applies if NO establishment in a member state BUT:
• Offering goods or services to data subjects located in
member states (no payment required)
• Monitoring behaviour of data subjects in member states
GDPR – territorial scope (art 3)
7. • Under current law, data processors (e.g. IT service providers) have no
direct liability to data subjects or regulators.
• Only exposure is contractual, to the controller
• Failure to deal with contractually = controller’s problem
• Under GDPR, processors have direct obligations and direct liability for a
range of obligations, including obligations to secure data, to ensure it has a
compliant contractual obligation to the controller to secure it, & to
cooperate with the regulator.
GDPR – application to data processors
8. “Personal data shall be processed in a manner that ensures appropriate
security of the personal data, including protection against unauthorised
or unlawful processing and against accidental loss, destruction or
damage, using appropriate technical or organisational measures”
• Fleshed out more in art 32 (next slides)
• Burden of proof of compliance on data controller
GDPR – security principle - art 5(1)(f)
9. 1. Taking into account the state of the art, the costs of implementation and the nature, scope, context and
purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural
persons, the controller and the processor shall implement appropriate technical and organisational measures to
ensure a level of security appropriate to the risk, including inter alia as appropriate:
(a) the pseudonymisation and encryption of personal data;
(b) the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and
services;
(c) the ability to restore the availability and access to personal data in a timely manner in the event of a physical
or technical incident;
(d) a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational
measures for ensuring the security of the processing.
2. In assessing the appropriate level of security account shall be taken in particular of the risks that are
presented by processing, in particular from accidental or unlawful destruction, loss, alteration, unauthorised
disclosure of, or access to personal data transmitted, stored or otherwise processed.
GDPR – security (art 32)
10. • Core obligation to “implement appropriate technical and
organisational measures”
• Requires a risk assessment
• So, must know what the risks are!
• Balance risk against state of the art, cost
• Risks to consider include “accidental or unlawful destruction,
loss, alteration, unauthorised disclosure of, or access to personal
data transmitted, stored or otherwise processed”
Unpacking article 32 /1
11. Then a list of things must consider and either implement or have
a reason to reject, incl.
• “the ability to ensure the ongoing confidentiality, integrity,
availability and resilience of processing systems and services”
• “a process for regularly testing, assessing and evaluating the
effectiveness of technical and organisational measures for
ensuring the security of the processing”
Unpacking article 32 /2
12. • Controllers and processors must comply with record keeping
obligations.
• INCLUDES record of security measures and decisions taken
under art 32
• Make avail to regulator on request
• Only exception = <250 staff AND only occasional processing,
no likely risk to DSs, no “special categories” of data
GDPR – obligation to document (art 30)
13. • Controller – to regulator UNLESS unlikely to result in a risk to rights and
freedoms
• 72 hours unless not “feasible” (basically, have a v good reason)
• Time runs from “awareness”
• WP29 guidance that awareness of processor = awareness of controller!
• Processor – to controller
• Without undue delay – means “as soon as possible”
• Controller – to data subjects IF high risk to rights and freedoms
• Without undue delay
• Information to be provided to regulator includes
• Nature of the breach (i.e. how it happened, who affected etc.)
• Likely consequences of the breach
• Mitigation and remediation measures
GDPR – mandatory breach reporting
14. • Much of it devoted to establishment of agencies and a cross-border infosec
cooperation framework
• But ch IV & V also addnl obligations for some businesses:
• “Operators of essential services”
• Energy, transport, banking, financial infrastructure, healthcare, water supply, digital infrastructure
• “Digital service providers”
• Online marketplace, online search engine, cloud computing platforms
• Transposition deadline May 2018
Network and Information Security Directive /1
15. • From a security perspective, covers a lot of the same ground
• Applies based on activities and characteristics of ENTITY, not characteristics of
affected DATA
• If GDPR-compliant, prob. most of the way there BUT devil is in the detail esp.
notification requirements
• Micro and small business exception for digital service providers
• Additional regulators
• OES – by sector
NIS Directive – what does it add to GDPR?
16. “Operators of essential services”
•Designated by the state
•“appropriate and proportionate technical and organisational measures”
to “manage the risks posed” to security of networks and information
systems
•Mandatory breach notification “without undue delay”
Network and Information Security Directive /2
17. “Digital Service Providers”
•“appropriate and proportionate technical and organisational measures to manage the risks posed to
the security of network and information systems which they use in the context of offering …”
[marketplace, search engine, cloud services etc.]
•“A level of security of network and information systems appropriate to the risk posed” & taking into
account a range of factors including
• “the security of systems and facilities”
• “monitoring, auditing and testing”
• Mandatory notification without undue delay of “any incident having a substantial impact on the
provision of a [digital service] that they offer in the Union”
Network and Information Security Directive /3
18. In the UK, govt issued draft “high level principles” and NCSC has issued
initial generic guidance.
Of interest:
• A.2 Risk Management
“There should be efforts to seek an understanding of potential system vulnerabilities that the identified threats might attempt to take
advantage of. This might include technical vulnerabilities, misuse of legitimate business processes or anything else that could impact
the essential service.”
• B.4 System Security
“Software should be supported and up to date with security patches applied. Where patching is technically problematic there are other
possible mitigations but these should be viewed as sub-optimal and care must be taken to ensure that they are effective.”
Network and Information Security Directive /4
19. • Applies to “trust service providers”
“appropriate technical and organisational measures to manage the risks posed to the security
of the trust services they provide. Having regard to the latest technological developments,
those measures shall ensure that the level of security is commensurate to the degree of risk.
In particular, measures shall be taken to prevent and minimise the impact of security
incidents and inform stakeholders of the adverse effects of any such incidents.”
• Mandatory breach notification within 24 hours if “a significant impact on the trust
service provided or on the personal data maintained therein”
•Potentially to several bodies
Electronic Identification Regulation
20. • Legislation is technology neutral
• Compliance is self-assessed at the time, retrospectively re-
assessed by regulators post breach
• Strong legal obligations to secure
• Unlikely that 3P vendors will take much if any liability for OSS
• It is for the breached party to show that its security was
“adequate”
Relevance to OSS management /1
21. • From 2014 guidance published by the ICO, the UK data privacy regulator (emphasis added):
“It is ... important that any software you use to process personal data is subject to an appropriate
security updates policy ... you must also ensure that no relevant components are ignored. This is a
common risk where responsibility for updates is split between multiple people, or where third-party
libraries or frameworks are used.”
• The UK ICO at least has fined people specifically for failure to do this.
•E.g. Gloucester City Council
•& under GDPR, fines potentially get much much bigger …
• Reminder: 67% of applications scanned by Black Duck in 2016 contained unpatched OSS
vulnerabilities.
Relevance to OSS management /2
22. How does it get into org:
• From vendor, due diligence and ongoing dialog as to patch
and security management
• Contractual? Sometimes. Starting to see in regulated industries e.g.
finance
• Clarity as to who is responsible for what is key
• Patching reporting and SLA?
• From own code base, check-in processes and scanning tools
• Other sessions covering this in some detail
Relevance to OSS management /3
23. • Potential for very large fines (up to EUR20mn / 4% of global turnover under
GDPR)
• NB turnover of “undertaking” - in EU case law tends to mean an economic unit,
not legal person, so potential for measurement by reference to whole group
• Reputational damage (e.g. TalkTalk, Yahoo!)
• Damages claims by data subjects
• Regulatory intervention
• Possibility under GDPR of class actions led by charities and campaign groups
Consequences of failure
24.
25. • Equifax happened pre-GDPR
• Full facts yet to be established
• Based on what we know … about 700,000 EU citizens affected
• Might be investigated by one or more regulators, depending on who is affected (i.e. geography)
and “context of establishment”
• Regulators have power to carry out joint ops.
• Issues would be:
1. Whether the fact of the breach would mean a failure to comply with article 32
2. Whether the delay in reporting would be a breach of articles 33 and/or 34
3. If so, whether that breach is symptomatic of systemic problems leading to the failure to notify which
themselves amount to a breach of article 32
4. What the data of EU residents was doing on Equifax’s US servers in the first place, and whether that data
export had been done lawfully
How would Equifax have played out under GDPR?
26. • Breach “made possible” through Apache Struts vuln notified on 8 March
• Equifax had a patching policy, backed with some kind of scanning tool.
• Both failed. Vuln never patched.
• Parallels with Gloucester City Council:
• Data controller aware of vuln
• Knows it must patch
• Fails to do so
• Bad Things Happen
• Attackers apparently had access to data from 13 May, Equifax failed to detect until 29 July
• Data not encrypted at rest – but would that have helped?
• Response/mitigation botched? (art 83(2)(c) GDPR explicitly makes mitigation a factor)
Issue 1 – breach of article 32?
27. • To authorities, 72 hours unless unlikely to result in risk to people concerned
• Low threshold, clearly met in this case
• Time runs from “awareness” – when was Equifax aware?
• WP29 – “reasonable degree of certainty” that a breach has occurred and personal data
compromised as a result
• Investigation to get to that point must still be “prompt”
• 15 August(ish) – so report by 18 August
• Phased reporting acceptable (so incomplete investigation does not obviate need to report)
• To data subjects, “without undue delay”
• 7 September i.e. ~1 month
• Might reas. take the view that scale, impact of going public and necessity to prepare would
allow that delay? Authority would have power to instruct
Issue 2 – breach of notification obligations?
28. • Issue 3 don’t have enough facts to assess yet
• Issue 4:
• Equifax’s EU systems apparently not affected
• EU citizens’ data present on US systems “because of an oversight”
• Enough to catch Equifax’s Irish entity?
• Suggests export was not lawful in the first place
• PLUS but for the unlawful export, would they have been affected at all?
Issues 3 and 4
29. • General point about fines stands i.e. not going to leap to the maximum for
every breach
• BUT authorities (incl. ICO) lobbied for these powers for “the most serious
breaches”
• Articles 32, 33 and 34 are 2%/10mn infringements
• BUT infringement of article 5(1)(f) basic security principle is a 4% / 20mn
infringement
• Fines for multiple breaches capped at the maximum for most serious breach
(art 83(3) GDPR).
Consequences – administrative fines
30. • Mandatory arbitration clauses and adverse compulsory
venue clauses are basically unenforceable against
consumers in EU (cf USA)
• So, prospect of action in EU courts by affected people
• Consumer group class action provisions
• Remains to be seen how e.g. US courts might treat judgment
of an EU court on this (and I’m open to views on the subject
from US lawyers!)
Consequences - right of action for affected people
31. • The law has caught up with infosec risks
• Compliance is self-assessed at the time, but its adequacy is
retrospectively considered by the authorities post breach – so you
need to have a good story to tell
• OSS is not a special case and is not a get-out clause
• Failure to manage OSS vulnerabilities is unlikely to be accepted as an
excuse by regulators
• Financial and other consequences can be very serious
Conclusions and takeaways