As part of the NIST Big Data Public Working Group, we examine technologies that can support ethics in systems design. In particular, we review issues raised by the IEEE P7000 community regarding ethics for autonomous systems and robotics. Possible adaptations to the NBDPWG reference model are considered for the third and final version of SP1500.
Call Girls In Doddaballapur Road ☎ 7737669865 🥵 Book Your One night Stand
Technologies in Support of Big Data Ethics
1. Technology of Big Data Ethics
Overview of big data-related concerns from IEEE P70nn Working Groups @IEEESA http://sites.ieee.org/sagroups-7000/
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
2. IEEE P7000: Marquis Group Charter
“Scope: The standard establishes a process model by which engineers and technologists can address
ethical consideration throughout the various stages of system initiation, analysis and design.
Expected process requirements include management and engineering view of new IT product
development, computer ethics and IT system design, value-sensitive design, and, stakeholder
involvement in ethical IT system design. . .. The purpose of this standard is to enable the pragmatic
application of this type of Value-Based System Design methodology which demonstrates that
conceptual analysis of values and an extensive feasibility analysis can help to refine ethical system
requirements in systems and software life cycles.”
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
3. Related IEEE P70nn Groups
IEEE P7000 Ethical Systems Design
IEEE P7001 Transparency of Autonomous Systems
IEEE P7002 Data Privacy Process
IEEE P7003 Algorithmic Bias Considerations
IEEE P7004 Standard for Child and Student Data Governance
IEEE P7005 Standard for Transparent Employer Data Governance
IEEE P7006 Standard for Personal AI Agent
IEEE P7007 Ontological Standard for Ethically Driven Robotics and Automation Systems
IEEE P7008 - Standard for Ethically Driven Nudging for Robotic, Intelligent and Autonomous Systems
IEEE P7009 - Standard for Fail-Safe Design of Autonomous and Semi-Autonomous Systems
IEEE P7010 - Wellbeing Metrics Standard for Ethical Artificial Intelligence and Autonomous Systems
IEEE P7011 - SSIE Standard for Trustworthiness of News Media
IEEE P7012 - SSIE Machine Readable Personal Privacy Terms
IEEE P7013 - Facial Analysis
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
4. Key References
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
Focus: artificial intelligence
and autonomous systems.
Havens asks, “How will
machines know what we
value if we don’t know
ourselves?”
5. Recent Case Study Opportunities:
Case Study 1
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
“Faster, Higher, Farther chronicles a corporate scandal that rivals
those at Enron and Lehman Brothers—one that will cost Volkswagen
more than $22 billion in fines and settlements.” –Publisher
6. Case Study 2
“Equifax said that about 38,000 driver's
licenses and 3,200 passports details
had been uploaded to the portal that
had was hacked. (http://bit.ly/2jF3VTh)
Equifax said in September that hackers
had stolen personally identifiable
information of U.S., British and
Canadian consumers. The company
confirmed that information on about
146.6 million names, 146.6 million dates
of birth, 145.5 million social security
numbers, 99 million address
information and 209,000 payment card
number and expiration date, were
stolen in the cyber security incident.” –
Yahoo Finance
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
7. Case Study 3
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
It will be remembered as “a breach,” but the Facebook –
Cambridge Analytica incident was about big data.
Adjectives to
remember:
“Tiny” + “Big”
8. Case Study 4
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
Finding: Hispanic-owned and managed Airbnb properties, controlled for other
aspects, receive less revenue than other groups.
Response from Airbnb when contacted by reporters: We already provide tools
to help price listings.
Source: American Public Media Marketplace 8-May-2018
Related story: Dan Gorenstein, “Airbnb cracks down on bias – but at what cost?” Marketplace, 2018-09-08.
9. Case Study 5
A “charity” was used to subsidize
payments to Medicare patients in order
to boost drug sales. Multiple
manufacturers are involved.
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
10. Case Study 6
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
“Value-added measures for teacher evaluation, called the Education Value-Added
Assessment System, or EVAAS, in Houston, is a statistical method that uses a
student’s performance on prior standardized tests to predict academic growth in the
current year. This methodology—derided as deeply flawed, unfair and
incomprehensible—was used to make decisions about teacher evaluation, bonuses
and termination. It uses a secret computer program based on an inexplicable
algorithm (above).
In May 2014, seven Houston teachers and the Houston Federation of Teachers
brought an unprecedented federal lawsuit to end the policy, saying it reduced
education to a test score, didn’t help improve teaching or learning, and ruined
teachers’ careers when they were incorrectly terminated. Neither HISD nor its
contractor allowed teachers access to the data or computer algorithms so that they
could test or challenge the legitimacy of the scores, creating a ‘black box.’”
http://kbros.co/2EvxjU9
11. Case Study 7
A radiologist sends a message to a provider. It is never received, and critical
care was not delivered, probably resulting in a patient’s death. Whom would
you blame?
What’s in your stack?
“Apache Flink is an open-source framework for distributed stream processing that
Provides results that are accurate, even in the case of out-of-order or late-arriving
data. Some of its features are – (1) It is stateful and fault-tolerant and can seamlessly
recover from failures while maintaining exactly-once application state; (2) performs
at large scale, running on thousands of nodes with excellent throughput and latency
characteristics; (3) its streaming data flow execution engine, APIs and domain-
specific libraries for Batch, Streaming, Machine Learning, and Graph Processing.”
Or . . . ? “Apache Kafka solves the situation where the producer is generating
messages faster than the consumer can consume them in a reliable way.”
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
12. Related Decks
NIST Big Data Public Working Group – Overview for Cloud Native SAFE
Stakeholders for Ethical Systems Design
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
13. My Perspective
Chair Ontology / Taxonomy subgroup for P7000
Occasional participant in IEEE Standards WGs P7007, P7003, P7002, P7010
IEEE Standard P2675 WG Security for DevOps
Finance large enterprise: supply chain risk, complex playbooks, many InfoSec tools, workflow
automation, big data logging; risks include fraud and regulatory #fail
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
14. “Old” Buzzword: But Big Data Still Matters
Each V fronts a collection of
ethical hazards
Credit: “Ten V’s of Big Data”
from XenonStack.
https://kbros.co/2rMX0v0
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
15. IEEE Society on Social Implications
of Technology
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
16. IEEE Product Safety Engineering Society
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
• “Do no harm.” – It’s not
so easy.
• Do you know a system is
safe before it’s been fully
scaled up -- & possibly
federated?
• What constitutes “a
reasonable explanation”?
17. IEEE Reliability Society
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
See free reliability analytics toolkit. Some
items are useful to Big Data DevOps)
https://kbros.co/2rugRij
18. IEEE Shill? No.
Active communities are small.
Standards documents are not free, though participation for IEEE members is.
Heavily weighted toward late career participants.
Despite “Engineering” in title, often not “engineering.”
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
19. But IEEE has . . .
IEEE Digital Library (with cross reference to ACM digital library)
Multinational reach and engagement
Reasonable internal advocacy and oversight
Diversity
Sometimes good awareness of NIST work
Often best work in lesser-known conference publications (e.g., vs. IEEE Security)
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
20. State of Computing Profession Ethics
@ACM_Ethics
ACM Code of Ethics
(Draft 3, 2018) https://www.acm.org/about-acm/code-of-ethics
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
21. Highlights of ACM Ethics v3
“minimize negative consequences of computing, including threats to health, safety, personal
security, and privacy.”
When the interests of multiple groups conflict, the needs of the least advantaged should be given
increased attention and priority
computing professionals should promote environmental sustainability both locally and globally.
“. . .the consequences of emergent systems and data aggregation should be carefully analyzed.
Those involved with pervasive or infrastructure systems should also consider Principle 3.7
(Standard of care when a system is integrated into the infrastructure of society).
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
22. Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
https://www.computer.org/web/education/code-of-ethics
23. Joint ACM IEEE Software Engr Code
https://www.computer.org/web/education/code-of-ethics
1. PUBLIC - Software engineers shall act consistently with the public interest.
2. CLIENT AND EMPLOYER - Software engineers shall act in a manner that is in the best interests of their client and
employer consistent with the public interest.
3. PRODUCT - Software engineers shall ensure that their products and related modifications meet the highest
professional standards possible.
4. JUDGMENT - Software engineers shall maintain integrity and independence in their professional judgment.
5. MANAGEMENT - Software engineering managers and leaders shall subscribe to and promote an ethical approach
to the management of software development and maintenance.
6. PROFESSION - Software engineers shall advance the integrity and reputation of the profession consistent with the
public interest.
7. COLLEAGUES - Software engineers shall be fair to and supportive of their colleagues.
8. SELF - Software engineers shall participate in lifelong learning regarding the practice of their profession and shall
promote an ethical approach to the practice of the profession.
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
24. Human Computer Interaction
NBDPWG System Communicator
Usability for web and mobile content
Substitutes for old school manuals
“Privacy text” for disclosures, policy, practices
Central to much of the click-based economy
“User” feedback, recommendations
Recommendation engines
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
25. Natural Language Tooling
Hyperlinks to artifacts
Chatbots
Live agent
Speech to text support
Text mining
Enterprise search (workflow-enabled artifacts)
Some of the indexed artifacts may approach big data status
SaaS Text Analytics
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
26. Dependency Management
Big Data configuration management
Across organizations
Needed for critical infrastructure
See NIST critical sector efforts
Dependencies may not be human-intelligible
Special issues with machine-to-machine transactions
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
27. Traceability & Requirements Engineering
Define what is an ethical requirement
Possible: big data ethical fabric (transparency, usage)
Audit
Traceability requirements
Can an ethical responsibility be inherited like PII-tagged data elements?
What about synthetic, algorithm-defined elements?
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
28. Special Populations
Disadvantaged
By regulation (e.g., 8A, SBIR, disability)
By “common sense” (“fairness” and “equity”)
By economic / sector (“underserved”)
Internet Bandwidth inequity
Children
“Criminals” / Malware Designers
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
29. Transparency
What does it mean to be “transparent” about ethics?
What connection to IEEE /ACM professional ethics?
ACM: “The entire computing profession benefits when the ethical decision making process is accountable to
and transparent to all stakeholders. Open discussions about ethical issues promotes this accountability and
transparency.”
ACM “A computing professional should be transparent and provide full disclosure of all pertinent system
limitations and potential problems. Making deliberately false or misleading claims, fabricating or falsifying data,
and other dishonest conduct are violations of the Code.”
ACM “Computing professionals should establish transparent policies and procedures that allow individuals to
give informed consent to automatic data collection, review their personal data, correct inaccuracies, and, where
appropriate, remove data.”
ACM “Organizational procedures and attitudes oriented toward quality, transparency, and the welfare of society
reduce harm to the public and raise awareness of the influence of technology in our lives. Therefore, leaders
should encourage full participation of all computing professionals in meeting social responsibilities and
discourage tendencies to do otherwise.”
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
30. Algorithms
“Why am I locked out while she is permitted?”
“Why isn’t my FICO score changing?”
“How can I know when I have explained our algorithm?”
“Is there an ‘explain-ability’ metric?”
What is different about machine-to-machine algorithms?
“Can an algorithm be abusive?”
“Is ‘bias’ the new breach?” https://kbros.co/2I2sxDO
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
31. Audience, Alerts, Audits: Monitoring
Support multiple “stakeholders”
Not all are paying customers (“public interest”, regulators, suppliers)
Traceability requirements vary across stakeholder groups
In addition to those specified by product owners:
Alerts for citizens, infrastructure managers, CEOs, CIO’s, CISO’s, industry peers
May be the same, or may vary
Monitoring may need to be specialized according to each “V” | Live “seed” testing
Cautionary Tales: “Tin Can on the Wedding Car,” toddlers eating button batteries
(Opinion: Need to resurrect Complex Event Processing design patterns)
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
32. Big Data Simulation
New: DevOps Scalability
Simulation and Interoperability (SISO)
Scale for the V’s (see SISO)
NIST Big Data S&P Appendix A high conformance
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
33. Big Data Operational Intelligence
Big Data often needed to manage applications
Managing pay-as-you-go computing resources =>
OpIntel
Related: Managing OpSec
Related: Alerts and Logging
Tradeoffs and utility models
Transparency, traceability, “documentation”
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
34. Test Engineering and DevOps
Continuous Pipeline concepts applied to IoT / Edge / Distributed
Each platform (or stack “layer”) may introduce different types of ethical concerns
E.g., Identity Management for children
Infectious disease statistics -> break glass for public health
Autonomous vehicles response to fog conditions (see http://web.media.mit.edu/~guysatat/fog/)
Reliance on less reliable hardware or bandwidth (e.g., cheap sensors, residential wi-fi)
Left- and right-shift of safety, reliability, regulatory constraints (remember case studies)
New meaning for “interoperability” – “inter-responsibility”
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
35. Forensics
Big Data may be needed for full stack playback
Full stack for After Action Review is still immature with forensics professionals
Even large firms may not be staffed with forensics specialists
Big surprise may be in store when breach or litigation occurs
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
36. Federation & Supply Chain
Facebook/Cambridge Analytica scenario was forecast in V1
Supply Chains that have been casual need upgrades
Risk often increases as organizational size decreases
Cost of “keeping data around” dangerously close to zero
Conventional systems taxed to handle volume of identity management
Access is infrequently leased
Simplistic network zones fail to isolate subcomponents important to domain experts
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
37. Corporate Initiatives
Environmental Social Governance
Transparency within employee groups, departments, subsidiaries (See P7005)
Computing decisions that affect carbon footprint (green data centers, etc.)
Individual practitioners have greater influence than before
Disclaimers in developer contract work
Offshore culture: some workers may be afraid to question requirements, risk-taking
Whistle-blower (a la Bug Bounty) not working well yet
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
38. Who Decides?
Some Opinions
Requirements Engineering may need a refresher, uplift
System Architects must continuously place controls in hands of domain experts
This is counter to the “sysadmin” design pattern
Risks multiply in part due to the commercial deprecation of documentation, manuals
Boundaries of safe & manageable release pipelines may have already been exceeded (mobile)
“Explain this” mentality partly offsets the DIY developer syndrome
Good for self-education, but the problem is not defining “ethics”
On-demand microlearning must accompany microservices deployment
AI Agents: Can ask, “Why?” “Who?” and nudge ethical considerations
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
39. Value Chain – Reference Model
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
40. Bibliography
Bo Brinkman, Catherine Flick, Don Gotterbarn, Keith Miller, Kate Vazansky, and Marty J. Wolf. 2017.
Listening to professional voices: draft 2 of the ACM code of ethics and professional
conduct. Commun. ACM 60, 5 (April 2017), 105-111. DOI: https://doi.org/10.1145/3072528
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
41. Related Work
NIST 800-53 Rev 5 and others, NIST Cloud Security, NIST RMF
Building, Auto Automation ISO 29481, 16739, 12006
https://www.buildingsmart.org/about/what-is-openbim/ifc-introduction
Uptane
Ethics and Societal Considerations ISO 26000, IEEE P70nn
DevOps Security IEEE P2675
Microsegmentation and NFV IEEE P1915.1
Safety orientation
Infrastructure as code
E.g., security tooling is code, playbooks are code
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
42. This deck is released under
Creative Commons
Attribution-Share Alike.
Portions of the work summarized was developed by multiple contributors through the NIST open
public working group framework under the leadership of Wo Chang, but this document represents
my views alone. https://bigdatawg.nist.gov | govNISTBig Databig data securityBig Data SecPriv
V2
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
43. Background: NIST Big Data PWG
Other insights from the NIST Big Data Public Working Group
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
44. What’s Different about Big Data
(OLD NEWS)
Multiple security schemes, attack vectors, countermeasures
May have streamed data frameworks + data at rest
Sensor Sensibility
Unintended uses and deanonymization
Often multi-organizational (most standards built for single-org adoption)
Problems of scale and complexity, veracity, content, provenance, jurisdiction
Data and code shared across organizations
Big data power wielded by smaller organizations with weak governance, training, regs
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
45. Fluff
Security and privacy are affected by all dimensions:
Volume
Velocity
Variety
Veracity (Provenance)
Volatility
Cloud
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
46. Less Fluffy
Big Data partly side effect of SDLC shifts
Agile
API-First
Microservices / Containerization
Deprecated but not forgotten: Components, Composable Services
SDN, 5G
Left Shift (DevOps)
DevSecOps
Model portability: CrispDM (IBM SPSS link), OMG DOL (Distributed Ontology, Model & Spec Language, link)
IoT (Distributed Computing c. 1970-present)
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
47. Key Trends
Cloud (centralization, scale, code-sharing)
IoT, especially health & safety related
Mobility and pervasive human-computer interactions (Alexa, etc.)
Data Center automation (scripting -> DevOps code, “left-shift”)
Trust and Federation (related: Blockchain)
Domain automation (E.g., smart buildings, autonomous vehicles, FIBO)
ABAC more than RBAC
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
48. Use Cases
Network Protection
Systems Health & Management (AWS metrics, billing, performance)
Education
Cargo Shipping
Aviation (safety)
UAV, UGV regulation
Regulated Government Privacy (FERPA, HIPAA, COPPA, GDPR, PCI etc.)
Healthcare Consent Models
HL7 FHIR Security and Privacy link
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
49. Liaison
NIST (mostly 1:1 contacts, catalog of cited SPs and standards)
IEEE P2675 Security for DevOps
IEEE P1915.1 NFV and SDN Security, 5G (1:1 via AT&T)
IEEE P7000-P7010 (S&P in robotics: algorithms, student data, safety & resilience, etc.)
ISO 20546 20547 Big Data
IEEE Product Safety Engineering Society
IEEE Reliability Engineering
IEEE Society for Social Implications of Technology
HL7 FHIR Security Audit WG
Cloud Native SAFE Computing (Kubernetes-centric)
Academic cryptography experts
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
50. Contributions of this SP
Checklists
Deep bibliography
Consent and Break-Glass after HL7
Centrality of Domain Models
Simulation
Security/Privacy modeled after Safety frameworks
E.g., data / code toxicity (after Material Data Safety standard link)
“System Communicator”
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
51. Value Chain – Reference Model
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
52. ACM Computing Classification
Security & Privacy Topics
Database and storage security
Data anonymization and sanitation
Management and querying of encrypted data
Information accountability and usage control
Database activity monitoring
Software and application security
Software security engineering
Web application security
Social network security and privacy
Domain-specific security and privacy architectures
Software reverse engineering
Human and societal aspects of security and privacy
Economics of security and privacy
Social aspects of security and privacy
Privacy protections
Usability in security and privacy
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
53. Conceptual Taxonomy
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
Security
and Privacy
Conceptual
Taxonomy
Data
Confidentiality
Provenance
System Health
Public Policy,
Social, and Cross-
Organizational
Topics
54. Operational Taxonomy
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
Security
and Privacy
Operational
Taxonomy
Device and
Application
Registration
Identity and
Access
Management
Data
Governance
Infrastructure
Management
Risk and
Accountability
55. NBD SP Security & Privacy Safety:
Conformance Levels
General approach: ISO 17021, 17067, 17023 Conformity Assessment
Sets forth suggested levels of conformance:
Safety Level 1, 2 & 3
Self-administered
Mechanics at Level 3
Automated use of domain models for Security Operations
Security and privacy risks driven to IDE
Continuous Test (left- & right-shift of code)
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
56. Value of Security Ontologies
(Obrst, Chase, & Markeloff, 2012) Note that systematic use of ontologies could enable information
security tools to process standardized information streams from third parties, using methods such as
the Security Content Automation Protocol (SCAP). This model could enable automated reasoning to
address potential breaches closer to real time, or which have indirect effects on networks or
applications which require a mixture of human and machine cognition.
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
57. Privacy and Security Fabric
“Fabric” notion adopted by several organizations
Fabric to cover multiple layers, facets, technologies
Dissolving distinction between security and privacy
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
58. Snips from NBDPWG V2 Appendix A
Best practices for ABAC
Integration of legacy RBAC with ABAC
Derivation of ABAC from other model formats
Kubernetes walkthrough
Container and Microservice ABAC
Log analysis for Splunk Security Operations / Application design patterns
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
59. Appendix A
There is more . . . Refer to Appendix A in the full document. The preceding
slides were an excerpt.
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
71. Cloud Native Foundation
Safe Access For Everyone (SAFE)
https://github.com/cn-security/safe
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2
72. This deck is released under
Creative Commons
Attribution-Share Alike.
Portions of the work summarized was developed by multiple contributors through the NIST open
public working group framework under the leadership of Wo Chang, but this document represents
my views alone. https://bigdatawg.nist.gov | govNISTBig Databig data securityBig Data SecPriv
V2
Mark Underwood @knowlengr | Synchrony | Views my own | dark@computer.org | v1.2