This document outlines an agenda for a workshop on privacy and risk mitigation for emerging technologies. The workshop aims to counter an optimism bias by imagining how technologies could be misused or have unintended consequences. Participants will break into groups to generate hypothetical future scenarios where risks have occurred, then propose potential solutions to mitigate the risks identified in each story. The goal is to have a realistic discussion of risks in order to increase user protection, anticipate problems, and improve product design.
2. I T ’ S U N C L E A R H O W T E C H N O L O G Y W I L L B E U S E D I N T H E F U T U R E
I D E N T I F Y P O T E N T I A L S O C I A L I M P A C T S & U N I N T E N D E D U S E S
O V E R V I E W
_
3. B I A S T O W A R D S O P T I M I S M
_
Photo by Manu Schwendener on Unsplash
4. M A K I N G T H E E F F O R T T O I M A G I N E A D V E R S E O U T C O M E S L E A D S T O
M O R E R E A L I S T I C A S S E S S M E N T O F R I S K
I N C R E A S E D A B I L I T Y T O A N T I C I P A T E P R O B L E M S
A W A R E N E S S O F B L I N D S P O T S
I N C R E A S E D U S E R P R O T E C T I O N
I M P R O V E D P R O D U C T D E S I G N
C O U N T E R I N G A B I A S T O W A R D S O P T I M I S M
_
6. G E N E R A T E A S T O R Y I N Y O U R S M A L L G R O U P
_
CONSTRUCT A FUTURE WHERE RISKS HAVE OCCURRED
Users Do Not Control Data Surveillance
Data collected and sold without
consent* Data follows users through their
lifetime, affects
reputation, freedom or opportunity
Sensitive insights about users are
extracted from data
Bad actors leak/steal data The government uses data to infringe
on the rights of citizens
Companies with data are sold or shut
down
People near a XR device wearer are
not aware they are being monitored
*Prompts modified from The Ethical OS Toolkit available at www.ethicalos.org
7. G E N E R A T E A S T O R Y I N Y O U R S M A L L G R O U P
_
CONSTRUCT A FUTURE WHERE RISKS HAVE OCCURRED
1-2 MINUTE STORY OF A WORST-CASE SCENARIO
PICK A PROTAGONIST (PERSON, COMPANY, NATION, ETC.)
WHICH RISKS HAPPENED?
WHAT WERE THE CONSEQUENCES?
10. P R O P O S E P O T E N T I A L S O L U T I O N S F O R E A C H G R O U P ’ S S T O R Y
E X A M P L E S
T R A N S P A R E N T T E R M S O F S E R V I C E
A W A R D S
I N D U S T R Y E T H I C S
I N D E P E N D E N T W A T C H D O G
R E G U L A T I O N S / P O L I C Y
R I S K M I T I G A T I O N S T R A T E G I E S
_
12. W H I C H I D E N T I F I E D R I S K S A R E A P R I O R I T Y ?
W H A T S T R A T E G I E S C A N H E L P M I T I G A T E T H E S E R I S K S ?
W H E R E A N D H O W T O S T A R T ?
R E C A P
_
better product development, faster deployment and more impactful innovation. and minimize technical and repetitional risks. - ethical OS
People we tend to be optimistic about technology. because if we weren’t we likely would find other industries to work in.
`
describe pre-mortem
divide people into groups of 4
What are the costs of doing nothing? Given all you know from morning sessions, tell me a story of worst case scenarios.
important to pay attention to other group presentations. they must solve a problem that a different group presents. and the original team is allowed to add on to it.
transparent terms of service could educates user. maybe the watchdog could be the place whistleblowers could go and would organize ethical bounties?
give everyone a starting point of things to do that would safeguard user data. potential solutions from the audience. synthesize
important to pay attention. they must solve a problem that a different group presents. and the original team is allowed to add on to it.
At the end, ask people to write down 1-2 things are they are going to do now. Tell those 1-2 things to the people in your group. Ask for volunteers for anyone who wants to tell me.