This document summarizes privacy issues in the esports industry. It discusses how game developers, platforms, and broadcasters collect extensive personal and gameplay data with little transparency. This allows for mass dataveillance and profiling of players and viewers. Additionally, automated surveillance systems and terms of service give companies hierarchical control over communities with little oversight or player consent. Going forward, regulations like GDPR may help address these power imbalances and privacy risks in esports.
1. PRIVACY IN ESPORTS: MAIN
AREAS OF HIGH RISK
Trilateral Research Ltd.
Crown House
72 Hammersmith Road
W14 8TH, London
+ 44 (0)20 7559 3550
@Trilateral_UK
Dr Jedrzej Czarnota
2. COMPANY PROFILE
Trilateral Research is a leading London-based multidisciplinary research,
consulting and technology development company.
Our team collaborates across social sciences and technology, to bring insights
from each into supporting data driven innovation.
Small enterprise (SME) ≈ 30 staff members.
Running 20-25 projects at any given time.
Almost all research and technical staff have postdoctoral experience
(≈90% have PhDs).
Extensive publication list and excellent international profile:
http://trilateralresearch.co.uk/full-list-of-publications/
3. ETHICS, PRIVACY & DATA PROTECTION
SELECT CLIENTS
A structured approach
11+ years practitioner experience
Deep understanding of ICO requirements and GDPR compliance
Experts in PIA and EIA theory and practice
Well-versed in research ethics: informed consent, ethics monitoring, regulatory
compliance.
Contributing to standards in this area: ISO 29134/WD29134 - near final draft
on PIA guidance
6. ESPORTS – BRAVE NEW WORLD?
New field; rapid development driven by strong industry needs,
Closely linked to advertising and innovative revenue streams,
Sporting activity itself is proprietary (unlike majority of sports),
Reported cutthroat practices (Taylor, 2012),
Audience considerations – many young (at least 13 to use a game) and vulnerable people
(Boellstorff, 2008),
Player protection – both of professional users (unions) and regular users (low power),
eSports = first parties (VG devs) + third parties (broadcasters) + second parties (platforms),
All actors collect information,
Playing the game means automatic consent.
Still developing regulations – need for best practices in PIA (TRUSTe too close to the
industry?)
7. THIRD PARTY INFO USE
Details of your visits to our site including, but not limited to, transactions, traffic data, location
data, weblogs and other communication data.
We may collect information about your computer, including where available your IP address,
operating system and browser type.
We may obtain information about your general internet usage by using a cookies.
Transactions, IP Address, a unique user ID, device type, device identifiers, browser types and
language, referring and exit page addresses, platform type, version of software installed, system
type, the content and pages accessed, the number of clicks made, the amount of time spent on
pages, the dates and times, and other similar information.
The data that we collect from you may be transferred to, and stored at, a destination outside
the European Economic Area (“EEA”),
Accessing social media profiles if linked (account creation or feature use).
8. FIRST PARTY INFO USE
Use of personal info to: expand business and measure the effectiveness of operations,
Players relinquish any expectation of privacy through using forums and in-game chat,
Personal info shared with subsidiaries and affiliates, vendors and service providers, including
payment info,
Contact, authentication, payment, including the amount of money player spends,
Demographic and preference info, computer information including RAM (‘consent to
monitor’),
Personal info as a transferable asset (merger, sale, joint venture, bankruptcy,…),
Non personal info (aggregated, anonymized, de-identified) used to manage player
relationship and sharing with second party developers,
Game websites not responsive to DNT (do not track) signals,
Partial coverage by TRUSTe program.
12. MAIN QUESTIONS
Are the players and viewers made aware of the privacy policies and
practices of the game sponsors in an easily comprehensible way?
If any personal data is being processed, is the processing secure?
Is the data protected satisfactorily?
Do players and viewers have access to their data? Can they amend it if
necessary?
Do the broadcasters, platforms, sponsors, or developers share or sell
the data they capture?
13. INFORMATION PRIVACY AND
SURVEILLANCE
Privacy Protection is a process of finding appropriate balances between
privacy and multiple competing interests (Roger Clarke, 2017).
Systematic investigation or monitoring of the actions or communications of
one or more persons.
Dataveillance (gathering of information from personal data systems).
Personal dataveillance vs. mass dataveillance,
Profiling of users (for advertising, sales, service use prediction,
development),
Data gathered by different actors: videogame developers, platform owners,
broadcasting services.
14. NEW SURVEILLANCE
Enforced by the game software architecture (server communication),
Automatic monitoring software to prevent and detect disruptive behaviour
(cheating),
Negative form of dataveiilance (Fuchs, 2011) and panopticon dynamic
(Gandy, 2012, Clarke, 2017),
Downplaying of the negative sides of dataveillance by the industry.
Emergent surveillant assemblage – non-human elements (code, legal docs)
are connected to human elements (players, moderators).
16. BUSINESS-DRIVEN
DYNAMICS
Do players agree to a record of their play, their personal details,
their key logging?
Is any personal data captured from the viewers?
Incentive for VG companies to invest in governance structures –
establishment of a transparent and fair sport,
Low power of a player as the cost of moving to another game is
high (Castronova, 2005),
Early days: the community decides which game becomes an
esport game (Kerr et al., 2013),
Today: influenced by sponsorship money and broadcasters.
17. LONG-TERM TRENDS
Privacy as contextual integrity (Nissenbaum, 2004).
Privacy breach - when information is supplied in one context and then
extracted from it and used in another context, i.e., disrupting the integrity
of context.
Example: RAM scanning in search of cheaters. Anti-cheating tools are
intrusive technologies as they collect information from all players’
computers.
Example: lack of transparency and visibility of the surveillance mechanisms
in the eSports scene.
Shift towards hierarchical surveillance (as opposed to lateral).
This is opposite to the general trend on the Internet (Kerr et al., 2013).
19. KEY PROBLEMS
Toxic social environments, cheating – in many online games,
Need to establish and enforce social norms,
Companies free to set their own rules within a game as long as they conform to broader
legal frameworks:
protection of minors, data protection, general privacy, and intellectual property.
Balance between right to privacy and the right to property (IP),
Consent to record & broadcast from all players in a session – is gameplay public?
Players’ identity – players may wish to remain anonymous,
Shoutcasting – when private and sensitive data becomes public,
Professional shoutcasters vs. amateur YouTube or Twitch users,
Underage users (13 year old),
Information disclosure: social media and related networks.
21. POWER, TECHNOLOGY,
ETHICS
Increasing surveillance power is enabled by big data
technologies,
Panspectron – information gathered about everything (Braman,
2006),
Ethical issues of discrimination against some players (i.e.,
Tribunal monitoring, change of behaviour, and conformity),
Deviant behaviour constantly monitored – justified by economic
goals and a limited view of quality of service and community.
Companies move towards more hierarchical and legalistic forms
of governance and less transparent forms of automated
dataveillance (Kerr et al., 2013).
22. ENFORCEMENT OF SURVEILLANCE
Hierarchical surveillance
Game architecture, software code, and game rules,
Policy documents (PEGI, ISFE, ICDP, EULAs, TOS)
Company community management practices,
Company-produced paratexts,
Lateral surveillance
Community management,
Peer and participatory surveillance, and player-produced paratexts,
Driven by business model considerations: monetization, business expansion,
Ensuring cheating-free environment (thus confirming the status as an eSport).
23. IN-BUILT BIASES AND SHORTCOMINGS
Automated code is a designed agent and bears the biases of its
designers – an example of algorithmic surveillance (Gillespie, 2014),
ISFE and ICDP concerned that GDPR might impose new restrictions on
what game companies can do with player data,
Prioritizing values: punishing cheaters, but also tension between paying
players and banning them from community,
Certain actions taken by game companies are not transparent and
infringe on the consumer rights of the players (i.e., the right of appeal).
24. GDPR
General Data Protection Regulation will come into effect on 25 May 2018.
Replacing Directive 95/46/EC and patchwork of national regulations.
Includes (among others):
Data Protection Officers to monitor large scale data gathering,
Data Protection Impact Assessment for data gathering “likely to result in a high risk to the rights
and freedoms of natural persons” (Article 35(1, 3, 4)):
Criteria examples: data transfer across the borders of EU, systematic monitoring, automated
decision-making with legal or similar effect, sensitive data, and others.
Customise animation throughout the presentation to your style
Our foundation is in privacy, data protection and ethics, and what we bring is technology development that uses our expertise in this area to deliver privacy-compliant services or develop tools with an awareness of the potential impacts of that data collection and analysis processes.
ISFE – interactive software federation of Europe
PEGI – pan European game information
ICDP – industry coalition for data protection