Presentation at the 2017 Ethicomp conference discussing issues of editorial responsibility of online platforms arising from personalization algorithms that mediate the information that people see online.
3. Personalisation Algorithms (PAs)
• Personalisation Algorithms (PAs) ensure the content
presented to the user is relevant and engaging
• PAs are designed to prioritise and make some information
more visible than others
• However, there is no transparency in the criteria used for
ranking this information
• What are the consequences on users?
• Limiting exposure to attitude-challenging information
• Intellectual isolation
• Filter bubble/Echo Chambers effect
• Personal agency
3
6. User understanding of social media algorithms:
Facebook News Feed
6
Of 40 interviewed participants more than 60% of
Facebook users are entirely unaware of any algorithmic
curation on Facebook at all: “They believed every single
story from their friends and followed pages appeared in
their news feed”.
Published at: CHI 2015
7. How do the algorithms work?
7
PAs algorithms usually identify
personalised interests patterns
based on assumptions about user
behaviour (time/topic)
8. Positioning effect in NewsFeed
E. Bakshy, S. Medding & L.A. Adamic, “Exposure to ideologically diverse news and opinion on Facebook” Science, 348, 1130-1132, 2015
8
9. Personalised recommendations
• Content based– similarity to past results the user liked
• Collaborative filtering – results that similar users liked (people
with statistically similar tastes/interests)
• Community based – results that people in the same social
network liked
(people who are linked on a social network e.g. ‘friends’)
9
10. Editorial responsibility… who? Me?
10
We do not
alter content We just
reshape
content
We are a
technological
company
We are NOT a media
company and therefore
we do not hold editorial
responsibility
11. What about Responsible Research and
Innovation (RRI)?
• Due to their global reach and potential to promote the exercise
and enjoyment of human rights and fundamental freedoms,
social media companies have to arbitrate on the balance
between:
• Public interest vs. Personal privacy (e.g., the right to be forgotten)
• Rights of freedom of expression vs. Protection of harm (e.g. hate
speech)
• Accordingly Social Media platforms should take editorial-like
responsibility towards the public and adopt a code of ethics to
promote corporate social responsibility and a healthy
democratic discourse
• Including concepts such as public interest in their
optimisation algorithms
• This would imply applying principles of RRI and the
development of mutual learning via multi-stakeholder
involvement 11
12. Responsible Editorial Approach
• Ethical obligation to serve the public interest is based
on trusteeship, where policy makers and media
organisations apply normative principles of social
responsibility
• By contrast, social media platforms exhibit a
marketplace approach
• The platform enables an environment in which users take
responsibility when signing T&C
• In 2012 the Reuters Institute for the Study of
Journalism found that digital intermediaries act as
gatekeepers who exert editorial-like judgements:
• Content ranking manipulates information accessibility
• Lack of transparency to define relevance of content
12
13. To conclude
• In keeping with:
• ACM Principles of Algorithmic Transparency and
Accountability
• IEEE Vision for Prioritising Human Wellbeing with AI and
Autonomous Systems
• 2012 recommendation of the Council of Europe on the
protection of human rights with regard to social
networking services
SOCIAL MEDIA PLATFORMS SHOULD BE ACCOUNTABLE
FOR THE EDITORIAL-LIKE CONTROL EXERTED BY THEIR PAs
ON THE CONTENT VISIBILITY EXPERIENCED BY USERS
13