Call Now âď¸đ 9332606886đ Call Girls ⤠Service In Bhilwara Female Escorts Serv...
Â
Bus356 Facebook Ethics
1.
2. BACKGROUND INFO
ďľAll Facebook users see a newsfeed when the login,
comprised of friendsâ posts and other trending news
ďľA complex algorithm sifts through around 1500
ânewsworthyâ items and chooses approximately 300
of them to display on your news feed at any given
time
ďľWhat goes into your newsfeed is is supposed to be
determined by your likes, your posts, popular topics
among your friendsâ posts, etcâŚ
3. SO WHATâS THE ISSUE?
ďľTurns out Facebook conducted a huge experiment
on 689,003 customers at random, with neither their
knowledge nor consent
ďľAltered their newsfeeds- inundated some with lots
of negative posts, and others with many positive
posts
ďľThe goal: to see what effect it would have on what
the users would subsequently post themselves
after seeing the messages
4. CRITICS SAYâŚ
ďľ Academic researches are required to get permission
from people to do psychological research on them;
Facebookâs experiment falls under this umbrella
ďľ Facebook is way off base here. They are a social media
sight, not psychology professors
ďľ Blatant disregard for peopleâs safety and mental health.
What if an already depressed person was randomly
selected to receive a negative newsfeed? Should have
considered more of the possible repercussions.
5. FACEBOOKâS SIDE:
ďľ Every one of the 1.28 billion Facebook users agreed to the terms and
conditions when they created an account, so consent was given, not
our fault if you didnât read it
ďľ Picture sizes, ad placements, and other stuff are altered all the time on
your newsfeed to see what is most effective, and this is no different
ďľ Didnât legally need to get explicit permission, and doing so would have
made people more self-conscious, altered results
ďľ Claims experiment was to usersâ benefit: wanted to see if their was any
truth to the concern that seeing friendsâ positive posts makes people
feel bad or left out and want to stop going on Facebook altogether
6. THE RESULTS?
ďľâMOODS ARE CONTAGIOUSâ
ďľPeople who saw positive newsfeed ď post more
positive things
ďľPeople who saw negative newsfeed ď post more
negative posts
ďľMoods can be spread online, not just face to face
7. QUESTIONS
ďľThe general consensus is that the experiment
wasnât illegal, but is it ethical? Should Facebook
have asked the users selected for the experiment
for consent?
ďľWas this a valuable study, or psychological
manipulation? Did it pose any risks to participants,
or was it essentially harmless?
ďľGoogle monitors your searches, Yahoo tracks what
articles your read. Is this ok? Is it any different from
what Facebook did?
8. WORK CITED
Goel, Vindu. "Facebook Tinkers With Usersâ Emotions in News Feed
Experiment, Stirring Outcry." The New York Times 29 June 2014,
Technology sec.: B1. Web. 14 Sept. 2015.
<http://www.nytimes.com/2014/06/30/technology/facebook-tinkers-with-
users-emotions-in-news-feed-experiment-stirring-outcry.html?_r=1>.