Claudia Wagner Oxford, May 2015
Inequalities and Biases
in socio-technical systems
Doleac & Stein,
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1615149
How to detect biases?
• Audit Studies / Field Experiments
• Observational Study / Secondary
Analysis of Data
Gender Inequalities on Wikipedia
4.4 times more likely to be reported in articles about women
3.8 times more likely to be reported in articles about women
What do they have in common?
The Smurfette Principle
What about other stuff on Wikipedia?
Unequal Chances
Thank You
claudia.wagner@gesis.org
Slam about "Discrimination and Inequalities in socio-computational systems"
Nächste SlideShare
Wird geladen in …5
×

Slam about "Discrimination and Inequalities in socio-computational systems"

423 Aufrufe

Veröffentlicht am

Invited Slam at the 1st Science Slam at ICWSM2015 https://www.sg.ethz.ch/activities-events/workshops/1st-icwsm-science-slam/

Veröffentlicht in: Technologie
0 Kommentare
0 Gefällt mir
Statistik
Notizen
  • Als Erste(r) kommentieren

  • Gehören Sie zu den Ersten, denen das gefällt!

Keine Downloads
Aufrufe
Aufrufe insgesamt
423
Auf SlideShare
0
Aus Einbettungen
0
Anzahl an Einbettungen
3
Aktionen
Geteilt
0
Downloads
3
Kommentare
0
Gefällt mir
0
Einbettungen 0
Keine Einbettungen

Keine Notizen für die Folie
  • Welcome to my talk about biases and inequalities in socio-technical systems. The organizers told me that the main purpose of this invited talk is not to be funny. The main purpose is to do sth against inequalities. So since we had only one female slammer so far, I am here to push the number. So that’s good for me since I already achieved the goal by simply walking up the stage. That was easy. And it’s also good for me because the topic of my talk is actually nor exactly the funniest topic in the world – its in fact a bit depressing.
    But I try to focus on the less serious stuff for now.
  • So lets look at some of examples of how biases manifest in diff systems. Let me introduce you to José Zamora who was desperately looking for a job online and since he was not very successful with his online profile he decided to simply remove the S from his name. So Jose became Joe Zamora, and a week later, he says his inbox was full. Another example comes from Stanford where researchers tried to sell exactly the same product on an ecommerce site. Once the product was hold in a black hand and once in a white hand. So it turned out that whatever you hold in a white hand sells for more money. Those 2 examples show how pre-existing biases also manifest in the online world.
    But online is even worse than offline. Beside these pre-existing biases also technical biases and emergent biases exist in the online world. An example of a technical bias is the face recognition and motion tracking software from HP that could not detect black people. Emergent biases arise from the interplay between algs and people – i.e., when algorithms adapt their behavior based on the usage patterns they encounter. E.g. an algorithm might learn e.g. that on Airbnb hosts with darker skin color are less popular and therefore may recommend them less often.
  • So how can we detect those biases. Audit studies are typically field experiments in which researchers intrude into the social process or system that is studied. (e.g., sock puppet audits, scraping audits, code audits,…).
    We can do what Jose or Joe did. That means run small self-experiments. If we are bit more ambitious, we can run medium-scale or large scale experiments. However if you decide to do that I need to warn you. I recently had a chat with Aniko who studied price discrimination and personalization with her collegues at Northeastern and she told me a bit about her experience with running these medium scale experiments for detecting biases and this often involves to create fake accounts.
    So that means she had to create several hundred Google accounts. So her first strategy was to email all her fb friends (so some of them eventually decided to unfriend her). When she decided to try creating these accounts automatically, the whole IP range of the university got banned from Google (again many people love you if that happens and its your fault). Finally they were desparate enough and went to the MAC store since they thought there are all these laptops which one can use. Unfortunately they are all in the same IP range and that’s why also the MAC store IP got banned. SO you see running these experiments is a fairly adventurous process.
  • So in our work we could be a bit less adventurous since were interested in exploring gender inequalities on Wikipedia. So we could simply grap articles about notable men and women and compare them.

    So the good news is that we found that it’s very easy for women to make it to Wikipedia.
    You even don’t need to have a name (or at least a first name is for sure sufficient). Just make sure you marry a really famous guy.
    And if you have on top of a famous husband also a famous your brother you are for sure notable.
  • So looks like it’s easy to make it to Wikipedia if you are a women, but what do they then write about us? Well they deeply care about the relationship status of women (and that has of course nothing todo with the fact that they are mainly male).
    When comparing articles about men and women on a lexical level we found that if an article on Wikipedia mentions that a person is divorced it’s 4.4 times more likely to be about a women and an articles which mentions that a person is married is 3.8 times more likely to be about a woman.
    So looks like notable women get more frequently married and divorced or the Wikipedia editor community considers different aspects important depending on if they document the life of a man or a woman.
  • SO I have a quiz question for you guys which seems to be off-topic but actually is not!
    What do the muppets show, the smurfs and Start Wars have in common? There is only one women in the enire galactic empire, the enire smurfland and the whole muppets show
  • Katha Pollitt called this observation that mainly boys define a group and that girls often only exist in relation to boys in fiction as Smurfette Principle (or Token minority). So that principle applies to fiction editors but what about Wikipedia editors?
    We constructed a network of articles about men and women by extracting links between articles and compared the k-coreness distribution of male and female articles. And it turns out that more men are part of large and well connected groups.
    So the Smurfette principle also seems to hold on Wikipedia.
  • So maybe you think now: ok but this male-bias only becomes visible in the biography articles on Wikipedia, right?
    So let me give you one other example: Articles about Professions
    Some collegues and I recently went for dinner and we were just for fun looking through Wikipedia sites about different professions (you might wonder why but its not uncommon for people in academia we do that every 2 year). So what we saw on Wikipedia was pretty shocking. It turned out to be so difficult to find an article about a profession with an picture of a women that we started making a game out of it while waiting for food. Everyone had one guess. For example I would say hairdresser, german wikipedia and if that’s correct I get a point. If you ever play this game I can give you some tips: hairdresser, model, number girl, bar girl, hostess work in many language editions.
    Well so we had a lot of fun and the next day we crawled all pictures from all professions in all language edistions and set up a crowdflower task.
  • In German for many professions a male and female title exists. The German Wikipedia community has decided to use male as default and tries to redirect from the female profession title to the male one. However they keep a list of professions for which only a female for exists or which is predominantly executed by women. The list is short and contains things like: number girl, bar girl, hostess, prostitute
  • When analyzing all pictures from articles about profession (that we found by matching a list of around 100 professions to different language editions of Wikipedia), we found that more than 2/3 of all images that depict one or several people are showing men (or depict several people but the men are dominant).

    900 pics in total.
  • Let’s look at individual language editions. So we found that italian girls will not find much evidence for the fact that also women have jobs.
    So lets hope that italian girls don’t try to make carrier decisions based on Wikipedia.
    German, French and English Wikipedia a slightly better. What’s the most gender balanced language edition? Any guesses?
    The most gender-balanced wikipedia edition turns out to be Esperanto  - the most widely spoken constructed language in the world!
    It has 9 female and 17 male pics in our sample of professions. Italian has 42 mail pics.

    So looks like we have 2 options: change Wikipedia or teach our kids Esperanto (if you have girls).

    So to sum up – what the take away message from my talk.
    If you are female enjoy the fact that you have the entire galactic empire for yourself. Make sure that you marry the most famous guy in the empire. Only care about notability that is enough. You will probably get divorced anyway and the whole world will know it. If you get kids, make sure that your little girls learn Esperanto and don’t allow them to watch the Smurfs or the Muppets Show.

  • So looks like we have 2 options: change Wikipedia or teach our kids Esperanto (if you have girls).
    And never forget not allow your kids to watch the smurfs or the muppets show.

    So well to sum up if you are a woman enjoy the fact that you have the entire galactic empire for yourself. Make sure that you marry the most famous guy in the empire. Only care about notability that is enough. You will probably get divorced anyway and the whole world will know it. If you get kids, make sure that your little girls learn Esperanto and don’t allow them to watch the Smurfs or the Muppets Show.

  • Slam about "Discrimination and Inequalities in socio-computational systems"

    1. 1. Claudia Wagner Oxford, May 2015 Inequalities and Biases in socio-technical systems
    2. 2. Doleac & Stein, http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1615149
    3. 3. How to detect biases? • Audit Studies / Field Experiments • Observational Study / Secondary Analysis of Data
    4. 4. Gender Inequalities on Wikipedia
    5. 5. 4.4 times more likely to be reported in articles about women 3.8 times more likely to be reported in articles about women
    6. 6. What do they have in common?
    7. 7. The Smurfette Principle
    8. 8. What about other stuff on Wikipedia?
    9. 9. Unequal Chances
    10. 10. Thank You claudia.wagner@gesis.org

    ×