The “algorithm” is now an entity. It is a subject that society is talking a lot lately. In 2015, a photo app automatically tagged two Afro-American friends as gorillas. In 2016, a bot called Tay learned to be racist, Holocaust denier and that feminists “should all die and burn in hell”, in 12 hours. In less than 24 hours, it was shut down. There is unpredictability of machine learning algorithms when confronted with real people. How much bias machine learning algorithms can introduce? How much came from the data used to train the algorithms and how much came from the algorithm itself? How to create products based on machine learning avoiding gender, race, age or culture bias and others and avoiding doing harm to those groups?
Yates (Communication of ACM, June 2018) said that “any remedy for bias must start with awareness that bias exists.” Page (The Difference, 2007) proposed that identity diversity (our gender, race, religion, etc.) leads to cognitive diversity (the way we think and solve problems), mainly in tasks as prediction and problem-solving. A study made by McKinsey & Company in 2014 says that diversity fosters innovation and increase financial results. So, workplace diversity can help in different ways, including to detect and reduce bias in algorithms design and execution.
How much agile teams, from the beginning of software development chain, can help to minimize bias and reduce backslash to the end user? What is the role of agile when teams are built to work in a machine learning world? Agile Manifesto values individuals and interactions over processes and tools. Agile teams are built on that. Recently, Modern Agile also set two of four values based on people: make people awesome and make safety a prerequisite. Not as a causality, but, maybe, as a correlation, agile values are good evidence that we can have development environments that better support diversity. Once we have more diverse teams, we can expect better outputs (less biased) from machine learning algorithms.
2. Head of Operations Porto Alegre’s Office
Product Manager for the Identity Provider Teams (Devices and LGPD)
PhD Student in Computer Science/Software Engineer
KarinaKohl
karina.kohl@gmail.com
MSc in Computer Science - UFRGS
BSc in Computer Science - UFRGS
17 years in IT Industry
10 working in Agile Teams
6. 2019…“Thealgorithm” stillthere…
At the end of the day… what is the problem with that?
WhenIentermycarinaSundaymorningandmyphoneautomaticallynotifiesme
howmanytimetoarriveatmyparent’shome…
Myfriends’postsinsocialmediathatIseeinmytimeline…orIdon’tsee…
WhenIlogininmypreferredstreamingvideoappanditrecommendedmeanew
HospitalTVShowbecauseI marathonedmypreferredonetheweekendbefore…
AndIcanbealittlebitmoreneurotic:“whatif"myphone"(ormyassistant)is
listeningtomyconversations?”
😱
🚑
🚘
🤔
7.
8. GenderShades:IntersectionalAccuracyDisparitiesinCommercialGenderClassification∗
JoyBuolamwini and,TimnitGebru.2018
"Artificial Intelligence (AI) is rapidly infiltrating every aspect of society. From helping
determine our gender and skin type balanced PPB dataset at gendershades.org who is
hired, fired, granted a loan, or how long an individual spends in prison, decisions that have
traditionally been performed by humans are rapidly made by algorithms.
Many AI systems, e.g. face recognition tools, rely on machine learning algorithms
that are trained with labeled data. It has recently been shown that algorithms
trained with biased data have resulted in algorithmic discrimination."
9. Biasedandwrong?Facialrecognitiontechinthedock-MatthewWall
BBCNewsJuly,2019(https://www.bbc.com/news/business-48842750)
"Police and security forces around the world are testing out automated facial recognition
systems as a way of identifying criminals and terrorists.
But how accurate is the technology and how easily could it and the artificial
intelligence (AI) it is powered by - become tools of oppression?
With black Americans making up 37.5% of the US prison population (source: Federal
Bureau of Prisons) despite the fact that they make up just 13% of the US population -
badly written algorithms fed these datasets might predict that black people are more
likely to commit crime."
15. https://www.nytimes.com/2008/01/08/science/08conv.html
Question: "The term “diversity” has
become a code word for inclusion of racial,
ethnic and sexual minorities. Is that what
you’re talking about?"
Answer: ”I mean differences in how
people think. Two people can look quite
different and think similarly. Having said
that, there’s certainly a lot of evidence
that people’s identity groups — ethnic,
racial, sexual, age — matter when it
comes to diversity in thinking.” (Scott E.
Page)
16. Cognitive Diversity
The difference betweenhowweinterpret,reasonand
solveproblems-howwethink.
Identity Diversity
Itisdeterminedbyaffiliationwithasocialgroupas
gender,culture,ethnicity,religion,sexualorientation,
etc.
*ScottE.Page-TheDifference-2007.
22. But…Whyitmatters?
Identity diversity is the right and ethical thing to do… But it
is also the right thing to do for your business…
Agileisbuiltonpeopleandhow
theyinteract!