Diese Präsentation wurde erfolgreich gemeldet.
Wir verwenden Ihre LinkedIn Profilangaben und Informationen zu Ihren Aktivitäten, um Anzeigen zu personalisieren und Ihnen relevantere Inhalte anzuzeigen. Sie können Ihre Anzeigeneinstellungen jederzeit ändern.
Wird geladen in …3
×
1 von 42

Hybrid Intelligence: AI Design for Humans

2

Teilen

Presenter: Clare Corthell, summer.ai

Artificial intelligence (AI) now predicts an incredible amount of an individual human's experience. AI software can pick your stocks, score a customer's churn risk, and find a date who's your type. These inferential technologies enable users to make better choices among a massive number of options. But as software makes more decisions on the user’s behalf, how do we, the creators of this software, expand the user’s world instead of constraining it? How do we ensure that individuals are not equated with the collective, and treated as people with individual needs, desires, and the agency to choose? Spanning design, neuroscience, artificial intelligence, and ethics, this talk illustrates how creators can elevate a user's experience while leveraging the power of AI.

Ähnliche Bücher

Kostenlos mit einer 30-tägigen Testversion von Scribd

Alle anzeigen

Ähnliche Hörbücher

Kostenlos mit einer 30-tägigen Testversion von Scribd

Alle anzeigen

Hybrid Intelligence: AI Design for Humans

  1. 1. Wrangle 2015 Designing Artificial Intelligence for Humans Clare Corthell summer.ai
  2. 2. Wrangle 2015
  3. 3. Wrangle 2015 We build algorithms that make inferences about the real world.
  4. 4. Wrangle 2015 Potential to make life better
  5. 5. Wrangle 2015 Potential to do harm, in the real world
  6. 6. Wrangle 2015 Let’s get really uncomfortable
  7. 7. Wrangle 2015 Goal how to mitigate harm when using data about the real world
  8. 8. Wrangle 2015
  9. 9. Wrangle 2015 Reasonable Fear
  10. 10. Wrangle 2015 How harm came about Where harm occurs Action we can take
  11. 11. Wrangle 2015 How does harm come about?
  12. 12. Wrangle 2015 Digital Technology and Scale How
  13. 13. Wrangle 2015 Inference Technology and Scale How human-like decisions humans being “automated away” decisions faster, at scale, and without humans
  14. 14. Wrangle 2015 computers do everything ∴ Rainbows & Cornucopias — right?
  15. 15. Wrangle 2015 Bias & prescription (not everyone gets their cornucopia of rainbows) Where examples as defined by
  16. 16. Wrangle 2015 Bias holding prejudicial favor, usually learned implicitly and socially Where
  17. 17. Wrangle 2015 Bias in Data skews that we can’t directly observe Where
  18. 18. Wrangle 2015 Bias, scale, harm inference technology scales human decisions — any flawed decisions or biases it is built on is scaled, too. Where
  19. 19. Wrangle 2015 Descriptive Predictive Prescriptive What happened? What will happen? What should happen? Where
  20. 20. Wrangle 2015 Prescription when inferences are used to decide what should happens in the real world Where
  21. 21. Wrangle 2015 Prescription is power possible and likely to reinforce biases that existed in the real world before Where
  22. 22. Wrangle 2015 Examples online Dating future Founders loan assessment Where
  23. 23. Wrangle 2015 Dating Subtle Bias and Race Where
  24. 24. Wrangle 2015 • College Education Computer Science major Years of experience Last position title Approximate age Work experience in venture backed company future startup Founders Institutional Bias Where
  25. 25. Wrangle 2015 loan assessment The long history of bias Where
  26. 26. Wrangle 2015 beyond criticism to action
  27. 27. Wrangle 2015 the world isn’t perfect, but you could make it better
  28. 28. Wrangle 2015 The future is unwritten — yet sometimes we forget that.
  29. 29. Wrangle 2015 Design for the world you want to live in.
  30. 30. Wrangle 2015 Bias is difficult to understand because it lives deep within your data Finding Bias
  31. 31. Wrangle 2015 working against biaS 1. Talk to users, impacted people, and stakeholders 2. Construct Fairness ACtion
  32. 32. Wrangle 2015 talk to people Seek to understand: who they are what they value what they need what potential harm can affect them ACtion
  33. 33. Wrangle 2015 Construct fairness for example: to mitigate gender bias, include gender so you can actively enforce fairness ACtion
  34. 34. Wrangle 2015 designing algorithms is creative (data does not speak for itself)
  35. 35. Wrangle 2015 We should value what we do not simply by the accuracy of our models but the the benefit for humans and lack of negative impact
  36. 36. Wrangle 2015 a facial recognition thought experiment
  37. 37. Wrangle 2015 if you don’t build it maybe no one will
  38. 38. Wrangle 2015 you have agency and a market price
  39. 39. Wrangle 2015 we’re at the forefront of a new age governed by algorithms
  40. 40. Wrangle 2015 remember the people on the other side of the algorithm
  41. 41. Wrangle 2015 because whether it’s their next song, their next date, or their next loan, you’re designing their future. Make it a future you want to live in.
  42. 42. Wrangle 2015 Thank you. @clarecorthell clare@summer.ai sources of note • “Fairness Through Awareness” Dwork, et al • Fortune-Tellers, Step Aside: Big Data Looks For Future Entrepreneurs NPR • Harvard Implicit Bias Test

×