This document discusses challenges with AI governance and the potential for MLOps tools to help address them. It notes that while the AI market is growing rapidly, many machine learning projects fail or have issues with performance, bias, privacy, reproducibility and risk. MLOps aims to automate governance processes to make them easier and more standardized, as was done with DevOps, in order to enable faster and better model development and deployment while properly managing risks. Key governance goals that MLOps tools can help with include data and model management, monitoring, explainability and reproducibility.
Challenges for AI in Prod: An Intro to AI Governance
1. Challenges for AI in Prod
An Intro to AI Governance
Ryan Dawson - Seldon
2. Intro
- AI Rush
- But lots of Governance Concerns:
- Performance
- Bias
- Privacy
- Reproducibility
- Rise of Guidelines
- MLOps for faster and better AI
4. Famous failures
‘Watson for Oncology’ failed to the tune of $62M
Microsoft Chatbot turns racist
Apple face recognition fooled by mask (robustness failure)
First fatality involving self-driving car
10. Bias
Google images labelled black people as gorillas
There are data points we should not use for certain purposes. E.g. using race in
automated parole recommendations
The data we use might itself contain bias
11. Privacy
The facebook and Cambridge Analytica scandal has highlighted privacy issues.
A predictor is likely to predict similarly to the nearest data points in the training set.
Say you’re predicting voting and somebody ask for a prediction for retired female
voters in a given district.
One might not expect that to reveal much about who was surveyed for the training
data - but it might if there’s only a handful of retired female voters in that district?
12. Risk
So things do go seriously wrong.
Imagine something goes wrong and you can’t quickly fix… or show that you took
safeguards.
This could mean legal risk but especially reputation and financial risk.
13. Governance processes
Can be really manual
Form a dedicated ML QA team?
Code reviews and questionnaires
Sign-offs before release e.g.
“Name the business owner who signed off the data for use.”
“State any bias checking or reasons why bias monitoring is not needed.”
14. MLOps to the rescue?
There are tools but to achieve governance nirvana you’d have to do a lot of
configuration. Especially on data and model management but also some on
monitoring.
‘DevOps’ used to be really manual too
Release teams, release managers, artifact stores, questionnaires and detailed
documents
15. Reproducibility
This is something you want if things go wrong… But you kinda want it anyway.
And it’s challenging at multiple levels:
- Common tooling and team processes
- Dependency management
- Data management
- Artifact tracking through the lifecycle
16. Not One-Size-Fits-All
Sometimes you need long-running experiments. Sometimes CI is enough.
Sometimes the model is small.
Sometimes old predictions are ‘throwaway’
Not everyone needs explainability.
17. Explainability
Let’s say you want explainability
This is a data science task in itself. There are libraries.
But it also requires you to know exactly what the request was and what version
was running.
Let’s look at a quick example using an income classifier trained on US census
data. We’ll step into its request log and see why it made a particular prediction.
19. Good Governance Needs to Get Easier
This means MLOps needs to get easier
And more pluggable (even if buying a whole platform from one provider)
And we have to better understand what we want from it
20. DevOps Now
DevOps tools surprisingly well-delineated
We know what a CI is or container orchestrator etc.
MLOps getting there
22. Faster and Better?
Faster and better governance is possible… with automation
Flexible automation like we have with DevOps requires standardization. That can’t
happen with a single innovation. It also requires collective alignment, which takes
time.
At Seldon we’re proud to be playing our part
23. AI Governance in 2020
The range of AI Governance concerns can be overwhelming.
MLOps provides tools to help.
Projects have to choose which apply to their case.
Platform teams need to think about the range of cases in their organisation.