2. Agenda
â My Journey as a PM
â Introduction to Data Science
â Applications of AI/ML in E-commerce
â Journey of an AI/ML product
â PM Responsibilities
â When to use AI/ML
â Mini Case Study
â Challenges & learnings
â Q&A
3. Who am I?
Joined
GlobalScholar
(EdTech)
Built e-learning
applications
2011
2017
Started working on
engagement constructs
UGC, Wishlists &
Collections
Graduated as a
CS Engineer
Joined Amdocs,
built Billing & CRM
software
2009
Started my MBA
from IIM Bangalore
Majors in Finance
2013
Moved to the
Marketplace Team
(Seller side)
Leading Selection
Design
2019
Started my PM
journey at Flipkart
Led Catalog
Platformization
2015
â Engineer turned Product Manager
â Experience across domains - E-commerce, EdTech, Telecom
â Experience of building both consumer facing products as well as platform products
â Have worked extensively with Data Scientists to solve for key customer & business problems at scale
4. My DS Journey
Identifying fraudulent
activity
(Building intelligence)
Auto answering
of customer queries
(Creating customer
delight)
Content Quality &
Ranking
(Building
intelligence)
Demand - supply
gap analysis
(Solving for
business needs)
Feedback
summarisation
(Creating customer
delight)
Selection
Benchmarking &
Assessment
(Solving for business
needs)
Auto moderation of
content
(Automation)
Highlights -
6. â Vast & complex
â Fast evolving
â Lot of misconceptions
Common Queries
â How much of Data Science should a PM know?
â What are the artefacts that a PM produces while building a data science product?
â How much time does it take to build a Data Science product?
â What does your typical day look it?
â How does data science form a part of product strategy?
Introduction to Data Science
8. Journey of an AI/ML Product
Problem
DeïŹnition
DeïŹne the business
problem
Formulate
hypotheses
Translate it into one
or more DS problem
statements
Identify the right
metrics & establish
clear success criteria
(PM)
Data
Exploration
Identify/create the
underlying datasets
Identify feature sets
(domain knowledge
comes in handy)
Identify diïŹerent user
segments, corner
cases, etc.
(PM/DS)
Modelling &
Optimisation
Modularise
Explore various
models/techniques
Train the models &
iterate
Measure the model
metrics, make
tradeoïŹs
Validate through
business/ops
(P - DS, S - PM)
Scale up &
Maintenance
Enable logging &
debuggability
Setup alerts &
dashboards for health
metrics
Perform periodic
checks to identify the
need to retrain the
models
(P - Engg, S - PM/DS)
Deployment &
Experimentation
Integrate with the
core Tech stack
Implement a fallback
ïŹow & have feature
ïŹags
Instrument the
necessary data
signals
Perform an A/B
experiment
Design the UX
(P - DS/Engg, S - PM)
9. Problem DeïŹnition - Mini Case Study
Aspect Ratings
â DeïŹne the business problem & the product to be built
â User Need - to research about the product & its features
â Business Problem - enable faster & convenient decision making for
the user
â Target product - summarise customer feedback at an aspect level
â Translate it into DS problems
â Identify relevant aspects automatically
â Tag feedback to aspects
â Grade feedback from positive to negative
â Summarise feedback at product level
â Business metrics vs. DS metrics
â Business metrics - engagement, conversion
â DS metrics - coverage, accuracy
â Challenges
â Linguistic complexities - incorrect grammar, use of hinglish, etc.
â Optimising for category nuances
10. Applying your domain knowledge - |
â Identify relevant feature sets
â Proxy data
â Signals from the broader ecosystem
â Handling data anomalies
â Filtering out noise
â Handling data quality issues
â Ensuring data quality
â Check the checker ïŹows
â Critique the Models
â Question assumptions
â Play the devilâs advocate
â Test out both happy & unhappy ïŹows
11. Applying your domain knowledge - ||
â Make the right trade-oïŹs
â Accuracy vs. interpretability
â Precision vs. recall
â Identify commonalities across products
â Modularise & re-use
12. 1. Filtering out profanity from user generated content - what do you prioritise?
a. Precision
b. Recall
2. Aiding law & court proceedings - what do you prioritise?
a. Accuracy
b. Explainability
Examples - Tradeoffs
13. Capture user preferences
â Design your onboarding
experience
â Ask explicitly
â Allow users to update their
preferences
â Allow users to blacklist
Create feedback loops
â Validate your output regularly
â Respond to new data
User Experience Design - I
14. Collect data intelligently
â Make it playful
â Solve a customer need
Communicate eïŹectively
â Build trust with the user
â Tell the why part
Interact naturally
â Make it look & sound humane
â Simulate interactions & review
User Experience Design - II
Google Duplex phone calls -
15. Examples - Whether to use AI/ML?
1. Shortlisting resumes for a job proïŹle?
a. Yes
b. No
2. Taking decisions during a medical surgery?
a. Yes
b. No
16. When to use AI/ML?
Guiding principles:
â Recurring needs which are too costly or time consuming to do manually (eg. content moderation)
â When rules are not enough - either they are too many or they are too complex to deïŹne objectively (eg.
address intelligence)
â Scale of data is huge to analyse & predict (eg. recommendations)
â Underlying data keeps changing over time (eg. user preferences)
How to get started:
â Manual -> Rules -> DS
When not to use DS?
â Rules work reasonably well
â Mission critical systems with no scope for errors, where decisions are irreversible
â Explainability is crucial
17. Mini Case Study
Recommendation Engines & Personalisation
â Why do we need AI/ML?
â Too many choices
â Too many users with varying preferences
â Dimensions of Personalisation
â Level: No -> Cohort level -> User level -> User * context
level
â Aspects: language, genre, content format, etc.
â Data Inputs
â Implicit signals - browse/watch history, completion rates
â Explicit signals - selected interests during onboarding
â Context
â User context vs. session context
â Techniques
â Collaborative Filtering
â Facet Similarity
18. â DS is non-deterministic
â Stakeholders want predictability
â Predicting the likelihood & extent of success is almost
impossible in the beginning
â Start small, iterate and scale up
â Business decisions
â Build vs. buy vs. license
â TTM, strategic importance, skill availability
â Time bounding the research & development process
â DS models lack explainability
â Prioritise between accuracy & explainability
â Communicate & create transparency
â Leave scope for manual overrides
â User data is no more secure
â Put the control in the hands of the user
â Anonymise the data
â Build organisational ïŹrewalls to restrict access
â Communicate the beneïŹts of using the data
Challenges & Learnings
19. Examples - PM Decisions
1. You built a feature using AI/ML. The model has a pretty high accuracy of 95%. The feature when
launched led to a drop in conversion. What you do?
a. Launch the feature
b. Kill the feature
c. Reimagine the feature