Diese Präsentation wurde erfolgreich gemeldet.
Wir verwenden Ihre LinkedIn Profilangaben und Informationen zu Ihren Aktivitäten, um Anzeigen zu personalisieren und Ihnen relevantere Inhalte anzuzeigen. Sie können Ihre Anzeigeneinstellungen jederzeit ändern.

1440 horrobin using our laptop

173 Aufrufe

Veröffentlicht am

#PAWFINDAY1

Veröffentlicht in: Daten & Analysen
  • Als Erste(r) kommentieren

  • Gehören Sie zu den Ersten, denen das gefällt!

1440 horrobin using our laptop

  1. 1. Analytics Capstone Projects: Embedding Analytics Throughout Your Organization Rob Horrobin AVP, Head of Ops Optimization & Decision Analytics John Hancock Mike Thurber Lead Data Scientist Elder Research, Inc. 1
  2. 2. Question: When are you a Data Scientist? ▪ When did you know you were a Data Scientist? ▪ “Put your helmet on…now you’re a football player” 2
  3. 3. Need for Change ▪“Good process serves you so you can serve customers. But if you’re not watchful, the process can become the thing.”-Jeff Bezos ▪Customer expectations are changing and measuring you by the sum of their experiences  Opportunity to push data- driven decision-making to the Customer Interaction point 3
  4. 4. Step 1: Put the Customer First! ▪ Functional Decomposition ▪ Optimization of Sub-tasks ▪ Measuring against ourselves ▪ Regulatory or Operationally-based SLA ▪ Result: Potential rework, hand-offs 4 ▪ Functionally Linked, End-to-End ▪ Customer Experience Based ▪ Measuring against Everyone ▪ SLA based on what customer expects ▪ Result: Rapid Adaptation to Customer Wants Internal Focus Customer Focus
  5. 5. Step 2: Assess Practice Models + Analytics Expertise + Economies of Scale + Organizational Commitment - Time to Solution - Lack of Business Knowledge - Candor Challenge 5 Centralized Practice Ops/Cust Svc. Ops/Cust Svc. Ops/Cust Svc. Ops/Cust Svc. Ops/Cust Svc. Distributed Practice Ops/Cust Svc. Ops/Cust Svc. Ops/Cust Svc.Ops/Cust Svc. Ops/Cust Svc. + Business Context + Time to Implement - Lack of Objectivity - Day Job vs. Moonlighting - May lack expertise in one method Distributed Practice Distributed Practice Distributed Practice Distributed Practice
  6. 6. How do we leverage Scale and Customer Touchpoints 6 © Yann Forget / Wikimedia World Health Organization Organisation mondiale de la santé Customized Help Inferential Learnings
  7. 7. Answer: A Hybrid Model of Citizen Data Scientists 7 Centralized Practice Ops/Cust Svc. Ops/Cust Svc. Ops/Cust Svc. Ops/Cust Svc. Ops/Cust Svc. Distributed Practice Ops/Cust Svc. Ops/Cust Svc. Ops/Cust Svc. Distributed Practice Distributed Practice Distributed Practice Distributed Practice + Domain Expertise + Economies of Scale + Organizational Commitment + Cross-collaboration + Faster Decision-making + Skills-matching + Time to Solution + Skills to fit scope + Embedded talent + Community
  8. 8. Step 3: Design a Curriculum Objective was to accelerate, enhance and simplify decision-making at moment of truth, using near-real-time information. ▪ Better + more timely decisions ▪ Lower risk (quantitative support) ▪ Lower cost (access to information) ▪ Better customer experience (time to decision) ▪ Invest in existing analytics practitioners and supercharge their experience (employee benefit) 8 DAP=Decision Analytics Professional Program 5 Graduates Went from 2016 JUNE 2017 JUNE Phase 1: Large Data Sets, Statistics, MSAccess, Stat Excel functions Phase 2: Simulations, General Linear Models, advanced Table/Query generation Phase 3: Intro to SAS (JMP), SQL queries, business interaction models (CRISP-DM), Introduction to R Studio Phase 4: Capstone project (expectation to self-fund full year of program)
  9. 9. Step 4: Identify Data Talent ▪ Aptitude: Reporting Analysts in operations areas, Finance staff ▪ Opportunity: Areas going through change with some set of data on hand, data driven culture, value in numbers ▪ Interest: Leverage word of mouth, network of leadership, information on web portals, power of scarcity 9
  10. 10. Step 5: Negotiate with Leadership ▪ Set Expectations: 20% of staff time, real projects ▪ Legitimacy: Structured curriculum with outside experts ▪ WIIFT: ▪ Split the cost ▪ Employee engagement ▪ Will advance their area/times to solution ▪ No Poaching! 10
  11. 11. Step 6: Select Meaningful Projects ▪ Key Criteria ▪ Applicable to Business Area ▪ Implementable Recommendations ▪ Have direct impact to Customer Satisfaction, Expense, or de-risking the business ▪ Data accessible BUT requires cleansing and building an Analytics Base Table 11
  12. 12. Instructor’s View: Projects translate a business need into an inductive reasoning question ▪ How can we reduce errors and improve the customer experience with application and claim processes? ▪ Audit Optimization Model ▪ What do the data tell us about how to reduce unnecessary repeat calls from the customer? ▪ First Call Resolution Model ▪ How can we forecast policy completions when product and policy lifecycle stages are constantly shifting? Allocate staff and other resources appropriately? ▪ Sales Forecast Model ▪ Can we leverage data and analytics to identify potential fraud in our processes? ▪ Fraud Analytics Model ▪ How can we understand what customers expect from their Insurer in terms of handling their policy changes? ▪ Transaction Processing Model 12
  13. 13. The Class: Motivated Students with Questions ▪Armed with aptitude, interest and opportunity, they came with their work, and concerns ▪ Still had their “day job” ▪ “Teach me how to use this magic toolbox of machine learning algorithms!” ▪ “How exactly can data science deliver value in my actual work?” ▪ “I am not sure I am quite prepared…” 13
  14. 14. Instructor’s View: Learning by Example ▪ First Call Resolution Model: Call centers would love to tackle the problem of return calls because the previous call did not lead to full resolution. There are many hurdles, especially given that the voice part of a call is not cleanly tied to the IVR. ▪ Policy Submissions/Staffing Model: Forecasting policy completions accurately is critical to staff planning across the new policy/sales organization, balancing expenses and ensuring adequate capacity to process applications. ▪ Fraud Analytics Model: Certain activities are correlated with fraudulent activity which would not be readily tracked without advanced methods. ▪ Transaction Processing Model: Belief was that customers were calling before receiving an approved claim or policy because they wanted faster service. Instead, they were calling because they needed to review and verify their documents. Real aim was for customers to validate they’d followed the process. 14
  15. 15. Instructor’s View: Principles and Best Practices ▪ Business understanding was their most important preparation ▪ Basic analysis skills, e.g. Excel, should not be underrated ▪ The many tools, open source and COTS, are best learned with hands-on examples ▪ The beginning of data science is translating a business need into a value proposition ▪ The art of data science is translating the value proposition into a question for the data ▪ Preparing the data correctly is more important than choosing the “right” predictive modeling algorithm ▪ Removing noise from the data is essential, but difficult work ▪ Testing is about finding weakness, not strengths of your model, so make the tests HARD! ▪ Test on many time periods demonstrates the weaknesses of complex models and delivers better insights ▪ Validation of assumptions delivers much more insight that we ever think it will ▪ Simple models are (almost) always best 15
  16. 16. Step 7: Keep Momentum on Projects ▪ First Call Resolution Model-Adding Variables ▪ Policy Submissions/Staffing Model -In Production ▪ Fraud Analytics Model-In Production ▪ Transaction Processing Model-Funded Project ▪ Audit Optimization Model-A/B Testing ▪ Employee Engagement Model-On Hold 16 5 Year NPV $1MM+
  17. 17. Step 8: Observe and Internalize Lessons Learned/Key Take-aways ▪ Success indicators for staff: Grit, creativity, reporting capability, curiosity, impatience, HIPO’s ▪ Success indicators for leadership: Self-funding, upholding as thought leaders, have the fight up front, appeal to Employee investment, real benefits, some funding from them ▪ Success indicators for Project: Implementable, data is hard but not too hard, in the DAP practice areas, nail down the “question” very early ▪ Success indicators for Program: Involve Risk teams up front, split the funding, leverage outside support for “legitimacy”, let students struggle a bit, build camaraderie 17
  18. 18. We operate as John Hancock in the United States and Manulife in other parts of the world. Thank you 18

×