Diese Präsentation wurde erfolgreich gemeldet.
Wir verwenden Ihre LinkedIn Profilangaben und Informationen zu Ihren Aktivitäten, um Anzeigen zu personalisieren und Ihnen relevantere Inhalte anzuzeigen. Sie können Ihre Anzeigeneinstellungen jederzeit ändern.

Developer Night - Opticon18

271 Aufrufe

Veröffentlicht am

Lightning talks on best practices for product and engineering teams to experiment everywhere in their applications.

First presented at Optimizely's user conference, Opticon18 on September 12th, 2018.

Veröffentlicht in: Software
  • Download The Complete Lean Belly Breakthrough Program with Special Discount. ◆◆◆ http://ishbv.com/bkfitness3/pdf
       Antworten 
    Sind Sie sicher, dass Sie …  Ja  Nein
    Ihre Nachricht erscheint hier
  • Gehören Sie zu den Ersten, denen das gefällt!

Developer Night - Opticon18

  1. 1. • We’ll be doing Q&A at the end, you can ask your questions at any time by submitting at sli.do/U017 • Take our Developer Survey by food and enter a drawing for an Arduino Kit • Make sure to grab your Optimizely Developer Sticker Welcome to Developer Night!
  2. 2. Developer Night Sponsored by
  3. 3. Agenda 1. Integrating Analytics the Right Way 2. Going Deeper with Heap and Optimizely 3. Segmenting results with audiences 4. Experimenting in a DevOps World 5. Server-side testing in a Serverless world 6. Optimizing the Performance of Client-Side Experimentation 7. Managing Your Full Stack Experiments From Within Your Own Repository 8. How Optimizely uses Full Stack 9. Q&A
  4. 4. Integrating Analytics the Right Way Rocky McGredy Solutions Engineer, Optimizely Ali Baker Technical Support Engineer, Optimizely
  5. 5. 1. Why Integrate? 2. How Integrations Work 3. Implementation Challenges 4. Solutions/Best Practices Agenda
  6. 6. Why Integrate? • Our results are best for finding winning variations • Analytics platforms contain historical user data and reporting • Knowing when a user is in a variation is useful
  7. 7. How Integrations Work
  8. 8. Enable your Integration
  9. 9. Enable your Integration
  10. 10. Integration Methods
  11. 11. Decisions!Decisions!
  12. 12. Implementation Challenges
  13. 13. Reporting Differences
  14. 14. Implementation Considerations Timing Differences Tag Managers Tracking Variables
  15. 15. Scoping Differences User – Experimentation Session – Personalization User Scope Session Scope Hit Scope
  16. 16. Variables in Reporting Event Tracking Audience Segments Report Filtering
  17. 17. Solutions & Best Practices
  18. 18. Troubleshooting Integrations Adjust Timing Use Debug Tools Add Debug Events
  19. 19. Link User Ids Run an A/A Test Export Raw Data Validating Data
  20. 20. Support for integrations SupportDocumentation
  21. 21. Integrating With Full Stack • Similar to custom analytics • Notification listeners • Use first party data
  22. 22. Key Learnings • Integrations send experiment decision data • Consider independent factors: timing, tag managers, reporting • Run a test to validate data • Use your debugging tools • We’re here to help!
  23. 23. Diving Deeper with Heap and Optimizely Taylor Udell Lead Solutions Architect, Heap
  24. 24. Heap is a behavioral analytics platform that automatically captures every user interaction and allows you to define and analyze it after the fact. What is Heap?
  25. 25. How many of you have seen results like this?
  26. 26. How do you interpret these results?
  27. 27. Why did this variation win or lose?
  28. 28. How are all my metrics connected?
  29. 29. Optimizing Your Customers’ Experience ITERATE ANALYZE EXPERIMENT HYPOTHESIZE DRIVING AN ITERATIVE CULTURE
  30. 30. Going Deeper than your Goal Metrics Heap helps you understand the why behind your results without slowing your team down
  31. 31. Key Benefits of using Heap + Optimizely in your stack 1. Develop More Hypotheses 2. Deploy tests and changes without delaying for tracking code 3. Go deeper than goal metrics
  32. 32. Segmenting results with Optimizely audiences Michal Fasanek Technical Support Engineer, Optimizely
  33. 33. What’s this presentation about?
  34. 34. Custom Analytics Integrations • For building your own analytics integrations on top of Optimizely X Web • Great for sending Optimizely data to 3rd party analytics platforms
  35. 35. Custom Analytics Integrations Extensions • Build reusable ‘plugins’ that can be added to your experiments • Create visitor segments based on the pre-defined Optimizely audiences
  36. 36. • Create custom attributes that correspond with your audiences • Add them to an experiment • Create audiences matching the segments you care about How does it work? • Build the custom analytics extension and add it to the experiment
  37. 37. The code sample
  38. 38. Why is this awesome? • Doubles the value of existing audiences • Easy to scale • No additional costs
  39. 39. Resources • Github repository containing the code sample used in this presentation: https://github.com/michal- optimizely/audience_segment_builder • Documentation for Custom analytics extensions • Documentation for Custom Attributes • How to: Segmenting experiment results
  40. 40. Experimenting in a DevOps World Joy Scharmen Director DevOps, Optimizely All life DevOps is an experiment. The more experiments you make the better. -Ralph Waldo Emerson, sort of
  41. 41. DevOps is built on a powerful foundation of experiments.
  42. 42. But first, what is “Infrastructure”?
  43. 43. Infrastructure as Code: Define your hardware like you write your software
  44. 44. Experiment with your Infrastructure
  45. 45. Build your code & Deploy it to your users
  46. 46. Continuous Integration & Continuous Deployment: Continuous Experimentation
  47. 47. Canary Deployments: Testing Experimenting in Production
  48. 48. Feature Flags: Turning it off and on again
  49. 49. Blue/Green Deployment: Experiment down to your server layer
  50. 50. Benefits of DevOps Experimentation!
  51. 51. Server-side testing in a Serverless world Andreas Bloomquist Sr. Solutions Engineer
  52. 52. Serverless Computing
  53. 53. • Easy: Run your backend code without concern for provisioning, managing, or scaling your own server architecture • Cheap*: Ephemeral resources - only pay for your event driven code execution time • Flexible: Manage functions as microservices What is serverless (or FaaS) anyway?
  54. 54. How does it work? Use Case: Image processing Output hilarious meme Function Contain provisioned, code executes Event image uploaded to file storage
  55. 55. Serverless: Not just for hobbyists Encode media files from S3 Streamline real-time processing of interdependent data sets Lowered costs by ~66% with serverless vending machine loyalty service
  56. 56. let’s experiment!
  57. 57. Serverless + Full Stack Stateless + Stateless Benefits of full stack • Stateless - no network requests for decisioning • Remote configuration of variables • Test anything in code! Drawbacks when using FaaS • Stateless - each run is basically a new instance • No easy way to cache datafile/client object
  58. 58. How does it work w/ Optimizely? OutputFunction w/ Optimizely SDK installed Event Initialize SDK sever provisioned and scaled by cloud provider LATENCY
  59. 59. Overcoming latency in FaaS
  60. 60. Option 1
  61. 61. Option 2
  62. 62. stay in your network
  63. 63. Option 3
  64. 64. do your homework
  65. 65. fool me once...
  66. 66. Stateless !== stateless
  67. 67. Put it into practice w/ Alexa else.. get it from CDN / API Alexa event executes Send back variation response My Puppy Store Daily Deal Alexa Skill Alexa, ask Puppy Store for a daily deal! Dashboard 1 check if cache exists Events sent back to Optimizely Save on squeaky toys! upload code to lambda JSON Datafile (Akamai CDN or REST API) 2 3 4 5 6
  68. 68. Time for a demo...
  69. 69. Optimizing the Performance of Client-Side Experimentation Spencer Wilson Software Engineer, Optimizely
  70. 70. Performance testing is a kind of experimentation You have... hypotheses “Increasing image compression will increase session length” independent variables Image compression magnitude dependent variables Session length results interpretation Should we roll out any variation to 100% of traffic?
  71. 71. Instrumenting a page with Optimizely Web
  72. 72. #1: Placement <html> <head> <link rel=”stylesheet” href=”...url”/> ... <!-- more stylesheets --> <script src=”...optly1”></script> <script src=”...url”></script> ... <!-- more scripts --> </head> <body> <header>...</header> <article>...</article> <footer>...</footer> </body> </html>
  73. 73. #1: Placement <html> <head> <link rel=”stylesheet” href=”...url”/> ... <!-- more stylesheets --> <script src=”...url”></script> ... <!-- more scripts --> <script src=”...optly2”></script> </head> <body> <header>...</header> <article>...</article> <footer>...</footer> </body> </html>
  74. 74. #1: Placement <html> <head> <link rel=”stylesheet” href=”...url”/> ... <!-- more stylesheets --> <script src=”...url”></script> ... <!-- more scripts --> </head> <body> <header>...</header> <article>...</article> <footer>...</footer> <script src=”...optly3”> </body> </html>
  75. 75. #2: Attributes Some possibilities: ● <script> (synchronous) ● <script async> ● <script defer> Image credit: Daniel Imms, from https://www.growingwiththeweb.com/2014/02/async-vs-defer-attributes.html
  76. 76. #3: Resource Hints Some possibilities: ● None ● Link: <...url>; rel="dns-prefetch" ● Link: <...url>; rel="preconnect" ● <link rel=”preload” as=”script” href=”...url”> W3C specs: ● https://www.w3.org/TR/resource-hints/ ● https://www.w3.org/TR/preload/
  77. 77. Metrics ● DO: Endeavor to measure impact on users ○ first [contentful] paint, via browser ○ first meaningful paint, via you ○ key business metrics ● DON’T: ○ Use any single metric, like the document’s load event
  78. 78. L I V E D E M O
  79. 79. Recommended initial configuration. Test what works best for your product! Placement ● First script element in the document ● In the <head> Attributes ● None (synchronous) Resource Hints ● If script element is in HTML: none ● Otherwise: <link rel=”preload” as=”script” href=”...url”> ● If using cross-origin targeting, preload the iframe document
  80. 80. More #PerfThings Performance Workshop Tomorrow, Brera 2, 12pm Performance Certification $100 → Free w/code OPTICON18 Performance Whitepaper /resources/optimizing-performance/
  81. 81. In summary 1. Experimentation is an effective tool for informing perf-impacting decisions. Use it! 2. Focus on high-level business metrics. Low-level metrics are supplementary. 3. Attend the perf workshop tomorrow at noon. It will teach tips for going fast.
  82. 82. References ● Steve Souders, “I <3 image bytes”: https://www.stevesouders.com/blog/2013/04/26/i/ ● Ilya Grigorik, “Chrome’s preloader delivers a ~20% speed improvement!”: https://plus.google.com/+IlyaGrigorik/posts/8AwRUE7wqAE ● Tony Gentilcore, “The WebKit PreloadScanner”: http://gent.ilcore.com/2011/01/webkit- preloadscanner.html ● Philip Walton, “User-centric Performance Metrics”: https://developers.google.com/web/fundamentals/performance/user-centric-performance- metrics ● Addy Osmani, “Preload, Prefetch and Priorities in Chrome”: https://medium.com/reloading/preload-prefetch-and-priorities-in-chrome-776165961bbf
  83. 83. Managing Your Full Stack Experiments From Within Your Own Repository Travis Beck Software Engineer, Optimizely
  84. 84. Optimizely Full Stack is easy to use
  85. 85. But do Developers want to have to switch in and out of yet another web app?
  86. 86. Can we make it easier?
  87. 87. A New Feature: turbo_mode We want to test it out on low risk users If it works, then roll it out to everyone
  88. 88. Testing the New Feature attributes = {'plan': 'basic', 'language': 'en'} enabled = optimizely_client.is_feature_enabled('turbo_mode', user_id, attributes) if enabled: # feature implementation ... later ... optimizely_client.track(‘task_complete', user_id, attributes) tags = {‘value’: 100} optimizely_client.track(‘completion_time', user_id, attributes, tags)
  89. 89. What do we have to create in Optimizely? attributes = {'plan': 'basic', 'language': 'en'} enabled = optimizely_client.is_feature_enabled('turbo_mode', user_id, attributes) if enabled: # feature implementation ... later ... optimizely_client.track(‘task_complete', user_id, attributes) tags = {‘value’: 100} optimizely_client.track(‘completion_time', user_id, attributes, tags) Attributes
  90. 90. What do we have to create in Optimizely? attributes = {'plan': 'basic', 'language': 'en'} enabled = optimizely_client.is_feature_enabled('turbo_mode', user_id, attributes) if enabled: # feature implementation ... later ... optimizely_client.track(‘task_complete', user_id, attributes) tags = {‘value’: 100} optimizely_client.track(‘completion_time', user_id, attributes, tags) Attributes Feature
  91. 91. What do we have to create in Optimizely? attributes = {'plan': 'basic', 'language': 'en'} enabled = optimizely_client.is_feature_enabled('turbo_mode', user_id, attributes) if enabled: # feature implementation ... later ... optimizely_client.track(‘task_complete', user_id, attributes) tags = {‘value’: 100} optimizely_client.track(‘completion_time', user_id, attributes, tags) Attributes Feature Events
  92. 92. What do we have to create in Optimizely? attributes = {'plan': 'basic', 'language': 'en'} enabled = optimizely_client.is_feature_enabled('turbo_mode', user_id, attributes) if enabled: # feature implementation ... later ... optimizely_client.track(‘task_complete', user_id, attributes) tags = {‘value’: 100} optimizely_client.track(‘completion_time', user_id, attributes, tags) Attributes Feature Events + Experiment (since we’re testing the Feature)
  93. 93. What do we have to create in Optimizely? attributes = {'plan': 'basic', 'language': 'en'} enabled = optimizely_client.is_feature_enabled('turbo_mode', user_id, attributes) if enabled: # feature implementation ... later ... optimizely_client.track(‘task_complete', user_id, attributes) tags = {‘value’: 100} optimizely_client.track(‘completion_time', user_id, attributes, tags) Attributes Feature Events + Experiment (since we’re testing the Feature) + Audience (for targeting)
  94. 94. Questions you may be asking Do I have to have constant access to Optimizely to do basic application development? Can we keep this metadata that is fundamental to our code running properly closer to the code itself?
  95. 95. optimizely-cli A command line tool for managing your Optimizely data Every serious developer-focused service needs a command-line interface Built entirely on top of the Optimizely v2 REST API Works well for Full Stack. May work for Web.
  96. 96. setup $ opti init OAuths with your Optimizely Account once Links your code with a specific Optimizely project
  97. 97. Pulling Experiment Data $ opti pull Pulls all Optimizely data and writes it to an optimizely/ directory as yaml files
  98. 98. Pushing Back Experiment Data $ opti push Detects changes to your experiments and pushes back modified experiments to Optimizely
  99. 99. Advantages Scriptable - Automate changes to Optimizely in scripts Code Review - Make important modifications as a Pull Request in your own repo Historical Record - Use a webhook or update on a schedule to track changes over time
  100. 100. Try it out Install: pip install optimizely-cli Repository: https://github.com/optimizely/optimizely-cli Take a look at the code for good v2 REST API examples
  101. 101. Journey Up Mt. Experimentation Ali Rizvi, Software Engineer Mike Ng, Software Engineer
  102. 102. Results Experiment management Optimizely Application Optimizely Backend Java SDK Python SDK JS SDK events events results 3 projects 6 datafiles 30+ experiments 400+ features 2 projects 2 datafiles ~ 5 experiments ~ 2 features
  103. 103. BUSINESS VALUE VELOCITY/VOLUME LEVEL 1 Executional Start LEVEL 2 Foundational Growth LEVEL 3 Cross-functional Advancement LEVEL 4 Operational Excellence LEVEL 5 Culture of Experimentation Experimentation Maturity Curve Optimizely
  104. 104. How did we get here?
  105. 105. Level 1 Managing the datafile
  106. 106. Datafile caching in memcache
  107. 107. Datafile retrieval from memcache
  108. 108. Level 1 Event dispatching
  109. 109. Async event dispatching using Task Queues
  110. 110. Level 2 Datafile syncing
  111. 111. Consolidating datafile between backend and frontend App Engine App (Python) Frontend (Browser)
  112. 112. Level 3 Convenience methods
  113. 113. Cache user info for API calls
  114. 114. isFeatureEnabled call in product...
  115. 115. Method proxying
  116. 116. Continued...
  117. 117. Level 3 Consolidating datafiles
  118. 118. Level 3 Implement proper logging
  119. 119. Why isn't my feature enabled?
  120. 120. Level 4 Making QA Easy - Demo
  121. 121. Reaching the next stage • Consolidate projects • Use environments across all projects • Experiments / Feature Flags cleanup • Increase automated test coverage of all experiment paths
  122. 122. Takeaways Performance - Pass datafile between frontend to backend - Cache datafile in memcache - can also cache instance of Optimizely if appropriate Quality - Make it easy for users to QA your features and tests - Write automated tests for the different forks/paths created for experiments Productivity - Make it easy for developers to run experiments with wrapper/convenience methods - Always include a logger with the implementation
  123. 123. Q&A
  124. 124. Up Next... • Digital Lab of Pop Art Party Tonight! See you at the Marquee Nightclub at 8:30pm. • Breakout sessions from Vivid Seats, FullStory, Tealium and Gap start here tomorrow at 10am
  125. 125. Thank you!Thank You!

×