5. How do we do it
• Development rules + code review
• Feature flags + progressive rollouts
• Tests: FE UT, BE UT, Integration tests, UI tests, Smoke tests
• A/B tests
• KPI gate: performance regression tests
• 0 downtime blue green deployments
• Analytics + monitoring
• Continuous deployment
6. Development rules + code review
• New feature only under Feature flag
• Every new feature has a rollout schedule starting from 1%
• Feature can not be deployed without tests written
• Code style + pre-commit git hooks
• No code review for bug fixes without UT
7. Tests + performance regression
• FE UT: 3552, 87% code coverage
• BE UT: 690
• Integration tests: 195
• UI tests: 495 Scenarios
• Smoke tests: a subset of UI tests running against real API on dev, staging, pr
envs
• Web Page Performance: Load time, fully loaded, start render, speed index
12. Analytics + monitoring
• Keep an eye after deployment
• Real-time metrics for: system, application, business
• Quality alerts
Bug introduced Bug fixed
13. Conclusion
• Stimulates a culture of quality, the mindset to always keep the product in
releasable state
• Low risk releases: small change -> small risk; zero-downtime deployments
that are undetectable to users
• Faster time to market: Value delivered immediately to the user when it's
ready; Quick feedback on product changes
• Higher quality: automated tools to discover a regression; easy to rollback;
easy to identify bugs
• Better products: A/B testing enable us to take a hypothesis-driven approach
to product development
• Happier teams