5. Why to assess Identify gaps in the process To have a discussion To transfer knowledge To share practices To compare your team to other teams To improve – or inspect and adopt Por qué? Why?Чому? Dlaczego?Почему?Чаму?
6. Nokia Scrum Test Iterations must be time-boxed to less than six weeks / Do your sprints start and end on planned dates Is the software completely tested and working at the end of an iteration Can the iteration start before specification is complete
7. Assessing through surveys Good to gather the data quickly No discussion No feedback Require specific questions Do not discover hidden issues Good as a team practice There is good survey from Mike Cohn at http://comparativeagility.com
8. Question types Yes-no or specific (closed) Simpler Good as a pocket guide Faster Do not discover hidden issues Areas to discover (open) Time-consuming Explores the process Deals with creativity and innovation People finds answers by themselves
9. Our assessment method Face-to-face discussion Open questions Mark and Importance 2.5h-3h in quick mode Detailed assessment requires assessor to attend all the ceremonies and have additional meetings with the project team
13. Team structure Cross Functional Self-organizing Roles Acting as a team Collocation
14. Requirements Management Backlog writing in JiT/JE manner Backlog items writing in JiT/JE manner Backlog prioritization Product Owner Responsiveness Scalability of Product Ownership Cross team dependency tracking Single backlog for depending teams SoS
15. Release Planning Backlog sizing The meeting, values, re-sizing Velocity usage Release Burndown Chart Usage Release Planning culture Projection vs. Planning Long term commitments vs. indication
17. Engineering Practices TDD and Unit Testing Continuous Integration Distributer parallel build systems CI lamps Peer Review Pair Programming Refactoring Coding Standards Collective Code Ownership
18. QA and Acceptance Definition of Done Acceptance Criteria Sprint Review Automated Testing Manual Testing Dev doing QA work (cultural aspect) Dealing with defects
19. Continuous Learning and Improvements Retro Problem Solving Root Cause Analysis Willingness to Learn Knowledge sharing Process refinements
20. Cooperation and Collaboration in Distributed Settings Unified process across teams Daily Scrums same time same location Collaboration tools Impediments Scrum of Scrum Trips Phone, Video, IM Distribution strategy Proxies
22. Assessment Report Comes with detailed description of current process Gaps and areas for improvements identified Recommendations broken down into categories: Knowledge Immediate Short term Longer term
24. Why to measure To identify gaps in the process To improve processes To ensure predictability of the project To ensure agility
25. What to measure Attributes Team size over time Team members contribution Sprint length Velocity variance Cycle time Technical debt Post sprint defect arrival Root cause fixed defects Bug fixing to implementation time
26. What not to measure Individual performance Business value as productivity Source lines of code Velocity to compare teams
28. Cycle time Lead Time (In Sprint Cycle Time) Days or % of Sprint Length Completed minus started Shorter is better Indicator of productivity and predictability
29. Technical Debt Metaphor developed by Ward Cunningham Complex metric which is hard to measure High-Low makes the most sense Quick and Dirty approaches produce more debt Should be paid back Has significant impact on velocity
Здатність швидко реагувати на перешкоди, швидкість і тд. Відповідність принципам Еджайл, що є гнучкістю і здантістю реагувати на перешкоди. Скрам взятий як ядро
При такій розмові люди часто самі запропонують відповіді
Video linkLunch together
PO:Team members to think about the next day few days
Enhanced burndown shocked the customer
both pre-assignment and using pool of tasks make a sense