4. Feedback & Iterative Development
Feedback
All steps
for
shipment
Empirical
management
Learning
by review,
test
⢠Quick
⢠Many times
Inspect
adapt
improve
⢠Learning
⢠Effective goal
reaching
5. Why Definition of Done ?
Inspect , Adapt and Improve
⢠All steps of software life cycle (dev to
deployment) get feedback
⢠Product feedback : Test performance, Demo etc.
⢠Process feedback : Coding Quality, Peer Review,
Deployment etc.
6. Why Definition of Done ?
Almost done is not done at all
⢠PO and Dev in discussion
â˘
â˘
â˘
â˘
â˘
â˘
Is it done ?
Yes, almost
Can we go to production ?
No, not yet
Why not ?
Some bugs, some tests, not sure it works on prod,
Webservice not reachable in business domain, manual has
to be written, etc
⢠When can we go to production ?
⢠I am not sureâŚ..
7. Why Definition of Done ?
Better release planning
⢠No need for hardening iterations.
Iterations where bugs are solved, tests are
done, deployment is prepared
⢠Estimate / plan on iterations
17. Why Definition of Done ?
Minimize the delay of risk
⢠Undone work reveals itself in production !
18. Why Definition of Done ?
Defines the agility/quality/maturity of the team
⢠A team should be able to complete a (new)
feature in one iteration and release it
immediatly to production with all steps
defined in the DoD necessary to guarantee
best quality.
19. Definition of Done
Two definitons of done :
-Competence -> Automation (Canât)
-Maturity - > Wonât
In Use/ Current
⢠Transparant for Product owner
⢠Represents capability of the team
⢠What to improve
Optimal/Ideal
⢠Where do you want to go
20. Definition of Done
Ideal
â˘Code checked in
â˘Code build green on build server
â˘Coding Quality Check Green(er) (Sonar)
â˘Unit Test build server OK
â˘Unit Test build server OK (Code Coverage 80 %)
â˘Peer reviewed
â˘(Automated) Deployed on CI Server
â˘One click on Demo Server
â˘(All deployment is including automated
database deployment on all mentioned servers)
â˘(Automated) Integration Test run on CI
â˘(Automated) Acceptance Test run on CI
â˘(Automated) Performance Test run on CI
â˘(Automated) Deployed on ST Server
â˘(Automated) Deployed on UAT Server
â˘Exploratory testing done on ST Server
â˘Integration (chain) testing done on UAT Server
â˘Demo-ed and approved by Product Owner
â˘All sprint related bugs solved
â˘Deployment Guide up to date
â˘Interface documentation up to date
â˘Use Cases up to date
â˘RMS up to date
â˘Release Notes up to date
â˘User Manual up to date
â˘SRS updated
â˘Iteration Test Rapport (up to date)
â˘Technical Design updated (when absolutely
necessary)
â˘Product Backlog up to date
Two definitons of done
Current
â˘Code checked in
â˘Code build green on build server
â˘Coding Quality Check Green(er) (Sonar)
â˘Unit Test build server OK
â˘Peer reviewed
â˘(Automated) Deployed on CI Server
â˘(All deployment is including automated
database deployment on all mentioned
servers)
â˘(Automated) Integration Test run on CI
â˘(Automated) Acceptance Test run on CI
â˘(Automated) Deployed on ST Server
â˘Exploratory testing done on ST Server
â˘Demo-ed and approved by Product Owner
â˘All sprint related bugs solved
â˘Deployment Guide up to date
â˘Interface documation up to date
â˘Use Cases up to date
â˘RMS up to date
â˘Product Backlog up to date
Delay of Risk
Manifestation in production
21. Definition of Done
Ideal
â˘Code checked in
â˘Code build green on build server
â˘Coding Quality Check Green(er) (Sonar)
â˘Unit Test build server OK
â˘Unit Test build server OK (Code Coverage 80 %)
â˘Peer reviewed
â˘(Automated) Deployed on CI Server
â˘One click on Demo Server
â˘(All deployment is including automated
database deployment on all mentioned servers)
â˘(Automated) Integration Test run on CI
â˘(Automated) Acceptance Test run on CI
â˘(Automated) Performance Test run on CI
â˘(Automated) Deployed on ST Server
â˘(Automated) Deployed on UAT Server
â˘Exploratory testing done on ST Server
â˘Integration (chain) testing done on UAT Server
â˘Demo-ed and approved by Product Owner
â˘All sprint related bugs solved
â˘Deployment Guide up to date
â˘Interface documentation up to date
â˘Use Cases up to date
â˘RMS up to date
â˘Release Notes up to date
â˘User Manual up to date
â˘SRS updated
â˘Iteration Test Rapport (up to date)
â˘Technical Design updated (when absolutely
necessary)
â˘Product Backlog up to date
Two definitons of done
Current
â˘Code checked in
â˘Code build green on build server
â˘Coding Quality Check Green(er) (Sonar)
â˘Unit Test build server OK
â˘Peer reviewed
â˘(Automated) Deployed on CI Server
â˘(All deployment is including automated
database deployment on all mentioned
servers)
â˘(Automated) Integration Test run on CI
â˘(Automated) Acceptance Test run on CI
â˘(Automated) Deployed on ST Server
â˘Exploratory testing done on ST Server
â˘Demo-ed and approved by Product Owner
â˘All sprint related bugs solved
â˘Deployment Guide up to date
â˘Interface documentation up to date
â˘Use Cases up to date
â˘RMS up to date
â˘Product Backlog up to date
â˘User Manual up to date
22. Definition of Done
Conclusion
Definition of Done helps you with :
â˘
â˘
â˘
â˘
â˘
Improving team quality/agility/maturity
Transparancy to stakeholders
Giving burndown charts sense
Better release planning
Minimizing delay of risk
23. Product Backlog
⢠List of whatever needs to be done in order to successfully deliver a
working software system
⢠Features, functionality, technology, issues, emergent items
⢠Prioritized, estimated
⢠Product Owner responsible for priority
⢠More detail on higher priority items
⢠Anyone can contribute
⢠Posted visible and Maintained
24. Product Backlog Refinement
â˘
â˘
â˘
â˘
â˘
â˘
â˘
â˘
â˘
â˘
Time boxed meeting +- 1,5 hour every week whole team
Product owner should attend
Split , clarify and estimate work items, user stories, RFCâs
Share new insights with the team
Re-estimate when necessary
Priority determined by Product Owner
Goal is the have a âreadyâ Product Backlog for next planning
Prevent discussions in the planning session
Visualize release planning
(also known as Backlog Refactoring, Backlog Maintenance,
Backlog Grooming)
25. Product Backlog Refinement
Clear-Fine Items
1.--2.--3.--4.--Vague-Coarse Items
---------------------------------------------------------------------------------------------------------------------------
Items that are detailed and small enough
to be picked up by development for
implementation
Need more details, more discussion,
more acceptance criteria, smaller etc.
28. Planning
Planning session 1 :
Determine capacity of team
Pick userstories based on
âfeelingâ and velocity in mind
Time : 5-10 minutes
Planning session 2 :
Define tasks and hours
Time 2 hours