6. Measurements / Behaviours
“Tell me how you will measure me and I will tell you how I will behave.”
– Eliyahu Goldratt
Hawthorne Effect – that which is measured will improve.
17. Organizational Dysfunctions
Agile organization with Traditional measurements:
• Rewards the wrong behaviours
• Creates/exacerbates the chasm
• Individual stature
• Promote myself
• Me first, team second
• Me versus everyone else
• Desire to become an SME
18. Appraisals mapping to Agile manifesto and Principles
Creating Competition Vs. Collaboration
Ranking systems are used as a basis for dismissing the lowest performers,
making them even more threatening. When team members are in
competition with each other for their livelihood, teamwork quickly
evaporates.
Competition between teams, rather than individuals, may seem like a good
idea, but it can be equally damaging.
19. Appraisals mapping to Agile manifesto and Principles
Creating Idea of Impossibility
• Financial incentives are powerful motivators, so
there is a chance that the Individual or the
team might find a way to do the impossible.
• More likely case is that the promise of a bonus
that was impossible to achieve would make the
team cynical, and the team would be even less
motivated to meet the deadline than before
the incentive was offered.
• When people find management exhorting them
to do what is clearly impossible rather than
helping to make the task possible, they are
likely to be insulted by the offer of a reward
and give up without half trying.
Principle # 5
Build projects around
motivated individuals. Give
them the environment and
support that they need and
trust them to get the job
done
20. Creating Sub-Optimization
• Recently in a SW org, the management offered testers 100 bucks for every
defect they could find in a product about to go into UAT release.
• It was thought this would encourage the testers to work harder, but the
result was quite different.
• After all, the more problems the testers found, the more money they made.
• When we optimize a part of a chain, we invariably sub-optimize overall
performance
21. Destroying Intrinsic Motivation
Once employees get used to receiving financial rewards for meeting goals,
they begin to work for the rewards, not the intrinsic motivation that comes
from doing a good job and helping their company be successful.
Many studies have shown that extrinsic rewards like grades and pay will,
over time, destroy the intrinsic reward that comes from the work itself.
24. Performance Review – Should be regular
• While most systems focus on the individual, most agile people feel that the
team should be judged and rewarded if anybody is going to be judged and
rewarded.
• Managers are used to sitting down at the beginning of the year and
establishing some kind of performance goals for each employee.
• So, here's a tip that is perfectly workable: Give all of the members of an
agile team the same performance goals.
• When you do this, several things happen. First is that you find that you have
to establish goals that are higher level than the individual goals that you are
used to.
• By giving each member of a team the same goal, you have just
implemented a team goal.
25. Performance Review – Should be regular
Performance review of an Agile team should be done at a regular interval,
rather than it be a once upon year event.
Review should be cross functional and cross section (see future slides)
26. Sprint Report Card – Customer to Agile Team
FROM: Customer TO: Team
Overall customer satisfaction score
Additional comments:
<enter additional comments here>
<sum/4>
Project: < Name of the Project>
Sprint #: < ## >
Sprint Dates: < From and To Dates>
Sprint Review Date: < >
Measurements Score (0 - 100)
Value of delivery in terms of quantity, business value?
Do committed Stories work as expected?
Was the time frame commitment met?
Is the UI intuitive, professional and pleasing?
27. Sprint Report Card – Sample Filled
FROM: Customer TO: Team
Overall customer satisfaction scoreAdditional comments:
• Background colour is not matching our company standards
• Application crashed when more than 10 clickers were working together
90
Project: Clicker Software for Student Response System
Sprint #: 05
Sprint Dates: 9th March – 20th March 2015
Sprint Review Date: 20th March 2015 @ 3.00 pm
Feedback Provided by: Simon Reed – Chief Marketing Officer
Measurements` Score (0 - 100)
Value of delivery in terms of quantity, business value? 90
Do committed Stories work as expected? 85
Was the time frame commitment met? 100
Is the UI intuitive, professional and pleasing? 85
28. Product Owner Report Card to Development Team
FROM: Product Owner TO: Team
Overall customer satisfaction scoreAdditional comments:
I thought the team overall did pretty well on this project. The team had some members new to this
area so it took them a while to get up to speed and contribute at a high technical level. Also there
were a few times where I felt the team was not truly accepting ownership of the customer desires.
95
Project: Clicker Software for Student Response System
Sprint #: 05
Sprint Dates: 9th March – 20th March 2015
Sprint Review Date: : 20th March 2015 @ 4.30 pm
Measurements Score (0 - 100)
Team's "mojo" (teamwork, collaboration, helpfulness) 100
Team's ability to deliver to their commitment 95
Team's overall technical abilities 95
Team's ability to react to changes 95
Team's ownership of customer needs 90
29. Review of Product Owner by the Development Team
FROM: Team TO: Product Owner
Overall customer satisfaction scoreAdditional comments: XX
Project: Clicker Software for Student Response System
Sprint #: 05
Sprint Dates: 9th March – 20th March 2015
Sprint Review Date: : 20th March 2015 @ 4.30 pm
Measurements Score (0 - 100)
Availability to team
Understanding of customer requirements
Product Backlog organization, prioritization, and maintenance
Speed of answers to team
Leadership skills
30. Scrum Master Report Card – Development Team to SM
FROM: Team Member TO: SM
Overall customer satisfaction scoreAdditional comments: 88
Project: Clicker Software for Student Response System
Sprint #: 05
Sprint Dates: 9th March – 20th March 2015
Sprint Review Date: : 20th March 2015 @ 4.30 pm
Measurements Score (0 - 100)
Ownership of impediments 90
Ability to solve impediments 90
Coaching Skills 90
Knowledge of Agile / Scrum methods 90
Ability to effectively facilitate the team meetings 80
31. Team Member Peer Review Report Card
FROM: Jain Smith TO: Sam Dean
Overall customer satisfaction scoreAdditional comments:
Does not communicate effectively his impediments & stalled progress. He has the potential to be
really good due to his technical prowess, but he needs to understand the team concept better.
Several times I was hoping for much stronger collaboration.
70
Project: Clicker Software for Student Response System
Sprint #: 05
Sprint Dates: 9th March – 20th March 2015
Feedback date: 20th March 2015
Measurements Score (0 - 100)
Collaboration skills 70
Helpfulness to rest of team 60
Ownership of the team's deliverables 70
Technical expertize 80
Ability to meet the said commitments 70
32. A Word About: No End-Customer?
In some projects, the team does not have direct access to the end-customer.
What can you do in this case?
• Proxy – internal person playing role of customer (BA, stakeholder, etc.)
• Product Owner – customer voice in Scrum team
• Agile Manager
• Others?
33. End of Sprint Diagnostics – That Could assist Reviews
Some “diagnostics” to definitely measure:*
• Customer opinion
- Using the Sprint Report Card as shown in the previous few slides
• User story points delivered
- For team velocity computation
• Flow-based attributes
- # of user stories that were ready to start at the sprint planning meeting
- # of user stories completed – Met the DoD or considered complete by PO
- # of user stories committed, but could not meet DoD – therefore considered pending
* Be careful – only measure meaningful easy-to-collect items.
34. A Word About: Velocity – Should we use it for Reviews?
“Should we measure velocity?”
• Answer: Yes, Of course
“Should we use velocity for release planning?
• Answer: Yes, Of course
“Should we use velocity to help gauge how much to choose in a sprint
• Answer: Yes, Of course
“Should we use velocity for evaluating the team or an individual?
• Answer: Definitely, A BIG NO
35. End of Sprint – These measures do make business sense
Avoid measuring:
Individual velocity
Sum of all task hours for a person
• Compared to 40 hours/week
Number of tasks/person
Accuracy of task estimates – Something like our old favourite effort variance
Accuracy of story points estimates remember estimate is an estimate, it
cannot be accurate)
What about team’s ability to hit their commitment?
36. Make it different and make it count
• One manager in China said to me, "Scrum seems to hide the contributions of
individuals, and I can't judge them as well."
• My first thought was "Well, what does that tell you?" No one on a scrum team knows
better how each person is doing than anyone else on the team.
• Make use of that by trying 360-degree reviews. Doing them frequently takes away a
lot of the sting and, after a couple of rounds, there will be no surprises left.
• Doing them outside the formal review process will allow people to comfortably give
the kind of feedback that can really make a difference.
• Here's What I suggested:
• Announce to the team that we would do voluntary 360-degree reviews.
• Nobody was required to participate, but if you wanted feedback you had to give
feedback for everybody else on the team, including the ScrumMaster and
product owner.
• All feedback was handed in to the manager, who edited it together to make it
impossible to determine who said what.
• Feedback was then delivered to each person directly and in private by the
manager.
37. Project Report Card
FROM: Customer TO: Team
Overall customer satisfaction scoreAdditional comments:
Overall a great project & a great team! The high levels of communication & collaboration were quite
refreshing to me considering my previous failed attempt with another company.
94
Project: Clicker Software for Student Response System
Date of the Project Completion: 1st June 2015
Feedback Provided by: Simon Reed – Chief Marketing Officer
Measurements Score (0 - 100)
Level of collaboration experienced with Product Owner
and development team 95
Team’s ability to react to changes? 90
Timeliness of the project? 95
Quantity of features delivered? 95
Quality of features delivered? 95
38. If the Projects are too long spanning across years …
Some projects are very long (1+ year) due to quantity of functionality and/or
complexity. What can you do in this case?
• Snapshot – collect the project report card 2 or 3 times / year
• Timing – collect the project report card right before the annual review
• Timing – collect the project report card at the end of each major release
• Others?
39. Agile Annual Performance Review process
1-on-1 because of personal nature and team member likely worked multiple
projects
Performed by Agile manager, director, etc.
• Gather and coalesce the report card data!
• Average team score for all sprint report cards
• Average team score for all project report card
• Average team score for all PO project report card
• Average scores from project peer reviews
• Query key co-workers, POs, Scrum Masters
for additional insights on the person
Discuss what these measurements show
40. Agile Annual Performance Review process
• Identify how to improve the teams
• Identify training needed to make teams more cross-functional
• Honest and transparent discussion
• Both positive and constructive
41. Agile Performance Review process – A few thoughts
The Scrum framework doesn't prevent you from measuring whatever you
want to measure. However, you'd be missing the spirit of Scrum if you
measured the work of a single team member.
The fundamental point is that if a Scrum team member is not able to
contribute, then this points to a team problem.
As with any new approach, introducing an Agile performance evaluation
system will have its challenges:
Resistance to change
Fear – too revealing
Deflates the general optimism
Loss of control
Cheese mover comfort zone, Etc ..
A very important part of any Agile rollout is to align the performance
evaluation system (and other HR practices) with what Agile emphasizes
42. Final Thoughts in this journey
Conclusion is that tracking an individual's performance constitutes crime in
the Scrum world.
The fundamental idea of Scrum is team collaboration, and team members
"volunteer" for tasks rather than having them assigned.
One of the core values of Scrum is courage, and it's not a bad practice to
announce at the daily stand-up, "I missed my task timeline today." If that's the
case, the Scrum team collaboratively aligns itself to meet the sprint backlog, at
least enough to result in a shippable product when the sprint ends.
The keys, then, are accurate sprint planning and a "must-do" sprint
retrospective. Most important, don't wreck the Scrum by measuring an
individual's work items.
It is human nature for people to modify their behaviours to match the
evaluation system
Not doing so causes dysfunction that will erode the team’s effectiveness
43. Final Thoughts in this journey
Sprint report card
Sprint - a few diagnostics to measure and a few not to
Project report card
Product Owner report card
Project peer review
Annual “Agile performance review”