Moneyball for Software Teams?
Date: This event took place live on September 16 2011
Presented by: Jonathan Alexander
Duration: Approximately 60 minutes.
Cost: Free
The new movie "Moneyball" starring Brad Pitt is about to be released. Based on the bestselling book of the same name, the movie explores the use of "sabermetrics" to build winning baseball teams. In this webcast, Jonathan Alexander, author of Codermetrics, suggests that these same ideas be applied to software teams. Jonathan will discuss how you can apply similar ideas to improve your software teams, giving examples of specific metrics and techniques to help you identify, analyze, and discuss the successes and failures of your software engineers and to help make the team a more cohesive and productive. If you manage or lead teams of software developers and engineers, and want to make sure that your focus is on success, you can't afford to miss this entertaining and instructive event.
Presented by: Jonathan Alexander
Sept 16, 2011 webcast moneyball for software teams jonathan alexander
1. Moneyball for Software Teams?
Presented by Jonathan Alexander
VP Engineering at Vocalocity
Author of Codermetrics (O‟Reilly 2011)
2. The Popularity of Moneyball
Advanced stats used to analyze baseball players and teams
Bill James, the father of sabermetrics, author and consultant
Michael Lewis, author of Moneyball published 2003
Moneyball starring Brad Pitt being released Sept. 23rd, 2011
O‟Reilly Strata Summit: The Business of Data (NYC Sept. 20-21) has Paul
DePodesta, VP NY Mets as a featured speaker about Moneyball
3. Metrics Have Changed the Game(s)
Scouting
Drafting
Trades
Coaching
Player Development
Salary Arbitration
6. Principles of Moneyball:
Study and Learn from Outliers
anomaly (noun): an incongruity or inconsistency, a
deviation from the norm
outlier (noun): a person or thing that lies outside, a point
widely separated from the main cluster
Games At Bats Hits Doubles Triples HRs RBI Avg. OPS All Stars
Piazza „92-‟07 1,912 6,911 2,127 344 8 427 1,335 .308 .922 12
Pudge ‟91-‟10 2,499 9,468 2,817 565 51 309 1,313 .298 .800 14
7. Techniques Used For Moneyball
Leverage basic performance statistics
Hits, Runs Batted In (RBI), Runs Allowed (ERA)
Add “situational” statistics gathered by “spotters”
Errors, out-of-zone fielding, pressure situations
Develop “advanced” statistics through combinations
and formulas
OPS (on-base plus slugging) , FIP (fielding independent pitching) ,
BABIP (batting average on balls in play) , WAR (wins above replacement)
Analyze statistics to find best predictors of individual
and team success
8. Moneyball for Software Teams?
Implement new techniques to gather metrics on a
wide variety of contributions
Find ways to measure “wins” and “losses”
Analyze how individual contributions and team
“chemistry” correlate to wins and losses
Examine Assumptions
Discover Patterns
Use metrics to create focus and help identify
opportunities to change, adjust, improve
9. The Magic Triangle Challenge
Oft-discussed “triangle”: Features-Time-Quality
Is it true? You can‟t add more work unless you
lengthen time or reduce quality
Avg. Total Quality Release
Complexity Complexity Problems Quality %
Release 1 1.2 272 86 68%
Release 2 1.6 248 77 69%
?
Release 3 1.5 274 109 60%
Release 4 2.8 318 69 78%
Release 5 2.4 347 88 75%
Release 6 1.4 261 92 65%
Release Quality % = 100 – (Quality Problems / Total Complexity)
10. Questions To Answer
How well do team members handle their core
responsibilities?
Examples: Design, Code, Test
In what ways do team members contribute beyond
their core responsibilities?
Examples: Innovate, Take Initiative, Handle Adversity
How much do team members help others?
Examples: Assist, Mentor, Motivate
Is the software team succeeding or failing?
Examples: New Users, Production Bugs, Efficiency
11. What Are The Roles On Your Teams?
Playmakers and Scorers
Defensive Stoppers
Utility and Role Players
Backups
Motivators
Veterans and Rookies
12. Example: Skill Metrics
Metric Description Formula
Points Measure the overall productivity of each Points = Sum of complexity rating for all completed tasks
coder on assigned tasks
Utility Measure how many assigned tasks each Utility = Number of tasks completed
coder completes
Assists Measure the amount of coder interruptions Assists = Count of times that coder helps others
and how much a coder helps others
Saves Measure how often a coder helps fix urgent Saves = Number of severe product issues coder helps fix
production issues
Tackles Measure how many potential issues Tackles = Number of times a coder takes initiative or innovates
a coder handles proactively
Turnovers Measure the complexity of assigned tasks Turnovers = Sum of complexity for all tasks that are not completed
that a coder fails to complete
Errors Measure the magnitude of production Errors = Sum of bug severity factored by population affected
issues found in areas of coder responsibility
Range Measure how many areas of software a coder Range = Number of areas worked on by a coder
works on
13. Example: Response Metrics
Metric Description Formula
Wins Measure the number of active users added Wins = Sum (User Activations)
Losses Measure the number of active users lost Losses = Sum (User Deactivations)
Win Rate Determine the average amount of time it Win Rate = Time elapsed divided by the number of new users
takes to get a “win” (new user)
Loss Rate Determine the average amount of time it Loss Rate = Time elapsed divided by the number of lost users
takes to accumulate each “loss” (lost user)
Win Percentage Measure the percentage of trials that successfully Win Percentage = (Successful Trials / Trials Completed) × 100
convert to active users
Gain Measure the number of Wins minus the Gain = Wins - ((Trials Completed – Successful Trials) + Losses)
missed opportunities and Losses
Penalties Per Win Measure the overall urgency of customer Penalties Per Win = Penalties / Wins
support issues relative to the number of new users
14. Example: “Advanced” Metrics
Metric Description Formula
Power Measure the average complexity of the tasks Power = Points / Utility
that a coder completes
Temperature Measure how “hot” or “cold” a coder is at Temperature = Previous Temp. × (Current Points / Previous
any given time (start with Temp. 72) Points)
O-Impact “Offensive Impact” to summarize how a coder O-Impact = Points + Utility + Assists
helps move projects along
D-Impact “Defensive Impact” to summarize how a coder D-Impact = (Saves + Tackles) × Range
helps solve issues or avoid larger problems
Plus-Minus Measure the amount of positive contributions Plus-Minus = Points - Turnovers - Errors
versus negative issues for each coder
Teamwork Establish a relative rating for team-oriented Teamwork = Assists + Saves + Range - Turnovers
contributions
Fielding Establish a relative rating for the range and Fielding = (Utility + Range) - (Turnovers + Errors)
breadth of work successfully handled
Intensity Establish a relative rating for heightened Intensity = Saves + Tackles + (Avg. Temp. – Start Temp.)
productivity and dealing with hot issues
Win Shares Assign a relative level of credit to each coder Win Shares = Wins × Influence × Efficiency
for new users
Loss Shares Assign a relative level of responsibility to Loss Shares = Losses × (1.0 - Efficiency)
each coder for lost users
15. Techniques to Gather and Track Metrics
Get data from existing systems
Project tracking, bug tracking, customer support
Instrument your software for usage data
New users, lost users, feature usage, measured benefits
Self-reporting or “spotters” for situational data
Create documents or database for metric storage and tracking
18. Identify Key Goals and Accomplishments
Metric Description Formula
Boost Measure the amount of additional user Boost = Sum of the percentage of users receiving each benefit
benefits delivered
Acceleration Measure the ratio of user benefits delivered Acceleration = Boost / Number of Urgent User Issues) x 100
to urgent user issues created
19. Steps for Adopting Metrics
1. Find a Sponsor
2. Create a Focus Group
3. Conduct a Trial (restart or stop if trial fails)
4. Introduce Metrics to the Team
5. Create a Metrics Storage System
6. Establish a Forum for Discourse
7. Expand Metrics Used and Analysis
20. Places and Times to Use Metrics
Regular Team Meetings (sprint retrospectives)
Project Post-Mortems
Mentoring
Establishing Goals and Rewards
Performance Reviews (validation)
Promotion Consideration
21. Moneyball Strategies for
Building Better Software Teams
Recruit for “Comps”
Profile your team, identify roles you need, then recruit
Improve the Farm System
Use interns, contract-to-perm, promote from within
Make Trades
Re-organize teams internally to fill roles and balance skills
Coach the Skills You Need
Focus on those with aptitude, use target metrics
22. Recruiting Comps
Defensive Stopper Profile Candidate A Profile Candidate B Profile Candidate C Profile
Avg. Points Medium High Medium Medium
Avg. Utility Medium Medium Medium Medium
Avg. Assists Medium Low High Medium
Avg. Errors Low Medium Low Medium
Avg. Saves High Low High Medium
Avg. Tackles High Low Medium Low
Avg. Range Medium Low Medium Medium
Target Profile Best Candidate
24. Resources for Further Exploration
Codermetrics: Analytics for Improving Software Teams
262 Pages
Released August, 2011
In bookstores, Safari Online, or at http://www.oreilly.com
Codermetrics.Org – community website
Post ideas or stories
Share resources (spreadsheets, analysis)
Ask questions
Post Events
Follow on Twitter @codermetrics
25. Special Offer
Visit http://oreilly.com
to purchase your copy of
Codermetrics and
enter code 4CAST to
save 40% off print book
& 50% off ebook with
special code 4CAST
Visit http://oreilly.com/webcasts to view upcoming webcasts and online events