2. Best Way to Contract an Outsourced
Agile Test Team
Clareice Chaney CPCM, PMP
Clyneice Chaney PMP, CQMgr.
3. XBOSoft info
• Founded in 2006
• Dedicated to software quality
• Software QA consulting
• Software testing services
• Offices in San Francisco, Beijing, Oslo and
Amsterdam
5. About the Speakers
Clareice Chaney has over 30 years of experience in
commercial and government contracting with an
emphasis in information technology contracting and
specific expertise in performance-based contracting
techniques.
Clyneice Chaney brings over 20 years of testing, quality
assurance, and process improvement experience.
Clyneice holds certifications from the American Society
for Quality as a Certified Quality Manager, QAI Global
Institute's Certified Quality Analyst, and Project
Management Institute's Professional Project Manager.
6. Housekeeping
• Everyone except the speakers is muted
• Questions via the gotowebinar control on the right side of your
screen
• Questions can be asked throughout the webinar, we’ll try to fit
them in when appropriate
• General Q & A at the end of the webinar
• You will receive info on recording after the webinar
7. Next Webinar -July 30
• Mobile Testing Tutorial
• JeanAnn Harrison
• BYOD
• Registration page in your
chat
9. Outsourced Testing: Typical Categories
Consulting
Services
Quality
Assurance
Quality
Management
Staff augmentation
Functional testing
Integration Testing
Performance Testing
Testability
Assessment
Test management
TCOE
Risk –based testing
Test Automation
Agile Testing
Test Data
Management
Managed quality
services
Quality center of
excellence
On-site advisory/
assessment
Shared outcome QA
10. Outsourced Testing Challenges
• No single best testing
practice
• Complexity of system
under test
• Test automation
– Misaligned baselines
– Operational
inefficiencies
– Poor risk and
performance
management
– Collaboration failures
11. Is There a Difference Outsourcing Agile?
Requirement Design Build Test Deploy
Contract
Traditional
Agile Foils controlling systems
Defines deliverable differently
12. Agile & Contracting: Understanding a
Basic Contradiction
• You need an agreement to define a
fair and professional relationship.
• Contracts typically don’t allow for
quick or easy changes
• Poorly constructed contracts have
the potential to derail business
objectives
13. Impact of Agile on Contracts
• Co-location
• Decision making authority
• Customer participation
• Pricing concerns
• Cross-functional; not separate
teams
– Test often before delivery and part
of development process
– More automated test & analysis for
early buy identification
14. What’s the Fix?
• Modular contracting to provide flexibility. Several
smaller acquisitions that:
– Are easier to manage individually
– Address complex IT objectives
– Provide for delivery, implementation & testing of workable
systems solutions in increments
– Allows subsequent increments to take advantage of
changes in technology or requirements
15. And…. Fix # 2
First phase is a
typical service/
consulting
contract
2nd phase is a series of
fixed price contract
where you have
defined:
• What to build
• Time frame
• How much it will cost
Results: A framing
contract that is really
about the
collaboration and the
end goal.
Defines: # of
iterations and the
T&Cs for both phases
16. And Fix # 3: Focus On What You know
Contract Project Objectives rather
than the solution
• Target Cost
• Target Schedule
• Business objectives with a high level
overview of how the SW will help
• Time and resources budgets
Well defined planning and
acceptance process for each
iteration
Shared Benefit Contracts
• Goal is to provide
financial incentives for
meeting project
objectives and penalties
if they aren’t met
• Not technical
objectives… business
objectives
17. Road to Good Outsourced Test
Performance
SupplierPerformance
# 1 Identify outcomes
# 2 Determine outcome
measures
# 3 Define Service Levels
# 4 Establish Monitoring
Methods & Monitor Outcomes
19. What you Need to Know?
Quality of Service Quality of Product
Defect Density
Quality
Attributes
Status Quality
Target
Number of
Defects
Schedule
Test
Effectiveness
Test Planning
Effectiveness
Defect Detection
Adapted from James Bach, Rapid Software Testing.
20. Measuring the Suppliers Outcomes
Number of Tests Attempted by Given
Time
Number of Tests Passed by a Given Time
Number of Bugs Found by Given Time
Test Effectiveness Ratio
Defect Detection Capability
Duration of Testing Processes
Defects by Category/ Severity
Product Defect Density
Backlog of Open Bugs
Product Quality Over Time
Defect Detection Capability
Defect Severity Index
22. 5P Performance Measurement
Framework
Adapted from: Kalyana: Performance Measurement Framework for Outsourced Testing Projects (April 3, 2009)
Project: for each task or project monitor meeting
it’s requirements, service levels
Process: monitor suppliers processes, particularly
ones they agreed to perform
Product: monitor excellence of deliverables
People: monitor supplier staff- average experience;
availability, capability
Price: $ amount saved, price variance
23. Setting Testing Service Quality Levels
• Timeliness of service
• Accessibility and convenience
• Accuracy
• Staff Availability
• Service Delivery: Courteous?
• Adequacy of information
disseminated
• Customer satisfaction
• Condition and safety of facilities
used
Clareice: Master’s Degree PMP certification from the Project Management Institute and is a Certified Professional Contracts Manager through the National Contract Management Association (NCMA). She has presented at the National Contract Management Association World Congress.Clyneice:She has participated as an examiner for state quality awards for Georgia and Virginia. She is currently an instructor for the International Institute for Software Testing and has presented technical papers at the Software Engineering Institute: SEPG Conference, American Society for Quality: Quality Manager's conference, Quality Assurance Institute International Testing Conference, International Conference on Software Process Improvement and Software Test and Performance Testing Conferences.
Numerous approaches to testing…..Collaboration failures that collectively can destroy up to 90% of the value of an outsourcing relationship
Yes there is.. Contracting challenges the agile context. Conventional contracting factors like sdlc scope, schedule milestones, deliverable payment terms, are built on tight control and documented changes to contract. Large Organizations and complex systems often run into trouble because Agile:Foils most controlling systemsMakes it difficult to define deliverables which is the primary basis of the contractThe agile context- fix the scope to deliver, fix the time and cost,. Almost every aspect of traditional contracting is different in agile and the contracting terms should account for the differences.You'll need to structure the contract clauses from specific agile perspective. The following slides will tell you how
One of the major drawback of contracts is that they don’t allow you to change easy or fast. So a standard contract is contradictory to agile. Not special to agile; most SW development efforts have the similar issues.You need contracts to secure a fair and professional business relationship on the other hand poorly constructed contracts have the potential to derail business objectives.
Functional requirements are frequently not identified sufficiently at the level needed to develop a good cost estimate
Contract the process that will lead to the results that you want1st phase objective is to learn and understand not deliver..Learning and Set up to achieve common vision of what the system looks like from a technical and business view ( 3- 6 weeks max)2nd Phase: Iterations: Agree to run a series of iterationsList of mandatory functionality for upcoming iterationEstimate of development effort for each item on the listConsent between client /contractor that iteration makes sense both from a technical & business perspectiveTo plan and implement each iteration according to the state of the artProcess must provide enough formality for safety for both partiesIteration planning session written minutes that are mutually signed off onAcceptance based on automated testAll tests should be documentedLearning and Set up to achieve common vision of what the system looks like from a technical and business view ( 3- 6 weeks max)IterationsList of mandatory functionality for upcoming iterationEstimate of development effort for each item on the listConsent between client and contractor that this iteration makes sense both form a technical and business perspectiveCould set up a budget for 10 iterations and a list of features the domain experts think are minimum to deliver business success. If you leave enough 2nd Phase What to build (the list);Time frame ( next iteration)How much it will cost (the estimate)You end up with a framing contract that is really about the collaboration and the end goal. It fixes the process of how many iterations and the terms and conditions for both phases
Lean SW Development – Cutter Sr. Consultant Mary Poppendiieck proposesFocus the Contract on What you Know
What Gets Measured and Reported Gets Attention
Terminology: Goal: what you hope to achieve; Inputs - Amounts of resources used ; code, number of test scripts,. Often expressed as amount of funds or number of employee years or hours or bothOutputs are what you do (products )or services delivered; (Quantitative) : not what the impact this has: Outputs consists of the completed products of an activity; amount of work done. Outputs do not by themselves tell anything about the results achieved ; although they are expected to lead to desired outcomes (EXAMPLES: how much , how many, how economical, how prompt, how accurate, how responsive) Do not by themselves reveal anything about the results achieved although they are expected to lead to desired outcomes Outcomes: Impact (Qualitative) : Outcomes are what the outputs accomplish (how well, how valuable, how reliable, how courteous, how fast, how we respond). For performance measurement; amounts actually used not budgeted are relevant . events, occurrences, or conditions that are outside the activity or program itself that are of direct importance to customers or. An outcome indicator is a measure of the amount and/or frequency of such occurrences,.. It is important to measure both the outcomes and the outputs. Indicator: Observable and measurable (generally numeric) behaviorBenchmark: something that serves as a standard by which others may be measured or judged. Target: threshold of success Accomplishments: how your results compare to your targets – what actually happened.While amounts of work by itself are not outputs or outcomes; workload data can be used to produce outcomes data; amount of work not completed at the end of a reporting period can be considered a proxy for delays of service to customers; size of backlog can be used to indicated the customer wait times; and indicators of the extent of delays such as % of cases in which the time between a service request and when the service was provided exceeds X days and X is the standard
It is pretty well understood in the SW industry that testing is a specialized area which helps organizations reduce risk and derive greater business value across the entire SW development lifecycle. However many organizations continue to struggle with figuring out the best way to define service and the outcomes that can govern testing relationships.Test metrics are the means by which SW quality can be measured and they often provide the visibility into the readiness of the product and give clear measurement of the quality and completeness of the productPerformance measures should provide an answer to these questions:How long will it take to test?How bad/good is the product?How many bugs still remain in the product?Will testing be completed on time?Was the testing done effectively?How much effort went into testing the product?Put on Board: SMART: There is a n acronym for what makes a good metric:SMART: specific, measurable, attainable, repeatable, and time-dependent
Suppliers metrics how good are they at doing the job you are paying forDefect detection capability detected defects/ total time spentTest effectiveness = bugs found test/ total bugs found X 100Productivity varianceRate of incoming defects by product/ module etc.What is the release quality of the product that they are testingProgram defect density/ number defects/ length of source codeBacklog of open defects = count of bugs/ over time weightedProduct quality over time= weighted defect count by month . QuarterBug density= number bugs by severity for each componentDefect severity index direct measure of quality index presenting the average severity of defects. Calculated all defects total ( severity rating)/ divided by total number of defects
We would like to know how many tests we have, and whether they are all working. Consider a simple chart, showing for each time period (day, week, or iteration) how many customer acceptance tests exist. Color in green the number that are working, and in red the number that are not. The chart below shows what you might see on a project with an orderly progression of more and more tests running: We would like to know how many tests we have, and whether they are all working. Consider a simple chart, showing for each time period (day, week, or iteration) how many customer acceptance tests exist. Color in green the number that are working, and in red the number that are not. The chart below shows what you might see on a project with an orderly progression of more and more tests running: We would like to know how many tests we have, and whether they are all working. Consider a simple chart, showing for each time period (day, week, or iteration) how many customer acceptance tests exist. Color in green the number that are working, and in red the number that are not. The chart below shows what you might see on a project with an orderly progression of more and more tests running:Story changes can be bad or good. It’s good when the customer learns, but too many changes might suggest that they’re being fickle or that they’d benefit from a little more thought.Good or bad, however, defect fixes and story changes do consume time that might possibly be better spent. If you’re suspecting that there’s a problem, try a chart to see what happens.The chart above shows new work “above the line” and defect and changes below the line. It shows very vividly the effect that changes and defects are having on new production. It’s probably always good to track velocity, in terms of number of stories or story points. It can also be of value to show how your velocity is being spent, in terms of defect fixes or changes to previous stories. One “classical” chart from Scrum is the burn down chart :One issue with the burn down chart is that it doesn’t do a very good job of dealing with changes in the number of stories. One simple alternative is the burn UP chart. Here’s an example of a chart showing an increase in the number of stories, and an adjustment, therefore, in the anticipated ship date.like this chart. It’s simple, it shows clearly that changes in requirements are impacting the release date, and makes it clear that the decision about what to do to make the date lies in the hands of the requirements-givers. It’s a really nice example of a simple chart with powerful impact, and yet without much in the way of confrontation.
Here is an example of a PMF that addresses 5 key components of a performance measurement so that you focus the outcomes and results by different performance indicators/. For each engagement the priority level may change depending on the results you want. Sometimes process many be the critical area for performance measurement sometimes it is price..Formulate performance measurement metrics that assess vendor engagement and relationship health in additional to operational measures. It provides a collection of metrics to choose from multiple dimensions of the testing engagement The metrics can be cater to a wide variety of projectsKalyana: Performance Measurement Framework for Outsourced Testing Projects ( April 3, 2009) www. Software Testing » Performance Testing » Performance Measurement Framework for Outsourced Testing Projects
Companies are outsourcing test case executions, test script automation, and test case development tasks to offshore based companies, independent contractors, niche QA companies, and system integrators. Outsourcing approaches vary widely. Some companies outsource manual testing needs, while other companies outsource testing tasks. Approaches may vary but having clearly defined service quality expectations is critical . So of course quality matters. So lets look at some ways to determine service quality.Timeliness of serviceAccessibility and convenience of the serviceStaff availabilityConvenience of hours of operationLocationAccuracyCourteousness of service deliveryAdequacy of information disseminatedCondition and safety of facilities used – Doesn’t seen important but the negative press both Apple and more recently Wal-Mart received because of outsourced factory conditions is a reminder to keep this in mind. (fires in India factory) Customer satisfaction with a particular characteristic of the service delivery or overallThese are all potential areas for measurement