SlideShare ist ein Scribd-Unternehmen logo
1 von 66
Develop & Improve Your Software Testing Strategy Conveying and Inspiring Confidence Info-Tech Research Group “ Software implementation is a cozy bonfire, warm, bright, a bustle of comforting concrete activity.  But beyond the flames is an immense zone of darkness. Testing is the exploration of this darkness.”   -  extracted from the 1992 Software Maintenance Technology Reference Guide
Whether you develop it, configure it, or integrate it you simply can’t afford to ignore it ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Info-Tech Research Group This solution set is for you if…. ,[object Object],[object Object],[object Object],This solution set will answer the following…. IT exists to put functioning software into the hands of businesses, their staff and their customers SQA exists to make sure those needs are met in a  useful ,  meaningful , and  acceptable  manner Software Quality Assurance All require testing to ensure that requirements are met and the application functions according to expectations .
Executive Summary Your success is dependant on how well you test your applications. Whether you have developed them or simply integrated or configured them, this solution will help you understand how and why certain testing should be completed Software Quality Assurance or SQA is an umbrella term that refers to the process involved in ensuring quality in your deliverables. It encompasses both Software Quality Control (SQC) and Software Testing. Business leaders will all eagerly agree that  quality is important , but understanding how to get there, what processes to implement, what people to engage is often a difficult decision … this solution will help you to sift through all the information and turn it into useable knowledge to increase  your  success As any development organization matures, priorities jockey for position. For startups the elusive quality is  not the major concern , delivery is, but as the organization matures quality ultimately ends up at the  top Attempting to turn a screw with a knife blade can work, but it works much better, faster, and more accurately with the proper tool. Having trained quality professionals will help the organization achieve the level of quality that your customers expect … and deserve Develop and Improve your overall quality by learning and implementing proper process that fits your organizational needs. Whether you develop for the web, or integrate off the shelf software, your customers deserve your best, and you deserve their trust. This solution will help guide you through creating and maintaining a process that works for you Info-Tech Research Group A ssess  your readiness …  I mprove  your success D evelop  the means to achieve …  U nderstand  SQA, SQC, and Testing
Info-Tech Research Group Develop & Improve Your Software Testing Strategy Assess  where you stand with software quality assurance Success is determined by how well your customers feel toward you, your service, and your product Assessing your place in the world Assess The General Situation Your Environment Improve Registered Certified Gold Redesigned Network Develop Subscription Based Software Competencies Understand Program Management Concordant Bodies & Alternatives
Your approach to testing may be different depending on your development focus. Most development shops fall into 1 of 3 Info-Tech Research Group ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Most businesses with development groups will be system builders or integrators and will mature at a predictable pace Info-Tech Research Group Experienced : 15-30 Dev, 2-3 Dev Mgr, 1 Sr Dev Mgr,  0-1 BA, 0-1 PM, some formalized dev standards, most projects run by dev leaders, some early formalized planning, requirements mostly known, little change control, deadlines are set based on rough estimates and business/client need still largely asap. Inexperienced : 1-5 Dev, no Dev Mgr, no BA, no PM, no dev standards, projects run by dev, no real planning, requirements are extremely loose, no change control, deadlines are all asap.  **Top Priority ->  Delivery Mildly Experienced : 5-15 Dev, 1-2 Dev Mgr, no BA, no PM, unofficial dev standards, projects run by dev, no formalized planning, requirements are loose, no change control, deadlines are all asap. Very Experienced : 30-50 Dev, 5-8 Dev Mgr, 1-2 Sr Dev Mgr, 1-2 BA, 3-5 PM, formalized development standards being followed, some projects still run by dev leaders, most run by PM, formalized planning, requirements are set, some change control is attempted, deadlines are set based on established estimation process with  business/client needs factoring heavily. Seasoned Veteran : 50+ Dev, 8+ Dev Mgr, 3+ Sr Dev Mgr,  2-3+ BA, 8+ PM, formalized dev standards are followed, all projects controlled by PM, standardized and formalized planning, requirements are known, change control is in place, deadlines are based on well thought out estimates and business/client needs.  ** Top Priority ->  Quality No Testers ad-hoc developer testing only 0-1 Testers very little testing, some developer testing, some ad-hoc exploratory type testing 1 SQA/Test Leader, 5-10 Testers some developer testing, some standardized testing, mostly regression, smoke, exploratory, acceptance 1-2 SQA/Test Leader, 8-20 Testers some developer testing, standardized SQA run testing – separate test environment 1-2 SQA/Test Leaders, 20+ Testers developer unit testing, automated build testing standardized SQA run testing, automated regression, other automated testing – multiple separate test environments Individual business focus may be different, the approach to testing may be different, but the goal toward quality will be constant
As development organizations mature, top priorities change from delivery to quality with additional priorities along the way 5 Info-Tech Research Group Application Delivery Controlled Change Technology Used React Quickly Standards Adoption Process Adherence Planning & Design Project Management Application Quality Chaotic Change As an organization matures, priorities and their importance to the overall organization become  closer  and become more  tightly interwoven . As a result breakdowns in a more mature organization anywhere will have a far more  critical overall impact  to the organization. Startup  (inexperienced) Established (experienced) ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Low risk projects can take it slow implementing SQA process, but if your projects are high risk, good SQA is critical for success For  best  results a  quality process  should, at  minimum , include the following: Requirements Management  with a goal of clear, complete, testable requirement specifications Design and code  inspections Project  post-mortems/ retrospectives ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Info-Tech Research Group SQA processes should always be  balanced  with productivity so as to keep the  proverbial red tape  from getting out of hand
In many cases, convincing management of the need for testing resources is more difficult than you think Info-Tech Research Group Read the full “ Case Study: The business doesn’t care ”  in Appendix I of this document. A financial industry business develops software for the web, and for internal applications. They have no testing resources, and all active testing is completed by developers. The business has not experienced any major pain from their lack of testing and as such will not justify the expense. Scenario: ,[object Object],[object Object],[object Object],What Is Wrong: ,[object Object],[object Object],Lessons:
When times are tight, cutting testers and letting developers do the testing is common, but it is a short sighted decision Info-Tech Research Group ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Develop and Test : Opposites but connected by their similar being Distinct by orientation Unique by skills Incapable of performing the other! If you take this quick fix route, the results will be:
You know your cutting corners, you know it is wrong, but you’re a small shop with limited resources so what do you do? Info-Tech Research Group Read the full “ Case Study:  Even small shops need a testing strategy ”  in Appendix II of this document. A small IT shop that primarily deals with 3 rd  party application integration. Internal resources are limited and a lot of trust is imparted to the vendor. No testing resources exist, and there often is very limited time available. The manager, also the CIO, cuts corners regularly and knows it. Scenario: ,[object Object],[object Object],[object Object],What Is Wrong: ,[object Object],[object Object],[object Object],Lessons:
When you develop Web apps, you need to understand that there are differences in testing requirements Info-Tech Research Group ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],“ With Internet apps,  testing could be the last chance to ensure the safety of the data and the organization” - IT Professional with 25+ years experience
Your Web app is potentially visible to the world, so your effort to ensure its quality should not be taken lightly Info-Tech Research Group When building apps for the Web, you must consider and test for the following: ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Testing can save a lot of people a lot of time by ensuring that every app has at least considered the basics Info-Tech Research Group Developers should always follow best practices, but it is the job of testing to make sure they have. ,[object Object],[object Object],[object Object],[object Object],When testing your web applications,  make sure   your developers have considered the following basic requirements for web application development: ,[object Object]
If development is Agile, there is still a place for testers, but your focus and testing approach needs to change Info-Tech Research Group ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],a different set of challenges for Software Testing communication between team members is critical for success innovative thinking and possibly non-standard resources for testing close communication results in better clarity and understanding of the system Testing best suited for use in  Agile Development : “ With short timelines, it is critical that the tester has sufficient knowledge of the system, the objectives, and the requirements ”  - SQA Trained Professional with over 16yrs exp. Agile requires:
Rapid test-on-the-fly may be the norm in Agile Development, but resources focused on testing are still critical to your success Info-Tech Research Group ,[object Object],[object Object],[object Object],Critical test considerations for Agile Development: The objectives of the Project are clear to the entire team  At every stage the Software is tested to see if it meets the requirements Every requirement is translated to a test case While the processes and documentation are not stressed, sufficient steps must be taken to ensure that the software is delivered as per the user expectations This implies that each (sprint release) delivery is tested thoroughly before it is released Your testers should also be the voice on the team to make sure that:
If you’re testing an enterprise app, it is of little value how much testing has been done if it cannot be integrated Info-Tech Research Group Make sure your testers are aware of these Enterprise Testing Fundamentals: Testing is a critical requirement to making good enterprise deployment decisions. Recommendations include approaching your enterprise testing with a phased approach to your component, feature, and system testing. The test process for enterprise solutions can be best viewed as a  reverse engineering  of the product development. Typically this can be done in a  3-step process  involving  module verification ,  feature verification , and  usage . Product testing can only begin once one or more of the interrelated functions have been unit tested sufficiently as to allow a normal progression through each of the major areas of functionality. Start testing  early  to provide your supplier, vendor, or internal development group  sufficient time  to respond with fixes and have your test team verify the fixes while still  preserving your timeline Info-Tech Insight:
Enterprise App testing can include some assumptions of previous testing, but you still need to load it and smoke it Info-Tech Research Group ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Since system testing will help to determine the  production readiness  of the enterprise system,  customer acceptance testing techniques  are best used  during enterprise testing Testers should be aware that: Info-Tech Insight:
Quality test data is a critical piece of ensuring the ultimate quality is there; without it, it’s a guessing game Info-Tech Research Group Read the full “ Case Study: Test data, the critical component ”  in Appendix III of this document. A large organization with a large development and technology footprint. Well over 100 IT employees spread across multiple disciplines. SQA has 10 dedicated resources including a formally trained SQA Manager. The organization is responsible for custom development for internal purposes, web development, and enterprise integration. Scenario: ,[object Object],What Is Wrong: ,[object Object],[object Object],Lessons :
Info-Tech Research Group Develop & Improve Your Software Testing Strategy Improve  your overall approach to software quality testing It is impossible to “test” quality into a product Improving your testing effectiveness Assess The General Situation Your Environment Improve Your Testing Focus Your Testing Strategy Your Testing Effectiveness Best Practices Develop Subscription Based Software Competencies Understand Program Management Concordant Bodies & Alternatives
Incomplete requirements, poor specifications, and unclear objectives are listed clearly as the biggest project problems 0 Info-Tech Research Group In a recent unofficial independent poll of Software Quality Assurance professionals (Linked-In) when asked  “What do you think the top problems with development projects are?”  The results showed that poor requirements was the leading cause of failure in projects. Inadequate testing was NOT the biggest concern,  poor requirements were Make sure your project team fully understands the requirements and make sure those  requirements are testable If your team can’t test the requirement,  you have a problem!  approx N=19 18% 13% 7% 8% 28% Poor Requirements Feature Creep Miscommunication Inadequate Testing Unrealistic Schedule Having testers involved early in the projects can help to prevent the number one cause of failure in projects. *** Visionary requirements  cannot be tested , but detailed, well thought out requirements can!
Improve your testing process by first understanding the  5 most common problems and how to address them Info-Tech Research Group solid requirements  -  clear, complete, detailed, attainable, testable requirements that are agreed to by all stakeholders. realistic schedules  -  allow adequate time for planning, design, testing, bug fixing, re-testing, changes, and documentation; project resources should be able to complete the project without burning out. adequate testing  -  start testing early, re-test after fixes or changes, plan for adequate time for testing and bug-fixing. 'Early' testing could include static code analysis/testing, test-first development, unit testing by developers, automated post-build testing, etc. stick to initial requirements where feasible  -  be prepared to defend against excessive changes and additions once development has begun, and be prepared to explain consequences. If changes are necessary, they should be adequately reflected in related schedule changes. If possible, work closely with customers/end-users to manage expectations.  communication  -  require walkthroughs and inspections when appropriate; make extensive use of group communication tools - groupware, wiki's, bug-tracking tools and change management tools, intranet capabilities, etc.; ensure that information/documentation is available and up-to-date - preferably electronic, not paper; promote teamwork and cooperation; use prototypes and/or continuous communication with end-users if possible to clarify expectations.
To ensure you maintain a good quality process, make sure you follow these basic steps to keep yourself on the upward slope Info-Tech Research Group ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Strategy, Planning & Requirements Development Design Deployment ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Do not limit your testing resources to just testers, include resources from other areas to improve your success 0 Info-Tech Research Group “ Just like development, to be effective SQA needs to have good processes For good testing results those processes need to include other people as necessary” - SQA Analyst– IT Professional Services (Financial Industry) Organizations that included resources outside of testing throughout their process showed to be more successful in every case. N=72 Source: Info-Tech Research Group Business Analysts Business Users Developers Other IT +38% +33% +24% +83% Those that did Those that did not Clients use of additional resources
Use Info-Tech’s testing checklist as a guide to insure you and your testing team have sufficient test coverage Info-Tech Research Group ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],The checklist is a great way to remind yourself, your test team, and the project team what has been done, and what has yet to be done. If you don’t already have your own checklist, try this one out
All good projects begin with a plan. For testers, that means drafting a test strategy and a comprehensive test plan Info-Tech Research Group ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Test Plan – What you need to know! Project test plans are documents that describe the  objectives ,  scope  and  approach  for the project testing effort. The process of creating the test plan is a  useful exercise   to force an evaluation of the entire validation procedure for the given project. The completed test plan document is also  extremely useful  for others to understand the validation process, specifically why and how certain validation will occur. Of respondents who had the most success with their projects,  95%  agreed they followed a testing strategy. Initialization: Overview & Setup: Testing Details:
Improve & streamline your test planning by utilizing  Info-Tech’s test plan template Info-Tech Research Group Use  Info-Tech’s “ Test Plan Template ” to quickly layout and streamline your project test plan. Test strategies are unique, but the categories and type of information you need to record is the same each time. This template will guide you through building a complete and comprehensive test plan, strategy document. ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],The plan you produce should provide sufficient detail to permit identification of the major testing tasks and estimation of the time required to do each one With this template you will be able to describe the overall approach to testing. For each major group of features or feature combinations, you will be able to specify the approach which will ensure that these feature groups are adequately tested and specify the major activities, techniques, and tools which are used to test the designated groups of features.
Descriptive reporting of any issues found during testing is critical to ensure you continue on the upward slope of quality Info-Tech Research Group When reporting an incident, include: ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],When reporting a resolution, include: When re-testing a resolution, include: ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],When a bug is found in the software it  needs to be communicated  back to the development team so that they can fix it.  After development has resolved the issue, it needs to be  re-tested  and determinations made against the requirements as to whether any regression testing should be executed to determine if potential problems have been created elsewhere in the software as a result of the fix. If a problem/issue tracking system is in place it should already handle these processes.  There is a very broad  and robust commercial software base for issue tracking systems available if you do not currently have one
One of the most difficult questions is knowing when enough is enough; when testing should stop Info-Tech Research Group How do you know when enough is enough; when do you stop testing? ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],What do you do  ? When software released to testing is just too buggy to continue? The best thing to do in this case is to document the most critical and blocking type bugs  This type of problem can have significant effect on project timelines and schedules  It can point to deeper rooted problems within the development organization.  Insufficient unit testing, integration testing, poor design, lack of following requirements, improper adherence to development build and release processes. SQA must ensure that Development managers are made aware of these situations and are provided with adequate documentation  It sounds like an easy one to answer, however this question can be one of the  most difficult  to determine for SQA resources. It is imperative that SQA provide a  comfort rating  to management to aid in the decision to release the software to customers
Despite all your good efforts, sometimes the clock just isn’t on your side. If that happens, where does that leave testing Info-Tech Research Group You have done everything right. You planned, you strategized, you lined everything up … but the deadline is looming ,[object Object],[object Object],[object Object],Evaluating risk can be challenging, but consider this: ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],What if the project just isn’t big enough to justify extensive testing? ,[object Object],[object Object],[object Object]
Improve your effectiveness by following this simple list of  best practices for software quality testers Info-Tech Research Group 1) Learn to analyze your test results thoroughly 2) Learn to maximize your test coverage 3) Break your application into smaller functional modules 4) Write test cases for intended functionality 5) Start testing the application with the intent of finding bugs 6) Write your test cases in the requirement analysis and design phase 7) Make your test cases available to developers prior to coding 8 ) If possible identify and group your test cases for later regression testing 9) Applications requiring critical response time should be thoroughly tested for performance 10) Developers  should  not  test their own code 11) Go beyond requirement testing  12) While doing regression testing use bug data 13) Note down the new terms, concepts you learn while testing 14) Note down all code changes done for testing 15) Keep developers away from test environment 16) It’s a good practice to involve testers right from software requirement and design phase 17) Testing teams should share best testing practices 18) Increase your conversations with the developers 19) Don’t run out of time to do high priority testing tasks 20) Write clear, descriptive, unambiguous bug reports Read the full “ Testing Best Practices ”  in Appendix IV of this document. Don’t forget:  testing  is  a  creative  and  challenging  task. It depends on your skill and experience, how you handle this challenge
Info-Tech Research Group Develop & Improve Your Software Testing Strategy Develop  your testers, your process, and the means to measure Involve SQA early, but not too early … wait until the ambiguity starts to settle Developing your testing strategies Assess The General Situation Your Environment Improve Your Testing Focus Your Testing Strategy Your Testing Effectiveness Best Practices Develop Your Testers & Test Coverage Your ability to measure success Understand Program Management Concordant Bodies & Alternatives
Like gears in a well oiled machine, having the right people on the job is your best guarantee for a quality product Info-Tech Research Group Software Tester – focus on testing – reaction oriented Software QA – focus on overall quality – prevention oriented ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],QA Manager – focus on Quality – process & team ,[object Object],[object Object],[object Object],[object Object],65%   of survey respondents say formal SQA training greatly improves their success
Having the right mix of experience and training in the right roles makes a difference to your chance of success 0 Info-Tech Research Group Increasing Success Our survey shows: Clients that showed greater success had  36%  more experienced  staff on their team Organizations that showed greater success had  45%   more staff with  development backgrounds Clients that showed greater success had  41%   more staff with  formal QA training N=72 Source: Info-Tech Research Group In a related unofficial poll of SQA Professionals, when asked “ What is a good ratio of developers to testers?  ” the response was an overwhelming and unanimous  “NO”  The unanimous reasoning was simply that there are too many variables to give a reasonably   accurate answer However  the survey did show that while the minimum ratio was 0:1, the maximum was 1:30 the  most common ratio was 1:3 (testers to developers) “ Having the right resources spread between Dev, SQA, and the BA’s is like  3 legs of a stool  … you need them all ” - IT Manager Financial Industry +45% +36% +41% Relevant Work Experience Formal QA Training Development Background
Trained SQA testers should know what type of test is best, but using these should be your minimum coverage Info-Tech Research Group Unit  Testing  - the most micro scale of testing; to test particular functions or code modules. Typically done by the developer and  not by  testers, as it requires detailed knowledge of the internal program design and code. Not always easily done unless the application has a well-designed architecture with tight code; may require developing test driver modules or test harnesses Functional Testing  - black-box type testing geared to functional requirements of an application; this type of testing should be done by testers. This doesn't mean that the programmers shouldn't check that their code works before releasing it (which of course applies to any stage of testing) Integration Testing  - testing of combined parts of an application to determine if they function together correctly. The 'parts' can be code modules, individual applications, client and server applications on a network, etc. This type of testing is especially relevant to client/server and distributed systems Exploratory & Ad-Hoc Testing  - often taken to mean a creative, informal software test that is not based on formal test plans or test cases; testers may be learning the software as they test it Regression Testing  - re-testing after fixes or modifications of the software or its environment. It can be difficult to determine how much re-testing is needed, especially near the end of the development cycle.  Automated testing  approaches can be especially  useful  for this type of testing Usability Testing & User Acceptance  - testing for user-friendliness. This is the most subjective, and will depend on the end-user or customer. User interviews, surveys, video recording of user sessions, and other techniques can be used. Developers and testers are usually not appropriate as usability testers “ Automated testing is not the silver bullet that everyone believes it to be but yet another tool in the QA toolbox. An automation framework is great when it is used to enhance testing ” -  SQA Trained Professional with over 16yrs exp . Read the  complete list of testing  types in Appendix V
What to, when to, or even whether to automate is a crucial decision; make sure you choose the right tool for the job ,[object Object],[object Object],[object Object],[object Object],Info-Tech Research Group ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Automation comes in many flavors: Some of the popular test tools/suites available today include: HP QuickTest Professional IBM Rational Functional Tester Selenium SilkTest TestComplete TestPartner WatiR/WatiN/WatiJ Functional & Regression Regression Record & Playback  Testing for Web Apps Functional Enterprise Testing Full Suite – Functional, Regression, Unit, GUI… GUI  Web – Browser based testing ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Automated testing tools can only find what they are programmed to find. Most will make this determination from a comparison to a baseline. If you don’t have a good baseline, your tool will be of no use to you “ Invest in your people first, then the tools. The tools are no magic bullet if the people don’t know the process” - IT Manager–Financial Industry
Get in the habit of writing test cases to validate your coverage. Follow these steps to develop a repeatable test case format Info-Tech Research Group ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],What is a test case? A test case exists to describe an input, action or event and an expected response, to determine if a feature of an application is working correctly and as expected  Use  Info-Tech’s “ Defect Reporting Template ” to quickly layout and streamline your project test cases. Test cases are unique, but the categories and type of information we need to record is the same each time. This template will guide you through building a complete and comprehensive test case.
Develop a strategy for projects by accounting for the likelihood and impact of application failure Info-Tech Research Group Use  Info-Tech’s “ Software Testing Strategy & Risk Assessment ” tool to determine risk and testing strategies by looking at these influencing factors. Testing strategies should be unique to each development project. The strategy for each project will vary depending on some of the important variables measured in Info-Tech’s Strategy Assessment tool. A strategy for each system in the application portfolio makes up the organization’s overall strategy for testing, leading to investments in resources that match a true reflection of testing resources   ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Assess the strengths & limitations of your development efforts by evaluating & measuring the quality of your product Info-Tech Research Group ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Assessing Product Quality  ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],“ Many development shops only have a vague sense of their overall quality, usually this comes in the form of gut-feel. ”  - IT Professional with 25+years experience
Evaluate the benefits of your  system testing group  by measuring and evaluating their costs and application coverage Info-Tech Research Group ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Evaluate System Testing  ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Cost of Quality Effort/Activity ,[object Object],[object Object],[object Object],[object Object],[object Object],Cost of Defects Doing an analysis of cost and benefits to testing can encourage better decision making and ensure that resources are allocated effectively to support the maximum level of quality for a project at the lowest cost.
Info-Tech Research Group Develop and Improve Your Software Testing Strategy If you only do a little, you need to understand the basics Understanding is the first step toward useful knowledge Understand how it all fits together Assess The General Situation Your Environment Improve Your Testing Focus Your Testing Strategy Your Testing Effectiveness Best Practices Develop Your Testers & Test Coverage Your ability to measure success Understand If you only test a little, this is what you need to know How testing fits in the bigger SQA picture
Unit Testing is the 1 st  level of testing and the most important! Detecting and fixing bugs early helps reduce costly fixes later ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Info-Tech Research Group Time pressures to get the job done may result in developers cutting corners in unit testing  it helps to write scripts, which automate a part of unit testing  Automating where necessary  will help ensure that the necessary tests were done An effective unit testing process can increase the  software reliability  and the  credibility of the developer Many new developers take the unit testing tasks lightly and realize the importance too late in the project
The size of the hamster powering the wheel is unimportant when functional testing is performed – requirements only ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Info-Tech Research Group The objective of functional testing is to measure the quality of the business components of the system Functional testing can be an overwhelming task for teams with little experience To ensure success, the scope of the testing effort must be well defined 94%   of survey respondents ranked functional testing as their most important testing
If your application is going to be used by people, then you need to do usability testing … its just that simple! ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Info-Tech Research Group testing for  user-friendliness the most subjective of all tests performed developers and designers, while talented, aren't like  “ normal ” people Designing systems that make sense to developers will often lead to a site that is not usable by the average person  While  particularly necessary  for web development, or client application development. It should also be considered for corporate or business application development
Any software change can cause existing functionality to fail  It is common for defect fixes to introduce new problems ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Info-Tech Research Group Systems tend to become more fragile with each change, requiring additional testing of the original system in addition to changes With each change, regression testing can become more costly Regression testing is particularly important in various software methodologies where change is embraced and occurs often …  i.e.: Agile
While unit testing focuses on testing the individual pieces, integration testing tests all the pieces as they come together ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Info-Tech Research Group Even if a component is successfully unit tested, it is of little value if the component cannot be successfully integrated with the rest of the application  Integration test cases should focus on scenarios where one component is being called from another It is  very common  that a lot of bugs are discovered during integration testing
Automated testing tools make sense when the efficiency gained from use is greater than the tools cost to purchase and maintain ,[object Object],[object Object],[object Object],[object Object],Info-Tech Research Group In some environments (ie: web development)  where various hardware related differences exist, or multiple browsers  must be tested for, automation can be an  amazing time saver . Where manual testing may require several days, automated testing may require only hours, which translates directly into cost and time savings. Automated tests can be shared between your test group and developers, and can be triggered to run automatically every time new or changed source code is checked in. If the test fails, the developer can be notified automatically and avoid involving your testing resources until the build is successful and ready for testers to take control. ,[object Object],Automated testing can be affordable, and is recommended in order to save both time and money, however  you should be prepared to spend time setting up and maintaining automated test scripts.
If your testing team is not already an expert tester, then automation is a bad idea and a waste of time ,[object Object],[object Object],[object Object],[object Object],Info-Tech Research Group Certain testing can  never be automated  as it requires a person. Exploratory testing and usability testing in particular require more thought and randomness and, in the case  of usability, require the  subjectivity of a real person. It is highly recommended that, when just learning to automate, you bring in (or hire) a technical test engineer with specific and relevant experience with the automation software. If you don’t,  you  run the risk of reinventing the proverbial wheel  which can lead to an extremely expensive venture. ,[object Object],[object Object],[object Object]
Info-Tech Research Group Software Quality Assurance is not all about the testing,  in fact SQA encompasses all aspects of the  Software Development Life Cycle from planning through support An organization that focuses on Quality Management looks at all aspects from beginning to end with a goal of  total satisfaction. “ Getting them engaged early creates more successful projects because they have the context to plan and prepare.” - Manager of QA – Government (Health Sector) “ If you relegate QA responsibilities to just testing the end product, you overlook an opportunity to integrate quality into the entire software development life cycle.”   - SQA Trained Professional with over 16yrs exp. “ To truly be effective at building quality, you can’t just look at testing alone. Making sure the requirements are even testable is a crucial step towards quality.” - SQA Trained Professional with over 16yrs exp. Software testers typically get involved during design and development phases …but you really shouldn’t wait, or stop there. Software Development Life Cycle (SDLC)
Info-Tech Research Group ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Software Quality Assurance Software Quality Control Software Testing Often you will hear the terms  SQA, SQC ,(QA/QC) ,   or even  testing  used as if they were 100% interchangeable It is important to understand the distinction, by the classic definitions to  avoid confusion  and add clarity to your resource roles and responsibilities Software Quality Assurance comprehends the whole Software Quality Control encompasses testing Testing is at the heart of Quality Control Using any of these terms is typically sufficient to get your point across to any of the development team, including management
Develop & Improve Your Testing Strategy Summary & Conclusions Info-Tech Research Group A ssess  your readiness I mprove  your success D evelop  the means to achieve U nderstand  SQA, SQC, and testing ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],This solution has been designed to  A I D  U  in improving your current testing practices, develop new, practical, tried and true process, and provide a level of understanding of software testing that will enable you to make informed and strategic decisions for your organization in your pursuit of quality!
Appendix Info-Tech Research Group ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Appendix I Case Study:  Testing is limited. The business doesn’t care Info-Tech Research Group The business remains uninvolved because they trust the development group. There is not enough regression testing to the manager’s estimation, and as a result there are post production issues.  Since the business doesn’t see it as a concern they may never change testing practices. “ We have monitoring software for servers.  Monitor but don’t test . We put it out there and  hope that it doesn’t crash . It will go down and we’re  reactive  about it.  The IT budget is limited, so we can’t invest in testing” The Situation : Finance Industry  – Applications developed include  website  and  internal applications . There are approximately 30 people in IT, 10 Web Developers ( no testers ) – Spread across an average of 10 applications and/or projects. Some project managers are involved in some of the projects and sometimes will produce a plan. Development staff typically will draft a test plan (of sorts). There is no standard methodology followed, everything is very ad-hoc. Attempts are made to collaborate with the business when possible; projects managers are too distant and operational managers are not equipped with knowledge to provide feedback. The Challenge : One of the biggest struggles facing the group is trying to get the business included and involved. They have no understanding of development or testing and do not support what they don’t know. IT has no idea what they are supposed to test, and  because this has not caused a major problem  yet , the  business does not feel enough incentive to get involved , or to support the need for testing. The only time the business gets involved in our projects is when they need to sign off for compliance reasons. Generally the business takes on the attitude of … “we just trust you.” Comments: Recommendations: This business needs to take a step back and look at the risk associated with not doing testing.  Trusting of their developers is admirable, but it won’t help the business.  A serious look at the cost justifications for testing should be done. Compared against the cost to react to problems and the risk they are exposing themselves to, especially considering their industry, should be enough to help them in the right direction. Many businesses fall into this pit … they trust their developers, and so testing is a cost not justifiable.  Before you right it off , as with any  good business decision  … do your  due diligence ,  quality is worth it! Go Back
Appendix II Case Study:  Even small shops need a testing strategy Info-Tech Research Group “ No, I think that for us, because we’re a smaller shop, it comes down to time.  We just don’t (test)–  I’m gonna say from a manpower perspective,  we definitely take some shortcuts right now .  In some cases, I have my VAR, my key contact, he will test it.  He’ll show me what – we’ll quickly go over it and I’ll say “ Yeah, okay that looks good ” and  I just let it go .  I  don’t test it ; I don’t necessarily test them myself just because I  don’t have the time to do it .” For vendor patch updates, the partner installs and tests patches.  But they don’t put all their faith in the hands of the development partner. Before any patches are applied, a copy is made of the previous release. If something happens once the development partner applies the patches they go back to original. The Situation : Small Shop Profile –CIO, Application manager and QA manager at a manufacturing industry IT shop of four.  Development for their iSeries based commercial ERP platform is an external partner, with whom they have had years of experience. They work with a VAR to specify and test ERP enhancements, and sometimes just don’t have time to test.  The Challenge : Big projects – will test in conjunction with developers The alternative:  For minor projects, and when there just isn’t time, will have the development partner test functionality.    Comments: Recommendations: The manager in this case recognizes the need to test, he uses the 3 rd  party VAR whom he trusts to take on a lot of the testing responsibilities and trusts that if something goes wrong, they can back out of it. As with in the first case study, this business needs to take a step back and look at the risk associated with not doing testing.  Trusting of their VAR in this case is risky, and it won’t help the business. The VAR does not have the same stake if something goes wrong. A serious look at the cost for testing should be done.  All may be well today, but what about tomorrow… As a small shop of only four developers, it is easy to understand, and easy to justify not having specific testing resources. However, improved quality can be achieved by having dedicated resources that know how to plan, how to strategize and test the integration of the systems. A dedicated resource can free up the development resources to work on more relevant tasks. The development resources in small shops can be an amazing resource, but they can also be very quick to move on. Keep them productive and happy doing things they love to do …. develop. Focus the developers attention on the solutions, and bring in co-op students to handle the testing.  its your quality, don’t leave it to someone else! Go Back
Appendix III Case Study:  Test data, the critical component Info-Tech Research Group “ So I’ve seen one of my testers was testing maintenance for something, and there literally was not enough data to sort a column.  And that project, that application had made it through to production.” Another scenario is when data is not unique enough.  “We’ve had this one happen – the entire test, people in there had the same birth date, so you’re trying to validate reports that generate different things based on date of birth, you don’t really know if it’s working or not because you’re always seeing the same result.” The Situation : 100 person IT shop. 10 QA staff, lots of custom and commercial testing.  There are software architects, developers, e-business analysts, and testers, and kind of quality analyst type roles, project managers,  application desk-side support analysts. The Challenge : For security/compliance reasons an enterprise is limited in the real data they can tests in applications,  and incomplete test data can limit testing. The result is that testing does not actually meet the targets in the test script. People need to create custom data and sometimes they just don’t create enough.   Comments: Recommendations: This organization has a well established and well formed environment. Their resource make up is good, the processes they have in place are good, the testing strategy is good, and there is adequate planning and support for their testing efforts. The short fall is with the infrastructure for the test environments. In this case, data. Time needs to be spent ensuring that the test team has sufficient data to properly test for conditions that properly reflect those situations  that emulate the real world.  The recommendation would be to duplicate one of the production databases, move this to an isolated test environment, and take the time to change the data from real to test … for example, changing names, changing titles, changing all facets of the data to test (fake) and obviously fake, data. This is a tedious and time consuming task, however once completed it will  increase the effectiveness of the testing team. Another alternative recommendation would be to create a mirror within the production system that could write (changed) data to a separate database for use in the test environment. Go Back
Appendix IV Testing  Best Practices Info-Tech Research Group 1) Learn to analyze your test results thoroughly.  Do not ignore the test result. The final test result may be ‘pass’ or ‘fail’ but troubleshooting the root cause of ‘fail’ will lead you to the solution of the problem. Testers will be respected if they not only log the bugs but also provide  solutions . 2) Learn to maximize the test coverage  every time you test any application. Though 100 percent test coverage might not be possible you can always  try  to reach it. 3)  To ensure maximum test coverage  break your application into smaller functional modules.  Write test cases on individual unit modules. Also, if possible, break these modules into smaller parts, e.g.: If you have divided your website application in modules and “ accepting user information ” is one of the modules. You can break this “ user information ” screen into smaller parts for writing test cases: Parts like UI testing, security testing, functional testing of the user information form etc. Apply all form field type and size tests, negative and validation tests on input fields and write all test cases for maximum coverage. 4)  While writing test cases,  write test cases for the intended functionality  first i.e: for valid conditions according to requirements. Then write test cases for invalid conditions. This will cover expected as well unexpected behavior of the application. 5) Think positive . Start testing the application by intending to find bugs/errors. Don’t think beforehand that there will not be any bugs in the application. If you test the application with the intention of finding bugs you will definitely succeed. 6) Write your test cases in requirement analysis  and the design phase itself. This way you can ensure all the requirements are testable. 7) Make your test cases available to developers  prior to coding. Don’t keep your test cases with you waiting to get the final application release for testing, thinking that you can log more bugs. Let developers analyze your test cases thoroughly to develop a quality application. This will also save the re-work time. Go Back
Appendix IV - continued Testing  Best Practices Info-Tech Research Group 8)  If possible identify and  group your test cases for regression testing.  This will ensure quick and effective manual regression testing. 9)  Applications requiring critical response time should be thoroughly tested for performance.  Performance testing is the critical part of many applications.  In manual testing this is mostly ignored by testers. Find out ways to test your application for performance. If it is not possible to create test data manually, then write some basic scripts to create test data for performance testing or ask the developers to write it for you. 10) Programmers should not test their own code.  Basic unit testing of the developed application should be enough for developers to release the application for the testers. But testers should not force developers to release the product for testing. Let them take their own time. Everyone from lead to manger will know when the module/update is released for testing and they can estimate the testing time accordingly. This is a  typical situation  in an  agile project environment . 11) Go beyond requirement testing.  Test the application for what it is  not  supposed to do. 12)  While doing regression testing  use previous bug information.  This can be useful to predict the most probable bug filled part of the application. 13)  Keep a text file open while testing an application and  write down the new terms and concepts  you learn. Use these notepad observations while preparing a final test release report.  This good habit will help you to provide a complete unambiguous test report and release details. 14)  Many times testers or developers make changes in the code base for the application when under test. This is a required step in development or testing environments to  avoid execution of live transaction processing like in banking projects .  Record all such code changes done for testing purposes  and at the time of final release  make sure you have removed  all these changes from the final client side deployment file resources. Go Back
Appendix IV – continued (2) Testing  Best Practices Info-Tech Research Group Go Back 15) Keep developers away from the test environment.  This is a required step to detect any configuration changes missing in a release or deployment document. Sometimes developers do some system or application configuration changes but forget to mention those in deployment steps. If developers don’t have access to the testing environment they will not make any of these changes accidentally. These changes  must be captured at the right place . 16)  It’s a good practice to  involve testers right from the software requirement and design phase.  This way testers can get knowledge of the application dependability resulting in detailed test coverage. If you are not being asked to be part of this development cycle then make a request to your lead or manager to  involve your testing team  in  all decision making processes or meetings . 17)  Testing teams should  share best testing practices , and experience with other teams in their organization. 18) Increase your conversation with developers  to know more about the product. Whenever possible, use face-to-face communication for resolving disputes quickly and to avoid any misunderstandings. But also, when you reach an understanding or resolve any dispute - make sure to communicate the same in writing.  Do not leave anything strictly verbal . 19) Don’t run out of time to do high priority testing tasks.  Prioritize your testing work from high to low priority and plan your work accordingly. Analyze all associated risks to prioritize your work. 20) Write clear, descriptive, unambiguous bug reports.  Do not provide only the bug symptoms but also provide the effect of the bug and all possible solutions.
Appendix V Understand the various testing types Info-Tech Research Group Black box testing  - not based on any knowledge of internal design or code. Tests are based on requirements and functionality White box testing  - based on knowledge of the internal logic of an application's code. Tests are based on coverage of code statements, branches, paths, conditions Unit  Testing  - the most micro scale of testing; to test particular functions or code modules. Typically done by the developer and  not by  testers, as it requires detailed knowledge of the internal program design and code. Not always easily done unless the application has a well-designed architecture with tight code; may require developing test driver modules or test harnesses Incremental Integration Testing  - continuous testing of an application as new functionality is added; requires that various aspects of an application's functionality be independent enough to work separately before all parts of the program are completed, or that test drivers be developed as needed; done by programmers or by testers Integration Testing  - testing of combined parts of an application to determine if they function together correctly. The 'parts' can be code modules, individual applications, client and server applications on a network, etc. This type of testing is especially relevant to client/server and distributed systems Functional Testing  - black-box type testing geared to functional requirements of an application; this type of testing should be done by testers. This doesn't mean that the programmers shouldn't check that their code works before releasing it (which of course applies to any stage of testing) System Testing  - black-box type testing that is based on overall requirements specifications; covers all combined parts of a system End-to-End Testing  - similar to system testing; the  macro end of the test scale; involves testing of a complete application environment in a situation that mimics real-world use, such as interacting with a database, using network communications, or interacting with other hardware, applications, or systems if appropriate Smoke Testing  - typically an initial testing effort to determine if a new software version is performing well enough to accept it for a major testing effort. For example, if the new software is crashing systems every 5 minutes, bogging down systems to a crawl, or corrupting databases, the software may not be in a condition to warrant further testing Regression Testing  - re-testing after fixes or modifications of the software or its environment. It can be difficult to determine how much re-testing is needed, especially near the end of the development cycle.  Automated testing  approaches can be especially  useful  for this type of testing Acceptance Testing  - final testing based on specifications of the end-user or customer, or based on use by end-users/customers over some specified period of time Load Testing  - testing an application under heavy loads, such as testing of a web site under a range of loads to determine at what point the system's response time degrades or fails Go Back
Appendix V – continued Understand the various testing types Info-Tech Research Group Stress Testing  - often used interchangeably with load and performance testing. Also, used to describe such tests as system functional testing while under unusually heavy loads, heavy repetition of certain actions or inputs, input of large numerical values, and large complex queries to a database system, etc. Performance Testing  - often used interchangeably with  stress and load testing. Ideally , performance testing  is defined in requirements documentation or QA or Test Plans. (Specified performance criteria from the customer). Usability Testing  - testing for user-friendliness. This is the most subjective, and will depend on the end-user or customer. User interviews, surveys, video recording of user sessions, and other techniques can be used. Developers and testers are usually not appropriate as usability testers. Deployment Testing  - testing of full, partial, or upgrade install/uninstall processes. Recovery & Failover Testing  - testing how well a system recovers from crashes, hardware failures, or other catastrophic problems. Security Testing  - testing how well the system protects against unauthorized internal or external access, willful damage, etc.; may require sophisticated testing techniques. Compatibility Testing  - testing how well software performs in a particular hardware/software/operating system/network/etc. environment. Exploratory Testing  - often taken to mean a creative, informal software test that is not based on formal test plans or test cases; testers may be learning the software as they test it. Ad-Hoc Testing  - similar to exploratory testing, but often taken to mean that the testers have significant understanding of the software before testing it. User Acceptance Testing  - determining if software is satisfactory to an end-user or customer. Comparison Testing  - comparing software weaknesses and strengths to competing products. Alpha/Beta Testing  - testing of an application when development is nearing completion; minor design changes may still be made as a result of alpha testing.  Beta testing occurs when development and testing are essentially completed and final bugs and problems need to be found before final release. Typically, alpha & beta testing is done by end-users or others, not by developers or testers. Go Back
Appendix VI Available Tools & Templates Info-Tech Research Group Use  Info-Tech’s “ Test Plan Template ” to quickly layout and streamline your project test plan. (strategy) Use Info-Tech’s “ Defect Reporting Template ” to quickly record and document your project test cases. Use Info-Tech’s “ Software Testing Strategy & Risk Assessment ” tool to determine risk and testing strategies by looking at these influencing factors. Use  Info-Tech�
develop improve software testing
develop improve software testing
develop improve software testing
develop improve software testing
develop improve software testing

Weitere ähnliche Inhalte

Andere mochten auch

Manual testing interview question by INFOTECH
Manual testing interview question by INFOTECHManual testing interview question by INFOTECH
Manual testing interview question by INFOTECHPravinsinh
 
Software Testing Fundamentals
Software Testing FundamentalsSoftware Testing Fundamentals
Software Testing FundamentalsChankey Pathak
 
Software testing basic concepts
Software testing basic conceptsSoftware testing basic concepts
Software testing basic conceptsHưng Hoàng
 
Software testing life cycle
Software testing life cycleSoftware testing life cycle
Software testing life cycleGaruda Trainings
 
Best Practice Testing with Lime 2
Best Practice Testing with Lime 2Best Practice Testing with Lime 2
Best Practice Testing with Lime 2Bernhard Schussek
 
Model-Based Testing: Theory and Practice. Keynote @ MoTiP (ISSRE) 2012.
Model-Based Testing: Theory and Practice. Keynote @ MoTiP (ISSRE) 2012.Model-Based Testing: Theory and Practice. Keynote @ MoTiP (ISSRE) 2012.
Model-Based Testing: Theory and Practice. Keynote @ MoTiP (ISSRE) 2012.Wolfgang Grieskamp
 
EuroSTAR 2013 - Test Strategies are 90 percent waste!
EuroSTAR 2013 - Test Strategies are 90 percent waste!EuroSTAR 2013 - Test Strategies are 90 percent waste!
EuroSTAR 2013 - Test Strategies are 90 percent waste!Remi Hansen
 
Ruud Teunissen - Test Process Improvement on a Shoestring
Ruud Teunissen -  Test Process Improvement on a Shoestring Ruud Teunissen -  Test Process Improvement on a Shoestring
Ruud Teunissen - Test Process Improvement on a Shoestring TEST Huddle
 
Overview of test process improvement framework
Overview of test process improvement frameworkOverview of test process improvement framework
Overview of test process improvement frameworkQA Club Kiev
 
White Paper - Checklist for Business Process Improvement
White Paper - Checklist for Business Process ImprovementWhite Paper - Checklist for Business Process Improvement
White Paper - Checklist for Business Process ImprovementCraig Maye
 
Is There A Risk?
Is There A Risk?Is There A Risk?
Is There A Risk?TEST Huddle
 
White paper on testing in cloud
White paper on testing in cloudWhite paper on testing in cloud
White paper on testing in cloudimkulu
 
Integration Testing Practice using Perl
Integration Testing Practice using PerlIntegration Testing Practice using Perl
Integration Testing Practice using PerlMasaki Nakagawa
 
Test Process Improvement
Test Process ImprovementTest Process Improvement
Test Process ImprovementMomentum NI
 
Chapter 10 software certification
Chapter 10 software certificationChapter 10 software certification
Chapter 10 software certificationdespicable me
 

Andere mochten auch (15)

Manual testing interview question by INFOTECH
Manual testing interview question by INFOTECHManual testing interview question by INFOTECH
Manual testing interview question by INFOTECH
 
Software Testing Fundamentals
Software Testing FundamentalsSoftware Testing Fundamentals
Software Testing Fundamentals
 
Software testing basic concepts
Software testing basic conceptsSoftware testing basic concepts
Software testing basic concepts
 
Software testing life cycle
Software testing life cycleSoftware testing life cycle
Software testing life cycle
 
Best Practice Testing with Lime 2
Best Practice Testing with Lime 2Best Practice Testing with Lime 2
Best Practice Testing with Lime 2
 
Model-Based Testing: Theory and Practice. Keynote @ MoTiP (ISSRE) 2012.
Model-Based Testing: Theory and Practice. Keynote @ MoTiP (ISSRE) 2012.Model-Based Testing: Theory and Practice. Keynote @ MoTiP (ISSRE) 2012.
Model-Based Testing: Theory and Practice. Keynote @ MoTiP (ISSRE) 2012.
 
EuroSTAR 2013 - Test Strategies are 90 percent waste!
EuroSTAR 2013 - Test Strategies are 90 percent waste!EuroSTAR 2013 - Test Strategies are 90 percent waste!
EuroSTAR 2013 - Test Strategies are 90 percent waste!
 
Ruud Teunissen - Test Process Improvement on a Shoestring
Ruud Teunissen -  Test Process Improvement on a Shoestring Ruud Teunissen -  Test Process Improvement on a Shoestring
Ruud Teunissen - Test Process Improvement on a Shoestring
 
Overview of test process improvement framework
Overview of test process improvement frameworkOverview of test process improvement framework
Overview of test process improvement framework
 
White Paper - Checklist for Business Process Improvement
White Paper - Checklist for Business Process ImprovementWhite Paper - Checklist for Business Process Improvement
White Paper - Checklist for Business Process Improvement
 
Is There A Risk?
Is There A Risk?Is There A Risk?
Is There A Risk?
 
White paper on testing in cloud
White paper on testing in cloudWhite paper on testing in cloud
White paper on testing in cloud
 
Integration Testing Practice using Perl
Integration Testing Practice using PerlIntegration Testing Practice using Perl
Integration Testing Practice using Perl
 
Test Process Improvement
Test Process ImprovementTest Process Improvement
Test Process Improvement
 
Chapter 10 software certification
Chapter 10 software certificationChapter 10 software certification
Chapter 10 software certification
 

Mehr von Info-Tech Research Group

Select and Implement a Next Generation Endpoint Protection Solution
Select and Implement a Next Generation Endpoint Protection SolutionSelect and Implement a Next Generation Endpoint Protection Solution
Select and Implement a Next Generation Endpoint Protection SolutionInfo-Tech Research Group
 
Master Contract Review and Negotiation For Software Agreements-sample
Master Contract Review and Negotiation For Software Agreements-sampleMaster Contract Review and Negotiation For Software Agreements-sample
Master Contract Review and Negotiation For Software Agreements-sampleInfo-Tech Research Group
 
Improve IT Business Alignment With An Infrastructure Roadmap
Improve IT Business Alignment With An Infrastructure RoadmapImprove IT Business Alignment With An Infrastructure Roadmap
Improve IT Business Alignment With An Infrastructure RoadmapInfo-Tech Research Group
 
Build a Business-Driven IT Risk Management Program
Build a Business-Driven IT Risk Management ProgramBuild a Business-Driven IT Risk Management Program
Build a Business-Driven IT Risk Management ProgramInfo-Tech Research Group
 
Optimize Project Intake Approval and Prioritization
Optimize Project Intake Approval and PrioritizationOptimize Project Intake Approval and Prioritization
Optimize Project Intake Approval and PrioritizationInfo-Tech Research Group
 
Modernize Communications and Collaboration Infrastructure
Modernize Communications and Collaboration InfrastructureModernize Communications and Collaboration Infrastructure
Modernize Communications and Collaboration InfrastructureInfo-Tech Research Group
 
Craft an End-to-End Data Center Consolidation Strategy to Maximize Benefits
Craft an End-to-End Data Center Consolidation Strategy to Maximize BenefitsCraft an End-to-End Data Center Consolidation Strategy to Maximize Benefits
Craft an End-to-End Data Center Consolidation Strategy to Maximize BenefitsInfo-Tech Research Group
 
Develop a Project Portfolio Management Strategy
Develop a Project Portfolio Management StrategyDevelop a Project Portfolio Management Strategy
Develop a Project Portfolio Management StrategyInfo-Tech Research Group
 
Implement an enterprise service bus revised
Implement an enterprise service bus    revisedImplement an enterprise service bus    revised
Implement an enterprise service bus revisedInfo-Tech Research Group
 

Mehr von Info-Tech Research Group (20)

Select and Implement a Next Generation Endpoint Protection Solution
Select and Implement a Next Generation Endpoint Protection SolutionSelect and Implement a Next Generation Endpoint Protection Solution
Select and Implement a Next Generation Endpoint Protection Solution
 
Create a Winning BPI Playbook
Create a Winning BPI PlaybookCreate a Winning BPI Playbook
Create a Winning BPI Playbook
 
Master Contract Review and Negotiation For Software Agreements-sample
Master Contract Review and Negotiation For Software Agreements-sampleMaster Contract Review and Negotiation For Software Agreements-sample
Master Contract Review and Negotiation For Software Agreements-sample
 
Optimize Change Management
Optimize Change ManagementOptimize Change Management
Optimize Change Management
 
Improve IT Business Alignment With An Infrastructure Roadmap
Improve IT Business Alignment With An Infrastructure RoadmapImprove IT Business Alignment With An Infrastructure Roadmap
Improve IT Business Alignment With An Infrastructure Roadmap
 
Build a Business-Driven IT Risk Management Program
Build a Business-Driven IT Risk Management ProgramBuild a Business-Driven IT Risk Management Program
Build a Business-Driven IT Risk Management Program
 
Standardize the Service Desk
Standardize the Service DeskStandardize the Service Desk
Standardize the Service Desk
 
Optimize Project Intake Approval and Prioritization
Optimize Project Intake Approval and PrioritizationOptimize Project Intake Approval and Prioritization
Optimize Project Intake Approval and Prioritization
 
Modernize Communications and Collaboration Infrastructure
Modernize Communications and Collaboration InfrastructureModernize Communications and Collaboration Infrastructure
Modernize Communications and Collaboration Infrastructure
 
Optimize the IT Operating Model
Optimize the IT Operating ModelOptimize the IT Operating Model
Optimize the IT Operating Model
 
Info-Tech Membership Overview
Info-Tech Membership OverviewInfo-Tech Membership Overview
Info-Tech Membership Overview
 
Define an EA Operating Model
Define an EA Operating ModelDefine an EA Operating Model
Define an EA Operating Model
 
Become a Transformational CIO
Become a Transformational CIOBecome a Transformational CIO
Become a Transformational CIO
 
Craft an End-to-End Data Center Consolidation Strategy to Maximize Benefits
Craft an End-to-End Data Center Consolidation Strategy to Maximize BenefitsCraft an End-to-End Data Center Consolidation Strategy to Maximize Benefits
Craft an End-to-End Data Center Consolidation Strategy to Maximize Benefits
 
Build and Information Security Strategy
Build and Information Security StrategyBuild and Information Security Strategy
Build and Information Security Strategy
 
Build an Application Integration Strategy
Build an Application Integration StrategyBuild an Application Integration Strategy
Build an Application Integration Strategy
 
Develop a Project Portfolio Management Strategy
Develop a Project Portfolio Management StrategyDevelop a Project Portfolio Management Strategy
Develop a Project Portfolio Management Strategy
 
Implement an enterprise service bus revised
Implement an enterprise service bus    revisedImplement an enterprise service bus    revised
Implement an enterprise service bus revised
 
Implement a Shared Services Model
Implement a Shared Services ModelImplement a Shared Services Model
Implement a Shared Services Model
 
Assess and Optimize EA Capability
Assess and Optimize EA CapabilityAssess and Optimize EA Capability
Assess and Optimize EA Capability
 

Kürzlich hochgeladen

Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...Fwdays
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
Vector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector DatabasesVector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector DatabasesZilliz
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenHervé Boutemy
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
The Future of Software Development - Devin AI Innovative Approach.pdf
The Future of Software Development - Devin AI Innovative Approach.pdfThe Future of Software Development - Devin AI Innovative Approach.pdf
The Future of Software Development - Devin AI Innovative Approach.pdfSeasiaInfotech2
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machinePadma Pradeep
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Wonjun Hwang
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clashcharlottematthew16
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
Vertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsVertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsMiki Katsuragi
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostLeverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostZilliz
 

Kürzlich hochgeladen (20)

Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
Vector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector DatabasesVector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector Databases
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache Maven
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
The Future of Software Development - Devin AI Innovative Approach.pdf
The Future of Software Development - Devin AI Innovative Approach.pdfThe Future of Software Development - Devin AI Innovative Approach.pdf
The Future of Software Development - Devin AI Innovative Approach.pdf
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machine
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clash
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
Vertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsVertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering Tips
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostLeverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
 

develop improve software testing

  • 1. Develop & Improve Your Software Testing Strategy Conveying and Inspiring Confidence Info-Tech Research Group “ Software implementation is a cozy bonfire, warm, bright, a bustle of comforting concrete activity. But beyond the flames is an immense zone of darkness. Testing is the exploration of this darkness.” - extracted from the 1992 Software Maintenance Technology Reference Guide
  • 2.
  • 3. Executive Summary Your success is dependant on how well you test your applications. Whether you have developed them or simply integrated or configured them, this solution will help you understand how and why certain testing should be completed Software Quality Assurance or SQA is an umbrella term that refers to the process involved in ensuring quality in your deliverables. It encompasses both Software Quality Control (SQC) and Software Testing. Business leaders will all eagerly agree that quality is important , but understanding how to get there, what processes to implement, what people to engage is often a difficult decision … this solution will help you to sift through all the information and turn it into useable knowledge to increase your success As any development organization matures, priorities jockey for position. For startups the elusive quality is not the major concern , delivery is, but as the organization matures quality ultimately ends up at the top Attempting to turn a screw with a knife blade can work, but it works much better, faster, and more accurately with the proper tool. Having trained quality professionals will help the organization achieve the level of quality that your customers expect … and deserve Develop and Improve your overall quality by learning and implementing proper process that fits your organizational needs. Whether you develop for the web, or integrate off the shelf software, your customers deserve your best, and you deserve their trust. This solution will help guide you through creating and maintaining a process that works for you Info-Tech Research Group A ssess your readiness … I mprove your success D evelop the means to achieve … U nderstand SQA, SQC, and Testing
  • 4. Info-Tech Research Group Develop & Improve Your Software Testing Strategy Assess where you stand with software quality assurance Success is determined by how well your customers feel toward you, your service, and your product Assessing your place in the world Assess The General Situation Your Environment Improve Registered Certified Gold Redesigned Network Develop Subscription Based Software Competencies Understand Program Management Concordant Bodies & Alternatives
  • 5.
  • 6. Most businesses with development groups will be system builders or integrators and will mature at a predictable pace Info-Tech Research Group Experienced : 15-30 Dev, 2-3 Dev Mgr, 1 Sr Dev Mgr, 0-1 BA, 0-1 PM, some formalized dev standards, most projects run by dev leaders, some early formalized planning, requirements mostly known, little change control, deadlines are set based on rough estimates and business/client need still largely asap. Inexperienced : 1-5 Dev, no Dev Mgr, no BA, no PM, no dev standards, projects run by dev, no real planning, requirements are extremely loose, no change control, deadlines are all asap. **Top Priority -> Delivery Mildly Experienced : 5-15 Dev, 1-2 Dev Mgr, no BA, no PM, unofficial dev standards, projects run by dev, no formalized planning, requirements are loose, no change control, deadlines are all asap. Very Experienced : 30-50 Dev, 5-8 Dev Mgr, 1-2 Sr Dev Mgr, 1-2 BA, 3-5 PM, formalized development standards being followed, some projects still run by dev leaders, most run by PM, formalized planning, requirements are set, some change control is attempted, deadlines are set based on established estimation process with business/client needs factoring heavily. Seasoned Veteran : 50+ Dev, 8+ Dev Mgr, 3+ Sr Dev Mgr, 2-3+ BA, 8+ PM, formalized dev standards are followed, all projects controlled by PM, standardized and formalized planning, requirements are known, change control is in place, deadlines are based on well thought out estimates and business/client needs. ** Top Priority -> Quality No Testers ad-hoc developer testing only 0-1 Testers very little testing, some developer testing, some ad-hoc exploratory type testing 1 SQA/Test Leader, 5-10 Testers some developer testing, some standardized testing, mostly regression, smoke, exploratory, acceptance 1-2 SQA/Test Leader, 8-20 Testers some developer testing, standardized SQA run testing – separate test environment 1-2 SQA/Test Leaders, 20+ Testers developer unit testing, automated build testing standardized SQA run testing, automated regression, other automated testing – multiple separate test environments Individual business focus may be different, the approach to testing may be different, but the goal toward quality will be constant
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.
  • 13.
  • 14.
  • 15.
  • 16.
  • 17. If you’re testing an enterprise app, it is of little value how much testing has been done if it cannot be integrated Info-Tech Research Group Make sure your testers are aware of these Enterprise Testing Fundamentals: Testing is a critical requirement to making good enterprise deployment decisions. Recommendations include approaching your enterprise testing with a phased approach to your component, feature, and system testing. The test process for enterprise solutions can be best viewed as a reverse engineering of the product development. Typically this can be done in a 3-step process involving module verification , feature verification , and usage . Product testing can only begin once one or more of the interrelated functions have been unit tested sufficiently as to allow a normal progression through each of the major areas of functionality. Start testing early to provide your supplier, vendor, or internal development group sufficient time to respond with fixes and have your test team verify the fixes while still preserving your timeline Info-Tech Insight:
  • 18.
  • 19.
  • 20. Info-Tech Research Group Develop & Improve Your Software Testing Strategy Improve your overall approach to software quality testing It is impossible to “test” quality into a product Improving your testing effectiveness Assess The General Situation Your Environment Improve Your Testing Focus Your Testing Strategy Your Testing Effectiveness Best Practices Develop Subscription Based Software Competencies Understand Program Management Concordant Bodies & Alternatives
  • 21. Incomplete requirements, poor specifications, and unclear objectives are listed clearly as the biggest project problems 0 Info-Tech Research Group In a recent unofficial independent poll of Software Quality Assurance professionals (Linked-In) when asked “What do you think the top problems with development projects are?” The results showed that poor requirements was the leading cause of failure in projects. Inadequate testing was NOT the biggest concern, poor requirements were Make sure your project team fully understands the requirements and make sure those requirements are testable If your team can’t test the requirement, you have a problem! approx N=19 18% 13% 7% 8% 28% Poor Requirements Feature Creep Miscommunication Inadequate Testing Unrealistic Schedule Having testers involved early in the projects can help to prevent the number one cause of failure in projects. *** Visionary requirements cannot be tested , but detailed, well thought out requirements can!
  • 22. Improve your testing process by first understanding the 5 most common problems and how to address them Info-Tech Research Group solid requirements - clear, complete, detailed, attainable, testable requirements that are agreed to by all stakeholders. realistic schedules - allow adequate time for planning, design, testing, bug fixing, re-testing, changes, and documentation; project resources should be able to complete the project without burning out. adequate testing - start testing early, re-test after fixes or changes, plan for adequate time for testing and bug-fixing. 'Early' testing could include static code analysis/testing, test-first development, unit testing by developers, automated post-build testing, etc. stick to initial requirements where feasible - be prepared to defend against excessive changes and additions once development has begun, and be prepared to explain consequences. If changes are necessary, they should be adequately reflected in related schedule changes. If possible, work closely with customers/end-users to manage expectations. communication - require walkthroughs and inspections when appropriate; make extensive use of group communication tools - groupware, wiki's, bug-tracking tools and change management tools, intranet capabilities, etc.; ensure that information/documentation is available and up-to-date - preferably electronic, not paper; promote teamwork and cooperation; use prototypes and/or continuous communication with end-users if possible to clarify expectations.
  • 23.
  • 24. Do not limit your testing resources to just testers, include resources from other areas to improve your success 0 Info-Tech Research Group “ Just like development, to be effective SQA needs to have good processes For good testing results those processes need to include other people as necessary” - SQA Analyst– IT Professional Services (Financial Industry) Organizations that included resources outside of testing throughout their process showed to be more successful in every case. N=72 Source: Info-Tech Research Group Business Analysts Business Users Developers Other IT +38% +33% +24% +83% Those that did Those that did not Clients use of additional resources
  • 25.
  • 26.
  • 27.
  • 28.
  • 29.
  • 30.
  • 31. Improve your effectiveness by following this simple list of best practices for software quality testers Info-Tech Research Group 1) Learn to analyze your test results thoroughly 2) Learn to maximize your test coverage 3) Break your application into smaller functional modules 4) Write test cases for intended functionality 5) Start testing the application with the intent of finding bugs 6) Write your test cases in the requirement analysis and design phase 7) Make your test cases available to developers prior to coding 8 ) If possible identify and group your test cases for later regression testing 9) Applications requiring critical response time should be thoroughly tested for performance 10) Developers should not test their own code 11) Go beyond requirement testing 12) While doing regression testing use bug data 13) Note down the new terms, concepts you learn while testing 14) Note down all code changes done for testing 15) Keep developers away from test environment 16) It’s a good practice to involve testers right from software requirement and design phase 17) Testing teams should share best testing practices 18) Increase your conversations with the developers 19) Don’t run out of time to do high priority testing tasks 20) Write clear, descriptive, unambiguous bug reports Read the full “ Testing Best Practices ” in Appendix IV of this document. Don’t forget: testing is a creative and challenging task. It depends on your skill and experience, how you handle this challenge
  • 32. Info-Tech Research Group Develop & Improve Your Software Testing Strategy Develop your testers, your process, and the means to measure Involve SQA early, but not too early … wait until the ambiguity starts to settle Developing your testing strategies Assess The General Situation Your Environment Improve Your Testing Focus Your Testing Strategy Your Testing Effectiveness Best Practices Develop Your Testers & Test Coverage Your ability to measure success Understand Program Management Concordant Bodies & Alternatives
  • 33.
  • 34. Having the right mix of experience and training in the right roles makes a difference to your chance of success 0 Info-Tech Research Group Increasing Success Our survey shows: Clients that showed greater success had 36% more experienced staff on their team Organizations that showed greater success had 45% more staff with development backgrounds Clients that showed greater success had 41% more staff with formal QA training N=72 Source: Info-Tech Research Group In a related unofficial poll of SQA Professionals, when asked “ What is a good ratio of developers to testers? ” the response was an overwhelming and unanimous “NO” The unanimous reasoning was simply that there are too many variables to give a reasonably accurate answer However the survey did show that while the minimum ratio was 0:1, the maximum was 1:30 the most common ratio was 1:3 (testers to developers) “ Having the right resources spread between Dev, SQA, and the BA’s is like 3 legs of a stool … you need them all ” - IT Manager Financial Industry +45% +36% +41% Relevant Work Experience Formal QA Training Development Background
  • 35. Trained SQA testers should know what type of test is best, but using these should be your minimum coverage Info-Tech Research Group Unit Testing - the most micro scale of testing; to test particular functions or code modules. Typically done by the developer and not by testers, as it requires detailed knowledge of the internal program design and code. Not always easily done unless the application has a well-designed architecture with tight code; may require developing test driver modules or test harnesses Functional Testing - black-box type testing geared to functional requirements of an application; this type of testing should be done by testers. This doesn't mean that the programmers shouldn't check that their code works before releasing it (which of course applies to any stage of testing) Integration Testing - testing of combined parts of an application to determine if they function together correctly. The 'parts' can be code modules, individual applications, client and server applications on a network, etc. This type of testing is especially relevant to client/server and distributed systems Exploratory & Ad-Hoc Testing - often taken to mean a creative, informal software test that is not based on formal test plans or test cases; testers may be learning the software as they test it Regression Testing - re-testing after fixes or modifications of the software or its environment. It can be difficult to determine how much re-testing is needed, especially near the end of the development cycle. Automated testing approaches can be especially useful for this type of testing Usability Testing & User Acceptance - testing for user-friendliness. This is the most subjective, and will depend on the end-user or customer. User interviews, surveys, video recording of user sessions, and other techniques can be used. Developers and testers are usually not appropriate as usability testers “ Automated testing is not the silver bullet that everyone believes it to be but yet another tool in the QA toolbox. An automation framework is great when it is used to enhance testing ” - SQA Trained Professional with over 16yrs exp . Read the complete list of testing types in Appendix V
  • 36.
  • 37.
  • 38.
  • 39.
  • 40.
  • 41. Info-Tech Research Group Develop and Improve Your Software Testing Strategy If you only do a little, you need to understand the basics Understanding is the first step toward useful knowledge Understand how it all fits together Assess The General Situation Your Environment Improve Your Testing Focus Your Testing Strategy Your Testing Effectiveness Best Practices Develop Your Testers & Test Coverage Your ability to measure success Understand If you only test a little, this is what you need to know How testing fits in the bigger SQA picture
  • 42.
  • 43.
  • 44.
  • 45.
  • 46.
  • 47.
  • 48.
  • 49. Info-Tech Research Group Software Quality Assurance is not all about the testing, in fact SQA encompasses all aspects of the Software Development Life Cycle from planning through support An organization that focuses on Quality Management looks at all aspects from beginning to end with a goal of total satisfaction. “ Getting them engaged early creates more successful projects because they have the context to plan and prepare.” - Manager of QA – Government (Health Sector) “ If you relegate QA responsibilities to just testing the end product, you overlook an opportunity to integrate quality into the entire software development life cycle.” - SQA Trained Professional with over 16yrs exp. “ To truly be effective at building quality, you can’t just look at testing alone. Making sure the requirements are even testable is a crucial step towards quality.” - SQA Trained Professional with over 16yrs exp. Software testers typically get involved during design and development phases …but you really shouldn’t wait, or stop there. Software Development Life Cycle (SDLC)
  • 50.
  • 51.
  • 52.
  • 53. Appendix I Case Study: Testing is limited. The business doesn’t care Info-Tech Research Group The business remains uninvolved because they trust the development group. There is not enough regression testing to the manager’s estimation, and as a result there are post production issues. Since the business doesn’t see it as a concern they may never change testing practices. “ We have monitoring software for servers. Monitor but don’t test . We put it out there and hope that it doesn’t crash . It will go down and we’re reactive about it. The IT budget is limited, so we can’t invest in testing” The Situation : Finance Industry – Applications developed include website and internal applications . There are approximately 30 people in IT, 10 Web Developers ( no testers ) – Spread across an average of 10 applications and/or projects. Some project managers are involved in some of the projects and sometimes will produce a plan. Development staff typically will draft a test plan (of sorts). There is no standard methodology followed, everything is very ad-hoc. Attempts are made to collaborate with the business when possible; projects managers are too distant and operational managers are not equipped with knowledge to provide feedback. The Challenge : One of the biggest struggles facing the group is trying to get the business included and involved. They have no understanding of development or testing and do not support what they don’t know. IT has no idea what they are supposed to test, and because this has not caused a major problem yet , the business does not feel enough incentive to get involved , or to support the need for testing. The only time the business gets involved in our projects is when they need to sign off for compliance reasons. Generally the business takes on the attitude of … “we just trust you.” Comments: Recommendations: This business needs to take a step back and look at the risk associated with not doing testing. Trusting of their developers is admirable, but it won’t help the business. A serious look at the cost justifications for testing should be done. Compared against the cost to react to problems and the risk they are exposing themselves to, especially considering their industry, should be enough to help them in the right direction. Many businesses fall into this pit … they trust their developers, and so testing is a cost not justifiable. Before you right it off , as with any good business decision … do your due diligence , quality is worth it! Go Back
  • 54. Appendix II Case Study: Even small shops need a testing strategy Info-Tech Research Group “ No, I think that for us, because we’re a smaller shop, it comes down to time. We just don’t (test)– I’m gonna say from a manpower perspective, we definitely take some shortcuts right now . In some cases, I have my VAR, my key contact, he will test it. He’ll show me what – we’ll quickly go over it and I’ll say “ Yeah, okay that looks good ” and I just let it go . I don’t test it ; I don’t necessarily test them myself just because I don’t have the time to do it .” For vendor patch updates, the partner installs and tests patches. But they don’t put all their faith in the hands of the development partner. Before any patches are applied, a copy is made of the previous release. If something happens once the development partner applies the patches they go back to original. The Situation : Small Shop Profile –CIO, Application manager and QA manager at a manufacturing industry IT shop of four.  Development for their iSeries based commercial ERP platform is an external partner, with whom they have had years of experience. They work with a VAR to specify and test ERP enhancements, and sometimes just don’t have time to test. The Challenge : Big projects – will test in conjunction with developers The alternative: For minor projects, and when there just isn’t time, will have the development partner test functionality.   Comments: Recommendations: The manager in this case recognizes the need to test, he uses the 3 rd party VAR whom he trusts to take on a lot of the testing responsibilities and trusts that if something goes wrong, they can back out of it. As with in the first case study, this business needs to take a step back and look at the risk associated with not doing testing. Trusting of their VAR in this case is risky, and it won’t help the business. The VAR does not have the same stake if something goes wrong. A serious look at the cost for testing should be done. All may be well today, but what about tomorrow… As a small shop of only four developers, it is easy to understand, and easy to justify not having specific testing resources. However, improved quality can be achieved by having dedicated resources that know how to plan, how to strategize and test the integration of the systems. A dedicated resource can free up the development resources to work on more relevant tasks. The development resources in small shops can be an amazing resource, but they can also be very quick to move on. Keep them productive and happy doing things they love to do …. develop. Focus the developers attention on the solutions, and bring in co-op students to handle the testing. its your quality, don’t leave it to someone else! Go Back
  • 55. Appendix III Case Study: Test data, the critical component Info-Tech Research Group “ So I’ve seen one of my testers was testing maintenance for something, and there literally was not enough data to sort a column. And that project, that application had made it through to production.” Another scenario is when data is not unique enough. “We’ve had this one happen – the entire test, people in there had the same birth date, so you’re trying to validate reports that generate different things based on date of birth, you don’t really know if it’s working or not because you’re always seeing the same result.” The Situation : 100 person IT shop. 10 QA staff, lots of custom and commercial testing. There are software architects, developers, e-business analysts, and testers, and kind of quality analyst type roles, project managers, application desk-side support analysts. The Challenge : For security/compliance reasons an enterprise is limited in the real data they can tests in applications, and incomplete test data can limit testing. The result is that testing does not actually meet the targets in the test script. People need to create custom data and sometimes they just don’t create enough.   Comments: Recommendations: This organization has a well established and well formed environment. Their resource make up is good, the processes they have in place are good, the testing strategy is good, and there is adequate planning and support for their testing efforts. The short fall is with the infrastructure for the test environments. In this case, data. Time needs to be spent ensuring that the test team has sufficient data to properly test for conditions that properly reflect those situations that emulate the real world. The recommendation would be to duplicate one of the production databases, move this to an isolated test environment, and take the time to change the data from real to test … for example, changing names, changing titles, changing all facets of the data to test (fake) and obviously fake, data. This is a tedious and time consuming task, however once completed it will increase the effectiveness of the testing team. Another alternative recommendation would be to create a mirror within the production system that could write (changed) data to a separate database for use in the test environment. Go Back
  • 56. Appendix IV Testing Best Practices Info-Tech Research Group 1) Learn to analyze your test results thoroughly. Do not ignore the test result. The final test result may be ‘pass’ or ‘fail’ but troubleshooting the root cause of ‘fail’ will lead you to the solution of the problem. Testers will be respected if they not only log the bugs but also provide solutions . 2) Learn to maximize the test coverage every time you test any application. Though 100 percent test coverage might not be possible you can always try to reach it. 3) To ensure maximum test coverage break your application into smaller functional modules. Write test cases on individual unit modules. Also, if possible, break these modules into smaller parts, e.g.: If you have divided your website application in modules and “ accepting user information ” is one of the modules. You can break this “ user information ” screen into smaller parts for writing test cases: Parts like UI testing, security testing, functional testing of the user information form etc. Apply all form field type and size tests, negative and validation tests on input fields and write all test cases for maximum coverage. 4) While writing test cases, write test cases for the intended functionality first i.e: for valid conditions according to requirements. Then write test cases for invalid conditions. This will cover expected as well unexpected behavior of the application. 5) Think positive . Start testing the application by intending to find bugs/errors. Don’t think beforehand that there will not be any bugs in the application. If you test the application with the intention of finding bugs you will definitely succeed. 6) Write your test cases in requirement analysis and the design phase itself. This way you can ensure all the requirements are testable. 7) Make your test cases available to developers prior to coding. Don’t keep your test cases with you waiting to get the final application release for testing, thinking that you can log more bugs. Let developers analyze your test cases thoroughly to develop a quality application. This will also save the re-work time. Go Back
  • 57. Appendix IV - continued Testing Best Practices Info-Tech Research Group 8) If possible identify and group your test cases for regression testing. This will ensure quick and effective manual regression testing. 9) Applications requiring critical response time should be thoroughly tested for performance. Performance testing is the critical part of many applications. In manual testing this is mostly ignored by testers. Find out ways to test your application for performance. If it is not possible to create test data manually, then write some basic scripts to create test data for performance testing or ask the developers to write it for you. 10) Programmers should not test their own code. Basic unit testing of the developed application should be enough for developers to release the application for the testers. But testers should not force developers to release the product for testing. Let them take their own time. Everyone from lead to manger will know when the module/update is released for testing and they can estimate the testing time accordingly. This is a typical situation in an agile project environment . 11) Go beyond requirement testing. Test the application for what it is not supposed to do. 12) While doing regression testing use previous bug information. This can be useful to predict the most probable bug filled part of the application. 13) Keep a text file open while testing an application and write down the new terms and concepts you learn. Use these notepad observations while preparing a final test release report. This good habit will help you to provide a complete unambiguous test report and release details. 14) Many times testers or developers make changes in the code base for the application when under test. This is a required step in development or testing environments to avoid execution of live transaction processing like in banking projects . Record all such code changes done for testing purposes and at the time of final release make sure you have removed all these changes from the final client side deployment file resources. Go Back
  • 58. Appendix IV – continued (2) Testing Best Practices Info-Tech Research Group Go Back 15) Keep developers away from the test environment. This is a required step to detect any configuration changes missing in a release or deployment document. Sometimes developers do some system or application configuration changes but forget to mention those in deployment steps. If developers don’t have access to the testing environment they will not make any of these changes accidentally. These changes must be captured at the right place . 16) It’s a good practice to involve testers right from the software requirement and design phase. This way testers can get knowledge of the application dependability resulting in detailed test coverage. If you are not being asked to be part of this development cycle then make a request to your lead or manager to involve your testing team in all decision making processes or meetings . 17) Testing teams should share best testing practices , and experience with other teams in their organization. 18) Increase your conversation with developers to know more about the product. Whenever possible, use face-to-face communication for resolving disputes quickly and to avoid any misunderstandings. But also, when you reach an understanding or resolve any dispute - make sure to communicate the same in writing. Do not leave anything strictly verbal . 19) Don’t run out of time to do high priority testing tasks. Prioritize your testing work from high to low priority and plan your work accordingly. Analyze all associated risks to prioritize your work. 20) Write clear, descriptive, unambiguous bug reports. Do not provide only the bug symptoms but also provide the effect of the bug and all possible solutions.
  • 59. Appendix V Understand the various testing types Info-Tech Research Group Black box testing - not based on any knowledge of internal design or code. Tests are based on requirements and functionality White box testing - based on knowledge of the internal logic of an application's code. Tests are based on coverage of code statements, branches, paths, conditions Unit Testing - the most micro scale of testing; to test particular functions or code modules. Typically done by the developer and not by testers, as it requires detailed knowledge of the internal program design and code. Not always easily done unless the application has a well-designed architecture with tight code; may require developing test driver modules or test harnesses Incremental Integration Testing - continuous testing of an application as new functionality is added; requires that various aspects of an application's functionality be independent enough to work separately before all parts of the program are completed, or that test drivers be developed as needed; done by programmers or by testers Integration Testing - testing of combined parts of an application to determine if they function together correctly. The 'parts' can be code modules, individual applications, client and server applications on a network, etc. This type of testing is especially relevant to client/server and distributed systems Functional Testing - black-box type testing geared to functional requirements of an application; this type of testing should be done by testers. This doesn't mean that the programmers shouldn't check that their code works before releasing it (which of course applies to any stage of testing) System Testing - black-box type testing that is based on overall requirements specifications; covers all combined parts of a system End-to-End Testing - similar to system testing; the macro end of the test scale; involves testing of a complete application environment in a situation that mimics real-world use, such as interacting with a database, using network communications, or interacting with other hardware, applications, or systems if appropriate Smoke Testing - typically an initial testing effort to determine if a new software version is performing well enough to accept it for a major testing effort. For example, if the new software is crashing systems every 5 minutes, bogging down systems to a crawl, or corrupting databases, the software may not be in a condition to warrant further testing Regression Testing - re-testing after fixes or modifications of the software or its environment. It can be difficult to determine how much re-testing is needed, especially near the end of the development cycle. Automated testing approaches can be especially useful for this type of testing Acceptance Testing - final testing based on specifications of the end-user or customer, or based on use by end-users/customers over some specified period of time Load Testing - testing an application under heavy loads, such as testing of a web site under a range of loads to determine at what point the system's response time degrades or fails Go Back
  • 60. Appendix V – continued Understand the various testing types Info-Tech Research Group Stress Testing - often used interchangeably with load and performance testing. Also, used to describe such tests as system functional testing while under unusually heavy loads, heavy repetition of certain actions or inputs, input of large numerical values, and large complex queries to a database system, etc. Performance Testing - often used interchangeably with stress and load testing. Ideally , performance testing is defined in requirements documentation or QA or Test Plans. (Specified performance criteria from the customer). Usability Testing - testing for user-friendliness. This is the most subjective, and will depend on the end-user or customer. User interviews, surveys, video recording of user sessions, and other techniques can be used. Developers and testers are usually not appropriate as usability testers. Deployment Testing - testing of full, partial, or upgrade install/uninstall processes. Recovery & Failover Testing - testing how well a system recovers from crashes, hardware failures, or other catastrophic problems. Security Testing - testing how well the system protects against unauthorized internal or external access, willful damage, etc.; may require sophisticated testing techniques. Compatibility Testing - testing how well software performs in a particular hardware/software/operating system/network/etc. environment. Exploratory Testing - often taken to mean a creative, informal software test that is not based on formal test plans or test cases; testers may be learning the software as they test it. Ad-Hoc Testing - similar to exploratory testing, but often taken to mean that the testers have significant understanding of the software before testing it. User Acceptance Testing - determining if software is satisfactory to an end-user or customer. Comparison Testing - comparing software weaknesses and strengths to competing products. Alpha/Beta Testing - testing of an application when development is nearing completion; minor design changes may still be made as a result of alpha testing. Beta testing occurs when development and testing are essentially completed and final bugs and problems need to be found before final release. Typically, alpha & beta testing is done by end-users or others, not by developers or testers. Go Back
  • 61. Appendix VI Available Tools & Templates Info-Tech Research Group Use Info-Tech’s “ Test Plan Template ” to quickly layout and streamline your project test plan. (strategy) Use Info-Tech’s “ Defect Reporting Template ” to quickly record and document your project test cases. Use Info-Tech’s “ Software Testing Strategy & Risk Assessment ” tool to determine risk and testing strategies by looking at these influencing factors. Use Info-Tech�