Why Test Automation Fails In Theory and In Practice - PDF

Please download to get full document.

View again

of 23
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Information Report
Category:

Real Estate

Published:

Views: 6 | Pages: 23

Extension: PDF | Download: 0

Share
Related documents
Description
Why Test Automation Fails In Theory and In Practice May 20 th, 2016 Jim Trentadue Software Quality Consulting Director Original Software Original
Transcript
Why Test Automation Fails In Theory and In Practice May 20 th, 2016 Jim Trentadue Software Quality Consulting Director Original Software Original Software 1 Agenda 1 Test Automation Industry recap & current trends 2 Adoption challenges with Test Automation 3 Why are Test Automation adoptions failing 4 Why are Test Automation implementations failing 5 Correcting the Test Automation approach, concepts and applications 6 Session recap Original Software 2 Test Automation Industry: Recap & Current Trends Original Software 3 Test Automation Industry - recap First Record \ Playback tools appeared about years ago The primary objective was to test desktop applications. Some tools could test broken URL links, but not dig into the objects The tester was required to have scripting or programming knowledge to make the tests run effectively, even in Record \ Playback Record \ Playback Structured Testing (invokes more conditions) Data Driven Keyword Driven Model / Object Based Actions Based Hybrid (combines 2 or more of the previous frameworks) Original Software 4 Test Automation Industry current trends Number of test automation tools have increased to over 25-30, focused on taking the scripting / programming out of the mix for the manual testers Technical complexities have increased considerably to cover the following: Manual Testers are slow to delve into test automation because almost every market and open source solution requires some degree of coding Products are becoming single niche tools, sometimes not offering a fully integrated quality solution for the SQA organization Original Software 5 Adoption of Test Automation; Common myths and perceptions Original Software 6 Adoption challenges New challenges below have slowed productivity Rapid deployment times to production Shifts from a Waterfall SDLC to an Agile Methodology How Test-Driven Development (TDD) impacts when to automate tests How component testing such as: Database, Data Warehouse, Web-Service or messaging testing impacts what tests to automate To ensure success, many organizations have employed a Proof of Concept Original Software 7 Common myths & perceptions Significant increase in time and people Implement with a click of a button Test Automation can t be accomplished! Existing testers will not be needed Associates training will have no impact Will serve all testing needs Original Software 8 Why is the Test Automation adoption failing in organizations? Original Software 9 Case Study 1: Management Struggles Excerpts from a test engineer s interaction with mgmt. It Must Be Good, I ve Already Advertised It Management s intention was to reduce testing time for the system test. There was no calculation for our investment, we had to go with it because manual testing was no longer needed. The goal was to automate 100 % of all test cases. Automate Bugs An idea by one manager was to automate bugs we received from our customer care center. We were told to read this bug and automate this exact user action. Not knowing the exact function, we hard-coded the user data into our automation. We were automating bugs for versions that were not in the field anymore. Testers Aren t Programmers My manager hired testers not be to programmers so he did not have more developers in the department. Once told that some tools require heavy scripting / coding, the message was relayed that testers need to do Advanced Scripting, rather than Programming Impress Customers (the Wrong Way) My boss had the habit of installing the untested beta versions for presentations of the software in front of the customer. He would install unstable versions and then call developers at 5:30am to fix immediately. We introduced automated smoke tests to ensure good builds. Experiences of Test Automation Case Studies of Software Test Automation: Graham & Fewster Original Software 10 Adoption questions Aligned into categories for questions not answered or asked Who What When Where Why How are the right people to create, maintain test cases and object repository information? may execute automated tests instead creating? will be the dedicated go to person(s) for the tool? is our organization s definition of automated testing: what it is and what it is not? are the test automation objectives? is the expected testing coverage sought after with test automation? are testing resources available for this initiative? can training happen in relation to the procurement? are there project deadlines that compete against test automation? will the test automation take place; dedicated test environment? are the associates located? This will decide whether a training should be remote on onsite will a central repository reside? start a test automation initiative what are the specific issues? run a vendor POC program; what do we want from this assessment? start with a top-down management approach? do our test automation objectives differ from our overall testing objectives? does manual testing fit into a test automation framework? will a full integrated quality suite function? Original Software 11 Adoption Failures The not so best practices & assumptions for adoption Automates manual test cases Manual Test Cases not written for automation Error handling is not considered as part of manual testing Modularity isn t a key element for manual test case planning Must cover 100% of the application Testers should just automate the entire regression test library Application changes instantly added to automated library Full overnight run with discrepancies logged as app. defect Testers can automate in a release Replace manual testing time with automated testing time Manual testers can automate their own tests Separate initiative is not required, ample time within project Original Software 12 Why is the Test Automation implementation failing in organizations? Original Software 13 Case Study 2: Choosing the Wrong Tool Background for the Case Study Goal was to automate testing of major functionality on web development tools for internet software development company Pre-existing Automation Most functional tests were executed manually. Some dated automation scripts could potentially serve useful if restored New Tool or Major Maintenance Effort? The GUI had gone through major changes for the application. Tool had to be flexible with app. changes & easy to maintain Moving Forward with existing tool? Consensus was tool required coding; difficult for manual testers. Also, native object recognition provided did not work What was done after the tool was replaced? Understand how automated tools use objects and realize that identification by image was not adequate Conclusion Examine solutions that work best with the your applications and controls. Maintainability & error-handling should be critical Experiences of Test Automation Case Studies of Software Test Automation: Graham & Fewster Original Software 14 Implementation Failures What could wrong when trying to use procured solution Won t work on my application AUT uses technologies that don t work with the tool Attempt to playback my recording doesn t playback at all Multiple tools on varied technologies (Desktop, Web, Mobile) Automate my AUT if it changes? Heavy maintenance in re-recording tests for GUI changes Only for regression testing of legacy apps; not new apps Can t work when deployed incrementally; key for Agile Complex for manual testers No time with project schedules and over-allocation Marketed to testers with a development skillset Doesn t require cross-department support Original Software 15 Correcting the Test Automation approach Original Software 16 Adoption corrections Changing the mindset concerning test automation Automates manual test cases Automated differs from manual with entry, data & exit criteria Error handling is an essential part of automated test cases Modularity is how automated test cases are developed Objectives aim for the highest ROI Must cover 100% of the application Start with regression library, but focus on tests with a high ROI App changes planned and automated in a maintenance window Failure analysis: Defect in the automated TC or AUT? TP includes automated and manual Testers can automate in a release Initial cycle is longer than manual; cycle runtime reduced after Requires training on the tool, approach and best practices Separate initiative, defined unique objectives apart from projects Separate goals & timelines Original Software 17 Case Study 3: Multi-Solution Strategy Background for the Case Study Leading electronic design automation company had products comprised of over 120 programs. This was GUI-based and ported to all leading industry workstation hardware platforms Proposed Solution Purchase solution or build internally Hire an outside consultant to validate the tool selection prior to purchase, then train testing staff DECISION: Build in-house tool Current Testing Activities Three distinct test phases: Evaluation (done by end-user), Integration System Integration & System done manually and consumed 38 person-weeks of effort for every software release on every platform Key Benefits Integration test automation System test automation Test effort per platform cut in half Consistent regression testing Less dependent on system SME s Better use of testing resources First trial Completed under a year Manual: 2-person weeks Automated: 1-person week but not everything could be automated Without automation Manual: 10 person-weeks on every platform Automate in two phases: breadth and depth Manual: 8 person-weeks per platform Automated: 4 personweeks total Total tests automated was approximately 79% Software Test Automation: Effective use of test execution tools Fewster & Graham Original Software 18 Implementation corrections Changing the approach to put automation into practice Won t work on my application Develop criteria for vendors to prove their solution works Standardize object naming structure from Dev, very important Research fully integrated suites; vendor mgmt. will thank you! POC will prove the solution Automate my AUT if it changes? Find solution that you don t re-record if adding new controls Consider starting your testing with automated tests to start Test automation should be a key element in sprint planning Understand the object landscape Complex for manual testers Schedule a just-in-time training shortly after procurement Layout the resource plan on how every tester contributes Testing mgmt. can build the IT cross-functional support plan Develop a resource plan for all Original Software 19 Case Study 4: Automate Government System Background for the Case Study Speed up the Department of Defense (DoD), software delivery. Leading-edge technology, but lengthy testing & certification processes were in place before software can deploy Can t Be Intrusive to System Under Test (SUT) The tool has to be independent of the System Under Test; no modules should be added Must Be OS Independent The solution(s), had to go across Windows, Linux, Solaris, Mac, etc. Must Be Independent of the GUI The tool needed to support all languages that the applications were written in Must Automate Tests for Both Display & Non-Display Interfaces The system should be able to handle operations through the GUI and backend processes Test Automation ROI Solutions selected spanned across technologies Reduced test cycles from 5 testers to 1 Must Work in a Networked Multicomputer Environment The tool needs to test a system of systems: multiple servers, monitors and displays interconnected to form one SUT Non-Developers Should Be Able to Use the Tool One of the main requirements from the DoD. Testers were SME s in the application, but not software developers writing scripts. Needed a code-free solution. Must Support an Automated Requirements Traceability Matrix Needed a framework that could allow for test management activities, including documenting a test case in the Automated Test and Re-Test (ATRT) solution, along with a hierarchical breakdown of test cases into test steps and sub-steps Testing department worked on automated test cases, both creation and execution Additional processes for Test Case & Test Data Management were now enabled Experiences of Test Automation Case Studies of Software Test Automation: Graham & Fewster Original Software 20 Session recap Original Software 21 Presentation Recap Reviewing the main points of the presentation Automation industry focused on desktop initially, but now must cover multi-technology stacks to cover many platforms Primary skillset for automation was programming, but solutions are available offering code-free test automation Common perceptions and myths will exist; review the list of questions to be asked to drive your strategy Present real-life case studies to management and ensure this does not occur during your research and implementation Review common failures in the theory and implementation; compare if any are problematic in your organization Make the corrective action for each failure step accordingly; incorporate test automation as part of your overall test strategy Original Software 22 Jim Trentadue Software Quality Consulting Director Original Software 23
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks