Back to Resources

Blog

Posted August 13, 2015

Reducing Regression Execution Times

quote

We all know the saying "time is money." QA managers are constantly under pressure not only to deliver high-quality software products, but also to do so within time constraints. Regression testing is a vital component in any software development life cycle to ensure that no new errors are introduced as a result of new features or the correction of existing bugs. Every time we modify existing source codes, new and existing test cases need to be executed on different configurations such as operating systems and platforms. Testing all of these permutations manually is simply not cost- or time-effective, and is also inconsistent. Automated regression addresses these challenges. But as we increase feature coverage and permutation testing, companies increase their execution times to a level that is no longer acceptable for the delivery of high-quality products within a tight schedule. Here are a few ways to improve execution time:

1. Introduce Continuous Integration (CI) Tools:

When going from a few scripts to a few thousand scripts, you begin to notice some growing pains. I frequently notice engineers manually executing script batches one at a time in a serialized manner and monitoring them as they execute. This consumes both time and resources. This becomes even more challenging when regression runs overnight or during weekends, with no one available to troubleshoot. As your automation grows, you need to have an infrastructure in place that allows you to scale and be able to run regressions unattended. I recommend you use a Continuous Integration (CI) tool to manage automated regression executions, such as CircleCI or Jenkins. Tools like CircleCI or Jenkins can help you bring up virtual machines, start regressions, handle a more dynamic queuing mechanism, monitor regressions, and warn you if something goes wrong that requires manual intervention. It can also help you through a recovery mechanism that can be triggered if something becomes non-operational.

2. Use CI Tools not Only to Run Scripts, but also to Automate All Manual Steps!

When it is time to execute regression, there are a series of steps that need to be executed before a script can be run, such as:

  • Loading the new software to be tested

  • Updating your scripts with the latest version

  • Configuring servers

  • Executing scripts

  • Posting results

  • Communicating failure details

Very frequently we see customers using CI only to run scripts, relying on a manual process to execute the remaining steps, which is indeed a very time-consuming process. I recommend minimizing manual intervention, and trying to achieve an end-to-end automated process using a CI tool that allows you to monitor and orchestrate the different tasks.

3. Introduce Dynamic Timeouts

Timeouts are a necessary evil when writing automation. They allow you to slow down your automation, simulating a more human-like interaction, or you can simply wait for something to happen before the next step. When abused, this practice can increase execution time substantially. (I have seen regressions running 3 times slower before adjusting timeouts – no kidding!) I recommend using dynamic timeouts. You can program dynamic timeouts to wait a certain amount of time, or until you get the expected event that will reduce waiting time. The effectiveness of dynamic timeouts is based on your implementation, but in most cases it’s better than just hard-coding them.

4. Unlock the Power of Parallel Execution and Virtualization

Once you have a seamless end-to-end automated process using a CI tool, your next productivity leap could come from increasing capacity by introducing parallel execution. To perform parallel execution, you must increase the number of physical or virtual machines by using a CI tool to manage dynamic queuing and load balancing. This can cut down execution time exponentially based on the number of VMs or servers you integrate.

5. Build a Fully Integrated Automation Framework

To have an efficient automation framework all its components must be fully integrated and talking to each other. Missing or non-fully integrated components add extra time to the whole regression process, as you would have to execute those steps manually. I recommend integrating a centralized tool to keep track of results – it can be as simple and inexpensive as automatically posting results to a Google spreadsheet, or using a Cloud-based test case management tool consumed as a service and paid per engineer usage, or buying the license to that tool and installing it. You also need to analyze each component in your automated regression framework and make sure it is fully integrated to achieve higher levels of efficiency and effectiveness.

6. Combat High Script Failure Rates

If script failure rates are high, time spent in failure analysis can invalidate automated regression, leading to unacceptable time loss. I have personally seen cases where automated regression becomes irrelevant due to this problem. A high failure rate (anywhere from 25-50%) will make it extremely difficult to detect new issues due to changes in code. Before you continue to add more scripts to the pool of an automated regression environment that has a high failure rate, perform a root cause analysis of the failures and focus on fixing them until you get a less than 5% failure rate, ideally.

Conclusion

Many factors adversely affect the time spent during automated regressions, but if done right, you can achieve high levels of effectiveness when using them. Israel Felix has more than 20 years of experience working in the technology industry serving multiple leadership roles within development and test groups. The last 15 years have been focused on leading key functional, regression, system, API and sanity test in both automation and manual environments - in technologies such as Switching, Routing, Network Management, Wireless, Voice, and Cloud-Based Software. He has also managed large global test groups across US, India, Thailand and Mexico.

Published:
Aug 13, 2015
Share this post
Copy Share Link
© 2023 Sauce Labs Inc., all rights reserved. SAUCE and SAUCE LABS are registered trademarks owned by Sauce Labs Inc. in the United States, EU, and may be registered in other jurisdictions.