Reactive test automation for continuous delivery

Testing’s not properly automated until every last manual process has been removed

In 2014, around 70% of testing remained manual, and testing could take up to half the time in the SDLC.[1] As a consequence, test automation was, and still is, on the rise, as more organizations attempt to eliminate testing bottlenecks and keep up with the short cycles demanded by changing business needs.

Automating test execution makes sense, given how execution has traditionally been the longest phase of testing and automated tests run in a fraction of the time compared to manual ones.[2]

Test execution automation is not a “silver bullet”

However, if only test execution is automated, many other tasks will remain too slow and manual, while test automation itself can introduce additional labour. These challenges can make rigorous testing impossible within a sprint. They were set out in the CA webinar “Reactive Automation – Moving from Requirements to Automation”, and are summarized below.

The first issue is manual scripting. Though many good automation frameworks exist, they tend to rely on scripting, and this brings you back to manual test case design. Hours or even days will go into deriving automated scripts from static requirements, and at one team we worked with it took 6 hours to write just 11 tests.[3]

What’s more, manual test design is unsystematic, leading to the creation of whichever tests testers can come up with based on ambiguous, incomplete requirements. Even a simple system will contain several thousand logical combinations – more than even the most talented tester could come up with in their head.

As a consequence, typical test coverage is as low as 10-20% and the 11 test cases just mentioned achieved just 16% coverage. Negative testing will be particularly neglected, leaving the majority of the system exposed to costly defects.

Maintenance remains a major painpoint with manual test generation, and test automation has historically been criticized for not being able to manage change[4]. Test scripts are not traceable to the static requirements from which they were derived, so that testers have to manually check and update the existing mountain of existing tests to reflect a change made to the system.

Often, every existing test is “burned” and re-created from scratch, but then testing stands no real chance of keeping up with the rate of changing user needs. Alternatively, more and more tests are piled up in an effort to maintain coverage, but this leads to wasteful over-testing while invalid and broken tests lead to automated test failure.

“Reactive Automation” and keeping up with changing requirements

If rigorous testing is going to be achieved within the duration of a sprint, automation needs to extend beyond test execution, to include test creation and maintenance.

This can be achieved if automated tests are derived from “active” requirements, where it is possible to automatically generate automated tests directly from the requirements stored in mathematically precise formats such as a flowchart.

This eliminates the time wasted on manual test generation, while optimization techniques can be applied to generate the smallest test of tests needed for maximum coverage. More defects will be detected earlier, and test cycles are shortened as a consequence.

Arguably the greatest advantage of this approach is that maintenance is automated. Because automated tests have been derived directly from the requirements, they are traceable back to them. They can therefore be updated automatically when the requirements change, with any broken or redundant tests removed automatically and any new tests needed for maximum coverage created.[4]

Missed the webinar? Watch it here.

References:

[1] Bloor Research, Automated test case generation, 2015, p. 5.

[2] Dorothy Graham & Mark Fewster, That’s No Reason to Automate!, 2009

[3] https://communities.ca.com/community/ca-devtest-community/ca-test-data-management-test-case-optimization/blog/2016/02/24/test-case-calamity

[4] Philip Howard, Automated testing: Coping with change (Bloor Research: 2015), p. 3.

 


Tom Pryce is a Product Marketing Manager, having been a technical and content writer for…

Comments

Modern Software Factory Hub

Your source for the tips, tools and insights to power your digital transformation.
Read more >
RECOMMENDED
Low-Code Development: The Latest Killer Tool in the Agile Toolkit?What Are “Irresistible” APIs and Why Does Akamai's Kirsten Hunter Love Them?Persado's Assaf Baciu Is Engineering AI to Understand How You Feel