Is Your Software Development Ship Sinking?

by August 15, 2017

Legacy testing is not a good fit for agile

Today’s application economy demands flexibility and youthful ambition coupled with sage-like industry experience and vision. The organizations that are surviving this new digital climate at some point realized that the approaching technological storm was not something to merely be weathered. They either needed to change course or capsize. “That’s just how we do it here,” is no longer an authoritative response to rebuke new ideas. Such companies are doing requirements engineering, test case design and building code the same way they have for decades, the old-fashioned (and slow) way!

My Shipwreck Adventure

Not too long ago, I was a new team member on a software development project that started a year before my arrival. We were already over budget and out of time when I came on board. My first two meetings were extremely informative. The scope of the project was discussed at length and in great detail because no one had the original requirements engineering and project scoping documents, and no one agreed on what the end result should be! Although it looked like we hadn’t even started, upper management deemed it prudent to stop dragging our feet and wanted to force our application into production as quickly as possible.

The purpose of this application was to automate our exceedingly tedious and manual business processes. The baseline value-add, the whole reason for this endeavor, was to make life easier. But this was hardcore waterfall at its worst. In order to quiet the confusion and show we know how to do requirements engineering, we quickly diagrammed the flow of the new application on a whiteboard – the basic nuts and bolts of what we were after. About 6 months later, it was time for acceptance testing. The code was promoted to our QA environment on a Wednesday, and the production release was scheduled for Friday. Despite the pending doom of not having enough time or resources to test, I tested away as fast as possible. Did our shiny new product execute on the nuts and bolts as it was supposed to? Yes, but not well.

After this thing hit production, the net productivity gain was negative. We had lost critical functionalities to our business processes that weren’t documented, and each new defect we found represented its own, new manual workaround. Clearly, our requirements engineering efforts failed to deliver. Our boat flooded to sea level and it was all we could do to paddle around waist-deep for another year while more IT resources were spent bucketing out the water and patching the holes.

The Agile Life Raft

The first assumption is that waterfall methods served their purpose back in the day, but that agile methodologies would have saved us. This is only partly true. When implementing agile, one of the first hull breaches teams discover is still going to be the QA cycle. While being agile enables teams to course correct their code and features more quickly, it doesn’t aid test case design time, test data shortages, manual testing, test automation scripting, or the requirements engineering rework from defects and changes along the way. Agile methods allows us to work smarter, but we need the right tooling to be more efficient and keep up with our sprints.

Building a Better Ship

Let’s take a step back for a second and ask some important questions.

What are we testing? The functionality of the coded application.
Where did that come from? The developer.
Why did they code it? The business or customer requested it.
Was it coded as requested? Nope.
Why not? Because there were requirements engineering communication breakdowns from the moment we started collecting and documenting requirements.

It might surprise you to know that more than 50 percent of defects are the result poor requirements. So, if requirements is the issue, we need better requirements engineering and test case design tools. Typically, requirements are gathered and presented in two forms, too much or not enough. Whichever you get, both struggle to thoroughly convey ideas through the limitations of written word. Yet developers use the requirements to code and make assumptions when there’s ambiguity. Testers refer back to requirements and designing their test cases, oblivious to Dev’s interpretation, and test automation engineers scour through them to begin scripting their workflows. Think back to my story. What was required for solid communication of ideas and outcomes? It wasn’t a twenty-page scoping document detailing screenshots and Boolean statements written as best as one can in English. It was a whiteboard and more importantly, a model.

If the process is modeled, there’s less of a need for all of that rework, confusion, and miscommunication. Each path through your model is a test case directly tied to the requirements. Each time a change is made to the model, new test cases are created and old ones are broken and fixed. Negative test cases can be designed simultaneously. Furthermore, why maintain individual automation scripts for each test case? Tie automation to the model and export the individual scripts! Imagine if there was a tool that could do these things and operate as a single authoritative source – a reference point for the Product Owner, the BA, the Developer, the Tester, and the Automation Engineer. Watch this video to see why model based testing is so important.

The technology already exists, enabling teams to model their applications and quickly adapt to changes. This powerful modeling technology increases the quality of applications, eliminating requirement-based defects, and accelerating quality. By reducing defects, the cost of fixing those defects is reduced, and so is the time spent in rework. With less defects, rework, and cost, applications are able to hit the market faster, allowing revenue streams to be realized sooner. Isn’t it time to build a better software ship?

Try out the technology for yourself with a 30-day trial, and instead of struggling not to capsize, let the winds of change propel your organization on towards continuous delivery through continuous testing.