Practical Test Management

It sort of pains me to even have to write about #4 on my list of mandatory practices (see Tuesday’s blog):

“4. Testing of every requirement (using the RM tool to track progress)”

because I always think “how else would you test?”. But perhaps that’s because I’m originally from an aerospace background where we had a very formal set of practices that involved not only tracing every requirement to a test but also formal documentation of a Test Plan, Test Description, and Test Report.

This is what we’d expect for things like Space Shuttle software and fighter jets, but these practices probably wouldn’t pass a cost/benefit tradeoff for most commercial software.

However, there are surely some basic things you’d like to see as a result of your testing. For example, wouldn’t you like to:
#1 Know that each requirement was tested (or what’s left to be tested)?
#2 Understand how thoroughly the requirement was tested?
#3 Understand what needs to be retested after changes are made or new features are added?

#1 can be achieved by a relatively simple practice of “checking off” each requirement as it’s tested. This assumes that the requirements are written down somewhere, and there must be some way for the tester to do this “check off”. Attributes associated with requirements (in a formal requirements management (RM) tool, make this pretty easy and is a big reason that I recommend a formal tool. You can so a similar thing in a word processor using annotation or in a spreadsheet program by creating a column for this information.

The “check off” can be done by entering the testing date, and if your team has more than one tester, perhaps identifying “who tested” might be worthwhile (and painless). By analyzing what has a “check off” and what doesn’t, you can easily determine how much more testing the product need – very often a question asked of the QA Manager (and therefore of the Project Manager).

The second item above, understanding the testing that happened, could be done by making the tester describe how he/she tested, but I’ve found that this is usually a lost cause. Testers don’t like to write down this detail. My workaround was to meet with the test team to make sure they understood the product and what the new features were supposed to do, and to ensure that they followed the basics of good testing (high/low limits, error conditions, constraints, etc.). Hopefully that last concern is eliminated when the tester is hired in the first place, but it never hurts to be certain, at least until trust is built.

Any test information that is written down (or test data, test results, special cases, etc.) could be stored in files that are linked or referenced in the attribute fields (or annotations) of the requirements. I have to admit that this isn’t the most-used bunch of files, but it did come in handy when a customer asked about specific testing of a complicated feature once.

Regression testing, the third item above, is always a challenge. It’s sometimes hard to retest something that was thoroughly tested before. It’s usually best to create a suite of tests that get run with each release, making this an automatic practice that is a requirement for releasing the software. The suite should be expanded as time goes on, and each release should get a review for areas that might be affected by changes and new features. This can be aided by maintaining a single requirements spec (or a spec per feature) that is updated with changes and new features, annotated with the release number, so you can see the areas affected and make regression testing plans accordingly.

There’s lots more to discuss relative to testing practices (performance testing comes to mind), but that’s for another time.

Good, practical test practices are usually created over time, but it’s good to start with some. I’ve found it very hard to introduce practices once a test team has gotten used to working without any. But I’ve found that testers will come and chat about modifying practices that aren’t working in order to improve them. If you start with the reason for your practices (what you want to accomplish), you can see how this will help make your point.

Anita Wotiz
www.duckpondsoftware.com (see the Consulting tab)
Instructor, UCSC Extension in Silicon Valley
Software Requirements Engineering and Management (next class Aug 2009)
Software Project Planning, Monitoring, and Management (next class Oct/Nov 2009)

Share

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top