Is your Test Management system really delivering results?

August 29, 2011

If you’re reading this I expect you’ve got a test management system in place in some form or another. The question is, how much value is that system delivering for you and your team. The answers to the following five questions should give you a pretty good idea.

1. What reports do you use? Reports are a key tool to support your business process. Yes we all get enticed by those smart and flashy 3D pie charts most solutions produce these days. Ask though if the data in those chart of any use? If you’re not using those reports on a daily, or at least weekly, basis then you’ve probably got problems.

Reports are supposed to guide your decision making process. If you’re not using them then you’re probably basing your decision on gut feel, instinct or guess work. If you’re doing this you should ask why you aren’t using your reports as the basis of your decision making process. Perhaps you don’t believe the information, or the information isn’t presented in the right format. Either way if you can’t use the data you’re being provided with you need to know why you’re expecting everyone else to go to the trouble of entering the data in the first place.

2. What business processes are being supported? This question has a subtle edge to it. I’m not asking which business processes the tool should be used for but which processes is it actually supporting. If your key business processes are being supported outside of your test management tool then chances are that your tools are failing in a pretty major way.

For example, is test execution time a key factor in your planning process? If so are you expecting your testers to enter time taken to test in spreadsheets rather than the test management tool? I’ve seen it happen. Companies spending tens of thousands on a tool and then 12 months down the road everybody in the team is being asked to track time in a spread sheet. This is data that the system should be tracking in the first place.

3. How good is the data? You don’t need to spend months analysing the data to work out if the data is any good. A quick look at a handful of records and a few carefully chosen questions to individual testers can give you a pretty good indication. If you find poor records ask the person who entered it why it’s not up to scratch. They’ll be the first to tell you if the process for entering the information is too onerous or if the forms for entering the information are irrelevant for the purposes they serve.

Very rarely is the cause of poor data down to the end user. Poor training, poorly designed data capture forms or a work flow implemented in the tool that doesn’t really match the real world work flow. All of these are far more likely to be the cause. Yet once a tool starts to get a reputation for poor data quality it’s very difficult to convince others that the tool is worth using.

4. How do you train new users? All teams have to bring on new team members at some point. Or even engage the services of people outside of the immediate team. Either way when you get a new user of the test management system are they being trained properly?

Without good training you’ll get inconsistent usage. This inconsistent usage usually leads to poor data quality and we already know that this leads to lack of confidence in the tool. As always poor training results in a less productive team.

5. When did you last make changes? Your business process doesn’t stand still. Your real life process change as the organisation changes, as the customers demands change and as you improve your process. If your test management tool isn’t keeping up with this change then it isn’t providing the real worth that you need.

So go back and see when you last updated work flow models in the system. Compare those work flow models implemented with the work flow some of your test team are following at the moment. Is it a good match? Has the real world process left the model you’ve implemented behind? Either way if you’re not updating your test management tool it’s probable that the tool you implemented is sliding into obsolescence.

I’ve seen few implementations that haven’t failed on one or two of the points listed above. Yet there’s not usually any insurmountable issues associated with the above. Which means it’s usually quick and simple to put right.

It all boils down to having confidence in the system you’ve implemented. And that test management system needs to be current, it needs to be well used and it needs to contain quality data.