We are debating the value of software testing standards right now on the context-driven testing list.
Here's my latest post ...
>and by the same token writing test cases doesn't make
>your testing worth any more than if you wrote
>NOTHING AT ALL.
For the broad, general case, James, I agree with you.
However, (to borrow a phrase) can you imagine a situation where this is not the case?
For example - instead of two pages of MS word documents per page, image one row in a spreadsheet, with five columns -
Who you should do
What we expect to happen
What actually happened
Your program is a Fahrenheit to Cecilius conversion. The requirements talk about the formula and give cases to 0, 32, 212 and 100 - but don't cover bounds or rounding.
The test cases cover bounds and rounding, and the customer views them and agrees.
In this case, test cases are a form of documentation. Heck, I wrote an article on it!
My main problem with this is that using this side-effect logic, you aren't adding value to forensic and investigative process of figuring out if the software works.
In other words, your "test documentation" may help with something, but, at this point, it is not helping you test. So why call it test documentation?
Luckily, I can think of other examples. Say you have an API that does the conversion, and a test suite that looks like this:
my ($blnOk, $msg, $convert) = FahrToCel(5001);
ok(!$blnOk,'Limit of function is 5000');
ok($msg eq 'FahrToCel Limit Exceeded','And error message makes sense');
($blnOk, $msg, $convert) = FahrToCel(5000);
ok($blnOk,'5000 and under work fine');
----> These examples not only provide basic regression, they provide examples of the basic API for the maintenance programmer, and they get the easily, simple bugs out of the way so that we can focus on finding the real hard ones.
Sadly, I have to agree with James and Cem's comments. In the years that I have heard the mantra of "You must document your test cases", the few examples I saw had much more complexity and needless detail than the examples above, and we never automated at the API level with simple, straightforward code - meaning that the Return On Investment for the practice went way down.
Again, sadly, I suspect that's because the gurus had never actually, well ... done much of the stuff in the field.
And that, in a nutshell, is why I am involved in the context-driven community. :-)
Schedule and Events
March 26-29, 2012, Software Test Professionals Conference, New Orleans
July, 14-15, 2012 - Test Coach Camp, San Jose, California
July, 16-18, 2012 - Conference for the Association for Software Testing (CAST 2012), San Jose, California
August 2012+ - At Liberty; available. Contact me by email: Matt.Heusser@gmail.com