Schedule and Events

March 26-29, 2012, Software Test Professionals Conference, New Orleans
July, 14-15, 2012 - Test Coach Camp, San Jose, California
July, 16-18, 2012 - Conference for the Association for Software Testing (CAST 2012), San Jose, California
August 2012+ - At Liberty; available. Contact me by email:

Tuesday, August 11, 2009

Becoming a software testing expert ($50 USD/year paid registration) has been having a discussion on "Becoming a testing expert" in it's forum lately. A number of the comments were very insightful and interesting. I did put out a short follow-up reply out that I thought might be helpful to Creative Chaos readers:

I've heard it said that you can tell a newbie because they want to be told what to do. You bring them in to remodel your kitchen or write your software (or maybe test it), and they ask for a spec or maybe a test plan. When this is kinda vague, they get mad at you. This is a 'contractual' worldview.

A different worldview is that you are discovering the requirements together. The craftsman doesn't ask for a spec; instead, he asks a bunch of questions, and eventually makes a protoype "is this what you want?"

The first prototype is not a solution, instead, it's designed to provoke a reaction "no, but now that I see that, I know what i really want" and the game continues until the prototype is close enough to the desired functionality for work to continue.

That's how I like to approach testing - as a collaborative risk management exercise. Does that make me an expert? Not alone, and that's really for you to decide in your own mind, anyway. But what I can tell you is the people whining about the requirements are too vague, or they should have been involved up front, or they need a test plan ... well, you can probably guess what my initial response is to that kind of rhetoric.

But that's just me talking. YMMV.

This ideal lines up with my concept of the Boutique Tester in that you have the contributor taking 'the bull' of the test process by the horns and shaping a test strategy for each engagement. It is far from complete. What do you think?


Alex said...

I used to use that rhetoric. I whined about poor requirements and demanded testers get involved at the beginning of projects.
Well, people listened, and testers were brought in earlier. But it didn't matter. The requirements didn't get better and the testers didn't contribute. All I got were complaints from product managers that they were paying for testers without anything to test.

The problem was/is exactly what you posted -- wanting to be told what to do. I have almost an identical post on an internal blog.

The answer is to move from a "push" where you're waiting for instruction to a "pull" where you request, and look for, and investigate, and discover. And collaborate, and learn.

Ben Kelly said...

I was with you all the way up to But what I can tell you is the people whining about the requirements are too vague, or they should have been involved up front, or they need a test plan ... well, you can probably guess what my initial response is to that kind of rhetoric.

If you have a poor attitude, or if you're ignorant of what your responsibilities as a tester are, it doesn't matter when you're involved, or what documentation exists.

Being involved early in a project can be important - I would argue that as a tester it's tough to do collaborative risk assessment if you're not involved until SIT begins on a waterfall project, for instance. Same arguments can be made for whatever flavour of SDLC you prefer.

Having clear specifications can be important, as can having a test plan. Knowing why they're important and how people use them is the difference, not whether or not they ask for them.

If as a tester I ask about requirements documentation that doesn't exist, that alone doesn't make me a bad tester. If I down tools and refuse to work, or say that I am prevented from working because of the lack of documentation, then a bad tester tag might be warranted.

As testers, the onus is on us to make sure that other members of the team know what we need to help us do our job (of helping them). Being proactive isn't necessarily the hallmark of an expert tester, but I do think it is a prerequisite for being considered a competent one.

Ted said...

Hi!, I have compiled a list of the top Agile blogs, and yours was included! check it out

You can claim your link badge at

Matthew said...

Ben, Alex -

Thank you for the thoughtful feedback. As for the idea the being involved earlier /can/ help, I certainly agree with the feedback. It can make me more effective, and it's good.

But if it doesn't happen; if I am brought in at the last minute, I am not going to throw up my hands and say "I can't test this and it's your fault."

... and, I suspect, neither would you.

Anonymous said...

I recently came across your blog and have been reading along. I thought I would leave my first comment. I don't know what to say except that I have enjoyed reading. Nice blog. I will keep visiting this blog very often.