Schedule and Events



March 26-29, 2012, Software Test Professionals Conference, New Orleans
July, 14-15, 2012 - Test Coach Camp, San Jose, California
July, 16-18, 2012 - Conference for the Association for Software Testing (CAST 2012), San Jose, California
August 2012+ - At Liberty; available. Contact me by email: Matt.Heusser@gmail.com

Sunday, April 27, 2008

100% Test Automation

A recent post I put out to the Agile-Testing List:

The original poster asked:
> I have been asked to work out a system to make a 100%
> automated testing solution to suit the agile development
> process that the dev team are using.

Perhaps I don't understand what 100% automated testing solution means. When I hear those words, I think that means that every test should be able to run at a click of a button - that no test should ever involve setting up test data or checking the screen personally to make sure the results are "right."

Did I get that right?

If so, has *anyone* on your entire team *ever* actually had success with such an approach? AT ALL?

How will you automate usability testing?

Later on, the author asked:
>I think that while we are all very enthusiastic
>about agile development and automation of the acceptance
>tests we are not entirely sure of the practices that
>we need to implement.

I find it helpful to remind myself that our end goal is to produce working software, not cool automation. So one technique I have used is to write up the automation as stories and let the PM prioritize those stories. That means that sometimes, the automation doesn't get written. That's ok, because the business has made a decision to not invest in the feature - a fancy word for this is "governance." On the other hand, the automation that does come out is a true project - not something you, as a tester, are expected to write on your lunch hour despite other deadlines.

Of course, everything I'm writing above is in reference to customer-facing test automation, not developer-facing. Developer-facing unit tests, in my mind, are just part of the discipline of development, and I simply expect them to happen. If they don't, I expect the devs to either live with the pain or start automating. (If the code released to QA just plain doesn't work, and the devs have no automated unit tests, then I begin a ... collegial conversation among peers ...)

Regards,

--heusser
xndev.blogspot.com

1 comment:

Shrini Kulkarni said...

Matt, you are spot on!! Here I am discussing with one of my colleagues who claims that their team had a great success of automating about 98%. They also feel that this is a program running for past 2-3 years and customer has put his stamp on this.

Sales folks take this story and go around telling everyone that 98% of automation has saved 70% of manual testing effort ...

I am still need to feel the ground under my feet ... but this is something that these people claim ...

When I asked 98% of what? they say 98% of all available test cases have been automated. Further to this since all these test cases have been 100% mapped to requirements(not sure what that means) - hence these test cases *symbolically* represent 100% application functionality (assuming that what is in requirements has been implimented in best possible ways) ... In effect they say they have been able to automate about 98% of testing ... Tall claim. This application is a desktop application and is pretty stable application.

As I dig more on this .. I just told them one thing ...

Do you know what will *loose* if you automate a test? In that context what would an automation to the tune of 98% mean? only 2% human testing ....

For that they say ... "Yes .. application is pretty stable. Automation has proved highly useful"

What do you say? What are problems with such large percentage of automation .. whatever may be the denominator ... 98% is HIGH and SCARY ...

What would be your questions if someone claims 98% automation?

Shrini