We've been discussing tab-order tests on my discussion list, SW-Improve, and also the Software Testing Yahoo Group.
Elisabeth Hendrickson (or "ESH", for short) put out a post on SW-IMPROVE that said that, while automating browser tests might be prohibitively expensive for many shops, she found that high-functioning agile shops doing acceptance-test driven development often had a different equation than a "traditional" shop - and - for them - GUI test automation may be cheaper and have a greater return - more quickly.
Now, ESH is one of the those rare birds who has a real understanding of agile development, a broad and deep exposure (and experience) with software testing in a great variety of contexts, coupled with an actual understanding of human nature and interaction design.
So when ESH says something, I try very hard to listen. Reading her comments made me take a very hard, realistic look at my position. Along the way, I learned a thing or two.
This is my response:
For the record, I think XP-style developer-facing TDD as per the Jeffries and Beck boks are totally awesome. This post is about browser-driving system or 'acceptance tests.'
Elisabeth Hendrickson wrote:
>So yes, if I am working in my preferred kind of context that already
>has a significant investment in automated tests, I would automate this
>test. And I would automate others like it if there are other, similar,
>expectations that we had not yet captured and articulated in automated
>tests.
If we frame the problem in the context of acceptance-tests - things the customer actually cares enough to articulate - I think a test for tab order might take sense. For example, if we are taking some sort of green-screen application that the data-entry people zip through and putting it on the web, we darn well better make sure not to require the operator to take his hand off the keyboard and onto the mouse between fields. (Yes, we might be able to automate the whole thing; different post.)
If we even bring it up and the customer says "Yes, add that to the acceptance tests", we might want to automate it.
At the same time, I assert that I could find enough of these "quick tests", that everyone agrees "should" be automated, that the cost of writing the test automation grows to 3-5x the cost of developing the code itself. I have empirical evidence from Google that can back this up, and proposed a session at Agile 2009 to discuss it.
This is Heusser's first law of software testing: The better the tester you are, the more test ideas you have - but the Good Lord doesn't give you any more time. I have found that people who want to automate every test you can think of simply can not think of as many tests as I can.
So, I /might/ do it if the customer is willing to add it to the short list of 'automated specifications', or browser-driving tests. Of course, being Agile, we want this list to be leading to working software, /not/ a comprehensive list.
Many people believe that automated tests, once written, are "free" -- I've worked in shops - one, in particular, a best-in-class, high functioning agile shop, that followed this advice, and now have a very large amount of tests - say 10,000 f thousands of automated tests. Unless the GUI is static (in which case, you're done, and there is no new information) these tests tend create /drag/ on the project. The tests tend to break under CI when the GUI changes, and you need to invest time into bringing them up, etc.
With straight ATDD, this is really manageable. With "automate everything", it's extremely painful.
Of course, I am a member of the context-driven school of software testing, so I do not believe in, and am essentially not /allowed/ to make the kind of claims of "best practice" that you may often here about testing. What I wrote above is simply my experience and current opinion. In the face of different information, I might change that opinion.
regards,
--
Matthew Heusser
Schedule and Events
March 26-29, 2012, Software Test Professionals Conference, New Orleans
July, 14-15, 2012 - Test Coach Camp, San Jose, California
July, 16-18, 2012 - Conference for the Association for Software Testing (CAST 2012), San Jose, California
August 2012+ - At Liberty; available. Contact me by email: Matt.Heusser@gmail.com
Friday, January 23, 2009
Wednesday, January 21, 2009
CYA's don't
John McConda posted to the Software-Testing Yahoo Group yesterday, asking if it made sense to get rid of a test suite that took 2 people full time to maintain - yet never, ever, found a single bug, since he had been hired six months before.
This is my reply:
John McConda:
>I feel like the suite has been used as a parachute for
>some time, where if bugs get through to production,
>they've been able to point to it and say' but look
>at all the tests we've run!".
Ahh, Heusser's First Law of Product Development:
"The desire to avoid blame is, if not the root of all kinds of evil, at least the cause of a many a death spiral on technology projects."
Or, to quote Rich Sheridan of Menlo Innovations:
"If we ran hospitals like we ran software projects, the goal of the doctor would be - when the patient is dead, have a convincing explanation why it isn't your fault."
Or, Heusser's First Law of Product Dev stated more succinctly:
"CYA's Don't."
This is my reply:
John McConda:
>I feel like the suite has been used as a parachute for
>some time, where if bugs get through to production,
>they've been able to point to it and say' but look
>at all the tests we've run!".
Ahh, Heusser's First Law of Product Development:
"The desire to avoid blame is, if not the root of all kinds of evil, at least the cause of a many a death spiral on technology projects."
Or, to quote Rich Sheridan of Menlo Innovations:
"If we ran hospitals like we ran software projects, the goal of the doctor would be - when the patient is dead, have a convincing explanation why it isn't your fault."
Or, Heusser's First Law of Product Dev stated more succinctly:
"CYA's Don't."
Wednesday, January 14, 2009
30th Anniversary of the Spreadsheet
On the Software-Testing Yahoo Group lately, we've been debating the pros and cons of W. Edwards Deming. Among that group, I am surprisingly pro-Deming. Here's a quote from his wikipedia page that I particularly like:
Deming realized that many important things that must be managed couldn’t be measured. Both points are important. One, not everything of importance to management can be measured. And two, you must still manage those important things. Spend $20,000 training 10 people in a special skill. What's the benefit? "You'll never know," answered Deming. "You'll never be able to measure it. Why did you do it? Because you believed it would pay off. Theory." Dr. Deming is often incorrectly quoted as saying, "You can't manage what you can't measure." In fact, he stated that one of the seven deadly diseases of management is running a company on visible figures alone.
Now that is not something you hear every day when you talk to a management "guru."
In that Spirit, John Dvorak just published a humorous, and, I hope, tongue-in-cheek critique of that ultimate management-by-the-numbers tool: The Spreadsheet. Yes, the spreadsheet, wheren you model the problem mathematically and it pops and answer out.
According to Dvorak, it's done more harm that good. Like I said, I think his column is intended to be at least slightly humorous - so please take it with a grain of salt - but also, please, take a look.
Deming realized that many important things that must be managed couldn’t be measured. Both points are important. One, not everything of importance to management can be measured. And two, you must still manage those important things. Spend $20,000 training 10 people in a special skill. What's the benefit? "You'll never know," answered Deming. "You'll never be able to measure it. Why did you do it? Because you believed it would pay off. Theory." Dr. Deming is often incorrectly quoted as saying, "You can't manage what you can't measure." In fact, he stated that one of the seven deadly diseases of management is running a company on visible figures alone.
Now that is not something you hear every day when you talk to a management "guru."
In that Spirit, John Dvorak just published a humorous, and, I hope, tongue-in-cheek critique of that ultimate management-by-the-numbers tool: The Spreadsheet. Yes, the spreadsheet, wheren you model the problem mathematically and it pops and answer out.
According to Dvorak, it's done more harm that good. Like I said, I think his column is intended to be at least slightly humorous - so please take it with a grain of salt - but also, please, take a look.
Monday, January 12, 2009
January ST&Pedia in the mail (and, ah, intarwebs)
The January issue of Software Test&Performance Magazine is out. You can download it now; (for free) our column, which covers Application Lifecycle Management tools and Service-Oriented-Architectures, appears on page 28.
Friday, January 09, 2009
This is how we do it ...
Adina Levin, our VP of Products, presented "how we develop software at Socialtext" yesterday at a meeting of the Silicon Valley Product Management Association.
Slides:
UPDATE: I'm trying to add more multimedia to Creative Chaos, but keeping it in the < 10 minute range. Adina's presentation is 17 minutes, which will be the exception.
What do you think? More Multimedia? Less? Was this good? I appreciate your feedback.
Slides:
UPDATE: I'm trying to add more multimedia to Creative Chaos, but keeping it in the < 10 minute range. Adina's presentation is 17 minutes, which will be the exception.
What do you think? More Multimedia? Less? Was this good? I appreciate your feedback.
Wednesday, January 07, 2009
Black-belt testing challenge - II
(A hint)
The challenge involves watching a video that introduces a simple web-based application, and describes a testing strategy - then identifying your own testing strategy for the same web-based app, and defending your answer. You can email me to join the challenge (matt.heusser@gmail.com) or leave non-revealing questions or comments on the blog itself.
The challenge involves watching a video that introduces a simple web-based application, and describes a testing strategy - then identifying your own testing strategy for the same web-based app, and defending your answer. You can email me to join the challenge (matt.heusser@gmail.com) or leave non-revealing questions or comments on the blog itself.
Tuesday, January 06, 2009
Black-belt testing challenge
I've got a black-belt thinking analysis problem I am working on. It is non-trivial; your expected involvement would be on the order of 1.5 hours. At this point, /I/ am not 100% certain that /I/ have confidence in my own answer to the challenge.
If you are interested, drop me an email: matt.heusser@gmail.com
If you are interested, drop me an email: matt.heusser@gmail.com
James Bach on Test Improvement
James is one of the best-known skilled testers in the world - and he can explain his ideas well. Those three things are key:
1) He's made an effort to heard above the din ("well known"). That means he is involved in the community. Unless a tester is involved in the community - they may be doing great work and have great ideas, but others won't get benefit from that experience.
2) He's skilled. I think that one goes without saying. Un-skilled people that are well-known waste your time at best, and can lead you toward bad testing at worst.
3) He can explain his ideas. It has been my experience that whenever you try to say anything above the fray, it will be very easy to have your idea mis-understood, mis-applied, and attacked. You've got to be able to defend what you stand for, fight for it, articulate it well, and deal with attacks with grace and aplomb.
In our little world of testing, James is one to learn from. Now watch the video!
If you want more, I'd suggest his blog or the book Lessons Learned in Software Testing, which James co-authored with Cem Kaner and Brett "The Guy with the Cowboy Hat" Pettichord.
1) He's made an effort to heard above the din ("well known"). That means he is involved in the community. Unless a tester is involved in the community - they may be doing great work and have great ideas, but others won't get benefit from that experience.
2) He's skilled. I think that one goes without saying. Un-skilled people that are well-known waste your time at best, and can lead you toward bad testing at worst.
3) He can explain his ideas. It has been my experience that whenever you try to say anything above the fray, it will be very easy to have your idea mis-understood, mis-applied, and attacked. You've got to be able to defend what you stand for, fight for it, articulate it well, and deal with attacks with grace and aplomb.
In our little world of testing, James is one to learn from. Now watch the video!
If you want more, I'd suggest his blog or the book Lessons Learned in Software Testing, which James co-authored with Cem Kaner and Brett "The Guy with the Cowboy Hat" Pettichord.
Monday, January 05, 2009
New Blog Up
Happy New Year! It's time to try something new!
I've decided to try a little experiment and split my blog, based on my interests. Creative Chaos will remain an exploration of dynamics in development, with a focus on testing and systems thing.
My new blog, The Craft Of Software will be about how we approach our work - how we learn, how we improve, how we demonstrate expertise and mastery in all technical disciplines - Development, Project Management, and Testing. Crafting Software will probably have a little bit more of an agile and development feel.
You can pick which one to read, based on your interests, or hopefully, read both. :-)
I've decided to try a little experiment and split my blog, based on my interests. Creative Chaos will remain an exploration of dynamics in development, with a focus on testing and systems thing.
My new blog, The Craft Of Software will be about how we approach our work - how we learn, how we improve, how we demonstrate expertise and mastery in all technical disciplines - Development, Project Management, and Testing. Crafting Software will probably have a little bit more of an agile and development feel.
You can pick which one to read, based on your interests, or hopefully, read both. :-)
Subscribe to:
Posts (Atom)