I just got my copy of How We Test Software at Microsoft in the mail. Weighing in at 420 pages, it will be awhile before I can digest the whole thing.
One of the more ... interesting things about the Microsoft test culture is the insistence that the Software Development Engineer in Test, or SDET, be a fully-qualified developer. This has causes some degree of confusion; Developer-Types and Agile Advocates say things like "All Microsoft testers write automaton all the time" or "All tests should be automated" or "Microsoft views testing as a automation activity." (Don't believe me? I was challenged on it in an interview just last week.)
So here's what the Microsoft guys have to say, straight from the horse's mouth:
The concept of hiring a software engineer with a passion for testing is powerful and is the biggest differentiator between the Microsoft approach to software testing and the typical industry approach. The most common conclusion drawn is that we hire these "coders" for test positions because we want to automate everything and eliminate manual testing. Although we do want tester who can write effective automation, that's only a small part of the equation. Testers who understand programming concepts and computer architecture typically have the analysis skills that testing requires. They can also find bugs earlier and understand the root cause so that they can quickly find other similar bugs and implement early detection. This strong grounding in computer science - the same ground a developer has - reinforces the tester skills and gives us a more dynamic and flexible workforce of testers. - Page 23
In my words, an SDET who is at least a qualified entry-level developer will be able to understand things like signed/unsigned errors, buffer overflows, and be able to test tools like a compiler or linker more effectively.
By making that tradeoff, Microsoft gets certain benefits, and also, due to the law of supply and demand, pays a little more for it's testers. I suppose I don't have a problem with a company that makes compilers and linkers wanting to make that distinction.
But let's not jump to the conclusion that all tests at Microsoft run unattended at the push of a button. In fact, on page 220, the authors give a guide about how they make the decision to automate or not. The biggest problem they list with test automation - or "unattended test execution" as I'd call it - is that test automation can give false error messages. These error messages are usually the result of a change in the configuration or a problem with the test set-up. Those tests need to be re-run, observed, the problem tracked down ... and all this takes time and attention.
Schedule and Events
March 26-29, 2012, Software Test Professionals Conference, New Orleans
July, 14-15, 2012 - Test Coach Camp, San Jose, California
July, 16-18, 2012 - Conference for the Association for Software Testing (CAST 2012), San Jose, California
August 2012+ - At Liberty; available. Contact me by email: Matt.Heusser@gmail.com
Thursday, February 19, 2009
Wednesday, February 18, 2009
Time and Attention
The agile movement is beginning to recognize the importance of tech debt, the value of exploratory testing, how distributed development can be done in an Agile way, and the importance of having a balanced breakfast to your testing approach.
Not only that, the agile conference is in Chicago and I could actually afford to go. I put in four proposals and was super-excited - this was going to be my year!
Well, on top of the blogging. And the day job at Socialtext. And the teaching at night at Calvin College. And the monthly column for Software Test and Performance Magazine. And contributing a chapter for "Beautiful Testing". And speaking at the occasional user's group. And obligations in the real work like Coaching Soccer or being a father and husband.
So, after much careful thought, I have pulled my proposals from Agile2009 - I simply don't have time to give it my full attention, and anything less would be, well, something less than excellent. As the guy that started a conference on software excellence, that idea didn't really appeal to me.
I do, however, have one more challenge if anybody is interested. Jim Shore has his test on agile maturity here. The challenge is: Think of at least two situations where the "mature" answer might actual be the /wrong/, or "less mature" answer.
UPDATE: Fixed the bad link.
Not only that, the agile conference is in Chicago and I could actually afford to go. I put in four proposals and was super-excited - this was going to be my year!
Well, on top of the blogging. And the day job at Socialtext. And the teaching at night at Calvin College. And the monthly column for Software Test and Performance Magazine. And contributing a chapter for "Beautiful Testing". And speaking at the occasional user's group. And obligations in the real work like Coaching Soccer or being a father and husband.
So, after much careful thought, I have pulled my proposals from Agile2009 - I simply don't have time to give it my full attention, and anything less would be, well, something less than excellent. As the guy that started a conference on software excellence, that idea didn't really appeal to me.
I do, however, have one more challenge if anybody is interested. Jim Shore has his test on agile maturity here. The challenge is: Think of at least two situations where the "mature" answer might actual be the /wrong/, or "less mature" answer.
UPDATE: Fixed the bad link.
Wednesday, February 11, 2009
February ST&P is out!
Chris McMahon and I cover Service-Oriented Architectures in this issue of STPedia, which, yes, you can download for free. Our column appears on page 18.
Tuesday, February 10, 2009
Metrics
Michael Bolton just did a wonderful blog post that summarizes his position on metrics - it is very close to my own. I highly recommend that you not only read it, but that you start emailing it to executives when they start a metrics initiative.
I am completely serious.
I am completely serious.
Monday, February 09, 2009
Black-Belt Testing Challenge - III
About a month ago I threw out a black-belt testing challenge. This one was a little bit more involved than tests in the pasts - I designed it to both take a little more work to do, and more work to get involved in.
Instead of a simple "here's a salt shaker -- test it!" listed on the blog, I asked the participants to email me. That meant the first test was itself a test of initiative.
The next test was on of commitment; readers had to watch a one hour video that introduced a piece of software and offered a test strategy. Then they needed to critique that test strategy and invent their own. Then, over email, they had to defend it.
It was not an easy test; I asked the readers to defend their techniques.
Now, my colleague Chris McMahon has said that there are many wrong ways to test software, and nearly as many "right" ones, and he's correct. I believe test strategy is much more often a question of better or worse than right and wrong. I do not expect participants to come to the same answer as me; I am more concerned that the strategy they use be capable of (A) Rapid Assessment, (B) Finding a large set of categories of bugs, and (C) that they stand behind it as personally responsible professionals.
Now, some people don't want to invest two or three hours in a test strategy that isn't real. Others don't feel they need to 'prove' anything, and that's fine. I've had several inquiries that didn't finish the process either, and there's nothing wrong with that.
At the same time, we just had our first challenger stand up, define a strategy and defend it well. If you don't know Markus Gaertner, you might want to get to know him. He has some fine ideas.
And, if you think you've got what it takes to wear the black-belt - or, you don't, you just want to get some free testing training from old man Heusser, feel free to email me: Matt.Heusser@gmail.com.
The test is still open. There's time for one more. Will it be you?
Instead of a simple "here's a salt shaker -- test it!" listed on the blog, I asked the participants to email me. That meant the first test was itself a test of initiative.
The next test was on of commitment; readers had to watch a one hour video that introduced a piece of software and offered a test strategy. Then they needed to critique that test strategy and invent their own. Then, over email, they had to defend it.
It was not an easy test; I asked the readers to defend their techniques.
Now, my colleague Chris McMahon has said that there are many wrong ways to test software, and nearly as many "right" ones, and he's correct. I believe test strategy is much more often a question of better or worse than right and wrong. I do not expect participants to come to the same answer as me; I am more concerned that the strategy they use be capable of (A) Rapid Assessment, (B) Finding a large set of categories of bugs, and (C) that they stand behind it as personally responsible professionals.
Now, some people don't want to invest two or three hours in a test strategy that isn't real. Others don't feel they need to 'prove' anything, and that's fine. I've had several inquiries that didn't finish the process either, and there's nothing wrong with that.
At the same time, we just had our first challenger stand up, define a strategy and defend it well. If you don't know Markus Gaertner, you might want to get to know him. He has some fine ideas.
And, if you think you've got what it takes to wear the black-belt - or, you don't, you just want to get some free testing training from old man Heusser, feel free to email me: Matt.Heusser@gmail.com.
The test is still open. There's time for one more. Will it be you?
Wednesday, February 04, 2009
Context or what?
There's been a lot of conversations recently on twitter about context. Specifically, the context-driven-school of software testing, which says that the way we do our work should be (strongly) influenced by the business problems we are currently living in.
I went into some detail on this earlier in the week in an interview with the Elegant Coders.
Two days later, and I read Ron Jeffries's Post "Context My Foot."
Now Ron is a very smart guy, and when he and I disagree, I start to wonder what's really going on.
Here's what I see as Ron's Objection: You say your team wants to do extreme programming, but you have business analysts who write requirements documents. You can't fire them - they are your context. So you don't do story cards, but instead do written documents.
And your executives are used to getting estimates of the exact day the software will be done, with all features. And you can't change them - why, that's the business context!
And you've got this one Vice President of Engineering who really hates pair programming, and the HR department won't let you move the cubicles around to create an open environment ...
etc, etc, etc. Rinse and repeat. Eventually, you've made so many compromises that you aren't really doing agile at all.
But you're context-driven, right?
Well, uh ... no.
When I speak of the business domain, I do not mean the fact that a certain vice president is stuck in his ways. Those might be impediments - they may even be context - but they are accidental.
When I talk about context, I mean the essence of the business problem. The essence of the problem at XBox 360 at Microsoft is different than the essence for people developing embedded systems at Boeing.
Don't believe Matt Heusser? How about Michael Porter, professor of business at Harvard University. His book, Competitive Strategy: Techniques for Analyzing Industries and Competitors introduces a 'competitive forces' model in order to examine a business .
It is not light reading, but the basic argument is that to improve the business, you need to examine and adapt to your competitive environment. Instead of a simple prescription of "best practices", Porter gives his readers tools to figure out for themselves the right thing to do.
The book is about essential business context - not accidental. I've read it twice now, and I cannot recall a single instance where Porter talks about giving up on a strategy because the HR department doesn't like the idea.
So I remain context-driven. For nearly any practice, I can come up with cases where I might not want to apply it. And, at the same time, Ron Jeffries does have a point - some appeals to context are just whining.
Our challenge is to know the difference.
I went into some detail on this earlier in the week in an interview with the Elegant Coders.
Two days later, and I read Ron Jeffries's Post "Context My Foot."
Now Ron is a very smart guy, and when he and I disagree, I start to wonder what's really going on.
Here's what I see as Ron's Objection: You say your team wants to do extreme programming, but you have business analysts who write requirements documents. You can't fire them - they are your context. So you don't do story cards, but instead do written documents.
And your executives are used to getting estimates of the exact day the software will be done, with all features. And you can't change them - why, that's the business context!
And you've got this one Vice President of Engineering who really hates pair programming, and the HR department won't let you move the cubicles around to create an open environment ...
etc, etc, etc. Rinse and repeat. Eventually, you've made so many compromises that you aren't really doing agile at all.
But you're context-driven, right?
Well, uh ... no.
When I speak of the business domain, I do not mean the fact that a certain vice president is stuck in his ways. Those might be impediments - they may even be context - but they are accidental.
When I talk about context, I mean the essence of the business problem. The essence of the problem at XBox 360 at Microsoft is different than the essence for people developing embedded systems at Boeing.
Don't believe Matt Heusser? How about Michael Porter, professor of business at Harvard University. His book, Competitive Strategy: Techniques for Analyzing Industries and Competitors introduces a 'competitive forces' model in order to examine a business .
It is not light reading, but the basic argument is that to improve the business, you need to examine and adapt to your competitive environment. Instead of a simple prescription of "best practices", Porter gives his readers tools to figure out for themselves the right thing to do.
The book is about essential business context - not accidental. I've read it twice now, and I cannot recall a single instance where Porter talks about giving up on a strategy because the HR department doesn't like the idea.
So I remain context-driven. For nearly any practice, I can come up with cases where I might not want to apply it. And, at the same time, Ron Jeffries does have a point - some appeals to context are just whining.
Our challenge is to know the difference.
Tuesday, February 03, 2009
Laws of Software Development
Gerald M. Weinberg has two books Secrets of Consulting and More Secrets of Consulting that have a number of guides, rules of thumb, advice, rubrics, and so on. I found these immensely helpful. At the same time I am reluctant to use the term 'law' (as in Newton's laws) - because 'law' implies that the idea is /always right/. Still, there are a few things I have seen enough times in a row to classify as dang strong bits of advice(*), and a "law of software testing" has a nice ring to it.
So, after years of doing test automation, after speaking at large conferences about it, consulting on it, running events and getting a broad and deep consensus from the community, Here is Heusser's last law of test automation:
"Despite all your automation awesomeness, with you automated build and build verification hooked into your CI and FITnesse and sellenium and automatic deploy and deployment-notification emails ... before you send those notification emails to that distribution list, you probably want to have a human being poke and it and make sure it's ok.
Just trust me on this one. Or don't; your call."
That's my last law. I suppose I should go find the first few. :-)
--heusser
(*) - Yes, I am a card-carrying member of the context-driven school of software testing (www.context-driven-testing.com) - so I am a part of a community the essentially censures the term "best practice." For nearly any "best practice", I can find examples where you might /not/ want to do it. And, when it comes to finding absolute things that are always true on every project, the few I can come up with are in the area of "brush your teeth and wear deodorant."
This latest bru-ha-ha on the intarwebs about context-driven could be better?
That's tomorrow's post, bub.
So, after years of doing test automation, after speaking at large conferences about it, consulting on it, running events and getting a broad and deep consensus from the community, Here is Heusser's last law of test automation:
"Despite all your automation awesomeness, with you automated build and build verification hooked into your CI and FITnesse and sellenium and automatic deploy and deployment-notification emails ... before you send those notification emails to that distribution list, you probably want to have a human being poke and it and make sure it's ok.
Just trust me on this one. Or don't; your call."
That's my last law. I suppose I should go find the first few. :-)
--heusser
(*) - Yes, I am a card-carrying member of the context-driven school of software testing (www.context-driven-testing.com) - so I am a part of a community the essentially censures the term "best practice." For nearly any "best practice", I can find examples where you might /not/ want to do it. And, when it comes to finding absolute things that are always true on every project, the few I can come up with are in the area of "brush your teeth and wear deodorant."
This latest bru-ha-ha on the intarwebs about context-driven could be better?
That's tomorrow's post, bub.
Monday, February 02, 2009
Agile Testing Preview
If you'd like to hear a basic seat-of-the-pants overview of my current thinking on agile testing, you might want to check out my interview with David Starr on his podcast "Elegant Code." He just put the interview up on the web today. The audio quality is a little choppy in parts, but I'm more interested in your feedback on the quality of the ideas.
You can download the MP3 directly here.
If you noticed that the content of the interview comes very close to matching my Agile 2009 proposals, well, gee, that's interesting, isn't it?
You can download the MP3 directly here.
If you noticed that the content of the interview comes very close to matching my Agile 2009 proposals, well, gee, that's interesting, isn't it?
Subscribe to:
Posts (Atom)