This went out over SW-IMPROVE, my software discussion list. I will post my reply tomorrow ...
I heartily agree that the first two items are anathema to effective software development.
However, I see (with significant disclaimers) some benefit in the last two. When you say "hand off to the next person in the chain" that premises a serial relationship between workers who, assembly-line fashion, perform tasks that incrementally transform raw materials into some manufactured good. This might work if software were manufactured goods. It's a remarkably stupid way to approach software development. Whereas rivets and holes and the like in a manufactured good is standardized, and subject to no uncertainty or renegotiation at the point of production. However, if you and I work together between two parts of a software system, we'll may start with a rough interface between our stuff, but we'll be working out details between us.
I assert the value of the "artifacts that are intermediate work products" to record the AS-BUILT connections and theory of operation of each piece. A few reasons come to mind: some poor slob is going to have to maintain this mess. It's nice to have a written starting point to help that guy get into the "Zen" of the code. Especially, when that poor slob is either of us six months after our initial development push.
Second, suppose one of us gets hit by a truck or leaves the company. The only true reality of a software solution is the object code. But when you look at the source code, you easily lose track of the forest for the trees.
After a software system exists for a few years someone gets it in his head to "rewrite this mess, and get it right this time." I have fingerprints on three generations of software that solves similar/overlapping problems. Would you believe I've seen similar bug reports issued on each? Each time we ran into the same issues that are inherent to the problem being solved. What often looks like kruft in the code are bug-fixes. Those krufty looking bug-fixes reflect difficult parts of the problem being solved, or limitations of the initial design. The fact that they're bug-fixes indicates that the requirements and design were inadequate at these points. I have often heard "build the new one just like the old one, but..." If I can meaningfully study the old one's requirements and design while aware of the bugs of that implementation, I can write more complete requirements and design.
It's my opinion that the most enduring aspect of any software development system is the requirements discovery. I have picked up the requirements for a DOS system written in the 1980s and used them to directly identify test cases for three-generations-later rewrite. Thus when we discovered that we had no decent requirements statement for a project (written in the early 1990s.), I told the woman tasked with this, "Managers come and go. (I looked at my boss.) Programmers come and go. (I pointed to myself.) But the single most important thing that you will do in your entire tenure with this company will be this requirements statement."
This happened in the context of our having an existing code base that has been making the company money for over 15 years. Making the new one exactly like the old one would not work, because in so doing, we found differences that reflected undeniable errors I made in the early '90s. Back then, me and a fellow who left the company sweat bullets on the requirements coming up with "a nod and a handshake" agreements at each point. We didn't write it down; I just coded it that way. Thus, I had nothing to go by, except code known to have bugs and fading memories, when I did the rewrite this spring. My fading memories are a sucky basis for establishing requirements. Even less would be available to the company had I died of cancer five years ago. Conversely, if we had documented what we agreed to back then, we'd have saved the company a LOT of rework when the new implementation introduced variations where unstated business rules were ignored.
With this in mind, I happily assert, "if you didn't */record/* it, it didn't happen." I know you like to satisfy documentation requirements "creatively" so I'll gladly accept VHS (or Betamax) tapes of conversations between principals where requirements are discovered and concomitant decisions are made. Similarly, I'm cool with the meeting minutes of design meetings and reviews.
Schedule and Events
March 26-29, 2012, Software Test Professionals Conference, New Orleans
July, 14-15, 2012 - Test Coach Camp, San Jose, California
July, 16-18, 2012 - Conference for the Association for Software Testing (CAST 2012), San Jose, California
August 2012+ - At Liberty; available. Contact me by email: Matt.Heusser@gmail.com