Schedule and Events
Tuesday, October 31, 2006
1) "All things to all people" - The Marketing-Driven Approach (Also known as the "checklist" approach)
2) "Sense and Respond" - The Microsoft Approach
Sense and Respond is a very real thing. In fact, it's covered in pages 120-126 of Information Systems Management In Practices, 6th Edition. You can buy the seventh edition from Amazon here.
My problem is that Sense and Respond is still a perfection-driven, so-everything, be-the-best approach. It is a modern approach; it is a better mousetrap.
So, let's look at the iPod. Keep in mind, there were MP3 Players already on the market before the iPod. They had some software. Why didn't they sell?
A) The Big Checklist was a problem. The "add features fast" theory led to features that were not well thought out, no one would use, and cluttered up the user interface.
B) That approach appeals to early adopters and the tech savvy. There was a vast, untapped market that just wanted to listen to music, just like they used the VCR to listen to movies. Keep in mind that clock still blinks 12:00 in many households.
C) The vast multitude didn't want to take pictures, record audio, and make phone calls ... They just wanted to listen to Britney Spears. To do that, they had to "rip" music from the CD's they owned, and the MP3 Players rarely shipped with software that was useful. (After all, the companies thought "We make hardware, not software.")
Enter Apple. Enter the iPod.
The fist thing I find interesting about the iPod is that it does one thing (music), and does it well.
It's the "one check box" approach. Check the box, but check its so well that you create barriers to entry. Heck, make a patent.
In fact, to make the iPod, Apple took things that are on the checklist of any MP3 player and no one even thinks about (like the ability to switch batteries) and it took them off.
That's right. You can't switch batteries on your iPod. What's more, you probably don't care because it's so fruity-good.
Under this model, usability and simplicity are king. So Apple made iTunes, and made it really easy to transfer music from CD to PC and MP3 player, or to buy music online. Apple then let an entire line of partners make tools - armbands, radio trasmitters, and car chargers. Each partner either does an awesome job at one thing, or, well ... Probably loses the market.
Simplicity. Usability. Do one thing and do it well. Make sure that your one thing integrates well with the world around you.
Here's my argument: These ideas are inherently post-modern. Over the next ten years, in the competitive market of ideas, I think they are going to win.
Once I find that I have read a hundred-thousands words of fiction inside a week, in my spare time, because the writing is just so good that I gobble it up, I tend to listen to what the person has to say.
And I don't have a lot of spare time, either. :-)
So in that spirit, I just read a post on presentations called "Please stop giving (cruddy) presentation"- by David Rogers. Check it out here.
Granted, it's a trade magazine, which means it is paid for by adverts, which means reader beware, but there is some good information in it, and each issue seems to have a little bit more for me to dig into. This month (Free PDF here) Scott Barber has a column on a company with a very different software strategy.
Where most companies poll and wine and dine the 'decision makers' with the purse-strings, this company actually interviewed and tried to understand the users of it's software, to try to build the best product - instead of the product that was easiest to sell. They actually paid the travel expenses for the user group, had an advisory board that actually mattered, and, according to Scott, built the best product they could.
While I was reading the article, I kept noticing the time-and-energy pains that the company took to "get it right." I kept asking myself "Who has the resource - time, people, staff, to do this kind of in-depth interviewing about a software product?"
I thought that it had to be Oracle, HP/Mercury, Microsoft, audacity, or another, similar company.
And, of course, it was Microsoft.
Why does this matter? Because this is a cultural shift in software strategy. The dominant strategy of the 1980's and 90's was one that I like to call the "checklist" strategy.
Under the checklist, company X compiles a list of features that their product offers vs. The competition. Then the Product Manager meets with the manager of software development and insists that company X "has" to have all of the features the company has, but company X does not.
Then the coders write code like heck for 6 months to a year. At this point, hopefully, the marketing or product manager can make the big grid.
You've seen the big grid - it's a collection of check-boxes. Company X has all of them. The competition has ... Some of them. The sales manager creates a glossy brochure that prominently features the big grid and goes and sells thousands of units.
The big grid is designed to make the product easy to sell - see how well it stacks up against the competition? The problem is that it does not indicate if the features where implemented well, if they are useful, or even helpful to the people that will buy the tool.
In fact, the "squeeze in as many features as you can" mentality almost guarantees that the features will be junk.
My perfect example is the cell phone: Remember in 1999, when every cell-phone was advertised as running java? Could your sales person even tell you what Java was any good for? Probably not, but he knew that you had to have it, even if you interface was 10x10 characters of ASCII text.
Today the big checklist is still in full swing with cell phones - witness text messaging, camera-phone, video-phone, or internet-phone. A very large percentage of the population just wants to use cell phones as, well, phones.
My argument is that the shift needs to go further - someone could actually make a big pile of cash by selling a cell phone that was actually easy to use as a phone.
This total shift? Another company in the Pacific Northwest is doing it, with another product, called the iPod. More about that tomorrow.
Monday, October 30, 2006
As always, Mike impresses me with his ability to deal with people issues (and even his own issues) with grace. Color me impressed.
A few months back I found Jonanthan Coulton, a software-developer turned musician. One of this songs, Code Monkey, is a personal favorite on my .MP3 playlist. It's also available for free download. If you want to understand Jonathan's motivations, you can read this blog entry.
Friday, October 27, 2006
I think it's great, and I've added his site to my blogroll.
My take on Agile is a little different than most. I don't think you *have* to have everyone in the same room, 100% customer availability, 100% pair programming, a big set of index cards, and so on.
Yes, in general, if an organization uses the practices well, I think they will be better off than with heavyweight methods. Still, in my book, those things are outward appearances - physical manifestations of an inner change in values. In other words, hopefully they demonstrate that the organization is choosing quick iterations and high-volume communication methods because it follows the principles and values of the agile manifesto.
Then again, it's possible that the organization is just practicing Cargo Cult Software Engineering.
In my mind, Agile isn't "Yes", or "No", it is "More Or Less." More important to me than practices 2, 3, and 4 on the Extreme Programming checklist are questions like "Does this organization respond to change? Can they make the tough decisions and live with the consequences?"
To me, the little poster and note that Charlie put up is one of the more mature decisions any individual technical contributor can make; and I'm not just saying that because I have one my cube that is signed "HoyZa." :-)
I was especially impressed by the out-of-town people who came, in two ways: First, many of them assumed we were an old, established event (like PNSQC), and second, I got lots of compliments on organization.
(What those people didn't know is that I can't even keep my desk clean at work. :-)
This leads me to three insights about organization and events ...
1) To re-phrase Clarke's Third Law, "Motivation, interest, desire, and pride, sufficiently advanced, is indistinguishable from good organization."
or, more likely ...
2) If you lack organization skills but can surround yourself with people who have them, there is no effective difference. (Really, all of the conference volenteers were top flight; we compensated very well for each other)
and, also likely, to paraphrase Jerry Weinberg
3) There are very few new ideas. The best way to get "new" ideas is to either cross-pollinate old ideas, incrementally improve existing ideas, or, well ... steal from the best.
Thank you PNSQC, thank you STAREast, thank you IQAA, thank you OSCon ...
Wednesday, October 25, 2006
Post-Modern Software Development is in it’s infancy. It’s inspired by a talk Larry Wall gave a the perl conference , by Context-Driven Software Testing, and by conversations I have had with Jon Kohl and Mike Kelly.
On my way back from the IQAA Conference, I listed to a podcast interview of Jason Fried (of Basecamp fame) that basically defined post-modern development.
The twelve-or-so principles of the Post-Modern School of Software Development:
1) Analysis, Requirements, Design, Coding, Test, and Deploying are not phases, they are activities.
2) Analysis, Requirements, Design, Coding, Testing and Deploying are not roles or jobs either, they are activities. (But we allready said that)
3) If anything changes, you end up implementing all activities throughout the project. By the way: In business software development, stuff changes. This means that there are basically two ‘phases’ to development: Pre-Release and Maintenance.
4) Maintenance is a lot bigger than the academic success literature implies. To support maintenance projects, get good at regression testing. To do short cycles and quick releases, you probably need test automation of some type.
5) What the customer wants is going to change. Get used to it; make sure your process has a way of dealing with it.
6) Consistency is fine as long as it is helpful. We are not overly concerned with enforcing consistency when it is not helpful.
7) If what we used to call 'phases' are really 'activities', then knowing how to do more than one activity is probably helpful.
8) Job titles are fine, as long as they are helpful, not excuses.
9) The industry has spent decades trying to refine job descriptions to make them reflect reality, yet I have never seen an accurate job description. Ever. Why bother?
10) Simple and working (generally) beats perfect and in development.
11) We use other principles (classic, baroque, romantic, modern) when they are helpful
12) Vision is important. We’ve all seen the MP3 Players with 59 features that no one can figure out how to use, and the iPod that just plays music and speech. Do one thing, and do it well.
I just made these up. Eventually, I’d like to present them somewhere (perhaps IWST or GLSEC), build some consensus, and get them adopted as an actual, well, thing.
Then again, do we really need another buzzwordy manifesto-thing? Probably not. Personally, I am more interested in the discussion.
I covet your feedback.
Tuesday, October 24, 2006
Due to issues with blogger I had to repost this. Notice several newer articles appear below this post.
(Begin old post)
I recently placed an article in software testing in a magazine; it will appear in the February Issue.
I think it needs work, and I wonder if you agree. Would you like a sneak peak in trade for some feedback?
A Box Of A Different Color
A short while ago I had to prepare a testing course for software developers. The development team was responsible for both unit testing and system testing, and programmed in a relatively unpopular programming language based loosely on Pascal. Searching for course materials, the information I found kept pointing back to Java and C++, but the development staff simply did not know or use those languages. In fact, the best material I could find was from a course by Dr. Cem Kaner known as Black Box Software Testing, or BBST.(*)
Next I posted on a software discussion list, actively asking people to recommend open-source materials and what they thought of using BBST materials for software developers. The answers I got back were consistently the same: "No, No, No, you can’t teach a course in Black-Box Testing, go get some material on glass-box testing", and they would recommend a course that ended up being in Java or C++. Occasionally they would recommend a book or class and say something like, "I have heard good things about this but haven’t read it" - it would wind up to be in Java or C++. The existing material would simply not be helpful to the development staff, and I kept coming back to BBST
My next question was to ask why the BBST course was inappropriate. The typical response was "It’s black-box, genius. That’s not the right kind of testing." Others would say "Well, black-box testing is really for people who can’t see the code - it’s not really for the developers."
Both of those answers were unsatisfying to me. They were the definitions of the term, not reasons to reject the course material. One or two people mentioned that black-box testing doesn’t include coverage metrics, which is true, but it turned out that the customer had no requirements for coverage to be measured, and the Dr. Kaner’s BBST course actually includes just enough information about coverage and coverage metrics to meet my needs.
So I looked at the basic skills that the Black-Box course covered: critical thinking, general systems thinking, patterns of software failure, issue isolation, and investigation skills. These are skills that any tester might want to grow, and I found a huge overlap in these skills, regardless of box color. (See Venn diagram, above right)
There were other areas of the BBST that I might want to skip. Bug Advocacy and How to Write a good bug report become less important if the person who logs the bug is the one who is going to fix it - and bugs found in unit testing might not get logged at all. Still, it seemed foolish to through out the entire course when I could just throw out a couple of chapters.
This got me to think about what it takes to be a good black box tester and what it takes to be a good clear box tester.
The developers who would take my course already had the skills on the left. Because equivalence classes work just as well on functions as they do on GUIs, the developers could stand to benefit from any black-box material that would cover the skills in the middle.
If I can screen things from the far right out of the training materials and the developers already have the skills at left, then the differences between Black-Box and Clear Box Testing (or the value of distinguishing the difference) greatly decreases.
At that point, if the difference between the two is not valuable, then why are we splitting them up? On this project, splitting hairs over the color of the box was actually harmful. It caused me, and many others, to shut out potentially valuable learning opportunities that were available, and even free.
Yet the valuable things in the course are not specific techniques (like jUnit or MockObjects) but instead genuine skills that can apply to many areas of our lives. We could learn those skills from many sources, and this is one of the traditional arguments for a liberal-arts education. When it comes to evaluating courses, conferences, and books, instead of listing specific techniques, we would often be better off listing skills and asking if the course will teach those skills. In that case, "Bug Advocacy" becomes influence, and "How to write a good bug report" becomes writing skills - both of which have universal value to technology workers.
Once we have the skills, the techniques and terminology can help. After all, we the community created test terminology as a servant. Our terminology is a pattern language that allows us to express complex ideas quickly and unambiguously. I may grouse and complain a bit about 'Oracles', 'Heuristics', and 'Context', but when I use the terms, people know what I mean. With this glass/black-box discussion, we the community were becoming a servant of the terminology; we got the essence confused with the accident(**).
Speaking of liberal arts, when I was preparing for the course I asked my wife what books she would recommend to teach systems thinking. Her response was simply "Aristotle".
I would like to know: What color box is that?
(*) - http://www.testingeducation.org
(**) - For a great read, please consider Rethinking Systems Analysis and Design by Gerald Weinberg. In the book, Dr. Weinberg describes a teaching method where he has his students analyze the inputs and outputs of a physical black (mechanical) box to determine behavior. It’s a fascinating read, and the skills delivered to his students apply to a whole lot more than BBST.
In the software testing world, we have an idea of testing strategy. Two common strategies are a risk-based strategy (find all the bugs you can, or, alternatively "find the worst bugs") and a information-based strategy (report about the status of the software). You can even download a course that talks about it.
Your strategy is the goal; it drives how you do things. Strategy helps you decide what to do, and what not to do.
So what are the strategies for software development? Most of the time, most companies have a everything should be perfect or crystal ball strategy. That means that the goal is to make reality conform with a plan that was written early into the project, based on silly-wild-assumptive-guess (SWAG) that someone made at the very beginning of the project. Sometime the SWAG is really just wishful thinking.
Here are a few of the more common software strategies:
Feature-Complete Driven Development: “The Software will be done when it’s done.” Id Software is famous for using this strategy successfully for doom, quake, and other games. Microsoft is famous for failing using this strategy for windows
Schedule-Driven Development: Also known as crystal-ball driven development. Under schedule-driven development, the team has to meet some date, and, when things change, the project manager has to manipulate reality to make the schedule work. This strategy generally involves wishful thinking, lots of overtime, and a team that gets what it deserves. Unless, of course, the schedule is padded to the nth degree up front. Then the team might do okay, but the company generally does poorly in the marketplace.
Date-Driven Development: This comes in two flavors. The less-disciplined version declares a date, provides a high-level vision, and thrashes and constantly insists that the date be met. Sometimes this works; sometimes the date comes, the code is an awful mess on the floor, and the project team is burnt out and exhausted. The more disciplined version is called Feature-Driven Development, and I have outlined it below.
Feature Driven Development: Also known as evolutionary development. Using this strategy, the requirements are broken into features, and each feature is implemented as a thin slice, end-to-end, before moving on to the next slice. This strategy enables the team to build a simple working system quickly, and thus hit any specific date. This, FDD trades off time for features. Generally, features are listed and implemented in priority order, so that if the project is late, it can still ship – with less features.
Audit-Driven Development: In this case, the important thing is that your project comply to some external standard. For example, all code must have traceability to requirements, or all code must be reviewed. Generally, audit-driven development requires artifacts which can be inspected to prove the code did what it was supposed to do. Sarbanes-Oxley, HIPPA, and SAS70 are three common auditing standards.
Governance-Driven Development: In this meta-strategy, the person paying for the work gets to decide what drives development, and can change her mind. Under governance-driven development, when a bug is found, the developer doesn’t fix it, because that would mean that the developer has to stop the current work. Instead, the developer defers the question to the appropriate decision maker to determine which they want more – the new feature or to see the old feature fixed. Sometimes strategy and mission statements can automate that decision, but they are written and signed by the person paying for the work.
Technology-Driven Development: When the development house (and sometimes the marketers) run the store, the important thing is that the software use interesting technology, such as XML, Tagging, RPC, SOAP, RSS, object-oriented databases, Ruby, and so on. Dates and features become less important than doing fun stuff. Technology-Driven companies often build robust, scalable, extensible infrastructure before they begin working on features, as the infrastructure will accelerate development. It generally doesn’t accelerate development much, but the coders do get to have a lot of fun.
Phase-Driven Development: Also known as Assembly-Line Driven Development, this strategy views development as a series of steps to be followed in an orderly process. Under phase-driven development, the requirements are specified completely. Then the entire project is estimated and scheduled. Then the software is designed, then coded, then tested, then deployed. Each activity is referred to as a phase; if the project is late, then testing is cut. This means that phase-driven development trades off time for quality.
The nine strategies above just scratch the surface of the motivations a team might have. They are important, because (A) Strategies can be an extension of personality, (B) If you don’t pick your strategy, someone else will for you, and (C) it is entirely possible to do what you believe is right but miss out on someone else’s hidden-agenda software strategy.
Figure out your team’s strategy, then make it explicit and work toward it. Put it on the wall on a big poster. Then, when someone says “You shipped on Oct 1st without the inverted sort square root feature”, you can point to the poster on the wall and explain that decision.
Developing a strategy, refining and communicating that strategy is an important step in team sports. In software development, unlike professional sports, it is possible for everyone to win.
So let's take the game seriously and develop a strategy.
Monday, October 23, 2006
My colleague and friend Jon Kohl just finished the excellent post The Agile Backlash.
While I agree with his sentiment, my conclusions are a little different:
1) John Wrote: Obviously, this is behavior that is completely against the values of any Agile methodology I've come across.
It's easy to say that you are doing Extreme Programming. It's really easy to say that you are doing Scrum. The hard part is making decisions that line up with the Agile Manifesto - such as forgoing mind-numbing documentation, making the tough decision to forgo an "extensible" infrastructure, or refusing to accept the status-quo requirements document and insisting on a face-to-face meeting.
To quote Shakespeare from Henry IV ...
Glendower: "I can call spirits from the vasty deep!"
Hotspur: "Why so can I; or so can any man. But will they come when you do call for them?"
Right now, I am much less interested in companies using Agile terminology and much more interested in them making good decisions. Personally, I believe values of the Agile Manifesto are better than the current default in corporate
2) Jon Wrote: They forget to adapt and improve to changing conditions, and they just stick to the process, and try to follow the formula that worked in the past. As they get more desperate, they make wild claims promising that "Agile" will help them unseat a powerful competitor who is "waterfall", or will guarantee bug free software, improve all employee morale problems, etc.
This really odd - and very familiar. I see a lot of "Agile" process zealots. I don't get it. One of the points of the manifesto is to focus on individuals and interactions over process and tools, and yet people keep selling methodologies and tools. I suppose that's because it's easy to sell. I intend to propose a session at the Agile Conference 2007 to discuss this.
----> Overall, here's my takeaway. To paraphrase Esther Derby "Any good idea can be implemented poorly." Personally, I think this Agile/Non-Agile discussion is divisive and damaging, for some of the same reasons Jon outlines in his article. In my discussions with people, I have dropped Agile/Non-Agile from my vocabulary. My talks are more about More-Agile or Less-Agile, and by more-or-less I don't mean adherence to a process document, but instead consciously choosing to focus on:
individuals and interactions over processes and tools
working software over comprehensive documentation
customer collaboration over contract negotiation
responding to change over following a plan
Of course, this isn't as easy as mindlessly following someone's process description; you have to think.
I wouldn't have it any other way.
Friday, October 20, 2006
When a contributor doesn’t realize this, it can be bad. The person on the other side of the table is probably a professional negotiator – an executive, sales person, or marketer. So the sales person does a great job negotiating … and the contributor ends up promising the impossible. Three months later, when the project is supposed to be "done" but isn’t … the results are pretty predictable.
The solution to this, in my mind, is to set boundaries and limits, and stick to them, consistently. Give in once, and you’ve set precedent. There are a lot of ways to do it; in "Behind Closed Doors: Secrets of Great Management" Rothman and Derby recommend saying something like "I can not commit to what I know I can not do."
The real challenge, however, isn’t what you say – it is getting the guts to anything like that at all. So, if you want practice, I have one idea.
Some of the big box retail stores (Kohl’s, Toys R’ Us, and sometimes Radio Shack) ask for personal information that they don’t really need at checkout – like your zip code or email address. When they ask "Can I have your phone number, please?" Try something new.
Really, the person won’t hit you or yell at you. The worst that can happen is that they can try to make you feel bad – but they won’t, because they asked something unreasonable and you gave them a reasonable answer.
Try it three or four times at stores, then try it in the office. If you struggle with telling people "No", you might be surprised how far setting reasonable limits – and sticking to them – will get you.
Thursday, October 19, 2006
That presupposes that the talk was any good. You could listen for yourself and decide.
The title is "So You're Doomed" -
Here's a link to the PowerPoint (5MB)
And the Audio (45MB)
And I forgot Stephen Covey and Dale Carnegie, but of which were very helpful to me personally.
Wednesday, October 18, 2006
After the IQAA conference, a number of people asked me how I learned presentation techniques. Typically, I would either mumble something about my experience as a military cadet or deny any skill and say something to the effect of "well, if you practice for 100 hours for a one-hour talk, you'll probably do ok."
Basically, I do this because I don't want to brag. Still, that answer may be modest, but it doesn't provide any help with other people improving presentation skills ... at all.
So, at the risk of sounding a bit arrogant, I'm going to try to answer the question.
Here is my approach to speaking improvement:
(1) Recognize that I am not that good.
That's the first step. For me, for a number of years, a number of people told me that I needed to work on my "communication skills", so, well, I did. I read books, I surfed the web, I tried to learn. Eventually, I figured out that the problem wasn't communication at all --- it was a difference of *values*, and a matter of *timing*. Half of the secret of giving a good talk is having a receptive audience when they are in the right mood.
Because of #1, I sought advice. Here are a few of my favorite references:
(A) My old professor, Dean Defino once said something to the effect of "I don't read the book. You have the assignment in class to read the book; my job is to connect the dots, make it interesting, and add the things that the book misses - to make the course well-rounded."
On power-points, I don't read the bullet points
Dr. Defino also said many times in class that "The answer to any essay question is always 'it depends, if X then 1, if Y then Z ..." I try to less tell people what to do and more to provide tools to help them decide for themselves what to do.
(B) Presentation Aikido - Damian Conway gave this at OSCON a few years back. I burrowed a copy of the slides from a friend of a friend. You might be able to Google them, but I haven't found it in a casual search. Aikido means "Sprit-Way-Path-Harmony" roughly translated; the talk is about how to do 1-hour talk well. There is a good review here.
(If anybody wants to pay for shipping, I can probably get you photocopies)
(C) Presentation Judo - Mark Jason Dominus - This one is on how to do an all-day tutorial, and it is on-line here.
Note that you can click on details and get more info than just the slides.
(D) The ClueTrain Manifesto:
Scroll down to the list of 95 elements. In a nutshell, don't be a disembodied talking head. Instead, talk like a person.
(E) Remember item #1. I'm not that good.
So I get feedback from people I trust. I try to give every talk at least once, if not three times, in front of a receptive audience.
(F) If possible, I try to introduce myself to the audience before the talk starts - in a personal way. I show up 15 minutes early to the talk, introduce myself to people, and ask "So, why did you choose this track?" or "What are you looking to get out of this talk today?" -- I noticed Esther Derby do this at BetterSoftware a few years back. At worst, the audience views you as a person, not a talking head (our cluetrain goal). At best, this will allows you to customize your talk by a few sentences and really reach your audience. Singers often change words in a song to talk about the local hometown or hometown team; I'm not recommending that, but if the entire audience is interested in All-Pairs testing, you can spend a few extra minutes on that slide.
(G) Know your competition. Techology Conference? Expect death by PowerPoint, right? Be the contrarian.
(H) Know the difference between a gimmick and a technique. Remember I said to know that I'm not that good? So when I want to try something that's borderline, I pursue feedback well before the talk. I wanted to use a technique that I got from Tim Lister - less slides, no bullet points, make each slide a single big picture and tell one story per card. (Tim did a talk like that at ADC 2004 and also at the Better Software Conference in 2005; if you email him he'd probably send you the PDFs) I was on the fence, so I asked several people; my wife finally convinced me that getting rid of the bullet points would be a good choice for a Friday-afternoon-burnt-out talk.
(I) You can use the fieldstone method on talks (make each slide a story) - See Weinberg's book on Writing, The Fieldstone Method.
(J) What do you value? I like things that are simple and holistic - viewing the entire project instead of segmenting it into parts. This is very different than the primary way of thinking of the IS Industry - we like things that are big, perfect, comprehensive, orderly, broken down into straight lines ... Enterprise-y.
Scott Ambler and Ken Schwaber both have greats talk on ITConversations. Joel Spolsky, Phillip Greenspun, and Steve McConnell all have good interviews. Download them and listen to them. Here is one of my favorites.
Same thing. The
(J) Find bad presentations, and learn what not to do.
So what's my conclusion? I was convinced that I was an incompetent speaker, and worked hard to become competent - to move my way up the abstraction chain.
The list above is just the ten or so most interesting things I learned along the way.
Right now, probably the worst thing I could do for myself is to consider myself *good* ... :-)
Monday, October 16, 2006
I just got back from the IQAA Conference in
A few years back I used to blog on use.perl.org, but Mike Kelly inspired me to start up again. Hopefully, I'll be able to share some of that follow-up here instead of a Byzantine set of emails that may or may not reach the right people. Only time will tell. In the mean time, if the things on this list are good, it's my idea; if they are bad, well, it's all Mike Kelly's fault. :-)