Schedule and Events



March 26-29, 2012, Software Test Professionals Conference, New Orleans
July, 14-15, 2012 - Test Coach Camp, San Jose, California
July, 16-18, 2012 - Conference for the Association for Software Testing (CAST 2012), San Jose, California
August 2012+ - At Liberty; available. Contact me by email: Matt.Heusser@gmail.com

Thursday, March 01, 2007

A Testing Challenge

As a fan of incremental/iterative methods, I like the idea of test automation.

In theory, everything should be retested every release, but with two-week iterations, that simply is not going to happen. With test automation, we can at least have some confidence that the software didn't introduce any major regression errors outside of the obvious features being changed.

So, it would be really nice to have a computer with a big button that says "test" that runs all the tests and comes back with a green (or red) light. For unit tests, I use these in spades.

The problem is customer acceptance tests. There are some tools for automating acceptance tests: Most notably FIT and Fitnesse.

Fit and Fitness take a set of inputs and compare that against a set of outputs. To do this, they have 'fixtures' that call into the application, call a function, get the results, and compare that to output. Since Fitnesse is written in Java, it can have 'hooks' into your application -- if that app is also written in Java or a language which FitNesse Supports.

This can work for standalone applications; logical silos that take an input, transform it, and provide an output.

Now the challenge:

----> Lately I've been working with IS Shops, not software companies. These are organizations that support a traditional, brick-and-mortar business. Instead of producing standalone apps, these organizations are more often integrating two applications.

The software that is being testing isn't the app itself (that ws commercial-off-the-shelf) but the data bridge between the apps.

For example, one application pulls data out of an ERP system, stores it as a flat file, and imports it into a financial system. The requirements are "Make the Financial System LOOK LIKE the ERP System for accounts A, B, and C."

Another application pulls data out of the ERP system and populated the data warehouse. A third creates a flat file which is sent over to a trading partner "Make sure the trading partner knows all our gold members, the member ID, and eligibility dates" are the requirements.

Think about it - the requirement is to take one set of black-box data, and import it into another black box. We can test the data file that is created, but the real proof is what the second system accepts -- or rejects.

And, no offense, but for some of these Apps, FIT isn't a very good fit.

First of all, the test databases used are refreshed every three months from production. That means that you either have to find test scenarios from live data (and hope they don't change next time you refresh) or re-enter every scenario in test every three months.

Now, take the trading partner example. The best you can do within your organization is to test the file. The interface might take three hours to run, then you GREP the file for results and examine. You'll have to write custom fixtures to do this, and your programming language isn't supported by FitNesse. Or you could write a fixture that takes a SELECT statement to count the number of rows that are generated by the file, run the interface, and compare.

Of course, a programmer is going to have to write the SELECT statement. Is it a valid acceptance test?

Or you could have the number of rows fixture be approximate - "Between 10,000 and 15,000" - customers could write this, and it guarantees that you didn't blow a join, but not much else.

You could write code that accesses the deep guts of the application, turning it sideways to generate a single member at a time, thus speeding up the acceptance test runs to a few seconds. That's great for the feedback loop, but it's more of a unit test than an acceptance test.

You could suggest I re-write the whole thing to use web services, but that introduces testing challenges of an entirely different kind. To be frank, when I have a problem and people suggest that I re-write the whole thing without recognizing that it would present an entirely different set of challenges, it's a sign to me of naiveté.

I submit that all of these would be a significant investment in time and effort for not a whole lot of value generated.

So, I still want to write customer acceptance tests, but I'm not sure this is the right paradigm to do it. I also have a handful of tricks and techniques I have used over the years to make this easier. I will reveal them in a future post, but in the mean time, here's my challenge:

What would you suggest to solve this puzzle?

I should add that I don't think this is a trivial puzzle; at least, more than half of the people of which I ask this give an answer that I believe to be unsatisfactory. Can you do better?

11 comments:

Elisabeth said...
This comment has been removed by the author.
Elisabeth said...

I started to write my answer in a comment, but it got too long. So I wrote a blog post instead.

Chris McMahon said...

As far as I'm concerned, the Bible for this kind of work is Gregor Hohpe's work at http://eaipatterns.com/.

In particular, check out http://www.enterpriseintegrationpatterns.com/docs/TestDrivenEAI.pdf

I've taken two approaches to this kind of work:

First, have tests that validate that any input to the system is processed correctly in the output. For instance, check that input Field 22 appears in the filename, and input Field 87 appears in the output file header, or whatever. This approach is amenable to risk-based analysis.

Second, account for the possible range of input data. Sometimes this means selecting records semi-randomly from the input system. Sometimes it means generating input data using semi-random methods.

Finally, run a LOT of tests. Run tests all night, run tests all week, run tests all month. Harry Robinson's work is a good reference for this sort of thing.

Anonymous said...

If the live data refreshed from production is exhaustive and you cannot find a test scenario then you may technically have a bug in the code but you will not hit the bug with the current production data (you may hit it in the future with new production data.) Now you can wonder "if a bug exists in the code but doesn't make any noise did the tree really fall?"

One approach - get the live data from system A, run the data transformation/import into system B in a test environment, and get users from systems A and B in a room together to discuss whether system B properly represents the data from system A. The discussion should at least address some normal scenarios and several flavors of extreme/weird scenarios.

Mallik said...

Few things to test apart from what are already mentioned,
-If A is pumping out the data at faster pace than speed with which B takes-in, in that case there should be proper queueing logic in the bridge.
-If suddenly B goes down then will the data pumped in by A is queued.
-If bridge itself is not able to process the data at speed with which A pushes in.

-Mallik
Code Inspections

Anonymous said...

When faced with this sort of problem I usually try to reverse engineer the input files from the output files and compare. That way I can find fields and records that are missing, extra stuff, bad transformations, etc. Of course, this isn't always possible but it's a good approach if you can do it. You might have to extract information from the target system to check that the import was complete.

Integration testing and architectural-level issues are are a whole different kettle of fish and something that the agile guys neatly avoid discussing. Unit tests aren't the answer to everything.

Ben Simo said...

If I understand, you have three different processes taking data out of one system and placing it into three different systems/formats:

1) Another application
2) A database
3) A flatfile

The requirement to test is that the data in all three destination systems is the same as displayed by the source system.

Do I have this right?

Vondran Andre said...

First of all: Great Blog! You really inspired me to learn more, thank you!
Would you say that Cubes are an option to solve this?
Once again, thanks for your contributtion to the testing world.
Best Regards, André Vondran

Anonymous said...

buy wow goldbuy wow gold[url=http://www.belrion.com]buy wow gold[/url]
[url=http://www.igamehub.com/wowgold/buy-wow-gold.htm]buy wow gold [/url]

Anonymous said...

http://www.pumachaussure.com/ : puma
http://www.pumachaussure.com/ : femme puma
http://www.pumachaussure.com/ : homme puma
http://www.pumachaussures.com/ : chaussures puma
http://www.pumachaussures.com/ : puma basket
http://www.pumachaussures.com/ : puma
http://www.pumachaussures.com/ : puma basket chaussures
http://www.pumachaussures.com/ : basket puma
http://www.uggsoutlet.us/ : UGG Online Store
http://www.uggsoutlet.us/ : cheap ugg boots
http://www.uggsoutlet.us/ : black ugg boots
http://www.uggsoutlet.us/ : ugg boots on sale
http://www.uggsoutlet.us/ : cheap uggs
http://www.chiflatiron.us/ : chi flat iron
http://www.chiflatiron.us/ : chi flat irons
http://www.chiflatiron.us/ : chi hair straightener
http://www.chiflatiron.us/ : chi ceramic flat iron
http://www.chiflatiron.us/ : chi iron
http://www.thehairstraighteners.com/ : cheap hair straightener
http://www.thehairstraighteners.com/ : hair straighteners
http://www.thehairstraighteners.com/ : chi hair straightener
http://www.thehairstraighteners.com/ : chi hair straighteners
http://www.thehairstraighteners.com/ : chi flat iron
http://www.northfacejackets.us/ : north face factory outlet
http://www.northfacejackets.us/ : north face outlet store
http://www.northfacejackets.us/ : discount north face jackets
http://www.northfacejackets.us/ : north face jackets
http://www.northfacejackets.us/ : spyder jackets
http://www.northfacejackets.us/ : cheap north face jackets
http://www.jacketscart.com/ : cheap north face
http://www.jacketscart.com/ : north face outlet store
http://www.jacketscart.com/ : discount north face jackets
http://www.jacketscart.com/ : cheap north face jackets
http://www.jacketscart.com/ : spyder jackets

Anonymous said...

Rewrite the financial system and the ERP system. It will probably take less time, and you'll have a more stable, robust, featureful, user friendly, and testable system to boot.

I hate those "enterprise" apps. I'm surprised though that you're even talking about using SQL, since things like Siebel, Peoplesoft, etc. do their worst to keep your hands off of your data in your database.

Seriously, though, that's the biggest challenge. Documenting your acceptance criteria and pointing tests to them is about all you can do. I'm working (a few hours a week) on a fitnesse-like tool, but there isn't a really good way to do it.

Cucumber (cukes.info) is popular, but it runs into the same problem -- it's code, and a very limited grammar. I'm convinced that plain english documentation tied to good old fashioned test scripts (either manual or automated) is the way to go.

I don't want to say it, but something like Test Director/Quality Center is where I'm headed, but using open source tools like Selenium and xUnit.