[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: XML and unit testing
- From: Leigh Dodds <firstname.lastname@example.org>
- To: xml-dev <email@example.com>
- Date: Thu, 05 Jul 2001 09:57:11 +0100
> -----Original Message-----
> From: Simon St.Laurent [mailto:firstname.lastname@example.org]
> Sent: 04 July 2001 19:30
> To: email@example.com
> Subject: XML and unit testing
> In testing a filter, I need to be able to test component pieces, some of
> which are plain old objects and some of which are representations of
> those objects as SAX events and/or XML. I've had a number of cases
> involving shallow copying (finally fixed) where the objects looked great
> at the time of their creation but morphed by the time they reached the
> XML output, so I really need to be able to test these things in a
> variety of situations.
I'm currently using Junit for testing all my code, and am pretty pleased
with it. Admittedly I haven't been pushing the tool as far as Lars seems
to, but it hits the 80/20 point for me.
I usually follow the XP/Refactoring approach and build the tests in-step
the code. I've found that this not only has the benefits of testing your
code immediately, but also puts you firmly on the 'client' side of your
API / interfaces. This has generally lead to improvements in those APIs.
So my approach to your problem would be to build separate test cases
for the individual components first. If your filter delegates much of its
work to other objects, then these can be tested in isolation from the
Admittedly testing the filter can be trickier particularly if you want
automated tests with a range of inputs. However I've found that adding
utility code to the test suites helps here.
You might also want to consider Schematron for testing the output of
your application. The assertion mechanism can be used to check for
correct values and structure in the document.
Mock objects are another possible avenue to explore. These are objects
that implement your application interface but don't actually do any
useful work (they may instead contain hard-coded test values, debug logging,
additional assertions). These can then be plugged into your application
instead of the real implementation, allowing some additional kinds of tests.
> So far, I've been using a set of test cases and my own eyeballs. It's
> worked pretty well as far as figuring out high-level pass/fail, but does
> very little to help me track down where the pass/fail came from.
I'd suggest that suitable logging might help here, as it seems that your
unit test has done what its supposed to: identify a failure.