[
Lists Home |
Date Index |
Thread Index
]
[Dennis Sosnoski]
>I have a hard time communicating with the hard-core text backers who
appear to
>see any transformation of XML (other than gzip, which apparently is
blessed by
>virtue of predating XML itself) as inherently evil.
Eh? As a hard-core text backer I'm not sure I understand you.
gzip performs a lossless compression. I use it with XML all the time. Works
great. I use it on the wire too with HTTP. I've no problem with it, why
would anyone have a problem given that the XML is unharmed?
Here is my simpletons guide to handling size in XML:
(1) If the size of your XML-in-situ is thought to be a problem, gzip it.
Example - OpenOffice.
(2) If the size of your XML-in-transit-on-the-wire is a problem, gzip it,
preferably transparently to either end e.g. with HTTP compression.
(3) If, after trying these, the size of the XML is still seen as a problem,
then **don't use XML as your native format at all**. Instead, provide
lossless ToXML and FromXML input/output filters from your native object model.
(4) Beware programmers trying to treat XML like a set of database records.
Many such programmers blame XML when then cannot load an *entire* record
set into memory. The same programmers would never contemplate loading an
entire database into memory. Approaching XML processing the wrong way (its
just a database, right?) can lead to non-sequiters like binary XML.
Sean
http://seanmcgrath.blogspot.com
|