[
Lists Home |
Date Index |
Thread Index
]
Jeff Lowery wrote:
> I'm sure this has been asked before, but I missed it:
>
> What can be achieved by binary XML that can't be similarly achieved
> using well-known text compression algorithms?
Do you mean gzip and friends?
If so, then the answer is "reduced complexity" and "improved
performance"! gzip is a fairly complex algorithm, easy to get wrong,
hard to understand, and it uses a lot of buffer space (especially when
compressing). Plus it's quite CPU intensive.
Many have pointed out that gzipping XML produces a smaller file than
just PER-encoding or something like that - this is certainly the case if
the XML in question is more text than tags, since in PER the text would
be represented just the same as in XML, in general, and all that is
gained is the conversion of tags and structured data to compact forms.
However, gzipped PER will generally be even smaller than gzipped XML; so
if you have the spare resources at each end to gzip everything then you
can regain some of the time performance you lost due to gzipping by
using something like PER instead of XML as the stuff you compress.
So to conclude - byte stream compressors like gzip, bzip, and so on gain
you space efficiency in transit at the cost of time and space efficiency
at the endpoints. But byte stream compression sits at a lower level than
XML, PER, XDR, and so on - it can be equally well applied to all of
them, although formats with less redundant information in tend to
produce smaller compressed versions (don't be fooled by the fact that
the compression ratio of highly redundant data is high ;-)
ABS
|