[
Lists Home |
Date Index |
Thread Index
]
John Cowan wrote:
> A variety of
> small-scale studies have shown that general-purpose compression is generally
> as good as, or better than, some scheme that knows it's compressing XML.
Err, quite the opposite. XMill beats gzip. BiM/BiX requires a schema,
but there are many ways in which a schema can be deduced, even with just
a raw document (and it can be done more intelligently than most tools
that deduces schema information from instances I've seen out there do
it). And BiX quite clearly beats gzip or bzip2, often by a large margin.
> A scheme for compressing a particular XML document type might be useful
> in extreme cases, but is probably not worth standardizing.
Techniques that are to be used by a large array of different
participants are usually worth standardizing. Binary XML is used already
in a number of broadcast-related domains (set-top box "websites",
audio-video metadata, TV guides, numeric television, numeric radio...)
and there is strong and growing interest from other sectors, notably in
the Web Services and mobile arenas. If by extreme you mean "extremely
limited compared to your average desktop or server" or "requiring
extremely big payloads" then you're right, but if you meant "rare" then
I'm afraid your point doesn't hold: small devices consuming XML are
crawling all over the place :)
In addition to that, having a single standardized way of binary-encoding
XML means that industry-specific standard organisations can stop wasting
their time creating ad hoc binary encodings for their XML data that will
fall apart with the first need for change, and use tried and shared
technology instead.
--
Robin Berjon <robin.berjon@expway.fr>
Research Engineer, Expway
7FC0 6F5F D864 EFB8 08CE 8E74 58E6 D5DB 4889 2488
|