Lists Home |
Date Index |
On Wednesday 26 February 2003 16:46, Robin Berjon wrote:
> Alaric B. Snell wrote:
> > Binary data encodings that compare with XML seem to quote being able to
> > reduce the size of the data to 20%-50% of the original size. Happy? :-)
> Where do you get those numbers from? Results naturally vary a lot according
> to the data set, but given a sufficiently smart scheme compression of the
> structural information tends to revolve around 2% of the original size (x20
> factor) quite easily.
I was taking an admittedly quite conservative rough range of the numbers I've
seen aired on this list - and including the actual CDATA, for some kinds of
documents that have a lot of text in.
But for something that's mainly elements, numbers, and dates, the compression
ratio can indeed be quite vast.
> With some SOAP messages it can get really stupid on some requests with
> factors over 200.
Way to go :-)
> And then the rest varies according to the data/structure ratio, whether
> zlib is an option or not, whether some things can be optimised away, etc.
> but 50% would seem highly unusual to me.
That's what you'd get for no gzipping (so CDATA is transferred as-is) with
something that's 50% actual text rather than tags - verging perilously on
actual document markup rather than data interchange.
A city is like a large, complex, rabbit