OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   Re: [xml-dev] Why not just zip the XML? [was the MS FUD thread]

[ Lists Home | Date Index | Thread Index ]


On Nov 20, 2003, at 2:36 PM, Jeff Lowery wrote:

> Real-time stream compression algorithms certainly exist, although I
> understand their compression rates are around 50% or less. I suppose 
> some
> applications would require better than that, though.
>

One thing that sticks with me from the binary infoset serialization 
workshop is what really matters to users is latency; compression is a 
way people TRY to improve latency, but it seldom works well except for 
the hardware-based streaming compression built into modems, etc.  In 
experiments of which I'm aware, conventional (gzip I believe) 
compression only produced a net decrease in latency in situations where 
there was lots of spare processor power on the compressing side and the 
network itself was fairly slow.  If the network is fast or the 
processor slow, conventional compression showed no improvement in 
overall latency, and could easily slow things down.

The holy grail would be a compression/decompression scheme that added 
very little to the processor load (which directly relates to battery 
life in wireless apps) but significantly reduced the number of bits to 
be sent over the wire.    I guess the holy spear :-) would be a format 
that is fast to compress, decompress, and parse back to an Infoset.

I don't know this subject very deeply, but I suspect that the Holy 
Spear is unobtainable, and there will be a tradeoff between compression 
efficiency and the amount of work it takes to reconstruct an Infoset. 
  





 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS