OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: almost four years ago....



> -----Original Message-----
> From: Alaric Snell [mailto:alaric@alaric-snell.com]
> Sent: Saturday, June 16, 2001 11:10 AM
> To: The Deviants
> Subject: Re: almost four years ago....

> This is easy to do. GZIP is massively crippled by having no
> information about the structure of the file - it's just a string of bytes
that
> it has to make some assumptions about the probable structure of with regards
> to frequency distributions that won't even apply very well to XML;

Not really wishing to start up the binary XML/ASN.1 argument again, but it
would be nice
to see concrete, real-world data supporting this assertion.  How MUCH better
does XML-aware compression work than a generic tool? How much more or less
computation does it require?

> it's  trivial to write  something that compresses better

Undoubtably true, but will the difference make a difference?  Can you make a
business
case that someone will get more happy users for the actual product by writing
something
that compresses x% better than an off-the-shelf generic technology? The
success of the HTTP/HTML is probably due to the economy gained by ignoring
many of the nasty problems that previous hypertext proposals handled "better".
And the he bankruptcy courts are clogged with companies that had a "better"
solution to a problem that people didn't care about strongly enough to
actually pay to have solved. "Worse is better" may be lousy technology, but
tends to be good business, and engineering is all about making appropriate
tradeoffs between the two.