[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: "Binary XML" proposals
- From: "Al B. Snell" <firstname.lastname@example.org>
- To: Danny Ayers <email@example.com>
- Date: Tue, 10 Apr 2001 15:10:17 +0100 (BST)
On Tue, 10 Apr 2001, Danny Ayers wrote:
> BTW, I don't believe this :
> <- but the amount of information in a binary-XML file is less than
> <- the amount
> <- of information in a text-XML file.
The point was that the binary XML format would miss out redundant and
unnecessary information such as whitespace and the tag name in the close
tag, but will contain the same level of "useful" information; therefore,
there is less information that needs transmitting, so the lower bound on
compressed size is smaller (using a standard dictionary/entropy compressor
such as gzip; not sure about bzip2, which I have yet to fully grok).
> Or that a linear increase in the length of a chain of XML processors will
> cause a an exponential rise in processor requirements.
I didn't mean that - let me rephrase it better.
Some problems can be solved faster by using lots of commodity machines in
parallel; the relationship between speed and hardware cost is
linear. Stuff like Web servers that serve lots of little independent
requests work like this. These problems are nice.
Some problems, however, cannot be parallelised; you need a single, faster,
machine to make them happen faster. And the cost of a single processor
rises (APPROXIMATELY!) exponentially with the speed of it... certainly
nonlinearly with increasing gradient, I'm comitting myself too much by
I don't mean the difference in prices between an Athlon 900MHz and an
Athlon 1.5GHz - I mean the difference between an x86 machine and a 64 bit
Alpha or UltraSPARC :-)
Alaric B. Snell
http://www.alaric-snell.com/ http://RFC.net/ http://www.warhead.org.uk/
Any sufficiently advanced technology can be emulated in software