[Date Prev]
| [Thread Prev]
| [Thread Next]
| [Date Next]
--
[Date Index]
| [Thread Index]
RE: [xml-dev] No XML Binaries? Buy Hardware
- From: "Len Bullard" <cbullard@hiwaay.net>
- To: "'Michael Champion'" <mc@xegesis.org>, <xml-dev@lists.xml.org>
- Date: Thu, 22 Feb 2007 12:27:07 -0600
Which is the question asked waaaay back when the XML binary issue first came
up. I will be surprised if any single binary does the job as well as
dedicated binary techniques although the patents may still be a thorn. The
3D graphics guys claim to have a strong case for a binary if not a common
one. I await the results from the working group on that one. The question
seems to be, is one standard or a handful of profiles sharing common
techniques (an adaptive standard) the right approach.
On the other hand, the public safety group at Intergraph didn't bother to
wait. They binarized and moved on because of the traffic volume as you
allude to. The commonality is real-time system messaging whether it be
virtual real-time or reality itself as is the case with dispatch systems.
Not so weirdly, those cases are merging because of simulations being used to
drive training on the real time dispatch systems. The next 9/11 will be
entirely fought in pixels if we're lucky.
My question was more along the lines of what I asked the other Mike (the one
with the accent): what are these appliances doing that can't be done in
software or is the hardware more effective and why?
len
From: Michael Champion [mailto:mc@xegesis.org]
> -----Original Message-----
> From: Len Bullard [mailto:cbullard@hiwaay.net]
> If XML has such a negligible impact on performance, why do companies like
> Cisco and IBM buy small companies that build XML hardware accelerators?
Who says that XML has a negligible effect on performance anymore? It's
probably true that in traditional document processing scenarios the
percentage of overall app time spent actually parsing and transforming XML
is so small that optimizing it would be pointless, but in high volume
messaging and database scenarios there's clear evidence that XML processing
is often a bottleneck.
The main current dispute is over whether a single binary format can meet a
wide enough range of needs to be worth standardizing. I suspect there will
also be a fun debate over whether tightly coupling the components of a
distributed application via schemas is worth the performance gains that a
schema-driven efficient serialization format would offer. The W3C EXI WG
has collected a mountain of data on the question of how much performance and
compression improvement one gets under different scenarios, but it's not yet
released in a conveniently usable form AFAIK.
[Date Prev]
| [Thread Prev]
| [Thread Next]
| [Date Next]
--
[Date Index]
| [Thread Index]