OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   RE: [xml-dev] Re: Where does the "nothing left but toolkits" myth come f

[ Lists Home | Date Index | Thread Index ]

> -----Original Message-----
> From: Elliotte Harold [mailto:elharo@metalab.unc.edu]
> 
> Rich Salz wrote:
> > Let's ignore the Infoset.  But if we posit
> > 	XML --> a binary serialization --> XML
> >
> > how does that atively damage the XML community?
> 
> First, you have to realize that most binary proposals to not in fact
> posit that. They either superset or subset or only intersect with
> regular XML. But if we rule all those formats out, there are still
> problems:
> 
> 1. Patents are beginning to invade this space, closing off
> interoperability and open software.

All the more reason to create a decent standard before the space is
completely closed off, baring entry of useful, innovative ideas.
Easiest way to block a patent is to publish the same idea openly first.

> 2. The data that's transmitted in this binary format is less
inspectable
> than data in the regular XML format.

True.  In my actual experience with binary-xml, this is possibly the
single largest weakness.  Then again, by tying binary-xml to XML, you
gain the significantly improved possibility of being able to
alternatively send the data as text-xml.  A critical component of any
binary-xml design should be to emphasize that binary-xml is just an
optimization of the normal text-xml path.  This is a key reason why I
think binary-_X_M_L_ is actually better than ASN.1 or some new format.

> 3. Software vendors will publish tools that only consume the binary
> data; and therefore systems will refuse to accept the textual data.

Just as today there are vendors who only consume ASCII or UTF-8 XML.
This relates back to 2, and to general market pressures.  If the market
really only wants binary-xml, then let it go.  I tend to believe that
most markets will quickly see the value in being able to choose text or
binary, depending on the current needs.  Are you debugging, or still
building the system?  You probably want text-xml (back to your point
#2).  Are you focused on tuning your system for absolute throughput,
choose binary.

> 4. Binary parsers often forgo well-formedness checks such as name
> characters that textual parsers make. They incorrectly assume that
> nobody can or will inject broken data into the system.

I am in complete agreement with you here as well, but this can be
addressed (and should be) by conformance tests, and clear requirements
within the specification.

> These problems are not insurmountable, but once you surmount them
you're
> very close to reinventing real XML, and being about as fast and maybe
> marginally slower.

Reinventing XML would mean going back to SGML and restarting the long
and arduous process of trimming that down.  Binary-xml is not about
reinventing XML, it is about leveraging large parts of the existing XML
ecosystem of tools, languages, and general knowledge.  Having worked
with binary-xml, in a number of forms, including implementing full
well-formedness validation, binary-xml can easily be much faster than
XML.  For some scenarios (especially where binary serialization would
avoid numeric type conversions) the performance gain can be huge.

Basically, you appear to be arguing that existing proposals are flawed;
therefore all future proposals will be similarly flawed.  That logic
would have prevented XML from ever happening, would mean that hybrid
cars would never exist, etc.  I'm not sure that W3C can succeed with
their binary-xml effort, but declaring it broken provides value to no
one.

-derek




 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS