OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   Some notes on the binxml permathread (was: Re: [xml-dev] Parsingefficien

[ Lists Home | Date Index | Thread Index ]

Dear deviants,

a few hours only into the thread and already we are re-hashing old ground. Here 
are some notes (foolishly) hoping to avoid covering too much littered and dead 
ground.

  - "Binary XML" is an oxymoron. There is no such thing, and most likely will 
never be. Whatever binarisation scheme you use you're binarising an infoset.


  - Debates as to whether it should "happen" or not are moot: it's already 
happened. You may not have seen it yet, but binary infosets are used in many 
areas and it's probably too late to stop them if you want to. ISO/MPEG, 3GPP, 
ARIB, DVB, DAB, TV Anytime... this is just a small sample of organisations I 
know off the top of my head to be investigating (with the intention to use) or 
using binary infosets, and then you have the list of companies. These uses are 
not meant to happen in closed systems either.

    Thus, more interesting questions are imho: Should we find a way of 
standardising it before we have an interop nightmare (and before so many people 
are interested in it that it becomes impossible to not produce bloat)? In which 
cases is it ok to use it? Can a binary infoset be considered an "encoding" of 
XML, or is it something completely different (MIME-wise)? Should binarisation be 
done by Textual Fanatics or left up to the 
object-serialisation-everything-is-typed people?

("yes", "whenever it solves an XML-related problem", "tough question", "the 
former of course")


  - On this topic, one frequently hears broad statements from list members of 
the type "It won't give you any speedup", "XML parsing is never the bottleneck", 
"gzip compression beats anything else/is good enough", etc.

    Where it concerns low- to medium- performance applications running on 
reasonably powerful boxes, those are true (apart from the "gzip beats anything" 
one of course). That, however leaves open high-performance apps, and lower-power 
devices. Two big areas. I would very much appreciate it if people believing that 
these statements hold in those two cases were to provide empirical data, because 
it very flatly contradicts mine.

    And of course those statements do not cover requirements relating to 
streaming, packaging, fragmenting, random access...


  - The "Don't use XML then" argument also comes up quite frequently. In some 
cases, it's right on -- XML should clearly only be used when there's a benefit 
in using it. In others it's quite hard to buy.

    If you have a workflow in which nine steps out of ten use XML and reap great 
benefits from it (many existing tools, open, proven, powerful, interoperable, 
low coupling, many developers, standard APIs...) but one in which it proves to 
be unusable, you basically have two options:

    . Reinvent it all. You throw away all the tools, all the knowledge, all the 
interop, all the reliability, all the goodies, etc. and recreate them all to be 
ad hoc to your system. Why? Because you are using XML for something "it wasn't 
designed for" and any other option will get Hans Blix on your ass. Yes, people 
do use this argument on occasion.

    . Keep it the way it is, but find a way to solve the issues you have in that 
one step. This can, in some cases, involve binary infosets. You lose nothing for 
the nine other steps, and binfosets can be made to quack like XML so that your 
workflow isn't disrupted.


I'm probably forgetting a number of points, but hopefully these will help :)

-- 
Robin Berjon <robin.berjon@expway.fr>
Research Engineer, Expway        http://expway.fr/
7FC0 6F5F D864 EFB8 08CE  8E74 58E6 D5DB 4889 2488





 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS