OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   RE: [xml-dev] Fast text output from SAX?

[ Lists Home | Date Index | Thread Index ]

We will now have a comparative history debate 
in which two points of view separated by different 
interests will attempt the usual futile fusions. 
For those who aren't interested, hit the delete 
key now.

Claude L Bullard wrote:
>> There were reasons to go to markup over all of the
>> binary solutions that dominated the markets when 
>> that decision was made.   The success of that 
>> decision is evident by the near ubiquitous use
>> of markup systems now and the ease with which
>> unprecedented integration and standardization
>> is occurring.

>I'm sorry but I just don't accept this as anything other than
>rewriting history. As far as I know, there was only *one* moment in
>this industry when a large number of people sat down to debate the
>"text vs binary" decision. That moment came back in the 80's during
>the early days of X.400 and X.500. And, the result of that debate was
>to choose ASN.1 as the winner. 

Yes and for a while, ASN had the upper hand which made things moderately 
difficult for the markup community mostly engaged after that period in 
DoD work.  We found out that reuse of binaries was horrible, that 
ASN systems were not readily available and were not cost effective as 
a result.  However, the more pressing problem was Adobe's insistence 
first on Postscript, then on PDF, and the WYSIWYG snake charmers who 
wanted to trade ease for both speed and reuse.   Every time we did 
the work, it came back to simple tagged data objects as the sweet 
spot: scalable, easily vetted, cheap, human readable, designable by less 
than master degreed computer professionals and so on.   It was this 
intersection of human, machine and commercial interests that eventually 
made STEP/Express and SGML the winners of the CALS initiative. And this 
was when the cost of an SGML parser was outrageous.  As a US Army civil 
servant said at a large meeting, "It's terrible but the best we have". 

In short, it wasn't the computer science industry that chose.  They never 
have that choice.  It was the customers. Typically, system integrators 
get to choose formats, not system vendors.  Lesson learned.  No matter 
what the propellorHeads tell me, it is the customer that has to be sold 
on the idea.  XML was sold to propellorHeads with a dose of "and this 
will kill off SGML" and they were naive enough to buy that.  Slick!

>What happened later was that HTML grew in popularity due to
>HTML's use in a single (but compelling application). 

Gencoding always wins if a naive customer is choosing.  There 
are good reasons for it.  That is why it is the original markup 
application dating from the 1960s.  It is the first thing everyone 
tries.  It just has the problem of not being a vertical tag stack.

>The HTML folk were then joined by their SGML cousins (the SGML folk wanted
to get
>"back on top" again) and we eventually got XML. 

As several people on the list can tell you, SGMLers were there from 
the time TimBL first started demoing at CERN.  Certain ones were 
incredulous that after as much hypermedia research existed, the 
wireHead was proposing gencoding.  At that time, there was no 
'top' to get on.  HTML doesn't impress anyone who has a background 
in the domain.  But it was a heckuva a good onramp.  The more 
pecuniary among the SGMLers noted that quickly and worked hard 
both to make HTML a compliant SGML application, then to get 
SGML On The Web.  Slam dunk because it came in low and fast 
under the radar.  The W3C gets the public credit for XML, but 
that's another bit of mythInformation.  It was SGMLers who 
made it happen with not an insignificant amount of bad behavior, 
but that's life online.

>This wasn't a decision
>-- it was a social process that was driven by primarily non-technical
>forces. 

No significant decisions made for large systems are ever completely 
technical or non-technical.  If they were, we'd all be using LISP 
on the command line.

>However, there are still
>applications around that have a need for hard "technical"
>characteristics (like compactness and parsing speed) that is greater
>than their need for the social benefits of ease of use, debugging,
>etc.

Here we agree.  I've said that from the first email.  There are 
applications for which XML is inappropriate on-the-wire or in 
memory because verbosity matters very much, datatype support matters 
very much, and in these cases, human readability matters a lot less.
VRML is the case I use.  Real-time 3D staggers a processor unless 
the author is quite proficient, and that sends the cost of the 
document soaring as well as limits the uptake by the occasional 
user.  So in X3D, there are three encodings including a binary.

It comes back to, "will a generalized XML binary work well enough 
for the applications that need a binary to accept the penalties 
of generalization?  IOW, will they end up creating their own 
binary anyway, thus nailing to the church door one of the reasons 
for the WG: preventing single use binaries from proliferating.

>The original "decision" to choose ASN.1 was made in a time
>when the players were all much more aware of the technical
>requirements than they were of social needs. Machines and networks
>were terribly slow. Software development was highly concentrated in a
>small number of highly skilled groups. The software industry still had
>very little experience with non-technical products of any kind.

Yes.  But ASN went to the parking lot quickly because the cheap 
processors and memory, GUIs, and the opening of the Internet to 
commercial use started a grassfire that took out the wheatfield 
with it.  For all of the awards given the web, the first SGML 
hypermedia systems predate it and yes, they worked on networks. 
Some of them still do.  But as you say, history and markets pick 
winners and conveniently forget facts that don't fit the mythology. 
That's fine.

>Now, times have changed and we've got XML to solve a whole
>class of issues. Today, the percentage of people who *need* something
>like binary encodings is much smaller then it was back in the 80's,
>but there are still a significant number of applications and
>organizations that still have relatively severe technical requirements
>for compactness and speed... That's why this perma-thread has never
>died. The need is still there.

I agree.

>What we need is detente between these two camps. They have
>been battling for decades now. It is time to stop fighting and learn
>to work *together*. We should use XML for text based stuff and ASN.1
>binary encodings for the binary stuff. Anyone who isn't satisfied by
>one of these two probably has a requirement so bizarre that there
>isn't anything a general standard could do for them.

That's what we are here to determine, but when the thread started, it 
was a polite request to do this on the public list for this task, and 
so far, I haven't received a single email from that list.

It's a demonstration example I want to provide to a certain 'list 
community maestro' who has told another organization that he can 
get bigList denizens to accept his forums after years of the 
community being on a big list.  There are cultures that just won't 
be corraled.  There are technologies that just won't die or be 
replaced.   There are niches of alternatives.  So we need to 
work out just how much need there is for another XML system 
format because unless we can show a ten fold benefit for everyone, 
it will be better to work this for a smaller application domain.

Thanks Bob.  Enjoyable as always.

len






 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS