XML.orgXML.org
FOCUS AREAS |XML-DEV |XML.org DAILY NEWSLINK |REGISTRY |RESOURCES |ABOUT
OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index]
Re: [xml-dev] json v. xml

Michael Champion wrote:
> I'm not sure it's
> a comforting thought to know that this could all be done the SGML way, given
> that SGML was not exactly a resounding success outside a very small
> community. 
>   
Oh, your comfort was my most fondest concern, be assured :-)  But I'm 
not recommending a syntax or meta-syntax or data model or 
meta-data-model, let alone SGML (though certainly SGML is already 
modularized in IS8879, so it would be completely possible to redefine it 
as a set of transformations that ultimately generate XML or JSON, 
cleaning up a few things on the way and becoming more expressive along 
the way.)

There an SGML angle to it though. Non-dinosaurs may be surprised to 
learn that SGML's earliest, near-fatal challenger was not formats, but 
WYSIWYG. Old word processors (troff, Word Perfect, TeX, etc) all allowed 
you to play with tags; even the editors with presentation preview modes 
allowed you to edit the tags. Then WYSIWYG came along (with bastardized 
version of Ben Schneiderman's "direct manipulation" ideas) and the push 
was on for hiding tags both on-screen and in binary data formats, and 
against batch processing and transformation. SGML fitted into the UNIX 
pipes world that, while it never went away, was not the kind of 
mom-and-pop technology that soaked up all the capital and market share.

Apple, Adobe, MS, Corel, and all the software houses spent hundreds of 
millions of marketing dollars to push the glamour of WYSIWYG. Concepts 
of repurposing, semantic markup, hypertext links between documents, 
schema checking, document construction from components, let alone 
archiving or application-neutrality, were abandoned.  The "failure" of 
SGML is the "failure" of Vi over PageMaker. 

Failure is a matter of expectation. Is the Wiki format a failed 
technology? From the POV of sales, I am sure it it; from the POV of 
numbers using it, compared to Office or OpenOffice, I am sure it is; 
from the POV of its ability to be useful in creating Wikipedia-like 
things, it is obviously a roaring success (and Office and OpenOffice are 
failures).

So what is the angle? That a good idea ultimately wins through, despite 
counter-marketing, but only when the technological conditions are right. 
JSON could be in the same position.

In 1985, the question "Do different data formats need some underlying 
way to unify them" had an answer "yes", responding to the technology of 
the time (and lo SGML was born). In 1996, the question was answered "no" 
(and  XML was born).  In 2007, the question is getting asked again, and 
it may well have a different answer.  But, in the mid-80s, parsing 
theory was relatively widely taught, and systems like UNIX reflected it; 
by the mid-90s, parsing theory was not well-known, and systems like Macs 
and PCs reflected that; now in the 00s, I don't see any great resurgence 
in knowledge of parsing that would make the old SGML approach 
particularly congenial for users, even though perhaps more people are 
getting an introduction through XSD grammars and XSD regex to some 
concepts.

> Some sort of underlying
> unification principle, whether it be grammar-based or datamodel-based, would
> seem useful to make their lives easier.   
>   
Here's a unification principle behind XML, JSON and Fast XML: don't 
re-invent the wheel.

Cheers
Rick Jelliffe


[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index]


News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 1993-2007 XML.org. This site is hosted by OASIS