OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   The waterfall model lives? (was Re: [xml-dev] The subsetting has begun)

[ Lists Home | Date Index | Thread Index ]

On Sat, 22 Feb 2003 05:21:15 -0500, Daniel Veillard <veillard@redhat.com> 
wrote:


> My guts feeling is that your problem is a framework one not a
> problem with the XML spec purely and to be perfectly frank the
> reliance on Java just makes the 2 of the 4 points I just pointed out 
> insanely large. Still it's not a valid justification to blatantly break a 
> well established specification.
> Fix your parsers/framework instead of tweaking the spec to get the 
> overall solution to fit your constraints :-(

I find this whole discussion a bit disorienting.  I sortof hate it when I 
disagree with so many people who I respect and generally agree with, and 
have rethought my position many times over the last few years.   Still, I 
don't agree with the argument that the XML 1.x  specification is the fixed 
point around which the "XML world" (broadly defined) should revolve.

It reminds me of the "waterfall model" of software design/development: 
requirements are gathered, written down, reviewed, and cast in stone; 
designs are devised, written down, and cast in stone; then the design is 
implemented in code, the code evaluated against the requirements, and 
tweaked until the requirements are met.  This is more or less a strawman in 
most software engineering texts that I've seen in the last 20 years or so, 
but it seems to be treated as incontrovertible truth when we're talking 
about the XML spec.  The larger world of software engineering has come to 
terms with the fact that the world changes faster than we can write specs 
(which is after all more of a political process than a technical one), that 
technology changes can make engineering tradeoffs in designs (and assumed 
in requirements) obsolete very quickly, and that the only way to keep 
projects from degenerating into chaos is to have some sort of tight 
feedback "spiral" among requirements, designs, and implementation.

In this case, "XML" (I agree with the critique that whatever this J2ME 
subset thingie is, it's not XML 1.x and Sun should make that crystal clear) 
is being put to uses and in environments that were AFAIK outside the 
expertise of the original XML WG.  They made plenty of requirements/design 
tradeoffs in subsetting XML to meet the needs of the Web of backoffice 
servers and desktop browsers.The last 5 years has shown that to a VERY 
great extent, they made good tradeoffs, and XML has been far more 
successful than anyone (at least that I knew of back then) predicted.  But 
AFAIK they weren't thinking about Java VMs running on cellphones, or 
enterprise-class transaction processing engines handling thousands of 
messages per minute.  It's not surprising that XML 1.0 is turns out to not 
be exactly optimal for these environments.

What does surprise me is how reluctant the XML community is to apply the 
techniques we've learned for building robust software to building robust 
specifications.  Monoliths are fragile, but layered, modular architectures 
are adaptable.  External realities change, and requirements have to be able 
to change or they will be bypassed.  It looks to me (from a distance) as 
though the J2ME people are doing the Right Thing -- setting up tight 
feedback loops among requirements, designs, and implementations and 
weighing the business value of each.  Sure they've tweaked the spec to fit 
their constraints, but to do otherwise would be to set themselves up for 
failure, as so many software projects following the "waterfall" approach 
have over the last few decades.

I'm afraid that I think that it's XML that needs to accomodate the 
requirements of its "customers" by becoming more modular so that people in 
Sun's situation here aren't faced with a stark choice between being non- 
compliant with any spec, and bloating their code / making products more 
expensive in order to comply with parts of a monolithic spec that 
(apparently) add little business value.

So what if XML were "refactored" so that the bare-bones well-formed syntax 
(and/or data model, that's another issue!) were the common core, and DTD 
processing were at the next layer up?  That would solve the J2ME issue, 
address the high-speed SOAP processing issue, standardize the "Common XML 
Core" that is the rock-solid basis of de facto interoperability, and so on. 
 DTD users won't suffer by avoiding implementations that clearly don't meet 
their needs. Other users gain, since as a practical matter raw XML with 
entity references isn't going to go out to a cellphone anyway, and not too 
many Docbook documents are likely to be seen in thousands-of-transactions- 
per-minute environments.  And having an identifiable conformance level to 
explictly check against (J2ME would be clearly labelled as conforming only 
to "XML Basic" or whatever) would tend to prevent nasty surprises.  And if 
Moore's Law or the growth of XML Everywhere leads to a demand for Docbook 
on cellphones, well then J2ME can just evolve to meet changing realities 
like everyone else has to.  (Or more likely the use case for J2ME itself 
will disappear, but such is life).

Sigh, another half hour lost to this permathread ... oh well.







 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS