> -----Original Message-----
> From: Edd Dumbill [mailto:firstname.lastname@example.org]
> Sent: Sunday, August 05, 2001 2:44 PM
> To: Tim Bray
> Cc: email@example.com
> Subject: RE: SAX Filters for Namespace Processing
> In my opinion we need to look behind the label that he ignored common
> practice and take another view at what he did, picking out design
> principles like some of those espoused by Extreme
> Programming. The Web seemed to be doing the simplest thing
> that would possibly work.
> PS. Also another reflection: while it was possible to spawn
> the web via such mechanisms, is the continued development of
> the web plausible in such a way?
Well, the Extreme Programming people say that after you've built the first version that is the simplest thing that could possibly work, and then you've worked with the customer to define some additional features, you probably have ended up with a bit of a mess.
That is EXACTLY where the XML world is right now ... except that we (the W3C anyway) probably went a bit overboard in adding a fair number of features that most "customers" don't give a rat's patootie about.
That's when the Extreme Programming people say that you refactor -- put common pieces that have been independently invented into a single module, wrap it up in a clean interface, and re-write the various bits to access the common functionality through the new interface. (Do they say to euthanize the mistakes at this point too?) BUT ... the conventional wisdom on this list is that things such as XML namespaces and schemas are "finished" and there's nothing that can be done about them ... we just have to live with the consequences.
The way I see it, there are two viable options: One is to bite the bullet and start to refactor the XML "superstructure" specs. Maybe that is Edd's question ... is this idea politically plausible? I certainly wish it were, but can't point to any evidence that there are a significant number of people active in the W3C who take this option seriously.
The other is a variant of what Simon St. Laurent proposes in the "breaking up is hard to do" thread: Bifurcate into those who spend their energy propping up and adding on to the ivory tower ... and those who spend their time finding the "simplest things that ACTUALLY work" within the XML infrastructure and superstructure. There are a lot of people in and out of the W3C that are sympathetic to the search for best practices; there is a lot we could do to promote this idea (and I think Edd has done a great job on XML.com of at least treating this approach as having some legitimacy while giving the "ivory tower" its due as well). It really just requires a subtle shift in the way we think about W3C specs, from "the spec is sure to be a de-facto standard, so I'd better support it" to "the spec is sure to have been well thought-out by some smart people from lots of different companies, so I should consider it carefully."
This is probably another thread, but I for one interpreted Sun's recent announcement of a multi-schema validator (see http://xml.coverpages.org/ni2001-08-01-c.html) as a step in this direction. "MSV supports RELAX NG, RELAX Namespace, RELAX Core, TREX, XML DTDs, and a subset of W3C XML Schema Part 1." I think [obligatory disclaimer: I have not yet persuaded my employer of the wisdom of the following pontification, so don't blame them :~) ] that more XML implementors should admit that the emperor is "sartorially challenged" and just implement those parts of the specs that make sense, and call a subset a subset. This might limit interoperability in principle, but I for one would rather work around an explicitly missing feature than to find (after wasted nights of debugging) that the implementers interpreted the spec differently.