[
Lists Home |
Date Index |
Thread Index
]
- From: Tim Bray <tbray@textuality.com>
- To: xml-dev@ic.ac.uk
- Date: Fri, 28 Jan 2000 16:35:27 -0800
At 05:39 PM 1/28/00 -0600, Steven R. Newcomb wrote:
[lots of stuff]
I don't think I have anything to add to the arguments about whether
the 10744 family of standards is being given its proper due. But there
is one assertion here that ought to be addressed:
>The W3C process appears to be based on the naive belief that
>independent design assignments for all the various aspects of
>Web-based information interchange can be made to a plethora of
>independent technical committees, and, in the end, everything can be
>made to work together somehow.
Hmm. The W3C is well-supplied [some would say "infested"] with
Co-ordination Groups which exist precisely to detect cases where
different working groups need to do extra work to ensure consistency
and interoperability between their output. A tremendous amount of
time and effort goes into this work. The committees are *not*
indepenent in theory or in practice. It's reasonable to argue over
the quality of the results, but the "naive belief" that Steve describes
simply doesn't exist.
It is the case that there is no over-arching formal, architectural
underpinning of the W3C's design for the future of the Web. TimBL has
some living documents that he updates regularly, at
http://www.w3.org/DesignIssues/
but they don't really have any normative force, and lots of people in the
process disagree with parts of them. There has been some discussion of
how such a formalization might be created, but nobody has to date
proposed anything that's politically viable.
So what we have is a partitioning of the design problem, allowing parallel
work by multiple groups of people, with an added cost due to the need for
co-ordination. The work in creating mammoth standards like 8879 and 10744
is more centralized and in my experience mostly done by a really tiny group
of individuals who do all the heavy lifting. Arguably, you have to get
better consistency end-to-end in the ISO system.
Well, we're all spectators at ringside; nobody knows what the right way
is to build the standards infrastructure for the largest experiment in
information processing ever attempted; everyone is making it up as they
go along.
I'm certainly not going to defend the W3C model of work as the be-all
and end-all or even go so far as to say that couldn't be improved
quite a lot. But at this point in history it seems obvious that of
the 3 models of work we have before us (ISO, IETF, W3C), none can be
said to have been shown to be either triumphant or bankrupt, based
on results. The most likely conclusion is that they excel in different
problem domains.
I'll end with a question. Steve asserts:
> XML Schema *does not* address the problem of
> validating mixed vocabularies. As far as I can tell, this
> fundamental problem doesn't even appear on that committee's radar.)
Is this true? I haven't been following schema that closely but I know
the issue is for sure on their radar. I would agree with Steve that
skipping this would be a pretty serious omission. -Tim
xml-dev: A list for W3C XML Developers. To post, mailto:xml-dev@ic.ac.uk
Archived as: http://www.lists.ic.ac.uk/hypermail/xml-dev/ or CD-ROM/ISBN 981-02-3594-1
Unsubscribe by posting to majordom@ic.ac.uk the message
unsubscribe xml-dev (or)
unsubscribe xml-dev your-subscribed-email@your-subscribed-address
Please note: New list subscriptions now closed in preparation for transfer to OASIS.
|