[
Lists Home |
Date Index |
Thread Index
]
It seems to circle back to the "If you depart
from syntax as the means to ensure portability
of data, XML makes no promises about its
interoperability" position. A conservative
and sane position, but it seems to get in
the way of relentless innovation because it
comes down to "If it hurts, don't do that!"
I don't mean to be flip but the XML-Dev archives
are littered with the uniform XML data model
threads. Given transformation into an out
of non-XML systems, he seems not to be asking
for an XML solution, but for a universal
data model that all consenting systems must
agree to process uniformly and reliably.
Perhaps I am overreading his email.
I've not read the XML Fragment Interchange CR.
Is this a likely and productive place to start
to answer TimBL's request, or just what we
have as a place to start?
len
From: 'Liam Quin' [mailto:liam@w3.org]
On Tue, Jan 13, 2004 at 03:24:17PM -0600, Bullard, Claude L (Len) wrote:
> The question is, what does the infoset specification
> provide that would answer or help create an answer
> to the request for equivalence of any given XML set
> under use by multiple systems?
>
> (I don't know how to put Timbl's "chunk" term more precisely
> given namespaces.)
Neither do I. The Information Set specification doesn't seem to
me to help at all if you take things that are not in fact XML
documents, and neither does the XML Specification itself. You
can only really talk about a chunk of XML in terms of the
containing XML document, in those documents.
The XML Fragment Interchange (a candidate recommendation since 2001)
[1] may be a starting point, but it needs two implementations to get
out of CR and move towards bcoming a REC. Possibly that spec needs
to be updated, as it predates the Infoset and xml:base for example,
and maybe interest in it is finally building.
|