[
Lists Home |
Date Index |
Thread Index
]
From: "Mike Brown" <mike@skew.org>
> I think the solution should not have to involve changing the semantics or the
> level of abstraction at which a character reference operates. They should not
> tread some middle ground between the fairly discrete levels of abstraction
> (between characters, code points, encodings) that have been established in XML
> 1.0 and that are, IMHO, not crying out to be broken just to make it easier for
> XML to carry binary payloads.
Yes.
But I think the real issue here is a flaw in XML Schemas: that the bin64 datatype
was introduced to allow transmission of data that could not be fitted into XMLs
constraints, but that once it has been received there is no way to restore it
to its original form: we can un-encode it, but into what?
And there needs to be a change in the type hierarchy to introduce
anyString
before
String
where anyString allows any characters (except 0x00) and has a facet
transmission-encoding ( plain | bin64 | bin16 | q ) "plain"
which expresses the lexical form of the data being sent.
Cheers
Rick Jelliffe
|