[
Lists Home |
Date Index |
Thread Index
]
True. It is reasonably straightforward to tokenize
XML at the infoset definition. Then one can drop to
the name sets and using a schema, do a good job a la
WAP, MPEG and so forth. Where the rubber meets the
road in the graphics is in the content: what is between
the pointies, and that is what the X3D spec group has
been working. I'm not that conversant with the work, but
have kept up with some of the public discussions.
So to me, the argument that 'we must get a handle on
the wild' arguments aren't compelling until one can
show an advantage big enough for ALL of the efforts.
That may be the case but I note that most of the
binaries I know about were done not to fix the XML
problems, but to optimize particular applications with
particular problems. I am interested to see what
comes out of Liam's workshop.
len
From: Jonathan Borden [mailto:jonathan@openhealth.org]
A few years ago (e.g. circa 1998) I thought it would be a good idea to
develop an XML representation of the tagged binary ACR/NEMA DICOM
standard for (Digital Image COmmunication for Medicine). Well, it turns
out that when you are transmitting megabyte -> terabyte hunks of data
around, that having a full network protocol stack may actually be the
way to go (DICOM is mostly over TCP/IP but there is a spec for a DICOM
physical layer connector (i.e. a wire :-).
It also turns out that developing an XML representation of a tagged
binary format (modulo chunks of raw image data) is a fairly easy thing
to do .... hmm it looks like Robin Cover has archives one of my efforts
at this here: http://xml.coverpages.org/DICOM-dtds.zip and see:
http://xml.coverpages.org/astmHealthcare.html
That said, aside from the "gee I can turn anything into XML!" factor, I
think there is a perfectly good place for binary data formats where they
are appropriate, so several years later I am still using DICOM -- many
many times a day, but haven't spent much time actually using the
XMLization I created.
|