Lists Home |
Date Index |
Dennis Sosnoski wrote:
> When dealing with the outside world many applications
> want to use XML for all the standard reasons. In reality,
> though, it's only the Infoset that they care about.
The contention that "only the Infoset" really matters is one
of the key sources of conflict between the two sides of the "binary vs
text" debate. Supporters of binary encodings generally consider the
primacy of the Infoset to be intuitively obvious. Those opposed to
binary often argue that encoding is more important. This is, of
course, downright baffling for the binary folk. Clearly, it isn't
quite so intuitive or we wouldn't be having this discussion.
Personally, I'm in the group that likes to view the world via
the Infoset. I want my code to deal with abstract concepts that are as
close as possible to my conceptual view of the application and as far
as reasonable from the dirty details of how the machine will actually
process things. I want highly efficient ways for expressing my
"intent" which minimize the amount of effort that I must put into even
understanding mechanisms like encoding rules. As a result, I like to
stay "encoding-neutral." My code presents an abstract object to an
interface which then serializes the thing in what I hope will be a
compact and efficient manner. I really don't want to know how the bits
actually look while on the disk (unless I've got some debugging to
do.) My goal is to separate "interface" from "implementation." I want
an InfoSet "interface" in memory (where my program sits) and really
wish that I did't need to know what the encoding rules are.
I would appreciate it very much if at least one of the folk
that sees syntax as being more important than the Infoset interface
could try to clarify their position on this. The better we understand
each other, the less we've got to fight about... Can anyone explain
why encoding syntax is more important than the Infoset interface?