If I load a Word or OpenOffice document into a text editor, can I read
it? (Well, after you unzip OpenOffice it's XML, so that's proving your
point. ;-) )
If the format is self-describing and standardized, for some value of
standardized / open source, then any editor that supports that format
can view it.
Of course, it is not text. I worked through some text-only encodings,
but I couldn't find the efficiency features there.
On the other hand, as a practical matter XML quickly becomes cumbersome
without an XML-aware editor. When I wrote a paper in XML last year, I
quickly grew tired of worrying about matching level tags when I was
concentrating on content.
The fact that esXML can be 'edited' directly means that XML editors can
use it as an internal format easily also. No need to 'unzip' it first.
I don't see how not requiring pedantic well-formedness checking breaks
a contract of any kind. If a certain application context requires
check, check. I'm not somehow preventing that, only decoupling it from
actual use of the data.
Quite a number of people have experienced the need for something
better, in various ways, than XML 1.1. The question "Do we need a
general solution?" is strange to me. Each person solving a problem
needs a solution for that problem. Experienced people know as a basic
rule that solving the meta-problem, either similar application
situations or similar representation/processing situations or both, is
better in the long run. A good architect/designer will always try to
solve the general problem that includes the one they are working on.
Often, an analysis leads to use of an existing general solution created
by those that have worked to solve the general problem, hence the
popularity of XML. As the overhead pain of XML is often great, and
more and more often too great, this same kind of analysis leads to a
conclusion of "I want all of the goodness of XML because I know how bad
it was before, but I need to fix or have fixed problems X,Y,Z".
Sometimes you can feel potential solution elements, but often the
balancing of requirements to get an optimized solution is beyond the
scope you can handle on the present project. Enough of these instances
will cause people to work on an extension to the art. Based on a
project I architected in 1998 with Java, XML, web application servers,
and an expert system/rule engine, I started such a quest, slowly.
Many inefficiencies and inabilities don't get worked on because they
aren't recognized, no one thinks there is a viable solution, or no
person or group has the resolve to work toward or standardize a
solution, or not enough people beyond an energized person or group
appears to be receptive to using a successful result.
In 1998 these premesis were mostly true, at least for my point of
view. Today, none are. In my interpretation, you and others are
arguing that:
"Work toward solving general problems by building on existing
successful standardized solutions shouldn't be done publicly and
shouldn't be done expecting to successfully standardize on anything
unless it is first proved that everyone needs it and that it can be
done to X level. Furthermore, it is disrespectful to the creators of
the successful standard to leverage their correct choices with
additional efficiency/complexity, call it anything similar, risk
confusing poor programmers, and having any impact on lucrative revenue
for hardware and software vendors benefiting from inefficient
processing standards proliferation."
The problem is real, many people by now have experienced it, viable
solutions to some extent have been proposed, benchmark methods are
clear, all purely useful semantics of XML are being retained,
programmers could use better models and less coding requirements, and
application owners are demanding better efficiency.
sdw
Bullard, Claude L (Len) wrote:
If I open esYourML in a text editor, can I
read it? That's all the view source issue amounts to.
Probably not unless the 'uncompress' code is
invoked as it is for gzip when X3D uses that.
No, X3D definitely needs more than zipping
because that we have.
Some believe a binary helps with the view
source problem. Others know it doesn't much.
Also, because the range of XML applications
from real documents to message payloads
gets a different amount of inspection (I
can't envision too many people view sourcing a
SOAP message but lots of them view source
X3D, HTML, etc. to acquire techniques),
the requirements for this one problem can be
different.
The problem with the esXML approach so far is
that is breaks the widely held contract of
well-formedness and substitutes asNeeded
well-formedness checking. While a good idea where
there is a lot of control, is that acceptable
for the kinds of blind exchanges that the
web at large requires? So again, it's not a
one-size-fits-all and at this point, I am
interested in the question I posed, do we
really need a general solution? I don't
expect an answer here but I expect that
question to stay at the front of the WG
agenda because otherwise, the rush to design
and implement will overcome the
more fundamental ecosystem questions of
adding new formats when existing
formats may be good enough. Just as some
don't want to add new addressing
systems, some don't want yetAnotherFlash.
The need for speed may not
justify all of the effort required to get a
new format into a world wide federated
system of systems.
Again, I am pruning the CC lists.
len
esXML does not obscure anything as all information for XML 1.1
equivalence is in its self-describing format to be 'uncompressed' by
any version of the library. If you are exchanging deltas, a
man-in-the-middle might not have access to the parent, but that's not
real obscurity.
The new formats, as we've discussed, are about efficiency of one or
more types, and explicity not only size efficiency. Schema-based
approaches do tend to obscure, but self-describing formats like esXML
and apparently finf, do not beyond the need to uncompress to text.
This is not totoally unlike needing to ungzip.
sdw
Bullard, Claude L (Len) wrote:
It's a fair question. Lots of technologists and
marketing types have been drafting on XML since it
became successful, but this really is a request from
parts of the XML community to create a faster format
that XML systems can use. Reasons differ, mostly
they are the "need for speed", but also some want
to obscure the content from prying eyes and are not
bothered by arguments that say any thing can be
reverse-engineered. There are customers who resent
view source prying and for good reasons. No, this
is not the best means to stop that but it helps like
that almost worthless bolt lock people use on their
doors that anyone with a little determination can
overcome. The difference is the number of people
who are really determined and able vs those that
just want to do a bit light burglary.
len
-----Original Message-----
From: Rick Marshall [mailto:rjm@zenucom.com]
i don't understand at all why we have to have binary or optimised xml.
it just seems to me that if what you want is eg asn that use it. if you
want xml, use it. if your application can benefit from transforming xml
to asn or using asn with it's "xml" extensions, then use a translator.
why not let xml do its job and asn and others do theirs? i canlive with
importing and exporting data from data bases when and as it seems
sensible to use xml for representation and databases for storage and i'm
not convinced (probably never will be) that there's any advantage in
confusing rather than using standards and technologies.
rick
On Wed, 2004-04-14 at 23:43, Bullard, Claude L (Len) wrote:
Well, actually I mean the idea of calling something
XML that clearly isn't. The spinning of the 'what
is XML' thread doesn't impress me much. I agree
with Elliotte. The spec tells us exactly what
XML is.
People who want to do things that experience has
shown are short-sighted are sometimes called innovators
while their critics are labeled Luddites or Sabots.
After the innovators do their damage, it is a little late
to hit them with shoes. We really do need to know
if a binary is something only some applications need,
and therefore, a generalized spec and standard are
not required. Once a binary is approved for
all XML applications, XML will rarely be seen
as the programmers rush for the binary format for
the same reason countries fear they will be second
class without nukes.
My problem with the current thread is that it is
designing a binary ahead of making that determination.
The case is made for some applications using a binary.
The case is not made for it being generalized.
len
From: Rick Marshall [mailto:rjm@zenucom.com]
On Fri, 2004-04-09 at 23:50, Robin Berjon wrote:
Bullard, Claude L (Len) wrote:
References to 'optimized XML' without a clear
set of definitions for this. The slippery slope
is evident.
That's why there's a WG about it :)
i think len means the wg is the slippery slope. i certainly suspect it
is
rick
-----------------------------------------------------------------
The xml-dev list is sponsored by XML.org <http://www.xml.org>, an
initiative of OASIS <http://www.oasis-open.org>
The list archives are at http://lists.xml.org/archives/xml-dev/
To subscribe or unsubscribe from this list use the subscription
manager: <http://www.oasis-open.org/mlmanage/index.php>
--
swilliams@hpti.com http://www.hpti.com Personal: sdw@lig.net http://sdw.st
Stephen D. Williams 703-724-0118W 703-995-0407Fax 20147-4622 AIM: sdw
--
swilliams@hpti.com http://www.hpti.com Personal: sdw@lig.net http://sdw.st
Stephen D. Williams 703-724-0118W 703-995-0407Fax 20147-4622 AIM: sdw
|