[
Lists Home |
Date Index |
Thread Index
]
As a rule of thumb, view source is pretty good. Given that
it exposes the tightly bound, system specific definition, it:
1. Enables me to see as much detail as the vendor decides
to expose. Mileage varies here.
2. For that detail exposed, the ability to infer via the
properties of an instance, the likely meaning given more instances,
and from that, to build up a data dictionary (what schemas do
well) likely to express a probabalistic truth (theory) of the document type.
So, a feature of the system (enabling openness) is likely to be as important if not more than the actual schema used. In short, yes documents are thin on the ground, but view source enables one to develop some and improve others given informed insight and a sufficient number of instances to be considered representative.
Pretty much what we do about RTF.
It says nothing about any given schema except insofar as inspection
reveals what experience informs is the subset that expresses the
most widely used properties for some class of system.
len
-----Original Message-----
From: Tim Bray [mailto:tbray@textuality.com]
K. Ari Krupnikov wrote:
> Data formats need to be documented, the documentation needs to be
> publicly available and correct. No matter how good or bad a format is,
> if the documentation you have doesn't match what you find in actual
> files, or if there are crucial features that are not publicly
> documented, the format is useless.
Except for, they're usually not. Good docs are thin on the ground. The
virtue of the Web was that you could figure it out yourself anyhow by
doing a "view source". That's the real test I'd want to apply to the
MSFT offering or any other. I've taken one brief look at the XML output
from Office when it was at the alpha level; that first cut passed the
"View Source" test. Early evidence and not conclusive. -Tim
|