[
Lists Home |
Date Index |
Thread Index
]
- To: xml-dev@lists.xml.org
- Subject: RE: [xml-dev] Identifying Data for Interchange [was: XML Components]
- From: "Bullard, Claude L (Len)" <clbullar@ingr.com>
- Date: Mon, 6 Jan 2003 10:56:21 -0600
True enough, but up front, I am interested in knowing that
the standard proposed describes the source conditions as
precisely as needed, first, then uses descriptions that
are widely understood. Sorry, Walter, but I do have
to ensure that the receiving nodes understand particular
items a particular way; I don't have to be sure that they
all implement in a particular way unless the implementation
affects performance constraints in a way unacceptable to
the overall communication, but Mars Observer proved the
value of shared interpretation of the numbers.
Yes position data is better than distance for applications
I deal with. To those, we add timestamps because we need
this data not once, but frequently. We are plotting in real
time on a map. Frequency matters as does speed because we
are calculating not distance, but arrival time given alternative
routes and timestamped conditions on the routes. Not everyone
needs that data so we would not propose a generic position
element to be used by any application.
1. The timestamp has to also be standardized. Not hard
but don't ignore it. XML Schema datatype threads have
exposed the issues here so we don't need to repeat them.
2. The representation must be processable fast enough such
that the near real time update of the map appears to be
true real time to the person on the console. The accuracy
of the actual recorded data must be close enough for litigation.
Those who do aircraft accident simulation for investigation
understand this.
It is difficult (no duh) to write a standard for data without
reconciling the target applications. It is dangerous to write
a standard that tightly couples to the target applications. The
middle ground is about right but tends to make everyone equally
dissatisfied. No free lunch.
Extreme Programming artifacts are dubious standards candidates
when initially fielded. If they survive and thrive, they are
good candidates, but that is true of any data application
regardless of the methodology. For cost reasons, the test
of standardization should be success in the market. Everything
else is a specification for an application, and in these, the
XP fans have it right. Specifications should only become
standards when proven to work for multiple parties with quantified
and valid interests. That effectively halves the life cycle
of a standard.
len
-----Original Message-----
From: Mike Champion [mailto:mc@xegesis.org]
On Sat, 04 Jan 2003 12:03:50 -0500, Roger L. Costello <costello@mitre.org>
wrote:
> Instead, define precisely one, unambiguous "interchange standard". This
> becomes the "lingua franca" interchange format.
Sigh. The eternal optimism of nerds ;-) that human factors can somehow be
assumed away.
|