Lists Home |
Date Index |
- To: <email@example.com>, "Thomas B. Passin" <firstname.lastname@example.org>
- Subject: RE: [xml-dev] Are people really using Identity constraints specified in XML schema?
- From: "Hunsberger, Peter" <Peter.Hunsberger@STJUDE.ORG>
- Date: Fri, 20 Aug 2004 08:57:11 -0500
- Cc: <email@example.com>
- Thread-index: AcSGWrWeXp8NLwT2SaichBXcPw/16QAYkCKw
- Thread-topic: [xml-dev] Are people really using Identity constraints specified in XML schema?
> When I read this I feel good about how we have engineered CAM.
> These real world examples show that we have it right - since
> it can handle all of this in its stride. 3 different ways of
> looking at one value? No problems. And context driven - yes!
I don't think it can be emphasized too much how important context
sensitive validation is required. The real world requirements are
simultaneously local and global. Two main issues arise:
1) what is best practice for determining context? A simple hierarchical
mapping works for only some subset of the problem. A complete rules
engine is complex and expensive. Similarly, an Ontological traversal is
complex and expensive.
2) Normalization. Static XML templates don't easily provide a useful
degree of normalization. Given the lack of a clear path for my first
issue this may not seem like a real problem yet, but we're already
running into it. Our business analysts have dumped many requirements on
us to allow them to reuse already existing template fragments (we use
them for presentation, styling and validation) across multiple contexts
when a portion of the template is identical.
Currently for 1) we use a simple XSLT based rules engine traversing
multiple XML hierarchies. I know where I want to go with this and I
believe I can keep the processing costs reasonable and get an 80%
solution. The solution to 2 is joined at the hip with 1. As you
traverse the rules graph you locate pointers to the template fragments
needed to create the entire template; start globally, traverse to local,
(recursively use rules to determine how to combine the results).
The implication for CAM is that, long run, you also need a VAM: a
Validation Assembly Mechanism. This may seem like overkill, but I
really think this is a multiple dimensional problem with the same
solution over each dimension: triples driven graph traversal.
An interesting side effect of looking at the problem this way is that it
gives you a precise definition of ambiguity. Ambiguity arises when the
traversal over each dimension does not arrive at a single point but
rather some higher order space (be it 2D, 3D, or whatever). If you
arrive at such a space you have to either have defaults or a way to
alert some portion of the organizations involved that they have not yet
agreed on a workable solution; (local augmentation of the rules graphs
is likely the near term fix...). Now if only we really knew what each
dimension was (Zachman's architecture framework anyone ?:
http://www.zifa.com/). Clearly, MVC only captures a small portion of
> Sure people can bitch about this not being 'simple' - but the
> kind of use cases you have shown only proves you need strong
> adaptability and flexibility to solve these real world needs.
> Fortunately - unlike Tim's thoughts on XQuery - I feel very
> confident that CAM has managed to address the needs.
It appears you're doing very good work on one portion of the problem.
However, watching the various standards chasing each other in circles
around the world, feeding on vast amounts of human resources, growing
ever fatter, and continually failing to simplify my life doesn't make me
have quite your confidence.