Lists Home |
Date Index |
Jeff Lowery <Jeff.Lowery@creo.com> wrote:
> For me, it's not an issue of "how do I represent this data?",
> but "how do I define this data so that it can be represented
> several different ways?" (and efficiently).
Boy, that's a $64,000 question...
> There's certainly a lot of effort being expended in the
> development of mapping between relational and object models,
> and between object models and XML document models right now.
> The problem is that all mapping technologies
> seem to require a primacy of one model over the other. For
> example, I can
> generate class defs from a document model (Castor, JAXB), or
> I can generate document models from class definitions (JiBX).
> Is there a universal mapping language that can be used across
> all data representations (a.k.a. mediums)? At the risk of
> sounding like a thrall of certain fascists, I really do think
> that any such universal mapping language will, at it's heart,
> be formulated on relational algebra. That's not to say that
> all models must conform to integrity constraints under all
> operations; what it does say is that those potential
> integrity violations are understood and handled correctly
> when data is moved from one representation to the next
> through the defined mapping operations.
Hmm, I might even grant integrity constraint conformation (99.99% or the
time). What I wouldn't necessarily expect is normalizations that conforms
to what experts in the current relational world might expect: I'm starting
to believe that data normalization and metadata normalization are orthogonal
to each other. Schemas that optimize one may do so at the expense of the
other. However, having (properly, whatever that might mean) done the
normalization either way it seems that data will conform to the constraints
that result (more or less by definition)... Eg.; there are known relational
patterns for normalizing a non-cyclical hierarchical structure. If I use
one of these to organize my data based on it's structural relationships the
resulting data normalization may be rather different than if I apply
standard data normalization to the same domain...
> I believe we're a long way from getting a mapping language
> that's both universal and easy to use. It may be a language
> with many dialects, each suited from a particular set of
> representations and mapping direction. It does seem, though,
> that the current technique of writing one-off Java, C++, and
> XSLT algorithms to perform these representational transforms
> is horribly error prone.
True, and yet they all the domains agree on so many basic concepts (sigh).
The "model" driven development world may provide some of the best ways to
attack this. An abstract model representation (aka UML and more likely it's
predecessors) holds out hope of being able to generate multiple