[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: Picking the Tools -- Marrying processing models to data model s
- From: "Bullard, Claude L (Len)" <clbullar@ingr.com>
- To: Jeff Lowery <jlowery@scenicsoft.com>,'Uche Ogbuji' <uche.ogbuji@fourthought.com>
- Date: Tue, 22 May 2001 16:34:40 -0500
Well, yeah, it can be harder.
Is the thinking one-off? The suite operates around a
common data model. XML thinking asks the question,
what if the common data model of the suite requires a
common data model so we can *afford* to build non-common
variants. When an application passes some level of
functionality, it becomes too expensive to build for a
sale of one. The common data model creates a market.
How common is it? Perry says, local nodes rule. Ok.
True enough. Yet, does that remove the necessity of a
common model, or in some cases, does it mean that
the common model is shallow but affective, ie, shapes
the dataspace effectively?
In the police business, there is a standard for
police data. It doesn't have a bit of XML, OOP, or even C in
it. It just names the names, associates a description,
and a type to the data. We build everything else based
on a not-so-common model of business rules that determine
relationships (a before b, a and b, a Not B) and so forth.
The common model is simply the definition of the data for the
last guy in the food chain: the FBI. The local guys could
care less about it, yet as stated, accomodating the
last receiver enables us to get enough definition that
we can create structure, and once we have structures,
the local rules are just customization with a few
new thingies here and there. Now we have a product
and a market because there is a fair amount of custom work
and a fair amount of shared definitions. There is
reuse but it may not be very deep. There is commonality,
but only in that there is a consumer at the end of
the food chain waiting to be fed. Last guy rules.
What about standard processes? Two scenarios:
Evolution:
Over time, a few leaders in a market emerge, mostly
the top dog and the second dog. The decisions they
enforce locally about their product colonize the market.
The Nos and Yeses for commitments to modifications and
enhancements add features and variant, and
you sold the second version to recoup the cost of the prototype
and now you have customers on maintenance so some
bugs became features but...
It begins to converge because
o the customer tends to imitate his peers and
o with only two dominant attractors, the shape of
the process emerges, and people understand it, think
it is the right way instead of simply a product
of feedback (processes evolve and devolve). In
other words, you accept minima because they get
what they pay for. Promise control is everything.
After some time of doing that, it becomes ripe for sharing
process definitions. Process controls emerge from
neighbors that exchange anything. The longer the
chain, the more important it is to satisfy the
last guy first. The bottom of the stack rules.
The XML is the easy bit. The trick is
getting Big Dogs to share. Interoperation is about
willingly sacrificing market share to cohesion with
the anticipated reward being expansion or replacing
old with new. I hear Malthus rattling....
Fiat:
Some committee grinds for years and says, "ok,
here is what you are doing and here is what is
common and here is a standard definition. We
described the things we named and typed and
now you can implement structures for these."
Back to square one. Who sez? Well, who cares?
Len
http://www.mp3.com/LenBullard
Ekam sat.h, Vipraah bahudhaa vadanti.
Daamyata. Datta. Dayadhvam.h
-----Original Message-----
From: Jeff Lowery [mailto:jlowery@scenicsoft.com]
Any suite of application built around a
common database is reusing the same data model. The ugly problem is that
it's a relational model most of the time, and winds up having to be mapped
into a set of hierarchical objects *by hand*. I'm not going to hazard a
guess as to whether the process could be automated if the data in the object
hierarchy, with many of the attendant constraints, was represented as a
schema. It couldn't be any harder, however.