[
Lists Home |
Date Index |
Thread Index
]
Yes, they do become an issue. The more games we play
with metametameta sources to generate applications via joins,
the more the system starts to crawl.
There are definitely tradeoffs among approaches that
say "code fast and get something running" and those
that say "model it all then implement it". I don't
want to restart that permathread, but an approach
that unites triadic relations to objects and relational
stores should sort that out. Ontological lifecycles
(versioning, etc.) have to be controlled if they
are to be dynamic. Jeff Heflin's paper at
http://concept.cs.uah.edu/
takes up the topic. Harry's paper is an informative
read and mentions the Helsinki principles (ISO TR9007:1987).
The Meme paper (ref'd yesterday) is very good although
deep and long.
Good thread. Again, there's a publication in this
topic because hybrid systems seem to be a permanent
aspect of the landscape. I hope one of you takes
the time to write it.
len
From: Hunsberger, Peter [mailto:Peter.Hunsberger@STJUDE.ORG]
He, he, yes, with sub-second response time.
Joins do eventually become an issue, a couple of weeks ago, in a note to
myself about optimization I wrote:
"The joins across the system for the constant resolution of type data
are killing us. [Big RDBMS vendors product] just can't keep the
precedence of the relationship structure straight WRT to the originating
query focus. Why should it? How should it know what we don't already
know? If we knew the relationships in advance the schema would be fixed
instead of dynamic!"
That (and some discussions on xml-dev) is partly what lead me to a
triples model in a relational database: we were joining for relationship
type instead of explicitly modeling them in the first place.
<snip>Danny's responses -- see my other reply to him</snip>
|