[
Lists Home |
Date Index |
Thread Index
]
- From: "Bullard, Claude L (Len)" <clbullar@ingr.com>
- To: "Simon St.Laurent" <simonstl@simonstl.com>
- Date: Mon, 17 Jul 2000 15:37:41 -0500
Hi Simon:
I read it, and thanks. Interesting. I think it
is a good start and eminently recognizable which
is why I trust it. If it were truly revolutionary
with no basis for comparison, I would reject it
as too far ahead of the event horizon for immediate
use. As tis, I know what to do with this: engineer
an enterprise for business.
This is the enterprise engineering stuff. Hooray,
XML finally gets away from the period of domination
by parserHeads and into the really truly fun bits
for application engineers. The wait is over. We
can finally boogie in the big arenas.
To the WaaaaayBack machine, Sherman.
The seminal period for me in thinking about this was
1988 to 1992. If you could poke around the CALS and
PDES archives, you might find quite a bit of the thinking
from that period. At that time, we were looking at
non linear dynamic systems and exploring their realization
in real time systems for describing very large integrated
product development environments, particularly, how
hypermedia could be used to enable these. The problem
to be solved was not just cost reduction and high
quality, but the issues of noisy environments. How
to hold a band together on a stage and still be
able to improvise at will is the same problem as
managing a business through a period of technical
emergence: predictability depends on discovery
of known good sources of information. Gotta learn
how and deep knowledge isn't always as good as
awareness, desperation, and dumb luck.
But it should be.
One example at the time was enabling high-tech plants in
the countries just then coming from behind the crumbling
Soviet bloc where 50 years of Marxism had created
low resolution work environments (non-competitive
and very difficult to change or retrain). In the face of a
dysfunctional culture made that way by accidents of
history, how does one shape the behavior
toward a stable cooperating system? Marxist/socialist
systems did precisely the opposite. They tended
toward obsfuscation to hide mistakes instead of
translating mistakes into learning. This is also the
"to the metal" problem of teaching XML by the way
sometimes just called, NIH.
Competition is essential to local coherence
in communications. They have to WANT to work. A whip
just won't do it. The pigeon bites the hand
that holds the whip AND the food after a while.
There are some fundamental concepts:
o Event-based. Signals are typed and are point to
point. This enables the system to not require locked
synchronization but does enable scheduling. Locked
synchonization depends on centralization and that
impedes local discovery. Take a spontaneous ride on your
instrument while playing Beethoven's Fifth and
see who is first violin next week.
o Rules and Contexts: the orchestration model is
pretty directly the model the hytimers discussed.
Remember that was a music description language
to begin with and the understanding that the
timing and gestural model of music could be generalized
to an orchestrated performance was seminal. Take a
great ride during a performance of Don't Get Around
Much Anymore and unless you step on the guy next to you,
you can be the first saxophonist next week unless
he does a better one this weekend. Context and rules
count in a negotiated set of services and roles.
(I didn't see a role model in SCL. Hmm.)
o Hierarchical description of business processes as
contracts. This is simply a Work Breakdown Structure
with discoverability. It enables a percolation model
for performance.
When percolating, you can't predict an exact path
but you don't have to. Virtual time is top down and
real time is bottom up. These are view dimensions.
The upper level views are managerial/control views
and the lower level views are real processes. The
idea is to enable the scheduling of opening and closing
views that have tests for well-performed behaviors,
aka, goodness or the reliability or trust in the
information. See Claude Shannon: "Data becomes
information as it removes uncertainty" and Boltzman's
equation, S=KlogW in which the number of good referents
in the system determines its entropic state.
o Binding points. We talked a lot in those days about
tesselating models. It is a geometric concept but it
explored the idea of DTDs (now schemas) that enabled
point to point constructions. The idea is that the
schemas could be bound as needed when needed but more
important, adapted within a defined space. Schemas
define boundaries. We think of these now as namespaces
but the concept is the same: non-ambiguity in an address.
Timing plays a big role here as well as loose and
tight coupling such that orchestration is not overly
constrained and the resulting performance is not
overly predictable (Is Entertaining thus keeps
the Attention (the real commodity of cost) of
those who must perform it and those that choose
to attend it)).
Compare classical music performance
to jazz or rock performance to get a feel for why this
has to work this way. Latency and noisy environments
are very important. The question is, how did Ringo
keep time for the rest of the band in the face of
a few thousand screaming girls? He watched John
and Paul's buns. They wiggled as they played so while
he could not hear, he could keep beat. This is a
gestural system lightly coupled to ensure a reasonable
if not perfect rendering. It was a discoverable
service of John and Paul's buns which Ringo could
use when all else failed due to noise. Ringo was
the timekeeper and he scaled time to the motion
of the buns to ensure real time coherence (on the
other hand, not great musicality if you listen to
the Hollywood Bowl performance but the customer
could care less, and folks, that is quality).
If we use the notion of the time quanta, it sets the lowest
level of process resolution and is scalable. The timeline
can be said to exhibit the features of a Cantor set, or
in a two dimensional realization, a Sierpinski gasket.
Mappings onto the coordinate space of events that fail are not
addressable, or simply, fall through the cracks, or better,
are not viewed or modeled. Closure of a process creates a
continuous map. Failure to close is a discontinuity.
It is important to remember that a hierarchy of infinities
is not a real object; it is a recursive process or a nesting
of recursive processes: loop to success or exhaustion.
Transfinite numbers are neither real nor really numbers,
just a way to talk about accuracy or granularity.
Human systems can be viewed a loosely coupled real time
systems where policy directs events. Time is a not a
director, just a scaling frame. Events direct events.
If time to reply is not important, rough granularity is
acceptable (again, synchronization of very large systems
is problematic due to latency). The document based
systems or policies create a rough closed feedback process loop
for adapting the communications to do it or do it again
until it closes. Latency is the key issue not just
in speed of send/respond, but also in absorbtion of the
signal in presence of noise (roughly, the power law
at the receiver). Self-correcting systems (eg,
shared schema) are a means to detect and correct
for noise. Self-adjusting systems (eg, dynamic
schema) enable the system to self-correct or to
learn and thus evolve new capability. In genetic terms,
self selection. Note, that in Darwinian thinking,
only living systems have local rules for self-directed
evolution. It is a fascinating idea.
The problem of the web is superstition, that is, the
web is an amplifier that feeds signals back into itself
and this enables degraded modes to emerge and be sustained.
Little differences become give differences and events
occuring in non-visible dimensions create produce
effects that amplify across the boundaries. That is
why I get so wrapped around the history thread from
time to time because in Darwinian thinking, the history
of events is accidental but affective. Not understanding
this leads to more superstitions and more incorrect
behaviors. The system may still be sustainable but
its direction is questionable. Feeding error back
into a system is stochastic composition with a
pseudo-deterministic model of known processes.
Information does not want to be free (superstition);
Information wants to cohere (a prediction based on a
range of sustainable communications).
Think of it as a fractal event stream (feedback-mediated,
but false values in the control range). Cool for
some compositions, but not exactly what the price-sensitive,
weWantWhatWeSpecified, results business models are
supposed to predict and as a control, produce. In
other words, if we want a high goodness factor in
the nested business model, we need trusted patterns
and trust is a markovian function. We can introduce
an episodic model depending on how much error the
processes can tolerate and still close correctly.
Episodic models do not have to close with absolute
precision as long as 'the job gets done',
To define the processes of the enterprise (discoverable
services), create a bounding defintion or mission
statement of goals, then hierarchies of processes
to meet the goals (as defined in that document
element in the contract). Remember, this is not
a simulation model although one could create one,
it is a computable living contract for real
business processes. It alerts the humans if it
detects something amiss (opportunity for negotiation
and learning or more discovery). Drift in the
model means costs, so renegotiation is always an
option and why I asked you about the negotiation
model in SCL. Again, virtual time (project time
and project costs) is top down and real time
(performance time and performance costs) are
bottom up. Renegotiation also entails taking
a mal-behaving process and putting it to sleep
so another task can use the resource (a foldable
procedure). If the process is dependant and
a circularity develops, the performance locks.
Min/max local states must be detectable (instrumented).
If we adopt the notion that these processes as
declared are geometric, than they are space filling
and the process space is bounded (can be said
to have an energy budget), thus the notions of
binding points, discontinuities, and so forth.
The performance will have a distinct shape, but
it should be of a class of shapes for like
performances. That provides a visualization of
the goodness of the performance instance. It
may also point to the kind of math that can
be used to analyze cluster density based on
the links and link types. This is the fractal
stuff; look there, not simply at
storage behaviors because it might let one
predict network saturation. If instead of
just plotting points, one thinks of a
Koch curve as a copy, rotate and scale
operation, the process fills the space of
addressable points in a space of variations.
The goal or Product is the classification
characteristic. Product and process wind
together, inseparable but separated, like
a double helix in DNA. A query can be seen as
a digital enzyme for state maintenance.
That's enough for today. We may want to
come back, dump the geometry, and talk
about the relationships between style
systems and orchestration. Or maybe
someone else wants to pick this up at
the head and riff awhile. :-)
Len Bullard
Intergraph Public Safety
clbullar@ingr.com
http://fly.hiwaay.net/~cbullard/lensongs.ram
Ekam sat.h, Vipraah bahudhaa vadanti.
Daamyata. Datta. Dayadhvam.h
-----Original Message-----
From: Simon St.Laurent [mailto:simonstl@simonstl.com]
I've not had the chance to read it in detail, but at least it seems like an
interesting start.
|