[
Lists Home |
Date Index |
Thread Index
]
- From: "Bullard, Claude L (Len)" <clbullar@ingr.com>
- To: "Simon St.Laurent" <simonstl@simonstl.com>, xml-dev@xml.org
- Date: Thu, 19 Oct 2000 13:59:58 -0500
-----Original Message-----
From: Simon St.Laurent [mailto:simonstl@simonstl.com]
>In the first part, the Web becomes a much more powerful means
>for collaboration between people. [...]
>
>In the second part of the dream, collaborations extend to computers.
>Machines become capable of analyzing all the data on the Web - the
>content, links, and transactions between people and computers.
>A "Semantic Web," which should make this possible, has yet to emerge,
>but when it does, the day-to-day mechanisms of trade, bureaucracy,
>and our daily lives will be handled by machines talking to machines,
>leaving humans to provide the inspiration and inuition. The intelligent
>"agents" people have touted for ages will finally materialize. This
>machine-understandable Web will come about through the implementation
>of a series of technical advancements and social agreements that are
>now beginning (and which I describe in the next chapter.)
Ok, have there been some advancements in inference
engine technology that will improve expert systems
on distributed networks? I confess to being unaware
of them.
Collaboration between people over the network is
the first and last reason we use it. Extending
it to machines looks logical but beware the
real time system problems of superstitious acquisition
and cascading. Don't bet the farm on a potentially
noisy system. Chaos is chapter 1 in real time control.
Machines are very bad when faced with uncertainty.
October 1960: the early warning systems
in Thule were screaming to launch. The operator
turned off the system and found out they were
getting return radar echoes from the moon.
If it's going to get windy, let it blow AI:
From Beyond the Book Metaphor (1991(, 1.6 : The Secret of the
Devil and The Deep Blue Sea.
"Episodic memory is typically
implemented as scripts and rule sets which are invoked
when a set of data inputs for an event match a stored
set of parametric values in a system model data base.
These model sets are knowledge bases operated on by a
decision support system which by means of an engine
employing techniques such as Demptster-Shafer evidential
reasoning and with knowledge of the application, operations
and organizational domains can be used to map requirements
to capabiliteis and provide best fit solutions."
The trick was iteration and a system that learned to
detect and remediate chaos inducing feedback. Episodic
memory is used to flag known problems that occur
when applying the frame-base. To reduce destabilization,
tests are performed to ensure the process has executed
correctly and to proof for known error sources. Testing
is a scheduled event. The system becomes noise-tolerant,
not noiseless. Tests can slow it down and that becomes
a latency issue (perhaps worse if the comm protocol is
stateless but that is an issue to think about).
Latency in the hierarchy of control processes
is a potential source of chaos. Business objects that are
ACID-conformant typically isolate the transactions such
that rollback is always possible. However, that means
you have to beware of processes with timing issues or
hidden couplers if the process is new and or mission
critical. The principle of least commitment or just in time
binding is applied. Analysis of historical data is used
to refine the knowledge base to offset the effects of
inadequate rules or noisy signals, that is, reasoning
with uncertainty. These were often called "learning systems"
and a lot was made of the power of neural network modelds
for predicting failures prior to their occurrence such
that damping could be applied prior to runaway cascades.
The collaboration model Tim is dreaming about used
to be called a Type C production system (see Automated
Design of Displays for Technical Data; " Westfold, et al,
Kestrel Institute under contract to AFHRL, AF Systems
Command, Brooks AF Base, TX. Sept 1990). These systems
generate displays using relations in the technical data.
A generator creates a space of possible displays and a
selector prunes this space using rule-based criteria for
possible dislays. Similar to today's DHTML systems, one
could talk about a tactical interface: created for the
particular use based on particular conditions. This is
very useful when considering fielding systems that have
variant workflows.
A navigation method for locating data
during production and when fielded is provided. This might
be configured with a component hierarchy with cross references.
The model based reasoning component can be used to analyse
symptoms, diagnose faults, and present tests in the optimum
order, then incorporates user feedback or sensor feedback
to determine further tests.
Most of this doesn't work for so-called, wicked problems.
These are unknown-unknowns. However, so we might get
back to the services thread, the basic components
for the knowledge base were (based on David Hu's work,
C/C+ for Expert Systems):
o Access controller - decomposes queries, routes subqueries
to local data bases, processes data and forwards replies
(Similar to workflow layer in current architectures)
o Inference engine - backward and forward chaining,
demon objects, blackboard communications, meta-strategy
execution and control
o Weights and weight propagation for fuzzy diagnosis
and confidence values for handling uncertainty
o Frame or object based representations and rule
language including default values for incomplete
knowledge reasoning
o Editors
o Explanation interface (advisor) to help clarify why and
how a goal was achieved
o Pattern classifier for categorization and incremental
update of episodic memory and the system synthesis model
And so on and so forth...
Len Bullard
Intergraph Public Safety
clbullar@ingr.com
http://www.mp3.com/LenBullard
Ekam sat.h, Vipraah bahudhaa vadanti.
Daamyata. Datta. Dayadhvam.h
|