OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   Re: [xml-dev] Reductionist vs Holistic Semantics

[ Lists Home | Date Index | Thread Index ]

Roger L. Costello wrote:

>Hi Folks,
>
>Reading [1] this morning stimulated some thoughts ...
>
>"The reductionist approach [involves] dissecting the world into the
>smallest and simplest pieces you can.  You look for the solution of some
>more or less idealized set of problems, somewhat divorced from the real
>world, and constrained sufficiently so that you can find a solution."[1]
>
>As I read this it occurred to me that the RDFS and OWL approach is
>reductionist.  That is, you take the existing world, break it up into
>pieces, and then document the relationship of those pieces.  This
>documentation of relationships constitutes the "semantics" of those
>pieces.
>  
>
The "new" semantics for RDF and OWL are based upon Model Theory. Current 
Model Theory is derived from Tarksi's writings e.g.<a 
href="http://www.ditext.com/tarski/tarski.html";>The Semantic Conception 
of Truth...</a>

I would say (though this is certainly something that can be reasonably 
debated) that this comes primarily from a study of the interaction of 
*language* and *meaning* i.e. "truth", rather than primarily from a 
study of reductionist science, indeed note the "Polemical Remarks" at 
the end of the above article.

For "pre informatics" molecular biology, biochemistry, etc. there is 
hardly any mention of first order logic etc.

>For example, consider the camera domain.  An OWL ontology will break up
>this domain into pieces such as Camera, SLR, aperture, f-stop, etc.  And
>then it will relate those piece like this:
>
>   - SRL is a type of Camera
>   - aperture is synonymous with f-stop
>
>But is it reasonable to treat semantics with such a sterilized,
>laboratory approach?
>  
>
It depends. A photographer needs to assemble a set of equipment in order 
to go out on a shoot. This generally requires -- at least at the 
professional level-- an understanding of cameras, lenses, shutters etc.

>"The real world demands ... a more holistic approach.  Everything
>affects everything else, and you have to understand the whole web of
>interactions."[1]
>  
>
Yeah but when you need to sit down and start doing some real work, you 
need to start *somewhere*. If you sit down and start thinking about 
*everything* you probably won't get much done, unless perhaps you are a 
philosopher and are working on a generalized model of the world.

>I will argue here that semantics must be approached from a holistic
>approach (i.e., a complex systems approach).
>  
>
The complex systems approaches I've studied are generally *highly* 
mathematical in nature, they just use different equations than model 
theory equations. A good place to read about interactions between 
complex systems modelling and graph theory is in regards to "Petri Nets".
...

>
>There may be untility to treating semantics with a sterilized,
>laboratory approach.  Certainly if this was the 17th century, where
>computers weren't available, then such a static, taxonomy-like approach
>would be acceptable.  But in today's dynamic, computer-driven world
>surely we can do better ... much better.
>  
>
We, as in who? Because there are many folks doing actual coding of 
software that processes ontologies as well as organizations such as the 
"American College of Pathology" (ACP/SNOMED) that has spent countless 
man-hours and millions of dollars developing medical ontologies that 
have ~400,000 terms (or so). The "American Medical Association" has 
developed the CPT coding system (essentially an ontology) that 
essentially *all* of the American medical reimbursement system is based 
upon. This is *literally* trillion dollar stuff.

Can you actually do better? Pick a real world problem domain (that folks 
are willing to spend big $$$ working with) and implement something *real*

>I believe that Didier and Mike Champion made mention of Google as tool
>which provides semantics in a holistic fashion.  I found their
>statements extremely enlightening  I totally agree with them.  Yes, I
>think that Google is the best semantics tool today.
>  
>
>While Google provides semantics in a holistic fashion, it is more or
>less semantics for eyeballs, i.e., the results it returns is intended
>for humans to process.  The critical problem is how to create a tool
>which provides semantics in a holistic fashion *for computers*.  Would
>someone care to take a stab at characterizing the nature of such a tool?
>  
>
Well Google is at it's essence a *network analysis tool* i.e. a piece of 
graph processing software. There are all sorts of probabalistic network 
models e.g. Bayesian.

I don't think we need to throw the baby out with the bathwater i.e. I 
don't think the current work on ontologies is misguided *in the least*. 
On the other hand, we do need to incorporate Bayesian/stochastic 
modelling into the current work, for example to classify i.e. diagnose 
patients on the basis of symptoms/findings/lab tests.

To me, where the money is at, and I've said this before (search the 
archives), would be to develop some sort of melding of a Bayesian or 
markov based pattern analysis tool with the current finite state tools, 
and/or DL inference engines.

Jonathan





 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS