OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: [xml-dev] Adam Bosworth on XML and W3C



Only Google. :-)

My intuition is that they unite in the original entropy definitions 
for addressability and resource consumption based on the individual 
access to the address (boltzman).  View dimensionality is point of view
dependant
and a point of view is n-dimensional (limited but not apriori).  
Loading and consumption are analogous terms.

One might ask, what is the entropy of the net (statistically, 
how do we predict 404s)?   As a rule of thumb, given two 
specifications for the *same* technology, which costs more 
to implement given some model of an implementor (the model 
is the view).   I emphasize "same" because one might inquire 
if RELAX NG and XML Schema are the same technology with 
respect to their feature set.  I think they are not.  I think 
that could be said of the XHTML vs XML Schema technologies 
even moreso.  So one could derive the two complexity measure(s) 
but I don't think the results can be generalized except in 
costs.

len

-----Original Message-----
From: Fuchs, Matthew [mailto:matthew.fuchs@commerceone.com]
Sent: Tuesday, October 09, 2001 1:12 PM
To: Bullard, Claude L (Len); xml-dev@lists.xml.org
Subject: RE: [xml-dev] Adam Bosworth on XML and W3C


I think both are relevant - schema is consumed by both carbon and silicon
agents.  For two markup related examples - ambiguous content models and
Murata Makoto's work with hedge automata.  There seems to be a human
(carbon-based agent) predilection for ambiguous content models because they
can be easier to write, but they can cause problems for the behavior of
programs (silicon-based agents).  Likewise, Makoto's work is very elegant,
and implementation may not be so hard (low Kolmogorov complexity), but there
are cases requiring exponential processing time, which is why I was against
using them directly in Schema (when I heard "exponential" I thought
"denial-of-service attack").
 
It would be awesome if there were some way to relate the formal complexity
measures with psychological complexity.  Do you know of any sources?