[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: [xml-dev] Adam Bosworth on XML and W3C
- From: "Bullard, Claude L (Len)" <clbullar@ingr.com>
- To: Eric van der Vlist <vdv@dyomedea.com>
- Date: Tue, 09 Oct 2001 12:47:52 -0500
For the spec, the quality and clarity of the examples
would be measured as part of psychological complexity
(cognitive loading). For XML Schema, one could apply
kolmogorov complexity (the shortest effective program).
The useful complexity would be a point of view measure
that used all of the dimensions of the various measures
to determine complexity for an individual (point of view)
at some scale. Measures should be scale invariant but
I never figured that bit out. That could be entropy or
old age and the need to keep my reserves up for treachery.
Come to think of it, XForms is doomed: Too complex
and we already have HTML forms so why bother. ;-)
len
-----Original Message-----
From: Eric van der Vlist [mailto:vdv@dyomedea.com]
Bullard, Claude L (Len) wrote:
> It depends on the complexity measure suited to the application domain.
Yes, that's why I wanted to know which was the algorithm used by Bjorn
for the W3C specifications.
The length is obviously not a good indicator since if more explanations
are added, the specification should be easier to understand and not be
considered as more complex --it can depend on the way the explanations
are written, though.
Eric
>
> Approximate entropy is one measure.
>
> "Approximate entropy is a statistical measure to quantify the regularity
in
> relatively short noisy time series. It is defined as the rate of entropy
for
> an approximating Markov chain to the process. Useful in deistinguishing
> between correlated stochastic processes and composite
> deterministic/stochastic models."
> http://www.cpm.mmu.ac.uk/~bruce/combib/compref260.html
>
> Rounds complexity is another
>
> "rounds complexity, are defined and then illustrated by designing and
> analyzing two algorithms: a parallel summation algorithm which proceeds
> along an implicit complete binary tree and a recursive doubling algorithm
> which proceeds along a linked list. In both cases replacing global
> synchronization with local synchronization yields algorithms with reduced
> complexity."
> http://csdocs.cs.nyu.edu/Dienst/UI/2.0/Describe/ncstrl.nyu_cs%2FTR1991-539
>
> Kolmogorov complexity is another. "the length of the shortest effective
> description of an individual object" aka, compressability
> http://www.cwi.nl/~paulv/kolmogorov.html
>
> Psychological complexity "psychological complexity measure developed at
> Clemson University, called the Permitted Interactions (PI) measure, uses
> design information to calculate the psychological complexity as a measure
of
> effort. However there is a general demand for measures that can use
> information present at earlier phases. Following this requirement the
> measure reported here estimates complexity at the domain analysis phase
> which is the earliest development phase in an objectoriented software
> process. Psychological complexity relates to the cognitive load imposed on
> the developers of the software system which is in turn directly related to
> the time to completion of the development process."
> http://citeseer.nj.nec.com/167340.html
>
> Structural measures of disorder using graphs (connectance ratio)
> http://crl.nmsu.edu/users/madavis/Site/Present/tsld001.htm
>
> and so on.
>
> len
>
>
> -----Original Message-----
> From: Eric van der Vlist [mailto:vdv@dyomedea.com]
>
> How do you measure the complexity of a specification?
>
> Eric (just being curious)
>
>
>
--
Rendez-vous a Paris pour une visite guidee de la nebuleuse XML.
http://dyomedea.com/formation/
------------------------------------------------------------------------
Eric van der Vlist http://xmlfr.org http://dyomedea.com
http://xsltunit.org http://4xt.org http://examplotron.org
------------------------------------------------------------------------