[
Lists Home |
Date Index |
Thread Index
]
Title: Message
Any
technology or information system that must encapsulate multiple
points
of view becomes complex. One should look at the level of
specification over the level of application. If these
are not clear or
clearly separated, complexity is unavoidable. One must come
to
understand Boltzman entropy.
I can
simplify your travel vehicle to two wheels, pedals, a brake
and
handlebars. Do you really need gears? Do you really need
an
engine? Yes, iteration over a design simplifies it but only
for a
specified value for 'done' or task.
There
are tasks for which SGML is a better choice if XML
is all
one has to work with. If one has XML plus the myriad
technologies developed to support it, the choice is different.
Also,
is the complexity of the metalanguage vs the complexity
of the
instance of that language the same or different? One
can
take two instances of markup applying first SGML and
then
XML. Given the inherent capabilities of SGML, the
instance string can be much shorter. The cost is complexity
in
parsing. What XML provides is a sweet spot in the
tradeoffs among the various components of an overall
system
of systems. A Foxpro program for working with
a
database is waaaay simpler than the same program
written in Visual Basic but the cost is in the generality
of the
programming language.
len
Hi Folks,
I observed a few days ago that XML is able to
achieve virtually endless complexity through the use of a couple simple
building blocks and a couple simple assembly mechanisms. (I am
continuing with the Lego analogy)
We all know that XML has its ancestry in SGML and
XML is a simplification of SGML. So, to achieve forward progress a
complicated technology was simplified (progress via
simplification).
There are other technologies which achieve great
complexity with simple building blocks and simple rules. For example,
Cellular Automata. I don't have enough experience to state for certain,
but people tell me that the programming languages Lisp and Forth have simple
building blocks and simple rules, and are able to achieve tremendous
complexity.
It is asserted that all the amazing complexity
seen is nature is achievable by using simple components and simple rules
[1].
It occurs to me that in the development of a
technology there are 2 approaches:
Approach 1 - Progress via
Simplification
With this approach the attitude is "what are the
simplest collection of components needed to achieve all the complexity
required?". Interestingly, this approach strives for greater complexity
by removing complexity.
I think that typically the right collection of
components is not found on the first attempt. Typically, the first
attempt produces a collection of components that are too complicated.
So, successive versions of the technology result in simpler components.
But these simple components can be assembled to produce results that are as
complex (or more so) than the earlier components.
As noted above, XML is an example of a technology
that made forward progress by simplification (of SGML).
For the past 6 months I have been putting
together a demo. The first version of my demo was horribly
complicated. Then I realized how to simplify it. The second
version was much simpler (and more powerful) than the first version. But
even the second version was too complicated. After some time I realized
that it could be simplified still further. I went through 6 versions,
each version getting simpler and more powerful. My current version is
astonishingly simple. This experience humbled me (it's humbling to scrap
all my hard work and complex code in favor of something that is simple;
somehow complex code seems more "manly") and it opened my eyes to the value of
progress via simplification.
Approach 2 - Progress via
Complexification
With this second approach the attitude is "the
existing functionality does not give users all the desired complexity, so
let's add more functionality". Thus, greater complexity is achieved by
adding more complexity.
As I look at the next-version of some of the XML
technologies it appears that this second approach is being taken. For
example, with XSLT 2.0 and XPath 2.0 you are able to accomplish what was
extremely difficult (or impossible) in 1.0. However, this enhanced
complexity is achieved by adding more complexity to the language.
I believe that XML Schemas 2.0 is going along the same path - more complexity
by adding more complexity. I am not trying to "knock" any of these
technologies. In fact, as a technology geek, I like the cool stuff that
has been put into the 2.0 version of XSLT and XPath.
But I keep thinking about the lessons I learned
from my demo, and keep wondering if the 2.0 version of these XML technologies
could have achieved the additional complexity by recognizing "the collection
of components in 1.0 are wrong; they do not provide the desired complexity;
let's scrap those components and find the right collection that's simple yet
powerful".
Perhaps for some things progress must come about
by adding more complexity. I don't know. What do you think?
/Roger
[1] A New Kind of Science by Steven
Wolfram.
|