[
Lists Home |
Date Index |
Thread Index
]
A thought experiment on directing the evolution of web systems:
http://people.clarkson.edu/~bolltem/Papers/ChapterChaosandBifurcationCon
tr.pdf
If we are specifying systems, we are targeting markets. Given that the
web is a chaotic system, we rely on local continuities (predictable
behavior over short time cycles) to transport the ideas we want to
develop. This is the low-energy approach.
If we are to do this without creating environmental monsters, shouldn't
we "trade time for energy". The critical problem is knowing when to
apply feedback. How can the low-energy approach be applied? Isn't it
better to build and prove a prototype away from the community and get
feedback at exactly the right time? First mover advantage is an
advantage only to the first mover. Starting a project out by getting
the largest buy-in first (eg, creating lots of liaison relationships
before there is a single line of running code) is possibly the worst way
to evolve the environment/market into which one is fielding. Open
source and open standards might actually be harmful if they open up too
fast and too large. That may be why Adobe is so successful and why
standards such as X3D are so tough to kill (small numbers of developers
over longer time cycles with a selected number of iterations). That is
what makes them tough.
"...considering the exponential growth rate of such errors from an
optimistic standpoint, a vanishingly small energy input has the
potential to yield a wide range of outcomes. The problem of programming
when and how much those perturbations should be applied is the targeting
problem."
len
|