[
Lists Home |
Date Index |
Thread Index
]
That is my take too. As in 'wisdom of crowds' what one
asks, 'crowds of what?', the principle isn't applicable
without knowing the task because it is 'least power
adequate to the job at hand'. As Jeliffe said, otherwise
one is a troll.
AJAX is probably applicable to lots of jobs only because
there are objects around to do the heavy lifting. The
fact of xmlhttp and loading up a bigChunk of XML, well
that is how markup was supposed to work from before the
web and was talked about while everyone else was arguing
for 'thin clients', 'HTML is holy' and 'separate formatting
from presentation'. Load balancing IS holy. The client
gets thicker by task, the separations of formatting from
presentation IS about reuse, and so on. It isn't that the
ideas aren't right, they are only right sometimes and
even then, they may start becoming wrong if practice and
application take another direction. It's like driving
in the left lane in right lane countries.
So when I see these principles, I have some scepticism
until 'crowds of junkyard dog experts' chew on them. If
they stand up to that (for example, "Dare to do less" was
debated but it stands up to abuse and is a sound principle),
they are probably good to go.
len
From: Peter Hunsberger [mailto:peter.hunsberger@gmail.com]
On 2/16/06, Bullard, Claude L (Len) <len.bullard@intergraph.com> wrote:
> I hate to see web architectural principles in the same
> light as pop psychology. So if there really is a
> deeper and clarifying principle here, one wants to be
> able to express it in simple terms that the marketing
> department can't screw up.
Don't think there is any deep clarifying principle here. Even if
there was, it's not one that couldn't be screwed up....
I recall the first time I encountered the Mandelbrot set: the
algorithm looked pretty simple so I coded it up in a high level
language I was using at the time. It had good floating point
libraries and I figured things would work fine. The resulting program
was probably about 200 lines of code and took like 30 minutes to
produce a very low resolution plot. So next I turned to C. Now I got
it down to maybe 100 lines of code and I got a better resolution graph
in a couple of minutes but still nothing like the images that I
wanted. Finally, I turned to 370 Assembler. I had direct access to
the floating point registers so I could pull a couple of numerical
manipulation tricks and I finally got something that ran in seconds
and produced the results I wanted with probably about 40 lines of
code. (All of these essentially fed the same graphics library).
Could I base any development principles on this? Absolutely not, the
result was completely specific to the problem at hand and I think it
always will be. IT seems to me that finding the "least powerful" way
to implement an algorithm, system or whatever requires as much
analysis, modelling and experimentation as any other approach to
matching requirements to implementation, if not more and is not
something that can be generalized or encapsulated in a couple of pithy
sound bites worth of "wisdom"..
|