[
Lists Home |
Date Index |
Thread Index
]
From: sterling [mailto:sstouden@thelinks.com]
>.. the preconditioned environment sets the stage for acceptance.
Yes. Low energy transport marketing models indicate that one does this with minimal means. For example, when selling a political candidate to a local electorate, yard signs work better than mass media. Why? Frequency with the exciter of location (opinion leaders). Door to door visits by the candidate work best. The exception is a highly loaded environment where one or two issues dominate the discourse. Like a heavily loaded pachinko shot, a single debate can turn the outcome of a single game. Then the real problems begin: can the *elected* do the work?
>... attempt to explain the technology even before the products under development become available for sale.
The parallel track is research. The Internet is a result of DARPA steering. This is the 'breed more cultivars than are needed based on the elite cultivar' model. Winnow only as it becomes obvious. To do this, keep the cultivation costs low.
>Many times the actual acceptance requirements has little to do with the real needs of the buyer, and as a result the best >product for the buyer is eliminated from competition at selection time. This is generally true because the buyer has no >relevant experience whatsoever in the field at the time that they buy.
That can happen (naïve buyer) but generally, there is a support system for making nested or related selections and the buyer gets at least 80% (vapor statistic) of what they need. Some features turn out to be bad fits, these become irritants, and the irritation becomes a feedback site to the local system and then to the global market.
>The best technology [for whom?] is a moving target. It is subject to insertion within a particular set of buyer needs,
>but it does not meet the general needs of all buyers.
Yes. It might be better though to think about a situation semantic as requirements in motion, a functional tensor that obtains or loses functions based on proximity of other requesting objects (strong and weak relationships) (in de Moor's writings on Pragmatics, say the norms of the environment and the affordances of any semiote within it. This is exactly the model one adopts for online real time 3D worlds). The proximate relationships stabilize the evolution of the situation semantic. The weak relationships (signals) are the harbingers of change so a director observes outliers. Think intensity (force vector), frequency (rate of observations of an instance), clustering (the type map) and speed of intersection. It is a multidimensional resonant manifold that you *learn* to plot a course across factoring energy costs over time. This is the model of directed evolution.
>"expected results from particular application types", capacity of the buyer to learn within the time needed for the
>application, the benefits of the application is the product is applied to it, and things like that.
Yes.
>Take a look at Netscape 01, it replaced the governments browser in just a few months.
It depends on which browser you are talking about. IADS, for example, is still heavily used. I think the original Mozilla browser was done away with quite quickly although it lives on in the guts of IE and probably Netscape. The users there did some pretty dumb things and they had to shut down sites for awhile to clean up. As predicted, the web has morphed from 'the wild west' into a utility that is going to be heavily regulated. The pundits that were so popular some years ago were very wrong about that.
>At the time about 3 million users were on line because it was learnable, free, distributed without cost by download, and >more user friendly and because three million users interacted to teach each other how to use it.
That was true at least in the markets where I was observing and involved. I had to give a speech at Lockheed Martin in Orlando to the top brass to tell them that some of their product lines, even if more advanced, were dead in the market because of the learning curve and costs. Again, the use of the elite cultivar approach means taking plants or animals that thrive in the wild and breeding them with a model that represents 'the best' to obtain survival genes or new qualities.
Keep that Long Tail in mind and think of it as a slope some are climbing and others are sliding down, then extend that into a 4 or 5 dimensional plot of use, competence, cost, availability, criticality, etc. There are still hypermedia applications for which an Internet browser is a bad platform but increasingly fewer as the formerly thin clients become rich clients. There are still applications where the web server is isolated from the rest of the LAN.
>The government browser preconditioned the audience to use a Browser and clearly demonstrated the limitations of text only
>type technology.
Ummm... We've been stuffing images into browsers since way before Netscape or Mozilla. The "Web" in general is fairly ignorant about hypermedia history. They are sort of the Scientologists of Hypertext. Otherwise, yes. Preconditioning is part of fixing the game if one thinks it is a contest, but in an ecological model, it is soil preparation.
>I wonder if Netscape had been the first browser if it would have succeeded? The reason is that on the old 300 baud dial >up type networks it slowed things down quite a bit.
Probably not but that isn't the point. Success is relative to the goal of the aggregate market. Andreesen believed his own publicity. That is deadly. The majority of people who actually do shape the future are largely invisible to the public and that is the way information ecosystems work, not a conspiracy.
len
|