Lists Home |
Date Index |
That would be a logical positivist approach.
Another would be a hybrid approach. The abstract
qualties which some ontology declares as present in any
web service and which are neither true nor false because
not measurable and IAW the logical positivist point of view,
also meaningless, might be assigned values. The expert
appraiser does that or the web service owner does that. If
one needs metaphysical assurance, they can have that.
The measurable qualities must be observable, testable,
and provable. These can be done by any independent
agent with access to the context of measurement, and could
be a web service itself, eg, a sensor. A UDDI statement
could attest that the web service provider is a subscriber
to a testing organization and that the values provided in
the UDDI statement are from that source. The measure of
the measurer is the reputation of that source of measurement.
It might also be subject to measurement and we can recurse
into absurdity or use a metric such as how many subscribe
to that measuring service (much like a page rank) but that
is actually a metaphysical quality (reputation). So not
exactly "meaningless" given some means to measure, but
1. A WS QoS framework is created that is the set of means
of measurement and a set of metrics applicable to any WS.
Some will be abstract but if abstract, must explicitly
declare the means and source of the metric.
2. Each WS by industry type provides addional metrics
and means pertinent to that industry.
3. A separate QoS framework compares sources to determine
deviations from the mean which should be essentially in
4. Experience is used to tune all of the above.
Feedback: the maker and destroyer of reputations and
enterprises alike. (musavada veramani)
From: Jason Kohls [mailto:firstname.lastname@example.org]
I totally agree.
I can see "Web Services Brokers" cropping up, each focusing on a
particular vertical, etc. Of course, some sort of "industry standard"
metrics/SLAs (like Application Service Providers have been moving too),
from which each Web service provider can be benchmarked against, will
have to be solidified, with vendor backing.
I doubt this will ever be perfected programatically, however.
Reviewing/rating/measuring Web services will be a time-consuming,
experienced-based process, a process that naturally lends itself to
expert appraisal, much like antique furniture and real-estate agencies.