[
Lists Home |
Date Index |
Thread Index
]
To consider sources, you are right. I am aware of the disparity
between US and European or Asia-Pacific coverage. That is a
benefit of the web: easy access to multiple sources. What isn't
provided is independent validation beyond one's own good judgment.
Consider the technology. In the thread that kicked this off, the
cited source was attempting to refer to the technology of aggregation
as a source of fabrication. In essence, aggregators can be fooled.
Does anyone here dispute that? (No, I didn't ask if there are
conspiracies afoot, black helicopters, etc., just if the aggregators
can be fooled?)
Of course they can. NSS. The general rule is don't put anything
on the web you don't want to see on the front page of the New York Times
and don't believe anything you read on the web until you have had
it validated by multiple independent sources using multiple
independent means. Possibly true of the Times too given a bad
day in the editor's chair but less likely.
At this time in the US, many states are considering legislation
for the dataMegaMarts that use aggregators. We know by observation
such legislation is often flawed because it is passed without
expert understanding of the technology. While we may dispute other
aspects of some topic, it is possible to provide expertise about
the technology as it exists, as it is used, and as it is deployed.
Otherwise, policy prevails. There are policies such as CFR 28 Part 23
for systems used to gather information on individuals in the process
of a criminal investigation that do govern the accuracy and conditions
under which such information can be gathered. Systems designed and
provided to agencies charged with these activities must certify the
compliance of their systems to this CFR.
What isn't determined is if information purchased by agencies from
the dataMegaMarts is subject to the CFR. Likely not. Information
from these systems cannot be mixed into the CFR-compliant systems.
The fair assessment is that such CFR-compliant systems must be the
gateways or clients of the non-compliant systems. How to do this
is the issue at hand.
len
From: Ronald Bourret [mailto:rpbourret@rpbourret.com]
Sent: Tuesday, March 15, 2005 3:01 AM
1) If there are 8 billion Web pages, there is no way anyone can rate
even a meaningful subset of them by hand. Google's voting system, while
imperfect, strikes me as a pretty good way to do this automatically.
2) The problem with any system of "experts" is deciding who the experts
are. Main-stream journalism may at least be fact-checked, but it's not
clear the world will ever agree that those writing about a particular
topic are indeed "experts" to anyone beside their peers. If you don't
believe this, read US and European articles about the same topic and see
if you think they're even covering the same event.
3) The Web is a like a great, big bar with a zillion drunken
conversations. You'll meet some interesting people, discover some
dubious facts, and have a good time, but anyone who trusts it implicitly
is asking for trouble, and all the ratings in the world are never going
to convince the black helicopter crowd [1] that the UN isn't really
invading the US.
|