[
Lists Home |
Date Index |
Thread Index
]
> Google maintains a cache (in fact, quite a lot of them) that contains
> snapshots of the whole Web, thus making hash of the notion that
> copyright owners get to control the making of copies.
Not so: google honors the robots.txt file, so if your website lists
places you don't want fed into google, they'll stay away.
They also recently did something about not making their cache of old
NYTimes articles available because NYT said not to.
/r$
--
Rich Salz Chief Security Architect
DataPower Technology http://www.datapower.com
XS40 XML Security Gateway http://www.datapower.com/products/xs40.html
XML Security Overview http://www.datapower.com/xmldev/xmlsecurity.html
|