[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Traffic Analysis and Namespace Dereferencing
- From: firstname.lastname@example.org (Henry S. Thompson)
- To: Norman Walsh <email@example.com>
- Date: Tue, 09 Jan 2001 10:16:30 +0000
Norman Walsh <firstname.lastname@example.org> writes:
> / Miles Sabin <MSabin@interx.com> was heard to say:
> | There are already many production-side systems which validate,
> | and I'm sure there'll be many more in the future. Where the input
> | docs can't be assumed up front to be valid and where DTDs/
> | schemas are cached locally, this doesn't seem like such a crime.
> The local caching seems to be part of the problem. I make extensive
> use of OASIS Catalogs on my own system to map public identifers (and a
> few system identifiers) to local copies.
> The OASIS Entity Resolution Technical Committee is currently working
> on a revision to OASIS Catalogs.
> I'd like to see other systems (like XSLT processors and Schema
> processors) extended so that they passed all their URIs through a
> resolver as well.
> The other alternative is to design some sort of caching system (like
> wwwoffle) to do the caching work for you. But this generally seems to
> require vastly more technical skill on the part of the person setting
> up the system. I know lots of people who could be trained to edit a
> catalog file that would find setting up a proxy server way too
Norm and I have a long-standing disagreement about catalogues. I much
prefer the local caching proxy approach, as it works for any software
which accesses any aspect of the web through the standard libraries.
I'm about to try installing wwwoffle under Win2k, and will report
Henry S. Thompson, HCRC Language Technology Group, University of Edinburgh
W3C Fellow 1999--2001, part-time member of W3C Team
2 Buccleuch Place, Edinburgh EH8 9LW, SCOTLAND -- (44) 131 650-4440
Fax: (44) 131 650-4587, e-mail: email@example.com