[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: Traffic Analysis and Namespace Dereferencing
- From: Miles Sabin <MSabin@interx.com>
- To: Norman Walsh <firstname.lastname@example.org>, email@example.com
- Date: Mon, 08 Jan 2001 23:56:00 +0000
Norman Walsh wrote,
> / Miles Sabin <MSabin@interx.com> was heard to say:
> | There are already many production-side systems which validate,
> | and I'm sure there'll be many more in the future. Where the input
> | docs can't be assumed up front to be valid and where DTDs/
> | schemas are cached locally, this doesn't seem like such a crime.
> The local caching seems to be part of the problem. I make
> extensive use of OASIS Catalogs on my own system to map public
> identifers (and a few system identifiers) to local copies.
Ahh, but that _is_ a cache ... a manually maintained one.
Sorry, maybe I didn't make it all that clear that I meant 'cache'
in a rather broader sense that 'HTTP caching proxy server'.
> The other alternative is to design some sort of caching system
> (like wwwoffle) to do the caching work for you. But this
> generally seems to require vastly more technical skill on the
> part of the person setting up the system. I know lots of people
> who could be trained to edit a catalog file that would find
> setting up a proxy server way too challenging.
What requires more skill than what? Designing and implementing
a (good) proxy is non-trivial; installing and configuring one is
quite a bit easier. Writing a catalog file is easy; manually
maintaining one could be quite a headache (cp. host files vs.
Don't get me wrong, I'm all in favour of the work that you and
the OASIS Entity Management TC are doing. But for me that's only
a very partial solution to a much bigger problem ... it's a host
file, I want DNS.
Miles Sabin InterX
Internet Systems Architect 5/6 Glenthorne Mews
+44 (0)20 8817 4030 London, W6 0LJ, England