[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: Traffic Analysis and Namespace Dereferencing
- From: Miles Sabin <MSabin@interx.com>
- To: xml-dev@lists.xml.org
- Date: Tue, 02 Jan 2001 16:31:26 +0000
David Megginson wrote,
> John Wilson wrote,
> > Performing an HTTP GET on an arbitrary URL is not an
> > innocuous action.
>
> Very well put -- there are many dangers, including (as John
> points out) denial-of-service (intentional or unintentional)
> and maliciously altered schema information.
It's worth bearing in mind that this also applies to the
dereferencing of DTD external subsets. Generic XML processors
which want to validate arbitrary document instances and don't
already have a cached copy of any external subset will have to
fetch it, and that opens up the same possiblity of DoS, spoofing
and disclosure.
I can't help worrying that unintentional DoS might turn out to be
a major problem in the not too distant future ... the W3C's
servers host an awful lot of critical DTDs, and a awful lot of
generic XML processors don't cache external subsets or use
caching HTTP proxies by default. So what would happen if w3.org
collapsed under the strain of a couple of hundred thousand XML
editors all starting up at once?
Cheers,
Miles
--
Miles Sabin InterX
Internet Systems Architect 5/6 Glenthorne Mews
+44 (0)20 8817 4030 London, W6 0LJ, England
msabin@interx.com http://www.interx.com/