[
Lists Home |
Date Index |
Thread Index
]
Karl Waclawek writes:
> are there any well-known ways to protect against
> malicious XML, e.g. XML that causes your parser
> to eat up all memory?
You build sanity limits into your parser. That will require the
parser to reject some well-formed documents, and technically, will
probably make it non-conformant, but that's just tough. I'd suggest
something like a size limit on element and attribute names of 1024
characters, an element nesting depth limit of 512, and a limit of 128
attributes for each element. Anyone who passes these limits is just
goofy. SAX already lets you break up long strings of character data
into reasonable-sized chunks, so there should be no problem there;
limits will also be necessary for IDs (if validating), processing
instructions, and possibly comments (if the parser bothers with them).
Fixing the parser probably isn't the biggest issue, though -- the
application sitting on the other end of it (whether a DOM tree
builder, a point-of-sale system, or what-have-you) is probably using
most of the memory, and the fixes for that are application-specific.
All the best,
David
--
David Megginson, david@megginson.com, http://www.megginson.com/
|