Lists Home |
Date Index |
- From: Gabe Beged-Dov <email@example.com>
- To: David Megginson <firstname.lastname@example.org>
- Date: Mon, 29 Mar 1999 09:58:57 -0800
David Megginson wrote:
> In other words, it's not the XML *input* that you need to optimize,
> but the *output* -- for example, if you have a Perl script that
> renders XML in HTML, the best speed optimization is to cache the
> result and reserve it for any request with the same parameters.
Assume that caching isn't an option. I.e. you have to make all your processing reasonably
fast. Its not acceptable to make 80% of your processing really fast.
> The XML/SGML processing model is generally to walk through a document
> (as a collection of events or as a tree) and fire off handlers for
> different types of things. Even a short to medium-length XML document
> can cause the handlers to be fired off many thousands of times, and if
> you're trying to handle hundreds of requests per second, that's going
> to cause problems with or without XML.
Are we talking about throughput or responsiveness? It would be useful to bring up some
use-cases where XML processing can't be employed using the default handler firing model and
try to understand what the alternatives are.
Matt Sergeant has brought up one that he might be able to flesh out involving large scale
usage. I'm sure there are others.
xml-dev: A list for W3C XML Developers. To post, mailto:email@example.com
Archived as: http://www.lists.ic.ac.uk/hypermail/xml-dev/ and on CD-ROM/ISBN 981-02-3594-1
To (un)subscribe, mailto:firstname.lastname@example.org the following message;
To subscribe to the digests, mailto:email@example.com the following message;
List coordinator, Henry Rzepa (mailto:firstname.lastname@example.org)