OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.


Help: OASIS Mailing Lists Help | MarkMail Help



   Re: [xml-dev] Parsing efficiency? - why not 'compile'????

[ Lists Home | Date Index | Thread Index ]

Peter Finch writes:

 > Do you think that if there was a way to compiler a document and an
 > open way to edit, query and modify it that people might be more
 > incline to store the documents in a Persistent DOM form (or the
 > like)? I am talking about documents, not data that is just used
 > for interchange.

I used to, but as I've mentioned, this is something that has come up
over and over again during the past five years, and nothing has
succeeded in catching on.

For shorter documents (under 10MB or so), there would probably be
little or no advantage to a binary format -- parsing the whole XML
document into memory in one fast burst is likely much faster than a
lot of random disk access.  Perhaps there just are not enough people
dealing with bigger XML documents to make a compiled format catch on.
Building a DOM tree can be slow, admittedly, because of all the memory
allocation required, but people dealing with more than trivially-small
documents don't generally build a DOM anyway (even if the documents
are human-readable prose).

 > With all the open standards for accessing an XML document, is
 > storing them in a flat file on a filesystem a little inefficient
 > (especially if they are large)? We use databases for record
 > based data and people would never store records in flat
 > files... but isn't that essentially what most people do with
 > their XML documents?

Inefficient in size?  I think that most people just compress them,
then uncompress them on the fly when needed.  That does slow down
reading very slightly, of course.

Note that a naive binary implementation is not small, because you need
to keep several pointers in each node: to beat out compressed XML, you
would have to do a lot of optimization (buckets,
common-prefix-removal, etc.) that will make the format considerably
harder to implement (and thus, less likely to catch on among initial

 > Is this a view held by most people? Is compiling an XML document
 > into a persistent DOM format a waste of time... if so... why?

I'm taking a free-market perspective -- people have been attempting
and offering this kind of thing for half a decade, and virtually no
one has taken them up on it.

Like you, I can think of cases where a random-access equivalent of an
XML document might be useful, but text-based XML seems to be good
enough that people don't seem inclined to do a lot of extra work for a
small additional advantage.

All the best,


David Megginson, david@megginson.com, http://www.megginson.com/


News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS