[
Lists Home |
Date Index |
Thread Index
]
- From: Tim Crook <tcrook@JetForm.com>
- To: XML-Dev Mailing list <xml-dev@xml.org>
- Date: Tue, 21 Nov 2000 17:17:21 -0500
Hello all.
I am currently using expat to create an XML DOM, which works pretty well
when files are small.
I was trying to think of a way to do a selective load of a small group of
repeated elements from very large data group of repeated elements in a large
data file, or data stream. With expat, the only way of going back and
reading again previously read elements (which may have been discarded) would
be to destroy the expat parser using XML_ParserFree and restart parsing the
document from the beginning, from what I have seen. Might there be a better
way than doing it this way? My understanding of expat is that going
backwards in the data stream and trying to resume parsing at that point is
not really viable.
There just doesn't seem to be a explicit way of reseting the context of the
parser to any particular point in the file.
Any ideas? Is there something I may be overlooking?
|