XML.orgXML.org
FOCUS AREAS |XML-DEV |XML.org DAILY NEWSLINK |REGISTRY |RESOURCES |ABOUT
OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index]
RE: [xml-dev] Validating Bulk XML Data

Michael Kay writes:

> I can't see any intrinsic reason why validating 1000 small documents 
should
> take longer than validating one document formed by concatenating the
> content, provided the schema itself is only prepared once.

Yes, in principle.  If the particular validation tool you use happens to 
have a long startup or setup time, then validating many small documents 
would of course be slower.  I noticed in your earlier email you suggested 
a good approach in which Java APIs would be used to prepare a schema once, 
then apply that schema repeatedly.  That would presumably be fast.   If 
one used repeated calls to a command line parser interface, perhaps froma 
script or batch file, and if that interface did not support precompilation 
or prepartion of the schema, then 1000 small documents would probably be 
slow.  I suspect that's what you meant, but in this case I think it's not 
clear what tools are being used.

Noah

--------------------------------------
Noah Mendelsohn 
IBM Corporation
One Rogers Street
Cambridge, MA 02142
1-617-693-4036
--------------------------------------






[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index]


News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 1993-2007 XML.org. This site is hosted by OASIS