[
Lists Home |
Date Index |
Thread Index
]
At 11:38 AM +1100 2/25/03, Matthew.Bennett@facs.gov.au wrote:
>Why parse repeatedly, if it's so damned inefficient? Why not come up with
>the concept of a 'compiled' xml document; one where structural info. is
>stored, and access is *FAST*, and validity and well-formedness have already
>been 'certified'?
The interoperability is largely based on the failure to certify
well-formedness. Everyone checks it for themselves. There's a reason
compiled data doesn't port nearly as well as text source code.
There's a reason corrupt binary data routinely crashes programs
ranging from Microsoft Word to simple JPEG viewers. There's a reason
Java VMs verify byte code before executing it. When receiving data
from heterogeneous sources, you do not want to assume the data is
correct. Binary or text, it needs to be verified. Moving to a binary
format would not remove this requirement from XML processing.
--
+-----------------------+------------------------+-------------------+
| Elliotte Rusty Harold | elharo@metalab.unc.edu | Writer/Programmer |
+-----------------------+------------------------+-------------------+
| Processing XML with Java (Addison-Wesley, 2002) |
| http://www.cafeconleche.org/books/xmljava |
| http://www.amazon.com/exec/obidos/ISBN%3D0201771861/cafeaulaitA |
+----------------------------------+---------------------------------+
| Read Cafe au Lait for Java News: http://www.cafeaulait.org/ |
| Read Cafe con Leche for XML News: http://www.cafeconleche.org/ |
+----------------------------------+---------------------------------+
|