[
Lists Home |
Date Index |
Thread Index
]
Elliotte Rusty Harold wrote:
> ...
> However, most processors of binary formats such as Word do not start
> with the assumption that they are reading an arbitrary stream of
> bytes. They assume they're reading data in a known format and build
> assumptions about the format into their code. When those assumptions
> are violated, the program heads south in unanticipated and potentially
> damaging and dangerous ways. This is why it really bothers me when
> processors attempt to gain speed compared to traditional XML parsing
> by skipping well-formedness checks. This applies to both many binary
> parsers and some so-called minimal parsers that process traditional
> XML without checking for well-formedness.
Binary doesn't imply there isn't any well-formedness checking, obviously.
Incremental, or lazy evaluation, well-formedness is useful and
potentially protects applications just as well as full well-formedness
with better efficiency.
Allowing the application the option of avoiding repetitive or unncessary
well-formedness checking is a valid strategy. Your argument that data
that a 'program' receives must always be fully validated in any
situation could just as easily be extended to libraries and modules
receiving DOM references or similar. What one system may do with
libraries or software modules, another may do with plugins and another
may do with n-tier processing steps. Does the granularity of the
implementation somehow necessarily change the fundamental likelihood of
corruption?
sdw
--
swilliams@hpti.com http://www.hpti.com Per: sdw@lig.net http://sdw.st
Stephen D. Williams 703-724-0118W 703-995-0407Fax 20147-4622 AIM: sdw
begin:vcard
fn:Stephen Williams
n:Williams;Stephen
email;internet:sdw@lig.net
tel;work:703-724-0118
tel;fax:703-995-0407
tel;pager:sdwpage@lig.net
tel;home:703-729-5405
tel;cell:703-371-9362
x-mozilla-html:TRUE
version:2.1
end:vcard
|