OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   Re: [xml-dev] The triples datamodel -- was Re: [xml-dev] Semantic Web p

[ Lists Home | Date Index | Thread Index ]

At 12:56 PM +0100 6/6/04, Dave Pawson wrote:

>What processing expectation should we have for such 'extensions' Elliotte?
>  Its foreign to an expected schema, not previously met... What is
>your definition of robust and flexible? What one might call robust
>and flexible, others might call guesswork?

This response demonstrates yet another common fallacy in software 
design. There are unexamined principles at the foundation of your 
question which are so deeply ingrained in your thinking that it 
doesn't occur to you that they need to be examined or justified, but 
they do.

The fallacy here is that a document has some sort of processing 
expectation, but this is simply not true in the heterogeneous world 
of the Internet. The document is what it is, and will be processed 
differently by different actors. I likely do not want to do the same 
thing with the same document as you do, nor is it necessary that I do 
so. The demand that we provide and adhere to schemas is often little 
more than a demand that we process documents in only certain 
preapproved ways. That is a fundamentally limited perspective. It may 
work within one program or a small organization. However, it does not 
scale to the needs of large organizations and groups of organizations 
with different, unique objectives.

You are assuming that the extensions must be processed because 
they're there. I disagree. If I don't need them, I am free to ignore 
them. My only concern is whether the document contains what I need in 
order to perform my task. You likely have different requirements for 
that document than I do. I do not guess how to handle anything. I 
take what I need, and ignore the rest.

>>Sometimes the answer, is "I don't know" and the document may need 
>>to be kicked to a human for further analysis.
>
>Which some might equate to 'fall over and die'?

Absolutely not. The fact is computers aren't that smart, and robust 
systems allow and prepare for human intervention. In practice, most 
debugged and deployed systems rarely require human intervention of 
this sort. However, when they do (and sooner or later they all do) it 
is better to be ready for it and acknowledge it rather than silently 
drop the problem on the floor by rejecting invalid documents and 
claiming it's not your problem.

>I think the SGML world got it right on this one.

As proven by the massive success of SGML, and the complete failure of XML. :-)
-- 

   Elliotte Rusty Harold
   elharo@metalab.unc.edu
   Effective XML (Addison-Wesley, 2003)
   http://www.cafeconleche.org/books/effectivexml
   http://www.amazon.com/exec/obidos/ISBN%3D0321150406/ref%3Dnosim/cafeaulaitA




 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS