OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   RE: [xml-dev] Local Vs Global Vocabularies

[ Lists Home | Date Index | Thread Index ]

<snip on filter bait/>

> 
> Anywho... better topic.  When designing vocabularies for 
> very large communities, how do youse guys/y'all/anyone 
> approach the dilemma of scale vs localization?  In reading 
> a currently proposed language, we find that the approach 
> taken was to review some n number of examples and boil 
> that down to some n number of productions.  It seems 
> sensible enough until one actually tries to implement 
> that for local sites and discovers how much customization 
> one puts back to deal with the fact that boiling it 
> down proved to be locally lossy even if globally complete.
> 
> Of course, XSLT cures all ills, but ....

This is an issue we have in spades: 1000's of protocols each with a
vocabulary highly specific to a research sponsor all sharing a common
database and driven from common metadata.  The process for us is tedious
at best:

1) document specific use cases;
2) model abstract commonality;
3) get everyone together and get concurrence that the abstract models
can be extended to meet their specific use cases and that the abstract
representations can be commonly understood (i.e.; get everyone to agree
on abstract semantics!);
4) have business analysts implement the resultant generic metadata;
5) have business analysts implement customization of each model for each
group of users representing a common view (sometimes an aggregation of
many different protocols);
6) add escape hatches for local customization. In particular, we drive
the collected data to customized data marts where it is no longer in our
system.  This sometimes means having (small) mapping tables to map from
the generic models to the required data marts...

Note that the metadata doesn't just describe data representations but
security models (equivalent to row and column by role in context) and
coming this year; work flow models....  

We use XSLTs as generic rule processing engines.  The generic metadata
is combined with the domain specific metadata to produce abstract data
object models for each specific domain according to rules encoded as
XSLT templates.  Parts of the system depend on the ability to
dynamically produce XSLT from other metadata that is then used to filter
and encode instance specific data.  Finally, a presentation specific
XSLT layer renders XHTML and later this year Flash (with long range
plans for PDF and other presentations) for the end user from the
abstract object models combined with instance specific data.  

The resulting system (regular xml-dev readers may have noted that we use
Apache Cocoon as the underlying enabler for all this) can churn out a
new research protocol in less than a week, but the cost is lots of
hardware.  This system isn't specific to research protocols but I do
think it is best for areas where the business requirements are not well
defined (i.e.. research).  For better defined business problems the
mangling of all the layers of metadata required to produce any given
specific customization likely couldn't be cost justified (though the
tool might work for rapid prototyping).  For areas where hundreds of
thousands of dollars are spent to make small steps forward (or, saving
lives) the cost of another adding another server is small by comparison
(until we run out of server rack space)....






 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS