OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   RE: [xml-dev] XML Schema for large database

[ Lists Home | Date Index | Thread Index ]

Title: RE: [xml-dev] XML Schema for large database
           you are missing the question:
 
           Is there any major road blocks for future comsumers of my XML files if  the XML Schema they reference is very large ..

Bill Riegel
Landmark Graphics
Phone: 713-839-3388

-----Original Message-----
From: CHIUSANO, Joseph [mailto:JCHIUSANO@lmi.org]
Sent: Thursday, August 15, 2002 11:08 AM
To: 'Bill Riegel'; 'xml-dev@lists.xml.org'
Subject: RE: [xml-dev] XML Schema for large database

Bill,

One thing to keep in mind is the structure of the schema (and therefore the corresponding XML documents), as this may affect locking when committing information to a database (and the more information you have, the longer the duration of the locks).  That is, suppose you have an Order table and an Item table - there are 2 basic ways to structure this information:

(1) "Nested" approach
(2) "Blocks" approach

In (1), the XML document would look as follows (as you can see, the Item records are "nested" inside their corresponding Order records):

  <Orders>
       <Order orderID="123">     
          ...order information here...
        <Items>
              <Item itemID="0116602800">
                      ...item information here...
               </Item>
                <Item itemID="0116659813">
                      ...item information here...
               </Item>
              ...more items...
           </Items>
       </Order>     
  ...more orders...
</Orders>  

It seems reasonable to assume that when committing this information to a database, one may have to lock multiple tables simultaneously (thereby violating the best practice rule of "hold locks for as little time as possible").  That is, the Order table would need to be locked, then the Item table (while the Order table is locked), etc.  The more tables that are involved, the more simultaneous locks that need to be held.  There are certainly creative ways around this, but these would involve an intermediate step that places the information in a more "lock-friendly" structure.

In (1), the information is in "blocks" that correspond to database tables.  For example:

  <Orders>
       <Order orderID="123">     
          ...order information here...
       </Order>  
        ...more orders...
  </Orders>
  <Items>
        <Item itemID="0116602800" orderID="123">
           ...item information here...
        </Item>
        <Item itemID="1106500706" orderID="456">
           ...item information here...
        </Item>
        ...more orders...
  </Items>

With this approach, only the Order table needs to be locked when committing Orders to the database, then only the Item table, etc.  The only caveat is that there needs to be a cross-reference from Items to Orders, so that it is clear which items belong to what orders (see orderID attribute above).

Hope this helps,
Joe Chiusano
LMI

> **************************************************************************
>   Joseph M. Chiusano
>   Logistics Management Institute
>   2000 Corporate Ridge
>   McLean, VA 22102
>   Email: jchiusano@lmi.org
>   Tel: 571.633.7722
> **************************************************************************
>


-----Original Message-----
From: Bill Riegel [mailto:BRiegel@lgc.com]
Sent: Thursday, August 15, 2002 10:17 AM
To: 'xml-dev@lists.xml.org'
Subject: [xml-dev] XML Schema for large database


Wanting to serial the contents of portions of a large database, i.e. 500 -
1000 tables definitions.
The purpose of the file to allow it to be loaded by someone else, somewhere
else.

Trying to understand the implications of creating a XML Schema that reflects
the rules
of the entire database. The XML Schema will be 1000's if not 10000's line
long.

I would auto create the XML Schema from the database's metadata.

But would there be problems with parsers or transformation engines being
able
to consume it.

I have been toying with the idea of breaking up the logical model into
several
smaller sets, and only allowing the user selection a set of tables that are
defined in the set.

Or have the minimal about of data in one xml file. and collect the files in
a
zip file. Each file would then have a small(er) Schema.

Is it possible/probable that a large schema would not be able to be
processed ??

Bill Riegel
Landmark Graphics
Phone: 713-839-3388

-----------------------------------------------------------------
The xml-dev list is sponsored by XML.org <http://www.xml.org>, an
initiative of OASIS <http://www.oasis-open.org>

The list archives are at http://lists.xml.org/archives/xml-dev/

To subscribe or unsubscribe from this list use the subscription
manager: <http://lists.xml.org/ob/adm.pl>





 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS