[
Lists Home |
Date Index |
Thread Index
]
- To: <xml-dev@lists.xml.org>
- Subject: RE: [xml-dev] Large xml processing into SQL Server
- From: <william.j.boxall@accenture.com>
- Date: Fri, 7 Jan 2005 14:14:18 -0600
- Importance: normal
- Priority: normal
- Thread-index: AcT03r6hMl6yiFTFTt+YNZItmIodEAAEcsD7
- Thread-topic: [xml-dev] Large xml processing into SQL Server
Thank you all who have offered to help!
I am home sick right now, so don't have access to my work computer so I can't check all the settings.. The driver we are using, though, is weblogic.jdbc.sqlserver.SQLServerDriver.
The temp table I am attempting to write the records to is:
CREATE TABLE [TMP_TAN_TAB] (
[TAN_PREFIX] [char] (4) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[TAN_NUMBER] [char] (9) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[CHANNEL_TYPE] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[STATUS] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[TIN] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[NON_GDS_IND] [bit] NULL ,
[ACTIVE_IND] [bit] NOT NULL ,
[LAST_UPDATE] [smalldatetime] NOT NULL ,
[INTERFACE_NAME] [varchar] (15) COLLATE SQL_Latin1_General_CP1_CI_AS NULL
) ON [PRIMARY]
GO
And a sample 3 record xml file is:
<?xml version="1.0"?>
<AMEXTANToACNTAN DateTimeCreated="20040127165815" RecordCount="3" xmlns="stsi/amextan" SerialNum="1">
<BusinessUnit Code="AAAA" TANCount="2">
<ITEM TAN="AAAA00324">
<ChannelType>P</ChannelType>
<TANStatus>A</TANStatus>
<TIN> </TIN>
</ITEM>
<ITEM TAN="AAAA98422">
<ChannelType>R</ChannelType>
<TANStatus>U</TANStatus>
<TIN>1234567</TIN>
</ITEM>
</BusinessUnit>
<BusinessUnit Code="BBBB" TANCount="1">
<ITEM TAN="BBBB00420">
<ChannelType>P</ChannelType>
<TANStatus>C</TANStatus>
<TIN> </TIN>
</ITEM>
</BusinessUnit>
</AMEXTANToACNTAN>
But the size of the file IS big.. The sample file we received was 72 meg. That equates into roughly 819,000 records.
This huge a file won't be an every day occurance, though. It's a 1 time initial load, and thereafter the daily files we receive will be smaller, though we do not know exactly what size they will be.
Currently, I'm attempting to work with a smaller version of the file, which is only 3 meg, or about 33,000 records.
What we're using to process the xml is whatever came packaged with BEA Weblogic.. I'll admit I didn't dig too deeply into that. There is a file event generator that picks up the physical file and passes it to my process as an xml document. I then store it in a document type, and do my iteration and validation checks on individual fields, then as I iterate through and the individual record passes the validation routine, I add it to a batch sql statement, which I pass to the database at the end.
We do not use DTDs or XSLT, currently, though we may have to. We do have a schema, though, which is in hierarchical format. It may need to be flattened if we are to do a bulk load.
Once again, thank you for any help you can give!
Bill
This message is for the designated recipient only and may contain privileged, proprietary, or otherwise private information. If you have received it in error, please notify the sender immediately and delete the original. Any other use of the email by you is prohibited.
|