XML.orgXML.org
FOCUS AREAS |XML-DEV |XML.org DAILY NEWSLINK |REGISTRY |RESOURCES |ABOUT
OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index]
Re: [xml-dev] Re: [ubl-dev] Top 10 uses of XML in 2007

I'll read all the posts later today, but here is my "evidence".

1. First my views and measures on binary formats are/should be well 
known here (don't do it as Elliotte says)
2. In my own area of expertise, at least 20 years ago, when I was 
writing my database system I was faced with 2 problems - variable length 
records and binary coding of numbers.

To some extent I'm sorry I never solved the first, although limited real 
estate in the real world (carton labels and other ugly intrusions of 
reality into cyber space) means that I've not really suffered.

More importantly for this debate - do you use binary coded numbers 
(integers, floats, etc) because they can be more densely stored, 
operated on natively, etc.

Through an optimisation process that took years (nearly a decade in the 
end) and comparing performance of industry leading ISAM APIs I never 
found any evidence that end to end performance was helped by binary 
storage. In fact my plain ASCII storage mechanism (for numbers in 
particular) is still very fast. To find out why you have to see where 
the time is being spent.

Remember profiling?

It turns out that the slow things are not what you expect. Converting 
between binary and ASCII representations of numbers is very slow 
involving a large number of calculations. It very quickly swamps any 
savings made by the calculations on the numbers themselves. And almost 
all the reports we run have more numbers that are simply displayed than 
numbers that are calculated - empirical data over thousands of report 
and screen programs.

Back to binary XML. You have to understand transmission time as a 
percentage of end to end time. If it's say 30% of total time for message 
generation, transmission and handling and you halve the transmission 
time then you get a net saving of 15%. You might also get that and more 
by looking at the other 70% and optimising it.

I remain a skeptic on binary anything at the higher levels until someone 
comes up with detailed analysis of where time is being spent - including 
a proper profile of the APIs involved. And it takes years to do this 
properly - not weeks or months.

Size, as it turns out, is not everything...

Rick

Elliotte Harold wrote:
> Stephen Green wrote:
>> Hi David
>>
>> I agree that when we are doing B2B then there may be in many cases
>> compression already. In non-B2B though, such as within an
>> organisation network or intranet, I would see binary XML as becoming
>> commonplace to increase performance. 
>
> And the evidence you have that it will do this is what exactly? A lot 
> of people are working under twenty year old assumptions about what is 
> and is not fast, that haven't been true for years. Binary formats are 
> not a magic panacea to improve performance. In many cases, XML is 
> actually smaller than competing binary formats. (Compare OpenDocument 
> to the equivalent Microsoft Office binary, for example.)
>
> There are a lot of myths and wild guesses about performance. I don't 
> doubt that people who never bother to crack open an analyzer or write 
> a good benchmark will switch to binary XML for no good reason. That's 
> a big reason I oppose it. The only areas in which the arguments for 
> binary XML are the least bit compelling are in the wireless space, and 
> that has a lot more to do with battery life than document size.
>


[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index]


News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 1993-2007 XML.org. This site is hosted by OASIS