OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   RE: [xml-dev] Partitioning?

[ Lists Home | Date Index | Thread Index ]

Hi Didier,

 

Ramin said:

 

2)     There was too much performance variance on the client. Unfortunately, the server doesn't necessarily know the memory/CPU configuration of the client before sending up raw XML. It seems to me the decision point for whether to do the processing on server vs. client isn't whether the client *has* an XSLT engine, but whether the user experience is enhanced by doing it up there in the desktop. We thought of putting up a sort of 'pre-tester plug-in' up in the client to sniff out performance metrics and communicate it to the server (the way videogames 'pre-test' your graphics card and build a capability profile the first time you run them). Haven't tried that yet.

 

Didier replies:

 

So it seems that your XSLT style sheets where quite CPU demanding. The test I made on my side lead me to think that if the XML document to be processed is not too big and the stylesheet having an output quite similar to other web pages, then it seems to work on most PCs. On very slow PCs even sophisticated HTML pages takes some time to render. I discovered that the Microsoft engine is quite running fast enough for most machines (taking into account the XML document size and template matching complexity).

 

Ramin replies:

They were what I would call 'moderately complex' but yes, demanding on the CPU. Some of the performance issues, I believe, also have to do with problems like hitting the 'back' button to go back and visit a page that was browser XML/XSL rendered. At the time I was doing the testing, the browser was not smart enough to know that the previous page had already been rendered in XML/XSL and should therefore use the same HTML output instead of re-transforming the whole thing. With HTML, the browser is smart enough to know (or guess). I haven't checked on the very latest browser versions to see if it's still the case. Have you tried that?

 

Ramin said:

 

3)     You get a *lot* of performance boost if you can cache the stylesheet and graphics up in the client. We used a programmable browser cache program we had developed and the performance was much better since you usually only had to download the raw XML data and could retrieve the XSLT stylesheet locally. But since not every client has the cache software, you still have to develop your server code to handle other clients.

 

Didier replies:

Very good point. I myself discovered that if the server settings are OK, the stylesheet is cached on the client side (by setting the cache parameter for the template). Since the XML document was dynamically created form a database I could cache the XML document but just having the stylesheet parameters properly set improved the overall performance since the Microsoft browser keeps the documents in the cache as specified by the HTTP headers. So my question is: why your team didn’t used the cache parameters to set the Microsoft browser to cache the stylesheet?

 

Ramin replies:

Actually, the main reason was lack of control over the native browser cache. If someone went out and did a 'delete temporary internet files' it would get rid of the cache contents. I haven't done much more follow-up testing to see if today the HTTP headers consistently over-ride this or not. The separate programmable cache avoided this whole issue and let us put other things like javascript, frequently used images, etc. in there so you could have a reasonably dependable user-experience. Another goal was to eventually integrate the programmable browser cache with a 'smart' caching XSL engine that would try to do optimized generation of only the bits that had changed (even 'trickle' transforming in the background).

 

Ramin said:

 

4)     A pragmatic matter: when you're under deadline to build a web-based application, you barely have enough time to finish the server-side code, let alone branch it. Effectively you are writing two applications in one. You can do some clever designs to reduce this work, but it's extra work that would otherwise delay a project, requiring some advanced programming, and may or may not provide a performance boost.

 

Didier replies:

Ramin it seems that this is a problem we encountered during these “irrational exuberance” times. People thought that programming for the web was a different affair than for other kind of programming task. I often heard the expression “web time” to signify, in fact, sloppy development methods J 

 

Ramin replies:

I agree. But that's a problem that pre-dates the "irrational exuberance" period (cf. "Mythical Man Month" by Fred Brooks).

     

Ramin said:

Overall, I think the approach has merit. But the server frameworks have a long way to go before being able to handle this automatically. Until then, you have to roll your own.

 

 

Didier replies:

So bottom line, you didn’t got too much help from what’s actually on the shelves.

 

Ramin Replies:

I didn't really expect to. Any reasonably functional 'off-the-shelf" system still needs customization. I don't know of any distributed application tool that blindly re-partitions the code for you. Even if one *was* available, I'd want to know what I was giving up in return for gaining something. In this case, doing the processing up in the browser might save you some data transfer time and reduce processing load on the central server, but it *might* increase the risk of a bad user experience. I would recommend gathering some good stats on each option before proceeding. As they say, YMMV.

 

Best,

Ramin

 





 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS