[
Lists Home |
Date Index |
Thread Index
]
Hello David,
On Fri, 3 Jun 2005 1:02 pm, Didier PH Martin wrote:
>..we finally get out of the dark ages and the actual
> mainframe centric architecture connected to dumb terminals (something we
> call a browser).
yes, but now you get to see the dark ages in colour... at least that is the
upside. And you get to look at jumping things called flash animations..
Didier:
Yes indeed, but most web applications are slow like molasses. It's like
asking for mom for every moves it wants to make :-)
> It's incredible how fast we stepped backward. The evolution of the
> beginning 90s, compound documents (in a single package and with in place
> editing) and rich component based client-server environments disappeared;
> its like if dinosaurs came back and wiped out more intelligent life forms.
but at least programming and software development is so much better now..
Didier:
What???? Can you expand on this? Frankly some pieces (like server side) are
indeed in better shape. On the other hand, the client side is still on the
darker side....and in shape like a coach potato.
> You have a messaging system (i.e. a MOM) one publisher need to publish
> content that can be consumed by a, b, and c. this has to be done with the
> least amount of efforts from the developer.
Years of experience has taught me to be really careful when anybody
says "all I want is a simple system that just....".
Didier:
You are totally right on this one David. It's not that easy especially if
you want to have the same data to be consumed by three different
environments. We do not have yet a common object serialization. We have just
a remote procedure call mechanism with SOAP but no ways to move objects.
Moreover, if these objects are to be used in three different environments it
is even less obvious.
Different solution path:
a) serialize the objects in a java serialization format. You are now facing
the problem to reconstruct the objects for a c# managed code in a CLR
environment. Idem for ECMAscript. Do we have off the shelf tool to do it?
b) serialize the object in ECMAScript format and then you are facing now the
problem to transform serialized ECMASCript object in Java and C#. Again what
are the off the shelf too to perform this transformation?
In case a and b you need a piece of code to demarshall the objects in these
respective environments.
c) serialize the object in XML and then transform them into ECMAScript, java
and c# objects.
For somebody knowing XSLT this is straightforward and not too long to do.
The XSLT template can be made generic for different cases.
For a, b and c the problem is now with the behavior side of the objects. So
in cases a, b and c you need a demarshalling agent that merge the data with
the respective behavior for each platform.
So, yes in all cases it's not easy to do. The question is now: Is it easier
to do with off the shelf XML technologies than with something else?
Off course, there is more than a, b or c solutions. If anyone think of
something easier to implement than with XML+XSLT, I am all ears. Personally
I found it easier to resolve the problem with off the shelf XML tools like
XSLT. To make it dynamic is doable but not easy to implement especially for
static languages like java or C#. However for languages like python, perl or
ECMAScript or any other dynamic language it's a piece of cake and could be
done in a matter of hours.
Now another question:
Do we really need the second tier? Can we have more responsive applications?
People found the new acronym "AJAX" for more responsive applications. Did
anyone played with google map? Is the experience better then with "a page at
a time" kind of application? ( ref http://map.google.com ). Did anybody
experimented with SQL/XML? Is the possibility to create objects from
yemplates submitted by a client and filled by a server a way to resolve the
impedance mismatch (at least for dynamic languages).
I think that an answer to any of the above question is probably a move
forward getting out of the dark ages and contentment of the actual poor
quality status quo. The only progress I saw in the recent years was mainly
on the server side. We have now new technologies to implement service that
are far beyond what we got before.
Bottom line: the server side progressed tremendously
The client receded tremendously.
Any real progress now is to occur on the client side.
Some clues:
a) is it possible to move a collection of objects from server to clients
instead of an object at a time. Is it possible to "check out" these objects
and work on them even offline. Is XML useful to encode such objects?
b) is it possible to build a less complex and still robust architecture on a
two tier system. On the one hand clients running applications on the other
hand data/object/service sources.
c) can both a and b be viable with a low cost of ownership?
d) Can we improve the experience of users and have at least responsive
applications like we got in a not so far past with visual basic,
powerbuilder and other component based development/run-time environments?
e) can we leverage the new generation of machine having more power than
several mainframe of a not do far away past. Is using them as simple HTML
decoder the best usage of so much power. Are these machine incapable to run
applications? This is not what I see with more and more powerful games.
f) is there any alternative to rich environments like XAML? How do we expect
a user to react in front of a web page and in front of a highly interactive
XAML application?
And so on and so forth....And have a good week end.
Cheers
Didier PH Martin
|