OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   RE: [xml-dev] Principle of Sustainable Complexity

[ Lists Home | Date Index | Thread Index ]

> From: Micah Dubinko [mailto:MDubinko@cardiff.com]

> Also with interest I observed Simon's thoughts on an 'XMLchucker'
> protocol--a simple send with (at most) a checksum back. I 
> don't doubt that
> such a straightforward protocol is already in use today, 
> though it's hardly
> as press-worthy as huge corporations "teaming" on pseudo-standards
> organization.

I can think of one particular system where we took such an approach (over
raw sockets, not HTTP) as a middleware layer to hook components in a web
application into a mainframe, while abstracting away the idiosyncracies of
the mainframe interface. As I try to digest this REST debate (and I can't
possibly keep up with the latest threads -- you'd think these folks don't
have day jobs ;-)), I keep trying to relate this all back to the things I've
done and consider: is this solving any problems for me?

One thing that has become quite evident to me in working through these
thought exercises is the point Paul Prescod made in an earlier post that
REST is more intrusive into business-level designs and architectures than
protocols typically are. Is this a good thing or a bad thing? The notion of
modelling the application in terms of uniquely addressable resources makes
sense. The typical web application design that relies upon a "session"
(correlated with a cookie) and sticks references to application objects into
some session object as name-value pairs sucks. I've always hated that
approach; not because it violates REST, but because it does a lousy job of
solving the problems it is supposed to solve for me. I have always struggled
towards designs that allows me to correlate the information in a request
with the specific component that will process it (and that component will
probably delegate processing of portions of the message to other
components). So the simple "chucker" protocol starts to look like a crude
ORB very quickly in complex applications. REST seems to be telling me to
make the criteria I use to identify the target component explicit at the
protocol level and that the URI should be sufficient for that, whereas the
approach I've tended to take (and which seems to be the prevailing web
service approach) is to minimize intrusion of the protocol layer into my
design and rely upon criteria in the message to do dispatching. Its the same
complexity either way. It's just a question of the right way to use the
protocol and the implications of that on the design process.

> There isn't a rich set of technical writings behind the 
> simple concepts like
> XMLchucker or XMLbaton. For one thing, the concepts are so simple that
> interop isn't an issue. 

I think that's oversimplifying. Interop is an issue. Basically, to make this
work, you need to invent a protocol. Even to just handle a checksum, both
parties need to agree to the algorithm used for computing the checksum, and
need to agree to its representation on the wire. As requirements get more
complex (e.g. handling security, as just one example), the protocol grows
more complex. As you start to want to do meaningful things, you start
defining message structures. As you start to discover commonality in these
message structures, then you start to abstract that commonality and either
incorporate it into the protocol, or define general, modular, reusable XML
fragments as building blocks. And things get more complex.

> Are there aspects of Web Services that should be this simple, 
> but are being
> made more complex only in order to satisfy the principle of 
> Sustainable
> Complexity?

Simple solutions can solve simple problems. Sometimes solutions are
overengineered, but often the complexity is there because you are trying to
solve a complex problem. I think that many looking toward web services are
trying to solve complex problems, and web services have been evolving to try
to address that challenge.

I'm not completely convinced that REST has all the answers, but the thought
that we can simplify things to the point that there is no protocol there
other than the notion of chucking some XML in one direction, and getting
some XML chucked back is simply to push the complexity onto the individual
solutions and have them reinvent solutions to the same problems over and
over again. 

The challenge is to develop a layered protocol that allows simple solutions
for the simple problems, but allows those trying to solve more complex
issues to still do things in a standardized fashion. I would say the track
record of the W3C in designing specs in this fashion is uneven, to say the
least, and seems to be degenerating. But this is what we should hope for,
not leaving those with complex problems to solve out in the cold.

However, I also think there tends to be some fetishism of standards, these
days. Interoperability is important, but so is picking the right solution
for the problem at hand. I am doubtful that one XML Protocol can be defined
that will please everyone and will support all requirements. Developers will
still need other options in at least some cases.




 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS