OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.


Help: OASIS Mailing Lists Help | MarkMail Help



   Re: [xml-dev] SOAP, XSD, and HTTP

[ Lists Home | Date Index | Thread Index ]

Mike Champion wrote:
> It's time to move on ...  Or to put it more bluntly, "REST WON!
> Learn to accept victory gracefully!"

Let me see if I understand the sense in which REST won. The XMLP working
group is going to add to their primer (and maybe to the SOAP spec
itself) saying: "please don't use SOAP when HTTP GET is more
appropriate." In other words, SOAP ceded a part of the solution space to

This is a big step in the right direction, but let me describe why it
isn't victory.

First, different people have different motivations for REST. I latched
onto the addressing issue because for me it was a no-brainer. Others
feel (e.g. Fielding and Baker) that it will be impossible to securely
and widely deploy a protocol without known-meaning methods across
administrative boundaries. So addressing is *just one issue*. It happens
to be the issue I am most interested in.

Second, we haven't even won the addressability issue yet. Consider if
the standards lawyers on xml-dev applaud this decision and go home. The
regular developers may or may not ever read the primer or standard. They
may or may not ever hear that there are cases where you are supposed to
use HTTP GET instead of SOAP.

Okay, but let's be optimistic and say that they DO read the primer and
they DO understand it. 

Now they sit down to develop a web service with the tools they have
available. Let's say that they have Visual Studio. So they need to
develop a web service that uses HTTP for safe, fetching operations and
SOAP for the rest. So they need to be familiar with both SOAP tools and
HTTP tools. Both the "Web Services Toolkit" and .asp (or whatever). They
need to understand both the SOAP *and* HTTP protocols. They'll need to
deal with the fact that Microsoft wizard-izes the hell out of one but
not the other. What will they do? Revert back to using just plain old

Okay, but let's be optimistic and say that one in ten sends a feature
request into Microsoft to ask them for better support for mixed-mode
usage in their toolkit.

So a program manager at Microsoft gets all of these requests and says:
"Hmmm. Only one in ten programmers cares about this issue, and they are
the most technical programmers who could figure out how to work around.
And adding support for HTTP GET in our interface will complicate things.
And the SOAP specification doesn't say we *have* to do anything about
the issue at all. And anyhow HTTP is just one transport, what if they
want to deploy the same service over another transport where there is no
HTTP GET? Then we'll have to give them some kind of error message that
says that some features of particular services are restricted to some
transports." Chances are high they'll say "screw it, let them deal with
two different interfaces if they want to make mixed-mode services."

And by the way, making a mixed-mode SOAP/HTTP service does not strike me
as trivial even *with* a good tool.

Okay, but let's be optimistic and say that the program manager is a real
web and REST zealot and he buys into it and adds support to the web
service toolkit to Visual Studio.NET 2 (a year from now). And maybe he
finds a magic bullet to make it really easy to migrate between the two
different modes of thinking.

This same version of Visual Studio.NET might well have wonderful
features for signing, encrypting, authorizing, etc. SOAP messages. But
you can't use these features with HTTP GET. So now your service is not
only split into two different interfaces, but you actually have to use
different security mechanisms for the two different halves. And you can
use SOAP routing to route your side-effect-creating messages but
something else to route your safe, side-effect-free ones. As time goes
by, more and more extensions to SOAP make this divergence more and more
stark. But all of the time, money and effort is going into SOAP tools.
What developer in their right mind would spend effort trying to maintain
the URI-space when it costs twice as much to do so and they lose
features that they are otherwise used to?

Does this sound like a world in which REST has won?

> If SOAP doesn't meet any needs for some project,
> don't use it! But looking ahead, it is very likely that SOAP will be used
> to provide a *framework* for protocol interoperability (much like XML provides a
> framework for data interoperability).  As David Orchard says in the TAG
> discussion quoted above, "I  think that [Roy Fielding] is right on here -
> SOAP specifies what to do in POST since there is no format otherwise."

If SOAP is an envelope and HTTP uses MIME as an envelope, then how is
that before there was no format and now there is? Does putting an
envelope in an envelope sound like it gives good bang for buck to you?
And let's say that it does give bang for buck because of soap routing
and security, etc. Then why wouldn't you want those same features for
GET-like operations?

> Also, if the point is that in a pure HTTP environment, much of what SOAP
> offers is un-necessary, that point is well taken.  BUT the simple fact
> is that in many, many of the environments where SOAP is flourishing, HTTP
> is just one of the protocols in the mix -- there's proprietary stuff such
> as MQ Series ... JMS ... maybe BEEP someday... as well as HTTP/SMTP/etc.
> SOAP's value is clarified when there ARE intermediaries, and one needs
> a way of bridging the information in HTTP headers across protocols that
> don't have them.  

The standard way of bridging HTTP is to wrap the HTTP message in another
message and send it. You're proposing to translate HTTP into SOAP and
then wrap *that*. How is that better?

And let's say that it is better: the WG defined no way to translate HTTP
GET into SOAP. So I guess those messages just drop on the floor!

> ... Likewise, other protocols have things that HTTP doesn't
> have, such as asynch notifications, built-in reliability, transactions,
> security ... 

If these other protocols are Web protocols then it is likely that they
will look a *lot* like HTTP in terms of semantics (URIs, caching, etc.)
and should just be defined today as HTTP 2.0 or as HTTP extensions. If
the protocols are NOT Web protocols then I don't think that they are the
W3C's responsibility. It always surprises me that people feel argue that
the W3C should spend effort on SOAP because it will be so wonderful in
non-Web environments!

If SOAP had been developed in an open process, its inventors would have
come to the W3C and said: "here are the ways in which HTTP doesn't meet
my needs." Microsoft could have expended resources alongside the W3C
team to come up with something that did everything that HTTP does and
everything that SOAP does. We could have all worked together to come up
with something that met both sets of needs. But from day one, SOAP was
not intended to be a web technology (as you point out above). And
inventing it within the W3C might have turned it *into* a Web
technology, which would have made it more difficult to use as a
tunnelling tool. 

So now we are stuck with a situation where you can either use the Web
technologies (HTTP, URIs) and get one list of features or use SOAP and
get an orthogonal list of features. 

I'm sorry, but I have a hard time seeing this total mess as victory for

Occasionally my mind latches onto the solution of redefining HTTP/URI
semantics on top of SOAP but that means that you would be running
HTTP-NG on top of SOAP on top of HTTP and you would *lose* HTTP's nice
text versus binary agnosticism. We would lose efficiency to get back to
where we are. But at least I could say I'm "cooperating with SOAP
people" and "being constructive."

 Paul Prescod


News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS