Lists Home |
Date Index |
I hate to dash your hopes, but that will greatly depend
on the domain(s) of given type(s). The idea that
standards wonks can hammer down the processes of
different groups into a set of singleton documents
is hopeful at best for lots of what would otherwise
be interesting and profitable businesses. Yes, one
can cite a lot of limited successes, say the
rendering vocabularies where the reason for multiple
languages is usually local software, but for say,
legal systems, it is almost hopeless given some
set of boundaries (eg, countries). So XSLT is good
and even that won't fix the problems of really
disjunct enterprise processing models.
I agree wholeheartedly that REST can't solve
that. It may be the case that SOAP RPC is
easier to manage in these cases precisely
because one might prefer the edge system
to open an API and code to that. It will be
faster than trying to get agreements for
ironing out the disjunct domains. Divide
and conquer on fixed price.
From: Jason Diamond [mailto:firstname.lastname@example.org]
> In summary, the argument that I was making is that using REST helps reduce
> the number of access methods (i.e., accessing is scalable), but the
> "processing" of n arbitrary XML documents is non-scalable.
Why are you expecting your machine to process every XML vocabulary on the
Web? Won't most business only interact with sites that output XML
vocabularies that they can actually process? I don't see the scalability
problem for specific domains where the number of vocabularies is limited
(hopefully to 1).