Lists Home |
Date Index |
> RPC's aesthetics put a premium on minimizing the impedance
> mismatch between programming languages and the web, "hiding the
> network" as a number of people here have said.
One can argue that the reason it ends up so difficult to
"do RPC right" is that exact statement. In fact it's not so
much an aesthetic as a justification of why "procedures"
(methods etc) are the interaction model, not messaging.
Rather than trying to make remote things look local, and
"hide the web" so much that the inevitable failures come
as surprises, I think the other way around is better:
The goal should be to have an appropriate model for
accessing remote data, which highlights the facts of
specialized failure modes and higher latencies. And
into that model one should be able to slip "local" data,
which just happens to act like a particularly fast and
That is, mask the local/remote dichotomy not by trying
to hide remoteness under a locally-oriented "procedure"
model (doesn't work well) ... but by hiding local-ness
under a model that cleanly exposes remoteness.
REST meets that model better than RPC. The fact
that they're Turing-equivalent is beside the point;
nobody programs Turing machine bytecode. :)