Lists Home |
Date Index |
Miles Sabin wrote:
> If reading habits and credit card details were as sensitive as medical
> records then Amazon would be. But they're not.
If there were a documented, public credit-card theft at Amazon it would
have a massive impact on their stock price and potentially even on their
sales. I think that they are exactly as paranoid as Rich's customers,
but they have decided that interoperability with the Web is more
important for them then their fear of breakin. After all, without the
Web, they don't have a business.
> Hmm ... that's either avoiding the question (decrypt in the DMZ) or
> refusing to acknowledge the practical problems of auditing an HTTP
> server (backend supports HTTPS/proxying).
You asked me how to do it with REST. I gave two ways that people who are
security-conscious but not legally encumbered can do it. If a customer
came to me with this problem but without legal restrictions I would tell
Have a proxy in your DMZ that forwards only authenticated HTTPS
connections to a locked down intermediary machine. The intermediary
machine runs either OpenBSD or an audited microkernel. It only runs one
network service, which is an audited minimal HTTPS server written in a
language immune to buffer overflows. The intermediary has as few tools
installed as possible. Most live on a CD. This service decrypts (and
perhaps validates) data and gateways to the "real application", which
only accepts connections from the intermediary. Security is about risk
management. There are much easier ways to steal someone's medical
information than trying to attack this intermediary.
You say that there are "practical problems" auditing an HTTPS server.
What makes you believe that it is any harder to audit this server than
to audit the union of the existing protocol and Rich's decryption
software? A minimal HTTP implementation can be done in a very few lines
of code. I've never implemented TLS but surely it is on the same order
of complexity as other encryption technologies.
> Not a good analogy. XSLT transformations can be done offline where all
> inputs and outputs can be tightly controlled. Offline HTTP servers are
> ... ahem ... not terribly useful.
It's exactly the same. A buffer overflow in the XSLT engine can be
exploited to break out and attack the OS. By definition, if the XSLT is
running at runtime it is "online" not "offline".
> > I feel like I'm repeating myself: The whole point of REST is the
> > intermediation
> This is news to me. I thought that resource modelling was the whole
Resource modelling for its own sake? It's not UML. You do the resource
modeling in order to make your information conform to the standard so as
to improve interoperability, just as you do ER modeling in order to
> And if it isn't already obvious, the issue here is that the way
> that resource modelling is realized in HTTP(S) leaks information.
That's an outrageously off-base statement. You have not demonstrated
*anything* remotely like this. We haven't even discussed the details of
the SSL protocol in this thread. "Leaks information" has a precise
meaning in the security world. The best you've demonstrated is "the SSL
protocol requires the use of software that costs money to audit." That's
a totally different statement.
Come discuss XML and REST web services at:
Open Source Conference: July 22-26, 2002, conferences.oreillynet.com
Extreme Markup: Aug 4-9, 2002, www.extrememarkup.com/extreme/