OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   RE: [xml-dev] Re: Can A Web Site Be Reliably Defended Against DoS Attack

[ Lists Home | Date Index | Thread Index ]

Yes, it was, Michael.  At the time of web swooshing over 
other competitors, there was the whole hurrah about the 
simplicity of TCP/IP, worse is better, 80/20 and so on from 
the usual suspects.  It was witless, self-serving, and 
the myths became accepted as facts, were fed into the 
Darwin Machine, and uncaring nature as it always does, 
discriminated little between safe applications and 
unsafe applications.  The defenses for it read like 
the P2P file sharing defenses that claim they can't 
filter when anyone who understands metadata and 
hash marks knows they can, but hey, "up the system" 
as they used to say in the olden daze while getting 
famous for the crap that passed for art.  Now engineers 
have the same 'be a star' disease.  Me too.  The difference 
is I have to answer the RFP truthfully even if it 
costs my company the contract because the procurement 
personnel 'drank the kool-aid' while accepting the 
Federal dollars.

The reason we have a brain is to make judgements about 
such things, not to go lemming like over the cliff just 
because the crowd is going there.  There are lots of 
businesses for which such risks are acceptable.  There 
are some that are not, but today we have the pundits 
espousing, consultants reading, and few considering 
the really obvious problems that are there at the base 
of an 80/20 design being used for applications for 
which it isn't designed.

That's stupid.  You may be right about the curse of 
the witless.  You'll hear it when those pdas in the 
hands of the first responders stop working as their 
servers go 'off the air' just as the truck or plane 
filled with something nasty takes out the local mall. 
Hang on to your radio.  RF doesn't know about DDoS.

The web was fielded witlessly.

len


From: Michael Champion [mailto:mc@xegesis.org]

On Feb 5, 2004, at 9:21 AM, Bullard, Claude L (Len) wrote:
> , but the fault lies in the design of the Internet
> itself;  specifically, TCP/IP.  It's another case of 80/20 coming
> back to bite us.
>
> The web was fielded witlessly.


The Web (or rather the TCP/IP Internet, you appear to mean from the 
context) wasn't "fielded witlessly" it was designed to solve a problem 
that had we ever faced, the Russian malware authors would be as 
radioactively dead as the American male potency enhancement spammers.   
The commercial world jumped on it because TCP/IP works as well for very 
real infrastructure link failures as it does in hypothetical nuclear 
attacks .  As always, Father Darwin had the last word, and the 
monoculture that this success created serves as a fertile ground for 
all sorts of parasites that exploit the lack of 
accountability/tracability/confirmation that are inconsistent with 
TCP/IP's core mission of simply and efficiently routing around points 
of failure.

It may be time to change priorities and either handle routing and 
reliable messaging farther down in the infrastructure or add new layers 
in the middle of the stack so that there can be better authentication 
and accountability at the levels we care about.  Whatever is done, it's 
likely that a new type of parasite will evolve to exploit it -- maybe 
we'll trade the hassles the spammers create for the hassles the 
bureaucrats create, and in 20 years curse the "witless" people who 
created the situation.  Such is life  (and I think the invevitability 
of parasitism is well-known in Artificial Life as well).




 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS