> -----Original Message-----
> From: Bullard, Claude L (Len) [mailto:email@example.com]
> Sent: Wednesday, February 28, 2001 9:23 AM
> To: James Robertson; firstname.lastname@example.org
> Subject: RE: Why 90 percent of XML standards will fail
> Not at all, James. His arguments are vacuous.
> He cites no examples and offers no proof other
> than that other initiatives have failed. He
> fails to cite the successes or offer any
> contrasts as to why they may have succeeded.
It's a trade paper think piece, not an academic article!
The gist of it is the piece seems to be:
The specification initiatives (aka "standards") most likely to fail are those that are not aligned to the real business needs of major companies, that over-promise, that take on an already crowded space, or that try to dictate business processes rather than accomodate existing practice. Since 90% of XML initiatives fall afoul of more than one of these, 90% are likely to fail.
Is this really unreasonable, or inconsistent with the experience that we have had with various "standards" efforts? I don't think so ... I find this list potentially useful in predicting which efforts will succeed and which won't, so that I can ration my scarce attention span on the ones that might go somewhere.
The last one about standardizing business processes is the most controversial, I'd guess. Does ebXML's inclusion of a "Business Process and Information Meta Model" fall afoul of this? If so, is this really reason to worry? Not sure ... but at least it's an interesting question that would not have occurred to me until reading the article in question. Of course it would be nice to have more examples to come up with a credible answer, but that's not what think pieces are supposed to do. They're supposed to provoke thought!