OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.


Help: OASIS Mailing Lists Help | MarkMail Help



   Transactional Integrity of Services (LONG)

[ Lists Home | Date Index | Thread Index ]
  • To: XML DEV <xml-dev@lists.xml.org>
  • Subject: Transactional Integrity of Services (LONG)
  • From: "W. E. Perry" <wperry@fiduciary.com>
  • Date: Fri, 28 Mar 2003 16:38:42 -0500
  • Organization: Fiduciary Automation

    Transactional Integrity of Services vs. Validation to Standard

Following the fraud at HealthSouth
http://www.nytimes.com/2003/03/28/business/28CARE.html I should like to
focus XML-DEV's attention once again on the consequences of a
fundamental design decision all too commonly made in the implementation
of XML:  http://www.xml.com/pub/a/2002/05/29/perry.html

Here's the problem: functional nodes--call them for the sake of argument
web [nocap] services--can be designed to execute upon the invocation of
a public interface, or not. Under what has been for more than 20 years
the orthodoxy of two-phase commit (2PC) the counterparties to a
transaction (or, more precisely, the functional nodes proxying for them
in the execution of the data transaction) share--that is, execute
against--the same instantiated data structure. But an XML
document--whether it is designed as an interprocess message or not;
whether it is intended as a remote procedure call or not--is simply not
an instantiated data structure which a procedure might execute against.
The much-touted 'interoperability' which XML facilitates is very
different from interchangeability. That is, an XML document, even when
composed to the constraints of a strict vocabulary or messaging
standard, is not simply, nor reliably, the vehicle to bear a given
instantiated data structure from one procedure to another. Burden that
XML document with all sorts of post-schema-validation type decoration
and it is still not a data structure against which process might execute
directly. Nor does anything in any XML standard oblige a receiving
process to instantiate any XML document as the same data structure which
the transmitter of that document might have 'serialized' it from. So
tight, and so brittle, an understanding between parties sharing an XML
document can be achieved only through a priori (and out-of-document)
agreement so extensive that it obviates any need for XML at all, since
what passes between the parties is not a message but some direct
representation of the fully-instantiated data structure itself.

Now, I understand that the fraud at HealthSouth was not perpetrated
using XML documents and that the inability to audit it, which Ernst and
Young pleads, was not the effect of web services practice. However, the
fundamental principles of transaction process design which this fraud
(like so many recently!) illustrates, albeit in misapplication, are
those very points which as regularly as they are misunderstood ignite
predictable permathreads in our discussion on this list. Fraud was
possible at HealthSouth because too many parties assumed that what
looked like a transaction was a transaction and, worse, each assumed
that it was a transaction corresponding to his own particular
understanding of a transaction. This is the Fallacy of Extrapolating
from Validation, and it was ruthlessly exploited by those who designed
this fraud. Conformance to a content model or to any schematic is quite
simply just that and does not imply that any such valid document
instance may be (forget about will be!) reliably instantiated as a
particular data structure . Document instances of content models or of
other schematics are inevitably in many-to-many relationship with
instantiated data structures expressing the same content model or
schematic. That can be narrowed to a one-to-one relationship only by
constraining away all of the things that make the document a document,
so that you are effectively transmitting the particular data structure

Those not in on the fraud at HealthSouth assumed (!) that what looked
like a transaction was a transaction because it exhibited the precise
form and content expected of a transaction. For a full fifteen years no
one examined how these apparent transactions were created, and therefore
no one understood that they were records only, not backed up by
discoverable party-vs.-counterparty actions which those records should
record. I regard this as records crossing a boundary of understanding
without the necessary corresponding change in the understanding of those
records. The perpetrators of fraud who created those transaction records
understood full well that the records were backed by nothing. However,
they also understood that those downstream, including their auditors,
who examined the records would conclude that those records were
themselves evidence of the substance of transactions because the records
exhibited the proper form and validated to the expected content model
and schematics. The downstream recipients were manipulated into carrying
out the intent of the document creators. The documents crossed a
boundary of understanding between those who knew that they were
fraudulent and intended to induce a particular outcome and those who
tried to see in the document an expression of the creator's intent and
thereby were manipulated into carrying out that intent. Each recipient
should instead have examined each record to see whether the particular
data structure required for its particular use of the document could be
instantiated, based somehow on or triggered somehow by, the record as
received. In the case of the auditors, they should have looked to the
document as received to provide them sufficient identifiers to locate
components of the transaction recorded and to instantiate the content of
that transaction independent of that one record of it. That is, they
should have instantiated the specific data structure--a recreation of an
original transaction--required for the execution of their own expert
function and should have ignored (because it is not specific to, and
therefore of no use for the execution of their particular expert
process) the data structure which the document creator might have
'intended' them to instantiate from the document provided.

Unfortunately, all interface-initiated processes operate on exactly the
mechanism of this fraud. Standard data vocabularies in XML also seek to
create exactly these condition of process. The fundamental assumption of
standard data vocabularies is that intent can be communicated and
particular semantics transmitted by the choice of specific blessed
syntax. Within the enterprise firewall or in other hermetic environments
that may be a defensible premise, and in such situations two-phase
commit is workable. The inter-process communication there, however, is
not by means of documents but by direct representations of, or pointers
to, the instantiated data structure itself, to which the processes of
both parties are specifically fitted. However, once documents are the
means of communication there is no one fixed data structure which might
be instantiated for a particular process, given a particular document.
And, because processes no longer share a single understanding of data,
expressed in a particular data structure, one process should not be
expected to operate as another might intend or expect. This last point
is the basis for truly independent, specialized process--real web
services--just as it should have been the basis for independent audit of
the fraudulent transaction records at HealthSouth.


Walter Perry


News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS