[
Lists Home |
Date Index |
Thread Index
]
Mike wrote:
"Yes, but what did we learn from the experience? That external schemas are
quite difficult to deploy and evolve in the real world ... That parsing
data with markup is much easier and more robust than parsing data
against an external schema ... That the type systems of the programming
languages in widespread use are so diverse that it is very difficult
to define a neutral data interchange method that doesn't lose
information yet is simple enough to use by ordinary programmers ..."
Hear hear. If an 'ordinary' programmer has to deal with the validity of incoming messages then the environment has failed. Verifying the validity of a message is a middleware function not an application function.
Given a schema for a message type and that programs are to be written to handle it, then the best thing is to use the schema to generate a set of classes that can instantiate instances from a message instance. The more complete the schema language, the more watertight the objects.
The advantages of this are:
1. Early binding: giving much faster parsing and much smaller memory use
2. Version control: the objects in the program match the schema of the message (and the schema of the database if that needs to change too)
3. Free validation: the objects will fail to instantiate correctly if the message is not valid.
What's the moral? That class definitions in OO languages are in fact schemas for data types in programs. If you can't instantiate an object without a definition of its class then it is strongly typed. So XML Schema has to have complex types to generate complete sets of objects. In a language like Java, the PSVI information is called introspection and reflection.
|