There are a number of points raised here which ought to be separated. For XML itself, as a data interchange standard, the only serious attempt to improve on 1.0 was the 1.1 standard, which proved unsuccessful despite the fact that there were a number of implementations. Essentially what we learnt from this excercise was that the world benefited from having a single stable standard more than it benefited from the minor improvements offered by a new version. (And in fact the main improvements were then retrofitted into 1.0 by sleight of hand.) For other components of the ecosystem such as XSLT, XPath, and XSD, a significant number of people are using later versions (2.0, 3.1, 1.1) but it is quite true to say that many implementations have remained stubbornly at the 1.0 level. I think there are two factors that cause this: (a) the 1.0 standards are good enough for 80% of users. As with any technology, 80% of users only use 20% of the capability, and these users have little to gain from any subsequent enhancements. Even where they do have something to gain (like improved productivity) they may not recognize the fact. (b) the implementors of the 1.0 standards, as you point out, were driven by a rush of enthusiasm. The implementors broadly fell into two camps: amateur enthusiasts producing open-source products in their own time (libxml/libxslt being a prime example), and big vendors (Microsoft, Oracle, Sun, IBM) creating implementations that were given away as part of some platform. When subsequent standards came out from W3C, the amateur enthusiasts had found other more interesting things to do with their weekends, while the enthusiasts within the big vendors weren't allowed to spend any more money on development without presenting their management with a sound business case -- which is a difficult thing to do once the expectation has been established that the software is free. Saxonica was one of the few vendors that thrived in this environment, essentially by establishing a business model where our development costs were covered by license fees from users who actually needed (or appreciated the value of) the enhanced capability. The other question is, why has Java been the dominant platform? Well, it's not the only platform addressed by third-party vendors: XmlPrime and Exselt, for example, chose to go for the .NET platform. But I think it's a brave third party who invests in a platform where they could be wiped out at any time if Microsoft decides to move into their space. Or perhaps it's because the Java user community is more receptive to technology built by third parties? The Microsoft user base is still a little disinclined to touch anything that doesn't come from Microsoft. As for other languages and platforms (say C++ or _javascript_) each has their own dynamic. Implementing a technology like XSLT in the C/C++ environment is technically much more difficult than doing the same thing in Java or C#, and I have seen many attempts fail. For _javascript_ it's also true that until recently writing system software to run on the JS platform has been very challenging. Open source (or more generally, free software) was fundamental to the initial success of the XML ecosystem, and it is also fundamental to the lack of investment following on from this initial success. If the value that users are getting from the technology doesn't feed back into investment in enhancing the technology, then the enhancements will not come. Michael Kay Saxonica
|