XML.orgXML.org
FOCUS AREAS |XML-DEV |XML.org DAILY NEWSLINK |REGISTRY |RESOURCES |ABOUT
OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index]
Re: [xml-dev] How long before services sending/receiving XML mightneed replacement?

Hi Folks,

This has been a great discussion. Thank you Stephen for raising the issue and for keeping engaged in the discussions.

Below are a few of my favorite excerpts from the discussion.

 

Mechanical transformation of XML to JSON and vice versa always produces a mess

Stephen Green wrote:

 

>Is it maybe time to focus on standardizing mappings

> and transformations from XML to JSON?

 

Michael Kay responded:

 

> Unfortunately, converting a good XML design to a

> good JSON design can only be done with knowledge

> of the semantic data model: mechanical transformations

> (in either direction) done without knowledge of the

> data model always produce a mess.

 

What is a semantic data model?

Roger Costello asked:

> What is a "semantic data model"?

 

Michael Kay responded:

 

A semantic data model tells you how the items in your data relate to things in the real world. An XML document might have a <product> element, but it doesn't tell you what a product is - and it's always more complicated than you think (for example, if you put your cornflakes in a Christmas-themed package one week, does that make it a different product?).

 

> Would you give an example of a semantic data model for XML

> and an example of a semantic data model for JSON, please?

 

You're putting the cart before the horse. You create a semantic data model for a problem domain (for example, in UML), and then you define XML or JSON representations of the data in that domain. The conceptual data model comes first, the concrete realisations come second.

 

Of course, people often skimp on data modelling and can get into a mess as a result. I did some modelling work with a TV company that couldn't agree what a "channel" was - some people thought it was a content service that people subscribed to, others that it was a set of electrical signals transmitted on a particular frequency.

 

> How are their semantic data models used to transform

> XML to JSON and vice versa?

 

You design an XML representation of the data model, and you design a JSON representation of the data model, and then you work out how they relate to each other, by referring back to the data model.

 

The more widely used a data interchange standard is, the more resilient it is … there's absolutely no reason to doubt that there will still be plenty of XML around in 100 years’ time

Michael Kay wrote:

 

Genealogy data is still exchanged in GEDCOM format, as it has been since the 1970s (it was at version 5.5.1 for twenty years, and has recently been revised to 5.5.5, largely to change the character encoding from ANSEL to Unicode). There are plenty of other examples of interchange standards in widespread use that predate XML. I think someone mentioned MIDI as another example. Once these standards are incorporated into enough different applications, the cost of change invariably exceeds the benefits. And attempts to replace them often fail dismally. The same considerations apply to standards built on XML: if they are in active use, they will endure. So there's absolutely no reason to doubt that there will still be plenty of XML around in 100 years’ time. And if there is XML around, there will be XML technology around to process it, because the data represents a much larger investment than the software.

 

Of course new things will come along: all standards can be improved, especially if you focus on particular areas of application that the old standard wasn't optimized for. But the question starting this thread was whether and when services using XML might need replacement because XML technology or skills are longer available, and I think the answer to that is never. We're still using Unix APIs designed 50 years ago that everyone knows can be improved upon; we're still using SQL, which is equally ancient and crumbly: key interfaces like that, which are essential to interoperability of complex IT systems, don't wither and die however antiquated they become. No-one can afford the cost.

 

Simon St. Laurent wrote:

I don't get any sense in the document-centric worlds that I follow that people still working with XML are calling for revisions of the foundation specs or decommissioning.

I can't say that XML's use WILL be perpetual, but I definitely think that it's reasonable that it COULD be perpetual.  As with MIDI, there are people who want to do more (and less) and extend it in their own ways, but most of that seems (so far) to be contained to specific projects.

Marcus Reichardt wrote:

I think XML has a stronghold still in digital/cross-media publishing, but it's time to review the purpose of XML, or maybe find a new SGML subset or extension to bring markup back in line with what's actually needed, such as an archival format (where XML may work well), an intermediate format in publishing pipelines (ditto), or an authoring format (where XML is a poor choice considering digital text is written in markdown and other Wiki syntax formats).

Jim DeLaHunt wrote:

What I take from the conversation on this list is that if the information is encoded using the right XML language (and schema etc.) then it will be a more comprehensible, re-usable, and thus a more valuable asset in future decades with future systems than will be the same information encoded as CSV or JSON.

 

The open-source model doesn't encourage continuous innovation

Stephen Green wrote:

>Yet there have not been any marked improvements in the XML handling in 15 years

 

Michael Kay responded:

 

The developers of both the Java and .NET platform have left the field to third parties, and most third parties have found it difficult to establish a profitable niche (Saxonica being an exception!)

 

The biggest challenge here has been the open-source business model. XML's success would never have happened without open-source software, but at the same time the open-source model doesn't encourage continuous innovation, because the value that users get from it doesn't flow back to the developer. The big corporates like Microsoft and Oracle and IBM stopped doing new XML work because they couldn't construct a business case that offered a return on investment, and the weekend hobbyists who created some of the original great products like libxslt stopped doing new XML work because they wanted their weekends back. The users who wanted new improved stuff found they were going to have to pay for it.

 

But the other challenge is that the first wave of products met 90% of users' requirements anyway, so even where better things became available and were free, users didn't move forward. Witness the fact that people are still using DOM, despite much better (and free!) alternatives being widely available. It reinforces the fact that old technology, if widely and successfully deployed, simply doesn't die that easily, even when better things are available.

 

 

Will JSON replace XML in government IT?

Ihe Onwuka wrote:

 

JSON is 15 years old now. That's plenty of time for there to have been (at the very least) a successful pilot of an XML replacement project in govt IT.

Ain't heard of one yet. 

Show us the money.

 

The purpose of XML hasn’t changed

Norman Gray wrote:

I don't think the 'purpose' of XML has changed (and I have 'purpose' deliberately in scare-quotes there).

 

It is, as it always was, a way of overlaying explicit structure onto text.  In the 00s it was also used (and I think widely _abused_) as a means of serialising objects and messaging.  It can work for that purpose, and for certain types of large object/message it's the right solution, but for simple or small messages, it's unattractively cumbersome, and a less good solution than, for example, JSON.  It only became used for serialisation and messaging because there wasn't, at that precise time, an obvious better alternative.  There were other remarks on this in the 'over-engineered' thread here, a few months ago.

 

So XML is still arguably a good solution for the cases it was initially designed for, and what's happening now is that the cruft of misapplications of XML is falling away.

JSON is a necessary evil

Kurt Cagle wrote:

JSON is a necessary evil because most programmers have been trained to NOT be systemic thinkers, but rather to concentrate on their own particular module or component. JSON fits this mentality well because JSON serializes cleanly into _javascript_ objects, and reasonably well into Python objects.

 



[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index]


News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 1993-2007 XML.org. This site is hosted by OASIS