XML.orgXML.org
FOCUS AREAS |XML-DEV |XML.org DAILY NEWSLINK |REGISTRY |RESOURCES |ABOUT
OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index]
Re: [xml-dev] Too much power? was RE: [xml-dev] 2007 Predictions


Is that really true?  I thought AJAX flowered largely as a way to bring
modern UIs and interactivity to web apps -- the standardization and easy
deployment of web "pages", but a user experience close to at least a 1990's
desktop application.  Even had everyone been less evil and more willing to
follow those Leading the Web to its Full Potential in the '90s, we'd have a
welter of patches and hacks today because somebody or other would be working
to move beyond the 1999 state of the art, and their competitors would
follow.  The W3C wouldn't be able to stay ahead of the game because if they
really did make the rules everybody had to follow, there would be 50-100
people on all the WGs and nothing would get done :-)

I don't think its a case of the W3C making the rules here that's necessarily the issue. 1999 state of the art was -
XSLT having just been announced, with a Microsoft version of XSL that wouldn't be made compatible on IE for another three years,
Schema still a couple of years from being done, XForms still a couple of years from being done, Netscape sold off to AOL and the
open sourcing of Mozilla a couple of years out, Opera barely surviving by its fingertips, the XMLHttpRequest object just announced
by Microsoft as something to make IE browsers more powerful, DHTML was a kind of convenient marketing buzzword, and the vast majority of people in 1999 were not even aware
of what XML was. Microsoft SAT on Internet Explorer because it saw no real advantage to investing money into it at that point, and they could effectively thumb their nose
at the W3C because there was really only one gateway to the web. AJAX as a technology existed on the Microsoft platform, admittedly, but it was a pre-XML technology
that was (and for the most part still is) frozen in time to prior to the emergence of XML.

Most of the W3C suite is in fact emerging on Mozilla, and people ARE beginning to adapt to it. The 800 pound Internet Explorer Gorilla is still there, but what's happening now is
that you are now seeing AJAX friendly sites that are ignoring IE, you're now seeing enterprises adopting non-IE browsers because of the improved XML support ... support that's
increasingly desired to extend the XML workflow that they've built on the servers.

Yes, JSON is also becoming a key technology, but I don't see that as being a bad thing - it's an easy enough trick to parse JSON to XML and back on the server, and as JSON becomes more widely available I can see languages extending to do just that, or continuing to work with JSON as a general exchange format (a la Ruby) without breaking open the cask of XML. JSON won't replace XML for the same reason that SimpleXML never went anywhere - you reach a point where the distinction between data and document blurs, and JSON is too lightweight for the latter case. Here's a radical notion - JSON is simply another serialization format of an XML infoset, one that didn't come from the W3C. Gee - what a concept. That holds true for E4X as well, which is similarly a lightweight non-W3C based notation that comes from ECMA, which has had a few really good ideas come out of it. Significantly, neither JSON nor E4X have originated from Microsoft, but emerged primarily from people working largely with Mozilla and finding XML a pain to manipulate in its DOM form.

For years, there are two messages that have consistently come from Microsoft - "We don't want to change the browser because we are afraid of angering our customers" and "The W3C has taken too long to get any standards work done, and the free-market approach that we espouse work better because we're more reactive to our customers." I think they are mutually exclusive. Most of the core work within the W3C was done between 1998 and 2003; in some cases there are second generation iterations of technologies that have been around for less time than IE went through any significant upgrade. Yet Microsoft has done almost nothing to work with these, has implemented those standards that it had a direct hand in (XSD Schema, which is a mess) and largely ignored those standards that it didn't. Yes, the W3C missed the boat a couple of times, most notably on whether it should be involved in higher level "application" stack stuff such as XMLHttpRequest, which was perceived by many as something that should be vendor specific, but even there you can point to work that's been done just in the last year to rectify the tardy decision to get involved in that area, and just by dint of discussions on this and related lists there are a number of people who are in those working groups you seem so inclined to dismiss that are asking questions such as "should lightweight non-DOM manipulation be a part of the W3C mandate?", "Should behaviors be a part of the W3C mandate?" "Should we be considering an infoset serialization that doesn't have pointy-brackets?"

The web EVOLVES and as it does so does the W3C. In the first few years of this decade the W3C had to kick start the jump to a more narrowly defined structure than HTML, and that in turn meant that a few core standards needed to be defined. They had to be proactive. Yet increasingly, those pieces are solid, and the W3C is transitioning into an organization that is watching the web develop and then placing its imprimature on those things that most seem to further the overall cohesion of the Internet. We've become so used to "innovative standards", which as in many respects an oxymoron, that now that the W3C is acting more like what a standards body should be, which is as a legitimizing body that determines the canonical usage of a technology for interoperability, that many people are angry that the W3C isn't acting like it was.

Let me ask a question - outside of the Microsoft "universe", who uses XAML? Yet Microsoft uses W3C standards in most of their products because their customers would crucify them if they didn't. Opera, Safari, Konquerer and Mozilla all support SVG. Microsoft uses VML, which is poorly documented, poorly implemented, has strong dependencies on Microsoft libraries and has been "frozen" since the mid-1990s. When Adobe's SVG plugin support expires in 2008, how many people who are depending upon SVG for their applications (such as the city of Toronto, not a small customer by any means) will just decide to refactor their applications to work on Mozilla and jettison their reliance on IE. When XForms support becomes integrated into Mozilla as part of the core suite (mid-summer 2007) how many customers of Infopath will start to see this as a commercially viable alternative? Or when Opera and Safari follow suit? How many web developers will just say "to hell with it" because IE's JavaScript support (excuse me, JScript support) doesn't even bother to support getters or setters, and the costs of maintaining two code bases gets to be too onerous. No, the lagging users are still firmly in the MIcrosoft camp, but the leading edge has been dropping IE in favor of alternatives at a far higher rate than the ones that are going the other way.

The browser is the leading edge application for any technology - it is where people spend 75% of their time interacting with the computer. My browser can support word processing and spreadsheets and dynamically generated graphics and behaviors and complex forms. It certainly can support Powerpoint like presentations, oh, and it contains an integrated database. Yours can too, but it only works on one platform, still doesn't conform to established CSS standards, has only a very minimal extensibility story, and is remarkably fragile in the process. The advantage that IE has is that it comes free on every new computer that Microsoft has an OEM relationship with, whereas you have to go out and actively download Mozilla or Opera, but most of those downloads now take maybe five to ten minutes ... and if I don't like them, I can uninstall them. That option's fraught with peril for anyone who decides they DON'T want IE on their desktop.

I think before throwing stones, it would be worth taking a good hard look at where the rest of the world (Microsoft's potential customers) are going to. Oddly enough, most people seem to be far more inclined to take the W3C as a reliable arbiter there.

But my interest here is not to look back at what might have been but to look
forward to what could be.  Applying the "Principle of Least Power" seems to
imply that given the choice of two travel services that take a query for
flights between Seattle and New York ...
- One of which will return a list of scheduled flights sorted by time;
- The other of which will return a list of flights that have seats actually
available that I can be reimbursed for under my employer's travel policies,
ordered by a tradeoff across time convenience, intermediate stops, and
price, with opportunities to upgrade the seat using my personal frequent
flyer miles flagged;

... that I should choose the first because it has less power???  OK, it is
more mashup-friendly, could be implemented with a declarative query language
without any nasty server-side state or imperative code, ... but it's not
what non-geeks actually prefer.

If all I'm looking for is a list of scheduled flights sorted by time, then yes, I would choose the first. The point I don't understand here is that this has nothing to do with JSON or XML - it is a matter of what is exposed by the data provider. If my database provides this information and if I as a web designer choose to expose this information then it really makes little difference what serialization format I'm using. If I as a data provider feel comfortable exposing this information to the users in question, I might even provide for them an appropriate query tool to talk to the database, but this to me also raises questions about security and data integrity.

All of web communication is ultimately a question about the contract between the information provider and the information consumer. Few information providers want to give unfettered access, even read-only access, to their databases to their customers, because it exposes questions of security and privacy. Additionally, most people are not so much interested in getting ALL of the information that they can possibly get, but rather getting that information which is most germane to their problem at the time, in a view that's appropriate to the context in which they are asking the question. Put another way (and to get back to your analogy) - I need a certain voltage and amperage to run my computer. I rely upon the energy grid to insure that there is comparatively little variability in the line, and I rely upon a transformer to get just the precise amount of power that's needed to keep my laptop happy. Too much power will blow up my computer, too little and it can't run. That's the principle of least power - provide what is necessary and sufficient at the moment.



[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index]


News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 1993-2007 XML.org. This site is hosted by OASIS