[
Lists Home |
Date Index |
Thread Index
]
- From: rev-bob@gotc.com
- To: xml-dev@ic.ac.uk
- Date: 16 Nov 99 22:05:15 -0500
Okay, my first version of this response just wound up being far too huge. To make
things really simple, here's the point I'm trying to make:
Abstraction is a tool. It's neither good nor bad in and of itself - no more than a chainsaw
is. However, with any tool, its potential to be Really Good is matched by potential to be
Really Bad, and the more powerful it is, the greater *BOTH* of those potentials are.
This is what I'm talking about by "danger" in this context; if we give XML (a really
powerful structure) a really high level of abstraction (putting that power in the hands of
the masses), we will have something really powerful. This is not necessarily a good
thing, and that's all I'm really trying to say. (A nuclear bomb is really powerful, but tell
the people affected by one that its power is beneficial.)
So far, we have exactly one example class of high abstraction authoring tools on the Web
- the WYSIWYG HTML editor. Is there really anyone here who will argue that the
development of this class was a good thing? The questions about where this concept
failed are, for my purposes, largely irrelevant; I am merely pointing out that high
abstraction and ease of content generation is by no means a formula for a good tool. In
short(er), I'm just trying to urge caution and thought before we charge gung-ho down this
path.
Now, as for amplification on specific points....
> Ah, I see. So what you're saying is that if something may have negative
> consequences as a result of its misuse, it should be denied to the
> uninitiated. OK, that's fine. Let's start with the Bible.
I'm not saying that dangerous things should never be used - what I *am* saying is that
when you're using something dangerous, you should at least know that risks are involved
and how to deal with them. (And as far as the Bible goes, since you bring it up, I
consider it absolutely unconscionable to indoctrinate children in *any* religious belief
structure before they're old enough to actually consider and weigh the beliefs involved.
"Because Mommy says so" is not a sufficient foundation for faith - but that's a topic for
a private thread.)
> > Perhaps my point will be clearer if I explain that I'm all in favor of automation -
> > but if you don't know what you're automating, you will get into trouble. Heck, my
> > site is automated to the hilt - and yet, I control every last byte that comes out. See
> > the distinction?
>
> No.
Okay, here's another try, and I'm going to attempt to make it a short one.
In terms of content generation ("author level"), my site is really simple. A lot of code
gets stuffed into macros for ease of use and consistency of generation; the result is that if
I suddenly decide that for my XHTML pages, I want to transform FONT tags into UA-
sensitive CSS directives, I simply write one set of definitions, get 'em right, and go with
it. In one sense, this is close to what I understand of XSLT - take your content,
transform it according to certain definitions, and send it in the desired format. The
interesting part is that I don't have to depend on any client software to achieve this; all
the transformation is done either on my machine (pre-upload) or on-the-fly at the server.
Not having a way to accomplish this with XML, the current system works pretty well.
Here's the key - the abstraction is entirely in my hands. If it doesn't work, I don't have
to look for another piece of software that *might* work better. To return to the
chainsaw metaphor, if I cut myself with it, the blame is entirely mine. Similarly, if I
turbocharge it, I take the extra risks and (hopefully) reap the added benefits. If I may
coin a term, it's a completely open abstraction, instead of the closed sort common to
APIs and such. ("We don't care how it works, only THAT it works.") Just as I would
hope none of you would trust your data to an XSLT transformation that you could not
influence, I would not trust a site to a high-level closed abstraction. Why? Because time
moves on, and even the best abstraction must change on occasion. With a closed
abstraction, that means buying a new tool and learning how to use it. With an open one,
it simply means tinkering with the one you already have and are familiar with.
> In fact, seeing as how the Web took off long before there was a decent
> widely available HTML editor, you could blame the failure on handcoders
> alone. So let's stop hiding behind these poorly-abstracted editors you
> keep harping about.
In other words, let's stop talking about abstraction and shift to Why HTML Failed? No,
that's the wrong tangent. The topic is abstraction and its potential with Web
technologies; as such, we have to consider the existing examples.
> Seeing as how XML is fundamentally about validation and/or
> well-formedness, and that the structure has been carefull kept from the
> presentation, and the logic kept separate from both, there shouldn't be
> any trouble, now, should there?
Shouldn't, true. Then again, since nobody in their right mind would produce a
WYSIWYG-format HTML editor, I suppose FrontPage is just a figment of my
imagination. "Should" is wishful thinking; deal with what IS. Microsoft et al. had no
excuse to release (let alone sell) products that spit out horrible code - and yet, they did.
(Say, didn't MSXML do fairly terrible on XML conformance tests? Naah, this couldn't
be a sign of history repeating itself....)
I'm simply trying to point out that high-level abstraction for Web design has been tried
before, with spectacularly BAD results. Is it really so much to ask that this time we
might give the issue some really serious thought before trying the same idea again?
> One could argue that the problem with FrontPage and its kind is less a
> problem of poor HTML output and more of introducing a page-centric view
> into what should be performed using higher-level abstractions (e.g.,
> "sitelets", "areas", "chunks").
Possibly so. Speaking for myself, divorcing content from formatting at the page level
leaves me increasingly able to administer the site as a whole instead of each page as a
separate document. Sure, there are times I need to go in and tweak individual content
files - but these days, that's either for the purpose of enhancing metadata relevant to that
particular page or to bring an odd bit of code into line with a more widespread rule. That
sort of abstraction works wonders for my development efforts; what I am wary of is
closed abstraction that a given author has no control over. Yeah, saying that someone
should be able to create a new XML document with a couple of program lines sounds
really neat - but who writes the rules for that construction?
> > With FrontPage et al., there is no such impetus; the "child" never knows
> > he is being misunderstood.
>
> Similarly so for handcoders, and people who only test their markup in
> their own favorite browser, etc.
This is precisely what I like about XML - *in theory*, I should be able to write a
document, select or write a DTD, maybe write a set of XSL/T rendering rules for some
odd stuff, and have it work on any XML parser. As I understand it, XML documents
are supposed to either work or break - no ambiguity. I'm looking forward to that.
> You sincerely believe that the Web was ruined by lazy people with WYSIWYG
> editors, don't you?
They certainly didn't help matters.
> > Let the novices play with HTML; it's already broken. Why should we make
> > it easy for them to break XML as well?
>
> How will making something easy to use, while not surrendering its inherent
> robustness, make it easy for novices to break it? If anything, it seems to
> me that it will ensure it's success.
Chainsaws are easy to use; they are also extremely easy to MISuse. That is my concern.
FrontPage is easy to use if you're happy with broken code; I am not. Personally
speaking, I'll sacrifice some ease of use for clean results every time...but then, I care
about how my documents look. I'm concerned with getting pages up quickly, sure - but
what does it profit me to get something up fast if nobody can read it?
There's already been talk here about conflicting XML parsers and possible fracturing of
the spec...and this is at the developer level! Am I really supposed to believe that this will
get *better* by shoving it into the FrontPage "XML in a box" mold?
> I find it extremely difficult to say the same about requiring everyone to
> hand-code their own start- and end-tag handlers, thereby virtually
> guaranteeing that one of the following happens:
>
> 1) people who aren't programmers will stay away from XML
> 2) bad programmers will screw up XML worse than they did HTML
This, folks, is why strict conformance specs and validating parsers in Web user agents
are desirable - so that if a document is invalid, it doesn't work anywhere. (As opposed to
working in one parser but not in another - which is just Browser Wars: The Sequel.)
If someone can make a product that conforms strictly to the specs and yet has a high
enough level of abstraction to be widely usable, more power to 'em - but I *am* going to
be skeptical, for the reasons I've stated above. If, on the other hand, this mythical
someone has to sacrifice conformance for a higher abstraction - well, in that case, I'd
much rather see 'em junk the project altogether. IMO, it's better to have no tool than a
bad tool.
Rev. Robert L. Hood | http://rev-bob.gotc.com/
Get Off The Cross! | http://www.gotc.com/
Download NeoPlanet at http://www.neoplanet.com
xml-dev: A list for W3C XML Developers. To post, mailto:xml-dev@ic.ac.uk
Archived as: http://www.lists.ic.ac.uk/hypermail/xml-dev/ and on CD-ROM/ISBN 981-02-3594-1
To unsubscribe, mailto:majordomo@ic.ac.uk the following message;
unsubscribe xml-dev
To subscribe to the digests, mailto:majordomo@ic.ac.uk the following message;
subscribe xml-dev-digest
List coordinator, Henry Rzepa (mailto:rzepa@ic.ac.uk)
|