[
Lists Home |
Date Index |
Thread Index
]
Yes. The test specifications were the next step.
Alan Hudson was working on those last I checked.
The reference implementation was considered in VRML
times but wisely turned down. Once I understood the
implications, it seemed the right idea to turn to
the open source for open proofs of implementability
and a test suite for the rest.
Sure, XML is a good first step and well-shared but
it is about the sharing, not the technology. The
claims for interoperability based on XML are often
over stressed in the spec process. That led to a
lot of angst and anger when X3D was being spec'd.
The object model is the important piece and has
proven itself as extensions have been added.
len
From: Paul Downey [mailto:paul.downey@whatfettle.com]
On 5 Dec 2005, at 15:46, Bullard, Claude L (Len) wrote:
>
[snip]
> We debated a reference implementation but that has a way of
> strangling innovation for qualities such as speed and ease
> of extensibility. Over time, the open source implementation
> became the proving ground for those who require transparency
> of process and code. That works reasonably well.
>
> It was the lack of clear open source that killed many an
> early SGML project some which have well-known names. So
> while it is true that ISO does not endorse them, it is a
> good idea to have implementors working side by side with
> the standards members for reasons most of us here understand
> I'm sure.
I'm not keen on the idea of a 'reference implementation', but
I'm a big fan of 'test driven specification', by which I mean
if you can't test it, and don't have a test for it, it doesn't
belong in the spec. Interoperable specifications describe how
things should work, not how people should think about things:-
XML technologies that define the order of pointy-brackets in a
stream of bits are going to interoperate a whole lot better than
those that describe the expected firing of neurones in a bag of meat.
Paul
|