[Date Prev]
| [Thread Prev]
| [Thread Next]
| [Date Next]
--
[Date Index]
| [Thread Index]
RE: [xml-dev] More predictions to mull over
- From: "Len Bullard" <cbullard@hiwaay.net>
- To: <noah_mendelsohn@us.ibm.com>, "'bryan rasmussen'" <rasmussen.bryan@gmail.com>
- Date: Wed, 14 Feb 2007 09:52:40 -0600
That's similar to what I see in the real-time 3D space.
If you do the Barnes and Noble test, the most books on the shelf for games
programming online use an engine called Torque and a language called
TorqueScript not Flash or the other competitors. Torque CDs are included,
examples are plentiful and so on. When you open the books and look at the
script language, it is butt-ugly a bit like PHP. On the other hand, as you
get into the details, you see good support for the big ol' data glops that
games need for making the persistence mojo work on the server, and the
organization of these neatly reflects the concern of the game author for
character/avatar support, scores, guns, vehicles, etc. When you look at the
scripting commands, you find all of the hard work that it takes to make
water and skies are included as high-level commands.
If you do this stuff with VRML/X3D, you either find open proto libraries or
you build them yourself. The language is higher level than OpenGL but lower
level than Torque. For anyone who needs cross-genre (games are not 3D On
the Web; they are a genre of real-time 3D on the Web) and only wants to use
one set of tools and language because of the lifecycle costs, VRML/X3D is in
the sweet spot of complexity vs cost and effort. Some things are easier
such as sending events among the objects in different files, and the script
language is Javascript/Ecmascript/CDumbedDown, so that part of the curve is
easy. The lack of a network sensor or physics engine is bad but standards
wonks and the vendors are looking into the first and working on the second,
so that gets solved by market forces. I don't see any competitors for that
sweet spot because given the fundamentals of real-time 3D, any one writing a
cross-genre standard will produce something very similar and anything else
is a dedicated genre language or a library of the cross-genre standard.
Still, today by the B&N test, Torque rules. So if you learn this and it
does what you need, it is convenient and good enough. Same for HTML,
Fortran, Cobol and all the other languages that find themselves on life
support because while they aren't being improved, they are being used
because they do the job they were optimized for.
REST is fundamental like XML. If you do the B&N test on XML, you find that
the section that used to be one half of one shelf is now five shelves. Any
app can use these if you don't mind the tedium. WS, done well if ever,
tends toward application-specific and you'll need to learn more to do less.
If WS is mired, it is because its designers keep trying to make it be
general and a substitute for REST. Messages by their nature of use tend
towards the lingua of the community of use and that is by its nature, a
subset. XML? Is XML evolving at all? Really? Does that stop us from
using it?
There is no longer any real reason to pretend that we have ""The Web". We
have webs sitting on a substrate of network standards, proprietary products
and loosely accepting protocols. That's as good as it gets because that is
as free of constraints as a system can be made. Support your local
plurality.
If you aren't a Microsoft, Sun or IBM, it's better to slice a part of "A
Web" off and dominate it than to take it all from the guys with the
firepower. That is where REST thrives to the advantage of the developer.
If that seems inverted, well, it is.
BTW: after the Royal Brit sniper forces survey a target, they come back and
create a static 3D model. They make it out of sticks and mud because a) the
materials are cheap and convenient and b) it is easy to destroy it after
they are done with it. That is fundamentals in action.
len
From: noah_mendelsohn@us.ibm.com [mailto:noah_mendelsohn@us.ibm.com]
Sent: Wednesday, February 14, 2007 8:37 AM
Not sure whether it's still true, but 10+ years ago a lot of scientists
were still using FORTRAN in part because the optimizations people were
putting into the compilers were more suited to numeric codes, while the
optimizations for languages like C variants were focussed more on systems
code (pointer chasing, etc.) I think the math libraries were also better,
complex number types were more of a first class citizen, etc. In short,
while the language was in many ways dated, it was being well maintained
for a specific audience that other languages weren't addressing as well.
In part for that reason, innovation continued on the FORTRAN platform
certainly into the early '90s, and I suspect well beyond. It's probably
more like 15+ years ago, but folks like Ken Kennedy of Rice U. (who I'm
very sad to say died just a week or two ago) were doing most of their
parallel computing experimentation on a FORTAN base. I suspect things
have changed some since then. More and more large compiler efforts share
back ends and optimizers accross multiple source languages, suggesting
that the optimizations for C languages and FORTRAN have probably gotten
closer, etc. No doubt a lot of parallel computing work, including for
numeric codes, has moved to newer languages. Still, there were reasons
other than "lack of vision" why scientists have persisted on using FORTAN
long after most CS types figure it's become petrified. Maybe it has now,
but if so that's relatively recent.
[Date Prev]
| [Thread Prev]
| [Thread Next]
| [Date Next]
--
[Date Index]
| [Thread Index]