Lists Home |
Date Index |
On Fri, 31 Dec 2004 07:03:25 -0800 (PST), Benjamin Franz
> And yes - commercial planes _HAVE_ crashed where one of the proximate
> causes was broken software (accidents are rarely 'one thing' - they are
> usually two or more things in unusual combination): Put American Airlines
> Flight 965 into a search engine.
I did, and got some interesting information. (This was the
"controlled descent into terrain" incident near Cali Colombia). One
phrase "pilots may be overconfident in automation" leaped out at me;
that's certainly relevant to our discussion of procedural/declarative
approaches if you think of declarative as more "automated."
> I'm of the opinion that software is where engineering was about a century
> ago: In demand, unregulated, and open to anyone who wants to call
> themselves a 'programmer', regardless of skill or training. Disasters
> directly traceable to poor 'engineering' by people with neither skill or
> training killed a number of people and laws were passed restricting who
> can legally call themselves an 'engineer'.
From what I took away from a quick scan of the AA 965 incident, the
problem wasn't so much poor engineering as bad specwriting (or
"program management" maybe)-- everything worked as designed, but the
designs did not really "work" for human beings who had to cope with an
unexpected situation -- there was no graphic display of the terrain,
the pilots failed to revert to traditional navigation techniques when
they should have, and so on. Arguably the mechanical/software
engineering was too good, or too optimized for situations that prevail
99.99% of the time, so the overall *system* degraded precipitously
when, uhh, the brown smelly stuff hit the fan.
Did you have a different take on this from your more extensive
reading? What poor engineering killed people on AA965? More
importantly, is there really a well defined state of the art for the
overall systems engineering -- including the human factors -- that
could be taught/certified?
I admit that I'm a bit biased (or perhaps defensive, since I don't
have an engineering degree or certification and have had a bunch of
jobs with "engineer" in the title). I wouldn't disagree that this is
a Bad Thing, and that laws should be made to make the term "engineer"
mean something in the software world. I will guess that continual
*updates* to training and education are more important than
certifications and licenses per se. For example, look at the sample
chapter in Writing Secure Code, 2nd Edition
How many people with degrees/certifications in software engineering
that are more that a few years old would "treat all input as evil
unless proven otherwise"? And how many would knwo all the perverse
ways that have been found to subvert code written to 1990's best
practices for validating lengths before calling strcpy(),
My point is not to disagree about the need for rigor, but to argue
that it is not enough -- designers of safe systems also need a lot of
attention to touchy-feely human issues, a dose of Machiavellianism to
deal with the people who will deliberately try to make them fail, and
and appreciation for the possibility that Father Darwin tends to
design more robust systems by trial and error than an Intelligent
Designer can realistically hope to. :-) I have not idea how to create
a licensing exam to test for this kind of stuff.