OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   Re: [xml-dev] The Airplane Example (was Re: [xml-dev] Streaming XML)

[ Lists Home | Date Index | Thread Index ]

On Fri, 31 Dec 2004, Michael Champion wrote:

> From what I took away from a quick scan of the AA 965 incident, the
> problem wasn't so much poor engineering as bad specwriting (or
> "program management" maybe)-- everything worked as designed, but the
> designs did not really "work" for human beings who had to cope with an
> unexpected situation -- there was no graphic display of the terrain,
> the pilots failed to revert to traditional navigation techniques when
> they should have, and so on.  Arguably the mechanical/software
> engineering was too good, or too optimized for situations that prevail
> 99.99% of the time, so the overall *system* degraded precipitously
> when, uhh, the brown smelly stuff hit the fan.
>
> Did you have a different take on this from your more extensive
> reading?  What poor engineering killed people on AA965?

It is too long to put here, but see 'section 1.16.2 At Jeppesen Sanderson' 
from the accident report 
<URL:http://sunnyday.mit.edu/accidents/calirep.html>.

In essence, the system allowed the creation of an ambiguous navigation 
entry in a way that lent itself to retrieving the wrong navigation data 
without realizing it.

Dropping down to page 41:

   The evidence indicates that either the captain or the first officer
   selected and executed a direct course to the identifier "R," in the
   mistaken belief that R was Rozo as it was identified on the approach
   chart. The pilots could not know without verification with the EHSI
   display or considerable calculation that instead of selecting Rozo, they
   had selected the Rome beacon, located near Bogota, some 132  miles
   east-northeast of Cali. Both beacons had the same radio frequency, 274
   kilohertz, and had the same identifier "R" provided in Morse code on
   that frequency."

The poor engineering was the failure to force the user to disambiguate the 
identifiers. They were able to retrieve the database without confirming it 
was the _correct_ database when there was more than one match. The system 
should have presented them with a choice like

   Select:
     (A) Romeo
     (B) Rozo

rather than just returning a database without telling them there was 
_another_ database with the same identifier.

> More importantly, is there really a well defined state of the art for 
> the overall systems engineering -- including the human factors -- that
> could be taught/certified?

Hmmm...Hard one. I would say yes. Becoming a licensed professional 
engineer requires more than graduating with a engineering degree. It also 
required some (typically 4) years of experience before you are eligible 
for licensing and a grueling examination. The medical profession also uses 
a degree/testing/internship system. As does the legal profession.

The pattern is pretty standard in the professions:
education + testing/board + internship -> licensing

Here is an interesting diagram from the National Society of Professional 
Engineers:

  http://www.nspe.org/etweb/1!-00-2000_Licensure_Model_Law.asp


> I admit that I'm a bit biased (or perhaps defensive, since I don't
> have an engineering degree or certification and have had a bunch of
> jobs with "engineer" in the title).

I'm in the same boat, so I have many of the same biases. 24 years of 
experience, no software degree. I think that in any field evolving from 
'just invented' to 'formalized profession' this is going to be true.

> I wouldn't disagree that this is a Bad Thing, and that laws should be 
> made to make the term "engineer" mean something in the software world. I 
> will guess that continual *updates* to training and education are more 
> important than certifications and licenses per se.  For example, look at 
> the sample chapter in Writing Secure Code, 2nd Edition 
> http://www.microsoft.com/mspress/books/sampchap/5957.asp#SampleChapter. 
> How many people with degrees/certifications in software engineering that 
> are more that a few years old would "treat all input as evil unless 
> proven otherwise"?

Lack of attention to validating inputs is nearly universal in my 
experience. That is a whole 'nother thread. :)

> And how many would knwo all the perverse ways that have been found to 
> subvert code written to 1990's best practices for validating lengths 
> before calling strcpy(), memcpy(),-like methods?
>
> My point is not to disagree about the need for rigor, but to argue
> that it is not enough -- designers of safe systems also need a lot of
> attention to touchy-feely human issues, a dose of Machiavellianism to
> deal with the people who will deliberately try to make them fail, and
> and appreciation for the possibility that Father Darwin tends to
> design more robust systems by trial and error than an Intelligent
> Designer can realistically hope to. :-)  I have not idea how to create
> a licensing exam to test for this kind of stuff.

Practical exams and board exams.

You give people examples of broken code with descriptions of what they are 
_supposed_ to do, and tell them to fix them. You give them design specs 
and tell them to build it. You give them examples of programs and ask them 
to _tell you_ if the programs are well or poorly written - and why. You 
give them a spec with an important item or two missing, and to pass they 
have to figure out the right questions to complete the spec. You give 
them a spec for something that cannot be done and expect them to tell 
you that it can't be done. You give them programs and tell them to test 
them and tell you what is wrong with the user interfaces and how they 
could be corrected.

I think that the best way to find if someone knows how to do something 
well is ask them to do it and have it reviewed by professionals in the 
field. An experienced professional in any field can always tell shoddy 
workmanship. Whether you are talking programming or carpentry. 
Professional certification boards do much more than review paper 
documentation.

It is doable. Not easy or cheap, but doable. People have a habit of 
thinking in terms of multiple-guess tests and their limits in terms of 
testing knowledge and skill: Practical, essay, and verbal exams have been 
around a lot longer and can test much more. They just cost more to 
administer.

-- 
Benjamin Franz

"All right, where is the answer? The battle of wits has begun.
It ends when you click and we both serve pages - and find out who is right,
and who is slashdotted." - David Brandt




 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS