[
Lists Home |
Date Index |
Thread Index
]
> >>Absolutely not. The fact is computers aren't that smart, and robust
> >>systems allow and prepare for human intervention. In practice, most
> >>debugged and deployed systems rarely require human intervention of
> >>this sort.
> >
> >I've not met many such perfect systems.
>
> The systems don't need to be perfect.That's why I wrote "rarely"
> instead of "never". Pretty much all systems do require intervention
> of this sort, but rarely. The vast majority of transactions go
> through without a hitch. It's only the rare one that needs manual
> assistance.
I am reminded of a system that I used (as a lowly clerk on a student
vacation job) where we (the clerks) discovered a need to intercept letters
written to certain people before they were mailed out. We achieved this, in
effect, by entering dummy codes into the system so that these letters were
all sorted together and could be easily removed from the batch before being
posted. The following year they had improved the data validation so that
this procedure didn't work, but they hadn't fixed the problem that led to
the need for this manual process in the first place. In fact, they probably
never knew it was happening.
In general, I think validation is good and proper. It's possible that our
dummy codes were causing havoc in some other part of the system. But the
people writing the validation rules should always write them to allow
maximum flexibility, in the recognition that the system designers aren't
omniscient. Validation rules, for example, should never force users to tell
lies in order to get past validation (like the web sites, fortunately now
rare, that require me to enter a fax number - someone somewhere is getting
some strange faxes by now).
Michael Kay
|