Everyone who writes safety critical software should read this
Joseph Rushton Wakeling
joseph.wakeling at webdrake.net
Thu Oct 31 05:32:37 PDT 2013
On 30/10/13 23:31, Chris wrote:
> I know. A lot of people are like that. But who (mis)guides them? The big PR
> campaigns by big companies who talk about "safety" and "precision" and give
> users a false sense of security. Now that I think of it, maybe the fact that
> they don't have a simple mechanical backup is not because of the engineering
> culture. Maybe it is to do with the fact that a product might seem less
> attractive, if the company admits that it can fail by including a backup mechanism.
I'll play devil's advocate here, if nothing else because I'm curious what
Walter's response may be ... :-)
One of the things that makes a car different from an aeroplane is that pilots
form a relatively small group of highly-trained people. Car drivers get
trained, but not to a very high level.
So, in those circumstances, any control you put in the vehicle needs to be
confronted with at least four questions -- "What are the expected benefits if
this control needs to be used and is used correctly?" "What are the expected
problems if this control doesn't need to be used, but is used anyway?" "What's
the likelihood of a situation arising where the control needs to be used?"
"What's the likelihood that the driver can correctly distinguish when it needs
to be used -- what are the expected false positives and false negatives?"
The point being that a manual override in the hands of the average driver could
in fact _increase_ the risk of an accident because the most typical outcome is a
driver engaging it incorrectly.
More information about the Digitalmars-d
mailing list