Logical Fallacies in the Safety Sphere

Sometimes I feel like I really missed out by not receiving a "classical" education. While I can probably live without the latin and greek philosophy, one area I've been keen to pick up is formal logic. The forming of a coherent and valid argument is a key skill which is, in my opinion, overlooked in safety management. Which is disappointing since making such an argument is at the heart of making a safety case.

I'm not going to tackle the subject of logic today. To be honest, I don't know enough about the overall concept. Instead, I'm going to focus on the typical failings present in a logical argument - the logical fallacies.

A logical fallacy is essentially an error in reasoning leading to an invalid argument.

Firstly, it is funny that most definitions I saw on the web described them as "errors". A term which carries a certain definition in aviation safety circles regarding intent. I just want to be clear that fallacies are not restricted to unintentional errors - they can be made deliberately.

More importantly, I should define a valid argument.

A valid argument is one in which the truth of the conclusion flows from the truths of the premises.

Now, there are a lot of specific types of fallacies. So many, in fact, that people have even developed taxonomies of them. Recently, I found a good primer in this area thanks to a team from Virginia.

But I've got a bit of problem with one aspect of this paper. The authors seem to have a higher opinion of safety professionals than I do. These are some of the offending sentences:

We assumed that safety arguments do not contain emotional appeals for their acceptance or willful attempts at deception.

For example, wishful thinking was excluded because it concerns arguments in which a claim is asserted to be true on the basis of a personal desire or vested interest in it being true. Such an argument is unlikely to appear explicitly in a safety argument.

That second one really grates my nerves. Safety tends to cost money and money is the most basic "vested interest".

I have sat through quite a few presentations on aviation safety that have deliberately pulled on the heart-strings to promote their agenda. This is a type of fallacy known as an emotional appeal.

Under the emotional appeal category, there are a few different types. Each is based on a different emotion - fear, envy, hatred, etc. But it is probably the appeal to pity (or the argumentum ad misericordiam) that I've seen the most. Here is a run-through of the most vivid of my encounters - de-identified, of course.

This presentation was on a certain type of approach to operational safety. I'll at least say that it wasn't SMS but let's leave it at that. The majority of the presentation was, what I assume, was a fairly accurate outline of this approach and how it was to be applied in the operational environment of the presenter.

What I had a problem with was the introduction and regular reference back to the, what I considered, grossly inappropriate emotional appeal made at the start. The commentary came on top of series of personal photos, backed up with a lamenting ballad and outlined the heart-wrenching plight of "Jane".

Jane was happily married for a few short years...was the centre of her husband's world...had recently welcomed her first child into the world...until one day here world was torn apart by an aviation tragedy which claimed the life of her husband...

I'm a generally emotional guy and this story got to me. I'm passionate about safety and on some level, I want to minimise the number of "Janes" out there.

But her story and the thousands like it, had absolutely no bearing on the case put forward in the rest of the presentation. In fact, I felt like it detracted from the substance of the information presented. After overcoming my tears and quivering chin, I probably bounced back into a super-critical stance as a reaction to the manipulation which had just occurred.

It is very tempting to employ cheap tricks such as these in an effort to increase the impact of one's safety case. But in the long run, it will only hurt it. Either by casting doubt on the truth of your conclusion or turning people against the argument regardless of its overall validity.

I might be getting a little bit more philosophical in the coming months as Mr Dekker and Mr Taleb continue to blow my mind with just culture, complexity, randomness and the black swan - more to come.

Dan Parsons

Dan is an airport operations manager currently working at Queenstown Airport in beautiful New Zealand. His previous roles have included airport and non-process infrastructure operation manager in the mining industry, government inspector with the Civil Aviation Safety Authority and airport trainer. Dan’s special interests include risk management, leadership and process hacks to make running airports easier. 

http://therunwaycentreline.com
Previous
Previous

SMS Considered

Next
Next

Integrating Runway Safety Teams with your Safety Management System