Category Archives: aviation

Pilots, presentation skills and preparation: The crash of Air France 447 into the Atlantic – The official French investigative report shows what pilot training and presentation skills have in common

If you’ve attended a workshop or seminar of mine, whether about presentation skills or technology or health, you’ll know I sooner or later introduce something about commercial aviation.

I have a passion for this, and partly earn my living from working with patients who suffer fear of flying, which affects a significant number of people, and cuts across profession, gender, intelligence and age amongst notable variables.

Above, you will see the cover of a just published technical report into the causes of the total loss of an Air France Airbus A330 a little over three years ago, while making its way from Rio De Janeiro to Paris. There were no survivors when the aircraft was lost over the Atlantic having entered a quite severe weather pattern.

Very modern and ultra-reliable aircraft like the A330 don’t just “fall out of the sky”, and its mysterious loss was compounded by the difficulty of locating the tell-tale cockpit voice recorder (CVR) and the “black box” which records the total flight experience with respect to the aircraft’s performance. These two devices when recovered usually enable investigators to piece together likely causative factors, together with recovery of as many aircraft parts (and human parts for that matter) in order to put together a plausible story.

Sometimes, mechanical failure is the total reason for catastrophic failures. But it is rare when there is no human involvement in such failures, whether it be on the flight deck, in a radar facility, in an engineering maintenance facility, in the original aircraft supplier’s building facility, including all the sub-contractors supplying parts, or an airline’s navigation and flight planning department, all of whom can contribute to the ultimate loss of the aircraft.

Each of these involves humans, and so the field of Human Factors is an important one in both the prevention of incidents, training of personnel, and helping to explain when all seems to fail, as it did in Air France 447 in 2009. Total hull losses of modern aircraft are very rare when pitched against the total number of flights undertaken each year. Indeed, as I show my patients on my iPad (projected onto my wall in a 6ft x 6ft picture via an AppleTV), there are thousands of aircraft in operation at any one time, 24/7. The app I use is FlightRadar24 ($2.99) which shows traffic, airport info., aircrafts routes, and various aircraft parameters in real time. You can see what it looks like below:

Notice in the very bottom right corner it says: “Showing 72 of 2063”.

This is because the app has detected 2063 flights worldwide, and this is at 3:30pm AEST, which is early morning in Europe. I usually say to my patients that at any time, there are on average 7000 commercial aircraft flying, going from and to departure and destination points at any one point in time. Essentially commercial aviation is the safest form of mass transport, after elevators and escalators.

For some patients, this is comforting; for others it’s not where the action is. But when events like AF447 occurs, everyone in commercial aviation sits up and pays attention, especially airlines which operate the same aircraft type. It’s an anxious time, especially when it’s the first total loss of an aircraft type, for fear some design fault or build issue has finally shown its ugly face, and potentially means the entire fleet of such aircraft across all airlines needs to be grounded.

I recall working with Qantas pilots in Sydney in the months after AF447, and later visiting their A330 simulators, where I learnt the world’s airlines including QANTAS were attempting to simulate the known events of the Air France loss, to see the potential contribution of pilot versus aircraft systems. Here’s me at the simulation centre near Sydney International airport:

This was about a year after the loss, and only speculations were being entertained as no recovery of information systems had occurred.

Ultimately, using very advanced and expensive equipment, recovery of parts and telltale recorders miles beneath the Atlantic surface occurred, and investigators began to meticulously piece together the contributing events to AF447’s final moments.

I’ve downloaded the French investigators’ (BAE) report, and it is a very detailed, technical report which owners of A330s will be poring over this weekend.

One of the regular aviation blog sites (Flightblogger) I read captured my attention with its current entry, reporting on the investigative outcome:

Two short paragraphs of the Air France AF447 investigation report offer an (sic) curious insight into the brain’s response to aural alarm signals – and might go some way to explain not just the crew’s failure to recognise the A330’s stall but why terrain-warning systems sometimes seem to bark at pilots to ‘pull up’ in vain.

Stall warnings on the ill-fated Airbus sounded continuously for 54 seconds. But the inquiry report, sourcing seven different research papers, states that aural warnings demand the use of cognitive resources already engaged during periods of high workload.

“The ability to turn one’s attention to this [aural] information is very wasteful,” the analysis says, adding that the rarity – and even “aggressive nature” – of such warnings might lead to their being ignored.

Studies on visual-auditory conflict, it states, show a “natural tendency” to favour visual over auditory perception when information acquired by both senses appears to be contradictory.

“Piloting, calling heavily on visual activity, could lead pilots to a type of auditory insensitivity to the appearance of aural warnings that are rare and in contradiction with cockpit information,” the analysis adds. Visual-auditory conflict during heavy workload translates into “attention selectivity” which accepts visual information but disregards critical aural warnings.

Those of you who have been to a Presentation Magic workshop will acknowledge almost instantly why this sub-section leapt off the page at me.

In the course of the workshop, attendees learn about the multimedia theory of persuasive presenting, using research from the field of affective neuroscience to promote this understanding.

It follows a model of Don Norman, formerly an Apple Fellow in Apple’s early days, a psychologist and engineer now part of the Neilson-Norman group, who speaks of our emotional relationships with technology.

He reminds us that we have at least three ways of interacting with and understanding the world outside of ourselves. Here is the final slide I use, having built it up discussing in turn each of the three elements you see below:

It is Norman’s plea to industrial and software designers (link to his 2003 TED talk) to take all three into account when designing everyday things for humans to use. Those technologies that seem to attend to these elements become indispensable and beloved by their owners, such as the iPhone and iPad.

When we drop down a level from thinking and planning – a top of the brain phenomenon, literally – we use our senses to make sense of the world. For humans, the sense we primarily use is visual, and between 40% and 60% of brain real estate is devoted to processing visual cues. Think of all the things we do with vision. We detect:

1. Size and difference in size between objects

2. Distance – is one object further away or closer than another

3. Speed – how fast is an object travelling, a combination of 1. and 2. above

4. Colour

5. Motion – coming closer, or moving away from us. In aviation, this is helped by colour because a silouhetted aircraft can fool us in terms of its direction. So the left (port) wing tip has a red light and the right or starboard has a green light, remembered by the mnemonic, “There is no red port left.” 

6. Transparency, or we can see the spatial orientation of objects behind or in front of each other.

7. Texture – our eyes pick up edges, smooth areas, folds, etc., and our brains can assign meaning to these elements in a haptic fashion, meaning we can assume what the object will feel like when we run our hands over it.

8. Sameness or likeness, such as with faces. There is a region of the brain, the fusiform gyrus, whose task it is to recognise faces. See neursoscientist Vilayanur Ramachandran’s fascinating TED talk for what happens when this area is damaged.

9. Balance – our visual system works intimately with our vestibular system so we know what’s up or down, or where we are in space at any one moment. Disagreements between the two senses, which is what happens when you’re momentarily weightless – rollercoaster, fast moving elevator, sitting in the back seat of a car with limited vision, on a boat in heavy swells – will usually have you feel very uncomfortable and nauseous.

Our other senses – hearing, taste, touch, and smell are all important too – but not at the level of the visual sense. On the other hand, in terms of priorities, your dog makes sense of its world in this order: smell, hearing, vision.

They have nostrils which smell in stereo so they can detect location of a smell in very small amounts. In working dogs, their outer ears, the pinnae, can independently swivel to act as funnels to give extra location detection too.

If you go back to the Air France investigative report, it confirms what I’ve been teaching in Presentation Magic talks: give priority to the visual display of information in a timely manner, building up a complex story so as not to overload the viewer, and keep words on the slide to a useful minimum. Audiences don’t just read words (more quickly than you can physically say them) but they sub-vocalise them so they “hear” the words. This can put their automatic actions in competition with the visual sense, and a mixed or diluted message is perceived.

Here’s what the Air France report said (p.105):

In addition, studies on the visual-auditory conflict show a natural tendency to favour visual to auditory perception when information that is contradictory and conflicting, or seen as such, of both senses is presented [4, 5, and 6]. Piloting, calling heavily on visual activity, could lead pilots to a type of auditory insensitivity to the appearance of aural warnings that are rare and in contradiction with cockpit information. A recent study in electrophysiology on a piloting task seems to confirm that the appearance of such visual-auditory conflicts in a heavy workload situation translates into an attention selectivity mechanism that favours visual information and leads to disregarding critical aural warnings [7].

If we generalise this to presentations – or if only the pilots and their trainers had attended some presentation training which featured sections on affective neuroscience – it reminds us once more to stop piling words on slides, or too many pictures for that matter, because we unwittingly ask our audiences to engage in cognitive overload. A narrative flow of ideas, using both spoken word and images consistent with those words, minimise overload and allow for greater information management abilities.

In aviation, that means you stay on task managing what’s called “situational awareness”, while in presentations it means your audience stays engaged, curious and likely to eventually “connect the dots” in a meaningful way.

Personal connection

I first learnt of the primacy of human factors in commercial aviation following the total loss of an Air New Zealand DC-10 on the Antarctic plateau, when in 1979 this most sophisticated aircraft for its day crashed into a dormant volcano.

This despite the presence of a very experienced crew and commentators on board with significant experience of the region.

This total loss has proved significant in many ways. The New Zealand Royal Commission, led by Justice Mahon became a model for such investigations. The initial findings by the nations’s Chief Investigator, with his experience limited to lighter aircraft, and which placed the blame principally on the flight crew, were overturned in the Royal Commission, which (to make a complex story rather simple) found the correction of a long term navigation error itself to be an error, placing the aircraft on a direct path to the 16,000 foot mountain, covered in snow.

(Aside – some have held Mahon’s investigation in high regard as a worldwide turning point in hull loss investigations including human factors. The Erebus disaster also shifted the field of traumatology in rescue workers, looking at how such workers can be helped to recover from the awful sights and smells they witnessed. The key psychologist who investigated this is New Zealander Tony Taylor.)

Unaware of their situation, which the pilots likely believed was 20 miles from the mountain as per their briefing and charts, the aircraft headed down its plotted course. The trained observers looking out the flight deck windows did not see anything out of the ordinary, an example of confirmatory bias: we see  what we expect to see…

But two questions ought to leap out at you if you do not know the story of this crash on Mt. Erebus in 1979.

1. Why didn’t the pilots see the mountain?

2. Why didn’t the radar warn them?

To answer 1., again rather simply, it appears a polar visual event called White Out was present at the time, creating an illusion of cloud ahead and obscuring the mountain. Unless they knew a 16,000 foot mountain was dead ahead, the flight crew would have been deceived by this polar optical illusion that all was safe, and they remained 20 miles away from a direct line to the mountain. Again, they saw what they expected to see. This is because our eyes are merely data detectors. The data is sorted into useful information in our brains, where it is compared to known past events, and sense made of it.

The Royal Commission called numerous experts to give evidence, but the evidence on white out was given by Ross Day, who was my psychology professor in my undergraduate days. He was an experimental psychologist with an interest in perceptual illusions, and his evidence was convincing. When Justice Mahon visited the crash site on the one year anniversary of the incident, he serendipitously the white out phenomenon onboard a Hercules aircraft, confirming the illusion as discussed by Professor Day. In 1974, when I was Professor Day’s first year student participating in his lectures and experiments, I had no idea what he was teaching me – dry experimental science – it would one day became a subset of what I would teach others in Presentation Magic workshops.

To answer 2., one needs to know the Bendix weather radar on board the DC-10 had two “mapping” modes. One, with the nose-located radar aimed downwards, scans land terrain for features. At the 6000 foot cruising altitude of the DC-10 over Antarctica, it would likely instead have been set to locate weather activity directly ahead, looking for water droplets indicating rain and thunderstorms, something to be avoided for the comfort of passengers.

So why didn’t the weather radar detect the snow on Mt. Erebus? Because the density of the snow would have absorbed the radar signals, meaning it was a dry location compared to the individual moving particles of water in rain, hail or falling snow. Mt. Erebus would not have been perceived ahead unless the flight crew knew to look for it directly in front of them. Eventually, as the aircraft approached the mountain at significant speed, another set of radar instruments detected low ground clearance, and an automated voice called out “Pull up! Pull up!”. (See the section from the French BEA report, above).

From the recovery teams’ information, the crew responded as per their training (to judge from the full power “go around” settings discovered), but to no avail.

With aircraft becoming more and more automated, the tasks of flight crew continue to change, and the more important human factors become in terms of human conditions such as attention, engagement, activation, and rehearsal. These bear an uncanny resemblance to the skills needed if presenters are to conduct presentations which are engaging and make a difference in their audience’s lives.