top of page

Brief Airings, Big Bearings

Writer: SQSQ

2025 appears to be in the running to be one of the worst years for aviation, or at least that’s what it feels like from all the media reports. On 29 January, American Airlines Flight 5342 was about to land at the Ronald Reagan Washington National Airport when it crashed into an US Army Black Hawk helicopter mid-air. Nobody survived.



Two days later on 31 January,  Med Jets Flight 056, a medevac flight with six souls on board including a pediatric patient, violently crashed about 30 seconds after take-off. Nobody survived, and a driver of a car caught in the crash died too.


On 6 February, Bering Air Flight 445, a 9-passenger Cessna 208B Grand Caravan operating as a scheduled domestic flight in in Alaska, crashed after covering three-quarters of its 147-mile journey. Nobody survived.


On 17 February, Delta Connection Flight 4819 flying from Minneapolis, USA to Toronto, Canada, crashed upon landing and with its right wing snapped off before it overturned and came to a halt upside down. Everyone survived.


Each of these tragedies is now under investigation, and we won’t speculate on the causes. But moments like these offer lessons and reflections for anyone managing risk, leading teams, or making high-stakes decisions.


The Perils of Vague Communication


When media outlets analyzed the Black Hawk collision in Washington, DC, one air traffic control (ATC) transmission stood out: "do you have the CRJ in sight?", It suggested that the helicopter pilots may not have noticed the correct airplane in the sky. To be fair, 40 seconds earlier ATC did provide a detailed description, "Traffic just south of the Woodrow Bridge, a CRJ, it's 1,200 feet setting up for Runway 33.".


Yet, in the busy, dark sky where aircrafts appear as nothing more than bright lights (and even more so through the night vision goggles, as Singhealth's resident ex-helicopter pilot Joseph Lim would attest), the challenge of visually acquiring the correct target remains.


Vague communication creates uncertainty, and uncertainty invites mistakes. Consider these ambiguous phrases:


"Closely monitor this patient."--What exactly are we watching for? How much more closely are we monitoring compared to our usual monitoring protocol?


"Did you give the patient fluids?"--You meant intravenous. She was adamant she did not orally hydrate the patient.


"Cut the meat at an angle."--This was an instruction that, in my case, ended with someone handing me the knife to demonstrate what I meant.


The issue lies in people's perspectives. When we ask questions, we assume others see the world as we do. They don’t. As a result, a single vague phrase or question can be the difference between clarity and catastrophe. Do the questions you ask seem clear to you because of what you know, but actually sound ambiguous to the ears? In the rush of daily life, how easy is it for us to wrongly conclude that we’re seeing the same thing when we aren’t, without even realizing that we are wrong?


If clarity matters, don’t assume. Be specific.


Why the Right Components Must Break



It was fortuitous that Delta Connection Flight 4819's right wing broke away from the fuselage, and the plane skidded away and distanced itself from the fireball. The wingbox is actually one of the strongest parts of a commercial plane, connecting the fuselage to the wings and the main landing gears. While these are structures which are extra rigid to withstand the stresses of flight, there also exist design features which deliberately fail to ensure safety.


In engineering, a sacrificial component is intentionally designed to fail first when faced with excessive mechanical or electrical stress, so that other parts of the system downstream remain undamaged. Think electric fuses and crumple zones in cars. In healthcare, many patient gowns are designed to be easily ripped out and removed for CPR and other emergency procedures.


A sacrificial part is a safety feature that allows failure to happen in a controlled, predictable, and safe manner. In some companies this may mean financial reserves that absorb shock without crippling the company. Of course, airplane landing gears are designed to be extremely strong and robust, and are not supposed to be breakaway or frangible parts.


Do Your People Fight, Flee, or Freeze During Crisis


So, parts of the plane were in flames, the plane itself was upside down. How then did everyone on Flight 4819 survive the horrifying crash? One major contribution was the way everyone responded . "Everybody was helping their neighbor", the plane was fully evacuated within minutes, and multiple emergency services were already on site by the time the plane came to a halt. Yet another "textbook response", to quote the Toronto Airport CEO.


We typically respond in three ways when under stress: fight, flight, or freeze (or fawn, but which is irrelevant here). Our instincts might have us confront the threat directly, or distance ourselves from the threat as fast as possible. Our body physiologically adapts to support us in our attempt to resolve the threat.


Some of us might instinctively freeze, and be stunned like vegetable in the midst of a dangerous situation. Our brains suffer from analysis paralysis. Physiologically, rather than breathing faster so that you take in more oxygen, you instead hold your breath or restrict breathing. Freezing in the face of a dire situation is the worst possible response.


Our responses can also be understood using Jens Rasmussen's Skills, Rules, Knowledge framework. Also known as the SRK Model, it illustrates the cognitive processes and effort when people encounter situations of varying familiarity. Skill-based behaviors happen automatically, like driving a car without thinking about every action. Rule-based behaviors rely on learnt procedures when we recognize a situation (e.g.: recalling DRS ABCD upon discovering a collapsed person). Knowledge-based behaviors occur when situations are novel, and we are left to integrate details and figure out next steps.


The skill-rule and knowledge-based model (adapted from Rasmussen 1986). Source
The skill-rule and knowledge-based model (adapted from Rasmussen 1986). Source

In high-stakes situations, minimizing reliance on knowledge-based responses through training can improve decision-making and prevent delays. The aviation industry knows this, and drills emergency procedures relentlessly. Pilots, crew, and ground staff repeat crisis scenarios until correct action becomes second nature. The goal is to move from slow, effortful problem-solving (which fails under stress) to automatic, "muscle-memory" reactions that save lives.


During crisis, thinking slows. That’s why training must replace thinking with instinct. Are there life-saving behaviors that we should inculcate as habits? And when we do undergo refresher training, do we give it the respect and attention it deserves?


 
 
 

Comments


bottom of page