Two Deadly Accidents On US Destroyers, One Big Lesson In Human Factors

The summer of 2017 was not a good one for the US Navy. In fact, it’s an important lesson in human factors. On June 17, 2017 the USS Fitzgerald was involved in a collision with the container ship MV ACX Crystal off the coast of Japan. Seven crew members were killed, several were injured (including the ship’s commanding officer), and the destroyer sustained $367 million in damages.

 

In the aftermath, Rear Adm. Brian Fort submitted an internal probe to the Navy. Its details were shocking. The Navy Times reported that there was “routine, almost casual, violations of standing orders on a Fitz bridge that often lacked skippers and executive officers, even during potentially dangerous voyages at night through busy waterways.” Also mentioned in the probe was personal distrust between the officer of the deck, Lt. j.g. Sarah Coppock, which led her to avoid communicating with the combat information center (CIC) while the destroyer was attempting to cross the shipping superhighway.

 

The Navy Times shared that “when Fort walked into the trash-strewn CIC in the wake of the disaster, he was hit with the acrid smell of urine. He saw kettlebells on the deck and bottles filled with pee. Some radar controls didn’t work and he soon discovered crew members who didn’t know how to use them anyway.” Fort also found a Voyage Management System (VMS), designed to help watchstanders navigate without paper charts, that generated more trouble calls than any other piece of electronic navigational equipment. He saw that the VMS station in the skipper’s quarters had not been working properly, so sailors were using it for parts to help keep the entire system working.

 

In his conclusion, Fort told the Navy that the Fitzgerald’s crew suffered from low morale, that the destroyer was overseen by a dysfunctional chief, and that sailors were exhausted with very little time to train or complete critical certifications. Obviously, right? WRONG!

 

As a result of these findings, the Navy relieved two senior officers and the senior enlisted sailor in charge of their duties. They were planning to discipline twelve more sailors for failures that led to the collision, but at least one has been cleared of negligent homicide over the past few months. You are probably catching on to our previous point: No responsibility was taken by the organization, the Navy, for its role in the disaster. There were no mentions of design deficiencies. In their opinion, the evidence clearly points to human error as a root cause. What do we know about root cause and human error? The human is never the cause… They are the recipient of problems much further upstream, whether organizational or design-related.

 

As human factors experts, far too often we see humans being blamed when disasters like this occur. The term “human error” is often casually used to describe the belief that the people are the root cause versus a signifier that there are root causes that need to be addressed within the system or the organization. In the case of the Fitzgerald (and in most disasters), human error is simply not the root cause. To determine what the root cause is, we must ask: Why human error? Why was the bridge lacking skippers and executive officers during dangerous voyages? Why was there personal distrust between an officer and the CIC? Why was there urine and trash on the deck? Why didn’t crew members know how to use the radar controls? And why, oh why, was there a broken VMS station that forced users to “rig” parts so that that system could work?

 

Clearly, there were many breakdowns which led to the accident occurring. What many organizations fail to recognize is that safety begins and ends with the human being – specifically understanding their limitations and capabilities. Human Factors is the expertise that deals directly with human safety and prevention of human error. As an integrated part of the team, human factors experts could have dealt with the human safety concerns related to processes, policies and procedures. Specifically, they could have focused on how to optimize workflow to mitigate risk and also augment capabilities – preventing risk of human error and safety incidents. This also includes the design of the ship and how it must support a safe environment for all onboard. Human factors experts would have looked at all the parts of the system and how each component worked together, where potential risks were being introduced, and how they could have been prevented. They would have helped to ensure that users were not put in a place of owning the responsibility to adapt to or overcome a bad design.

 

If these issues were addressed immediately – or if they were never present in the first place – a second deadly collision would likely have not occurred just two months later. On August 21, 2017 USS John S McCain was involved in an accident as it collided with the Liberian vessel Alnic MC in the Singapore Strait. Ten crew members on the McCain were killed and five were injured. A safety investigation afterward found that the McCain made a sudden turn into the path of Alnic MC due to a series of missteps that occurred after propulsion controls were transferred. The crew onboard the Alnic MC saw the McCain turning and assumed that it would pass safely. Within 3 minutes, the ships had collided. This begs the obvious question: Who was on the safety team for the first accident and what expertise was called upon to conduct the root cause analysis? Far too often, a human factors expert is not pulled into the safety team to conduct the root cause analysis. While we don’t know for sure who was on the Navy safety team, this is a common missing link during root cause analysis.

 

Considering the second event, we will again ask the very important question: Why. The Navy may have answered this question indirectly earlier this month. “US Navy to ditch touch screen ship controls,” the news headlines recently read.

 

Two years after the deadly Fitzgerald and McCain incidents, the Navy has announced that both were preventable and due to multiple failures. They pointed out that sailors on both destroyers were not properly trained to use the touchscreen controls on board. An investigation showed that the touchscreen system used for throttle control and training deficiencies was overly complex and lead to a loss of control. Now the Navy has decided to revert ships in this class back to a physical throttle and traditional helm control system in the next 18 to 24 months. This, after the fleet overwhelmingly said they preferred mechanical controls to touchscreen controls in the first place. But notice once again that the Navy points to the users as a root cause and that they weren’t properly trained. Unfortunately, this is one of the worst causal explanations you could offer because training will never overcome a bad design. All that is accomplished is shifting blame to the user and away from the organization.

 

We must ask why sailors were using new technology that they didn’t have a real need for in the first place. Technology should never be created or integrated just for the sake of it. Although something may be the “latest and greatest” (like a touchscreen control), it doesn’t mean it’s better, or safer, than the old way of getting the job done. Users must have a need for the product. It must solve problems for users… not create them. It must function seamlessly and easily, while being accessible and usable for everyone. This is the very core principal of Human-Centered Design. A user’s inability to use something successfully is never rooted in training. We would love to be able to help the Navy so that no more lives will be lost due to a lack of human factors.

Leave A Comment