Posted April 16, 2013
In 2012, the Naval Safety Center received 15,677 mishap reports from fleet units. Is that a lot? Yes. Does that mean Sailors and Marines had exactly 15,677 mishaps? Not even close. Was every report complete and informative? Now you’re just being ridiculous.
Given the answers to the last two questions, does the Naval Safety Center still want all of the reports that are required? Yes. And it’s not that we’re gluttons for punishment and enjoy scratching our heads about what some of the reports include (or don’t include). We just hate squishy data. You should too.
Granted, your basic, non-headline-worthy mishap report (minimal lost time, inexpensive) would seem to be of direct interest to only a few people. First, the person who suffered the pain and/or embarrassment. Second, the person who had to fill out the report (and who thereby experienced a different kind of pain). Third, the IT and data staff here at the Naval Safety Center who regurgitate statistics, studies and updates. Fourth, me, who has to root through thousands of reports to find promising raw material for the Summary of Mishaps. Finally, local safety specialists whose job is mishap prevention.
This somewhat limited audience is no reason to avoid reporting mishaps. Neither is a perceived lack of attention. If you think “nobody reads them,” ask the new JO ground-safety officer who wrote the following in a recent WESS report: “ATAN cut herself on servicing equipment and was instructed to no longer cut herself on said equipment. Recommend all maintainers work to not cut themselves.” This incisive investigation earned an unexpected visit and counseling from one of our O-5s (who happened to be nearby).
So let’s dispense with the idea that you don’t need to report mishaps because nobody pays attention to them. We’ll call this Bogus Reason #1. Here are a few more.
BR #2: The report asks for too much information. OK, this is only partly bogus, because the basic WESS report does ask for a lot, and every data point isn’t equally crucial. However, the data points are carefully chosen. You need as much data as possible in order to analyze trends. When the true role of human error in mishaps became clear, there was an amazing amount of potentially meaningful data that we hadn’t been collecting.
BR #3: The mishap was embarrassing and makes the chain of command look bad. Yes, some of the mishaps are embarrassing. ‘Fess up, take the heat, and learn to avoid something similar next time. Maybe a mishap does make the command look bad, but lying about mishaps or hiding them makes a command look even worse. Note that leaders must fight the temptation to shoot the messenger, even for things that aren’t exactly great headwork. If you are a leader, reward honesty, don’t punish it.
BR #4: When you report mishaps, you are keeping the Navy or Marine Corps from meeting their mishap-prevention goals. This line of reasoning shows the flaw behind real or perceived quotas. Those of you who, like me, worked through the rise and fall of Total Quality Management remember that #11 of Deming’s 14 points was the elimination of numerical quotas for the workforce and numerical goals for management. The problem with quotas and goals is that they focus on outcomes rather than the processes that lead to those outcomes. If a process is stable, the outcome won’t change because of an arbitrary goal. If the process in unstable, the results are unpredictable, producing sporadic bouts of “OMG, the sky is falling!” and “Hooray, we’re doing great, wish we knew why!”
Like it or not, we need all of the data. If we’re only getting part of it, how can we tell when the Navy and Marine Corps are doing better at preventing mishaps? If an improved safety culture results in more honest mishap reporting, it might seem like things are getting worse instead of better. That wouldn’t be cool.
Return to main blog page
April 2012 #1
April 2012 #2
May 2012 #1
May 2012 #2
June 2012 #1
June 2012 #2
July 2012 #1
July 2012 #2
October 2012 #1
October 2012 #2