Supplier Directory Subscribe
Advertisement
Advertisement
Advertisement
Advertisement
Home / THE CHANGING FACE OF SAFETY MEASUREMENT

THE CHANGING FACE OF SAFETY MEASUREMENT

Safety professionals who lack basic data on their company?s safety efforts are not likely to convince management that intervention requiring a significant cash outlay is necessary or warranted. Phil La Duke of OE Learning explains why they better do a meticulous job of measuring safety indicators and understanding what these indicators show.

Posted: June 29, 2010

Advertisement
Advertisement

I just returned from Lima, Peru, where I presented to the International Symposium on Mining Safety. My topic ? Selling Safety In Tough Times ? may not appear to relate to how the measurement of safety has changed, but the two are fundamentally linked.

Safety professionals who lack the basic data on their company?s safety efforts are not likely to convince Operations leadership that intervention ? particularly an intervention that requires a significant cash outlay ? is necessary or warranted. Safety professionals who want to be respected and have the ear of Operations leadership had better do a meticulous job of measuring safety indicators and understanding what these indicators show.

MEASUREMENT BASICS
There?s little dispute that measurement is important, but what should we measure? There are two types of measurements: leading indicators and trailing, or lagging indicators. Lagging indicators are measurements of events that have happened. Leading indicators measure activities that tend to correlate to future improvements. More and more safety professionals pride themselves in reducing the number of lagging indicators and increasing the number of leading indicators. However, this trend is shortsighted and possibly dangerous.

The information provided by lagging indicators is limited, but useful none-the-less. While many safety professionals decry the use of lagging indicators as ?measurements of failure,? these indicators can provide very useful information.

For starters, most governmental agencies require at least some lagging indicators ? Incident Rates, Days Away or Restricted Time (DART), and Lost Workday Due to Injury (LWDI) rates are all lagging indicators. Even though these figures reflect past performance, they should not be disregarded as irrelevant to an organization?s current safety performance because they reflect a solid baseline measurement that can provide incremental progress reports.

That said, proponents of leading indicators do have a point: past performance is typically a poor predictor of future performance. Several years back I was speaking at the National Safety Council, where I made the point that the absence of injuries does not denote the presence of safety. After my speech one of the audience members challenged this point by saying that the dictionary definition of safety was ?the lack or absence of injuries.? He went on to defend his safety program by stating he had gone three years without an injury. I congratulated him and told him how I envied him, because he could never die in a car crash. Puzzled, he admitted missing the connection.

I explained that the logic that led him to conclude his workplace was safe (i.e., there was little to no chance that an injury would occur) was analogous to believing that because he had never been killed in a car crash he never could be. To continue this analogy, he could have put together a series of measures of risk factors to prove that he was subject to minimum risk of dying in a car crash by measuring things like the frequency with which he violates traffic laws, the traffic volume in the area he drives, the number of miles he drives, etc. In so doing, he could express his risk in measurable terms. This thinking is the driving force behind leading indicators, and leading indicators ? when used correctly ? are powerful tools.

Leading indicators also have their limitations. For example, one leading indicator that is frequently used is participation in safety meetings. On the face of it, this would seem to be a reasonable and important indicator of a healthy safety activity. Unfortunately, many organizations that use this as a measurable do a poor job of defining the metric.

Using only quantitative data (such as whether or not a meeting occurs) ignores the quality of the event itself. In this scenario, a meeting that was attended only by the safety manager would be given equal credence to meetings that were fully attended. Quantitative data also ignores meetings with no, or a deeply flawed, agenda; meetings that do stick to the agenda; and meetings that accomplish nothing. Certainly, one could only count those meetings that met fixed criteria, but doing so becomes time consuming and expensive.

YOU GET WHAT YOU MEASURE
Metrics are a tricky business; in real terms you get what you measure. One of the principal criticisms leveled at behavior-based safety (BBS) and/or safety incentive systems is that they tend to encourage under-reporting of injuries. BBS proponents are quick to point out that these measurements are not part of their systems and that the poor measurements are the cause of the under-reporting, not an intrinsic flaws in their methodologies. I would agree.

I remember years ago reading about a study where a call center tried to improve its customer service. When the consultants did a current state analysis they found that the performance of call center employees was measured not on the quality of the call, but on the time it took for an individual phone call. Because employees were held accountable for the length of the call ? not the satisfaction of the customer ? the employees hurried through calls and, in some cases, even hung up on customers in mid-call.

The company changed the measurement to include customer satisfaction and the satisfaction scores climbed dramatically. So whatever you decide to measure, you need to carefully consider the outcomes that your measurements could induce.

A LOOK FORWARD
The future of safety metrics is likely to take two tracks. The first likely track is a trend toward fewer, more meaningful measurements that are safety influencers, but that have not traditionally been viewed as safety indicators.

For example, manufacturers who are behind in production tend to have employees working out-of-process. Working out-of-process represents a dramatic increase in the likelihood of injury. These hybrid indicators are not only good predictors of safety outcomes, but other process shortfalls as well. This trend will lead to a closer relationship between safety outcomes and their effects on the business elements most important to Operations.

The second track that we are likely to see is the shift toward viewing safety as an expression of risk, rather than the absence of injuries. This trend will be categorized by the use of both lagging and leading indicators. But where these indicators are currently seen as discrete elements, the new view will examine the relationship between lagging and leading indicators to craft a fuller view of the state of safety within the organization.

MAKING SENSE OF IT ALL
Measuring for measurement?s sake is pointless and costly, so once you have developed robust metrics you will have to interpret these metrics and craft recommendations for Operations. This process translates data points into useful information that Operations can put into action. As you design metrics to measure the health of your safety, remember to:

? Keep it simple. The more complex your metric, the more likely you will introduce statistical noise and unintended consequences.
? Beware of unintended outcomes. Remember, you get what you measure. Be sure you have carefully considered every possible outcome and take pains to keep your data clean and the scope of your metrics clean.
? Link your metrics to outcomes important to Operations.

– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –

Phil La Duke is the director of performance improvement at OE Learning, Inc., 2125 Butterfield, Suite 300N, Troy, MI 48084, 248-816-4400, www.oe.com. For questions or comments on this column, contact Phil at 248-816-4442 or [email protected].

Subscribe to learn the latest in manufacturing.

Calendar & Events
FABTECH 2024
October 15 - 17, 2024
Orlando, FL
Design-2-Part Show
October 16 - 17, 2024
Marlborough, MA
Design-2-Part Show
October 23 - 24, 2024
Akron, OH
SEMA
November 5 - 8, 2024
Las Vegas, NV
Design-2-Part Show
November 19 - 20, 2024
Nashville, TN
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement