Traffic safety is an important issue across the country and it’s the media’s responsibility to report relevant statistics in a responsible way.
Unfortunately, the media often uses faulty numbers when reporting safety statistics leading the general public to accept misinformation as fact.
In order to discuss the issue of highway fatalities, it is crucial to understand the differences between the terms “fatalities,” “fatality rate,” and the “fatal accident rate.” They each have varying degrees of accuracy when it comes to measuring highway safety trends.
Most Accurate (rarely referenced by the media):
The fatal accident rate: The number of fatal accidents on a per-vehicle-mile-driven basis (fatal accidents per 100 million vehicle miles traveled),
Somewhat Accurate (occasionally referenced by the media):
The fatality rate: The number of people killed in automobile accidents on a per-vehicle-mile-driven basis.
Not Accurate (frequently used by the media):
Fatality figures: A simple tally of the number of people killed in automobile accidents.
Reports and studies based on the numbers of fatalities have little merit or meaning within the context of highway safety trends.
The reason fatality rates and fatal accident rates are a more accurate measure of highway safety trends is because they are based on the concept of “exposure.”
A motorist who drives 50,000 miles a year has 10 times the accident exposure risk than a driver who logs 5,000 miles in a year.
Unfortunately, fatality figures are the numbers most often quoted by the media, insurance industry lobbying groups and even the National Highway Traffic Safety Administration (NHTSA).
Organizations and individuals that quote raw fatality statistics claim “rates” are too complicated for the public to understand. Actually, fatality rates and fatal accident rates are very simple to determine and understand.
A Quick Hypothetical Example:
Assume that in a given year, 20,000 motorists drive 20,000 miles each and fatal accidents kill eight people. The fatality rate, per 100 million vehicle miles traveled (the recognized standard for measuring fatalities), is two.
The next year, the number of drivers doubles to 40,000, with each of them driving 20,000 miles per year. Everything else being equal, it makes sense that 16 people will die in accidents. However, the fatality rate remains identical: Two deaths per 100 million vehicle miles traveled.
Although fatalities doubled, the highways were equally safe in each of those years.
In this hypothetical situation, the public’s key source of information on highway safety — the media — will likely quote raw fatality numbers given by self-proclaimed “safety” organizations, with headlines such as, “Traffic Fatalities Double From 1995 to 1996!”
Although accurate, the statement is completely misleading.
It also shows how comparing percentile changes of raw numbers, particularly when examining small populations (such as individual states), can be extremely misleading.
For example, from 1995 to 1996, Missouri experienced an increase in fatalities from 1,109 to 1,149, a 3.6-percent increase. Yet highways were equally safe in both of those years, as is shown by the fatality rate, which remained the same, at 1.9.
Nationally, although fatalities increased by 90 from 1995 to 1996, from 41,817 to 41,907, the fatality rate actually declined.
So, while the public saw headlines that read, “Fatalities increase . . .,” the media unknowingly failed to also tell the public that the fatality rate dropped 2.2 percent, from 1.726 to 1.688.
In reality, for the individual motorist, the highways had become safer!