Jumping to conclusions

Flashing lights have boosted intersection safety,” proclaimed the headline on the local news site. After four months, flashing red lights on stop signs were declared successful. I was immediately suspicious. In Lincoln signs are for public relations and nobody understands intersection safety.

It’s not impossible that a flashing light on top of a stop sign improves safety. But false statements about traffic safety are common. How can we tell?

There are statistical techniques to measure. The short answer is, I don’t know if the intersection is safer and neither does anybody else.

Here is the big clue that we can’t tell. The police chief “said there was a spate of serious accidents, some with injuries, over a period of several weeks.” So the lights were added.

Anything you do in response to an unusual cluster of accidents is likely to appear to help. That is due a statistical phenomenon known as “regression to the mean.” If there are three accidents a year from 2010 to 2016 and six in 2017, having three in 2018 is not an improvement. It is a return to normal.

Let’s look at numbers. I wrote before, “I have a list of all reported accidents in my area for the past 27 years.” Combining Massachusetts’ ominously named “crash portal” and a records request for older data, my lists go from 1990 to 2016. (MassDOT hasn’t released 2017 yet.) Data since 2002 is better quality so I’ll use only that. A 15 year baseline is good enough.

From 2002 to 2016 there were three multi-vehicle accidents per year on average, two property damage only and one injury.

Over last winter the town put flashing red lights on two stop signs. Somebody likes the results. Let’s assume that means there have been no accidents in the past four months.

What can we learn?

Nothing. The long term average time between accidents is four months. Four months accident free is meaningless. It’s a coin flip coming up heads. There have been many longer accident-free periods over the years.

An accident-free year would be unconvincing. There was one before.

A simple technique to decide whether a change is definitely not significant comes out of basic statistics. Think of accidents as independent random events. Then the typical year-to-year variation in the number of accidents should be about the square root of the total number. Any variation close to that must be assumed to be purely random. If three accidents per year become two, there is no meaningful change. If 29,900 traffic deaths per year become 30,100, there is no meaningful change.

You can use a longer period than a year. At the newly flashing intersection you’ll need to. One paper says you need three years of data to avoid suffering regression-to-the-mean bias.

Once you are no longer certain the data prove nothing, you have to crack open the books. One book is Observational Before-After Studies in Road Safety by Ezra Hauer. Hauer explains why apparent changes might have a different cause than you think. (For example, accidents per vehicle-mile tend to drop as traffic gets heavier.)

My guess is the lights make no detectable difference. But I don’t know and the police chief doesn’t know either. Nobody will know for several years.

The opinions expressed in this post belong to the author and do not necessarily represent those of the National Motorists Association or the NMA Foundation. This content is for informational purposes and is not intended as legal advice. No representations are made regarding the accuracy of this post or the included links.

Not an NMA Member yet?

Join today and get these great benefits!

Leave a Comment

One Response to “Jumping to conclusions”

  1. gerald m serlin says:

    Good observation and well written article. Thank you for pointing out the obvious, so that dummies like me can see the truth.