We don’t live in a black and white world. Reality has a whole messy collection of grays too, not to mention stripes and plaids. And while we like simple answers that cut through the morass, real life doesn’t always provide them.

On Thursday the National Highway Transportation Safety Administration (NHTSA, pronounced “nit-sah”) announced plans to investigate the May 7 fatality of a Florida man behind the wheel of – but not driving – a Tesla Model S. The car has an Autopilot feature that allows it take full control of highway driving, and during this accident, the car was in control.

So is Tesla at fault? The real answers are far from black and white.

Tesla’s Autopilot feature is a “beta” that’s disabled every time you turn the car off. This driver (and every driver who wants the feature) had to turn it on and click through the warnings. And there are many warnings. Among them is this one:

Warning: Traffic-Aware Cruise Control can not detect all objects and may not detect a stationary vehicle or other object in the lane of travel. There may be situations in which Traffic-Aware Cruise Control does not detect a vehicle, bicycle, or pedestrian. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death.

Maybe the driver is responsible. That warning is pretty clear — but disclaimers are just that: disclaimers. You don’t get to absolve yourself of responsibility simply because you post a note saying you aren’t responsible. If restaurants had notes saying “eat at your own risk,” are they responsible for food poisoning?

That said, what does “beta” mean in this context? Cars aren’t computers. We’re fine dealing with “beta” software on a computer, where crashes are as frequent as unpopped kernels in a bag of popcorn. Crashes on the highway don’t lead to rebooting, they lead to twisted metal. Simply by dint of the potential outcomes, unfinished software shouldn’t be released to users.

A note on Tesla’s website carries more than a tinge of defensiveness, as though a project manager at the company is already preparing to be excoriated for the death. The blog post is titled “A Tragic Loss,” but opens not with notes of sadness but this comment on the incidence of collisions:

“This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.”

It’s as if the company were saying, “Hey, we didn’t do it! Lots of people die every year!!” Only in the final paragraph of the note does the company acknowledge that “the customer who died in this crash had a loving family and we are beyond saddened by their loss… We would like to extend our deepest sympathies to his family and friends.”

David Weinberger, a senior researcher at Harvard’s Berkman Center, wrote an essay for us last year titled, “Should your self-driving car kill you to save a schoolbus full of kids?”

Today’s cars should do all they can to preserve the life of the driver and passenger, he argued, because that’s about as far as today’s tech can go. In the future, when cars are completely networked, they’ll know all about their passengers as well — at which point cars will need to make moral judgments.

Imagine this scenario: Two autonomous cars are about to crash, and the computers driving can save either one, but not both. One has a 25-year-old mother in it. The other has a 70-year-old childless man in it. Do we program our cars to always prefer the life of someone young? Of a parent? Do we give extra weight to the life of a medical worker beginning a journey to an ebola-stricken area, or a renowned violinist, or a promising scientist or a beloved children’s author?

But it’s not Tesla’s fault, at least not completely. When Tesla enabled the Autopilot feature, people invariably posted videos of themselves jumping in the backseat while the car steered down the highway. One man was caught napping behind the wheel of his Tesla as the car blithely drove itself down the highway. Even in a fully autonomous vehicle, which Tesla doesn’t claim to manufacture, we should be awake and alert as 5,000 pounds of steel, leather, and batteries zips us along at 80 miles per hour.

Cars aren’t toys, and cars that can steer themselves and avoid obstacles shouldn’t turn us into passengers or children.

For another thing, records reveal that the driver had 8 speeding tickets in 6 years. In theory, a self-driving car could turn him into a better driver, one who obeys the speed limits and doesn’t change lanes recklessly. That’s in the future, of course, when cars are fully autonomous. Today’s cars are hardly smart enough.

Perhaps the trillion-dollar question in this case – “Is it Tesla’s fault?” — should be rephrased as, “How do you deal with human nature?”

It’s inevitable that people will act recklessly – the videos of people pulling stupid stunts are evidence of that. How do self-driving cars (and the people who program them) deal with that? Google has said it wants to make its cars drive more like humans. After all, human drivers expect other vehicles on the road to act as they would, and humans aren’t good drivers. Imagine if the car in front of you came to a full stop at that yellow light as it’s supposed to, rather than tearing through as you would. Would that catch you buy surprise? Having a car that anticipates human foibles and can know enough to accelerate through a red light may reduce accidents.

The ultimate point of self-driving vehicles is just that: reducing accidents. Call them what they really are: collisions, and usually avoidable ones at that. More than a million people die every year in vehicle crashes, and the vast majority of them are caused simply because humans are human. We look at cell phones. We get distracted by others, our own thoughts, the radio, passing street signs, UFOs, whatever.

While this incident was a tragedy, it shouldn’t detract from the larger goal of reducing vehicular deaths. If designed right, computers will be much better drivers than we are – they never tire, they don’t get distracted, they come to a full stop and respect yellow lights. The road to complete autonomy for cars is potted and full of obstacles. But let’s keep the destination in our sights.

Related Posts

Tesla Model 3 got outsold by an EV from a Chinese smartphone brand

The Chinese smartphone maker delivered 258,164 units of its first EV. Meanwhile, Tesla sold only 200,361 Model 3s, marking the first time since Tesla's Chinese launch that another brand has overtaken it in the world's largest EV market.

Your future BMW electric M3 will still sound like a real M car

Instead of trying to invent a new "sound of the future" filled with abstract spaceship hums and digital warbles, BMW’s Motorsport division is digging into its own history books. New videos from the development team reveal that the upcoming electric M3 will feature a synthetic audio system built from high-fidelity recordings of the brand’s most iconic internal combustion engines. We aren't talking about generic engine noises here; BMW is literally sampling the legends.

This is the tech that makes Volvo’s latest EV a major step forward

The 2027 Volvo EX60 boasts engineering improvements in a package that’s likely to have mass appeal. It’s based on a new architecture that offers improved range and charging performance, backed by software with now-obligatory AI integration. And as a five-seat SUV similar in size to the current Volvo XC60 — the automaker’s bestselling model — it’s exactly the type of car most people are looking for.