Tesla Autopilot Crash Under NHTSA Investigation - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Data Management // IoT
Commentary
7/2/2016
12:06 PM
Thomas Claburn
Thomas Claburn
Commentary
Connect Directly
LinkedIn
Twitter
RSS
50%
50%

Tesla Autopilot Crash Under NHTSA Investigation

The National Highway Traffic Safety Administration is looking into the circumstances surrounding a fatal accident involving a Tesla being driven under autopilot.

10 Hot Smartphones To Consider Now
10 Hot Smartphones To Consider Now
(Click image for larger view and slideshow.)

The National Highway Traffic Safety Administration has opened an inquiry into the autopilot system in Tesla's Model S following the death of a driver who was using the system.

In a statement posted on the Tesla Motors website on June 30, the company acknowledged the inquiry and characterized the incident as "the first known fatality in just over 130 million miles where Autopilot was activated."

The NHTSA said in a statement Tesla had alerted the agency to the crash, which occurred on May 7 in Williston, Fla.

The Levy Journal Online, which covers Levy County, Fla., where the crash occurred, described the accident based on an account provided by the Florida Highway Patrol. A tractor-trailer was traveling west on US 27A and made a left turn onto NE 140 Court as the Tesla driver was heading in the opposite direction. The Tesla passed underneath the 18-wheeler and its roof collided with the truck. It then continued along the road before striking two fences and a utility pole.

(Image: Google)

(Image: Google)

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla said in its statement. "The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."

The failure of Tesla's computer vision system to distinguish the truck from the similarly colored sky appears to have been compounded by radar code designed to reduce false positives during automated braking. Asked on Twitter why the Tesla's radar didn't detect what its cameras missed, CEO Elon Musk responded, "Radar tunes out what looks like an overhead road sign to avoid false braking events."

The driver of the Model S, identified in media reports as 40-year-old Joshua D. Brown from Canton, Ohio, died on the scene.

The driver of the truck, 62-year-old Frank Baressi, told the Associated Press that Brown was "playing Harry Potter on the TV screen" at the time of the crash.

A spokesperson for the Florida Highway Patrol did not immediately respond to a request to confirm details about the accident.

In its June 30 statement, Tesla said drivers who engage Autopilot are warned to keep both hands on the wheel at all times. Autopilot, despite its name, is intended as an assistive feature rather than an alternative to manual control.

The incident has stoked doubts about the viability of self-driving cars and the maturity of Tesla's technology. Clearly, a computer vision system that cannot separate truck from sky in certain light conditions could use further improvement. It was unclear at press time whether Tesla will face any liability claims related to its code or sensing hardware.

However, Tesla insisted in its statement that, when Autopilot is used under human supervision, "the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving."

(Image: Tesla)

(Image: Tesla)

In April, at an event in Norway, Musk said, "The probability of having an accident is 50% lower if you have Autopilot on," according to Electrek.

That may be, but data isn't the only consideration. When human lives are at stake, perception and emotion come into play. Automated driving systems will have to be demonstrably better than human drivers before people trust them with their lives.

Yet, perfection is too much to expect from autopilot systems. Machines fail, and fallible people are likely to remain in the loop. In aviation, automation is common. It has prompted concerns that it degrades the skills pilots need when intervention is called for. If the same holds true for cars with autopilot systems, we can expect to become worse drivers, less able to respond to emergencies, even as our autopilot systems reduce fatalities overall.

There may be no getting around the fact that, given current vehicle designs, driving down a highway at high speed entails some degree of risk, whether a person or a computer is at the wheel.

Thomas Claburn has been writing about business and technology since 1996, for publications such as New Architect, PC Computing, InformationWeek, Salon, Wired, and Ziff Davis Smart Business. Before that, he worked in film and television, having earned a not particularly useful ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Comments
Threaded  |  Newest First  |  Oldest First
jastroff
50%
50%
jastroff,
User Rank: Ninja
7/3/2016 | 2:29:03 PM
Self-Driving Vehicles
We've had lots of discussion on the IW message center about self-driving vehicles.

There are those who think they will be perfected and rule the highway. There are those wo believe their use will be limited to controlled environments which don't necessarily involve people, i.e. industrial uses or smaller travel routes with controlled enviornments. 

I tend to be in the latter camp. In thinking  more about self-driving vehicles, it's clear that the driver will always have to be aware of the traffic, their surroundings, etc. With the amount of distracted driving that's going on today when we need BOTH hands on the wheel, it's hard to imagine one would get into a car as a passenger with the driver watching videos and looking at their text messages, even with or especially because of autopilot applications, which people can't always control
Whoopty
100%
0%
Whoopty,
User Rank: Ninja
7/4/2016 | 7:37:49 AM
Re: Self-Driving Vehicles
Although I think you're right about industrial uses of self-driving vehicles, I have to disagree that there will be a limit on its usefulness. There are still major problems with the technology, it still needs to get better, still needs to be tweaked and used over millions of miles of roads. 

That means it's going to be a slow transition. Yes, for now people should pay attention, but there's very little point in driverless technology if we also have to have a human watching over it. It will be possible in the future to make a driverless car that is more aware than human counterparts and can react faster too.

At that point why would we not move towards removing humans from the equation in driving? Insurance costs come down, fuel efficiency improves, the roads become less dangerous for pedestrians. 

As sad as it will be that people may well die on the road while the technology is improved, I believe it will be worth it in the long run.
vnewman2
100%
0%
vnewman2,
User Rank: Ninja
7/4/2016 | 11:57:55 PM
Re: Self-Driving Vehicles
"there's very little point in driverless technology if we also have to have a human watching over it..."

Exactly - if you still have to pay attention, then why not just do it yourself?  Who is willing to risk their life depending on a hunk of metal to think for you?  Computers are only as smart as the people who programmed them, and as we can now see, those people didn't cover every circumstance.  Talk about a bug in the system...yikes.  No thanks!
FreonPSandoz
50%
50%
FreonPSandoz,
User Rank: Apprentice
7/5/2016 | 3:04:19 AM
Re: Self-Driving Vehicles
People get killed all the time when an idiot in a big rig turns in front of them illegally without enough distance to complete the turn. I can't understand why on earth anyone thinks that a human driver would have been any more successful at avoiding a collision than the Autopilot was.
Whoopty
50%
50%
Whoopty,
User Rank: Ninja
7/5/2016 | 7:32:54 AM
Re: Self-Driving Vehicles
Although the human eye is (probably) better at diferntiating between bright colours than the Tesla's sensor system, I do agree, human drivers make awful mistakes all the time. If all cars were automated with location data in a central database of some kind, this sort of collision would never happen. It would be impossible. 

I'm looking forward to automated cars, but just as with the earliest of vehicles and safety concerns we weren't aware of yet, there will unfortunately be deaths and injuries on the road to full autonomy. When we get there though, the stats suggest we'll all be far safer. 
jastroff
100%
0%
jastroff,
User Rank: Ninja
7/5/2016 | 3:39:51 PM
Re: Self-Driving Vehicles
Good point -- >> Exactly - if you still have to pay attention, then why not just do it yourself? 

I thihk there are machines that do things for us, but which we monitor. Sometimes often, sometimes we just read dials.

But with a car, we are talking about sitting IN the machine, and being on the roadway with other machines, It's a very strange set of circumstances. It's not like an automated factory where someone watches the "line" to make sure the ketchup bottles are all getting filled properly. 
vnewman2
50%
50%
vnewman2,
User Rank: Ninja
7/5/2016 | 4:00:01 PM
Re: Self-Driving Vehicles
@jastroff - I love your ketchup analogy!  You hit the nail on the head!
paulno
50%
50%
paulno,
User Rank: Strategist
7/4/2016 | 10:50:31 AM
Self driving vehicles
We talked about that the whole saturday evening. It seems crazy the driver was watching at Harry Potter on his dvd while driving on a speedway. I suppose he got a complete trust and confidence on his car over time but for me Tesla is not responsible for what happened : they always said it's a "beta" technology.

In my family circle, some consider Tesla should be prosecuted for this accident but the fact is the driver should have kept an eye on the road and hands on the wheel, to react quicky. We all know it's not 100% safe and just a new technology, with fails and risks. Even Google's car, which is much more safe with the 360° radar on the roof can't be used without being mindful of the journey.


Google's technology was already famous for being better, this is a hard news for Tesla. But my opinion is that they're not faulting in the accident.
paulno
100%
0%
paulno,
User Rank: Strategist
7/4/2016 | 10:58:47 AM
Second thought
By the way I don't understand that kind of vehicles can be driven by common people on ordinary roads. That's extremely dangerous on my mind, both for the driver and every other vehicles and it should still be tested during years on private circuits by professionals.

 

That's the case for planes, copters, sport cars and so on, it's amazing the life of ordinary people who are just going to work like any other day anything could be destroyed by some others that accept to take thoughtless risks. They can do what they want regarding them but I could not accept it if my daughter's were injured  in such a situation.
jastroff
100%
0%
jastroff,
User Rank: Ninja
7/4/2016 | 12:39:35 PM
Re: Second thought
Excellent point -- driving in controlled environments -- as several car companies  are doing. Seems like Tesla took a risk based on arrogrance? Can't tell, but excellent point

>> That's extremely dangerous on my mind, both for the driver and every other vehicles and it should still be tested during years on private circuits by professionals.
NJ Mike
50%
50%
NJ Mike,
User Rank: Moderator
7/5/2016 | 11:04:42 AM
Just some thoughts
"People get killed all the time when an idiot in a big rig turns in front of them illegally without enough distance to complete the turn" -- from the article, we don't know if the truck turned illegally or didn't allow enough distance.

 

The point is made that automation is degrading pilot skills.  That will be magified with cars for two big reasons.  First, getting a pilot's license takes a lot more training than a drivers license.  Secondly, there are lot more systems used to control air flight to keep air craft organized in the sky.  Therefore, if a jumbo jet makes an unexpected turn, there is rarely another aircraft in the area.  And we can add to that the many regulations pertaining to a pilot's behavior before flight (such as no drinking for a specified number of hours before takeoff).

 

JMHO - all this neat technology should be used to assist the driver, but the driver must always be in control.
vnewman2
50%
50%
vnewman2,
User Rank: Ninja
7/5/2016 | 2:13:47 PM
Re: Just some thoughts
"The driver must always be in control"

Exactly, which is why you can't be watching Harry Potter movies when "chaperoning" your "driverless" car.

 

 
Technocrati
100%
0%
Technocrati,
User Rank: Ninja
7/5/2016 | 4:02:49 PM
Re: Just some thoughts

It amazes me that someone would trust their life to "autopilot" while driving.  And then to be so trustful that they would like to watch a movie while doing it.  This makes absolutely no sense to me.

vnewman2
50%
50%
vnewman2,
User Rank: Ninja
7/5/2016 | 4:37:08 PM
Re: Just some thoughts
Agree - what Telsa has implemented is like an "advanced cruise control" - an assisted driver technology much like assisted parking.  But people will treat it like autopilot, as people will do under a false sense of security - and unfortunate accidents like this will be more common.
TerryB
50%
50%
TerryB,
User Rank: Ninja
7/6/2016 | 1:16:13 PM
Re: Just some thoughts
We'll know this technology has made it when it is capable of eliminating an impaired driver from getting a DWI. And even if technology is sound, is society ever going to let the primary "operator" off the hook for staying below the tested level anyway?

To me (not because I'm a big drinker!), this is the killer app of a driverless car. Just replacing the driving work you do only goes towards the convienience/laziness factor human drivers have. But a car in the midwest that can get you home from a bar birthday party without a $40 cab trip (or $200 hotel) or risk of blowing a .085 at a checkpoint has a huge financial and safety benefit. I'd argue the technology is already better than these people that are repeat DWI offenders blowing .24 when pulled over. I'm sure the tech doesn't get on wrong side of divided highway and not know it already. 
jastroff
0%
100%
jastroff,
User Rank: Ninja
7/7/2016 | 10:56:10 AM
Re: Just some thoughts
Agree. I've always thought cars, etc. were weapons -- they can kill easily. Keep control, always.
jastroff
0%
100%
jastroff,
User Rank: Ninja
7/7/2016 | 10:56:10 AM
Re: Just some thoughts
Agree. I've always thought cars, etc. were weapons -- they can kill easily. Keep control, always.
Charlie Babcock
100%
0%
Charlie Babcock,
User Rank: Author
7/6/2016 | 7:37:08 PM
Ah, auto-pilot is not equivalent to 'self-driving car'
In the July 2 New York Times, the headline referred to "A Fatality In A Self-Driving Car Forces Tesla To Confront Its Limits." The car was not a self-driving car. It was a software and sensor-enhanced form of cruise control, with Tesla urging drivers using it to not take their hands off the wheel or attention from the road. I would not prejudge the outcome of the investigation by pillorying the driver. But I certainly urge auto-pilot users to put some limits on total trust in auto-pilot. 
Technocrati
50%
50%
Technocrati,
User Rank: Ninja
7/7/2016 | 2:01:44 PM
Re: Ah, auto-pilot is not equivalent to 'self-driving car'

@Charlie  Thanks for the clarification and to everyone along this thread that understands the issue better than I.  

If nothing else, this might serve to have people think before they entrust their life to a vehicle with Tesla stamped on it.

Technocrati
50%
50%
Technocrati,
User Rank: Ninja
7/7/2016 | 2:20:49 PM
Telsa: Not AutoPilot ? Then Don't Call It That

"...Autopilot, despite its name, is intended as an assistive feature rather than an alternative to manual control."

 

 

Tesla should change the naming.  People associate the term in the traditional sense and since anyone who can afford a Tesla thinks these are wonder machines.  It is easy to see how this function could be misconstrued.

vnewman2
50%
50%
vnewman2,
User Rank: Ninja
7/8/2016 | 1:13:26 PM
Re: Telsa: Not AutoPilot ? Then Don't Call It That
I agree @Technocrati - make up a new name or just be boring and call it Driving Assist.  Everyone has a preconceived notion for the term "autopilot" and even if Tesla tries to reframe it for their own purposes, not everyone is going to "get" it.
Charlie Babcock
50%
50%
Charlie Babcock,
User Rank: Author
7/7/2016 | 2:27:50 PM
Auto-pilot: before and after accident
In a June 30 blog, Tesla presented auto-pilot with "lane keeping and automatic braking capabilities – among others – is a driving-assist feature and is not intended to be used as a fully autonomous vehicle technology." That's after the June 7 accident. How did Tesla present auto-pilot to customers before the accident?
BobbyB269
50%
50%
BobbyB269,
User Rank: Apprentice
7/7/2016 | 3:41:07 PM
Tesla is not a self-driving car
Tesla's "self-driving" car uses a vigilant-human approach: 

"Tesla notes that Autopilot is meant only to assist drivers, not to replace them. And its onscreen warnings and owner's manual emphasize that drivers should remain vigilant and keep their hands on or near the wheel at all times."  --nytimes.com

Google OTOH, has monitored drivers (employees) while they were driving "vigilant-human approach automobiles" and found the drivers were often profoundly distracted and even napping. They realized that the vigilant-human approach was scary because most humans were lulled into totally trusting the car after hundreds of miles without incident. As a result, Google is approaching it from the perspective that the car must be reliably self-driving, with no steering, no brake pedals and no accelerator pedals. In-other-words, until their cars can drive WITHOUT ANY driver assistance, they are not good enough. 

I figure that in the long run, the only approach that will survive the market and the regulators will be Google's approach to self-driving cars. Tesla will either figure this out or lose the market. 

 
Slideshows
10 Trends Accelerating Edge Computing
Cynthia Harvey, Freelance Journalist, InformationWeek,  10/8/2020
Commentary
Is Cloud Migration a Path to Carbon Footprint Reduction?
Joao-Pierre S. Ruth, Senior Writer,  10/5/2020
News
IT Spending, Priorities, Projects: What's Ahead in 2021
Jessica Davis, Senior Editor, Enterprise Apps,  10/2/2020
White Papers
Register for InformationWeek Newsletters
2020 State of DevOps Report
2020 State of DevOps Report
Download this report today to learn more about the key tools and technologies being utilized, and how organizations deal with the cultural and process changes that DevOps brings. The report also examines the barriers organizations face, as well as the rewards from DevOps including faster application delivery, higher quality products, and quicker recovery from errors in production.
Video
Current Issue
[Special Report] Edge Computing: An IT Platform for the New Enterprise
Edge computing is poised to make a major splash within the next generation of corporate IT architectures. Here's what you need to know!
Slideshows
Flash Poll