Tesla Autopilot Crash Under NHTSA Investigation - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Data Management // IoT
Commentary
7/2/2016
12:06 PM
Thomas Claburn
Thomas Claburn
Commentary
Connect Directly
Google+
LinkedIn
Twitter
RSS
50%
50%

Tesla Autopilot Crash Under NHTSA Investigation

The National Highway Traffic Safety Administration is looking into the circumstances surrounding a fatal accident involving a Tesla being driven under autopilot.

10 Hot Smartphones To Consider Now
10 Hot Smartphones To Consider Now
(Click image for larger view and slideshow.)

The National Highway Traffic Safety Administration has opened an inquiry into the autopilot system in Tesla's Model S following the death of a driver who was using the system.

In a statement posted on the Tesla Motors website on June 30, the company acknowledged the inquiry and characterized the incident as "the first known fatality in just over 130 million miles where Autopilot was activated."

The NHTSA said in a statement Tesla had alerted the agency to the crash, which occurred on May 7 in Williston, Fla.

The Levy Journal Online, which covers Levy County, Fla., where the crash occurred, described the accident based on an account provided by the Florida Highway Patrol. A tractor-trailer was traveling west on US 27A and made a left turn onto NE 140 Court as the Tesla driver was heading in the opposite direction. The Tesla passed underneath the 18-wheeler and its roof collided with the truck. It then continued along the road before striking two fences and a utility pole.

(Image: Google)

(Image: Google)

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla said in its statement. "The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."

The failure of Tesla's computer vision system to distinguish the truck from the similarly colored sky appears to have been compounded by radar code designed to reduce false positives during automated braking. Asked on Twitter why the Tesla's radar didn't detect what its cameras missed, CEO Elon Musk responded, "Radar tunes out what looks like an overhead road sign to avoid false braking events."

The driver of the Model S, identified in media reports as 40-year-old Joshua D. Brown from Canton, Ohio, died on the scene.

The driver of the truck, 62-year-old Frank Baressi, told the Associated Press that Brown was "playing Harry Potter on the TV screen" at the time of the crash.

A spokesperson for the Florida Highway Patrol did not immediately respond to a request to confirm details about the accident.

In its June 30 statement, Tesla said drivers who engage Autopilot are warned to keep both hands on the wheel at all times. Autopilot, despite its name, is intended as an assistive feature rather than an alternative to manual control.

The incident has stoked doubts about the viability of self-driving cars and the maturity of Tesla's technology. Clearly, a computer vision system that cannot separate truck from sky in certain light conditions could use further improvement. It was unclear at press time whether Tesla will face any liability claims related to its code or sensing hardware.

However, Tesla insisted in its statement that, when Autopilot is used under human supervision, "the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving."

(Image: Tesla)

(Image: Tesla)

In April, at an event in Norway, Musk said, "The probability of having an accident is 50% lower if you have Autopilot on," according to Electrek.

That may be, but data isn't the only consideration. When human lives are at stake, perception and emotion come into play. Automated driving systems will have to be demonstrably better than human drivers before people trust them with their lives.

Yet, perfection is too much to expect from autopilot systems. Machines fail, and fallible people are likely to remain in the loop. In aviation, automation is common. It has prompted concerns that it degrades the skills pilots need when intervention is called for. If the same holds true for cars with autopilot systems, we can expect to become worse drivers, less able to respond to emergencies, even as our autopilot systems reduce fatalities overall.

There may be no getting around the fact that, given current vehicle designs, driving down a highway at high speed entails some degree of risk, whether a person or a computer is at the wheel.

Thomas Claburn has been writing about business and technology since 1996, for publications such as New Architect, PC Computing, InformationWeek, Salon, Wired, and Ziff Davis Smart Business. Before that, he worked in film and television, having earned a not particularly useful ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Comments
Oldest First  |  Newest First  |  Threaded View
Page 1 / 3   >   >>
jastroff
50%
50%
jastroff,
User Rank: Ninja
7/3/2016 | 2:29:03 PM
Self-Driving Vehicles
We've had lots of discussion on the IW message center about self-driving vehicles.

There are those who think they will be perfected and rule the highway. There are those wo believe their use will be limited to controlled environments which don't necessarily involve people, i.e. industrial uses or smaller travel routes with controlled enviornments. 

I tend to be in the latter camp. In thinking  more about self-driving vehicles, it's clear that the driver will always have to be aware of the traffic, their surroundings, etc. With the amount of distracted driving that's going on today when we need BOTH hands on the wheel, it's hard to imagine one would get into a car as a passenger with the driver watching videos and looking at their text messages, even with or especially because of autopilot applications, which people can't always control
Whoopty
100%
0%
Whoopty,
User Rank: Ninja
7/4/2016 | 7:37:49 AM
Re: Self-Driving Vehicles
Although I think you're right about industrial uses of self-driving vehicles, I have to disagree that there will be a limit on its usefulness. There are still major problems with the technology, it still needs to get better, still needs to be tweaked and used over millions of miles of roads. 

That means it's going to be a slow transition. Yes, for now people should pay attention, but there's very little point in driverless technology if we also have to have a human watching over it. It will be possible in the future to make a driverless car that is more aware than human counterparts and can react faster too.

At that point why would we not move towards removing humans from the equation in driving? Insurance costs come down, fuel efficiency improves, the roads become less dangerous for pedestrians. 

As sad as it will be that people may well die on the road while the technology is improved, I believe it will be worth it in the long run.
paulno
50%
50%
paulno,
User Rank: Strategist
7/4/2016 | 10:50:31 AM
Self driving vehicles
We talked about that the whole saturday evening. It seems crazy the driver was watching at Harry Potter on his dvd while driving on a speedway. I suppose he got a complete trust and confidence on his car over time but for me Tesla is not responsible for what happened : they always said it's a "beta" technology.

In my family circle, some consider Tesla should be prosecuted for this accident but the fact is the driver should have kept an eye on the road and hands on the wheel, to react quicky. We all know it's not 100% safe and just a new technology, with fails and risks. Even Google's car, which is much more safe with the 360° radar on the roof can't be used without being mindful of the journey.


Google's technology was already famous for being better, this is a hard news for Tesla. But my opinion is that they're not faulting in the accident.
paulno
100%
0%
paulno,
User Rank: Strategist
7/4/2016 | 10:58:47 AM
Second thought
By the way I don't understand that kind of vehicles can be driven by common people on ordinary roads. That's extremely dangerous on my mind, both for the driver and every other vehicles and it should still be tested during years on private circuits by professionals.

 

That's the case for planes, copters, sport cars and so on, it's amazing the life of ordinary people who are just going to work like any other day anything could be destroyed by some others that accept to take thoughtless risks. They can do what they want regarding them but I could not accept it if my daughter's were injured  in such a situation.
jastroff
100%
0%
jastroff,
User Rank: Ninja
7/4/2016 | 12:39:35 PM
Re: Second thought
Excellent point -- driving in controlled environments -- as several car companies  are doing. Seems like Tesla took a risk based on arrogrance? Can't tell, but excellent point

>> That's extremely dangerous on my mind, both for the driver and every other vehicles and it should still be tested during years on private circuits by professionals.
vnewman2
100%
0%
vnewman2,
User Rank: Ninja
7/4/2016 | 11:57:55 PM
Re: Self-Driving Vehicles
"there's very little point in driverless technology if we also have to have a human watching over it..."

Exactly - if you still have to pay attention, then why not just do it yourself?  Who is willing to risk their life depending on a hunk of metal to think for you?  Computers are only as smart as the people who programmed them, and as we can now see, those people didn't cover every circumstance.  Talk about a bug in the system...yikes.  No thanks!
FreonPSandoz
50%
50%
FreonPSandoz,
User Rank: Apprentice
7/5/2016 | 3:04:19 AM
Re: Self-Driving Vehicles
People get killed all the time when an idiot in a big rig turns in front of them illegally without enough distance to complete the turn. I can't understand why on earth anyone thinks that a human driver would have been any more successful at avoiding a collision than the Autopilot was.
Whoopty
50%
50%
Whoopty,
User Rank: Ninja
7/5/2016 | 7:32:54 AM
Re: Self-Driving Vehicles
Although the human eye is (probably) better at diferntiating between bright colours than the Tesla's sensor system, I do agree, human drivers make awful mistakes all the time. If all cars were automated with location data in a central database of some kind, this sort of collision would never happen. It would be impossible. 

I'm looking forward to automated cars, but just as with the earliest of vehicles and safety concerns we weren't aware of yet, there will unfortunately be deaths and injuries on the road to full autonomy. When we get there though, the stats suggest we'll all be far safer. 
NJ Mike
50%
50%
NJ Mike,
User Rank: Moderator
7/5/2016 | 11:04:42 AM
Just some thoughts
"People get killed all the time when an idiot in a big rig turns in front of them illegally without enough distance to complete the turn" -- from the article, we don't know if the truck turned illegally or didn't allow enough distance.

 

The point is made that automation is degrading pilot skills.  That will be magified with cars for two big reasons.  First, getting a pilot's license takes a lot more training than a drivers license.  Secondly, there are lot more systems used to control air flight to keep air craft organized in the sky.  Therefore, if a jumbo jet makes an unexpected turn, there is rarely another aircraft in the area.  And we can add to that the many regulations pertaining to a pilot's behavior before flight (such as no drinking for a specified number of hours before takeoff).

 

JMHO - all this neat technology should be used to assist the driver, but the driver must always be in control.
vnewman2
50%
50%
vnewman2,
User Rank: Ninja
7/5/2016 | 2:13:47 PM
Re: Just some thoughts
"The driver must always be in control"

Exactly, which is why you can't be watching Harry Potter movies when "chaperoning" your "driverless" car.

 

 
Page 1 / 3   >   >>
Slideshows
10 Cyberattacks on the Rise During the Pandemic
Cynthia Harvey, Freelance Journalist, InformationWeek,  6/24/2020
News
IT Trade Shows Go Virtual: Your 2020 List of Events
Jessica Davis, Senior Editor, Enterprise Apps,  5/29/2020
Commentary
Study: Cloud Migration Gaining Momentum
John Edwards, Technology Journalist & Author,  6/22/2020
White Papers
Register for InformationWeek Newsletters
The State of IT & Cybersecurity Operations 2020
The State of IT & Cybersecurity Operations 2020
Download this report from InformationWeek, in partnership with Dark Reading, to learn more about how today's IT operations teams work with cybersecurity operations, what technologies they are using, and how they communicate and share responsibility--or create risk by failing to do so. Get it now!
Video
Current Issue
Key to Cloud Success: The Right Management
This IT Trend highlights some of the steps IT teams can take to keep their cloud environments running in a safe, efficient manner.
Slideshows
Flash Poll