Automotive

Car was exceeding the speed limit at time of Tesla Autopilot accident

Car was exceeding the speed limit at time of Tesla Autopilot accident
The roof was ripped off Joshua Brown's Model S in the Autopilot accident
The roof was ripped off Joshua Brown's Model S in the Autopilot accident
View 2 Images
The roof was ripped off Joshua Brown's Model S in the Autopilot accident
1/2
The roof was ripped off Joshua Brown's Model S in the Autopilot accident
The side of the truck hit by Joshua Brown's Model S
2/2
The side of the truck hit by Joshua Brown's Model S

Tesla's Autopilot system has come under scrutiny recently, after a driver using the system was killed when his Model S hit a white tractor-trailer on a Florida highway. More details have emerged from the accident, with investigators revealing the car was traveling 9 mph over the posted limit at the time of impact.

The accident occurred on May 7th, when a tractor-trailer pulled across US Highway 27A in Florida. According to a report from the National Transportation Safety Board, performance data from the Model S says it was traveling at 74 mph (119 km/h) at the point of impact, 9 mph (14 km/h) above the 65 mph (105 km/h) posted limit.

In a release put out by Tesla, the company said the combination of a bright sky and white trailer might have made the truck difficult to see, potentially contributing to the accident.

After slamming into the side of the truck, the car rolled around 300 ft (91 m) before hitting a telephone pole. Having broken the pole, it then carried on another 50 ft (15 m) before coming to rest in someone's front yard.

In a statement released in June, Tesla was at pains to mention the fact Autopilot is still a public beta. Since then Elon Musk has clarified, using his Master Plan: Part Deux to reassure the public every iteration of the system is thoroughly tested in-house, but the beta label is used to minimize complacency among drivers.

The investigation into Joshua Brown's death is ongoing, with Tesla cooperating with the NHTSA and NTSB to identify who was at fault.

Source: National Transportation Safety Board

14 comments
14 comments
Daishi
You can't just put it in autopilot and go to sleep or stop looking at the road. The accident is still the fault of the driver (who was watching a Harry Potter DVD or something at the time).
It does highlight that autonomous driving will have a lot of scenarios that have to be accounted for and will need to coordinate input from multiple sensor types. Companies are throwing hundreds of millions into R&D of these systems and the best system on the market just missed seeing the broad side of a tractor trailer.
These systems when fully complete will be massive difficult to maintain codebases that will require a lot of onboard computing power and layers upon layers of expensive redundancy.
As they become reliable enough to /seem/ predictable people are going to let their guards down and be less alert like typical passengers so we will continue seeing cases like this for a while during that long transition.
Robert Walther
I am wildly in favor of automated cars, but I think trusting the currently rare such vehicle and beta tech to roads with same level cross intersections and 65 mph speed limits might be a little premature.
Mel Tisdale
Surely a minimum spec, even for beta application has to be a fly-by-wire throttle linked via Bluetooth to a detailed road map that contains all the information that an autonomous vehicle would need with speed limits at or near the top of a list of such features. (Bluetooth should be supported by micro electronic inertial navigation for any loss of signal. and the map itself needs to be linked to a roads control centre and be updated as and when required- sorry if such provision would put an end to high speed police chases.)
Tanstar
9 miles over the speed limit isn't unusual, but why not limit the speed to posted speeds when the autopilot is online? My cars GPS shows when I'm speeding, as do Garmin devices. This would only work where the speed limit was known, and there would need to be a way to submit speed limit changes for review as well. Still, it seems like a worthwhile additional safety precaution. Nothing will replace actually paying attention to where you are going.
JamesPatrickCase
The speed is set by the driver in this case ergo the speed is not the fault of the car's systems (and arguably the accident). "System performance data also revealed that the driver was operating the car using the advanced driver assistance features Traffic-Aware Cruise Control and Autosteer lane keeping assistance. The car was also equipped with automatic emergency braking that is designed to automatically apply the brakes to reduce the severity of or assist in avoiding frontal collisions." (NTSB-pre-report). This constitutes by no means a fully automated driving system but simply a simi-automated steering and mild speed control system (plus or minus a few miles per hour). It seems to be pretty clearly the driver's fault (both the speed and the accident) although if they implemented even more safety features (closed beta, set max speed to the known speed limit based on GPS?) they could have been prevented. But if want to live in a world where humans are not allowed to act like adults and take responsibility for their own actions then blame the people who could have been a nanny but instead treated their clients like adults. I think that this article is well thought out but that the "autopilot" software needed to be better explained. cited: http://www.teslamotors.wiki/wiki/Traffic-Aware_Cruise_Control_(TACC) https://teslamotorsclub.com/tmc/threads/three-recent-autopilot-cruise-control-accidents.70998/
bobflint
Why did the autopilot mode not reduce the speed to within the limit?
Koolski
This is a VERY common set of circumstances. This is why it will be a long time before I trust software to drive me around. That being said, I do understand that we are still in the infancy of autonomous vehicles.
Westtrekker
+9MPH is not "speeding" in most of the Country.
bobcat4424
A couple of comments that the articles seem to be missing. 1) The Teslas electronic control system amasses a huge amount of data. This data helps Tesla constantly tune their systems. But it is also going to be seen more and more in legal venues as accidents gain the benefit of all that data. 2) Autonomous cars (the ultimate goal) have a lot more than just a fancy software suite. There is a great deal more to making this work.
The four components of autonomous cars are 1) The manufacturer. 'Nuff said, but most manufacturers are on top of their game even though 5-10 years behind Tesla. 2) The driver. Drivers are going to have to adjust to a new paradigm where they have to learn and follow a new set of rules. This is going to be tough. "Monitor on Psychology" did a theme issue on this early in 2015. This included ethical issues. 3) Car-To-Car communications. Cars need to talk to one another because this allows sharing of sensor information (giving an almost total situational view) and to allow cars to signal their intentions. Though no one talks about it, this one is incredibly important. 4) Markings. There is a need to improve highway and vehicle markings in such a way that there are more easily and accurately seen. No more faded stripes or even "featureless" semi trailers. Even the most minimal marking would have prevented the collision in this particular incident.
There is no rationale that supports holding back on the technology until it is "perfect." It will never be perfect and at even Tesla's simplistic level, it is already safer than without driver assistance. This is just a variation on the nitrates and nitrites in processed meat paradox. The added chemicals are well-known to cause cancer. But the number of cancer deaths is less than 1/4000th of the deaths from spoiled meat. There is an old engineering saw that "perfect is the enemy of excellent."
ezeflyer
Press on regardless Elon!
Load More