Radar vanishes from Tesla’s view, prompting swift safety concerns

Elon Musk’s disdain for lidar has been well documented. Now the Tesla CEO has cooled on another sensor most automakers deem essential for underpinning certain safety features.

Tesla said last week it stopped equipping Model 3 and Model Y vehicles with radar sensors in April. The change sparked a swift response. Three leading automotive safety organizations stripped the vehicles of their safety designations.

NHTSA, the federal regulatory agency charged with ensuring vehicle safety, revised its five-star crash safety ratings. It removed checkmarks the vehicles had earned for offering features such as forward collision warning, lane departure warning, crash avoidance braking and dynamic brake support.

Consumer Reports and the Insurance Institute for Highway Safety followed by revoking the “Top Pick” and “Top Safety Pick+” designations, respectively, for the Model 3.

“If a driver thinks their vehicle has a safety feature and it doesn’t, that fundamentally changes the safety profile of the vehicle,” said David Friedman, vice president of advocacy at Consumer Reports and a former acting administrator of NHTSA. “It might not be there when they think it would save their lives.”

Instead of equipping the cars with radar, Tesla is transitioning to a camera-only system called Tesla Vision, which relies on machine learning to run certain safety features, Autopilot and a “Full Self Driving” feature that still requires human motorists to be responsible for driving.Most automakers are taking the opposite approach. They’ve relied on cameras and radar for information that supports certain driver-assist features, and many intend to add lidar to those systems in the years ahead. Cameras, lidar and radar each have their strengths and weaknesses. By combining different types of sensors, automakers can gain complementary sets of information that provide a cross-check.

In numerous incidents, including a high-profile crash investigated by the National Transportation Safety Board, Tesla vehicles have collided with parked first-responder vehicles while safety features such as Autopilot were activated.

Removing radar means the systems will go from bad to worse, says Sam Abuelsamid, principal research analyst at Guidehouse Insights.

“Clearly, the setup they had with the single radar sensor on the front was never going to be adequate for a real Level 2 system anyway, and they clearly have not been able to figure out how to use it effectively,” he said. “The better solution would be to not get rid of one sensor, but to follow the path of GM and Ford, with four corner radar sensors to help with cut-ins, because that’s definitely an area where Tesla has issues.”

Tesla said the radar removal affects Model 3 and Model Y vehicles made on or after April 27. In a blog post, the company said cars equipped with Tesla Vision may find some features “temporarily limited or inactive.” Those include Smart Summon and Emergency Lane Departure Avoidance. In a matter of weeks, Tesla says, those features will be restored via several over-the-air software updates. The automaker did not respond to a request for further comment.

Whether a camera-only system can adequately replace one that previously used both cameras and radar remains a serious safety question, says Bryan Reimer, a research scientist at the Massachusetts Institute of Technology.

“If performance is not greater in the new system, we are taking a step backward in efficacy and consumer transparency, negatively impacting safety,” he said.

A camera-only system could underpin certain features. Subaru’s Eyesight system utilizes two stereo cameras. But those cameras are spaced approximately 12 inches apart. Other automakers are exploring ways to push the cameras farther apart — toward the vehicle’s A-pillars. The farther apart, the more accurate for measuring depth of objects.

Abuelsamid said Tesla Vision’s three cameras are clustered close together, calling into question the ability to make those precise measurements.

“What they’re doing is a less-than-ideal solution for measuring distance, and then you don’t even have radar as a backup to verify those measurements, which you really need in low light, fog or rain,” Abuelsamid said. “Really, I think they decided ‘We’re going all-in on cameras, and we save ourselves a few bucks by dropping a radar sensor.’ ”