The National Highway Traffic Safety Administration said Friday it will assess whether Tesla’s system, also known as FSD, has the ability to detect and appropriately respond to fog and other reduced visibility conditions.
The agency said four crashes have been reported in such scenarios where FSD was engaged.
In one of those crashes, the Tesla vehicle fatally struck a pedestrian, and another collision resulted in a reported injury, according to NHTSA. Tesla representatives didn’t respond to an emailed request for comment.
The probe marks a potentially major setback to CEO Elon Musk’s efforts to position Tesla as a leader in automated driving.
The company staged an event at a Los Angeles-area movie studio just last week with autonomous vehicle concepts. It has for years charged consumers thousands of dollars for FSD, which requires constant driver supervision.The defect investigation comes on top of a recall query NHTSA opened in April into whether Tesla had done enough to keep drivers from misusing another set of assistance features marketed as Autopilot. The agency is looking into whether a software update Tesla deployed late last year ensures that drivers stay engaged while using the system.
NHTSA has said there’s been “a critical safety gap” between what drivers think Autopilot can do and its actual capabilities.