Skip to main content

Tesla is all-eyes as it drops more sensors from semi-autonomous systems

Tesla has started to remove ultrasonic sensors from its new vehicles and will rely only on cameras for the operation of its Autopilot and Full Self-Driving features.

Tesla's are becoming less like bats every day.

The automaker has announced that it has started to remove the ultrasonic systems its current vehicles use for their electronic driver aid systems, including Autopilot and Full Self-Driving, which will transition to a solely camera-based system.

Tesla began removing radar from its vehicles last year and the Model 3 and Model Y will be the first to lose the ultrasonic sensors, followed by the Model S and Model X next year.

A "vision-based occupancy network" will draw information from eight cameras installed on the vehicles.

ELON MUSK'S FLOATING CYBERTRUCK TWEET TORPEDOED BY GOVERNMENT AGENCIES

"This approach gives Autopilot high-definition spatial positioning, longer range visibility and ability to identify and differentiate between objects. As with many Tesla features, our occupancy network will continue to improve rapidly over time," Tesla said on its website.

Owners of existing cars are being notified that they will temporarily lose some of their semi-autonomous features with the associated software update, but that they will be restored after the new software has been fully validated for safety.

The functions include park assist, which currently uses the sensors to detect nearby objects at low speeds; Autopark, which can steer a vehicle into a parking space; and both Summon and Smart Summon, which allow the vehicle to be controlled remotely via an app at low speeds with no one on board.

ELON MUSK SAYS TESLA CYBERTRUCK WILL BE A ‘DAMN FINE MACHINE’

Other functions, including AutoSteer, blind spot warning and automatic emergency braking will continue to function as they currently do and all the vehicles will retain current crash test ratings that factor in active safety systems.

While other automakers use all three sensors and are starting to include lidar into their semi-automated driving systems, Elon Musk has championed what the automaker calls "Tesla Vision" and argued that having multiple inputs can overwhelm a system with conflicting data.

Some Tesla owners have posted video evidence, however, that show how the Pure Vision system can misread distances when stop signs are not standard size or mistake the Moon for a traffic light.

Tesla had hinted at the move with the latest Cybertruck prototypes, which are equipped with cameras but not any ultrasonic sensors.

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.