Tesla doesn't want you to see this video of a car hitting child dummies

To view this video please enable JavaScript, and consider upgrading to a webbrowser thatsupports HTML5video

Tesla has demanded a safety advocacy group take down videos of its vehicles running over child dummies when in self-driving mode.

The Dawn Project, a tech safety group, launched a nationwide TV campaign warning of the alleged potential dangers of Tesla’s ‘Full Self-Driving’ (FSD) software.

One of the videos posted by the group shows a Tesla vehicle with FSD software running over child-sized mannequins, with the message ‘Tell Congress to Shut it Down’.

Elon Musk’s car company has called the videos ‘defamatory’ and ‘misleading’ in a Cease and Desist letter sent on August 11.

Tesla said the Dawn Project and its founder ‘have been disparaging Tesla’s commercial interests and disseminating defamatory information to the public’.

The electric carmaker has threatened to take legal action, saying the tests in the videos are ‘likely fraudulent’ and ‘misrepresent the capabilities of Tesla’s technology’.

The company said its ‘FSD Beta’ a test version of its new automated driving technology used by a limited number of consumers, ‘does recognize pedestrians, including children, and when utilized properly, the system reacts to prevent or mitigate a collision’.

The Dawn Project claims that its tests are ‘are completely legitimate and not deceptive’ and called Tesla’s letter ‘marketing propaganda’.

Musk’s car company is already on thin ice with the US safety regulator investigating its Autopilot system following a series of crashes.

Just this year, a Tesla crashed into a building causing nearly £286,692 in damages while another self-driving Tesla crashed into a £1.5 million private jet while being ‘summoned’.

Recent research showed that 58% of UK residents were uncomfortable using self-driving cars while 55% of them did not feel safe sharing the road with them.

Source: Read Full Article