WASHINGTON (Reuters) – A U.S. senator on Friday urged Tesla Inc. (TSLA.(O) to the rebrand, the driver assistance system, an automatic pilot, said that it was “inherently misleading and is a subject to the possible danger of abuse.
FILE PHOTO: An advertisement that promotes the Tesla auto-pilot in the showroom the AMERICAN car manufacturer, Tesla in zurich, Zurich, Switzerland, March 28, 2018. REUTERS/Arnd Wiegmann/File Photo
However, Tesla said in a letter that he had taken measures to ensure that the driver’s involvement with the system, and the improvement of safety and security.
The electric car manufacturer has added new alerts for red lights and stop signs, last year, “on the minimization of the risk of a red light or a stop sign, which is as a result of the temporary driver’s negligence,” Tesla said in the letter.
Senator Edward Markey, said that he believed that the potential dangers of auto-pilot, it can be overcome. However, he called for “rebranding and remarketing of the system in order to reduce misuse, as well as the construction of a back-to driver monitoring tools that will ensure that no one falls asleep at the wheel.”
Markey’s comments came in a press release, a copy of the Dec. 20 a Model for the approach of some of the Democratic senator’s objections are connected to it.
The automatic pilot has been engaged in at least three of Tesla’s vehicle involved in fatal U.S. crashes, since in 2016.
Accidents in which the pilot will have to ask about the driver-assistance system, with the ability to detect threats, especially of stationary objects.
There has been mounting concern about the safety and security around the world and the systems in which you can use to perform the driving tasks for long stretches of time with little or no human intervention, but does not completely replace a human driver.
Markey cited a video of a Tesla driver, who appeared to fall asleep at the wheel during the operation of the automatic pilot, and others in which the drivers said that they were able to defeat the guarantees of the slices of a banana or a bottle of water in the gravel, so it looks like they were in control of the vehicle.
Tesla, in a letter, he said, the revisions made to the steering control means, that is, in the majority of situations with a weak hand on the steering wheel of a drowsy driver does not work, nor will it in the coarse hands of a person with impaired motor functions, such as a drunk driver.”
It added that the devices placed on the market in order to trick the automatic pilot, should be able to get the key from the system on for a short period of time, but this is generally not up for a trip to the auto-pilot.”
Tesla wrote that, while the video, as cited by Markey showed “a few bad actors who are grossly abusing the auto-pilot,” they represented only a very small percentage of our customer base.”
Earlier this month, the U.S. National Highway Traffic Safety Administration (NHTSA) said it was launching an investigation to the 14th-a crash involving a Tesla, that they have to verify the auto-pilot, or another of the advanced driver assistance system-in-use.
NHTSA is probing the Dec. 29 of the fatal crash of a Model S from Tesla is not in Gardena, California, usa. In that incident, the vehicle exited the 91 Freeway, ran a red light and struck a 2006 Honda Civic, killing its two occupants.
The National Transportation Safety Board to hold Feb. 25 hearing for the purpose of determining the probable cause of 2018, with a fatal Rate of auto-pilot and will crash in Mountain View, California, usa.
Report by David Shepardson; Editing by Chizu Nomiyama and Tom Brown