California has informed Tesla it is considering stricter regulation of the electric carmaker's driving assistance tools currently being tested on public roads, following videos posted online of disturbing episodes.
Several clips on YouTube and Twitter show drivers testing Full Self-Driving beta and suddenly having to regain control of their vehicles to prevent their Tesla from hitting a pole or veering into the oncoming lane.
Tesla has noted the tools require active driver supervision, but the California Department of Motor Vehicles said in a letter to the firm on January 5 it is reviewing whether the features meet the definition of an autonomous vehicle.
Elon Musk's car company has recruited some motorists for real-conditions tests of FSD beta, which is supposed to be able to drive in the city, stop automatically or make turns. California's DMV wrote in its letter that it is revisiting its "classification decision following recent software updates, videos showing dangerous use of that technology and open investigations" from US regulators.
"DMV will be initiating further review of the latest releases, including any expansion of the program and features," the letter said. If the DMV decides to classify Tesla's driver assistance systems as an autonomous vehicle, the rules will be stricter.
Tesla would, for example, have to report any problems it encounters to the agency and would have to identify all drivers testing its new tools. The company did not respond to a request for comment.
BK