By David Shepardson

WASHINGTON (Reuters) – General Motors startup Cruise LLC said Thursday it had recalled and updated software in 80 self-driving vehicles after a June crash in San Francisco that left two people injured.

Federal regulators said the recalled software could “incorrectly predict” an oncoming vehicle’s path. Cruise said it had determined this unusual scenario would not recur after the update.

The National Highway Traffic Safety Administration (NHTSA) has stepped up its scrutiny of advanced driver assistance systems and autonomous vehicle systems in recent months. Last year, it directed all automakers and tech companies to promptly report crashes involving self-driving vehicles.

NHTSA said Thursday that Cruise’s recall filing “to address a safety defect in its automated driving systems software” was required by law.

NHTSA added it “expects all manufacturers, including those developing automated driving systems, to continuously ensure that they are meeting their requirements to initiate a recall for any safety issue that poses an unreasonable risk to safety.”

NHTSA said the recalled Cruise software could “in certain circumstances when making an unprotected left, cause the (autonomous driving system) to incorrectly predict another vehicle’s path or be insufficiently reactive to the sudden path change of a road user.”

Cruise disclosed Thursday that after the June 3 crash in San Francisco, it temporarily prevented its vehicles from making unprotected left turns and reduced the area in which its vehicles could operate.

After the software update on July 6, Cruise said it had gradually reintroduced unprotected left turns, which refers to turning left at an intersection with a solid green light that directs all traffic, rather than a designated green arrow just for turning vehicles.

Cruise emphasized in a statement Thursday all vehicles had software updates and the recall “does not impact or change our current on-road operations.”

The company added “Cruise AVs are even better equipped to prevent this singular, exceptional event.”

NHTSA said “an improper (Automated Driving Systems) response can increase the risk of a crash.”

The agency said last month it had opened a special investigation into the Cruise crash.

In rare circumstances, Cruise said the software caused the autonomous vehicle to hard brake while performing an unprotected left turn that it deemed necessary to avoid a severe front-end collision.

The self-driving vehicle “had to decide between two different risk scenarios and chose the one with the least potential for a serious collision at the time, before the oncoming vehicle’s sudden change of direction,” Cruise said.

Cruise also noted a police report found the party most at fault for the June crash was the other vehicle, which was traveling at 40 miles per hour in a 25-mile zone.

In March, startup technology firm Pony.ai agreed to recall some versions of its autonomous driving system software, after an October crash in California, that had been used in three vehicles.

GM has lost nearly $5 billion since 2018 trying to build a robotaxi business in San Francisco and disclosed in July it lost $500 million on Cruise during the second quarter as it began charging for rides in a limited area of San Francisco.

GM and Cruise in February disclosed they petitioned NHTSA for permission to deploy self-driving vehicles without steering wheels, mirrors, turn signals or windshield wipers.

Last month, NHTSA said it would extend a public comment period on the request.

(Reporting by David Shepardson; Editing by Jason Neely, Bernadette Baum and Alexander Smith)

tagreuters.com2022binary_LYNXMPEI801K6-VIEWIMAGE


Source: One America News Network

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments