Officials have warned that there could be accidents and deaths involving self-driving cars
WASHINGTON:
The death of a woman in Arizona struck by a self-driving car with no human control, the first fatality involving a fully autonomous vehicle, is an event the nascent industry has long dreaded and comes at a sensitive time.
Monday's accident involving an Uber Technologies Inc car is shaping up as the first significant test of how policy makers and the public will respond to the new technology. The incident occurred as companies have been pushing for regulatory clearance to offer self-driving car ride services as soon as next year. On Friday, Uber and Alphabet Inc's Waymo car unit had written U.S. senators urging them to approve sweeping self-driving car legislation "in the coming weeks."
Automakers and technology companies such as Uber, General Motors Co and Toyota Motor Corp have made substantial investments that hinge on significant revisions to existing vehicle safety regulations written under the assumption that a licensed human would always be in control of a vehicle.
Auto and technology industry officials have warned that there could be accidents and deaths involving self-driving cars, but they have said countless additional lives would be saved as robotic systems programmed to obey traffic laws took over for distracted, sleepy or impaired human drivers.
Mark Rosenker, a former chairman of the National Transportation Safety Board, said on Monday the public should not overreact to the Uber incident. He noted that 6,000 pedestrians and nearly 40,000 people die annually on U.S. roads in more than 6 million crashes annually.
"This is going to be a unfortunate obstacle that we are going to have to deal with to regain (the public's) belief that these devices are safe," Rosenker said.
The incident prompted Uber to suspend all testing of self-driving cars.
The immediate impact of the fatality may be to further delay or change a landmark bill pending in Congress to speed the testing of self-driving cars that was already stalled by objections from a handful of Democrats over safety concerns.
Senator John Thune, a Republican who chairs the Commerce Committee, said the "tragedy underscores the need to adopt laws and policies tailored for self-driving vehicles."
However, two Democratic U.S. senators on Thune's committee, Ed Markey of Massachusetts and Richard Blumenthal of Connecticut, said the Uber incident demands a tough response.
"This tragic incident makes clear that autonomous vehicle technology has a long way to go before it is truly safe for the passengers, pedestrians, and drivers who share America's roads," Blumenthal said in a statement.
The Trump administration has been working to dismantle regulatory roadblocks to self-driving cars, but it has also said it is focused on ensuring safety. "The goal is to develop common sense regulations that do not hamper innovation, while preserving safety," Transportation Secretary Elaine Chao said on March 1. A spokesman for Chao had no additional comment Monday.
Chao is reviewing a petition that GM filed in January with NHTSA requesting an exemption to have a small number of autonomous vehicles operate in a ride-share program without steering wheels or human drivers.
The International Brotherhood of Teamsters said Monday in a statement the incident demonstrated "there are enormous risks inherent to testing unproven technologies on public roads. It is critical that pedestrians and drivers are safeguarded."
Former U.S. Transportation Secretary Anthony Foxx said Monday the incident is a "wake up call to the entire AV industry and government to put a high priority on safety."
In September, the U.S. House of Representatives unanimously passed a measure that would allow automakers to win exemptions from safety rules that require human controls. A Senate version would allow automakers, within three years, to each sell up to 80,000 self-driving vehicles annually if they could demonstrate to regulators they are as safe as current vehicles.
Concerns over the safety of autonomous vehicles flared in July 2016 when a man driving a Tesla Model S in semi-autonomous "Autopilot" mode died when his car struck a tractor-trailer.
In January 2017, federal safety regulators concluded there was no defect in the Tesla Autopilot system, and that the driver should have maintained control.
(This story has not been edited by NDTV staff and is auto-generated from a syndicated feed.)
Monday's accident involving an Uber Technologies Inc car is shaping up as the first significant test of how policy makers and the public will respond to the new technology. The incident occurred as companies have been pushing for regulatory clearance to offer self-driving car ride services as soon as next year. On Friday, Uber and Alphabet Inc's Waymo car unit had written U.S. senators urging them to approve sweeping self-driving car legislation "in the coming weeks."
Automakers and technology companies such as Uber, General Motors Co and Toyota Motor Corp have made substantial investments that hinge on significant revisions to existing vehicle safety regulations written under the assumption that a licensed human would always be in control of a vehicle.
Auto and technology industry officials have warned that there could be accidents and deaths involving self-driving cars, but they have said countless additional lives would be saved as robotic systems programmed to obey traffic laws took over for distracted, sleepy or impaired human drivers.
Mark Rosenker, a former chairman of the National Transportation Safety Board, said on Monday the public should not overreact to the Uber incident. He noted that 6,000 pedestrians and nearly 40,000 people die annually on U.S. roads in more than 6 million crashes annually.
"This is going to be a unfortunate obstacle that we are going to have to deal with to regain (the public's) belief that these devices are safe," Rosenker said.
The incident prompted Uber to suspend all testing of self-driving cars.
The immediate impact of the fatality may be to further delay or change a landmark bill pending in Congress to speed the testing of self-driving cars that was already stalled by objections from a handful of Democrats over safety concerns.
Senator John Thune, a Republican who chairs the Commerce Committee, said the "tragedy underscores the need to adopt laws and policies tailored for self-driving vehicles."
However, two Democratic U.S. senators on Thune's committee, Ed Markey of Massachusetts and Richard Blumenthal of Connecticut, said the Uber incident demands a tough response.
"This tragic incident makes clear that autonomous vehicle technology has a long way to go before it is truly safe for the passengers, pedestrians, and drivers who share America's roads," Blumenthal said in a statement.
The Trump administration has been working to dismantle regulatory roadblocks to self-driving cars, but it has also said it is focused on ensuring safety. "The goal is to develop common sense regulations that do not hamper innovation, while preserving safety," Transportation Secretary Elaine Chao said on March 1. A spokesman for Chao had no additional comment Monday.
Chao is reviewing a petition that GM filed in January with NHTSA requesting an exemption to have a small number of autonomous vehicles operate in a ride-share program without steering wheels or human drivers.
The International Brotherhood of Teamsters said Monday in a statement the incident demonstrated "there are enormous risks inherent to testing unproven technologies on public roads. It is critical that pedestrians and drivers are safeguarded."
Former U.S. Transportation Secretary Anthony Foxx said Monday the incident is a "wake up call to the entire AV industry and government to put a high priority on safety."
In September, the U.S. House of Representatives unanimously passed a measure that would allow automakers to win exemptions from safety rules that require human controls. A Senate version would allow automakers, within three years, to each sell up to 80,000 self-driving vehicles annually if they could demonstrate to regulators they are as safe as current vehicles.
Concerns over the safety of autonomous vehicles flared in July 2016 when a man driving a Tesla Model S in semi-autonomous "Autopilot" mode died when his car struck a tractor-trailer.
In January 2017, federal safety regulators concluded there was no defect in the Tesla Autopilot system, and that the driver should have maintained control.
© Thomson Reuters 2018
(This story has not been edited by NDTV staff and is auto-generated from a syndicated feed.)
Track Latest News Live on NDTV.com and get news updates from India and around the world