Waymo says it's bringing its fully driverless robotaxis to Miami. But are they safe?
Published in Business News
Waymo, the self-driving vehicle company owned by Google parent Alphabet, has announced it’s bringing its driverless robotaxis to Miami amid wide-ranging questions about their safety and an ongoing federal investigation into collisions involving its cars in cities where the service is already operating.
In a statement, the company said it will re-launch testing of its adapted Jaguar I-PACE electric robotaxis in early 2025 in Miami, then deploy its fully driverless Waymo One ride-hailing service locally in 2026, in collaboration with mobility tech firm Moove. Waymo had previously tested out its vehicles in Miami as recently as last year.
Miami would become the fifth U.S. city with widely available Waymo service, after San Francisco, Los Angeles, Phoenix and Austin, where the company says it’s now providing some 150,000 weekly driverless trips. Much like Uber and Lyft, the Waymo cars are summoned through a phone app — only there’s no one at the wheel.
Waymo repeatedly touted what it claims is the vehicles’ safety in its Dec. 5 statement.
“Together, we will provide safe, seamless trips for riders, and scale faster and more cost-effectively over time, with safety continuing to lead the way,” Ryan McNamara, vice president of Waymo operations, is quoted as saying in the release.
That’s a view echoed in the statement by Miami Mayor Francis Suarez, who is quoted as saying: “Fully autonomous driving technology offers a safe and convenient option to the people of Miami. I’m so pleased to welcome Waymo to our city.”
However, Suarez has no way to back that up: Waymo, which declined to speak on the record with the Miami Herald, doesn’t share internal data about the cars’ operation or safety protocols. The city doesn’t regulate Waymo or autonomous vehicles and doesn’t have access to, review or certify safety or operational data from the companies running them.
Neither does anyone else in Florida.
In fact, the self-driving autos, which are powered by algorithms, artificial intelligence and machine learning, require no specific permits to operate in Florida and are largely similarly unregulated across the country.
In the absence of any Congressional action, there are no federal rules governing their use. States like California and Florida have adopted a mostly hands-off approach to the tech and auto giants, including Alphabet and Tesla, that are seeking to operate them amid an intensely competitive and expensive race to market.
NHTSA probe
The only federal intervention comes after crashes have already occurred. The National Highway Traffic Safety Administration in May opened an investigation into Waymo after numerous instances in which its robotaxis collided with stationary objects, like parked cars and gates, that “a competent driver would be expected to avoid,” or operated in ways that violate traffic laws, such as wrong-way driving and running lights. The agency has issued no findings.
The software that runs autonomous cars, which rely on artificial intelligence as the vehicles “learn” to cope with varying road conditions as they go, is proprietary, and so are test results and safety protocols.
That means members of the public concerned about their safety must rely on assurances from the companies themselves — and those have proven questionable in some instances as the firms run what amounts to experiments in often chaotic urban settings, some experts say.
Waymo insists that its own data from a decade of testing and 25 million miles of fully autonomous driving, in areas ranging from reported accidents to injury rates and the number of airbag deployments, shows its vehicles are now considerably safer than cars operated by people — a claim that experts say is impossible to corroborate independently.
The experts stress they’re not saying Waymo’s cars are unsafe, just that no one can say with any reasonable certainty whether they are or not — including Waymo itself. That’s in part, they say, because the cars have not been driven nearly enough miles to generate sufficient information to judge. Experts say the vehicles would have to travel a few billion miles before there’s enough data to reach reliable conclusions.
“We don’t know a lot. We know what Waymo tells us,” said Philip Koopman, an expert on autonomous vehicle safety at Carnegie Mellon University who has emerged as a leading skeptic of the company’s reported safety statistics. “Basically you are trusting Waymo to do the right thing.”
‘Cherry-picked’ data
Koopman contends the company has at times “cherry-picked” what limited data it has provided publicly in published papers, and has made claims based on insufficient data. That includes promotional materials claiming that their robotaxis save lives because they have a lower fatality rate than cars driven by people.
“That is not true. Full stop,” said Koopman, a professor of electrical and computer engineering.. “When you hear the company talk you have to pay extreme attention and know what they say is only sort of true.”
Like Koopman, Henry Liu, a mechanical engineering professor who runs autonomous vehicle research and testing facilities at the University of Michigan, said he can’t say whether Waymo’s vehicles are safe given the lack of available data.
“Obviously they are not going through a rigorous third-party review,” said Liu, who with colleagues at Michigan has urged the federal government to develop safety guidelines for autonomous vehicles and require testing, much like a human driver has to take a test to get a license. “How much the average consumer can trust them, I can’t say.”
But he said he is optimistic about the potential for self-driving vehicles, which have improved rapidly and substantially thanks to the use of artificial intelligence to help train them.
He has ridden in Waymo robotaxis in San Francisco and says there were “no hiccups” and they consistently drove well. He thinks the service is probably safe enough to use right now given its relatively small-scale deployment and slow surface-street speeds. So far the Waymo One cars don’t go on highways, where he said significantly higher speeds would present a more formidable challenge to sensors and operating systems, and a far greater chance of serious injuries or fatalities in collisions.
“When that happens,” Liu said, referring to highway driving, “the number of accidents will occur a lot more. The question is, when they go to a larger scale, how do they deploy in a safe manner? How will it impact traffic flow and congestion?
”We don’t have enough information right now. Waymo has lots of experience already. But sometimes the cars do make mistakes. We want them to be much much better than human drivers before they are scaled up.”
GM puts brakes on effort
What is clear is that Waymo has taken an early lead in the race to deploy as efforts by Amazon and Uber have foundered. Its chief competitor, Cruise, majority-owned by General Motors, crashed out of San Francisco after one of its driverless cars critically injured a pedestrian who had been struck by a car driven by a human. The Cruise vehicle failed to stop, ran her over and dragged her 20 feet down the street.
The state suspended Cruise operations and federal transportation authorities sanctioned the company for misleading investigators about the collision. This week, GM pulled the plug on the formerly fast-growing Cruise for good, citing the high expense and difficulty of running the program.
Among U.S. automakers only Tesla, whose autopilot feature in its private cars has been blamed for numerous fatal crashes amid criticism that the company has overpromised on the technology’s safety and reliability, has said it still plans to deploy a driverless ride-hailing service.
Google launched what would become Waymo with its Self-Driving Car Project in 2009. Now a separate division under the Alphabet umbrella, Waymo has sought to operate first in warm, dry states because its technology reportedly doesn’t fare well in snow, though its cars are being tested in winter conditions. Difficulties handling Florida’s often heavy rainfall was apparently one reason for the company’s repeated testing in Miami, an issue Waymo suggests in its announcement it has successfully addressed.
In Miami, the rollout will follow the script it has used in San Francisco and other cities. At first, the cars will be tested with Waymo technicians in the driver’s seat as their systems “map” city streets, then will provide driverless ride-hailing to employees. After that, the service will be made available to the general public.
Waymo’s fares, the company says, are comparable to Uber and Lyft’s.
Mixed reception in San Francisco
In San Francisco, Waymo’s cars are now a nearly ubiquitous sight.
The company began testing the driverless cars in the city in 2021 before making the service widely available to the public In June. It now operates a fleet of 300 vehicles in the city’s hilly neighborhood streets.
The white Jags are easy to pick out because of the large cameras, sensors and other instruments, some of them whirling like fans, that the cars are equipped with. Because they are adapted for use as robotaxis and not designed or built specifically for that use, the Jags have steering wheels, pedals and a driver’s seat, which is unoccupied. No Waymo staffer sits in the car, though remote personnel monitor operations and can intervene if problems occur.
The Waymo robotaxis have proven especially popular with tourists who hail the cars for rides to attractions like Fisherman’s Wharf and Coit Tower.
But the reception from the locals has been a decidedly mixed bag, drawing anger and at times conflict. The cars were initially greeted by protests and, in at least once instance vandalism, as people attacked Waymo vehicles, striking, spray-painting and setting one on fire. Pranksters soon learned that an orange traffic cone placed on the hood of the vehicles would cause them to freeze. Waymo has sued alleged vandals for hundreds of thousands of dollars.
The city, meanwhile, complained that some of the robotaxis drive erratically and interfere with police and fire operations, running through yellow caution tape, blocking fire trucks and running over firehoses..
But the city has been powerless to do anything about it. That’s because Waymo and other driverless vehicle companies have lobbied state legislatures to block local regulation of the services in California and other states, Carnegie Mellon’s Koopman said.
The San Francisco city attorney sued in court and filed objections with the state commission that approved the robotaxis, saying the agency abused its authority by allowing fully driverless operations without measures to ensure their safety.
Waymo has recalled its fleet several times for fixes, at times voluntarily and sometimes under orders from federal regulators, after instances of erratic driving or avoidable collisions, such as crashing into poles too frequently.
A record of incidents
In October, San Francisco police had to intervene to move a Waymo car when it blocked a motorcade in which Vice President Kamala Harris was traveling. Residents say questionable maneuvers or cars confused by unfamiliar situations can snarl traffic. A reporter visiting the city in November witnessed one Waymo vehicle force its way across three lanes of gridlocked traffic as it made a possibly illegal turn from a side street.
The vehicles have also been blamed for creating nuisances. In one case, Waymo cars leaving and entering a company parking lot were honking relentlessly at one another, waking or otherwise disturbing local residents, who said they had to complain repeatedly before the company acted to fix the problem, the result of a software bug.
The company has said it’s constantly revising software as the vehicles gather reams of new information every time they go out. It also boasts that insurance groups like Swiss Re have praised the service for reducing claims, though some experts have challenged the dramatic improvements over human performance that it claimed.
The burgeoning robotaxi industry raises other issues as well.
Planners say the robocars will likely add considerable traffic to already congested urban streets, much like Uber and Lyft did, while potentially siphoning off riders from public transit. Some Uber and Lyft drivers in Los Angels and Phoenix, meanwhile, have complained that Waymo is cutting into their earnings, Business Insider reported.
Legal experts say the cars also raise unanswered questions about liability and responsibility for accidents or crashes given that they have no human driver to who can be ticketed, criminally charged or sued — though the operating company can be sued. Sometimes, depending on how state laws define “driver,” police are barred outright from issuing tickets for traffic violations, as is the case in California, Koopman said.
Florida laws approved by the legislature in 2019 and 2021 cleared the way for robocars to operate in the state with virtually no oversight. The rules specify insurance levels operators must carry, but shield the original carmakers — though not the companies operating robotaxis — from liability.
Moreover, because the cars are operated by highly complex and opaque computerized systems, assigning blame for a crash will likely require victims to hire experts to analyze whether a vehicle’s auto-control systems failed to operate safely, legal experts say.
But the questions about robocars’ safety go to the heart of the companies’ promises.
Supporters say self-driving vehicles powered by software and artificial intelligence and machine learning should be, in theory at least, safer than vehicles operated by error-prone human beings who may also choose to get behind the wheel while impaired.
The company insists they already are, and provides summaries of data on its website. Getting into a Waymo, it says, is far safer than climbing into your car to go to the drugstore — a decision fraught with risks that millions of people make every day without a second thought.
Many users, including journalists, have posted videos, reports and testimonials vouching for the pleasant experience of riding in a Waymo car, noting the vehicles seem to maneuver carefully, closely observing speed limits and traffic signs and signals while slowing or stopping for pedestrians and cyclists. Some have even complained that the cars are a bit poky compared to human-driven cars.
Early research, however, presents a mixed picture on the safety advantages of fully autonomous and driver-assist systems over human drivers. An analysis of crash data published in the scientific journal Nature Communications found that there’s insufficient information at this early stage to say anything conclusively, simply because the cars haven’t been driven enough yet.
The challenges of the road
Available data does suggest, the study concluded, that the automated cars may fare better when driving on a straight course, but worse than humans in other conditions, including at dusk and dawn and while the vehicles are changing direction — such as making turns and merging or changing lanes.
That may be one reason Waymo is not yet operating on highways. Another, Koopman suggested, is that it’s far safer to run the vehicles in low-speed urban streets. Cars traveling at 25 mph are far less likely to kill or seriously injure anyone than cars going at highways speeds.
Experts say that because software is inherently subject to bugs and defects, it’s not possible or realistic to expect that robocars would not make errors or would always react safely when the unpredictable happens. But while a software problem in a home computer program may cause a laptop to freeze or crash, the consequences of such an issue in a moving vehicle can be catastrophic, the experts note.
Liu, the Michigan professor, said the vehicles have likely not yet encountered enough of the rare or unusual events that human drivers often have to confront and that can lead to serious errors and collisions. Waymo robotaxis have been filmed entering blocked construction areas where conditions can change on a daily basis. For instance, Liu said, autonomous vehicles have demonstrated difficulties when they come across traffic lights that are not operating or have been covered up because they’re newly installed and not yet working.
Other experts say the robocars have trouble identifying animals and their unpredictable movements, a problem that may apply as well to children. Liu said the vehicles have shown issues during Halloween, when they can be confused by humans wearing, say, a dinosaur costume — something a human driver would easily understand.
Liu cites another example in which a Waymo was video’d as it drove behind a pickup truck carrying a tree. The vehicle read the tree as if it were stationary and slammed on the brakes.
“For the most part they drive really well, but just don’t see enough to see enough of these situations so they have difficulty with them,” Liu said.
Also subject to potential failure, experts note, are the cars’ complex hardware systems — sensors, cameras and the sophisticated onboard processors that handle and analyze immense amounts of real-time data to make constant decisions on speed, braking and steering. Like airplanes, the cars are equipped with backups if a piece of hardware fails.
The experts say the driverless cars should at a minimum reduce risks to a reasonably acceptable level. But with the criteria and definition left up to companies motivated by profit and competition, they say, it’s hard to say whether they are clearing that bar.
The bottom line, some experts say, is that the safety or robocars remains unproven even as they’re rapidly deployed.
The critical question, Koopman said is not so much how risky it would be to ride in one, but how capable they prove in evading collisions or striking pedestrians, cyclists and animals.
“If your readers want to get in one, it will probably be fine. People certainly do more dangerous things on a regular basis,” Koopman said. “The right question is, should you walk in front of one?
“I, personally, do not walk in front of one.”
©2024 Miami Herald. Visit at miamiherald.com. Distributed by Tribune Content Agency, LLC.
Comments