Advertisement

SKIP ADVERTISEMENT

Carmakers Are Pushing Autonomous Tech. This Engineer Wants Limits.

Missy Cummings, who spent more than a year at the federal auto safety agency, said that drivers were putting too much trust in systems like Tesla’s Autopilot and that regulators needed to restrict their use.

Missy Cummings sitting in a driving simulator and facing the camera for a portrait.
Missy Cummings, a professor of engineering and computer science, said regulators needed to place more restrictions on how and when driver-assistance systems could be used.Credit...Lexey Swall for The New York Times

Last fall, Missy Cummings sent a document to her colleagues at the National Highway Traffic Safety Administration that revealed a surprising trend: When people using advanced driver-assistance systems die or are injured in a car crash, they are more likely to have been speeding than people driving cars on their own.

The two-page analysis of nearly 400 crashes involving systems like Tesla’s Autopilot and General Motors’ Super Cruise is far from conclusive. But it raises fresh questions about the technologies that have been installed in hundreds of thousands of cars on U.S. roads. Dr. Cummings said the data indicated that drivers were becoming too confident in the systems’ abilities and that automakers and regulators should restrict when and how the technology was used.

People “are over-trusting the technology,” she said. “They are letting the cars speed. And they are getting into accidents that are seriously injuring them or killing them.”

Dr. Cummings, an engineering and computer science professor at George Mason University who specializes in autonomous systems, recently returned to academia after more than a year at the safety agency. On Wednesday, she will present some of her findings at the University of Michigan, a short drive from Detroit, the main hub of the U.S. auto industry.

Image
Systems like General Motors’ Super Cruise can steer, brake and accelerate vehicles on their own.Credit...Mason Dent for The New York Times

Systems like Autopilot and Super Cruise, which can steer, brake and accelerate vehicles on their own, are becoming increasingly common as automakers compete to win over car buyers with promises of superior technology. Companies sometimes market these systems as if they made cars autonomous. But their legal fine print requires drivers to stay alert and be ready to take control of the vehicle at any time.

In interviews last week, Dr. Cummings said automakers and regulators ought to prevent such systems from operating over the speed limit and require drivers using them to keep their hands on the steering wheel and eyes on the road.

“Car companies — meaning Tesla and others — are marketing this as a hands-free technology,” she said. “That is a nightmare.”

But these are not measures that NHTSA can easily put in place. Any effort to rein in how driver-assistance systems are used will probably be met with criticism and lawsuits from the auto industry, especially from Tesla and its chief executive, Elon Musk, who has long chafed at rules he considers antiquated.

Safety experts also said the agency was chronically underfunded and lacked enough skilled staff to adequately do its job. The agency has also operated without a permanent leader confirmed by the Senate for much of the past six years.

Dr. Cummings acknowledged that putting in effect the rules she was calling for would be difficult. She said she also knew that her comments could again inflame supporters of Mr. Musk and Tesla who attacked her on social media and sent her death threats after she was appointed a senior adviser at the safety agency.

But Dr. Cummings, 56, one of the first female fighter pilots in the Navy, said she felt compelled to speak out because “the technology is being abused by humans.”

“We need to put in regulations that deal with this,” she said.

The safety agency and Tesla did not respond to requests for comment. G.M. pointed to studies that it had conducted with the University of Michigan that examined the safety of its technology.

Because Autopilot and other similar systems allow drivers to relinquish active control of the car, many safety experts worry that the technology will lull people into believing the cars are driving themselves. When the technology malfunctions or cannot handle situations like having to veer quickly to miss stalled vehicles, drivers may be unprepared to take control quickly enough.

Image
“Car companies — meaning Tesla and others — are marketing this as a hands-free technology,” Ms. Cummings said. “That is a nightmare.”Credit...Mike Blake/Reuters

The systems use cameras and other sensors to check whether a driver’s hands are on the wheel and his or her eyes are watching the road. And they will disengage if the driver is not attentive for a significant amount of time. But they operate for stretches when the driver is not focused on driving.

Dr. Cummings has long warned that this can be a problem — in academic papers, in interviews and on social media. She was named senior adviser for safety at NHTSA in October 2021, not long after the agency began collecting crash data involving cars using driver-assistance systems.

Mr. Musk responded to her appointment in a post on Twitter, accusing her of being “extremely biased against Tesla,” without citing any evidence. This set off an avalanche of similar statements from his supporters on social media and in emails to Dr. Cummings.

She said she eventually had to shut down her Twitter account and temporarily leave her home because of the harassment and death threats she was receiving at the time. One threat was serious enough to be investigated by the police in Durham, N.C., where she lived.

Many of the claims were nonsensical and false. Some of Mr. Musk’s supporters noticed that she was serving as a board member of Veoneer, a Swedish company that sells sensors to Tesla and other automakers, but confused the company with Velodyne, a U.S. company whose laser sensor technology — called lidar — is seen as a competitor to the sensors that Tesla uses for Autopilot.

“We know you own lidar companies and if you accept the NHTSA adviser position, we will kill you and your family,” one email sent to her said.

Jennifer Homendy, who leads the National Transportation Safety Board, the agency that investigates serious automobile crashes, and who has also been attacked by fans of Mr. Musk, told CNN Business in 2021 that the false claims about Dr. Cummings were a “calculated attempt to distract from the real safety issues.”

Before joining NHTSA, Dr. Cummings left Veoneer’s board, sold her shares in the company and recused herself from the agency’s investigations that solely involved Tesla, one of which was announced before her arrival.

The analysis she sent to agency officials in the fall looked at advanced driver-assistance systems from multiple companies, including Tesla, G.M. and Ford Motor. When cars using these systems were involved in fatal crashes, they were traveling over the speed limit 50 percent of the time. In crashes with serious injuries, they were speeding 42 percent of the time.

In crashes that did not involve driver-assistance systems, those figures were 29 percent and 13 percent.

The amount of data that the government has collected on crashes involving these systems is still relatively small. Other factors could be skewing the results.

Image
Self-driving taxis operated by Cruise, a division of G.M., have been ferrying passengers in San Francisco without anyone at the wheel.Credit...Jason Henry for The New York Times

Advanced drivers-assistance systems are used far more often on highways than on city streets, for instance. And the crash data that Dr. Cummings analyzed is dominated by Tesla, because its systems are more widely used than others. This could mean that the results unfairly reflect on the performance of systems offered by other companies.

During her time at the federal safety agency, she also examined so-called phantom braking, which is when driver-assistance systems cause cars to slow or stop for no apparent reason. Last month, for example, the news site The Intercept published footage of a Tesla vehicle inexplicably braking in the middle of the Bay Bridge connecting San Francisco and Oakland and causing an eight-car pileup that injured nine people, including a 2-year-old.

Dr. Cummings said data from automakers and customer complaints showed that this was a problem with multiple driver-assistance systems and with robotaxis developed by companies like Waymo, owned by Google’s parent company, and Cruise, a division of G.M. Now under test in multiple cities, these self-driving taxis are designed to operate with no driver, and they are ferrying passengers in San Francisco and the Phoenix area.

Many crashes apparently happen because people traveling behind these cars are not prepared for those erratic stops. “The cars are braking in ways that people do not anticipate and are not able to respond to,” she said.

Waymo and Cruise declined to comment.

Dr. Cummings said the federal safety agency should work with automakers to restrict advanced driver-assistance systems using its standard recall process, where the companies agree to voluntarily make changes.

But experts questioned whether the automakers would make such changes without a significant fight.

The agency could also establish new rules that explicitly control the use of these systems, but this would take years and could result in lawsuits.

“NHTSA could do this, but would the courts uphold it?” said Matthew Wansley, a professor at the Cardozo School of Law at Yeshiva University in New York who specializes in emerging automotive technologies.

Dr. Cummings said robotaxis were arriving at about the right pace: After limited tests, federal, state and local regulators are keeping a lid on their growth until the technology is better understood.

But, she said, the government must do more to ensure the safety of advanced driver-assistance systems like Autopilot and Super Cruise.

NHTSA “needs to flex its muscles more,” she said. “It needs to not be afraid of Elon or of moving markets if there is an obvious unreasonable risk.”

Cade Metz is a technology reporter and the author of “Genius Makers: The Mavericks Who Brought A.I. to Google, Facebook, and The World.” He covers artificial intelligence, driverless cars, robotics, virtual reality and other emerging areas. More about Cade Metz

A version of this article appears in print on  , Section B, Page 1 of the New York edition with the headline: A Warning Signal on Self-Driving Cars. Order Reprints | Today’s Paper | Subscribe

Advertisement

SKIP ADVERTISEMENT