← 回總覽

一个学区试图帮助训练 Waymo 在校车前停车,但失败了

📅 2026-03-29 15:00 Aarian Marshall 人工智能 5 分鐘 5030 字 評分: 79
Waymo 自动驾驶汽车 AI 安全 计算机视觉 边缘案例
📌 一句话摘要 本文调查了 Waymo 自动驾驶汽车在识别校车停车臂方面的持续安全故障,强调了尽管进行了软件更新和联邦召回,当前自动驾驶学习模型仍存在局限性。 📝 详细摘要 本报告调查了奥斯汀发生的一系列安全事件,其中 Waymo 自动驾驶汽车未能对伸出停车臂的校车停车。尽管 Waymo 声称其车队具备集体学习能力,并实施了软件更新和联邦召回,但这些车辆仍继续非法超越校车。文章详细介绍了学区与 Waymo 合作收集训练数据的情况,但最终未能解决该问题。专家指出,自动驾驶系统长期以来在识别闪烁灯光和细长停车臂等特定道路安全装置方面一直存在困难,这引发了人们对该行业有效解决关键边缘案例能力的更

One of the purported advantages of self-driving car tech is that every car can learn from one vehicle’s mistakes. Here’s how Waymo puts it on its website: “The Waymo Driver learns from the collective experiences gathered across our fleet, including previous hardware generations.”

But in Austin, Waymo’s vehicles struggled for months to learn how to stop for school buses as drivers picked up and dropped off children. An official with the Austin Independent School District (AISD) alleged that the vehicles had, in at least 19 instances, “illegally and dangerously” passed the district’s school buses while their red lights were flashing and their stop arms were extended rather than coming to complete stops, as the law requires.

In early December, Waymo even issued a federal recall related to the incidents, acknowledging at least 12 of them to federal regulators at the National Highway Traffic Safety Administration (NHTSA), which oversees road safety. According to federal filings, engineers with the self-driving vehicle company had “developed software changes to address the behavior” weeks before.

But even after the recall, the school-bus-passing incidents continued, according to school officials and a report from the National Transportation Safety Board (NTSB), an independent federal safety watchdog that’s also investigating the situation.

Now, email and text messages between school officials and Waymo representatives, obtained by WIRED through a public records request, show the lengths that the Austin public school district and Waymo went to try to solve the problem. AISD even hosted a half-day “data collection” event in a school parking lot in mid-December, the documents show, with several employees pulling together school buses and stop-arm signals from across the fleet so the self-driving car company could collect information related to vehicles and their flashing lights.

Still, by mid-January, over a month later, the school district reported at least four more school-bus-passing incidents had taken place in Austin. “The data we collected from the beginning of the school year to the end of the semester shows that about 98 percent of people that receive one violation do not receive another,” an official with the school’s police department told the local NBC affiliate that month. “That tells us that the person is learning, but it does not appear the Waymo automated driver system is learning through its software updates, its recall, what have you, because we are still having violations.”

The situation raises questions about the self-driving technologies' curious blind spots and the industry’s ability to compensate for them even after they’ve been spotted.

Self-driving software has long struggled with recognizing flashing emergency lights and road safety devices with long, thin arms, including gates and stop-arms, says Missy Cummings, who researches autonomous vehicles at George Mason University and served as a safety adviser to the NHTSA during the Biden administration. “If [the company] didn't fix this a few years ago, the more they drive, the more it’s going to be a problem,” she says. “That’s exactly what’s happening here.”

Waymo did not respond to WIRED’s requests for comment. A spokesperson for the Austin Independent School District referred WIRED to the NTSB while the incidents are under investigation. A spokesperson for the NTSB declined to answer WIRED’s questions while its investigation continues.

Illegal Passing

By midwinter of 2025, AISD officials were frustrated. In one of the 19 incidents alleged by a lawyer for the district in a letter later released by federal road safety regulators, a Waymo passed a school bus letting off children “only moments after a student crossed in front of the vehicle, and while the student was still in the road.”

“Alarmingly,” the lawyer wrote, five of the alleged incidents had occurred after Waymo had assured the district that it had updated its software to fix the problem. Federal regulators with the NHTSA had already launched a probe into the behavior. “Austin ISD is evaluating all potential legal remedies at its disposal and intends to take whatever action is necessary to protect the safety of its students, if required,” the lawyer warned.

查看原文 → 發佈: 2026-03-29 15:00:00 收錄: 2026-03-29 22:00:20

🤖 問 AI

針對這篇文章提問,AI 會根據文章內容回答。按 Ctrl+Enter 送出。