Tính năng tự lái của Tesla đặc biệt nguy hiểm và có thể vẫn còn nguy hiểm

Giới thiệu Tesla Autopilot Was Uniquely Risky—and May Still Be

Hệ thống Autopilot của Tesla độc đáo và có rủi ro— và có thể vẫn còn đó.

Hãy tận hưởng trải nghiệm mua sắm tuyệt vời tại Queen Mobile với chất lượng sản phẩm đỉnh cao và dịch vụ chuyên nghiệp. Đánh giá sản phẩm một cách chính xác và chọn lựa ngay hôm nay! #QueenMobile #ChấtLượngSảnPhẩm #DịchVụChuyênNghiệp #MuaSắmTuyệtVời

Mua ngay sản phẩm tại Việt Nam:
QUEEN MOBILE chuyên cung cấp điện thoại Iphone, máy tính bảng Ipad, đồng hồ Smartwatch và các phụ kiện APPLE và các giải pháp điện tử và nhà thông minh. Queen Mobile rất hân hạnh được phục vụ quý khách….
_____________________________________________________
Mua #Điện_thoại #iphone #ipad #macbook #samsung #xiaomi #poco #oppo #snapdragon giá tốt, hãy ghé [𝑸𝑼𝑬𝑬𝑵 𝑴𝑶𝑩𝑰𝑳𝑬] ✿ 149 Hòa Bình, phường Hiệp Tân, quận Tân Phú, TP HCM
✿ 402B, Hai Bà Trưng, P Tân Định, Q 1, HCM
✿ 287 đường 3/2 P 10, Q 10, HCM
Hotline (miễn phí) 19003190
Thu cũ đổi mới
Rẻ hơn hoàn tiền
Góp 0%

Thời gian làm việc: 9h – 21h.

KẾT LUẬN

Tesla Autopilot là hệ thống tự lái độc đáo và hấp dẫn, nhưng cũng tiềm ẩn rủi ro khá lớn. Theo một số chuyên gia, hệ thống này có thể vẫn còn gặp vấn đề trong quá trình vận hành. Tuy nhiên, đây vẫn là một công nghệ đáng để đầu tư và khám phá, được các người mua quan tâm và muốn trải nghiệm.

A federal report published today found that Tesla’s Autopilot system was involved in at least 13 fatal crashes in which drivers misused the system in ways the automaker should have foreseen—and done more to prevent. Not only that, but the report called out Tesla as an “industry outlier” because its driver assistance features lacked some of the basic precautions taken by its competitors. Now regulators are questioning whether a Tesla Autopilot update designed to fix these basic design issues and prevent fatal incidents has gone far enough.

These fatal crashes killed 14 people and injured 49, according to data collected and published by the National Highway Traffic Safety Administration, the federal road-safety regulator in the US.

At least half of the 109 “frontal plane” crashes closely examined by government engineers—those in which a Tesla crashed into a vehicle or obstacle directly in its path—involved hazards visible five seconds or more before impact. That’s enough time that an attentive driver should have been able to prevent or at least avoid the worst of the impact, government engineers concluded.

In one such crash, a March 2023 incident in North Carolina, a Model Y traveling at highway speed struck a teenager while he was exiting a school bus. The teen was airlifted to a hospital to treat his serious injuries. The NHTSA concluded that “both the bus and the pedestrian would have been visible to an attentive driver and allowed the driver to avoid or minimize the severity of this crash.”

Government engineers wrote that, throughout their investigation, they “observed a trend of avoidable crashes involving hazards that would have been visible to an attentive driver.”

Tesla, which disbanded its public affairs department in 2021, did not respond to a request for comment.

Damningly, the report called Tesla “an industry outlier” in its approach to automated driving systems. Unlike other automotive companies, the report says, Tesla let Autopilot operate in situations it wasn’t designed to, and failed to pair it with a driver engagement system that required its users to pay attention to the road.

Regulators concluded that even the Autopilot product name was a problem, encouraging drivers to rely on the system rather than collaborate with it. Automotive competitors often use “assist,” “sense,” or “team” language, the report stated, specifically because these systems aren’t designed to fully drive themselves.

Last year, California state regulators accused Tesla of falsely advertising its Autopilot and Full Self-Driving systems, alleging that Tesla misled consumers into believing the cars could drive themselves. In a filing, Tesla said that the state’s failure to object to the Autopilot branding for years constituted an implicit approval of the carmaker’s advertising strategy.

The NHTSA’s investigation also concluded that, compared to competitors’ products, Autopilot was resistant when drivers tried to steer their vehicles themselves—a design, the agency wrote in its summary of an almost two-year investigation into Autopilot, that discourages drivers from participating in the work of driving.

A New Autopilot Probe

These crashes occurred before Tesla recalled and updated its Autopilot software via an over-the-air update earlier this year. But along with closing this investigation, regulators have also opened a fresh probe into whether the Tesla updates, pushed in February, did enough to prevent drivers from misusing Autopilot, from misunderstanding when the feature was actually in use, or from using it in places where it is not designed to operate.

The review comes after a Washington state driver last week said his Tesla Model S was on Autopilot—while he was using his phone—when the vehicle struck and killed a motorcyclist.