马上注册,结交更多好友,享用更多功能,让你轻松玩转社区。
您需要 登录 才可以下载或查看,没有帐号?FreeOZ用户注册
x
A still from a video broadcast by the Chinese government news channel CCTV that depicts a fatal crash that may have occurred while Tesla’s automated driver-assist system was in use. Tesla Motors came under renewed questioning about the safety of its Autopilot technology after news emerged on Wednesday of a fatal crash in China that may have occurred while the automated driver-assist system was operating. The crash took place on Jan. 20 and killed Gao Yaning, 23, when the Tesla Model S he was driving slammed into a road sweeper on a highway near Handan, a city about 300 miles south of Beijing, according to a report broadcast onWednesday by the Chinese government news channel CCTV. The report includes in-car video looking through the windshield as the car travels in the left lane at highway speed just before ramming into a parked or slow-moving orange truck. The video, apparently shot by a camera mounted on the rearview mirror, recorded no images, sounds or jolts that would suggest the driver or the car hit the brakes before impact. At that point, the in-car video ends. “When it was approaching the road sweeper, the car didn’t put on the brake or avoid it,” a police officer said in the CCTV report. “Instead, it crashed right into it.” In an emailed statement, Tesla said on Wednesday that it had not been able to determine whether Autopilot was active at the time of the Handan accident. The company declined to say when it learned of the fatality in China, or whether it had reported the crash to United States safety officials, who are investigating a fatal accident in Florida on May 7 in which Autopilot was engaged. So far, that Florida accident is the only confirmed death involving a Tesla with Autopilot turned on. In that accident, there was no sign that the driver or Autopilot had applied the brakes before the car collided at high speed with a tractor-trailer that had turned in front of it. 5 Things That Give Self-Driving Cars Headaches Despite their multitude of sensors and processors, autonomous cars have a lot of trouble with some everyday aspects of driving.
News of the Chinese crash will renew questions about when the company should disclose information about accidents in cars equipped with Autopilot and what information should be shared. “Because of the damage caused by the collision, the car was physically incapable of transmitting log data to our servers, and we therefore have no way of knowing whether or not Autopilot was engaged at the time of the crash,” a Tesla spokeswoman, Alexis Georgeson, said in the company’s statement. “We have tried repeatedly to work with our customer to investigate the cause of the crash, but he has not provided us with any additional information that would allow us to do so,” she said of the car’s owner, Mr. Gao’s father. She said Tesla was saddened to learn of the death of Mr. Gao. “We take any incident with our vehicles very seriously and immediately reached out to our customer when we learned of the crash,” she said. Tesla and Autopilot have been under scrutiny since the disclosure of the May fatality. That crash killed Joshua Brown, 40, whose 2015 Model S was traveling 74 miles per hour when it collided with a tractor-trailer that had turned left and was crossing a highway near Williston, Fla. Autopilot’s radar and cameras failed to recognize the white truck against a bright sky. News of the China crash comes just three days after Tesla’s chief executive, Elon Musk, outlined changes planned for Autopilot that he said would have prevented Mr. Brown’s accident and that he contended would make the Model S one of the safest cars on the road. The changes include refinements in Autopilot’s radar that improve its ability to spot and identify obstacles down the road and additional warnings to force drivers to keep their hands on the steering wheel and eyes on the road while the system is active. Photo
A Tesla Model S at an auto show in Beijing in April. News of a fatal crash involving a Tesla in China will renew questions about when the company should disclose information about accidents in cars equipped with its Autopilot technology. Credit Wu Hong/European Pressphoto Agency Tesla has said Autopilot is not meant to take over completely for a human driver. When Autopilot is turned on, drivers are given audio and text warnings to remain alert and engaged while using it. In August, Reuters reported that Tesla removed a Chinese term for “self-driving” from its China website after a driver in Beijing had a nonfatal crash while Autopilot was engaged. That driver later complained that the carmaker had oversold Autopilot’s capability. The video of the January accident indicates the type of unexpected problem that can crop up at highway speeds. Critics of Autopilot say a driver can be lulled into complacency, leaving too little time to take back control of the vehicle. Mr. Gao was traveling in the left lane of a three-lane highway with another car ahead of him. When the car ahead moved into the center lane, it revealed the orange truck, which was straddling the road’s left shoulder. Mr. Gao’s car never slowed before plowing into the truck. Police investigators concluded that Mr. Gao was responsible for the accident, CCTV reported. But in July his family sued the dealer who had sold the Tesla. The driver’s father, Gao Jubin, told CCTV he thought his son had been relying on Autopilot to drive the car and so was not watching the road when the crash took place. The lawsuit was filed “to let the public know that self-driving technology has some defects,” the family’s lawyer said in the report. “We are hoping Tesla, when marketing its products, will be more cautious. Don’t just use self-driving as a selling point for young people.”
|