2018-03-22# Can Self-Driving Cars Be Engineered to Be Ethical?
  时间:2018年3月22日 浏览数: 打印

  This week, a self-driving vehicle struck and killed a woman in the American state of Arizona. The death is believed to be the first involving a fully autonomous car on U.S. public roads.

  The vehicle was operating in self-driving mode when it hit a 49-year-old woman crossing the street near the city of Phoenix. Police say Elaine Herzberg was walking with her bicycle outside the road's crosswalk when she was struck. The Volvo SUV – belonging to the ride-sharing company Uber – was traveling at about 65 kilometers an hour. A driver was sitting in the vehicle's seat even though the car was set to drive itself.

  After the crash, Uber announced it was halting all its road testing programs while the incident is investigated. Local and federal officials are taking part in the investigation.

  Effects of the crash

  Some transportation experts believe the deadly crash could slow the development of self-driving technology. They argue that U.S. lawmakers and the public may resist the technology if they believe self-driving cars will repeatedly make deadly mistakes.

  Nicholas Evans is a professor of philosophy at the University of Massachusetts in Lowell, Massachusetts. He thinks Uber made the right decision to stop its self-driving tests while the crash is investigated. But he says future accidents could change public opinion about the technology.

  "I think that consumers get very worried about kind of dramatic individual episodes, and that can really shake consumer confidence in products and investor confidence in products."

  Evans said companies test self-driving vehicles in different ways.

  "At the moment, companies like Tesla and Uber are taking a more kind of open-ended approach to their testing. They're just kind of putting their cars out there and seeing what happens."

  On the other hand, Evans said Waymo - Google's self-driving vehicle company - is carrying out its testing more scientifically, closer to the way medical trials are done. For example, Waymo is choosing areas in the country with different environmental conditions to test its vehicles.

  Ethics for self-driving cars

  Right now, Evans is specifically studying ways to make driverless vehicles capable of making ethical decisions. In many situations, vehicles will have to quickly decide levels of risk. These risks could relate to the driver of the self-driving car, other drivers or passengers on the road, or pedestrians.

  For example, a self-driving vehicle might have to decide whether to save its own driver, or crash to avoid a vehicle carrying numerous people, like a school bus.

  Evans is receiving money from the National Science Foundation to study the ethics of decision-making algorithms in autonomous vehicles. He says self-driving cars need to be programmed to react to many difficult situations. But, he adds, even simple driving activities – such as having vehicles enter a busy street – can be dangerous.

  "So we're looking at how exactly we should think about imposing those kinds of small risks that autonomous vehicles are going to have to take every day that they're driving, and how those small risks add up into larger health and community impacts."

  One of the most basic questions is how to decide the value of human lives. Evans says most people do not like to think about this question. But, he says, it is highly important in developing self-driving technology.

  A driver might believe that since the vehicle belongs to him, the car should be programmed to take care of that person above all others. Or some might believe that pedestrians should always be protected in crash situations.

  Evans says whom to value is one of the most difficult parts of teaching a driverless car to act on its own.

  "So this is one of the really tricky questions behind autonomous vehicles – is how do you value different people's lives and how do you program a car to value different people's lives."

  As a philosophy professor, Evans teaches ethics to engineers working on self-driving cars and other technologies. He says one of his first assignments is to discuss an often-heard argument by supporters of self-driving technology who want to keep moving quickly with development. They argue that up to 90 percent of current traffic accidents are caused by human error. This number, they argue, would be greatly reduced if machines take over the driving.

  But, Evans says, his students realize this argument alone does not permit them to move carelessly. They must still carry out thoughtful, safe testing of the technology.

  Personally, Evans says he is not sure he would be ready take his hands off the wheel to go to sleep or do some work while riding in a driverless vehicle. He said his feelings have less to do with self-driving technology and more with other risks, such as a lack of road markings or extreme weather.

  "But give it 10 years and who knows, maybe I'll be on board like everyone else," he added.

  I'm Bryan Lynn.

  Bryan Lynn reported this story for VOA Learning English. Additional information came from Reuters and the Associated Press. Kelly Jean Kelly was the editor.

  We want to hear from you. Write to us in the Comments section, and visit 51VOA.COM.

  

〖信息来源:www.51voa.com〗