It’s coming up: 2020 seems to be the magic year for the realization of self-driving cars. Google plans to introduce this technology by 2020, and a number of other companies like Toyota and Nissan are also setting goals for self-driving cars by this time. But while they’re supposed to make our lives easier, autonomous cars raise a boggling number of not only legal, but moral, questions. For one: Should your self-driving car kill you in order to save the lives of others in a crash?
Of course, autonomous cars are designed to react more quickly than humans to prevent accidents. But what about those extreme cases? A recent article from the Los Angeles Times brings up an interesting scenario: A driverless car travels along the road when it detects children crossing. It brakes, but loses traction thanks to a recent rock slide. Should it send the occupants in the car off a cliff if that means saving several or perhaps dozens of children in the crosswalk?
And then, as the LA Times also proposes, what if a careless motorcyclist swerves into your lane to get ahead of a curve? Should the car put your life at risk to save his?
This begs even more questions: Is it a numbers game when it comes to saving lives? And who is more important: drivers, passengers, bikers, or pedestrians? And should cars be able to make these decisions for us? Perhaps by the time 2020 rolls around, manufacturers will answer more of these questions. At the very least, government regulators should eventually address them.
Should we trust driverless cars to make quick life or death decisions? Let us know your thoughts in the comments below.
Source: Los Angeles Times
The post TOTD: Should Driverless Cars Choose Who Lives and Who Dies in a Crash? appeared first on Motor Trend.
Agya Club Indonesia