Corvette Stingray Coupe

Make
Chevrolet
Segment
Coupe

Imagine you're driving down the street and suddenly you see an small child walking across obliviously across the street. Being a non-serial killer with a sense of empathy, you'd do whatever possible, swerving or braking, to get out of the way, right? Even if it meant bashing the car next to you, the headache brought on by the minor damage is minimal compared to even the thought of hitting the kid. Now imagine the same scenario one more time, except replace the child with a traffic cone.

Chances are you would hardly flinch as your tires imposed the burden of a two-ton vehicle on a piece of plastic. While this seems like a pretty straightforward case of using basic intuition to make split-second decisions, autonomous cars do not have the same ability. To them, a traffic cone is the same thing as a child, and this poses a huge moral dilemma that we will soon have to answer and supplies yet another argument against computer drivers. Given that with each passing year more cars are gaining semi-autonomous features and the advent of a fully independent self-driving car is looming, this is kind of a big deal. A recent study by American Association for the Advancement of Science surveyed 1,900 people about moral decisions and self-driving cars.

They were asked to imagine various situations in which a self-driving car had control over the outcome of an impending accident. In one scenario, the car had the choice to save 10 pedestrians in exchange for killing the front passenger or to save the passenger in exchange for 10 new grave plots. Most chose the first scenario, deeming 10 lives as more important than one. Moreover, things get tricky when the scenario is changed by adding a loved one in the car. After all, it's kind of hard to OK the decision to kill off your mother. As much as we hate to answer these sorts of questions, we may soon have to add regulations that take these things into account. Looks like the future won't be easier after all.