A Computer With A Conscience

Nick Carr's hypothetical:

So you’re happily tweeting away as your Google self-driving car crosses a bridge, its speed precisely synced to the 50 m.p.h. limit. A group of frisky schoolchildren is also heading across the bridge, on the pedestrian walkway. Suddenly, there’s a tussle, and three of the kids are pushed into the road, right in your vehicle’s path. Your self-driving car has a fraction of a second to make a choice: Either it swerves off the bridge, possibly killing you, or it runs over the children. What does the Google algorithm tell it to do?

He concludes that we "don’t even really know what a conscience is, but somebody’s going to have to program one nonetheless." Earlier Dish on machine morality here.

(Hat tip: Jacobs)