Thinking about how driverless cars will influence how we think about driving, Eric Jaffe wonders if it would be possible to program one for road rage:
I posed that question to Chris Urmson, head of Google’s self-driving car program, when I rode in the car on city streets in late April. In a strict technical sense, sure, the car could be programmed for aggression. But in line with the safety points mentioned above, Urmson said it’s “probably not the right thing to emulate all the human behavior” in programming driverless cars. … Urmson believes self-driving cars might have a therapeutic effect on aggressive driving styles. Slowly you’ll stop noticing the things that once made you irate on the road, and eventually you’ll forget they even existed. That’s a huge change in how we travel. Riding in cars, in this case, would become more like riding on trains or subways: the occasional unexpected stop will be annoying, but largely outweighed by the chances for diversion.
John Michael McGrath expects self-driving buses to be a more promising innovation than self-driving personal cars:
I strongly suspect sitting in traffic isn’t actually going to be more amusing just because your car is a robot. (Or at least, not after the first few times.) We will still need to find ways to move people more efficiently than any single-passenger vehicle can. That’s why people are mistaken when they say autonomous vehicles are going to mean the end of traditional mass transit.
Rather, the same kind of technology that allows self-driving cars should also allow transit operators to introduce self-driving buses, if voters (and transit unions) will accept it. Buses will continue to make more efficient use of the road due to physics and geometry than even the slimmest self-driving cars. Voters can be leery of driverless transit, but it can offer much higher frequency in off-peak times than systems relying on higher labour costs.
T.C. Sottek posits that driverless cars could be a boon for privacy as well, by eliminating one of our most common encounters with the police:
Privacy is about more than just data collection. It’s also about feeling secure against someone searching through your belongings. While the Bill of Rights protects citizens against unreasonable searches, it’s no guarantee that your rights won’t be violated — just ask David Eckart. Eckart’s example is extreme, but the kind of traffic stops that led to his ordeal are very common. Forty-two percent of involuntary encounters with police in the United States happen in cars, and many of these encounters lead to searches. …
In total, violations based on driver behavior accounted for 68.1 percent of traffic stops by police. In other words, human beings were pulled over in most cases because they’re human: they break the rules of the road and sometimes make mistakes. In some cases, like obeying speed limits, there’s even a cultural expectation that most people will routinely break the law. As the ACLU’s senior policy analyst Jay Stanley tells The Verge, this means that roads are quasi-authoritarian spaces that give police huge discretion in choosing who to punish. But in a world with self-driving cars, things would look much different. “The latitude of the police to pull people over would be much reduced,” Stanley says. “People wouldn’t be subject to so much arbitrary enforcement.”
And Camille Francois hopes they will get people to pay more attention to surveillance:
It’s quite clear: for most people, the link between government surveillance and freedom is more plainly understood by cars, rather than personal computers. As more and more objects become connected to the Internet these questions will grow in importance. And cars in particular might become, as Ryan Calo puts it in a 2011 article on drones, “a privacy catalyst”; an object giving us an opportunity to drag our privacy laws into the 21st century; an object that restores our mental model of what a privacy violation is.
When my grandmother starts to consider technologically-enabled constraints on how she can drive; or people knowing exactly where she can go—abstract issues of “autonomy” and “privacy” become much more real. … And that is important because in order for our society to shape the rules that will make the future of self-driving cars one in which we want to live, we need all members of society to contribute to the conversation. We need to ask: what happens when cars become increasingly like computers? With self-driving cars, are we getting the best of the computer industry and the car industry, or the worst of both worlds?