About Me

Find out more about me here.

02 October 2017

Trolleys Are Not Model T's Are Not SUV's

I was listening to the most recent RadioLab on my commute to work this morning. They revisit a philosophical/moral dilemma that is very interesting to me. They ask, as the self-driving car inches toward reality, how will we decide to program driving computers to cause fewer vs. more casualties/fatalities. (For the tl;dl version, skip ahead to the 27:14 mark.) An example of such a decision proffered in the episode was that a car should be programmed to crash into a brick wall, killing the lone passenger, rather than careen into a crowd, maiming and killing several. The general conclusion of many is that this would be unworkable. Most recognized in general that it was logical to do this, but few would be willing to ride in such vehicles, and few would want to sell such vehicles.

My observation is this: we are already making such decisions! People regularly make choices regarding transportation that, on the whole, value their own safety at the expense of others. People, for example, often cite "safety" as a reason for buying SUV's, but what people mean by "safety," whether they recognize it or not, is safe for themselves (or their loved ones) even if that means making pedestrians and people in other cars less safe. I have written about this before. Science fiction has long been recognized as an effective tool for understanding complicated current social circumstances. Too often when that science fiction is projected as near and inevitable, we fail to get the same messages.

No comments:

Post a Comment