Profile

Cover photo
Naveed M
Attended Virginia Tech
581 followers|244,441 views
AboutPosts

Stream

Naveed M

Shared publicly  - 
2
Jay Geater's profile photo
 
and it won't cost a lot 
Add a comment...

Naveed M

Shared publicly  - 
 
 
Ever since seeing this article a few days ago, it's been bugging me. We know that self-driving cars will have to solve real-life "trolley problems:" those favorite hypotheticals of Philosophy 101 classes wherein you have to make a choice between saving, say, one person's life or five, or saving five people's lives by pushing another person off a bridge, or things like that. And ethicists (and even more so, the media) have spent a lot of time talking about how impossible it will be to ever trust computers with such decisions, and why, therefore, autonomous machines are frightening.

What bugs me about this is that we make these kinds of decisions all the time. There are plenty of concrete, real-world cases that actually happen: do you swerve into a tree rather than hit a pedestrian? (That's greatly increasing the risk to your life -- and your passengers' -- to save another person)

I think that part of the reason that we're so nervous about computerizing these ethical decisions is not so much that they're hard, as that doing this would require us to be very explicit about how we want these decisions made -- and people tend to talk around that very explicit decision, because when they do, it tends to reveal that their actual preferences aren't the same as the ones they want their neighbors to think they have.

For example: I suspect that most people, if driving alone in a vehicle, will go to fairly significant lengths to avoid hitting a pedestrian, including putting themselves at risk by hitting a tree or running into a ditch. I suspect that if the pedestrian is pushing a stroller with a baby, they'll feel even more strongly this way. But as soon as you have passengers in the car, things change: what if it's your spouse? Your children? What if you don't particularly like your spouse?

Or we can phrase it in the way that the headline below does: "Will your self-driving car be programmed to kill you if it means saving more strangers?" This phrasing is deliberately chosen to trigger a revulsion, and if I phrase it instead the way I did above -- in terms of running into a tree to avoid a pedestrian -- your answer might be different. The phrasing in the headline, on the other hand, seems to tap into a fear of loss of autonomy, which I often hear around other parts of discussions of the future of cars. Here's a place where a decision which you normally make -- based on secret factors which only you, in your heart, know, and which nobody else will ever know for sure -- is instead going to be made by someone else, and not necessarily to your advantage. We all suspect that it would sometimes make that decision in a way that, if we were making it secret (and with the plausible deniability that comes from it being hard to operate a car during an emergency), we might make quite differently.

Oddly, if you think about how we would feel about such decisions being made by a human taxi driver, people's reactions seem different, even though there's the same loss of autonomy, and now instead of a rule you can understand, you're subject to the driver's secret decisions. 

I suspect that the truth is this:

Most people would go to more lengths than they expect to save a life that they in some way cared about.

Most people would go to more lengths than they are willing to admit to save their own life: their actual balance, in the clinch, between protecting themselves and protecting others isn't the one they say it is. And most people secretly suspect that this is true, which is why the notion of the car "being programmed to kill you" in order to save other people's lives -- taking away that last chance to change your mind -- is frightening.

Most people's calculus about the lives in question is actually fairly complex, and may vary from day to day. But people's immediate conscious thoughts -- who they're happy with, who they're mad at -- may not accurately reflect what they would end up doing.

And so what's frightening about this isn't that the decision would be made by a third party, but that even if we ourselves individually made the decision, setting the knobs and dials of our car's Ethics-O-Meter every morning, we would be forcing ourselves to explicitly state what we really wanted to happen, and commit ourselves, staking our own lives and those of others on it. The opportunity to have a private calculus of life and death would go away.

As a side note, for cars this is less actually relevant, because there are actually very few cases in which you would have to choose between hitting a pedestrian and crashing into a tree which didn't come from driver inattention or other unsafe driving behaviors leading to loss of vehicle control -- precisely the sorts of things which self-driving cars don't have. So these mortal cases would be vanishingly rarer than they are in our daily lives, which is precisely where the advantage of self-driving cars comes from.

For robotic weapons such as armed drones, of course, these questions happen all the time. But in that case, we have a simple ethical answer as well: if you program a drone to kill everyone matching a certain pattern in a certain area, and it does so, then the moral fault lies with the person who launched it; the device may be more complex (and trigger our subconscious identification of it as being a "sort-of animate entity," as our minds tend to do), but ultimately it's no more a moral or ethical decision agent than a spear that we've thrown at someone, once it's left our hand and is on its mortal flight.

With the cars, the choice of the programming of ethics is the point at which these decisions are made. This programming may be erroneous, or it may fail in circumstances beyond those which were originally foreseen (and what planning for life and death doesn't?), but ultimately, ethical programming is just like any other kind of programming: you tell it you want X, and it will deliver X for you. If X was not what you really wanted, that's because you were dishonest with the computer.

The real challenge is this: if we agree on a standard ethical programming for cars, we have to agree and deal with the fact that we don't all want the same thing. If we each program our own car's ethical bounds, then we each have that individual responsibility. And in either case, these cars give us the practical requirement to be completely explicit and precise about what we do, and don't, want to happen when faced with a real-life trolley problem.
The computer brains inside autonomous vehicles will be fast enough to make life-or-death decisions. But should they? A bioethicist weighs in on a thorny problem of the dawning robot age.
166 comments on original post
1
Add a comment...

Naveed M

Shared publicly  - 
 
Claw machines can be programmed to automatically reduce their grip strength to maximize profits, while allowing an infrequent full-strength grip to entice suckers. From the instruction manual for a...
7 comments on original post
1
Add a comment...

Naveed M

Shared publicly  - 
 
The “highest economic growth decade was the 1960s. Income tax rates [for the top bracket] were 90 percent,” Bill Gates said.
1
Thomas Weeks's profile photoJames Salsman's profile photo
Add a comment...
In his circles
440 people
Have him in circles
581 people
Dave Horner's profile photo
Philip Kauffman's profile photo
Gautam Gaikwad's profile photo
David Khorram's profile photo
Florian Ragwitz's profile photo
Tsubasa Kato's profile photo
shahid raza's profile photo
Philip Schwartz's profile photo
Tony Merritt's profile photo

Naveed M

Shared publicly  - 
 
"I have no respect for your ancestors. As far as your ancestors are concerned, I shouldn't be a law professor at Georgetown. I should be a slave. That's why they fought that war. I don't understand what it means to be proud of a legacy of terrorism and violence. Last week at this time, I was in Israel. The idea that a German would say, you know, that thing we did called the Holocaust, that was wrong, but I respect the courage of my Nazi ancestors. That wouldn't happen. The reason people can say what you said in the United States, is because, again, black life just doesn't matter to a lot of people."
3
1
J. Henry Goins's profile photoCorey-Jason Saile's profile photo
 
I just saw that in a different forum. Wow!
Add a comment...

Naveed M

Shared publicly  - 
 
Another coward with a badge. If this cop was so scared of a golden retriever leashed up in the back yard, imagine how terrified he would be if he ever ran across an unarmed black kid.
 
A golden retriever that was leashed in a backyard barked at a Cleveland police officer, causing the cop to fear for his life, which was when he shot the dog dead.

#Cleveland   #Clevelandpolice   #pets  
A Cleveland police officer shot and killed a family dog, which was leashed in a backyard because the dog barked and made him feel threatened.
4 comments on original post
4
Add a comment...

Naveed M

Shared publicly  - 
 
Look at that leg room. No US airlines made the top 10.
 
We're losing to Qatar, Emarates, and Turkey 
The World Airline Awards recognize the best airlines based on millions of passenger reviews.
View original post
1
Add a comment...

Naveed M

Shared publicly  - 
 
Crime in black communities and crime committed against black people by the state are not created equal.
Hello, and welcome to the guide to debunking “black-on-black crime” and all of its rhetorical cousins.
3
cedrick wilson's profile photo
 
This nonsense goes all the way back to the end of slavery and then the racist movie Birth of a Nation which told white America that since Blacks aren't in chains anymore they're coming to rape your wife, take your job and steal your property.

The media has kept up with the black people are animalistic and violent for over a century, despite overwhelming facts that's its not true.

Good post.
Add a comment...

Naveed M

Shared publicly  - 
 
"70 people were injured while filming this movie with 100 untamed lions"
1
Add a comment...

Naveed M

Shared publicly  - 
 
"At least 4 million Vietnamese died as a direct result of the war, which means that at least 2 million civilians perished at the hands of U.S. forces and their mercenary brethren. When the war commenced in earnest in the 1960s, Vietnam's population was 19 million. An incredible 21 percent of this population therefore perished. In 1960 the U.S. population was about 180 million. Imagine a war that killed nearly 38 million Americans."
It is the Vietnamese we should honor, commemorate, remember.
1
Add a comment...
People
In his circles
440 people
Have him in circles
581 people
Dave Horner's profile photo
Philip Kauffman's profile photo
Gautam Gaikwad's profile photo
David Khorram's profile photo
Florian Ragwitz's profile photo
Tsubasa Kato's profile photo
shahid raza's profile photo
Philip Schwartz's profile photo
Tony Merritt's profile photo
Collections Naveed is following
Education
  • Virginia Tech
Links
YouTube
Other profiles
Story
Tagline
Open Source Programmer + World Citizen + Soccer Coach
Introduction
Born in Iran and raised in the US, I am a citizen of the world. I create open source software at work and in my free time. My favorite programming language is Perl. I post about issues related to social justice, computer programming, free open source software, green energy, and education. I am also passionate about using technology to teach children. I created this open source and free webapp to help teach kids math https://mathsheets.org.
Work
Occupation
software developer
Employment
  • software developer, present
Basic Information
Gender
Male