Club Cobra

Club Cobra (http://www.clubcobra.com/forums/)
-   ALL COBRA TALK (http://www.clubcobra.com/forums/all-cobra-talk/)
-   -   Autonomous Driverless cars (http://www.clubcobra.com/forums/all-cobra-talk/141234-autonomous-driverless-cars.html)

Dimis 07-22-2018 08:12 PM

Autonomous Driverless cars
 
Hmmm.... Decisions decisions... What to do?

Driverless, autonomous cars present ethical challenges

Enjoy.

Jim Vander Wal 07-22-2018 08:45 PM

This whole thing scares me. A well done article.
Jim

rodneym 07-22-2018 09:08 PM

All this autonomy crap and they can't even keep the brakes working!

Maybe the Germans will figure it out after they learn to make real air conditioning.

Gaz64 07-22-2018 09:38 PM

You won't see me in one. ;)

eschaider 07-23-2018 12:26 AM

Anything that preserves stupid people at the cost of innocent possibly smarter people is wrong.

If you are stupid enough to put others into a life and death situation through no fault of their own, then you should pay the price, irrespective of what that price may be.

If it is a life and death situation then the gene pool has been cleaned up going forward.


Ed

Ron61 07-23-2018 03:21 AM

I hate those dam things. We have already had people killed here by them doing the wrong thing and I don't want any computer driven piece of junk deciding whether I live or die. If people are unable to drive themselves then keep them off the road.

Ron

Tom Wells 07-24-2018 03:54 AM

Not sure what you guys are on about.

All the cars with cell phone zombies behind the wheel are driverless.

Must be millions of them on the road as we type...

(sorry, couldn't resist)

Tom

Ron61 07-24-2018 03:59 AM

Tom,

I completely agree with you and even our police here seem to always have a cell phone stuck in their ear when they are just driving around.

Ron

strictlypersonl 07-24-2018 04:39 AM

Even with the current state of technology, I would rather have an autonomous car next to me on the highway than a half-conscious driver. I'll wait a bit to make that preference for urban driving, but I suspect that it won't be too long. Exceeding the current "average" driver's competence is not that big a jump.

eschaider 07-24-2018 08:50 AM

Quote:

Originally Posted by Tom Wells (Post 1448533)
Not sure what you guys are on about.

All the cars with cell phone zombies behind the wheel are driverless.

Must be millions of them on the road as we type...

(sorry, couldn't resist)

Tom

Quote:

Originally Posted by strictlypersonl (Post 1448540)
Even with the current state of technology, I would rather have an autonomous car next to me on the highway than a half-conscious driver. I'll wait a bit to make that preference for urban driving, but I suspect that it won't be too long. Exceeding the current "average" driver's competence is not that big a jump.

I think possibly my point is getting confused with the capability of autonomous technology vs human guidance. I have absolutely no doubt that the autonomous model is the more capable model.

My issue is the programming of the autonomous model to respond to particular traffic situations that involve brain dead participants that put innocent observers at risk and sometimes risk of life.

The programming of the vehicle's autonomous guidance system should attempt to avoid any type of operation that puts humans at risk. That said when a human does something stupid, the operational logic should not select a course of action that puts an innocent observer to the event at risk for life or otherwise.

When someone does something stupid that puts themselves at risk, then they should own the responsibility for that decision. The risk and potential personal injury should not be transferred to another human being who just happens to be in proximity to the stupid person.

The guiding light, if there is one, is personal responsibility. Each individual must be responsible for their own decisions and actions. You should not transfer the responsibility for and results of those bad decisions / actions to an innocent bystander to avoid your responsibility for the bad decision. The attempt to escape responsibility by transferring it to an innocent bystander is patently wrong whether done by an individual or a machine.

The bottom line is people will still do stupid things, accidents will still occur, and anything that transfers the outcome and responsibility for those accidents to an innocent bystander is patently wrong to do.

We should try to educate and inform humans of dangerous decision making logic, encourage them not to use it. In the event they still choose to use it, no one but the individual who makes the stupid choice should have to pay for the bad judgement — especially and in particular if it will result in loss of life.

A pedestrian stepping into oncoming traffic on the expressway is a perfect example of poor decision logic applied to a life threatening situation. In this poor decision logic example, the decision and the attendant potential for loss of human life should be contained to that of only the stupid person.

In an autonomous vehicle situation the vehicular driving and accident avoidance logic should attempt to slow the vehicle without injuring the individual and importantly anyone else. If that is not possible then the vehicle's forward motion should be arrested in such a fashion that it does the least damage to the human life in the immediate area of the accident. The majority of the loss of life and limb, if at all possible, should be contained and focussed on the person who made the stupid choice.

This example is simple. The moment you added multiple actors the complexity of the logical solution skyrockets and a computer can better and more quickly resolve the logical problem. This is only true if the accident avoidance and resolution routines have been programmed both correctly and logically complete — a very difficult and demanding task by any standard.


Ed

CSX 4133 07-24-2018 09:20 AM

Unfortunately the same "brain dead" incompetent driver types we complain about are writing the programs that direct these autonomous cars.

rodneym 07-24-2018 10:57 AM

I like were strictlypersonal is going with this.
I like autonomous driving for everybody else!
:p

Jaydee 07-24-2018 06:08 PM

The other day my 5 year old son was asking me if I'm going to teach him to drive. I said by the time your 16 you won't need to learn. It will drive itself. So I though, will the driving age be lowered. Like I'll strap my 5 year old in the seat and tell the car to take him to school by himself. That doesn't sound so bad? When a drink driver gets caught several times, he is forced to fit a breathalyzer to his car. I reckon if someone gets to many traffic violations, they should be forced to drive a autonomous car. Eventually we will become used to trusting this technology. For years we have trusted our lives on a tiny bit of rubber in the master cylinder. I just can't see how a autonomous car will recognize dirt roads, a bit of loose gravel on a hard surface on a corner, etc.
Subarus already have a system to stop little old dears from putting it in the wrong gear and running into buildings or missing the brake pedal. or panicking.
JD

xb-60 07-24-2018 07:50 PM

Quote:

Originally Posted by Jaydee (Post 1448595)
....For years we have trusted our lives on a tiny bit of rubber in the master cylinder....

Rubber has a degree of memory, but it doesn't have any intelligence....;)

Looking forward a few years, got to admit that the antics of autonomously controlled cars jostling for parking spaces at school pickup time could be entertaining :rolleyes:

Cheers,
Glen

Dimis 07-25-2018 02:33 AM

Quote:

Originally Posted by eschaider (Post 1448554)
I think possibly my point is getting confused with the capability of autonomous technology vs human guidance. I have absolutely no doubt that the.... etc etc etc...

Ed

@Ed.
Sure. But what do you do in scenario 4.
Ie: "You’re driving down a narrow suburban street when a child suddenly runs into the road. There’s no time to brake."

The article is but there tip of the ice berg on the ethics surrounding this stuff.

While I agree we might not be far away on the technology, I think we are miles away on the ethics

Dimis 07-25-2018 02:38 AM

Quote:

Originally Posted by xb-60 (Post 1448597)
Rubber has a degree of memory, but it doesn't have any intelligence....;)

Looking forward a few years, got to admit that the antics of autonomously controlled cars jostling for parking spaces at school pickup time could be entertaining :rolleyes:

Cheers,
Glen

Haha. You are a very funny man.

DanEC 07-25-2018 04:08 PM

I'm a little bit with Strictlypersonl on this - I see people doing stupid, dangerous things every day in my commute to and from work. While I'm sure autonomous vehicles can and will manage to do some unintelligent things, I doubt they will rival many of the brain-dead or just plain risk loving drivers we cross paths with every day. But what will be a real mystery is in a multi-autonomous vehicle situation when each one is going through it's risk/ethical/monetary/human injury hierarchy analysis and making decisions without input on the decisions the other vehicles are making at the same time and which may be in conflict with the others. I guess it's not much of a stretch that at some point they will have all of the vehicles communicating constantly with each other and coordinating emergency actions with each other. At that point I guess they will be deciding amongst themselves if a Democrat is more important than a Republican, or an engineer more important than a teacher, or a 30 year old more valuable to society than a 70 years old, etc.

strictlypersonl 07-25-2018 04:34 PM

And then there's this:
The trolley dilemma: would you kill one person to save five?
Life is complicated. Fortunately, dilemmas like that don't happen but in one-in-a-hundred lifetimes.

eschaider 07-25-2018 04:54 PM

The issue you are identifying, Dan is just one variation of the decision tree dilemma that must be resolved by the on board ethics logic. The challenge you have accurately identified has only on possible solution and that is to have all ethics programs employ a basic set of cardinal rules and values to insure all robots come to the same conclusion, the same way at the same time.

A giant in the Science Fiction space, Isaac Asimov, evolved the most basic of these cardinal rules with what has become known as Isaac Asimov's "Three Laws of Robotics". Those three laws as conceived by Asimov are (in order)

A robot may not injure a human being or, through inaction, allow a human being to come to harm.

A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

While simple at first glance the three laws are profoundly complete, all-encompassing and represent an excellent foundation on which to build the remaining ethical and operational routines.

Asimov conceived these rules/guidelines almost a 80 years ago before we actually had any real robots to work with. The guy was not only an award winning author with engaging publications, he was also quite gifted across a wide range of disciplines.


Ed

eschaider 07-25-2018 05:02 PM

Excellent ethical dilemma's, Bob. Those questions and questions like them are excellent representations of the additional ethical and operational layering necessary for these complex 'thinking machines' that we are only beginning to realize the potential (both good and bad) for.

Ed


All times are GMT -7. The time now is 10:02 PM.

Powered by vBulletin® Version 3.8.0
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Search Engine Friendly URLs by vBSEO 3.6.0
The representations expressed are the representations and opinions of the clubcobra.com forum members and do not necessarily reflect the opinions and viewpoints of the site owners, moderators, Shelby American, any other replica manufacturer, Ford Motor Company. This website has been planned and developed by clubcobra.com and its forum members and should not be construed as being endorsed by Ford Motor Company, or Shelby American or any other manufacturer unless expressly noted by that entity. "Cobra" and the Cobra logo are registered trademarks for Ford Motor Co., Inc. clubcobra.com forum members agree not to post any copyrighted material unless the copyrighted material is owned by you. Although we do not and cannot review the messages posted and are not responsible for the content of any of these messages, we reserve the right to delete any message for any reason whatsoever. You remain solely responsible for the content of your messages, and you agree to indemnify and hold us harmless with respect to any claim based upon transmission of your message(s). Thank you for visiting clubcobra.com. For full policy documentation refer to the following link: