Videos


The Real Moral Dilemma of Self-Driving Cars



We talk about all the potentially challenging situations autonomous cars could get into but not about how human drivers are not very good. Tens of thousands die on the roads every year in collisions, most of which could be prevented by autonomous vehicles.
Sponsored by BMW

I wanted to make a video about autonomous cars for some time but I hadn't had the opportunity. The self-driving technology is already at a state where it can save lives if only it were more widely implemented.

Links to original clips:
TED-Ed https://www.youtube.com/watch?v=ixIoDYVfKA0
BBC Newsnight: https://www.youtube.com/watch?v=FypPSJfCRFk&t=172s

Music from http://www.epidemicsound.com "Ambient Electronic Groove," "Pet Animals 2," "The Long Ride."
Filmed by Raquel Nuno
Edited by Trevor Carlee


Views: 999369
Added:
Runtime: 3:41
Comments: 5370

Tags for this video:



Find more videos in the: "27"
Uploaded by:
See more videos uploaded by


Comments:

Author MyChico333 ( ago)
People don't care about being safe, people care about feeling safe.

Humans are morons by nature

Author Petr Novák ( ago)
To add my two cents to the discussions here, I see a different scary problem than ethics or assigning guilt in cases of accidents... I see a problem with hacking.
I presume the cars wont be completely separated from the global net. And if the softwares architecture isnt 100% safe from breach, this would be the first time ever, that someone could "hack" into a controlable lethal weapon on large scale. I mean, pilotless war drones are probably not being remotely controled by criminals, but there are very few of these, are proabably really well made and are controled by the military. This stuff will one day be made in china for a thousand dolars... what level of security will those cars have?

Author Bongan ( ago)
A quick question, did you rewrite the script anything or just went with what BMW gave you? Watching your other videos, there is a noticable difference. And now I feel sad.

Author Jim Teevan ( ago)
The problem is fairness. I completely agree that fewer crashes are a good thing, and I also agree that automation is the way to do that. It simply "FEELS" unfair if I end up in an accident because of a machine's decision. If I crash, it's my fault, and I can deal with that. If something crashes me, it seems somehow unfair.

Author KDreezy Gaming ( ago)
"Filling the roads with autonomous cars will eliminate jobs" - False. Who will program new algorithms,software and protocols for these cars? Programmers. Who will repair these cars? Mechanics, electrical engineers, IT. Who will harvest the materials for these vehicles? Miners, heavy machine/vehicle operators, managers etc. Who will build the factories to produce these vehicles and the robots which assemble the parts that humans don't assemble? Construction workers. The list goes on and on. People said the same thing about computers and calculators when they came out. "What will all the people with jobs in mathematics and engineering do with all these computers eliminating their jobs?" They find other things to do. There is always something. I have the theory of the conservation of jobs, jobs can't be created or destroyed, they can only change form or be moved. :P

Author Daniel Matthews ( ago)
The moral dilemmas are not the problem so much as the fact that if you hard code solutions into car software people can then exploit those scenarios reliably to murder people. e.g. If I want to kill the driver of car [A] I just have to wait at a point I know they routinely pass and then roll an empty pram into the path of oncoming car [B] such that it swerves into the path of car [A] rather than kill a baby that does not have the protection of a car around it. Do it on a bridge or near a steep drop-off and the destruction is potentially even greater. It is the predictability of the autonomous cars that will be the problem, a very practical exploit and not a question of philosophy etc.

Author Borhan Zadeh ( ago)
Featuring a tiny glimpse into who films these vids for Derrick (2:43).

Author lovebird mic ( ago)
So, one day, we may not need to worry about the problems of auto driving. If everyone had a auto car, we could put them on the same network and then they could tell each other "OBSTICAL! CAR ON LEFT? YES, CAR ON LEFT, BRAKE." Or something similar... Just an interesting idea...

Author Udayveer Singh ( ago)
I guess the main thing is there's too many variables. The examples you gave, Veritasium, of AI in planes and elevators is much like cruise control in cars. It controls a limited number of variables and it can't really go wrong (for the cruise control example thats why the driver is told to keep paying attention to the road). However with complete AI control, theres way too many variables, possibilities and unknown circumstances to be completely covered by an AI which ensures human safety.

Author ProCactus ( ago)
So the solution is to build things for idiots and morons.

Author melissa zabower ( ago)
In a different video, you talk about thinking and Drew's inability to hold too many pieces of information. As a middle school teacher, I've had students ask if they can us the calculator, and I always say no. Because we know that young children need to learn calculations so well that it becomes automatic. Every year, to prove my point, I would challenge a student to come up with a multiple digit multiplication problem. Another student and I would race; he/she would use a calculator and I would do it in my head. I almost always won. But I also practice. I know; I'm a geek. My point here is that if we allow drivers, especially young ones but really any age, to release their responsibility to the vehicle, then those people will lose the ability to analyze road conditions and make smart choices. I predict that we'll see this first with lane departure warnings. People will get used to hearing a beep, until it becomes part of the background that they don't hear it anymore. So yes, for a while you'll see fewer accidents, but I think in the long run it will backfire. Someone needs to put cell phone jammers in cars so that as long as the tires are moving, the cell phone won't work. I think that will do a lot more to reduce accidents than an autonomous car.

Author 1MoreRep ( ago)
I will trust my own brain over a machine thank you.

Author Broockle ( ago)
Why even own a car? Cars should be a lot smaller on average and owned by private companies or government or whatever and you could just order one with an app. It would pick u up and drop you off and go meet the next person's needs.
Like a Self driving Uber driver basically, that a thing yet?
Also drones could replace delivery in many forms with the right infrastructure, which would take more vehicles off the roads.

Author Dennis Doroslovac ( ago)
Remember the blackberry days when your device froze and required a battery pull? What happens when the cars computer freezes?

Author blonde brute ( ago)
The biggest reason I want a self-driving car...
So I can get wasted but take my car home XD

Author Scott Kendall ( ago)
Veritasium had a good segment on probability, my observation should have been on his mind when he made the comment that planes on autopilot are safer. This is a poor example. Pilots put planes on autopilot when the probability of things going wrong is lowest. Planes crash on take off and landing. Rarely do they crash in between when pilots put them on autopilot.

Author First Last ( ago)
I like how half of these comments are all science based comments from experience of disagreement and open discussion of bother pros and cons of the idea, even though I'm totally against the idea of cars with no human control.

Author Bob Bob ( ago)
I always thought the moral dilemma was the CIA's ability to control these kinds of cars.

Author Momo Surname ( ago)
if everyone had the self driving car then the dilemma would not exist. No programmed car will make a mistake.

Author Yonkage ( ago)
The answer to the moral "dilemma" is to not swerve and endanger another vehicle other than itself.

Author First Last ( ago)
How many fatal airplane crashes killing hundreds of people at once occurred because people were relying on a bad auto pilot?

Author Sam Stevenson ( ago)
should have also talked about horsey horseless. Seems like a good automotive example of people not being comfortable with change.

Author Samuel Clayton ( ago)
Or... we could just bring back trains...

Author ganondorfchampin ( ago)
I mean, you need to make sure the cars are safe before you put them on the road, moral dilemmas aside.

Author Antie Cuteness ( ago)
Why don't we all use bikes?

Author MilitantPeaceist ( ago)
The real moral dilhema is:
We can accept that humans make mistakes because we know we make mistakes, so we would accept a person chosing either the car to the left or bike to the right & the decision time prevents rationale to be considered in a split sec.
But we would not see this in a machine that decided the same course each time resulting in bike riders having a phobia about passing or riding next to an automated car etc.

The blame will be placed on the technology itself (not the driver) & then the software writers, & they will be morally judged differently to a human driver forced to make a split decision because the software writers chose the bike rider to die.
This will feel like pre-meditated homocide.

Author Matthew Hafner ( ago)
The correct choice is the car should make the choice that is most likely to protect its occupants. I would never buy a car that is programmed to kill me. Nor would I hire a bodyguard that doesn't put me as his priority. Most people agree with this sentiment.

If you can't convince the majority of people to buy autonomous cars, there *wont be* autonomous cars.

Author codediporpal ( ago)
The correct answer is always SLAM ON THE BRAKES!!!! That's why all accidents involving self driving cars have the self-driving car being rear ended.

Author ZloTip ( ago)
+Veritasium there was a case of death caused by self driving car. The driver was killed. No thanks for me.

Author Chuck Norris ( ago)
Wow. I just noticed Veritasium has an atomic mass of 42.0.








420?

Author Axel Schmidlin ( ago)
This videos explains the problem in a way too simplistic manner. Treating the moral dilemna purely with "quantifiable" arguments (number of deaths, accidents, etc...) does not bring out the real moral dilemna: the fact that we lose our ability to chose. Imagine if instead of your mentionned case scenarios, the driver had to die to save an entire schoolbus ? Should the driver do so ? What if it was just one family car? We must not let algorithms take such decisions, no matter the death toll it produces. We may reduce the number of people dying on the streets, but in doing so, we are actually diminishing our right for self-determination and what constitutes our humanity.

Author Michael Colby ( ago)
Hurry the heck up and mainstream self-driving cars so I can sleep the extra 20 minute drive to work. I want tinted windows as well so I can play with mys ... I mean pick boogers without anyone noticing.

Author Adam Eason ( ago)
yea like who would program a car specifically to be racist? it doesnt have time to calculate that stuff, in this low chance scenario, it doesnt really matter in the long run because its not going to happen very often, and if it does, why cant the driver take control?

Author Adam Eason ( ago)
0:43 huh... they should really make a smartphone with haptic feedback... kind of like nintendo's HD rumble... so that smartphones can have good gamepads... and then we should make a new store for apps because google play store is nothing but shovelware...

Author jhanks2012 ( ago)
The real answer to the posed dilemma is that the car will attempt to either swerve towards the SUV or the motorcycle and it will communicate with whichever vehicle so that vehicle also moves, and so on if there is another vehicle beside that on, etc. There is no reason to assume the cars will not be aware of each other and talking to each other. Essentially all accidents will be avoided because each car will respond to each other in such a way to avoid collisions altogether.

Author Tony01013 ( ago)
Here's a moral dilemma for you: How can you procreate knowing full well that your offspring is going to die and had no choice of whether or not he/she wanted to exist in the first place.

Author 14OF12 ( ago)
Why is breaking never an option in these moral dilemma problems?

Author Ashton Giovanni ( ago)
I personally don't think tech is ready for self driving cars. I think we should focus on improving current designs and maybe computer assisted driving.

Author 99NEPtune99 ™ ( ago)
The real moral dilemma is whether the car should save you or pedestrians outside. Like the car either swerves into the crowd to save you, or the car causes you to die and not run over pedestrians

Author Blarghenschnarf ( ago)
The way I see it (This is how *I* see it, so, y'know, opinions and such), is that the problem with driverless cars is their "brains" are too...weak. Much slower in processing, much worse memory, much less intuition, much less adaptability than a human brain, but the problem with drivers is their brains are too strong. I tried to fit this in four lines, but sadly...

The human brain is super powerful, so even something as complex as driving is so simple the mind can wander and do something else and have a fair chance of not crashing, and if the driver is concentrating completely on driving, even in bad conditions, you (the driver) can trust yourself. So the mind wanders because driving is too easy, and so people get distracted and crash.

I guess what i'm trying to say is that each system has positives and negatives. Computers are laughably, astoundingly worse than a human mind and might not be capable of handling sudden icy roads or potholes or whiteouts in bad weather, and might not be able to make the "right call" if something horrible happens, but on the other side, driving computers can't get drunk, distracted, angry, or sleepy.

If you give a car a perfectly flat road with no cars close to it on a sunny day, the car is going to be better than the person. But if something goes wrong and some quick thinking or intuition is required, i'd rather be the one driving. So weigh the odds. One situation happens ore often, and the other is catastrophic if you get it wrong.

Author the official sandidge02 ( ago)
soo... electric cars may be "almost" on the road by the time global warming is bad? bc arkansas will have a beach in the middle of the state...

Author Drew Downey ( ago)
Google is the only viable technology despite not having millions of blue money to put into it. The answer is if it's not Google, kiss the ground when you get out. The fact you let that BMW drive is equivalent to a Blind Man and a great memory.

Author okrajoe ( ago)
Hey its cruising along in Vegas!

Author Tyler Mauldin ( ago)
I'm skeptical that self-driving cars are be intrinsically safer than people-driven cars. I personally wouldn't put my life in the hands of a self-driving car. Even if they are safer, that will take the fun out of driving cars. :( But of course, if they really are safer, it's better to go with the boring option if it saves thousands of lives.

Author Zeb ( ago)
Great point about the elevators, i wonder how manufacturers can make self driving cars appealing to human psychology at first too.

Author Michael Zhang ( ago)
To me, I think if all the cars were self driving, the driving AI under the same code, there wouldn't be much of a problem. If you think about it, if all the cars were following one rule, that would be safer.

Author none ya ( ago)
I cant see myself ever being comfortable with this self driving car crap. I don't think it'll solve anything in the long run. are these vehicles going to be impervious to malfunctions? just what we need, cars going hay wire with no way to stop them! or worse... people hacking into these cars and driving them from there couch.. people love to have the latest and greatest and are also lazy, so these things will sell no matter what is said.. and that's what it all comes down to, money.

Author Wurstwaren Schmecken ( ago)
I am disappointed of this video. Very superficial.. At least u can read in the descriptions that this is pure advertisement for BMW and autonomous cars. Sad.

Author Kalum Batsch ( ago)
2:11 "We're using our phones"

What do you mean, "we"? That super cool douchebag there is using his phone.

Author Leslie Borregard ( ago)
Wonder how self driven cars would road rage after they somehow crashed into one another?

Would they just shoot web slang at each other like STFU, or GKY while us humans watched.

Imagine two cars fighting like a Prius VS a Tesla Model X it'd be highway gold lol.

Author Soryykah Wilson ( ago)
The moral dilema of "Where to swerve" can be fixed by a cloud system between the cars themselves. The car senses the impending danger in the road ahead, and it sends this signals to all the cars around it, who are also getting their own input. The cars determine the best, safest way to react and the entire roadway moves in brilliant synchronization to avoid disaster.

The only problem here is, if the entire roadway is connected, how does this open up cars on a network to being hacked and used? It's a whole new business of cyber security.

Author EElectric_M ( ago)
The solution? Don't make self driving cars and invent teleportation. No need to thank me.

Author TL M ( ago)
How about technology were the car's computer doesn't start because it detects alcohol on your breath. Or when you get in the vehicle your phone automatically goes into lock mode while the vehicle is in drive. Are you tired or sleepy? Computer says sorry get some rest, I detect heavy eye lids, this vehicle will not start. I believe technology can actually make us better drivers if we allow it to.

Author Fennec Besixdouze ( ago)
My favorite study about self-driving cars is one that shows Google cars have an astonishingly high accident rate, about as bad as senior citizens and slightly higher than teenagers. Except that wasn't what the study decided to highlight, instead they judged "at fault" using some fairly questionable metrics, and declared "when Google cars get in accidents, human drivers are to blame".

In other words, self-driving cars are unpredictable to humans, they don't behave in human ways, which causes humans around them to get into accidents that, under some stringent technical definition is "the human's fault", but under common sense is obviously the Google car's fault because the average human wouldn't have such a high accident rate.

If you want to lower the overall accident rate, you want to lower everyone's accident rate. Google cars increase the accident rate, and yet they're defended because of some tricky accounting to blame their atrociously high accident rate on other human drivers. This is an anti-scientific absurdity: confirmation bias of literally the highest order. Self driving cars are not decreasing accident rates, they are increasing them, and there is no reason whatsoever to believe as more and more manufactures adopt makeshift versions of this software and put them under the guidance of wider and wider segments of the population, that this will improve, and every reason to suspect it will lead to more accidents.

A shocking reality most people don't understand is that self-driving cars are most successful by COPYING THE HUMAN DRIVERS AROUND THEM. Fleets of self-driving cars, despite the common misconception that they would all act in magical harmony with each other and eliminate all traffic hams--are actually horrendously poorly behaved at the moment.

Unfortunately this healthy skepticism, which is actually well accepted in the research community, is associated by the public sub-consciously with anti-science movements like anti-vacc nonsense or climate change denialism, meaning people react instinctual in a negative way to someone pointing out self-driving cars are not working. And even though researchers have a better understanding and are more skeptical, they can't really share that with the public because the public is also notorious for not funding anything they don't believe in with all their soul (despite the fact that all good science comes out of studying things that AREN'T a sure bet, and a good science funding program will fund more duds than successes because you have to bet wide to win big in science). The confirmation bias feeds in on itself to the point where policy and scientific practice merge into a bastardized conflagration of self-confirming idiocy.

Author Underscore Zero ( ago)
The trick the companies need to play it as: make the user _feel_ in control, and like they're doing most of the work, when in reality very little is controlled by the user. I mean they should make cars basically be in 'easy mode', you could say.

Author Morph Verse ( ago)
Can't wait for this future soon enough..

Self driving cars will likely go full scale when the 5G internet becomes widespread..

Author GeorgeIsYourMan ( ago)
Nailed it!

Author matt orton ( ago)
Machines should be programmed to be safe and not have a need for self preservation. Face the facts: your car will kill itself and you before killing another. Once this is removed from the programming we have real AI to deal with.

Author Andrew Marx ( ago)
Fascinating! A haptic adaptive interface...and we're a hundred years ahead of time!

Author Tristan Johnson ( ago)
why dont those stupid cars drive themselves on the road

Author Nordkiinach ( ago)
Think deeply about the answer I am giving you... This is why - this answer is the only fact.




Money.

Author Marko ( ago)
1:42 , even car itself knows how to be typical BMW driver and park right on white parking line. What a time to be alive

Author OriginalPiMan ( ago)
To the question of the motorbike or the SUV, the answer is SUV because they are less likely to be seriously injured or killed in a crash.
Of course, a good self driving car should be far enough back from the next vehicle that it could stop in its own lane.

Author Achmed ( ago)
Self-driving cars don't have to be perfect, they just have to be better than human drivers. Which is a very easy target to achieve.

Author 556johny556 ( ago)
So BMW, a brand who insists to still be concerned about building fun driving machines, sponsors a video talking about getting rid of driving. Interesting. Other than that the response to the "real moral dilemma" is that autonomous cars are not yet capable to be safe outside of highways. And sometimes not even on them. So as much as many manufacturers like to wave 2020 as a year when revolution happens, we might need to wait a little more to purchase something that gets us from point A to point B without other input than what points A and B are.

Author Gary Generous ( ago)
My issue with "driverless" vehicles is not so much an issue with the type of vehicle shown and would likely buy one when affordable. The type of driverless car I have issue with is the type such as what Tesla is suggesting where there is NO command console (steering wheel, gas, & breaks) with which I can control the vehicle. First the technology is too new to be truly comfortable with, and more importantly there is the question of who is legally responsible.

If I am in a vehicle that is completely driving itself and something goes wrong (ie black ice at highway speeds causing my vehicle to crash into another vehicle killing one or more passengers in that vehicle) who is legally responsible? The car manufacturer who made the vehicle and maintains the self driving software or me as the owner of the vehicle in which I had absolutely no control over events.

On the one hand it would seem that it should be the manufacturer who is responsible not me but insurance and legal codes currently do not allow for this sort of scenario and thus I would likely be liable for an incident which I could not have stopped or altered in any way.

On the other hand, in the example used in the video of self driving elevators the owner of the building is liable for any incidents which occur on their premises not the manufacturer unless it can be proved that the manufacturer knew about an issue and did not correct it. In the current car situation I as the owner of the vehicle am liable for any damages which occur due to my vehicle even if I am not the person driving, or even in the car. so we can see that currently liability would fall squarely in the owner of the vehicle even though that owner would not have any control or way of controlling the vehicle at the time of the incident.

Until this gets worked out in a way equitable to all parties, and while I will definitely use driving and safety aides, y'all will only get my steering wheel out of my cold dead hands.

Author A Dude ( ago)
Mistitled video; should be "A Moral Dilemma in delaying the implementation of Self-Driving/Autonomous Vehicles" because there are many other moral dilemmas. For example, what do you do with the hundreds of thousands of people directly employed to drive vehicles around, most of whom would no longer be employed?

Is saving 10,000 lives a year - many of whom are killed due to their own error - worth the entire elimination of 100,000 to 1,000,000 (to many more, if you want to include industries that rely on drivers as a main source of customers, like restaurants, coffee shops, etc. particularly in rural to small urban centres between the large urban centres) fairly well paying jobs?

Is the unknown (and potentially unknowable, prior to implementation) risk of catastrophic failure of a driving network greater than the known risk of driving today? ie: someone finds a way to take control of cars remotely? Vault 7 release from Wikileaks already indicates that the CIA may have the ability to take control of cars remotely; imagine if Putin had that power and could take control of the cars driving around Congressmen or their families. Or someone connected to ISIS. Or just some random 400lb jerk in his basement out for shiate and giggles. Or a debt collection agency can simply drive a car from your lot to their lot.

What is the risk to losing that essential skill - driving - to our society, which is so dependent on goods and services moving around with precise timing? If there is a hiccup in the system (and EVERY computer system has hiccups) that causes trucks to be unable to drive, there is a real possibility that people may die due to lack of critical needs (perishable medicines for the most part) at critical times. The longer society is used to vehicles that drive themselves, the more people there will be who are unable to drive a vehicle "manually" (in this case meaning driving as we know it today, not driving a manual transmission vehicle). As an example of essential skills being lost, most people in the West today could not reliably grow their own food if their lives literally depended on it - the majority of people in the West would starve to death if they were forced to suddenly grow their own food. Particularly in places with a winter that stops any plants from growing.

Etc., etc. Etc. So the idea that this is "The Moral Dilemma" is just plain wrong. It is A moral dilemma. I would argue that it is not even the most important moral dilemma, personally, though I could be convinced otherwise. But it is not THE Moral Dilemma. Thumbs down for that arrogance.

Author m3gadork ( ago)
I'm gonna ask the obvious question about the holographic screen at the beginning...why wouldn't they just use a touchscreen?? It's super cool and futuristic, but why???

Author Ste Richards ( ago)
2:47 - Does he hire an old short Japanese lady to hold the camera?

Author Mike Boulrice ( ago)
Because of government regulation and liability.

Author James Edward ( ago)
I you want to make a real moral choice to stop needless death you would stop eating meat.

Author chocolate man ( ago)
i hope you die you money hungry cunt

Author MEMECREFT LATS PLEYS! ( ago)
3:30 + 420

Author Jonah Kirk ( ago)
Ah yes, I recall learning about the industrial revolution and how everyone was making philosophical debates about the ethics of factories and engines as well as with tangential jobs that were put in place. Then again, a lot of those jobs were boring and highly dangerous at the time. Still more profitable than others around. Now with these "Self-driving cars" There's less of the boring and safety concerns as this technology is implemented to lessen boredom of driving and increase safety... You know what? We'd be better off living by a lake. Come on People!

Author Bartosz Buliński ( ago)
Where is joy and pleasure of driving ?

Author Chris Thomas ( ago)
I need one because im always wanting to go places but my vision isnt good enough for driving.

Author Remi Caron ( ago)
The real problem is why haven't we banned cars altogether since they are the most inefficient way to move people or things?

Author Carson Landry ( ago)
Could have mentioned Teslas...

Author Edhi Kurniawan ( ago)
I think because of disparity of self-driving car population vs man driven ones.
If all the cars on the street are self driven, and there is a perfected expert system that can sense abnormalities that could lead into accidents then regulate the entire driving network, like move the right car faster, so it would be save to swerve to the right. It would be perfect, at least for what i think.

Author Hutchens ( ago)
Yeah, I'll just stay in control of my own vehicle and not text and drive.

Author Jappleng University ( ago)
If you don't give the driver a moral choice configuration, very few people will opt to go in a self driving car. Someone jay walking / running across the street unexpectedly should not be the fault of the driver should the only choice be their death or your death regardless of age. People value their own lives more than other people's lives usually and that means pedestrians and cyclists better learn to watch for self-driving cars when they become common place. I don't think there's a moral dilemma if an accident happens and the road laws are followed and there was no software bug. Notice that there hasn't been a self-driving vehicle accident that was caused by the self driving vehicle, only by those around it breaking the road rules / law which caused said accidents. Again, if they expect people to buy them, cars will need to do everything they can to protect the passengers inside the vehicle and other cars on the road will have to try to do the same. If the risk of driving a motorcycle means getting hit by a self driving car in an unusual circumstance, then so be it, it would be the same if a human driver were involved anyways except humans are less likely to avoid such situations. To avoid 99.9% of accidents with self driving cars or to keep going with the 50%~ chance of getting into an accident by human error? Not much of a dilemma there.

Also, stay about 300ft away from vehicles carrying anything, it's a pretty common rule to follow, perhaps a law. No self-driving vehicle would stay close to that vehicle in real life.

Author daddyleon ( ago)
Inspired by CGPGrey on HI?

Author Michael Hutchinson ( ago)
People aren't thinking of the bigger problem this solves... insurance. Rear-end collusions will no longer be your fault will they?

Author Dead Zone ( ago)
"How much can you trust an autonomous car?"
Well, let's see.
A major hack attack could send millions of cars off the road.
60,000,000 cars on the road can already be remotely controlled by hackers, and the CIA/ NSA.
Blue screen of death.
The self driving vehicle could value the life of say a j walking homeless man over my own life and slam me into a wall and kill me instead in order to save him.
I guess I have to say, I don't trust self driving or heavily electronic cars very much at all...

Author Blast King ( ago)
i expect trolls in the future to troll those self-driving cars

Author orlando garcia ( ago)
my problem of this as a car guy is not the safety, it is about the driving experience, if i wantet to go from A to B i will take a taxi or buy an autonomus car that isnt too powered because i will not driver the car

Author SnakeEngine ( ago)
Sorry no, I will never trust an auto-car, just too many factors where it can go wrong. May be as an emergency solution it will be a great aid, when driver falls asleep or a threat is getting obvious that has to be avoided, but other than that a computer can never be as robust as a biological system.

Author The Pot Scientist Reports ( ago)
Also, there's the moral dilemma of organ donations! Without all those auto accidents, fewer organs will be available to save lives of those who need them!

Author Salt & Pepe 69 ( ago)
Or you could just have urban sprawls non existent and have walkways like Japan that go over streets. Lmfao problem solved

Author Mixlop ( ago)
What's with all the dislikes

Author Man Lamp ( ago)
If every car were to be self driving, there would be no accidents. Sensor = ez

Author JayJay Jones ( ago)
It's the nect best thing after a flying car!

Author G Eduardo Bicelis G ( ago)
I suppose autonomous driving is an option to be added, not the only way to commute. Is that what is been programmed? Thanks for your educational videos.

Author Brandon Marvel ( ago)
so when a BMW drives it's self does it also refuse to use the blinker?

will the BMW self driving car still drive like most BMW drivers?

Author Dan Baker ( ago)
You are a fool.

Author zippy ( ago)
It is going to be a lot longer then you think before we have delivery drones and driverless cars. They just don't have the tech. yet.They tried it in Cal. and there were Too many accidents. Not for 50 years.

Author No Name Provided ( ago)
Bigger moral dilemma: Anybody who gets their hands on cia exploits can self-drive you right into a tree!

Author T Kevin ( ago)
This video isn't supported by Audible.

Author Klaudius Harsch ( ago)
Hey Veritasium, you are one of the only youtubers were I give an thumb up, before watching the video. Heven't seen a bad video from you. I really appreciate what you are doing here. Never stop doing that!

Author Paskaloth ( ago)
Doesn't seem so complex. Make it's moral compass an option to be set up or changed, frees the car manufacturer from blame and keeps the moral dilemma where its always been and in my opinion where it should always be, with the driver.

Embed Video:

URL 
Link 

Search Video

Top Videos

Videos

Analyse website