The Real Moral Dilemma of Self-Driving Cars

  • Added:  4 months ago
  • We talk about all the potentially challenging situations autonomous cars could get into but not about how human drivers are not very good. Tens of thousands die on the roads every year in collisions, most of which could be prevented by autonomous vehicles.
    Sponsored by BMW

    I wanted to make a video about autonomous cars for some time but I hadn't had the opportunity. The self-driving technology is already at a state where it can save lives if only it were more widely implemented.

    Links to original clips:
    TED-Ed https://www.youtube.com/watch?v=ixIoDYVfKA0
    BBC Newsnight: https://www.youtube.com/watch?v=FypPSJfCRFk&t=172s

    Music from http://www.epidemicsound.com "Ambient Electronic Groove," "Pet Animals 2," "The Long Ride."
    Filmed by Raquel Nuno
    Edited by Trevor Carlee
  • Video CategoriesEducation
  • Runtime: 3:41
  • Tags for this video:  veritasium  self-driving  autonomous  car  cars  moral  ethics  ethical  dilemma  trolley problem  bmw  self-driving cars  autonomous cars  vehicles  driverless  automatic  safety  

Comments: 5 523

  • MyChico333
    MyChico333 4 months ago

    People don't care about being safe, people care about feeling safe.

    Humans are morons by nature

    • ManintheArmor
      ManintheArmor 19 days ago

      The problem is that we give them too much of a choice in the matter. Not only that, but there are people that need to get places even though they're not qualified to drive, and no-one cares enough to lend a hand. Not everyone can drive. But they have to considering the nature of our society.

      Society is said to owe you nothing, but it does considering how much it wants to sustain itself. Not everyone wants to be part of it, but they end up being assimilated into it anyways. Thus is it a priority for society to equip its citizens appropriately than to pretend these tools are privileges... no, these are necessities.


    • Andrei S
      Andrei S 23 days ago

      If you actually believe that you are gravely mistaken on two accounts.
      1.99% of human error accidents are avoidable by humans(i.e., not speeding, not driving recklessly, not drinking and driving). Educating humans to stop driving like it's a game will reduce the number of deaths more than 3-4 decades of self-driving cars development. If he's driving as he should, he won't kill anyone.

      2.People right now are speeding and driving recklessly, despite the huge risks they face. They don't care about them. Do you actually believe they would give driving up for an autonomous car just because of its safety? They won't. Those cars will come and nobody would use the feature, and then they'll become a gimmick and that would be the end of them. I agree it's sad, but that's the truth.


    • Bastian Cellarius
      Bastian Cellarius 23 days ago

      @FreelanceDev4life Good for you. I just don't think other people will die as gladly by your fault.


  • Sander Koni
    Sander Koni 1 day ago

    Tesla model S and X can already self drive.

  • LaFleur13
    LaFleur13 1 day ago

    That's the best moral dilemmas you can come up with? there are more important ones that this one...

  • EmuStar Gaming
    EmuStar Gaming 2 days ago

    The real problem with self driving cars is how much they'll cost. While yes, they can and will save numerous lies and trips to the emergency room, plus millions of dollars in vehicle damage, how will any of this happen if a lot of people can't afford the car in the first place?

  • Thashreef Muhammed
    Thashreef Muhammed 2 days ago

    What if a criminal hijack the car by hacking...??? I am afraid of disaster that happened in Fate of Furious...

  • Ecosse57
    Ecosse57 3 days ago

    just take a cab and let the rest of us not be utterly consumed by technology. friggin lemmings.

  • nwimpney
    nwimpney 4 days ago

    As with most statistics, the interpretation of the data is key. Even if 94% of collisions are caused by driver error, you have to consider if the average driver's likelyhood of causing a crash is proportional.
    If a very small percentage of drivers are "bad", and causing those accidents, the self driving car may beat the average in aggregate, while still putting the average driver at greater risk than they would be driving themselves.

    I don't know if this is the case or not, but most of the statistics I've heard are pretty simplistic.

  • Angry Encore
    Angry Encore 4 days ago

    Why do we have to choose between hitting the SUV or the motorcycle? If we're talking about a future car, it should be able to hit them both.

  • Angry Encore
    Angry Encore 4 days ago

    Always swerve into a motorcycle...thats any easy one, everyone knows that.

  • Hoss Cartwright
    Hoss Cartwright 4 days ago

    The only reason I'd like to have a driverless car is to get my drunk ass home.

  • Mole Downunder
    Mole Downunder 4 days ago

    The behaviour of the self-driving car should correspond to the rational self-interest of the driver as much as possible. We don't need to decide on what should be programmed into the car, our focus should be in making it as close to the driver's rational self-interest as possible. This will improve over time as technology advances so I don't think there really will be a dilemma. For example, a crude recognition system could swerve the car towards the bigger vehicle. A more advanced system could swerve the car towards what it recognises as a stranger rather than a family member.

  • Area85 Restorations

    Autonomous cars are going to kill more people than they save in my opinion. My friends 2016 BMW X5 caught fire one morning due to a rat chewing on the harness. Imagine doing 90 mph and your autonomous car has a short and steers you into a barricade. Murphy's law is an absolute, coming from someone with a masters in mechanical engineering.

  • Wolflens
    Wolflens 5 days ago

    Trains?

  • SweetAsHoney
    SweetAsHoney 5 days ago

    yep, more control/regulations by the government and corporate. more conformity, less freedom. make people feel safer in order to steal their freedom. simple strategy used for thousands of years by royals and elites.

  • legion
    legion 7 days ago

    NO MORE TICKETS!!!

  • pascal jacob
    pascal jacob 8 days ago

    I can't wait to drive to work and then send my car off to find a parking space on it's own

  • Des Troya
    Des Troya 9 days ago

    I speak for millions of Americans, we don't want self driving cars. The price for them is just not worth it. Not only the cost of the new technology, the cost in terms of lost freedom and more control to government regulators and car companies over where we can or cannot drive. Also the possibility for cars to be hacked is already a very real issue, just look at what happen to Michael Hastings. Finally there is the issue of who is responsible for accidents from self driving cars, and the feeling of helplessness as your car crashes for you when if you had been behind the wheel you could have avoided it. It is obvious from the start this is not a good idea, and I hope it never comes to be a reality.

  • Robert Weekes
    Robert Weekes 11 days ago

    The other dilemma is replacing millions of people who drive for a living, transportation is the #1 industry in America.

  • Josh Daniel
    Josh Daniel 11 days ago

    You've shoulda tried a Tesla Model X

  • PowerPC602
    PowerPC602 12 days ago

    I WANT TO DRIVE MY CAR. PERIOD.

  • Gleison Storto
    Gleison Storto 15 days ago

    I am still skeptical about seeing a company assuming full responsibility for a self-driving car. I am sure they will keep Tesla's or aircraft manufacturer's policies stating that the driver/pilot should always be ready to take control of the vehicle, if the vehicle internal systems fail.
    I would also like to see studies about driver training. Pilots must be periodically trained to be able to handle the automation in a modern aircraft. Why should self-driving cars be different?

  • Yuriy Fitsay
    Yuriy Fitsay 15 days ago

    We want to drive the car, to feel its inner mechanics, to have full control of this machine. that's a big reason for there not being such a large demand.

  • Yog  Bakhru
    Yog Bakhru 15 days ago

    not at all

  • Unidorsal Icosahedron

    Nice safira.

  • shinobody
    shinobody 17 days ago

    You're committing a huge fallacy tho - because HUMANS program these computers in self-driving cars. The same humans that make mistakes - when driving, or coding. How do we have ANY proof that in a pinch, autonomous cars would be safer? It's coder's mistake vs driver's mistake, until we get solid numbers.

  • Alex MC
    Alex MC 18 days ago

    what if all the cars were connected to each so that if one makes a decision the others know what to do,too. and Everyone is safe

  • ThePatrioticGamer
    ThePatrioticGamer 19 days ago

    Tesla already has a completely autonomous car that is 2x as safe as a human driver.

  • م هه
    م هه 19 days ago

    Im sorry. But people need to die... Even with all these deaths overpopulation is still happening. 400 every 3 minutes in India

  • BR5491Z1Z
    BR5491Z1Z 19 days ago

    ya stupid fock why didnt u go to an AMERICAN company like TESLA to make this video . ya DUMB ASS

  • Exelius
    Exelius 19 days ago

    The real moral dilemma is why we let stupid people drive cars and cause accidents.

  • François Maspuche
    François Maspuche 19 days ago

    It make me think about the time I started with the Motorcycle. I was like 16 and never driven but I still felt safer when I was driving than when he was... pretty strange.

  • Becky Ricard
    Becky Ricard 22 days ago

    Okay distracting driving is a problem, and people do make errors here's the thing though once these things are out on the lot how much of the public will be able to afford these safe vehicles? I'll bet due to insurance the people who probably should have these cars won't be able to afford them

  • prototip99
    prototip99 23 days ago

    you want to be driven places? oh that exists it's called public transport

  • Tidal Yacht
    Tidal Yacht 23 days ago

    Level 5 automated cars in the future will bring us one step closer to an NSA controlled society. Not only that, but they won't protect us from hackers.

  • wolverine96
    wolverine96 24 days ago

    I wonder how many people will be killed and injured if the autonomous cars get hacked?

  • M Lienau
    M Lienau 25 days ago

    think about all the jobs that would be lost.. Those who drive for a living would have to learn a technical skill.. it just doesn't cut it anymore!

  • Roland Németh
    Roland Németh 25 days ago

    Safe or not, I want to drive my car and not the car driving me.

  • kevlar20
    kevlar20 25 days ago

    I enjoyed that behind the scenes look in the elevator mirror.

  • FreelanceDev4life
    FreelanceDev4life 26 days ago

    The problem with this is by nature cars will not have the level of human emotion associated with human decision making. As such, a purely calculated choice by a machine will assess the situation without considering the human reactions of the other party involved.
    For example, someone runs a stop sign, collision imminent, someone is going to die, is the car going to be able to predict that the other driver was at fault? Why should I suffer for both man and machines actions? I already suffer for other idiots actions, why put machine on top of it?

    Human emotion is what makes us dangerous but it's what gives us our uniqueness. Without us being able to make decisions using human emotion we can't even begin to understand the other person. I fear that this will create a world where empathy is even harder to come by. I see people coping by saying "It wasn't your fault, it was a machine and another person." but you won't be able to change the effect it will have when you know that someone was killed because of the vehicle's actions you were in. When people are involved in a way that hurts someone of kills them, there is often grief, sorrow, pain, soul searching, and either bitterness or improvement. Without any chance for us to change that and grow as human beings, we will only grow backwards as a species. Pain, death, hard times, unfair treatment, and suffering, all give us chances to improve ourselves as it leads to self reflection. In a world without pain, no self improvement will happen and a society will die because there is little to no reason to overcome the world's cruelty and because people's cruelty can not be overcome by machines alone, there is no reason to be better.

  • Choco melk
    Choco melk 26 days ago

    This is sad news for people who enjoy driving a car by them self. I hope that in the future driving autonomously will not be the only Option, but that you could always take the steer if you want to.

  • Stefan Baars
    Stefan Baars 26 days ago

    The thing i have with self-driving cars is the fact that it's practically a computer, which means it can be hacked or get a virus. Which makes self-driving cars far less safe than cars like we have today.

    However if there's an autopilot button in your car which gives you the option to make it drive by itself, people will be less concerned about self-driving cars since you still have the freedom of choice.

  • Don Bodacious
    Don Bodacious 26 days ago

    aha, and I forgot: speeding, substance abuse, sleep deficiency and other incorrect behavior are the source for many accidents.

  • Jonathan Cross
    Jonathan Cross 26 days ago

    After all of that, he didn't answer the question? Who decides the tough choices and why?

  • Jurre Volkers
    Jurre Volkers 26 days ago

    Most of the donor organs come from people who died in car crashes. What will happen when those organs are more scarce because of safer autonomous cars? That is the real moral dilemma.

  • MrScottishBeaver
    MrScottishBeaver 26 days ago

    That was a pretty cheap conclusion. I thought he'd talk about tradeoff between risk aversion and speed. In the future it's entirely possible for "jailbroken" cars - or different safety preferences to compromise everyone else in the road. Being autonomous also requires uniformity

  • Jayyy Zeee
    Jayyy Zeee 29 days ago

    Excellent point! Despite the risks, autonomous cars will save more lives than they lose. We need to put the risks into perspective.

  • VioletGiraffe
    VioletGiraffe 29 days ago

    I trust an autonomous car more than a human driver, that's for sure.

  • Libertina Grimm
    Libertina Grimm 1 month ago

    let's get a video that is not sponsored by BMW that is more balanced. this just feels like an ad for BMW.

  • James Daniel Marrs Ritchey

    Would be cool if you could have an app on your phone that allows you to request your car to drive to your current location.
    Me, "Car come pick me up." Car, "Sure bro. Be there in 10."

  • James Daniel Marrs Ritchey

    You're reaching.

  • Nishant Gupta
    Nishant Gupta 1 month ago

    creating a video out of nothing, now that's what I call creativity 😎

  • zdrux
    zdrux 1 month ago

    The real dilemma is when will governments make it illegal to own your own car and drive by yourself? It will be done in the name of safety of course (all rights are removed using this method). You won't be able to go anywhere unless it's in an automated vehicle so it's always logged and tracked.. for safety reasons.

  • mjktrash
    mjktrash 1 month ago

    I used to like you and your channel, right up until I saw this video. Good bye.

  • Crotoman111
    Crotoman111 1 month ago

    I think self-driving cars will be great overall, but when problems DO arise, it'll be a legal nightmare to sort out. We can do it though! We've done more incredible stuff, like MAKE SELF-DRIVING CARS

  • bingola45
    bingola45 1 month ago

    The moral dilemma of automatic elevators was ignored in favor of cost.
    Sack a load of Blacks; save a load of money.

  • 144pandagirl
    144pandagirl 1 month ago

    When we shirk our responsibilities off onto other people or other things we also end up loosing our freedoms. I agree that distracted driving is a bad thing and needs to stop. If people can't control themselves and focus on the road, the government may one day mandate self-driving cars. Who knows, eventually the car will probably decide your route for you and not give you the choice as well. Something else that I think also needs to be considered is can these cars be hacked...what kind of deaths could we see from terrorists hacking cars? Any self-driving car should have manual override that can be engaged without going through the computer.

  • Air Pex!
    Air Pex! 1 month ago

    1:38 I love how it being an autonomous BMW is doubleparking right there :D

  • Andrew and Byron Juggling

    YES YES YES, A THOUSAND TIMES YES! Give me self driving cars yesterday!

  • Meltroxgeolix LunarGlare

    People complain about the computer system messing up. But may I remind you we rely on computer systems for literally everything. Like even the flight system to organize which flights take place at which times. And even things like dishwashers, or microwave ovens. So it's nothing new to rely on computers, and not to mention, people are way too easily distracted, while computers can't get distracted. They take in data from the components and process it, then choose a result based on how we programmed them. So they are way safer than humans. However! The only problem is, they have to track road lines, and they can't really see far ahead like humans can, and can't recognize things as well as us. But that is only a problem because the roads are not made for self driving cars. For example, if every car was autonomous, they could all communicate through a data system and know where virtually any car around them is. Not to mention the roads could be made to have some sort of emitter or something the vehicle can track, so it doesn't even need road lines to track. So driving in weather wouldn't be a problem. Also the street lights would not be necessary, and instead could be linked to an autonomous system so everything is synchronized. And if for any reason something did go wrong. The vehicles within the area would be notified, letting you know of an accident ahead, and give you alternate routes. Not to mention if the whole system was connected, the cars would not need such a physical tracking system if all the routes of other cars around yours could be identified so your car knows not to pull into a lane at a certain moment another car is. So currently self driving cars are a mess. But if the whole system was redone (a lot of time and money). It would be nearly perfect and a whole lot safer, and easier for everyone.

  • gnarmad
    gnarmad 1 month ago

    see? Las Vegas is cold in the winter

  • John Smith
    John Smith 1 month ago

    I am really skeptical about the abilities of self driving cars during a Canadian winter. With salt, snow, sand, and ice blowing around (hindering sensors), coupled with slippery roads, I don't think the car would fare well. I saw an "incident" on YouTube where a Tesla on autopilot followed a truck that was changing lanes (because it could not see the lane markings), forcing the driver to intervene to prevent the car from ramming another vehicle in the adjacent lane.

    Yes, some humans do a bad job at driving in all-seasons, but there are other ways to prevent that. For example, harsh penalties for distracted driving (i.e. jail time), and cops targeting distracted drivers rather than issuing tickets for technicalities.

  • Erik Le Blanc Pleym
    Erik Le Blanc Pleym 1 month ago

    This "Real Moral Dilemma" ties to all technology which we westerners are so privileged to have. Had the right socio-economic system been in place we could stop and completely reverse global warming in 20 years and build a colony on the moon as a hobby.
    The technology that is already invented and just sits there waiting for the economy to catch up is beyond our wildest trans-humanist utopian dreams. However, the science is only as strong as it's weakest chain and right now economy is at the point in it's development where it executes Galileo for blasphemy.
    The very notion that many will immediately jump to conclusions about me being a voluntaryist or communist or whatever economic theory they most despise just goes to show how dark the dark ages of political economy is.
    Most are to unwilling to face the discomfort that will come with such a radical economic reform as we unquestionably will need before the next century. (Look up "HUMANS NEED NOT APPLY" to understand why).
    So the real moral dilemma is that we're not stepping out of our comfort zone and ask what are we going to do about this.

  • Petr Novák
    Petr Novák 1 month ago

    To add my two cents to the discussions here, I see a different scary problem than ethics or assigning guilt in cases of accidents... I see a problem with hacking.
    I presume the cars wont be completely separated from the global net. And if the softwares architecture isnt 100% safe from breach, this would be the first time ever, that someone could "hack" into a controlable lethal weapon on large scale. I mean, pilotless war drones are probably not being remotely controled by criminals, but there are very few of these, are proabably really well made and are controled by the military. This stuff will one day be made in china for a thousand dolars... what level of security will those cars have?

  • Bongan
    Bongan 1 month ago

    A quick question, did you rewrite the script anything or just went with what BMW gave you? Watching your other videos, there is a noticable difference. And now I feel sad.

  • Jim Teevan
    Jim Teevan 1 month ago

    The problem is fairness. I completely agree that fewer crashes are a good thing, and I also agree that automation is the way to do that. It simply "FEELS" unfair if I end up in an accident because of a machine's decision. If I crash, it's my fault, and I can deal with that. If something crashes me, it seems somehow unfair.

  • KDreezy Gaming
    KDreezy Gaming 1 month ago

    "Filling the roads with autonomous cars will eliminate jobs" - False. Who will program new algorithms,software and protocols for these cars? Programmers. Who will repair these cars? Mechanics, electrical engineers, IT. Who will harvest the materials for these vehicles? Miners, heavy machine/vehicle operators, managers etc. Who will build the factories to produce these vehicles and the robots which assemble the parts that humans don't assemble? Construction workers. The list goes on and on. People said the same thing about computers and calculators when they came out. "What will all the people with jobs in mathematics and engineering do with all these computers eliminating their jobs?" They find other things to do. There is always something. I have the theory of the conservation of jobs, jobs can't be created or destroyed, they can only change form or be moved. :P

  • Daniel Matthews
    Daniel Matthews 1 month ago

    The moral dilemmas are not the problem so much as the fact that if you hard code solutions into car software people can then exploit those scenarios reliably to murder people. e.g. If I want to kill the driver of car [A] I just have to wait at a point I know they routinely pass and then roll an empty pram into the path of oncoming car [B] such that it swerves into the path of car [A] rather than kill a baby that does not have the protection of a car around it. Do it on a bridge or near a steep drop-off and the destruction is potentially even greater. It is the predictability of the autonomous cars that will be the problem, a very practical exploit and not a question of philosophy etc.

  • Borhan Zadeh
    Borhan Zadeh 1 month ago

    Featuring a tiny glimpse into who films these vids for Derrick (2:43).

  • lovebird mic
    lovebird mic 1 month ago

    So, one day, we may not need to worry about the problems of auto driving. If everyone had a auto car, we could put them on the same network and then they could tell each other "OBSTICAL! CAR ON LEFT? YES, CAR ON LEFT, BRAKE." Or something similar... Just an interesting idea...

  • Udayveer Singh
    Udayveer Singh 1 month ago

    I guess the main thing is there's too many variables. The examples you gave, Veritasium, of AI in planes and elevators is much like cruise control in cars. It controls a limited number of variables and it can't really go wrong (for the cruise control example thats why the driver is told to keep paying attention to the road). However with complete AI control, theres way too many variables, possibilities and unknown circumstances to be completely covered by an AI which ensures human safety.

    • Udayveer Singh
      Udayveer Singh 1 month ago

      If they were to make such an AI its hardware would probably take up the more space than whole the car itself


  • ProCactus
    ProCactus 1 month ago

    So the solution is to build things for idiots and morons.

  • melissa zabower
    melissa zabower 1 month ago

    In a different video, you talk about thinking and Drew's inability to hold too many pieces of information. As a middle school teacher, I've had students ask if they can us the calculator, and I always say no. Because we know that young children need to learn calculations so well that it becomes automatic. Every year, to prove my point, I would challenge a student to come up with a multiple digit multiplication problem. Another student and I would race; he/she would use a calculator and I would do it in my head. I almost always won. But I also practice. I know; I'm a geek. My point here is that if we allow drivers, especially young ones but really any age, to release their responsibility to the vehicle, then those people will lose the ability to analyze road conditions and make smart choices. I predict that we'll see this first with lane departure warnings. People will get used to hearing a beep, until it becomes part of the background that they don't hear it anymore. So yes, for a while you'll see fewer accidents, but I think in the long run it will backfire. Someone needs to put cell phone jammers in cars so that as long as the tires are moving, the cell phone won't work. I think that will do a lot more to reduce accidents than an autonomous car.

  • 1MoreRep
    1MoreRep 1 month ago

    I will trust my own brain over a machine thank you.

    • Tobaddl
      Tobaddl 17 days ago

      1MoreRep yeah but can you trust the brain that is beside you on the motorway?


  • Broockle
    Broockle 1 month ago

    Why even own a car? Cars should be a lot smaller on average and owned by private companies or government or whatever and you could just order one with an app. It would pick u up and drop you off and go meet the next person's needs.
    Like a Self driving Uber driver basically, that a thing yet?
    Also drones could replace delivery in many forms with the right infrastructure, which would take more vehicles off the roads.

  • Dennis Doroslovac
    Dennis Doroslovac 1 month ago

    Remember the blackberry days when your device froze and required a battery pull? What happens when the cars computer freezes?

  • blond beast
    blond beast 1 month ago

    The biggest reason I want a self-driving car...
    So I can get wasted but take my car home XD

  • Scott Kendall
    Scott Kendall 1 month ago

    Veritasium had a good segment on probability, my observation should have been on his mind when he made the comment that planes on autopilot are safer. This is a poor example. Pilots put planes on autopilot when the probability of things going wrong is lowest. Planes crash on take off and landing. Rarely do they crash in between when pilots put them on autopilot.

  • First Last
    First Last 1 month ago

    I like how half of these comments are all science based comments from experience of disagreement and open discussion of bother pros and cons of the idea, even though I'm totally against the idea of cars with no human control.

  • Bob Bob
    Bob Bob 1 month ago

    I always thought the moral dilemma was the CIA's ability to control these kinds of cars.

  • Momo eater
    Momo eater 1 month ago

    if everyone had the self driving car then the dilemma would not exist. No programmed car will make a mistake.

  • Yonkage
    Yonkage 1 month ago

    The answer to the moral "dilemma" is to not swerve and endanger another vehicle other than itself.

  • First Last
    First Last 1 month ago

    How many fatal airplane crashes killing hundreds of people at once occurred because people were relying on a bad auto pilot?

  • Sam Stevenson
    Sam Stevenson 1 month ago

    should have also talked about horsey horseless. Seems like a good automotive example of people not being comfortable with change.

  • Samuel Clayton
    Samuel Clayton 1 month ago

    Or... we could just bring back trains...

  • ganondorfchampin
    ganondorfchampin 1 month ago

    I mean, you need to make sure the cars are safe before you put them on the road, moral dilemmas aside.

  • Antie Cuteness
    Antie Cuteness 1 month ago

    Why don't we all use bikes?

    • Yonkage
      Yonkage 1 month ago

      Too long a commute. Very few people live fewer than 10 miles from their job, and nobody wants to add four extra hours of commuting to an eight-hour workday.


  • MilitantPeaceist
    MilitantPeaceist 1 month ago

    The real moral dilhema is:
    We can accept that humans make mistakes because we know we make mistakes, so we would accept a person chosing either the car to the left or bike to the right & the decision time prevents rationale to be considered in a split sec.
    But we would not see this in a machine that decided the same course each time resulting in bike riders having a phobia about passing or riding next to an automated car etc.

    The blame will be placed on the technology itself (not the driver) & then the software writers, & they will be morally judged differently to a human driver forced to make a split decision because the software writers chose the bike rider to die.
    This will feel like pre-meditated homocide.

  • Matthew Hafner
    Matthew Hafner 1 month ago

    The correct choice is the car should make the choice that is most likely to protect its occupants. I would never buy a car that is programmed to kill me. Nor would I hire a bodyguard that doesn't put me as his priority. Most people agree with this sentiment.

    If you can't convince the majority of people to buy autonomous cars, there *wont be* autonomous cars.

  • codediporpal
    codediporpal 1 month ago

    The correct answer is always SLAM ON THE BRAKES!!!! That's why all accidents involving self driving cars have the self-driving car being rear ended.

  • ZloTip
    ZloTip 1 month ago

    +Veritasium there was a case of death caused by self driving car. The driver was killed. No thanks for me.

  • Chuck Norris
    Chuck Norris 1 month ago

    Wow. I just noticed Veritasium has an atomic mass of 42.0.








    420?

  • Axel Schmidlin
    Axel Schmidlin 1 month ago

    This videos explains the problem in a way too simplistic manner. Treating the moral dilemna purely with "quantifiable" arguments (number of deaths, accidents, etc...) does not bring out the real moral dilemna: the fact that we lose our ability to chose. Imagine if instead of your mentionned case scenarios, the driver had to die to save an entire schoolbus ? Should the driver do so ? What if it was just one family car? We must not let algorithms take such decisions, no matter the death toll it produces. We may reduce the number of people dying on the streets, but in doing so, we are actually diminishing our right for self-determination and what constitutes our humanity.

    • nipi tiri
      nipi tiri 1 month ago

      How dare you speak out against our future robot overlords. Im a good house human, yes I am!


  • Michael Colby
    Michael Colby 1 month ago

    Hurry the heck up and mainstream self-driving cars so I can sleep the extra 20 minute drive to work. I want tinted windows as well so I can play with mys ... I mean pick boogers without anyone noticing.

    • nipi tiri
      nipi tiri 1 month ago

      Once those cars are here please newer order a cab. I dont want to be the next customer.


  • Adam Eason
    Adam Eason 1 month ago

    yea like who would program a car specifically to be racist? it doesnt have time to calculate that stuff, in this low chance scenario, it doesnt really matter in the long run because its not going to happen very often, and if it does, why cant the driver take control?

    • nipi tiri
      nipi tiri 1 month ago

      Let me guess you think the AI will be able to tell that its about to screw up. It will notify the driver who then will take a look at whats going on and react accordingly. Im sorry but if there is enough time for all that then its not really a dangerous situation. I wouldnt trust an AI that cant handle such a situation on its own with my safety.


    • Adam Eason
      Adam Eason 1 month ago

      the driver wouldnt need to remain aware and alert all the time really, and in the case of a break mau function or something of the sort, i dont think knowing traffic laws matter at that point, and the driver can take control and lead the car to safety if he so chooses to do so. if he does not choose to do so, the car should take the quickest route to safety. less people would die this way...


    • nipi tiri
      nipi tiri 1 month ago

      What would be the point of having a self driving car if the "driver" has to remain aware and alert all the time. And really people wont do that.

      Also if the AI usually does all the driving then the "driver" will end up too unskilled to properly react should there ever be need to do so.


  • Adam Eason
    Adam Eason 1 month ago

    0:43 huh... they should really make a smartphone with haptic feedback... kind of like nintendo's HD rumble... so that smartphones can have good gamepads... and then we should make a new store for apps because google play store is nothing but shovelware...

  • jhanks2012
    jhanks2012 1 month ago

    The real answer to the posed dilemma is that the car will attempt to either swerve towards the SUV or the motorcycle and it will communicate with whichever vehicle so that vehicle also moves, and so on if there is another vehicle beside that on, etc. There is no reason to assume the cars will not be aware of each other and talking to each other. Essentially all accidents will be avoided because each car will respond to each other in such a way to avoid collisions altogether.

  • Tony01013
    Tony01013 1 month ago

    Here's a moral dilemma for you: How can you procreate knowing full well that your offspring is going to die and had no choice of whether or not he/she wanted to exist in the first place.

  • 14OF12
    14OF12 1 month ago

    Why is breaking never an option in these moral dilemma problems?

    • nipi tiri
      nipi tiri 1 month ago

      Because if braking could solve the problem then it wouldnt be a moral dilemma.


  • Ashton Giovanni
    Ashton Giovanni 1 month ago

    I personally don't think tech is ready for self driving cars. I think we should focus on improving current designs and maybe computer assisted driving.

    • Ashton Giovanni
      Ashton Giovanni 1 month ago

      Thats a good idea. Maybe cars could alert each other of accidents or traffic etc.


    • nipi tiri
      nipi tiri 1 month ago

      I think the problem is all those illogical and unpredictable human drivers on the streets. All the cars need to be communicating with each-other before we can have self-driving cars.


  • 99NEPtune99 ™
    99NEPtune99 ™ 1 month ago

    The real moral dilemma is whether the car should save you or pedestrians outside. Like the car either swerves into the crowd to save you, or the car causes you to die and not run over pedestrians

Analyse website