﻿WEBVTT

1
00:00:01.661 --> 00:00:04.419
<v ->Human beings are great at a lot of things</v>

2
00:00:04.419 --> 00:00:06.419
but driving not so much.

3
00:00:07.611 --> 00:00:10.712
In the United States alone over 30,000 people die

4
00:00:10.712 --> 00:00:12.996
every year in car accidents.

5
00:00:12.996 --> 00:00:16.015
That's because drivers are all too human.

6
00:00:16.015 --> 00:00:20.042
They're too distracted, their reflexes are too slow.

7
00:00:20.042 --> 00:00:20.875
[smashing]

8
00:00:20.875 --> 00:00:22.889
A lot of times they are too drunk.

9
00:00:22.889 --> 00:00:24.700
[smashing]

10
00:00:24.700 --> 00:00:26.392
[smashing]

11
00:00:26.392 --> 00:00:28.108
You okay, little buddy?

12
00:00:28.108 --> 00:00:30.336
Can we leave insurance out of this?

13
00:00:30.336 --> 00:00:33.641
Which is why self-driving cars are coming and fast.

14
00:00:33.641 --> 00:00:37.352
They promise to make accidents all but a thing of the past.

15
00:00:37.352 --> 00:00:39.449 line:15% 
But let's pump the breaks self-driving cars

16
00:00:39.449 --> 00:00:41.816 line:15% 
won't save everyone.

17
00:00:41.816 --> 00:00:44.029 line:15% 
In fact sometimes they'll have to make the decision

18
00:00:44.029 --> 00:00:46.341 line:15% 
to harm you the passenger.

19
00:00:46.341 --> 00:00:48.872
Consider this scenario you're riding in your shiny

20
00:00:48.872 --> 00:00:51.838
new self-driving car and suddenly a group of people

21
00:00:51.838 --> 00:00:54.330
spills into the road.

22
00:00:54.330 --> 00:00:56.889
At this point the car has to make a decision.

23
00:00:56.889 --> 00:00:59.376
Either swerve into the wall killing you

24
00:00:59.376 --> 00:01:01.877
or plow through the crowd saving your life

25
00:01:01.877 --> 00:01:04.591
but taking several others.

26
00:01:04.591 --> 00:01:07.026
While surveys show that most people would chose to sacrifice

27
00:01:07.026 --> 00:01:09.972
themselves for the crowd those same people wouldn't

28
00:01:09.972 --> 00:01:12.643
wanna buy a self-driving car that would intentionally

29
00:01:12.643 --> 00:01:13.476
harm them.

30
00:01:14.523 --> 00:01:16.435
Which is silly and irrational.

31
00:01:16.435 --> 00:01:20.294
Of course this scenario would be exceedingly rare.

32
00:01:20.294 --> 00:01:23.187
You are far safer with a machine in control.

33
00:01:23.187 --> 00:01:28.060
Hell, even basic autonomy can cut traffic deaths by 80%.

34
00:01:28.060 --> 00:01:30.775
But as a society we have to start talking about

35
00:01:30.775 --> 00:01:33.401
how robocars will transform not just the logistics

36
00:01:33.401 --> 00:01:36.484
of driving but the very ethics of it.

37
00:01:39.712 --> 00:01:40.809
[moves into funky music]

38
00:01:40.809 --> 00:01:42.717
After all in a way self-driving cars

39
00:01:42.717 --> 00:01:44.867
will be programmed to kill.

40
00:01:44.867 --> 00:01:47.209
So if a self-driving car chooses to injure you

41
00:01:47.209 --> 00:01:49.883
does that make the car maker liable?

42
00:01:49.883 --> 00:01:52.500
And from a PR standpoint how will car maker's

43
00:01:52.500 --> 00:01:55.117
convince the public that self-driving cars will have to

44
00:01:55.117 --> 00:01:59.686
sacrifice a few drivers for the greater good?

45
00:01:59.686 --> 00:02:01.128
No one has answers yet.

46
00:02:01.128 --> 00:02:03.766
Which is fine because these are still early days

47
00:02:03.766 --> 00:02:05.006
but one thing is certain

48
00:02:05.006 --> 00:02:09.710
that self-driving cars are coming, so buckle up.

49
00:02:09.710 --> 00:02:10.960
Let's go buddy.

