﻿WEBVTT

1
00:00:01.199 --> 00:00:02.997
<v ->So what I want to do in this story is I want</v>

2
00:00:02.997 --> 00:00:05.452
to get into the specifics of the new product launch

3
00:00:05.452 --> 00:00:07.095 line:15% 
and the new things you are doing and the stuff

4
00:00:07.095 --> 00:00:09.365 line:15% 
that's coming out right now in the machine learning.

5
00:00:09.365 --> 00:00:11.373 line:15% 
But I also want to tie it into a broader story

6
00:00:11.373 --> 00:00:14.100
about Instagram and how you decided to prioritize niceness

7
00:00:14.100 --> 00:00:16.240
and how it became such a big thing for you

8
00:00:16.240 --> 00:00:17.898
and how you reoriented the whole company.

9
00:00:17.898 --> 00:00:20.031
So I'm gonna ask you some questions about the specific

10
00:00:20.031 --> 00:00:22.225
product and then some bigger questions.

11
00:00:22.225 --> 00:00:23.058
<v ->I'm down.</v>

12
00:00:23.058 --> 00:00:25.462
<v ->All right, so let's start at the beginning.</v>

13
00:00:25.462 --> 00:00:27.932
I know that from the very beginning you cared

14
00:00:27.932 --> 00:00:28.854
a lot about comments.

15
00:00:28.854 --> 00:00:31.154
You cared a lot about niceness, and in fact,

16
00:00:31.154 --> 00:00:33.456
you and your co-founder, Mike Krieger would go in

17
00:00:33.456 --> 00:00:35.602
early on and delete comments yourself.

18
00:00:35.602 --> 00:00:36.435
<v ->Yep.</v>

19
00:00:36.435 --> 00:00:37.268 line:15% 
<v Nicholas>Tell me about that.</v>

20
00:00:37.268 --> 00:00:38.897 line:15% 
<v ->Yeah, not only would we delete comments,</v>

21
00:00:38.897 --> 00:00:40.408 line:15% 
but we did the unthinkable.

22
00:00:40.408 --> 00:00:42.097
We actually removed accounts that were being

23
00:00:42.097 --> 00:00:43.992
not so nice to people.

24
00:00:43.992 --> 00:00:44.988
<v ->So, for example, whom?</v>

25
00:00:44.988 --> 00:00:46.816
<v ->Well, I don't remember exactly whom,</v>

26
00:00:46.816 --> 00:00:50.453
but the back story is my wife is one of the nicest

27
00:00:50.453 --> 00:00:51.819
people you'll ever meet.

28
00:00:51.819 --> 00:00:53.258
And that bleeds over to me.

29
00:00:53.258 --> 00:00:55.123
And I try to model it.

30
00:00:55.123 --> 00:00:59.567
So when we were starting the app, we watched this video

31
00:00:59.567 --> 00:01:02.290
about like, basically, like how to start a company.

32
00:01:02.290 --> 00:01:05.759
And it was by this guy who started the Lolcats company

33
00:01:05.759 --> 00:01:09.344
or meme, and he basically said to form community

34
00:01:09.344 --> 00:01:10.601
you need to do something.

35
00:01:10.601 --> 00:01:12.760
And he called it prune the trolls.

36
00:01:12.760 --> 00:01:14.505
Nicole would always joke with me.

37
00:01:14.505 --> 00:01:16.637
She's like, hey, listen, when your community's getting rough

38
00:01:16.637 --> 00:01:17.948
you gotta prune the trolls.

39
00:01:17.948 --> 00:01:20.566
And that's something she still says to me today

40
00:01:20.566 --> 00:01:23.092
to remind me off the importance of community

41
00:01:23.092 --> 00:01:25.175
but also how important it is to be nice.

42
00:01:25.175 --> 00:01:26.809
So back in the day, we would go in,

43
00:01:26.809 --> 00:01:28.802
and if people were mistreating people,

44
00:01:28.802 --> 00:01:30.611
we'd just remove their accounts.

45
00:01:30.611 --> 00:01:32.109
And I think that set an early tone

46
00:01:32.109 --> 00:01:35.811
for the community to be nice and be welcoming.

47
00:01:35.811 --> 00:01:38.034
<v ->But what's interesting is that this is 2010, right.</v>

48
00:01:38.034 --> 00:01:38.867
<v ->Yeah.</v>

49
00:01:38.867 --> 00:01:40.959
<v ->And 2010 is the moment where a lot of people</v>

50
00:01:40.959 --> 00:01:43.414
are talking about free speech and the internet,

51
00:01:43.414 --> 00:01:45.645
and Twitter's roll in the Iranian Revolution.

52
00:01:45.645 --> 00:01:48.687
So it was a moment where free speech was actually

53
00:01:48.687 --> 00:01:53.072
valued on the internet probably even more than it is now.

54
00:01:53.072 --> 00:01:56.857
How did you end up being more in the prune the trolls camp?

55
00:01:56.857 --> 00:02:00.771
<v ->Well, there's a age old debate between free speech</v>

56
00:02:00.771 --> 00:02:05.116
and like, what is the limit of free speech?

57
00:02:05.116 --> 00:02:07.609
And is it free speech to just be mean to someone?

58
00:02:07.609 --> 00:02:10.863
And I think if you look at the history of the law

59
00:02:10.863 --> 00:02:13.247
around free speech, et cetera, you'll find that generally

60
00:02:13.247 --> 00:02:16.901
there's a line where like you don't want to cross

61
00:02:16.901 --> 00:02:19.461
because you're starting to be aggressive or be mean

62
00:02:19.461 --> 00:02:23.652
or racist, and you get to a point where you want

63
00:02:23.652 --> 00:02:25.743
to make sure that in a closed community

64
00:02:25.743 --> 00:02:28.441
that's trying to grow and thrive, you make sure that you

65
00:02:28.441 --> 00:02:30.712
actually optimize for overall free speech.

66
00:02:30.712 --> 00:02:33.071
So if I don't feel like I can be myself,

67
00:02:33.071 --> 00:02:35.216
if I don't feel like I can express myself,

68
00:02:35.216 --> 00:02:38.218
because if I do that, I will get attacked,

69
00:02:38.218 --> 00:02:39.551
it's not a community we want to create.

70
00:02:39.551 --> 00:02:42.202
We just decided to be on the side of making sure

71
00:02:42.202 --> 00:02:45.392
that we optimized for speech that was expressive

72
00:02:45.392 --> 00:02:48.497
and felt like you had the freedom to be yourself.

73
00:02:48.497 --> 00:02:51.298
<v ->So one of the foundational decisions at Instagram</v>

74
00:02:51.298 --> 00:02:53.510
that helped make it nicer than some of your peers

75
00:02:53.510 --> 00:02:55.692
was the decision to not allow resharing, right,

76
00:02:55.692 --> 00:02:57.863
and to not allow something that I put out there

77
00:02:57.863 --> 00:03:00.148
to be kind of appropriated by someone else

78
00:03:00.148 --> 00:03:01.861
and sent out into the world by someone else.

79
00:03:01.861 --> 00:03:04.658
How was that decision made, and were there other

80
00:03:04.658 --> 00:03:08.062
foundational design and product decisions that were made

81
00:03:08.062 --> 00:03:09.516
because of niceness?

82
00:03:09.516 --> 00:03:12.933
<v ->Well, we debate the reshare thing a lot</v>

83
00:03:14.061 --> 00:03:16.095
because obviously people love the idea of resharing

84
00:03:16.095 --> 00:03:17.788
content that they find.

85
00:03:17.788 --> 00:03:19.564
Instagram is full of awesome stuff.

86
00:03:19.564 --> 00:03:22.591
In fact, one of the main ways people communicate

87
00:03:22.591 --> 00:03:25.695
over Instagram Direct now is actually they share content

88
00:03:25.695 --> 00:03:27.883
that they find on Instagram.

89
00:03:27.883 --> 00:03:30.026
So that's been a debate over and over again.

90
00:03:30.026 --> 00:03:32.635
But really that decision is about keeping your feed

91
00:03:32.635 --> 00:03:35.530
focused on the people you know rather than the people

92
00:03:35.530 --> 00:03:37.902
you know finding other stuff for you to see.

93
00:03:37.902 --> 00:03:41.558
And I think that is more of a testament of our focus

94
00:03:41.558 --> 00:03:45.550
on authenticity and on the connections you actually have

95
00:03:45.550 --> 00:03:46.965
than about anything else.

96
00:03:46.965 --> 00:03:48.988
<v ->So after you went to VidCon, you posted an image</v>

97
00:03:48.988 --> 00:03:51.620
on your Instagram feed of you and a bunch of the

98
00:03:51.620 --> 00:03:52.558
celebrities.
<v ->Totally.</v>

99
00:03:52.558 --> 00:03:53.391
<v ->I'm gonna read some of the comments.</v>

100
00:03:53.391 --> 00:03:54.224
<v Kevin>In fact it was a Boomerang.</v>

101
00:03:54.224 --> 00:03:55.501
<v ->It was a Boomerang, right, which got you in a little</v>

102
00:03:55.501 --> 00:03:56.334
bit of trouble.
<v ->Let's get technical here.</v>

103
00:03:56.334 --> 00:04:00.176
<v ->I'm gonna read some of the comments on @Kevin's post.</v>

104
00:04:00.176 --> 00:04:01.009
<v ->Sure.</v>

105
00:04:01.009 --> 00:04:01.842
<v ->These are the comments.</v>

106
00:04:01.842 --> 00:04:02.675
Suck.

107
00:04:02.675 --> 00:04:03.508
Suck.

108
00:04:03.508 --> 00:04:04.341
Suck me.

109
00:04:04.341 --> 00:04:05.174
Suck.

110
00:04:05.174 --> 00:04:06.559
Can you make Instagram have auto scroll feature?

111
00:04:06.559 --> 00:04:08.038
That would be awesome and expand Instagram

112
00:04:08.038 --> 00:04:10.128
as an app that could grow even more.

113
00:04:10.128 --> 00:04:12.083
Meme lives matter.

114
00:04:12.083 --> 00:04:12.916
You suck.

115
00:04:12.916 --> 00:04:14.821
You can delete memes but not cancer patients.

116
00:04:14.821 --> 00:04:16.041
I love meme lives matter.

117
00:04:16.041 --> 00:04:16.891
All memes matter.

118
00:04:16.891 --> 00:04:17.724
Suck.

119
00:04:17.724 --> 00:04:18.557
MLM.

120
00:04:18.557 --> 00:04:19.390
Meme revolution.

121
00:04:19.390 --> 00:04:20.223
Cuck.

122
00:04:20.223 --> 00:04:21.056
Meme.

123
00:04:21.056 --> 00:04:21.971
Stop the meme genocide.

124
00:04:21.971 --> 00:04:23.168
Make Instagram great again.

125
00:04:23.168 --> 00:04:24.001
Meme lives matter.

126
00:04:24.001 --> 00:04:24.834
Meme lives matter.

127
00:04:24.834 --> 00:04:26.273
Meme lives matter.

128
00:04:26.273 --> 00:04:27.134
Mmm, gang.

129
00:04:27.134 --> 00:04:27.967
MLM gang.

130
00:04:27.967 --> 00:04:31.035
I'm not quite sure what all this means.

131
00:04:31.035 --> 00:04:32.787
Is this typical?

132
00:04:32.787 --> 00:04:34.790
<v ->It was typical, but I'd encourage you to go</v>

133
00:04:34.790 --> 00:04:36.907
to my last post where I listed

134
00:04:36.907 --> 00:04:38.255
for Father's Day.
<v ->Your last post is all nice.</v>

135
00:04:38.255 --> 00:04:39.300
<v ->It's all nice.</v>

136
00:04:39.300 --> 00:04:40.769
<v ->It's all about how handsome your father is.</v>

137
00:04:40.769 --> 00:04:41.719
<v ->Right.</v>

138
00:04:41.719 --> 00:04:43.389
There are a lot, listen.

139
00:04:43.389 --> 00:04:44.407
He is taken.

140
00:04:44.407 --> 00:04:45.346
My mom is wonderful.

141
00:04:45.346 --> 00:04:46.959
[laughs]

142
00:04:46.959 --> 00:04:49.480
But there are a lot of really wonderful comments there.

143
00:04:49.480 --> 00:04:53.647
<v ->So why is this post from a year ago full of cuck</v>

144
00:04:55.659 --> 00:04:57.153
and meme lives matter, and the most recent post

145
00:04:57.153 --> 00:04:59.488
is full of how Kevin Systrom's dad is?

146
00:04:59.488 --> 00:05:01.668
<v ->Yeah, well, that's a good question.</v>

147
00:05:01.668 --> 00:05:03.212
I would love to be able to explain it.

148
00:05:03.212 --> 00:05:07.293
But the first thing I think is back then,

149
00:05:07.293 --> 00:05:09.493
there were a bunch of people who were, I think, were unhappy

150
00:05:09.493 --> 00:05:13.319
about the way Instagram was managing accounts.

151
00:05:13.319 --> 00:05:15.459
And there are groups of people that like to get together

152
00:05:15.459 --> 00:05:18.665
and band up and bully people.

153
00:05:18.665 --> 00:05:20.199
But it's a good example of how someone

154
00:05:20.199 --> 00:05:21.591
can get bullied, right?

155
00:05:21.591 --> 00:05:23.504
The good news is I run the company.

156
00:05:23.504 --> 00:05:25.138
I have a thick skin, and I can deal with it.

157
00:05:25.138 --> 00:05:28.409
But imagine you're someone who's trying to express yourself

158
00:05:28.409 --> 00:05:32.358
about depression or anxiety or body image issues,

159
00:05:32.358 --> 00:05:33.970
and you get that.

160
00:05:33.970 --> 00:05:35.207
Does that make you want to come back

161
00:05:35.207 --> 00:05:36.903
and post on the platform?

162
00:05:36.903 --> 00:05:37.749
<v Nicholas>Certainly not.</v>

163
00:05:37.749 --> 00:05:39.818
<v ->And if you're seeing that, does that make you want</v>

164
00:05:39.818 --> 00:05:41.804
to be open about those issues as well?

165
00:05:41.804 --> 00:05:42.772
No.

166
00:05:42.772 --> 00:05:45.810
So a year ago, I think we had much more of a problem,

167
00:05:45.810 --> 00:05:49.691
but the focus over that year on both comment filtering,

168
00:05:49.691 --> 00:05:52.097
so now, you can go in and enter your own words

169
00:05:52.097 --> 00:05:56.274
that basically filter out comments that include that word.

170
00:05:56.274 --> 00:05:59.749
We have spam filtering that actually works really well.

171
00:05:59.749 --> 00:06:01.635
So probably a bunch of those would have been caught up

172
00:06:01.635 --> 00:06:03.161
in the spam filter that we have

173
00:06:03.161 --> 00:06:06.570
because they were repeated comments.

174
00:06:06.570 --> 00:06:09.990
And also, just a general awareness of kind comments.

175
00:06:09.990 --> 00:06:11.721
We have this awesome campaign that we started

176
00:06:11.721 --> 00:06:13.722
called #KindComments.

177
00:06:13.722 --> 00:06:16.621
I don't know if you, you know, The Late Night Show,

178
00:06:16.621 --> 00:06:20.786
they read off mean comments on another social platform.

179
00:06:20.786 --> 00:06:24.467
We started Kind Comments to basically set a standard

180
00:06:24.467 --> 00:06:27.010
in the community that it was better and cooler

181
00:06:27.010 --> 00:06:29.369
to actually leave kind comments, and now, there's

182
00:06:29.369 --> 00:06:31.820
this amazing meme that has spread throughout Instagram

183
00:06:31.820 --> 00:06:33.987
about leaving kind comments.

184
00:06:33.987 --> 00:06:35.773
But you can see the marked difference between

185
00:06:35.773 --> 00:06:39.959
the post about Father's Day and that post a year ago

186
00:06:39.959 --> 00:06:43.256
on what technology can do to create a kinder community.

187
00:06:43.256 --> 00:06:44.394
And I think we're making progress,

188
00:06:44.394 --> 00:06:45.659
which is the important part.

189
00:06:45.659 --> 00:06:48.749
<v ->Tell me about sort of steps one, two, three, four, five.</v>

190
00:06:48.749 --> 00:06:50.825
Like, how do you, you don't automatically decide

191
00:06:50.825 --> 00:06:53.056
to launch the 17 things you've launched since then.

192
00:06:53.056 --> 00:06:53.889
<v ->No.</v>

193
00:06:53.889 --> 00:06:54.751
<v Nicholas>Tell me about the early conversations.</v>

194
00:06:54.751 --> 00:06:56.250
<v ->Well, the early conversations were really</v>

195
00:06:56.250 --> 00:06:57.864
about what problem are we solving?

196
00:06:57.864 --> 00:07:00.416
We looked to the community for stories.

197
00:07:00.416 --> 00:07:02.178
We talked to community members.

198
00:07:02.178 --> 00:07:04.356
We have a giant community team here at Instagram,

199
00:07:04.356 --> 00:07:06.590
which I think is pretty unique for technology companies

200
00:07:06.590 --> 00:07:09.292
that literally their job is to interface with the community

201
00:07:09.292 --> 00:07:12.104
and get feedback and highlight members who are doing

202
00:07:12.104 --> 00:07:14.543
amazing things on the platform.

203
00:07:14.543 --> 00:07:17.423
So getting that type of feedback from the community

204
00:07:17.423 --> 00:07:19.375
about what types of problems they were experiencing

205
00:07:19.375 --> 00:07:20.494
in their comments.

206
00:07:20.494 --> 00:07:22.753
Then, led us to brainstorm about all the different

207
00:07:22.753 --> 00:07:25.796
things we could build, and what we realized

208
00:07:25.796 --> 00:07:27.950
was that there was a giant wave of machine learning

209
00:07:27.950 --> 00:07:31.056
and artificial intelligence, and Facebook had developed

210
00:07:31.056 --> 00:07:34.083
this thing that basically, it's called Deep Text,

211
00:07:34.083 --> 00:07:34.916
which basically--

212
00:07:34.916 --> 00:07:36.441
<v ->Which launches in June of 2016, right?</v>

213
00:07:36.441 --> 00:07:37.274
So it's right there.

214
00:07:37.274 --> 00:07:38.602
<v ->Yep, so they have this technology,</v>

215
00:07:38.602 --> 00:07:40.407
and we put two and two together, and we said,

216
00:07:40.407 --> 00:07:41.240
you know what?

217
00:07:41.240 --> 00:07:44.282
I think if we get a bunch of people to look at comments

218
00:07:44.282 --> 00:07:46.785
and rate them good or bad, like you go on Pandora.

219
00:07:46.785 --> 00:07:48.029
And you listen to a song.

220
00:07:48.029 --> 00:07:49.084
Is it good or is it bad.

221
00:07:49.084 --> 00:07:50.191
<v Nicholas>Yeah.</v>

222
00:07:50.191 --> 00:07:51.125
<v ->Get a bunch of people to do that.</v>

223
00:07:51.125 --> 00:07:51.958
That's your training set.

224
00:07:51.958 --> 00:07:54.654
And then what you do is you basically feed it to the machine

225
00:07:54.654 --> 00:07:57.642
learning system, and you let it go through 80% of it.

226
00:07:57.642 --> 00:08:00.506
And then you hold out 20% of the other comments.

227
00:08:00.506 --> 00:08:02.215
And then you say, okay, machine.

228
00:08:02.215 --> 00:08:06.241
Go and rate these comments for us based on the training set.

229
00:08:06.241 --> 00:08:07.884
And then we see how well it does.

230
00:08:07.884 --> 00:08:09.275
And we tweak it over time.

231
00:08:09.275 --> 00:08:10.742
And now, we're at a point where basically,

232
00:08:10.742 --> 00:08:13.464
this machine learning can detect a bad comment

233
00:08:13.464 --> 00:08:17.232
or a mean comment with almost amazing accuracy.

234
00:08:17.232 --> 00:08:19.351
Basically a 1% false positive rate.

235
00:08:19.351 --> 00:08:22.429
So throughout that process of both brainstorming,

236
00:08:22.429 --> 00:08:24.276
looking at the technology available

237
00:08:24.276 --> 00:08:27.365
and training this filter over time with real humans

238
00:08:27.365 --> 00:08:29.978
who are deciding this stuff, gathering feedback

239
00:08:29.978 --> 00:08:32.723
from our community, and gathering feedback from our team

240
00:08:32.723 --> 00:08:35.584
about how it works, we are able to create something

241
00:08:35.584 --> 00:08:36.647
we're really proud of.

242
00:08:36.647 --> 00:08:38.836
<v ->So when you launch it, you make a very important decision.</v>

243
00:08:38.836 --> 00:08:41.079
Do you want it to be aggressive in which case it'll

244
00:08:41.079 --> 00:08:43.028
probably kick out some stuff it shouldn't,

245
00:08:43.028 --> 00:08:45.201
or do you want it to be less aggressive in which case

246
00:08:45.201 --> 00:08:47.099
a lot of bad stuff'll get through?

247
00:08:47.099 --> 00:08:50.275
<v ->Yeah, this is the classic problem.</v>

248
00:08:50.275 --> 00:08:54.777
If you go for accuracy, you will misclassify a bunch

249
00:08:54.777 --> 00:08:56.397
of stuff that actually was pretty good,

250
00:08:56.397 --> 00:08:58.964
so you know, if I'm just, you're my friend.

251
00:08:58.964 --> 00:09:01.062
And I go onto your photo, and I'm just joking around

252
00:09:01.062 --> 00:09:02.873
with your and giving you a hard time.

253
00:09:02.873 --> 00:09:05.143
Instagram should let that through 'cause you guys,

254
00:09:05.143 --> 00:09:07.308
we're friends, and
<v ->Right.</v>

255
00:09:07.308 --> 00:09:09.624
<v ->you know, I'm just giving you a hard time,</v>

256
00:09:09.624 --> 00:09:11.733
and that's a funny banter back and forth.

257
00:09:11.733 --> 00:09:14.767
Whereas if you don't know me, and I come on,

258
00:09:14.767 --> 00:09:17.606
and I make fun of your photo, that feels very different.

259
00:09:17.606 --> 00:09:20.560
Understanding the nuance between those two is super

260
00:09:20.560 --> 00:09:25.028
important, and the thing we don't want to do is have any

261
00:09:25.028 --> 00:09:27.041
instances where we block something

262
00:09:27.041 --> 00:09:28.418
that shouldn't be blocked.

263
00:09:28.418 --> 00:09:31.067
Now, the reality is, it's going to happen.

264
00:09:31.067 --> 00:09:31.900
<v Nicholas>Definitely.</v>

265
00:09:31.900 --> 00:09:33.099
<v ->The reality is it's going to happen.</v>

266
00:09:33.099 --> 00:09:36.131
So the question is is that margin of error worth it

267
00:09:36.131 --> 00:09:39.392
for all the really bad stuff that gets blocked?

268
00:09:39.392 --> 00:09:43.377
And that's a fine balance to figure out.

269
00:09:43.377 --> 00:09:44.964
That's something we're working on.

270
00:09:44.964 --> 00:09:47.805
We trained the filter basically to have

271
00:09:47.805 --> 00:09:49.704
a 1% false positive rate.

272
00:09:49.704 --> 00:09:53.389
That means 1% of things that get marked as bad

273
00:09:53.389 --> 00:09:54.923
are actually good.

274
00:09:54.923 --> 00:09:58.159
And that was a top priority for us because we're not here

275
00:09:58.159 --> 00:09:59.231
to curb free speech.

276
00:09:59.231 --> 00:10:01.839
We're not here to curb fun conversations between friends,

277
00:10:01.839 --> 00:10:05.209
but we want to make sure we are largely attacking

278
00:10:05.209 --> 00:10:07.662
the problem of bad comments on Instagram.

279
00:10:07.662 --> 00:10:09.505
<v ->And so you go, and every comment that goes in</v>

280
00:10:09.505 --> 00:10:11.840
gets sort of run through an algorithm, and the algorithm

281
00:10:11.840 --> 00:10:14.296
gives it a score from zero to one on whether it's likely

282
00:10:14.296 --> 00:10:16.463
a comment that should be filtered or a comment

283
00:10:16.463 --> 00:10:17.409
that should not be filtered.

284
00:10:17.409 --> 00:10:18.242
<v ->Right.</v>

285
00:10:18.242 --> 00:10:20.867
<v ->And then that score combined with the relationship</v>

286
00:10:20.867 --> 00:10:22.364
of the two people?

287
00:10:22.364 --> 00:10:25.580
<v ->No, the score actually is influenced based</v>

288
00:10:25.580 --> 00:10:26.413
on the relationship.

289
00:10:26.413 --> 00:10:27.895
<v ->So the original score is influenced by it,</v>

290
00:10:27.895 --> 00:10:31.139
and Instagram, I believe I have this correct,

291
00:10:31.139 --> 00:10:33.182
has something like a karma score for every user

292
00:10:33.182 --> 00:10:36.620
where the number of times they've been flagged

293
00:10:36.620 --> 00:10:38.876
or the number of critiques made of them is added in

294
00:10:38.876 --> 00:10:41.919
to something on the back end, and that goes into this, too?

295
00:10:41.919 --> 00:10:43.426
<v ->So without getting into the magic sauce,</v>

296
00:10:43.426 --> 00:10:47.122
you're asking like, Coca-Cola to give up its recipe.

297
00:10:47.122 --> 00:10:49.824
I'm gonna tell you that there's a lot of complicated stuff

298
00:10:49.824 --> 00:10:53.848
that goes into it, but, basically, it looks at the words.

299
00:10:53.848 --> 00:10:55.292
It looks at our relationship.

300
00:10:55.292 --> 00:10:57.206
And it looks at a bunch of other signals including

301
00:10:57.206 --> 00:11:01.871
account age and account history, that kind of stuff.

302
00:11:01.871 --> 00:11:04.534
And it combines all the signals, and then it spits

303
00:11:04.534 --> 00:11:06.912
out a score of zero to one about how bad

304
00:11:06.912 --> 00:11:09.036
this comment is likely.

305
00:11:09.036 --> 00:11:12.530
And then, basically, you set a threshold that optimizes

306
00:11:12.530 --> 00:11:15.107
for 1% false positive rate.

307
00:11:15.107 --> 00:11:16.940
<v ->When do you decide it's ready to go?</v>

308
00:11:16.940 --> 00:11:19.911
<v ->[sighs] I think at a point where the accuracy</v>

309
00:11:19.911 --> 00:11:23.109
gets to a point that internally we're happy with it.

310
00:11:23.109 --> 00:11:25.376
So one of things we do here at Instagram is we do

311
00:11:25.376 --> 00:11:27.960
this thing called dog-fooding, and not a lot of people

312
00:11:27.960 --> 00:11:29.497
know this term, but in the tech industry,

313
00:11:29.497 --> 00:11:31.515
it means, you know, eat your own dog food.

314
00:11:31.515 --> 00:11:33.865
So what we do is we take the products,

315
00:11:33.865 --> 00:11:35.908
and we always apply them to ourselves

316
00:11:35.908 --> 00:11:37.517
before we go out to the community.

317
00:11:37.517 --> 00:11:39.376
And there are these amazing groups at Instagram,

318
00:11:39.376 --> 00:11:41.759
and I'd love to take you through them unless they were,

319
00:11:41.759 --> 00:11:43.494
but they're actually all confidential.

320
00:11:43.494 --> 00:11:47.586
But it's employees giving feedback about how they feel

321
00:11:47.586 --> 00:11:50.709
about specific features, and--

322
00:11:50.709 --> 00:11:53.717
<v ->So this is live on the phones of a bunch of Instagram</v>

323
00:11:53.717 --> 00:11:54.904
employees right now?

324
00:11:54.904 --> 00:11:57.260
<v ->There are always features that are not launched</v>

325
00:11:57.260 --> 00:12:00.882
that are live on Instagram employees phones

326
00:12:00.882 --> 00:12:02.786
including things like this.

327
00:12:02.786 --> 00:12:06.249
<v ->So there's a critique of a lot of the advances in machine</v>

328
00:12:06.249 --> 00:12:09.360
learning that the corpus on which it's based

329
00:12:09.360 --> 00:12:11.296
has biases built into it.

330
00:12:11.296 --> 00:12:14.988
So Deep Text analyzed all Facebook comments, right.

331
00:12:14.988 --> 00:12:17.022
It analyzed some massive corpus of words

332
00:12:17.022 --> 00:12:18.531
that people have typed into the internet,

333
00:12:18.531 --> 00:12:20.742
but when you analyze those, you get certain biases

334
00:12:20.742 --> 00:12:21.662
built into them.

335
00:12:21.662 --> 00:12:23.730
So for example, I was reading a paper,

336
00:12:23.730 --> 00:12:25.922
and somebody had taken a massive corpus of text

337
00:12:25.922 --> 00:12:28.508
and created a meat machine learning algorithm

338
00:12:28.508 --> 00:12:32.461
to rank restaurants and to look at the comments

339
00:12:32.461 --> 00:12:33.724
that people had given under restaurants

340
00:12:33.724 --> 00:12:35.910
and then try to guess the quality of the restaurants.

341
00:12:35.910 --> 00:12:37.385
He went through, and he ran it.

342
00:12:37.385 --> 00:12:39.285
And he was like, interesting because all of the Mexican

343
00:12:39.285 --> 00:12:41.030
restaurants were ranked badly.

344
00:12:41.030 --> 00:12:42.378
So why is that?

345
00:12:42.378 --> 00:12:44.546
Well, it turns out as he dug deeper into the algorithm,

346
00:12:44.546 --> 00:12:46.565
it's because in the massive corpus of texts,

347
00:12:46.565 --> 00:12:49.226
the word, Mexican, is associated with illegal,

348
00:12:49.226 --> 00:12:50.538
illegal Mexican immigrant.

349
00:12:50.538 --> 00:12:52.417
Because that is used so frequently.

350
00:12:52.417 --> 00:12:55.672
So there are lots of slurs attached to the word, Mexican,

351
00:12:55.672 --> 00:12:58.086
so the word, Mexican, has negative connotations

352
00:12:58.086 --> 00:13:01.217
in the machine learning base corpus,

353
00:13:01.217 --> 00:13:03.073
which then affects the restaurant rankings

354
00:13:03.073 --> 00:13:05.031
of Mexican restaurants.

355
00:13:05.031 --> 00:13:05.963
<v ->It sounds awful.</v>

356
00:13:05.963 --> 00:13:07.286
<v Nicholas>So how do you deal with that?</v>

357
00:13:07.286 --> 00:13:09.959
<v ->Yeah, uh, well, good news is we're not in the business</v>

358
00:13:09.959 --> 00:13:11.242
of ranking restaurants.

359
00:13:11.242 --> 00:13:15.512
<v ->But you are ranking sentences based on this huge corpus</v>

360
00:13:15.512 --> 00:13:18.287
of texts that Facebook as analyzed as part of Deep Text.

361
00:13:18.287 --> 00:13:21.117
<v ->Well, it's a little bit more complicated than that.</v>

362
00:13:21.117 --> 00:13:22.775
So all of our training actually

363
00:13:22.775 --> 00:13:25.637
comes from Instagram comments.

364
00:13:25.637 --> 00:13:29.458
So we have hundreds of raters, and it's actually

365
00:13:29.458 --> 00:13:32.140
pretty interesting what we've done with this set of raters.

366
00:13:32.140 --> 00:13:34.814
Basically, human beings that sit there, and by the way,

367
00:13:34.814 --> 00:13:36.276
human beings are not unbiased.

368
00:13:36.276 --> 00:13:37.480
That's not what I'm claiming.

369
00:13:37.480 --> 00:13:39.260
But you have human beings.

370
00:13:39.260 --> 00:13:41.199
Each of those raters is bilingual.

371
00:13:41.199 --> 00:13:42.445
So they speak two languages.

372
00:13:42.445 --> 00:13:43.674
They have a diverse perspective.

373
00:13:43.674 --> 00:13:45.663
They're from all over the world.

374
00:13:45.663 --> 00:13:48.143
And they rank those comments basically

375
00:13:48.143 --> 00:13:49.834
thumbs up or thumbs down.

376
00:13:49.834 --> 00:13:51.892
Basically, the Instagram corpus, right?

377
00:13:51.892 --> 00:13:53.883
So you feed it the thumbs up, thumb down

378
00:13:53.883 --> 00:13:55.735
based on an individual, and you might say, but wait.

379
00:13:55.735 --> 00:13:59.260
Isn't a single individual biased in some way?

380
00:13:59.260 --> 00:14:01.575
Which is why we make sure every comment is actually

381
00:14:01.575 --> 00:14:05.919
seen twice and given a rating twice by at least two people

382
00:14:05.919 --> 00:14:09.602
to make sure that there isn't, there's as minimal amount

383
00:14:09.602 --> 00:14:11.927
of bias in the system as possible.

384
00:14:11.927 --> 00:14:15.083
And then on top of that, we also gain feedback

385
00:14:15.083 --> 00:14:17.624
from not only our team but also the community.

386
00:14:17.624 --> 00:14:19.418
And then we're able to tweak things on the margin

387
00:14:19.418 --> 00:14:21.768
to make sure that things like that don't happen.

388
00:14:21.768 --> 00:14:23.759
I'm not claiming that it won't happen.

389
00:14:23.759 --> 00:14:26.302
That's, of course, a risk, but the biggest risk of all

390
00:14:26.302 --> 00:14:30.277
is doing nothing because we're afraid of these things

391
00:14:30.277 --> 00:14:32.841
happening, and I think it's more important that we are

392
00:14:32.841 --> 00:14:37.489
A, aware of them, and B, monitoring them actively,

393
00:14:37.489 --> 00:14:39.820
and C, making sure that we have a diverse group

394
00:14:39.820 --> 00:14:43.038
of raters that not only speak two languages

395
00:14:43.038 --> 00:14:44.926
but are from all over the world and represent

396
00:14:44.926 --> 00:14:47.123
different perspectives to make sure we have

397
00:14:47.123 --> 00:14:48.875
an unbiased classifier.

398
00:14:48.875 --> 00:14:51.030
<v ->So let's take a sentence like, these hoes ain't loyal,</v>

399
00:14:51.030 --> 00:14:53.460
which is a phrase that I believe a previous study

400
00:14:53.460 --> 00:14:55.487
on Twitter had a lot of trouble with.

401
00:14:55.487 --> 00:14:57.310
Your theory is that some people will say,

402
00:14:57.310 --> 00:14:58.858
oh, that's a lyric.

403
00:14:58.858 --> 00:14:59.911
Therefore, it's okay.

404
00:14:59.911 --> 00:15:01.484
Some people won't know it will get through,

405
00:15:01.484 --> 00:15:04.305
but enough raters looking at enough comments over time

406
00:15:04.305 --> 00:15:06.786
will allow lyrics to get through,

407
00:15:06.786 --> 00:15:08.883
and these hoes ain't loyal, I can't post that

408
00:15:08.883 --> 00:15:10.615
on your Instagram feed if you post a picture

409
00:15:10.615 --> 00:15:11.966
which deserves that comment.

410
00:15:11.966 --> 00:15:14.445
<v ->Well, I think what I would counter is if you post</v>

411
00:15:14.445 --> 00:15:17.096
that sentence to any person watching this,

412
00:15:17.096 --> 00:15:20.121
not a single one of them would say that's a mean-spirited

413
00:15:20.121 --> 00:15:22.072
comment to any of us, right.

414
00:15:22.072 --> 00:15:24.912
So I think that's pretty easy to get to.

415
00:15:24.912 --> 00:15:26.878
I think if there are more nuanced examples,

416
00:15:26.878 --> 00:15:28.632
and I think that's in the spirit of your question.

417
00:15:28.632 --> 00:15:29.465
<v ->Yeah.</v>

418
00:15:29.465 --> 00:15:30.939
<v ->Which is that there are gray areas.</v>

419
00:15:30.939 --> 00:15:33.826
The whole idea of machine learning is that it's far better

420
00:15:33.826 --> 00:15:37.007
about understanding those nuances than any algorithm

421
00:15:37.007 --> 00:15:40.404
has in the past or any single human being could.

422
00:15:40.404 --> 00:15:42.581
And I think what we have to do over time is figure out

423
00:15:42.581 --> 00:15:45.268
how to get into that gray area, and like judge

424
00:15:45.268 --> 00:15:47.898
the performance of this algorithm over time

425
00:15:47.898 --> 00:15:49.645
to see if it actually improves things.

426
00:15:49.645 --> 00:15:52.381
Because by the way, if it causes trouble, and it doesn't

427
00:15:52.381 --> 00:15:55.673
work, we'll scrap it and start over with something new.

428
00:15:55.673 --> 00:15:57.586
But the whole idea here is that we're trying something,

429
00:15:57.586 --> 00:15:59.726
and I think a lot of the fears that you're bringing up

430
00:15:59.726 --> 00:16:02.506
are warranted, but it's exactly why it's keeps most

431
00:16:02.506 --> 00:16:04.546
companies from even trying in the first place.

432
00:16:04.546 --> 00:16:05.963
<v ->Right, and so first you're gonna</v>

433
00:16:05.963 --> 00:16:08.027
launch this filtering bad comments.

434
00:16:08.027 --> 00:16:10.308
The second thing you're gonna do is the elevation

435
00:16:10.308 --> 00:16:11.316
of positive comments.

436
00:16:11.316 --> 00:16:13.120
Tell me about how that is gonna work

437
00:16:13.120 --> 00:16:14.826
and why that is a priority.

438
00:16:14.826 --> 00:16:18.133
<v ->Um, the elevation of positive comments is more about</v>

439
00:16:18.133 --> 00:16:19.309
modeling in the system.

440
00:16:19.309 --> 00:16:21.609
We've seen a bunch of times in the system

441
00:16:21.609 --> 00:16:24.939
where we have this thing called the mimicry effect.

442
00:16:24.939 --> 00:16:29.155
So if you actually, if you raise kind comments,

443
00:16:29.155 --> 00:16:31.475
you actually see more kind comments.

444
00:16:31.475 --> 00:16:34.322
You see more people giving kind comments.

445
00:16:34.322 --> 00:16:36.468
It's not that we ever ran this test,

446
00:16:36.468 --> 00:16:38.620
but I'm sure if you raised a bunch of mean comments,

447
00:16:38.620 --> 00:16:40.412
you'd see more mean comments.

448
00:16:40.412 --> 00:16:43.739
Part of this is the piling on effect, and I think

449
00:16:43.739 --> 00:16:47.308
what we can do is by modeling what great conversations

450
00:16:47.308 --> 00:16:50.196
are, more people will see Instagram as a place for that

451
00:16:50.196 --> 00:16:52.519
and less for the bad stuff.

452
00:16:52.519 --> 00:16:54.941
And it's got this interesting psychological effect

453
00:16:54.941 --> 00:16:58.398
that people want to fit in, and people want to do

454
00:16:58.398 --> 00:17:01.539
what they're seeing, and that means that people

455
00:17:01.539 --> 00:17:03.128
are more positive over time.

456
00:17:03.128 --> 00:17:05.653
<v ->And are you at all worried that you're gonna turn</v>

457
00:17:05.653 --> 00:17:07.082
Instagram into the equivalent of

458
00:17:07.082 --> 00:17:10.714
an East Coast Liberal Arts College where people are--

459
00:17:10.714 --> 00:17:12.205
<v ->[laughs] I think those of us who grew up</v>

460
00:17:12.205 --> 00:17:14.010
on the East coast might take offense to that.

461
00:17:14.010 --> 00:17:15.306
[laughs]

462
00:17:15.306 --> 00:17:16.920
I'm not sure what you mean exactly.

463
00:17:16.920 --> 00:17:21.420
<v ->I mean, a place where there are trigger warnings</v>

464
00:17:21.420 --> 00:17:24.021
everywhere where people feel like they can't have

465
00:17:24.021 --> 00:17:26.697
certain opinions, or people feel like they can't say things

466
00:17:26.697 --> 00:17:30.706
where everything, where you put this sheen over

467
00:17:30.706 --> 00:17:33.969
all your conversations as though everything

468
00:17:33.969 --> 00:17:35.456
in the world is rosy, and that bad stuff

469
00:17:35.456 --> 00:17:37.116
we're just gonna sweep it under the rug.

470
00:17:37.116 --> 00:17:38.358
<v ->Yeah, that would be bad.</v>

471
00:17:38.358 --> 00:17:39.836
That's not something we want.

472
00:17:39.836 --> 00:17:43.238
So I think in the range of bad, we're talking about

473
00:17:43.238 --> 00:17:47.560
like the lower 5%, like the really, really bad stuff.

474
00:17:47.560 --> 00:17:49.104
I don't think we're trying to play anywhere

475
00:17:49.104 --> 00:17:50.770
in the area of gray.

476
00:17:50.770 --> 00:17:53.285
Although I realize there's no black or white,

477
00:17:53.285 --> 00:17:55.036
and we're gonna have to play at some level.

478
00:17:55.036 --> 00:17:57.730
But the idea here is to take out, I don't know,

479
00:17:57.730 --> 00:18:00.830
the bottom 5% of nasty stuff.

480
00:18:00.830 --> 00:18:03.693
I don't think anyone would argue that that makes

481
00:18:03.693 --> 00:18:06.003
Instagram a rosy place.

482
00:18:06.003 --> 00:18:07.718
It just doesn't make it a hateful place.

483
00:18:07.718 --> 00:18:09.680
<v ->And you wouldn't want all the comments</v>

484
00:18:09.680 --> 00:18:12.354
on your, you know, on your VidCom post.

485
00:18:12.354 --> 00:18:17.204
It's a mix of sort of jokes and nastiness and vapidity

486
00:18:17.204 --> 00:18:19.565
and useful product feedback.

487
00:18:19.565 --> 00:18:20.912
And you're getting rid of the nasty stuff.

488
00:18:20.912 --> 00:18:22.427
But would it be better if you raised like

489
00:18:22.427 --> 00:18:25.029
the best product feedback up and then the funny jokes

490
00:18:25.029 --> 00:18:26.132
to the top?

491
00:18:26.132 --> 00:18:26.965
<v ->Maybe.</v>

492
00:18:26.965 --> 00:18:28.792
And maybe that's a problem we'll decide to solve

493
00:18:28.792 --> 00:18:31.214
at some point, but right now, we're just focused

494
00:18:31.214 --> 00:18:33.876
on making sure that people don't feel hate, you know.

495
00:18:33.876 --> 00:18:37.153
And I think that's a valid thing to go after,

496
00:18:37.153 --> 00:18:38.069
and I'm excited to do it.

497
00:18:38.069 --> 00:18:39.723
<v ->So the thing that interests me the most</v>

498
00:18:39.723 --> 00:18:42.145
is that it's like Instagram is a world

499
00:18:42.145 --> 00:18:44.633
with 700 million people, and you're writing

500
00:18:44.633 --> 00:18:46.807
the constitution for the world.

501
00:18:46.807 --> 00:18:48.376
When you get up in the morning, and you think about

502
00:18:48.376 --> 00:18:51.575
that power, that responsibility, how does it affect you?

503
00:18:51.575 --> 00:18:54.386
<v ->Doing nothing felt like the worst option in the world,</v>

504
00:18:54.386 --> 00:18:56.652
so starting to tackle it means

505
00:18:56.652 --> 00:18:58.323
that we can improve the world.

506
00:18:58.323 --> 00:19:00.184
We can improve the lives of many young people

507
00:19:00.184 --> 00:19:03.014
around the world that live on social media.

508
00:19:03.014 --> 00:19:04.344
I don't have kids yet.

509
00:19:04.344 --> 00:19:05.351
I will someday.

510
00:19:05.351 --> 00:19:07.868
And I hope that kid, boy or girl, grows up

511
00:19:07.868 --> 00:19:09.981
in a world where they feel safe online,

512
00:19:09.981 --> 00:19:13.326
where I, as a parent, feel like they are safe online.

513
00:19:13.326 --> 00:19:17.480
And, you know, the cheesy saying, with great power

514
00:19:17.480 --> 00:19:19.164
comes great responsibility?

515
00:19:19.164 --> 00:19:21.232
Like we take on that responsibility,

516
00:19:21.232 --> 00:19:23.136
and we're gonna go after it, but that doesn't mean

517
00:19:23.136 --> 00:19:26.413
that not acting is the correct option.

518
00:19:26.413 --> 00:19:29.585
There are all sorts of issues that come with acting.

519
00:19:29.585 --> 00:19:31.572
You've highlighted a number of them today,

520
00:19:31.572 --> 00:19:33.351
but that doesn't mean we shouldn't act.

521
00:19:33.351 --> 00:19:34.922
It just means we should be aware of them,

522
00:19:34.922 --> 00:19:37.418
and we should be monitoring them over time.

523
00:19:37.418 --> 00:19:39.409
<v ->One of the critiques is that Instagram, particularly</v>

524
00:19:39.409 --> 00:19:41.184
for young people, is very addictive.

525
00:19:41.184 --> 00:19:43.416
And in fact, there's a critique being made

526
00:19:43.416 --> 00:19:45.478
by Tristan Harris who is

527
00:19:45.478 --> 00:19:47.102
a classmate of yours
<v ->Classmate of mine.</v>

528
00:19:47.102 --> 00:19:49.364
<v ->and a classmate of Mike's and a student in the same</v>

529
00:19:49.364 --> 00:19:52.510
class as Mike's and he says that the design of Instagram

530
00:19:52.510 --> 00:19:53.472
deliberately addicts you.

531
00:19:53.472 --> 00:19:55.301
For example, when you open it up--

532
00:19:55.301 --> 00:19:59.465
<v ->Sorry, I'm laughing just because I think the idea</v>

533
00:19:59.465 --> 00:20:02.672
that anyone inside here tries to design something

534
00:20:02.672 --> 00:20:06.839
that is maliciously addictive is just like so far-fetched.

535
00:20:08.319 --> 00:20:10.113
We try to solve problems for people,

536
00:20:10.113 --> 00:20:12.525
and if by solving those problems for people, they like

537
00:20:12.525 --> 00:20:15.642
to use the product, I think we've done our job well.

538
00:20:15.642 --> 00:20:17.285
This is not a casino.

539
00:20:17.285 --> 00:20:19.362
We are not trying to eek money out of people

540
00:20:19.362 --> 00:20:21.244
in a malicious way.

541
00:20:21.244 --> 00:20:23.999
The idea of Instagram is that we create something

542
00:20:23.999 --> 00:20:26.173
that allows them to connect with their friends

543
00:20:26.173 --> 00:20:27.883
and their family and their interests

544
00:20:27.883 --> 00:20:31.629
through positive experiences, and I think any criticism

545
00:20:31.629 --> 00:20:35.411
of building that system is unfounded.

546
00:20:35.411 --> 00:20:38.895
<v ->And so all of this is aimed at making Instagram better,</v>

547
00:20:38.895 --> 00:20:40.389
and it sounds like changes so far

548
00:20:40.389 --> 00:20:42.376
have made Instagram better.

549
00:20:42.376 --> 00:20:44.496
Is any of it aimed at making people better,

550
00:20:44.496 --> 00:20:46.736
or is there any chance that the changes

551
00:20:46.736 --> 00:20:49.250
that happen on Instagram will seep into the real world

552
00:20:49.250 --> 00:20:52.316
and maybe just a little bit conversations in this country

553
00:20:52.316 --> 00:20:54.794
will be more positive than they've been?

554
00:20:54.794 --> 00:20:57.966
<v ->I sure hope we can stem any negativity in the world.</v>

555
00:20:57.966 --> 00:21:02.197
I'm not sure we would sign up for that day one.

556
00:21:02.197 --> 00:21:04.562
But I actually want to challenge the initial premise,

557
00:21:04.562 --> 00:21:07.708
which is this is about making Instagram better.

558
00:21:07.708 --> 00:21:11.306
I actually think it's about making the internet better.

559
00:21:11.306 --> 00:21:13.127
I hope some day the technology that we develop

560
00:21:13.127 --> 00:21:15.462
and the trainings that we develop and the things

561
00:21:15.462 --> 00:21:17.681
we learn, we can pass on to start-ups.

562
00:21:17.681 --> 00:21:19.946
We can pass on to our peers in technology.

563
00:21:19.946 --> 00:21:22.878
And that we actually together build a kinder,

564
00:21:22.878 --> 00:21:25.393
safer, more inclusive community online.

565
00:21:25.393 --> 00:21:26.910
<v Nicholas>Will you open source the software</v>

566
00:21:26.910 --> 00:21:28.247
you've built for this?

567
00:21:28.247 --> 00:21:29.695
<v ->I'm not sure.</v>

568
00:21:29.695 --> 00:21:30.528
I'm not sure.

569
00:21:30.528 --> 00:21:33.527
I think a lot of comes back to how good it performs,

570
00:21:33.527 --> 00:21:37.184
and the willingness of our partners to adopt it.

571
00:21:37.184 --> 00:21:39.413
<v ->But what if this fails?</v>

572
00:21:39.413 --> 00:21:42.853
What if actually people kind of get turned off

573
00:21:42.853 --> 00:21:43.686
by Instagram?

574
00:21:43.686 --> 00:21:45.568
They say, Instagram's becoming like Disneyland.

575
00:21:45.568 --> 00:21:47.398
I don't want to be there, and they share less.

576
00:21:47.398 --> 00:21:49.614
<v ->[laughs] The thing I love about Silicon Valley</v>

577
00:21:49.614 --> 00:21:52.744
is that we bear hug failure.

578
00:21:52.744 --> 00:21:56.077
Like, failure is what we all start with.

579
00:21:59.078 --> 00:22:01.076
We go through.

580
00:22:01.076 --> 00:22:03.998
Hopefully, we don't end on on the way to success.

581
00:22:03.998 --> 00:22:06.038
I mean, Instagram wasn't Instagram initially.

582
00:22:06.038 --> 00:22:07.879
It was a failed startup before.

583
00:22:07.879 --> 00:22:09.928
I turned down a bunch of job offers that would have been

584
00:22:09.928 --> 00:22:11.720
really awesome along the way.

585
00:22:11.720 --> 00:22:12.971
That was failure.

586
00:22:12.971 --> 00:22:15.690
I've had numerous product ideas at Instagram

587
00:22:15.690 --> 00:22:19.552
that were total flaws, they were total failures.

588
00:22:19.552 --> 00:22:20.805
And that's okay.

589
00:22:20.805 --> 00:22:23.249
We bear hug it because when you fail

590
00:22:23.249 --> 00:22:25.137
at least you're trying, and I think that's actually

591
00:22:25.137 --> 00:22:26.845
what makes Silicon Valley different

592
00:22:26.845 --> 00:22:29.892
from traditional business is that our tolerance

593
00:22:29.892 --> 00:22:32.541
for failure here is so much higher.

594
00:22:32.541 --> 00:22:33.925
And that's why you see bigger risks

595
00:22:33.925 --> 00:22:36.008
and also bigger payoffs.

