﻿WEBVTT

1
00:00:00.150 --> 00:00:02.770
<v Announcer>Please welcome Chris Cox with Lauren Goode.</v>

2
00:00:02.770 --> 00:00:05.353
[upbeat music]

3
00:00:17.250 --> 00:00:18.499
<v ->Chris, thank you so much for being here.</v>

4
00:00:18.499 --> 00:00:19.340
<v ->Thank you.</v>

5
00:00:19.340 --> 00:00:21.074
<v ->And thanks to everybody for being here as well.</v>

6
00:00:21.074 --> 00:00:23.019
Let's talk about Facebook.

7
00:00:23.019 --> 00:00:24.380
<v ->All right.</v>

8
00:00:24.380 --> 00:00:25.315
<v ->All right.</v>

9
00:00:25.315 --> 00:00:26.527
Let's just get right into it.

10
00:00:26.527 --> 00:00:27.360
<v ->I've heard about it.</v>

11
00:00:27.360 --> 00:00:28.300
<v ->Yes, you've heard about it.</v>

12
00:00:28.300 --> 00:00:29.960
Uh, you're wearing Facebook colors too.

13
00:00:29.960 --> 00:00:30.800
<v ->I guess so.</v>

14
00:00:30.800 --> 00:00:31.633
<v ->I don't know if that was intentional, but.</v>

15
00:00:31.633 --> 00:00:35.810
Yeah, so you left Facebook as Chief Product Officer

16
00:00:35.810 --> 00:00:38.240
several months ago now, I think it was March.

17
00:00:38.240 --> 00:00:39.073
<v ->Yep.</v>

18
00:00:39.073 --> 00:00:43.651
<v ->Of this year and interestingly, the year before that,</v>

19
00:00:43.651 --> 00:00:46.599 line:15% 
before you left, you had just been promoted to run

20
00:00:46.599 --> 00:00:49.952 line:15% 
product at not only Facebook but also, WhatsApp, Instagram

21
00:00:49.952 --> 00:00:51.370
and Messenger.

22
00:00:51.370 --> 00:00:52.203
So you were really, like, you

23
00:00:52.203 --> 00:00:54.403
were running the product show there.

24
00:00:54.403 --> 00:00:56.968
That's 4 of the 6 core apps of the company.

25
00:00:56.968 --> 00:00:59.633
And then you left.

26
00:01:01.460 --> 00:01:03.040
How are you thinking about Facebook

27
00:01:03.040 --> 00:01:04.140
as a platform right now,

28
00:01:04.140 --> 00:01:07.837
now that you've had several months away from working there?

29
00:01:07.837 --> 00:01:10.140
<v ->Yeah well, there's my personal experience.</v>

30
00:01:10.140 --> 00:01:14.490
I was there for 13 years so I started there when I was 23.

31
00:01:14.490 --> 00:01:16.913
I was the, I think the 13th engineer.

32
00:01:18.391 --> 00:01:21.250 line:15% 
And so it's intertwined with my whole like 20s

33
00:01:21.250 --> 00:01:22.240 line:15% 
and a lot of my 30s,

34
00:01:22.240 --> 00:01:25.233 line:15% 
with the experience of growing up with the company.

35
00:01:26.923 --> 00:01:28.570
And then there's all the people.

36
00:01:28.570 --> 00:01:30.493
I miss a lot of the people there, I miss,

37
00:01:31.443 --> 00:01:34.784
some days I miss the intensity of the work,

38
00:01:34.784 --> 00:01:36.413
some days I don't.

39
00:01:38.339 --> 00:01:42.446
At least for, you know, part of the reason I was okay

40
00:01:42.446 --> 00:01:45.060
leaving was having after 2016

41
00:01:45.060 --> 00:01:45.950
spent a couple of years

42
00:01:45.950 --> 00:01:47.043
building out a bunch of the teams

43
00:01:47.043 --> 00:01:49.233
that I felt were most important.

44
00:01:50.920 --> 00:01:53.350
To sort of take the lessons that we learned

45
00:01:53.350 --> 00:01:55.320
through some of 2016 and start to put in place

46
00:01:55.320 --> 00:01:58.280
institutions that could help the company

47
00:01:58.280 --> 00:02:01.947
be more responsible and be a better communicator

48
00:02:01.947 --> 00:02:03.963
on some of the key issues.

49
00:02:05.280 --> 00:02:07.830
The misinformation team, I know that's a hot topic right now

50
00:02:07.830 --> 00:02:10.123
I'd be delighted to spend 40 minutes on that.

51
00:02:11.870 --> 00:02:14.530
As well as the team focused on protecting elections,

52
00:02:14.530 --> 00:02:17.210
which is called the Elections Integrity team.

53
00:02:17.210 --> 00:02:20.310
As well as the team focused on at-risk countries,

54
00:02:20.310 --> 00:02:23.670
which is countries where there's more potent risk

55
00:02:23.670 --> 00:02:24.590
of real-world harm,

56
00:02:24.590 --> 00:02:26.180
because of the use of social media.

57
00:02:26.180 --> 00:02:30.031
And so, each of these teams set about forming partnerships

58
00:02:30.031 --> 00:02:32.809
with NGOs with good leaders.

59
00:02:32.809 --> 00:02:35.530
Building up people who cared about working on those things.

60
00:02:35.530 --> 00:02:38.164
Building ways of measuring progress against the problem.

61
00:02:38.164 --> 00:02:41.392
And ultimately sort of changing the institution.

62
00:02:41.392 --> 00:02:43.630
It's obviously not done.

63
00:02:43.630 --> 00:02:44.820
<v ->But you felt like it was</v>

64
00:02:44.820 --> 00:02:46.153
in a good enough place for you to.

65
00:02:46.153 --> 00:02:48.640
<v ->I felt that the teams were absolutely in a place</v>

66
00:02:48.640 --> 00:02:50.670
where they were going to do a great job

67
00:02:50.670 --> 00:02:51.710
with or without me.

68
00:02:51.710 --> 00:02:54.076
<v ->So it's been suggested and reported</v>

69
00:02:54.076 --> 00:02:57.417
that part of the reason why you left is that,

70
00:02:57.417 --> 00:02:59.300
basically, throughout the time from when

71
00:02:59.300 --> 00:03:01.906
you were first promoted to run product

72
00:03:01.906 --> 00:03:03.661
at all over all of these product categories,

73
00:03:03.661 --> 00:03:05.760
and the time that you left

74
00:03:05.760 --> 00:03:08.380
was also around the time that CEO Mark Zuckerberg

75
00:03:08.380 --> 00:03:11.240
started to talk more about the future of social networks

76
00:03:11.240 --> 00:03:13.110
being private, more private.

77
00:03:13.110 --> 00:03:15.600
And talking about unifying backend systems

78
00:03:15.600 --> 00:03:17.230
and end-to-end encryption.

79
00:03:17.230 --> 00:03:19.273
And it's been suggested that philosophically there might

80
00:03:19.273 --> 00:03:20.899
have been a difference in how you felt

81
00:03:20.899 --> 00:03:24.173
versus how he felt the future of the company.

82
00:03:24.173 --> 00:03:26.430
You know, where things were going.

83
00:03:26.430 --> 00:03:29.550
Would you say that's an accurate characterization?

84
00:03:29.550 --> 00:03:31.770
<v ->Yeah, I mean it was true both that</v>

85
00:03:31.770 --> 00:03:33.470
I had been there 13 years

86
00:03:33.470 --> 00:03:35.560
and it wasn't something where I felt I wanted

87
00:03:35.560 --> 00:03:38.803
to spend another 13 years on social media.

88
00:03:40.400 --> 00:03:43.230
And also that, you know, as Mark and I both said,

89
00:03:43.230 --> 00:03:44.590
we saw things a little bit differently.

90
00:03:44.590 --> 00:03:46.260
And then those two things combined

91
00:03:46.260 --> 00:03:48.461
to help me, you know, make that decision.

92
00:03:48.461 --> 00:03:50.160
And we made that decision together.

93
00:03:50.160 --> 00:03:51.262
We spent a long time figuring out

94
00:03:51.262 --> 00:03:53.470
what was the right timing for that

95
00:03:53.470 --> 00:03:54.760
and what was the right moment so.

96
00:03:54.760 --> 00:03:56.930
<v ->How would you characterize how you're feeling</v>

97
00:03:56.930 --> 00:03:59.230
about things like encryption for example?

98
00:03:59.230 --> 00:04:00.063
Encryption's been a big topic here so far this morning.

99
00:04:00.063 --> 00:04:01.377
<v ->Yeah.</v>

100
00:04:01.377 --> 00:04:02.958
<v ->And I think we're going to continue to talk about it.</v>

101
00:04:02.958 --> 00:04:03.791
<v ->Yep.</v>

102
00:04:03.791 --> 00:04:05.110
I mean, I think it's great.

103
00:04:05.110 --> 00:04:05.943
I think.

104
00:04:05.943 --> 00:04:07.826
<v Lauren>You think encryption's great?</v>

105
00:04:07.826 --> 00:04:08.850
<v ->I do.</v>

106
00:04:08.850 --> 00:04:09.683
<v Lauren>Okay.</v>

107
00:04:09.683 --> 00:04:11.385
<v ->I think it offers an enormous amount of protection.</v>

108
00:04:11.385 --> 00:04:15.580
I think we are still investigating, we as an industry,

109
00:04:15.580 --> 00:04:18.100
how do you balance protecting the privacy

110
00:04:18.100 --> 00:04:21.210
of people's information and continue to keep people safe.

111
00:04:21.210 --> 00:04:23.950
There's not a short answer to that question,

112
00:04:23.950 --> 00:04:25.587
but there I think is a bunch of really important research

113
00:04:25.587 --> 00:04:27.020
happening on that question,

114
00:04:27.020 --> 00:04:30.350
which will be, I think a lot of the important work

115
00:04:30.350 --> 00:04:31.800
that these companies do.

116
00:04:31.800 --> 00:04:33.856
<v ->How do you think a platform like Facebook</v>

117
00:04:33.856 --> 00:04:37.870
squares encryption with some of the things

118
00:04:37.870 --> 00:04:38.771
that you were working on,

119
00:04:38.771 --> 00:04:41.931
filter bubbles, hate speech, misinformation?

120
00:04:41.931 --> 00:04:44.114
Some of that, the latter categories seems to require

121
00:04:44.114 --> 00:04:47.100
things being a little bit more out in the open.

122
00:04:47.100 --> 00:04:49.500
And encryption of course means you're providing

123
00:04:49.500 --> 00:04:50.333
a certain layer of protection and privacy for people.

124
00:04:50.333 --> 00:04:52.001
<v Chris>Yep.</v>

125
00:04:52.001 --> 00:04:55.902
<v ->How do your beliefs about encryption square with</v>

126
00:04:55.902 --> 00:04:58.535
the kind of products you were working on?

127
00:04:58.535 --> 00:05:00.042
<v ->I mean it certainly makes some</v>

128
00:05:00.042 --> 00:05:01.639
of those things more complicated.

129
00:05:01.639 --> 00:05:04.970
I think in a messaging system,

130
00:05:04.970 --> 00:05:06.566
there's a much higher expectation of privacy

131
00:05:06.566 --> 00:05:09.000
which is totally legit.

132
00:05:09.000 --> 00:05:12.080
And there's also, at least on WhatsApp,

133
00:05:12.080 --> 00:05:13.373
we did some work that,

134
00:05:13.373 --> 00:05:15.490
I think some of the WhatsApp folks are here,

135
00:05:15.490 --> 00:05:16.939
we did some work to try and understand

136
00:05:16.939 --> 00:05:19.070
how you could combat misinformation

137
00:05:19.070 --> 00:05:20.220
in an encrypted environment.

138
00:05:20.220 --> 00:05:22.170
And I think we did some very good work.

139
00:05:23.160 --> 00:05:24.760
Some of it was public education.

140
00:05:25.963 --> 00:05:28.268
So just doing digital literacy campaigns

141
00:05:28.268 --> 00:05:30.653
in India and Brazil prior to elections.

142
00:05:31.710 --> 00:05:33.724
Some of it was building message forwarding systems

143
00:05:33.724 --> 00:05:36.199
so that people could connect with fact checkers

144
00:05:36.199 --> 00:05:38.310
and then could speak to groups

145
00:05:38.310 --> 00:05:40.820
that they were in about what they were seeing.

146
00:05:40.820 --> 00:05:43.739
So I think there's a different toolkit than the toolkit

147
00:05:43.739 --> 00:05:45.890
that you had used in a public feed system

148
00:05:45.890 --> 00:05:47.393
where you can just start to see what goes viral

149
00:05:47.393 --> 00:05:49.530
and for anything that looks like it's gonna go viral,

150
00:05:49.530 --> 00:05:51.030
you send it to a fact checker.

151
00:05:52.488 --> 00:05:54.790
I think there are pros and cons with these systems

152
00:05:54.790 --> 00:05:57.963
and I'm not a hardliner on any one of them.

153
00:05:58.800 --> 00:06:01.010
And I think the decisions the company's

154
00:06:01.010 --> 00:06:02.289
making on encryption and privacy

155
00:06:02.289 --> 00:06:07.013
come from a place that is resonate with what people want.

156
00:06:07.013 --> 00:06:08.740
Which I appreciate.

157
00:06:08.740 --> 00:06:09.658
<v ->Interesting, okay.</v>

158
00:06:09.658 --> 00:06:13.610
I want to get to, eventually,

159
00:06:13.610 --> 00:06:15.215
what you're going to be doing next.

160
00:06:15.215 --> 00:06:16.510
<v ->Okay.</v>

161
00:06:16.510 --> 00:06:17.870
<v ->But I do have more questions about Facebook.</v>

162
00:06:17.870 --> 00:06:18.703
<v ->Sure.</v>

163
00:06:18.703 --> 00:06:20.053
<v ->So, political advertising.</v>

164
00:06:20.053 --> 00:06:20.890
<v Chris>Yes.</v>

165
00:06:20.890 --> 00:06:22.050
<v ->This is something that's been in the news</v>

166
00:06:22.050 --> 00:06:23.067
a lot over the past couple of weeks.

167
00:06:23.067 --> 00:06:25.757
Facebook has taken a bit of a hard line on

168
00:06:25.757 --> 00:06:30.070
political advertising, excuse me, Twitter has taken

169
00:06:30.070 --> 00:06:31.600
more of a hard line on political advertising

170
00:06:31.600 --> 00:06:33.210
on it's platform whereas Facebook has said

171
00:06:33.210 --> 00:06:34.531
it's going to allow political advertising

172
00:06:34.531 --> 00:06:39.531
even if those ads may contain untruths.

173
00:06:39.604 --> 00:06:42.305
And Mark Zuckerberg has spoken a lot about this

174
00:06:42.305 --> 00:06:43.812
just yesterday, I should add,

175
00:06:43.812 --> 00:06:47.131
Facebook's Chief Policy Officer, Nick Clegg

176
00:06:47.131 --> 00:06:49.780
actually said that they're going to be looking at things

177
00:06:49.780 --> 00:06:52.925
like micro-targeting a little more carefully and so.

178
00:06:52.925 --> 00:06:56.260
Like how do you feel Facebook

179
00:06:56.260 --> 00:06:57.951
is actually handling this right now?

180
00:06:57.951 --> 00:07:00.600
Is this the right approach?

181
00:07:00.600 --> 00:07:02.830
<v ->Well I think what Mark and Nick</v>

182
00:07:02.830 --> 00:07:03.684
said in the last couple of days

183
00:07:03.684 --> 00:07:05.743
was they're looking at ways to adopt

184
00:07:05.743 --> 00:07:08.162
some of what was suggested in the employee letter,

185
00:07:08.162 --> 00:07:10.221
a lot of which I agree with.

186
00:07:10.221 --> 00:07:13.520
While still taking the stand, which is,

187
00:07:13.520 --> 00:07:15.846
we think political advertising can be good and helpful.

188
00:07:15.846 --> 00:07:20.510 line:15% 
It often favors up-and-comers versus incumbents

189
00:07:20.510 --> 00:07:22.932 line:15% 
I think more often than not, which is important.

190
00:07:22.932 --> 00:07:24.860 line:15% 
If you look at the democratic field,

191
00:07:24.860 --> 00:07:27.445 line:15% 
you have a lot of folks who needed a way to find a platform

192
00:07:27.445 --> 00:07:28.739
that didn't quite have one yet

193
00:07:28.739 --> 00:07:32.020
and that creates diversity which I think is good.

194
00:07:32.020 --> 00:07:33.500
They're used a lot in local elections,

195
00:07:33.500 --> 00:07:34.972
which I think is important and is tied up

196
00:07:34.972 --> 00:07:37.581
in the question of micro-targeting by the way.

197
00:07:37.581 --> 00:07:40.280
And then they're adjacent to issue ads.

198
00:07:40.280 --> 00:07:45.280
Which are ads about a list of 23 issues which are political.

199
00:07:46.771 --> 00:07:50.034
So you have, if you wanna talk to people about gun control

200
00:07:50.034 --> 00:07:53.374
or climate change or immigration reform

201
00:07:53.374 --> 00:07:55.714
or women's rights issues, those are all political

202
00:07:55.714 --> 00:07:57.300
even the NFL, as they say

203
00:07:57.300 --> 00:08:00.150
is the last thing that wasn't political is now political.

204
00:08:01.620 --> 00:08:06.620
And so, I think there's good rationale for supporting these

205
00:08:06.798 --> 00:08:10.341
in a system that is designed to help people reach

206
00:08:10.341 --> 00:08:12.770
the community of people that wanna hear from them.

207
00:08:12.770 --> 00:08:14.750
And I stand behind that.

208
00:08:14.750 --> 00:08:16.600
<v ->Well what's interesting is it seems that you are</v>

209
00:08:16.600 --> 00:08:18.577
a proponent of fact checking in general

210
00:08:18.577 --> 00:08:20.242
we've talked about this.

211
00:08:20.242 --> 00:08:21.075
<v ->I'm a big fan.</v>

212
00:08:21.075 --> 00:08:24.570
<v ->You're a big fan of fact checking, we are too at Wired.</v>

213
00:08:24.570 --> 00:08:26.680
And so you're a proponent of fact checking,

214
00:08:26.680 --> 00:08:28.070
you've said this in earlier interviews,

215
00:08:28.070 --> 00:08:29.872 line:15% 
you've just said it, in this interview,

216
00:08:29.872 --> 00:08:34.117 line:15% 
and yet it seems as though, by Facebook saying,

217
00:08:34.117 --> 00:08:35.884 line:15% 
"We're going to allow political advertising,

218
00:08:35.884 --> 00:08:38.957
"all political advertising, and we don't really feel it's

219
00:08:38.957 --> 00:08:40.440
"necessarily a good thing."

220
00:08:40.440 --> 00:08:41.710
And I'm paraphrasing of course,

221
00:08:41.710 --> 00:08:43.765
to have fact checking in place for those political ads.

222
00:08:43.765 --> 00:08:46.024
Those two things are perhaps

223
00:08:46.024 --> 00:08:48.830
not in alignment with one another.

224
00:08:48.830 --> 00:08:50.738
<v ->Well political ads are their own animal.</v>

225
00:08:50.738 --> 00:08:53.157
I mean most political ads are highly partisan

226
00:08:53.157 --> 00:08:56.920
and fact checking, part of the place

227
00:08:56.920 --> 00:08:57.890
I think you wanna get to

228
00:08:57.890 --> 00:08:59.020
is to find a way of doing

229
00:08:59.020 --> 00:09:00.070
some sort of fact checking on these

230
00:09:00.070 --> 00:09:01.573
that's not so partisan.

231
00:09:03.540 --> 00:09:06.600
I think, you know, one of the systems that

232
00:09:07.990 --> 00:09:09.570
I worked on a while ago that the company's

233
00:09:09.570 --> 00:09:12.070
been vetting with academics is one that would help

234
00:09:13.350 --> 00:09:17.320
submit to some panel of a representative panel

235
00:09:17.320 --> 00:09:22.130
of people content in order for them to vet

236
00:09:22.130 --> 00:09:23.910
whether or not they feel it's misleading.

237
00:09:23.910 --> 00:09:26.780
And ends up at least according to some of the data

238
00:09:26.780 --> 00:09:27.920
that the academics were looking at

239
00:09:27.920 --> 00:09:30.005
ends up being a pretty good system.

240
00:09:30.005 --> 00:09:33.639
I think the company should investigate and is investigating

241
00:09:33.639 --> 00:09:38.380
micro-targeting, specifically in the political context

242
00:09:38.380 --> 00:09:39.815
because the thesis of all of this stuff

243
00:09:39.815 --> 00:09:42.410
is that it should be out in the open.

244
00:09:42.410 --> 00:09:44.108
And it is in the political ad archive,

245
00:09:44.108 --> 00:09:46.020
but if there's hundreds of variants

246
00:09:46.020 --> 00:09:48.180
being run of a creative then it's tricky

247
00:09:48.180 --> 00:09:51.162
to get your arms around what's being said to whom.

248
00:09:51.162 --> 00:09:54.638
I also think exploring more context in the UI

249
00:09:54.638 --> 00:09:57.102
both in the consumer experience and

250
00:09:57.102 --> 00:09:59.485
in the political ad archive could be helpful

251
00:09:59.485 --> 00:10:03.790
at continuing a position where you're looking for ways

252
00:10:03.790 --> 00:10:05.527
for fact checking to not be so partisan

253
00:10:05.527 --> 00:10:09.380
while also giving the user or the consumer

254
00:10:09.380 --> 00:10:10.610
a good experience.

255
00:10:10.610 --> 00:10:12.990
<v ->Spoken like a true product officer.</v>

256
00:10:12.990 --> 00:10:15.100
Tweak the UI and maybe.

257
00:10:15.100 --> 00:10:17.093
<v ->Well these tweaks are hugely impactful</v>

258
00:10:17.093 --> 00:10:19.820
so I don't say that lightly.

259
00:10:19.820 --> 00:10:23.650
<v ->And hugely impactful because of the size of the user base?</v>

260
00:10:23.650 --> 00:10:25.930
<v ->Well the whole experience is on a phone</v>

261
00:10:25.930 --> 00:10:27.276
in a hundred pixels, you know,

262
00:10:27.276 --> 00:10:30.900
so when you had, you know, 20 more that are helping people

263
00:10:30.900 --> 00:10:33.240
see other points of view or helping people vet

264
00:10:34.650 --> 00:10:38.520
sort of whether journalistic institutions have made

265
00:10:38.520 --> 00:10:39.772
comments about this, there are a bunch of things

266
00:10:39.772 --> 00:10:43.010
that can be done that I know the company is exploring.

267
00:10:43.010 --> 00:10:44.600
<v ->One more quick question about Facebook.</v>

268
00:10:44.600 --> 00:10:48.389
How confident are you in Facebook's ability

269
00:10:48.389 --> 00:10:50.910
to mitigate potential issues that may

270
00:10:50.910 --> 00:10:54.036
come up as we enter, it feels like we've already entered

271
00:10:54.036 --> 00:10:58.033
and been in for 20 years, the 2020 election cycle?

272
00:10:58.884 --> 00:11:02.430
<v ->Well, I can tell you the company has a whole bunch</v>

273
00:11:02.430 --> 00:11:04.295
more resources in terms of money, people,

274
00:11:04.295 --> 00:11:06.702
partnerships, programs.

275
00:11:06.702 --> 00:11:11.190
Not just in the United States but with most of the countries

276
00:11:11.190 --> 00:11:13.120
where major elections are happening.

277
00:11:13.120 --> 00:11:14.550
To anticipate these things.

278
00:11:14.550 --> 00:11:17.763
I can't be 100 percent confident, the company can't either.

279
00:11:19.070 --> 00:11:21.541
But we've put in place a real immune system

280
00:11:21.541 --> 00:11:24.810
that at least accounts for a lot of the issues

281
00:11:24.810 --> 00:11:26.443
we saw in 2016 and again saw

282
00:11:26.443 --> 00:11:28.782
in some of the mid-term elections.

283
00:11:28.782 --> 00:11:31.020
And have also seen looking at other

284
00:11:31.020 --> 00:11:32.660
elections around the world.

285
00:11:32.660 --> 00:11:34.925
<v ->So that seems like a good segue to Acronym.</v>

286
00:11:34.925 --> 00:11:36.100
<v Chris>Yes.</v>

287
00:11:36.100 --> 00:11:37.558
<v ->Tell people what you're doing with Acronym</v>

288
00:11:37.558 --> 00:11:39.610
and what Acronym is.

289
00:11:39.610 --> 00:11:40.450
<v ->Acronym.</v>

290
00:11:40.450 --> 00:11:41.564
<v ->Acronym.</v>

291
00:11:41.564 --> 00:11:43.713
<v ->Has anyone heard of Acronym?</v>

292
00:11:45.080 --> 00:11:46.093
One head.

293
00:11:47.620 --> 00:11:50.040
So one of the problems I've been interested in

294
00:11:50.040 --> 00:11:52.497
is the progressives in the United States

295
00:11:52.497 --> 00:11:55.060
having a good technology stack.

296
00:11:55.060 --> 00:11:57.369
Like good technical infrastructure for understanding

297
00:11:57.369 --> 00:12:01.149
how to develop messaging and then run campaigns.

298
00:12:01.149 --> 00:12:04.700
This is an area where my perception is that

299
00:12:04.700 --> 00:12:06.613
the progressives have been behind

300
00:12:06.613 --> 00:12:11.057
on ability to develop and use as a team

301
00:12:11.057 --> 00:12:14.995
infrastructure that helps you have a good voter file,

302
00:12:14.995 --> 00:12:19.995
how to develop messaging, just basic politics in 2019.

303
00:12:21.474 --> 00:12:25.434
Again that's my impression and I'm not a political expert.

304
00:12:25.434 --> 00:12:28.290
<v ->And you're sitting on the advisory board?</v>

305
00:12:28.290 --> 00:12:29.530
<v ->I'm sitting on the board along with</v>

306
00:12:29.530 --> 00:12:32.110
I'm not sitting on the board of directors,

307
00:12:32.110 --> 00:12:35.170
but I've been helping to advise Tara,

308
00:12:35.170 --> 00:12:37.483
raise money, hire a team.

309
00:12:38.380 --> 00:12:40.620
There's a woman names Tara McGowan who runs Acronym.

310
00:12:40.620 --> 00:12:42.490
They're based in Washington D.C.

311
00:12:42.490 --> 00:12:43.939
They work with progressive organizations

312
00:12:43.939 --> 00:12:46.413
like Planned Parenthood and the ACLU.

313
00:12:47.380 --> 00:12:48.698
Their mission is to help them effectively

314
00:12:48.698 --> 00:12:51.633
use social media and the internet.

315
00:12:53.320 --> 00:12:57.410
And I believe that it's important that such technology

316
00:12:57.410 --> 00:13:00.458
is available for whomever is our nominee.

317
00:13:00.458 --> 00:13:04.030
And for the sort of progressive institutions going forward.

318
00:13:04.030 --> 00:13:08.378
<v ->So you, by participating in these activities with Acronym,</v>

319
00:13:08.378 --> 00:13:13.378
you're really aligning yourself with a set of ideologies.

320
00:13:14.120 --> 00:13:16.570
You're saying they're progressives.

321
00:13:16.570 --> 00:13:19.070
I've read about it online it says like they want

322
00:13:19.070 --> 00:13:20.617
to basically, want to "Arm the left

323
00:13:20.617 --> 00:13:23.097
"with digital tools to help them combat

324
00:13:23.097 --> 00:13:23.930
"some of the more aggressive tactics of the right."

325
00:13:23.930 --> 00:13:26.214
<v ->Yes.</v>

326
00:13:26.214 --> 00:13:31.200
<v ->And do you feel like you couldn't perhaps have worked</v>

327
00:13:31.200 --> 00:13:32.400
on something like this in your

328
00:13:32.400 --> 00:13:35.080
role as Chief Products Officer at Facebook?

329
00:13:35.080 --> 00:13:35.913
<v ->Absolutely not.</v>

330
00:13:35.913 --> 00:13:37.133
<v ->And why is that?</v>

331
00:13:37.133 --> 00:13:39.390
<v ->I think when you're in a very very senior role at</v>

332
00:13:39.390 --> 00:13:43.519
a platform, you need to be, you have a duty

333
00:13:43.519 --> 00:13:46.833
to be much more neutral in your politics.

334
00:13:47.716 --> 00:13:49.693
<v ->Why is that?</v>

335
00:13:49.693 --> 00:13:54.693
<v ->I think that's part of running a platform</v>

336
00:13:55.070 --> 00:13:56.872
whose customers are across the aisle

337
00:13:56.872 --> 00:13:58.180
in a lot of different ways.

338
00:13:58.180 --> 00:13:59.987
And I think it's part of aiming to have institutions

339
00:13:59.987 --> 00:14:02.924
that can bring the country together.

340
00:14:02.924 --> 00:14:04.250
<v ->So you're free now?</v>

341
00:14:04.250 --> 00:14:06.299
And this is, in a sense?

342
00:14:06.299 --> 00:14:08.121
<v ->I certainly feel more free.</v>

343
00:14:08.121 --> 00:14:10.159
<v ->Okay, interesting.</v>

344
00:14:10.159 --> 00:14:12.740
<v ->You know, this is something I've wanted</v>

345
00:14:12.740 --> 00:14:14.437
to work on for a while.

346
00:14:14.437 --> 00:14:17.654
And I've come to understand at least in my own analysis

347
00:14:17.654 --> 00:14:20.521
of what happened since 2016

348
00:14:20.521 --> 00:14:22.930
that just good execution

349
00:14:22.930 --> 00:14:25.240
and like running a good campaign,

350
00:14:25.240 --> 00:14:28.514
using the internet, is something where I think

351
00:14:28.514 --> 00:14:31.200
that matters a lot.

352
00:14:31.200 --> 00:14:32.575
<v ->Was any of this driven by any sort of</v>

353
00:14:32.575 --> 00:14:35.010
sense of personal responsibility,

354
00:14:35.010 --> 00:14:36.470
for having been a part of Facebook

355
00:14:36.470 --> 00:14:37.600
during the 2016 election cycle?

356
00:14:37.600 --> 00:14:38.940
<v ->You know, maybe.</v>

357
00:14:38.940 --> 00:14:41.393
Not in a way I have direct access to.

358
00:14:42.840 --> 00:14:44.130
You know I definitely felt

359
00:14:44.130 --> 00:14:46.490
a sense of personal responsibility for cleaning up a lot

360
00:14:46.490 --> 00:14:48.470
of the pieces of the platform that I felt weren't

361
00:14:48.470 --> 00:14:50.672
going well, but this was just a conviction I think.

362
00:14:50.672 --> 00:14:52.853
Trump should not be our president.

363
00:14:54.310 --> 00:14:55.940
The other thing I care a lot about right now

364
00:14:55.940 --> 00:14:59.240
is climate change and he is not going to help us there.

365
00:14:59.240 --> 00:15:04.240
And four more years, you know, at five gigatons a year,

366
00:15:04.300 --> 00:15:06.759
that's a lot of carbon that we're not gonna get

367
00:15:06.759 --> 00:15:10.650
to go back and take back down unless we build

368
00:15:10.650 --> 00:15:13.183
some crazy technology that no one knows about yet.

369
00:15:15.060 --> 00:15:17.110
<v ->So in your times since you've left Facebook,</v>

370
00:15:17.110 --> 00:15:18.010
you were saying earlier,

371
00:15:18.010 --> 00:15:23.010
you've spent a lot of time researching climate change?

372
00:15:23.120 --> 00:15:26.310
You have, you're a smart person, you have the resources,

373
00:15:26.310 --> 00:15:29.590
you've really been head's down in this

374
00:15:29.590 --> 00:15:32.460
and you're doing something really interesting with a company

375
00:15:32.460 --> 00:15:33.293
that works with satellite technology.

376
00:15:33.293 --> 00:15:34.126
<v ->Yeah.</v>

377
00:15:34.126 --> 00:15:34.959
<v ->Talk about this.</v>

378
00:15:34.959 --> 00:15:37.821
<v ->Yeah so there's a really neat company called Planet Labs</v>

379
00:15:37.821 --> 00:15:41.800
in San Francisco, more of you are nodding about

380
00:15:41.800 --> 00:15:42.773
them than Acronym.

381
00:15:44.248 --> 00:15:48.610
They build satellites and design the satellites

382
00:15:48.610 --> 00:15:50.717
and build them right here on Harrison Street.

383
00:15:50.717 --> 00:15:53.130
I got to know the CEO and co-founder

384
00:15:53.130 --> 00:15:55.240
Will Marshall very well.

385
00:15:55.240 --> 00:15:57.570
A bunch of ex-NASA folks work there.

386
00:15:57.570 --> 00:15:59.087
The vision was to build these small,

387
00:15:59.087 --> 00:16:04.087
about shoebox-size satellites with solar panel wings.

388
00:16:04.930 --> 00:16:07.063
And have a fleet of them in space

389
00:16:07.063 --> 00:16:10.460
which is real time imaging the Earth.

390
00:16:10.460 --> 00:16:13.650
So every hour, you get a snapshot of every tile on Earth

391
00:16:13.650 --> 00:16:14.810
at medium resolution.

392
00:16:14.810 --> 00:16:17.666
Which means each pixel is about 3 meters.

393
00:16:17.666 --> 00:16:20.960
So this field has been called remote sensing.

394
00:16:20.960 --> 00:16:22.990
Remote sensing normally was something where you

395
00:16:22.990 --> 00:16:25.790
could get a picture every few weeks or every few months.

396
00:16:26.800 --> 00:16:29.123
But with this sort of time resolution,

397
00:16:29.990 --> 00:16:32.177
you can start to ask questions like,

398
00:16:32.177 --> 00:16:34.487
"What's going on with wildfires today?

399
00:16:34.487 --> 00:16:35.677
"How quickly are they spreading?

400
00:16:35.677 --> 00:16:37.410
"Where are they spreading?"

401
00:16:37.410 --> 00:16:39.696
Deforestation in the Amazon.

402
00:16:39.696 --> 00:16:41.043
Active coal power plants.

403
00:16:41.043 --> 00:16:43.763
How many coal power plants are firing right now?

404
00:16:45.350 --> 00:16:47.062
Methane emissions is on the horizon

405
00:16:47.062 --> 00:16:49.120
as something that we believe.

406
00:16:49.120 --> 00:16:54.120
We, the industry of computer vision and satellite folks,

407
00:16:54.530 --> 00:16:56.450
you may be able to see from space.

408
00:16:56.450 --> 00:16:58.335
Which is crazy 'cause it's a gas, but we know methane

409
00:16:58.335 --> 00:17:01.290
is a real contributor and it's coming

410
00:17:01.290 --> 00:17:04.160
from certain farms and refineries and stuff like that.

411
00:17:04.160 --> 00:17:05.883
So if you could identify leaks,

412
00:17:05.883 --> 00:17:10.766
you could start to contribute to having a health system

413
00:17:10.766 --> 00:17:13.591
where you're basically imaging the Earth every hour

414
00:17:13.591 --> 00:17:17.340
and then you're creating some public data set

415
00:17:17.340 --> 00:17:20.261
with tools that plug into decision makers, banks,

416
00:17:20.261 --> 00:17:23.323
insurance companies, policy makers.

417
00:17:25.460 --> 00:17:30.460
Investors, journalists, interested persons, the youth.

418
00:17:32.990 --> 00:17:34.970
Like, I could imagine in a classroom one day,

419
00:17:34.970 --> 00:17:36.540
you go and you say this is the Earth,

420
00:17:36.540 --> 00:17:39.266
here's a simulation of what 2050 looks like.

421
00:17:39.266 --> 00:17:41.990
They're getting pretty ugly by the way.

422
00:17:41.990 --> 00:17:44.430
As we update our assessment of the heating

423
00:17:44.430 --> 00:17:48.260
that we think might happen by 2100.

424
00:17:48.260 --> 00:17:50.357
My wife's from Bangkok, like, it's gone.

425
00:17:50.357 --> 00:17:55.357
And I think we're still doing such a poor job.

426
00:17:55.980 --> 00:17:59.540
We, humanity, of really wrapping our heads around

427
00:18:00.808 --> 00:18:04.760
how few precious years we have

428
00:18:04.760 --> 00:18:06.690
and I've even come to understand this more since

429
00:18:06.690 --> 00:18:07.713
leaving the company.

430
00:18:09.410 --> 00:18:11.450
Which to me is why it's one of the most important

431
00:18:11.450 --> 00:18:12.283
things you do is try and get somebody in office

432
00:18:12.283 --> 00:18:15.332
who can care about this.

433
00:18:15.332 --> 00:18:18.680
<v ->So you're advising Planet Labs in addition to Acronym?</v>

434
00:18:18.680 --> 00:18:19.810
<v ->Yep.</v>

435
00:18:19.810 --> 00:18:21.880
I have a badge, I'm learning about satellites.

436
00:18:21.880 --> 00:18:22.733
<v ->Okay.</v>

437
00:18:24.010 --> 00:18:25.210
<v ->I'm learning as much from them</v>

438
00:18:25.210 --> 00:18:29.070
and then I'm trying to help them build some software,

439
00:18:29.070 --> 00:18:30.310
'cause this is like building software

440
00:18:30.310 --> 00:18:32.420
on top of all this really cool satellite imagery.

441
00:18:32.420 --> 00:18:35.056
<v ->Have you considered starting your own company</v>

442
00:18:35.056 --> 00:18:36.452
around climate change?

443
00:18:36.452 --> 00:18:38.840
<v ->You know, I've thought about it and I've started</v>

444
00:18:38.840 --> 00:18:43.300
to look a little bit at what are some of the gaps.

445
00:18:43.300 --> 00:18:46.138
I'm still so far, I'm still so young at this field

446
00:18:46.138 --> 00:18:49.367
that I don't have enough confidence

447
00:18:49.367 --> 00:18:51.853
in my own mental model of the world.

448
00:18:53.306 --> 00:18:56.310
But it's been really fun starting to go see some

449
00:18:56.310 --> 00:18:59.350
of the technologists working on climate change.

450
00:18:59.350 --> 00:19:03.680
It's been very interesting, you forwarded me an incredible

451
00:19:03.680 --> 00:19:07.177
article in the New Yorker called, 'Money Is The Fuel

452
00:19:07.177 --> 00:19:10.877
"That The Fire Of Climate Change Burns On."

453
00:19:12.200 --> 00:19:16.690
And it's specifically about the financial industry

454
00:19:16.690 --> 00:19:18.970
and the insurance industry because there's so much

455
00:19:18.970 --> 00:19:22.650
private money that goes into banks and insurance companies

456
00:19:22.650 --> 00:19:26.211
that then actually finances coal and oil and gas

457
00:19:26.211 --> 00:19:31.211
and that's generally opaque.

458
00:19:32.560 --> 00:19:37.170
To investors, to private wealth funds, to countries,

459
00:19:37.170 --> 00:19:39.370
a lot of whom give their money to banks.

460
00:19:39.370 --> 00:19:41.317
And on the one hand, the country's saying, "We're working

461
00:19:41.317 --> 00:19:45.600
"on a 1.5 degree plan for Japan."

462
00:19:45.600 --> 00:19:46.433
But on the other hand

463
00:19:46.433 --> 00:19:47.747
a lot of their money is actually financing

464
00:19:47.747 --> 00:19:50.853
exactly the problem.

465
00:19:50.853 --> 00:19:54.082
Which is the supply of money to go build new coal

466
00:19:54.082 --> 00:19:58.393
and oil and do fracking and all of that other kind of stuff.

467
00:19:59.440 --> 00:20:01.968
<v ->What do you think big tech's responsibility</v>

468
00:20:01.968 --> 00:20:04.263
is when it comes to climate change?

469
00:20:05.160 --> 00:20:06.753
<v ->Well at the very least,</v>

470
00:20:07.980 --> 00:20:09.483
I think there's gradations.

471
00:20:10.700 --> 00:20:12.422
I think at the very least, it's making a commitment

472
00:20:12.422 --> 00:20:14.193
to being carbon negative.

473
00:20:15.190 --> 00:20:16.674
<v ->And that's in manufacturing processes,</v>

474
00:20:16.674 --> 00:20:19.200
that's in shipping even.

475
00:20:19.200 --> 00:20:21.300
<v ->Yeah, it's a lot easier for Facebook and Google</v>

476
00:20:21.300 --> 00:20:23.540
than it is for Amazon and Apple.

477
00:20:23.540 --> 00:20:25.146
Because we're not running giant supply chains

478
00:20:25.146 --> 00:20:26.429
and trucking stuff around the world

479
00:20:26.429 --> 00:20:27.936
and packaging things.

480
00:20:27.936 --> 00:20:28.769
So it's easy for Facebook to say

481
00:20:28.769 --> 00:20:33.586
'cause it's basically data centers and buildings.

482
00:20:33.586 --> 00:20:36.680
It's a lot easier to get a data center on renewable energy

483
00:20:36.680 --> 00:20:39.420
than it is to re-engineer your entire truck

484
00:20:39.420 --> 00:20:40.830
and supply chain system which is what

485
00:20:40.830 --> 00:20:42.150
Amazon has to think about.

486
00:20:42.150 --> 00:20:44.430
With that said, admirably, they've made a commitment

487
00:20:44.430 --> 00:20:47.973
to get to carbon neutral, I think it's by 2040.

488
00:20:49.100 --> 00:20:50.143
<v ->They've actually said they were going to do it</v>

489
00:20:50.143 --> 00:20:51.930
10 years earlier than everybody else.

490
00:20:51.930 --> 00:20:52.763
<v ->That's right, yeah.</v>

491
00:20:52.763 --> 00:20:54.280
<v ->So let's see if they can meet that goal.</v>

492
00:20:54.280 --> 00:20:55.947
<v ->You know, putting the tactics aside,</v>

493
00:20:55.947 --> 00:20:59.320
tech isn't actually creating that much carbon.

494
00:20:59.320 --> 00:21:03.233
I think Amazon's was 44 million.

495
00:21:04.650 --> 00:21:07.790
I did the math the other day, it's around a percent

496
00:21:09.780 --> 00:21:11.820
of the United States' carbon offset.

497
00:21:11.820 --> 00:21:12.790
<v ->Okay.</v>

498
00:21:12.790 --> 00:21:14.750
<v ->Which is pretty amazing considering what they do.</v>

499
00:21:14.750 --> 00:21:16.480
One of the interesting metrics to look at is

500
00:21:16.480 --> 00:21:19.742
carbon intensity which is per dollar GDP

501
00:21:19.742 --> 00:21:23.860
how much carbon, what's the externality in terms of carbon?

502
00:21:23.860 --> 00:21:25.794
High carbon intensity is a gas company,

503
00:21:25.794 --> 00:21:30.103
extremely low carbon intensity is most tech companies.

504
00:21:30.103 --> 00:21:33.483
But I do think tech can lead.

505
00:21:34.470 --> 00:21:37.790
We're an industry that in some ways is still looked to as

506
00:21:37.790 --> 00:21:40.900
a leader and in some ways is looked at with a squint

507
00:21:40.900 --> 00:21:42.870
to see if we'll improve.

508
00:21:42.870 --> 00:21:44.618
I think that both of those are healthy.

509
00:21:44.618 --> 00:21:47.960
But I think we can start to think about ways

510
00:21:47.960 --> 00:21:52.960
of more carefully dialing in our attention to carbon impact.

511
00:21:55.090 --> 00:21:57.490
And I don't know exactly what that looks like for big tech,

512
00:21:57.490 --> 00:22:02.490
but I believe that sort of riling up employees more

513
00:22:02.798 --> 00:22:07.798
and getting more power into the hands of the officers

514
00:22:09.053 --> 00:22:10.695
who are responsible for this at the company

515
00:22:10.695 --> 00:22:12.495
is ultimately gonna be a good thing.

516
00:22:13.370 --> 00:22:14.957
<v ->Chris, I have to let you go, we're out of time</v>

517
00:22:14.957 --> 00:22:16.180
but thank you so much.

518
00:22:16.180 --> 00:22:17.290
<v ->Yep, my pleasure.</v>

519
00:22:17.290 --> 00:22:18.321
Thanks everyone, appreciate it.

520
00:22:18.321 --> 00:22:19.154
<v ->Thank you everybody.</v>

521
00:22:19.154 --> 00:22:21.295
[audience applauding]

