WEBVTT

1
00:00:06.516 --> 00:00:08.217
Welcome to the Be Good podcast,

2
00:00:08.578 --> 00:00:14.622
where we explore the application of behavioral economics for good in order to nudge better business and better lives.

3
00:00:16.203 --> 00:00:16.443
Hi,

4
00:00:16.823 --> 00:00:19.565
and welcome to this episode of Be Good,

5
00:00:20.186 --> 00:00:22.488
brought to you by BVA Nudge Consulting,

6
00:00:22.908 --> 00:00:29.633
a global consultancy specializing in the application of behavioral science for successful behavioral change.

7
00:00:30.233 --> 00:00:30.894
Every month,

8
00:00:30.914 --> 00:00:34.136
we get to speak with a leader in the field of behavioral science.

9
00:00:34.560 --> 00:00:38.784
psychology and neuroscience in order to get to know more about them,

10
00:00:38.884 --> 00:00:39.504
their work,

11
00:00:40.005 --> 00:00:42.146
and its application to emerging issues.

12
00:00:42.787 --> 00:00:44.048
My name is Eric Singler,

13
00:00:44.348 --> 00:00:48.952
Managing Director of the BVA family and CEO of BVA Dutch Consulting,

14
00:00:49.432 --> 00:00:53.075
and with me is my colleague Richard Bordenave,

15
00:00:53.436 --> 00:00:56.458
Chief Behavioral Science Officer at PIRAS in Vivo.

16
00:00:57.139 --> 00:00:57.879
Hi Richard.

17
00:00:58.520 --> 00:00:59.120
Hi Eric,

18
00:00:59.460 --> 00:01:03.644
I'm excited to be joining you for this episode and I'm delighted

19
00:01:03.784 --> 00:01:05.125
to be introducing our guest,

20
00:01:05.205 --> 00:01:06.887
Professor Gerd Gigerenzer.

21
00:01:07.928 --> 00:01:13.553
Professor Gigerenzer is a giant in the world of academic research on human behavior and decision making.

22
00:01:14.554 --> 00:01:20.619
Professor Gigerenzer is director of Harding Center for Risk Literacy at the University of Potsdam,

23
00:01:21.440 --> 00:01:25.564
director emeritus at the Max Planck Institute for Human Development,

24
00:01:25.724 --> 00:01:28.006
and a partner at Simply Rational,

25
00:01:28.706 --> 00:01:30.248
the Institute for Decision and...

26
00:01:30.700 --> 00:01:34.022
He's also vice president of the European Research Council,

27
00:01:35.083 --> 00:01:39.186
and he's a former professor of psychology at the University of Chicago.

28
00:01:40.287 --> 00:01:41.047
So a few of

29
00:01:41.628 --> 00:01:43.669
Professor Gigerentz's awards include

30
00:01:44.170 --> 00:01:46.711
American Associations for the Advancement of Science,

31
00:01:47.672 --> 00:01:50.354
a prize for the best article in the behavioral sciences,

32
00:01:51.395 --> 00:01:56.458
the Association of the American Publisher Prize for the best book in social and behavioral sciences,

33
00:01:57.199 --> 00:01:58.900
the German Psychology Award,

34
00:01:59.280 --> 00:02:02.842
and the Communicator Award of the German Research Foundation.

35
00:02:03.602 --> 00:02:03.723
So

36
00:02:04.643 --> 00:02:09.186
Professor Gigerenzer is author of multiple books on heuristics and decision-making,

37
00:02:10.146 --> 00:02:13.448
which have been translated in many languages,

38
00:02:13.488 --> 00:02:14.228
like more than 20,

39
00:02:14.869 --> 00:02:16.570
including the book of today,

40
00:02:17.290 --> 00:02:18.231
Smart Management,

41
00:02:18.731 --> 00:02:18.991
How

42
00:02:19.792 --> 00:02:24.034
Simple Heuristics Help Leaders Make Good Decisions in an Uncertain

43
00:02:24.394 --> 00:02:27.156
World. I'm very happy to welcome you,

44
00:02:27.216 --> 00:02:27.676
Professor.

45
00:02:28.684 --> 00:02:31.665
And welcome to our Be Good podcast.

46
00:02:32.565 --> 00:02:32.685
Oh,

47
00:02:32.885 --> 00:02:35.346
thank you for having me here again.

48
00:02:36.347 --> 00:02:40.328
And I'm looking forward to have another insightful conversation with you.

49
00:02:41.468 --> 00:02:42.608
Professor Gigerenzer,

50
00:02:43.209 --> 00:02:48.410
thank you so much again for being with us today for this episode of Be Good.

51
00:02:49.871 --> 00:02:51.631
Before talking about your work,

52
00:02:51.951 --> 00:02:53.172
smart management,

53
00:02:53.412 --> 00:02:54.352
we would like to know...

54
00:02:54.848 --> 00:02:55.709
A little more,

55
00:02:55.749 --> 00:02:56.389
if possible,

56
00:02:56.509 --> 00:02:59.392
about you and your amazing career.

57
00:03:00.793 --> 00:03:07.318
Can you tell us first about how you came to be interested in human behavior in general,

58
00:03:07.919 --> 00:03:14.344
and maybe specifically about your interest in decision-making processes?

59
00:03:16.005 --> 00:03:18.487
I had an earlier career as a musician,

60
00:03:20.048 --> 00:03:23.191
and that is how I financed my studies.

61
00:03:24.832 --> 00:03:26.894
when I did my PhD,

62
00:03:26.914 --> 00:03:32.838
I had to make a decision whether to stay on the stage and play.

63
00:03:33.478 --> 00:03:35.119
That was entertainment music,

64
00:03:35.580 --> 00:03:36.821
mostly jazz,

65
00:03:36.881 --> 00:03:37.501
Dixieland,

66
00:03:37.601 --> 00:03:37.921
soul,

67
00:03:38.942 --> 00:03:44.166
and or risk an academic career.

68
00:03:45.607 --> 00:03:47.008
That was a decision.

69
00:03:47.048 --> 00:03:48.469
How do you make such a decision?

70
00:03:48.909 --> 00:03:49.109
Now,

71
00:03:49.750 --> 00:03:50.030
for me,

72
00:03:50.670 --> 00:03:52.251
the way I do it is,

73
00:03:52.231 --> 00:03:52.852
I'm not going to be a

74
00:03:53.472 --> 00:03:54.893
I have been playing music for,

75
00:03:55.293 --> 00:03:55.874
at this time,

76
00:03:55.874 --> 00:03:57.394
for 14 years on the stage.

77
00:03:58.175 --> 00:03:59.516
It was the safe option.

78
00:04:00.456 --> 00:04:05.379
I knew how it works and I earned much more money than as an assistant professor.

79
00:04:06.900 --> 00:04:08.541
And on the other side,

80
00:04:10.643 --> 00:04:11.263
I thought,

81
00:04:11.583 --> 00:04:11.923
okay,

82
00:04:12.684 --> 00:04:15.646
is this what you want to do for the rest of your life?

83
00:04:17.167 --> 00:04:19.008
And at the end...

84
00:04:19.836 --> 00:04:20.817
I took the risk,

85
00:04:21.317 --> 00:04:28.502
but for me it was a risky decision because I couldn't know whether I would ever make it to a professor at a good university.

86
00:04:29.423 --> 00:04:37.069
And that's a kind of important life decision that inspired me to look closer,

87
00:04:37.469 --> 00:04:39.531
how we actually make decisions.

88
00:04:40.771 --> 00:04:46.696
Could you share with us any mentors that had a particularly strong influence on you?

89
00:04:49.120 --> 00:04:57.604
Do you have any researcher or other people who have played an influential role in your professional career?

90
00:04:59.113 --> 00:05:02.415
So the people who intellectually influenced me,

91
00:05:02.535 --> 00:05:04.016
that was certainly Herbert Simon,

92
00:05:04.836 --> 00:05:09.439
with respect to his acknowledge of uncertainty,

93
00:05:09.859 --> 00:05:15.242
as opposed to a world of calculable risk where you can optimize.

94
00:05:15.622 --> 00:05:18.063
That's his famous concept of satisfying.

95
00:05:18.824 --> 00:05:24.327
But also a German-American psychologist,

96
00:05:24.507 --> 00:05:25.407
Egon Brunswick,

97
00:05:25.467 --> 00:05:26.428
who is less known.

98
00:05:28.789 --> 00:05:31.631
emphasized that to understand behavior,

99
00:05:31.771 --> 00:05:32.652
just like Simon's,

100
00:05:33.352 --> 00:05:34.794
we need to look at the world,

101
00:05:34.934 --> 00:05:36.094
the structure of the world,

102
00:05:36.175 --> 00:05:43.780
not just inside about traits or risk properties and risk diversion and such things.

103
00:05:46.342 --> 00:05:48.564
During my life,

104
00:05:51.606 --> 00:05:55.789
I benefited very much from the research group at the Max Planck Institute.

105
00:05:57.369 --> 00:06:04.633
And that was between around 30 or 35 researchers,

106
00:06:04.673 --> 00:06:10.336
from graduate students to what would have been the US system associate professors.

107
00:06:11.397 --> 00:06:19.361
And where we had for many years an open culture where ideas could be discussed,

108
00:06:19.381 --> 00:06:24.044
where nobody had anxieties about saying,

109
00:06:24.144 --> 00:06:25.945
what's the evidence for that point?

110
00:06:27.041 --> 00:06:28.161
But at the same time,

111
00:06:28.161 --> 00:06:33.104
the group was a big family and we still meet every year.

112
00:06:34.144 --> 00:06:35.945
And I'm just coming back from Barcelona,

113
00:06:36.445 --> 00:06:37.285
the last meeting.

114
00:06:37.805 --> 00:06:40.126
So much of the things

115
00:06:40.787 --> 00:06:42.407
I learned and

116
00:06:43.588 --> 00:06:50.230
I rediscovered were products from many people discussing every time.

117
00:06:51.891 --> 00:06:56.133
And the social aspect of ideas is...

118
00:06:56.553 --> 00:06:57.233
Very important.

119
00:06:58.394 --> 00:06:58.514
Yes.

120
00:06:58.534 --> 00:06:58.654
So,

121
00:06:58.754 --> 00:06:59.214
Professor

122
00:06:59.594 --> 00:07:02.995
Gigerenzer, as I mentioned earlier in your recent book,

123
00:07:03.535 --> 00:07:04.275
Smart Management,

124
00:07:04.355 --> 00:07:07.136
How Leaders Make Decisions in Uncertain Worlds,

125
00:07:07.896 --> 00:07:09.137
which has just been published,

126
00:07:09.177 --> 00:07:12.257
is co-authored with Jochen Reb and Seng Kualuang.

127
00:07:13.398 --> 00:07:15.418
And it will be at the center of the conversation today.

128
00:07:15.818 --> 00:07:18.619
But before we move into the content of it,

129
00:07:19.339 --> 00:07:22.600
can you just tell us what was the inspiration behind it?

130
00:07:23.000 --> 00:07:24.861
Why writing for managers?

131
00:07:25.601 --> 00:07:27.262
What was the idea behind the book?

132
00:07:28.262 --> 00:07:34.464
Management is an ideal topic of how to make decisions under uncertainty.

133
00:07:35.484 --> 00:07:35.624
Now,

134
00:07:35.804 --> 00:07:46.087
uncertainty means that you do not know the complete set of events that might happen in the future,

135
00:07:46.387 --> 00:07:47.467
nor the consequences.

136
00:07:49.208 --> 00:07:50.628
So that's management.

137
00:07:51.588 --> 00:07:52.569
That's most of life.

138
00:07:53.569 --> 00:07:56.231
as opposed to what

139
00:07:57.512 --> 00:07:59.633
Jimmy Savage called a small world,

140
00:08:00.834 --> 00:08:06.638
where you know all the future events and their consequences and their probabilities.

141
00:08:07.439 --> 00:08:09.780
So management is not a lottery.

142
00:08:12.022 --> 00:08:20.327
And so that work came out from the earlier work of the ABC research group,

143
00:08:20.788 --> 00:08:22.469
that is ABC is so adaptive.

144
00:08:22.901 --> 00:08:24.542
behavior and cognition.

145
00:08:25.783 --> 00:08:33.629
And adaptive means that you need to tune your heuristics to the problem at hand.

146
00:08:34.349 --> 00:08:37.131
There is no single hammer that works for everything.

147
00:08:38.252 --> 00:08:43.195
And my two co-authors were part of the research group,

148
00:08:44.356 --> 00:08:46.678
and they are now in Beijing and Singapore.

149
00:08:47.438 --> 00:08:48.979
So we wrote this thing together.

150
00:08:49.019 --> 00:08:50.600
We wanted to do it more personally,

151
00:08:50.680 --> 00:08:51.441
but it was...

152
00:08:51.957 --> 00:08:52.277
COVID-19,

153
00:08:54.138 --> 00:08:58.661
but we managed that to meet often enough.

154
00:08:59.621 --> 00:08:59.841
And

155
00:09:00.862 --> 00:09:08.566
I think this book brings very concrete examples,

156
00:09:08.706 --> 00:09:15.370
very concrete heuristics and discusses when it's a good thing to apply them and when not.

157
00:09:16.811 --> 00:09:20.393
And it provides an alternative to the usual book.

158
00:09:20.969 --> 00:09:27.776
A curricula in business schools where you learn that how should you make good decisions?

159
00:09:28.156 --> 00:09:30.178
Expected utility maximization.

160
00:09:30.759 --> 00:09:32.781
How should you not make them?

161
00:09:33.302 --> 00:09:34.663
Cognitive illusions.

162
00:09:35.304 --> 00:09:35.644
Errors.

163
00:09:36.625 --> 00:09:41.290
But then you don't know what to do because you can't maximize in the real world,

164
00:09:42.131 --> 00:09:43.072
in a VUCA world,

165
00:09:43.232 --> 00:09:44.413
and...

166
00:09:45.217 --> 00:09:47.118
If you just point out what's going wrong,

167
00:09:47.418 --> 00:09:48.938
you don't know what to do right.

168
00:09:49.838 --> 00:09:51.219
And that's what this book,

169
00:09:52.319 --> 00:09:52.439
so

170
00:09:53.139 --> 00:09:53.939
The Blind Spot,

171
00:09:54.199 --> 00:09:54.760
it fills.

172
00:09:55.720 --> 00:09:56.080
Thank you.

173
00:09:57.360 --> 00:09:58.241
So for managers,

174
00:09:58.581 --> 00:10:03.202
what would be the one key learning or benefit in trusting intuition?

175
00:10:04.702 --> 00:10:05.163
So first,

176
00:10:06.763 --> 00:10:09.924
the book teaches a few principles,

177
00:10:09.984 --> 00:10:12.685
which is take uncertainty seriously.

178
00:10:13.725 --> 00:10:14.005
Often.

179
00:10:14.569 --> 00:10:17.310
Uncertainty is reduced to risk,

180
00:10:17.510 --> 00:10:19.971
at least in most economic models,

181
00:10:20.571 --> 00:10:25.052
but also by many behavioral economists who assume that is the right way to do,

182
00:10:26.392 --> 00:10:29.133
that is right in a world of known risk,

183
00:10:30.053 --> 00:10:33.674
and that would be immediately applied to the real world of MOOC.

184
00:10:35.075 --> 00:10:37.315
And then take heuristics seriously,

185
00:10:38.756 --> 00:10:43.617
which means to be a little bit more humble.

186
00:10:44.169 --> 00:10:48.250
and accept that in much of the real world you can't optimize.

187
00:10:49.310 --> 00:10:53.051
And maximizing expected utility and maximum is just a version of that.

188
00:10:53.732 --> 00:10:56.672
So what do you really do?

189
00:10:56.813 --> 00:11:00.734
And then the book shows there are many ways to satisfy us.

190
00:11:01.154 --> 00:11:01.614
For instance,

191
00:11:01.674 --> 00:11:05.275
many ways to hire a person,

192
00:11:07.435 --> 00:11:10.016
to decide between the applicants,

193
00:11:10.756 --> 00:11:12.777
and think about that.

194
00:11:14.117 --> 00:11:20.082
And learn the adaptive toolbox of heuristics and learn wisely to select them.

195
00:11:20.982 --> 00:11:22.423
And here you need experience.

196
00:11:23.284 --> 00:11:26.346
While in the traditional proof of expected utility marks,

197
00:11:26.506 --> 00:11:27.767
you don't need experience.

198
00:11:28.147 --> 00:11:29.308
You don't need to learn.

199
00:11:29.608 --> 00:11:30.829
You just do a calculation.

200
00:11:32.510 --> 00:11:32.990
Professor,

201
00:11:33.351 --> 00:11:36.713
I would like to start with a key concept from your book.

202
00:11:37.093 --> 00:11:38.774
And first of all,

203
00:11:38.834 --> 00:11:42.377
with the concept of simple heuristics.

204
00:11:43.173 --> 00:11:50.718
Could you explain what you mean by a simple heuristic or what you call smart heuristics?

205
00:11:51.819 --> 00:11:52.339
Let's take

206
00:11:53.020 --> 00:11:54.000
Henry Morkowitz,

207
00:11:54.000 --> 00:11:54.661
who won his

208
00:11:55.341 --> 00:11:59.884
Economics Nobel Prize for an optimization method.

209
00:12:00.405 --> 00:12:08.110
So it's known as an optimal portfolio for the question,

210
00:12:08.150 --> 00:12:09.371
you have N assets,

211
00:12:09.591 --> 00:12:12.173
how do you invest in there?

212
00:12:14.157 --> 00:12:16.879
It's known as the mean variance portfolio.

213
00:12:17.759 --> 00:12:17.979
When

214
00:12:18.660 --> 00:12:23.563
Harry Markowitz made his own investments for the time of his retirement,

215
00:12:24.143 --> 00:12:29.286
then we might assume he used his Nobel Prize winning optimization method.

216
00:12:29.906 --> 00:12:30.887
He did not.

217
00:12:31.847 --> 00:12:33.328
He used a simple heuristic.

218
00:12:34.129 --> 00:12:34.909
In that case,

219
00:12:35.650 --> 00:12:36.790
the heuristic is called

220
00:12:37.231 --> 00:12:38.631
1 over n.

221
00:12:39.472 --> 00:12:41.213
n is the number of assets.

222
00:12:42.257 --> 00:12:44.540
And it means divide equally.

223
00:12:44.920 --> 00:12:46.642
So if you have two options,

224
00:12:47.323 --> 00:12:48.364
then 50-50.

225
00:12:48.844 --> 00:12:49.164
Three,

226
00:12:49.365 --> 00:12:49.805
a third,

227
00:12:49.825 --> 00:12:50.226
a third,

228
00:12:50.286 --> 00:12:50.746
and so on.

229
00:12:51.247 --> 00:12:52.148
That's a heuristic.

230
00:12:53.609 --> 00:12:57.233
The optimization portfolio is not a heuristic.

231
00:12:58.294 --> 00:13:00.196
It requires extensive data,

232
00:13:00.757 --> 00:13:02.138
extensive calculation,

233
00:13:03.319 --> 00:13:04.240
and estimation.

234
00:13:05.577 --> 00:13:05.697
So,

235
00:13:06.978 --> 00:13:07.379
studies,

236
00:13:07.459 --> 00:13:07.859
by the way,

237
00:13:07.999 --> 00:13:12.903
show that in many investment situations,

238
00:13:13.023 --> 00:13:15.185
1 over n makes more money,

239
00:13:16.266 --> 00:13:17.667
measured by Sharpe ratio,

240
00:13:17.767 --> 00:13:20.729
by other traditional criteria,

241
00:13:21.310 --> 00:13:23.672
than the Markowitz portfolio.

242
00:13:24.152 --> 00:13:24.573
And also,

243
00:13:25.533 --> 00:13:29.136
many modern versions cannot systematically do better.

244
00:13:30.337 --> 00:13:30.457
So,

245
00:13:31.258 --> 00:13:32.779
this is an example of a heuristic.

246
00:13:33.960 --> 00:13:36.842
And many people use the same heuristic for other things.

247
00:13:36.902 --> 00:13:37.322
For instance,

248
00:13:37.362 --> 00:13:44.247
parents with two or more children try to divide their love and time equally.

249
00:13:44.567 --> 00:13:45.708
That's one over n.

250
00:13:46.688 --> 00:13:48.289
And the heuristics they carry,

251
00:13:48.930 --> 00:13:49.931
as one can see here,

252
00:13:50.451 --> 00:13:52.212
often sense of fairness.

253
00:13:53.193 --> 00:13:53.593
With them,

254
00:13:54.254 --> 00:13:55.334
an equal division.

255
00:13:55.895 --> 00:13:58.937
And the point is that one needs to...

256
00:13:59.297 --> 00:14:03.039
study how well they do in,

257
00:14:03.059 --> 00:14:03.579
for instance,

258
00:14:03.920 --> 00:14:04.440
accuracy.

259
00:14:05.341 --> 00:14:05.461
So,

260
00:14:05.961 --> 00:14:12.585
what's often called naive diversification can do very well under UNSREC.

261
00:14:14.066 --> 00:14:17.468
Could you give us some concrete examples?

262
00:14:19.989 --> 00:14:20.189
Here,

263
00:14:20.209 --> 00:14:23.031
I'll give you an example of three heuristics.

264
00:14:24.692 --> 00:14:26.113
Let's start with Elon Musk.

265
00:14:27.081 --> 00:14:30.062
When Elon Musk was young and Tesla was young,

266
00:14:30.882 --> 00:14:39.365
he reported that he made the hiring procedure and he relied on a single criterion,

267
00:14:41.206 --> 00:14:41.626
which is,

268
00:14:43.007 --> 00:14:45.748
does the person have an exceptional ability?

269
00:14:46.468 --> 00:14:46.848
If no,

270
00:14:47.768 --> 00:14:48.429
not hired.

271
00:14:48.789 --> 00:14:49.249
If yes,

272
00:14:49.629 --> 00:14:50.009
hired.

273
00:14:50.789 --> 00:14:52.470
This is an extreme heuristic,

274
00:14:53.150 --> 00:14:55.291
because it just looks at one.

275
00:14:56.083 --> 00:14:58.444
variable or one feature.

276
00:14:59.685 --> 00:14:59.885
Now,

277
00:15:00.625 --> 00:15:04.246
you might think that it is a bit irrational.

278
00:15:06.487 --> 00:15:13.270
At least it would look from the heuristics and biases program that he might have cognitive limitations or something like that.

279
00:15:13.370 --> 00:15:14.030
But no,

280
00:15:14.931 --> 00:15:16.571
one needs to look at this and study it.

281
00:15:16.912 --> 00:15:17.032
So,

282
00:15:17.612 --> 00:15:18.492
in this case,

283
00:15:19.052 --> 00:15:24.995
there is one variable that is strongly correlated with a number of other variables.

284
00:15:26.055 --> 00:15:26.876
For instance,

285
00:15:27.436 --> 00:15:29.457
if someone has an exceptional ability,

286
00:15:29.597 --> 00:15:33.379
and even if this person is an excellent musician,

287
00:15:33.499 --> 00:15:37.202
which is not necessarily important for Tesla,

288
00:15:38.662 --> 00:15:42.324
but that means the person is likely able to concentrate,

289
00:15:43.025 --> 00:15:44.366
to stay on the problem,

290
00:15:44.886 --> 00:15:45.486
to sweat,

291
00:15:46.087 --> 00:15:46.987
to persevere,

292
00:15:47.968 --> 00:15:51.290
to be able to work with others together,

293
00:15:51.370 --> 00:15:52.991
like in an orchestra.

294
00:15:54.011 --> 00:15:54.131
So,

295
00:15:55.312 --> 00:16:00.475
finding an excellent variable is often the real question.

296
00:16:01.275 --> 00:16:05.217
And that brings along many others that are here.

297
00:16:05.898 --> 00:16:06.018
So,

298
00:16:06.078 --> 00:16:08.399
this is a heuristic from a broader class.

299
00:16:09.120 --> 00:16:12.121
There is one clever Q heuristic.

300
00:16:12.261 --> 00:16:14.422
Find an important aspect.

301
00:16:15.423 --> 00:16:15.723
Now,

302
00:16:16.504 --> 00:16:17.184
let's move on.

303
00:16:17.624 --> 00:16:18.685
A second example.

304
00:16:20.046 --> 00:16:20.166
So,

305
00:16:20.266 --> 00:16:21.106
Jeff Bezos,

306
00:16:21.747 --> 00:16:22.167
Amazon.

307
00:16:22.811 --> 00:16:26.132
He reported that when Amazon was small,

308
00:16:26.252 --> 00:16:27.213
he did the hiring,

309
00:16:28.313 --> 00:16:30.574
but he did it slightly different from Musk.

310
00:16:31.194 --> 00:16:36.997
He used a different heuristic that we call a fast and frugal tree.

311
00:16:37.617 --> 00:16:37.837
Okay,

312
00:16:37.897 --> 00:16:38.657
let me explain.

313
00:16:39.618 --> 00:16:39.738
So,

314
00:16:40.458 --> 00:16:43.099
it looks at most at three variables,

315
00:16:43.219 --> 00:16:43.940
not just one.

316
00:16:44.440 --> 00:16:45.240
And interestingly,

317
00:16:45.280 --> 00:16:48.021
the first one was the same as Musk.

318
00:16:48.401 --> 00:16:48.682
Namely,

319
00:16:49.102 --> 00:16:51.343
does the person have an exception ability?

320
00:16:51.423 --> 00:16:51.823
If no...

321
00:16:52.951 --> 00:16:53.511
No higher.

322
00:16:54.512 --> 00:16:55.192
If yes,

323
00:16:55.832 --> 00:16:56.713
that's not enough.

324
00:16:57.673 --> 00:16:58.313
For Bezos,

325
00:16:58.773 --> 00:17:00.334
a second question is asked,

326
00:17:00.614 --> 00:17:00.894
namely,

327
00:17:01.595 --> 00:17:04.216
can I admire this person,

328
00:17:04.636 --> 00:17:06.537
which is an unusual question.

329
00:17:09.058 --> 00:17:09.278
But

330
00:17:09.718 --> 00:17:10.378
Bezos said,

331
00:17:11.078 --> 00:17:12.559
if I admire a person,

332
00:17:12.939 --> 00:17:14.740
I will learn from that person,

333
00:17:15.020 --> 00:17:16.241
and that's important for him.

334
00:17:17.601 --> 00:17:17.721
So,

335
00:17:18.482 --> 00:17:19.342
if the person...

336
00:17:19.362 --> 00:17:20.242
Eh.

337
00:17:22.543 --> 00:17:25.085
If he thinks he cannot admire the person,

338
00:17:25.506 --> 00:17:26.126
no hire.

339
00:17:27.727 --> 00:17:28.228
But yes,

340
00:17:28.448 --> 00:17:30.830
a third and last question is asked,

341
00:17:30.850 --> 00:17:36.755
will the person improve the average quality in the unit he or she will be in?

342
00:17:37.576 --> 00:17:41.299
And that's a reasonable question because by hiring in this way,

343
00:17:41.859 --> 00:17:44.101
you always improve the group.

344
00:17:44.642 --> 00:17:46.043
And only then it's been hired.

345
00:17:46.744 --> 00:17:50.687
So this can be seen as a decision tree that is not complete.

346
00:17:51.259 --> 00:17:53.120
whether the decision is in every point.

347
00:17:53.620 --> 00:17:55.121
So exception ability,

348
00:17:55.501 --> 00:17:55.981
yes or no?

349
00:17:56.781 --> 00:17:57.382
And if yes,

350
00:17:57.822 --> 00:17:59.643
then admire,

351
00:18:00.443 --> 00:18:00.883
yes or no?

352
00:18:01.123 --> 00:18:01.864
And if no,

353
00:18:02.264 --> 00:18:05.805
then will it raise the proof?

354
00:18:07.146 --> 00:18:12.649
Note that these fast and frugally trees are not like in a full tree,

355
00:18:13.589 --> 00:18:14.849
and there's an order in it.

356
00:18:15.970 --> 00:18:17.531
So if you fail on the first one,

357
00:18:18.367 --> 00:18:20.569
the other two will not compensate.

358
00:18:20.949 --> 00:18:23.831
That's very different for most rational decision models,

359
00:18:24.212 --> 00:18:25.553
where you always can compensate.

360
00:18:27.575 --> 00:18:34.660
A second concept that is fundamental in your thinking is ecological rationality.

361
00:18:35.541 --> 00:18:44.849
Could you define what ecological rationality is and why this concept is fundamental for making good decisions?

362
00:18:45.869 --> 00:18:46.710
Yeah.

363
00:18:47.431 --> 00:18:47.551
The...

364
00:18:48.615 --> 00:18:49.595
As you may see,

365
00:18:50.276 --> 00:18:55.598
my approach to decision-making is to look precisely how a decision is being made.

366
00:18:56.459 --> 00:19:00.060
The standard concept of rationality is about consistency.

367
00:19:01.441 --> 00:19:01.561
So,

368
00:19:02.281 --> 00:19:07.323
whether it's transitive and the consistency axioms.

369
00:19:08.304 --> 00:19:15.107
And the ecological rationality is a term that it is...

370
00:19:16.291 --> 00:19:20.073
basically maps into Simon's concept of bounded rationality,

371
00:19:21.394 --> 00:19:24.576
which is not what Kahneman called bounded rationality,

372
00:19:25.997 --> 00:19:30.199
because that's also the reason why we use the term ecological rationality.

373
00:19:30.199 --> 00:19:31.740
It means it's a functional term.

374
00:19:32.961 --> 00:19:34.241
How good is this heuristic?

375
00:19:34.702 --> 00:19:36.983
To what degree is it adapted to the problem?

376
00:19:38.544 --> 00:19:40.725
So let me illustrate.

377
00:19:41.285 --> 00:19:44.207
In the case of Elon Musk,

378
00:19:45.083 --> 00:19:48.964
where the hiring heuristic is just one variable.

379
00:19:49.504 --> 00:19:51.625
Has this person an acceptability?

380
00:19:52.786 --> 00:19:53.086
That,

381
00:19:53.706 --> 00:20:01.388
now you can do some mathematics and see that whether this Q,

382
00:20:03.069 --> 00:20:03.369
whether,

383
00:20:04.289 --> 00:20:04.889
put it this way,

384
00:20:05.290 --> 00:20:06.170
if you add more,

385
00:20:07.570 --> 00:20:08.730
variables to this queue,

386
00:20:08.891 --> 00:20:09.691
does it improve?

387
00:20:11.891 --> 00:20:15.312
And if you imagine now,

388
00:20:16.012 --> 00:20:16.193
say,

389
00:20:17.733 --> 00:20:22.354
the first variable has this validity in terms of a linear model,

390
00:20:22.714 --> 00:20:25.295
and the second one has only half of that,

391
00:20:25.335 --> 00:20:27.115
and the third additional contribution,

392
00:20:27.176 --> 00:20:27.956
like beta weights,

393
00:20:28.596 --> 00:20:31.397
then you can prove that in this situation,

394
00:20:33.177 --> 00:20:36.278
looking at more variables will improve.

395
00:20:36.498 --> 00:20:40.020
not improve the decision.

396
00:20:40.020 --> 00:20:46.684
It even can make the decision worse because you have getting more and more error.

397
00:20:47.264 --> 00:20:54.488
So ecological rationality studies the condition under which certain heuristics work and do not work.

398
00:20:55.549 --> 00:21:04.174
And this is a new discipline because it has been assumed by at least still it's been assumed by the mainstream mainstream

399
00:21:04.722 --> 00:21:10.906
behavioral economists that a heuristic is always second class and optimization is always better.

400
00:21:11.966 --> 00:21:12.126
Now,

401
00:21:13.147 --> 00:21:14.448
in a local world,

402
00:21:14.628 --> 00:21:15.928
you can't optimize,

403
00:21:16.189 --> 00:21:16.629
period.

404
00:21:17.669 --> 00:21:19.911
That's an illusion to think this way.

405
00:21:20.331 --> 00:21:22.012
You have to ask another question.

406
00:21:22.392 --> 00:21:23.513
There are many heuristics.

407
00:21:24.373 --> 00:21:27.795
Which one of that will work in this situation?

408
00:21:28.996 --> 00:21:32.498
And if you have an exponentially decreasing

409
00:21:33.899 --> 00:21:34.319
weights,

410
00:21:34.920 --> 00:21:40.204
then you can just go with the best reason and ignore everything else.

411
00:21:40.625 --> 00:21:41.806
If that's not the case,

412
00:21:43.087 --> 00:21:46.730
if the weights of the variables are rather flat,

413
00:21:47.711 --> 00:21:50.493
then procedures like

414
00:21:50.973 --> 00:21:57.659
1 over n or tallying that you're giving don't estimate weights,

415
00:21:57.739 --> 00:22:00.482
but just doing unit weights just can't.

416
00:22:01.570 --> 00:22:07.613
avoid error in estimation and can be better than regression models.

417
00:22:09.893 --> 00:22:13.695
And these are effects that we call less is more.

418
00:22:14.035 --> 00:22:16.516
And less is more happens under uncertainty.

419
00:22:17.517 --> 00:22:20.438
It doesn't happen in the world of,

420
00:22:21.398 --> 00:22:23.499
in the small world of risk models.

421
00:22:24.840 --> 00:22:30.762
The third concept that seems fundamentally in your thinking is the adaptive.

422
00:22:30.882 --> 00:22:31.342
toolbox.

423
00:22:32.163 --> 00:22:32.543
Again,

424
00:22:32.763 --> 00:22:36.266
could you explain this idea of the adaptive toolbox?

425
00:22:37.807 --> 00:22:42.009
How do you think humans make decisions using this toolbox?

426
00:22:43.110 --> 00:22:45.272
As an adaptive toolbox of heuristics,

427
00:22:45.392 --> 00:22:47.753
they can be used consciously or unconsciously.

428
00:22:48.374 --> 00:22:49.655
Take the example of hiring.

429
00:22:50.415 --> 00:22:50.915
Now we have

430
00:22:51.356 --> 00:22:54.178
Musk's one reason hiring rule.

431
00:22:54.358 --> 00:22:58.721
We have Jeff Bezos'fast and frugal three with up to three reasons.

432
00:22:59.301 --> 00:23:00.462
Then you can add a few.

433
00:23:01.054 --> 00:23:01.534
For instance,

434
00:23:01.595 --> 00:23:03.456
a number of heuristics are social,

435
00:23:04.417 --> 00:23:10.302
and a social heuristic for hiring would be word of mouth.

436
00:23:10.342 --> 00:23:12.123
Word of mouth is often used,

437
00:23:12.664 --> 00:23:14.585
so you ask your own employees,

438
00:23:15.106 --> 00:23:19.890
do you know someone who would be a good person for that job?

439
00:23:21.031 --> 00:23:25.555
And there are studies that show that in a healthy company,

440
00:23:26.496 --> 00:23:28.838
word of mouth is hard to beat.

441
00:23:30.458 --> 00:23:31.500
And you can see why.

442
00:23:32.040 --> 00:23:41.833
Because the person who recommends another person to the job feels responsible him or herself.

443
00:23:42.954 --> 00:23:47.060
I will not recommend someone who works less than he or she.

444
00:23:51.066 --> 00:23:54.769
So that's an example of the adaptive toolbox.

445
00:23:55.869 --> 00:24:01.613
And then you can classify the heuristics in broader classes we've just seen.

446
00:24:02.354 --> 00:24:04.055
One is social heuristics,

447
00:24:04.195 --> 00:24:05.056
like imitation.

448
00:24:05.236 --> 00:24:08.458
Very important for innovation in businesses.

449
00:24:09.299 --> 00:24:12.781
Much of innovation is copying and editing.

450
00:24:14.882 --> 00:24:18.405
It's not that Facebook invented its Facebook.

451
00:24:19.386 --> 00:24:19.946
It copied.

452
00:24:20.794 --> 00:24:21.715
or Google copy.

453
00:24:23.196 --> 00:24:32.524
And so these type of heuristics can be systematically studied and their ecological rationality investigated.

454
00:24:34.466 --> 00:24:38.349
And that's much harder than saying you just maximize some utility.

455
00:24:40.102 --> 00:24:44.925
This is one of the central elements of the book and more generally of your thinking.

456
00:24:45.566 --> 00:24:51.910
And you highlight particularly the underestimation of the importance of intuition in decision making.

457
00:24:52.390 --> 00:24:53.931
And just for the audience to know,

458
00:24:53.991 --> 00:24:59.435
you've also recently devoted another book called The Intelligence of Intuition.

459
00:25:00.136 --> 00:25:05.119
So one thing I would like to know is what is your definition of intuition?

460
00:25:05.279 --> 00:25:08.301
What do you have in mind when you talk about intuition?

461
00:25:08.301 --> 00:25:09.262
Because I think you have very...

462
00:25:10.042 --> 00:25:11.743
specific definition of it.

463
00:25:12.903 --> 00:25:16.125
So intuition is a form of unconscious intelligence.

464
00:25:16.865 --> 00:25:21.467
So it is a feeling that has three components.

465
00:25:21.667 --> 00:25:24.929
One is it's based on long experience.

466
00:25:25.989 --> 00:25:26.349
Otherwise,

467
00:25:26.369 --> 00:25:27.610
there's no intuition.

468
00:25:28.550 --> 00:25:30.931
So there may be a year-long experience with the subject.

469
00:25:31.211 --> 00:25:31.531
Second,

470
00:25:33.111 --> 00:25:37.953
you sense very quickly what you should do or not do.

471
00:25:39.073 --> 00:25:39.713
And third,

472
00:25:41.113 --> 00:25:48.195
you cannot explain why you sense that you should do that or not.

473
00:25:49.096 --> 00:25:54.557
So intuition is not a sixth sense or not a God's voice.

474
00:25:54.977 --> 00:25:57.018
It's also not something that...

475
00:25:58.018 --> 00:25:59.479
women have,

476
00:25:59.659 --> 00:26:00.699
and may not,

477
00:26:00.899 --> 00:26:01.980
everybody has intuition,

478
00:26:02.400 --> 00:26:09.863
who works for sufficient time in a certain topic where there's also feedback.

479
00:26:11.944 --> 00:26:17.006
And you are right saying that in decision-making theory,

480
00:26:17.146 --> 00:26:21.068
intuition is looked at suspicious,

481
00:26:21.748 --> 00:26:24.069
but not in many other fields.

482
00:26:25.210 --> 00:26:25.330
So,

483
00:26:25.330 --> 00:26:25.710
you're right.

484
00:26:26.670 --> 00:26:29.651
Just think about mathematics or physics,

485
00:26:31.231 --> 00:26:32.832
where intuition is respected.

486
00:26:34.392 --> 00:26:35.172
Einstein said,

487
00:26:36.793 --> 00:26:40.174
intuition is a gift.

488
00:26:40.714 --> 00:26:43.335
So the intuitive spirit,

489
00:26:43.395 --> 00:26:43.815
he said,

490
00:26:44.315 --> 00:26:45.455
is a gift,

491
00:26:45.915 --> 00:26:49.976
and the rational spirit is its servant.

492
00:26:50.857 --> 00:26:55.598
And we have created a society that honors the servant and has forgotten.

493
00:26:55.798 --> 00:26:56.418
the gift.

494
00:26:56.958 --> 00:26:57.979
Just an example.

495
00:26:58.759 --> 00:27:03.300
If you talk with the best chess players in the world,

496
00:27:03.420 --> 00:27:05.741
Judas Polgar or Magnus Carlsen,

497
00:27:06.281 --> 00:27:15.324
they emphasize that their excellent play is a mixer between intuition and deliberate learning.

498
00:27:16.724 --> 00:27:18.765
And this is the first important point.

499
00:27:20.365 --> 00:27:20.665
Namely,

500
00:27:20.965 --> 00:27:22.866
intuition is not a post.

501
00:27:23.350 --> 00:27:25.451
to deliberate thinking,

502
00:27:26.171 --> 00:27:29.532
as it's still assumed in much of behavioral economics,

503
00:27:29.892 --> 00:27:31.872
where you have a system one and a system two.

504
00:27:32.532 --> 00:27:32.653
No,

505
00:27:32.753 --> 00:27:32.873
no,

506
00:27:33.393 --> 00:27:34.853
it goes hand in hand.

507
00:27:36.253 --> 00:27:36.374
So,

508
00:27:37.254 --> 00:27:38.314
another example,

509
00:27:38.674 --> 00:27:42.015
a doctor who sees you often,

510
00:27:42.775 --> 00:27:46.876
and today the doctor thinks something is wrong with you,

511
00:27:48.217 --> 00:27:50.537
but cannot explain what's wrong.

512
00:27:51.658 --> 00:27:52.558
That's an intuition.

513
00:27:53.766 --> 00:27:54.846
Long experience,

514
00:27:55.447 --> 00:27:58.667
it's quickly in consciousness,

515
00:27:59.288 --> 00:28:00.788
and the doctor cannot explain it.

516
00:28:01.928 --> 00:28:05.329
And then the doctor will move on and do diagnostics.

517
00:28:06.229 --> 00:28:07.870
But there's no contradiction there.

518
00:28:08.410 --> 00:28:13.331
It's not an either-or that you should stop all your intuition,

519
00:28:13.411 --> 00:28:14.612
as we are told.

520
00:28:16.152 --> 00:28:20.033
And that is an important insight.

521
00:28:20.333 --> 00:28:22.474
There will be almost no error.

522
00:28:23.206 --> 00:28:27.249
where you will do well without intuition.

523
00:28:28.810 --> 00:28:29.631
As you can add,

524
00:28:29.671 --> 00:28:34.354
there will be almost no error where you will do well without thinking.

525
00:28:35.494 --> 00:28:37.096
And it's not a contradiction.

526
00:28:37.976 --> 00:28:38.356
Interesting,

527
00:28:38.356 --> 00:28:39.157
because in management,

528
00:28:39.697 --> 00:28:47.863
sometimes we have an ambiguous point of view where managers are also invited to overcome their cognitive biases.

529
00:28:48.744 --> 00:28:51.465
So what's your thinking?

530
00:28:52.466 --> 00:29:08.530
against these approaches that were into can intuition actually be leveraged while avoiding pitfalls yeah of course so i'm not going here to criticize much the heuristic and biases approach

531
00:29:08.590 --> 00:29:21.894
i've done this many times so just make sure many of the so-called buyers buyers as we know today aren't any biases and they they are

532
00:29:22.390 --> 00:29:23.651
If you ignore information,

533
00:29:24.291 --> 00:29:25.591
then maybe something good or not.

534
00:29:25.631 --> 00:29:27.452
The question is ecological rationality.

535
00:29:28.132 --> 00:29:28.393
So when

536
00:29:28.873 --> 00:29:33.195
Harry Markowitz ignores his own optimizing portfolio,

537
00:29:33.935 --> 00:29:34.655
he may be right.

538
00:29:36.476 --> 00:29:36.616
When

539
00:29:37.857 --> 00:29:40.558
Elon Musk just relies on one variable,

540
00:29:40.838 --> 00:29:41.578
he may be right.

541
00:29:41.958 --> 00:29:42.979
Depends on the situation.

542
00:29:43.859 --> 00:29:48.181
So the juristic and biases tradition needs an ecological perspective.

543
00:29:49.038 --> 00:29:52.701
It's not true that being consistent would be always right.

544
00:29:54.041 --> 00:29:57.224
And using a heuristic loophole was such a first thing.

545
00:29:58.985 --> 00:30:00.606
Intuition is,

546
00:30:01.487 --> 00:30:03.688
I have worked with many large companies.

547
00:30:04.789 --> 00:30:06.130
And the

548
00:30:06.530 --> 00:30:09.612
CEOs and the top leaders of large companies,

549
00:30:11.653 --> 00:30:15.216
when in the studies with these companies,

550
00:30:17.097 --> 00:30:18.238
you will find that

551
00:30:18.746 --> 00:30:19.706
They say that

552
00:30:21.067 --> 00:30:27.568
50% of the important professional decisions is at the end a gut decision,

553
00:30:27.848 --> 00:30:28.809
so an intuition.

554
00:30:31.429 --> 00:30:44.613
You get these if you do an anonymous service with them or in some cases we use a person high up in the company who

555
00:30:44.653 --> 00:30:48.154
has the trust of everyone and can just openly talk.

556
00:30:49.298 --> 00:30:58.422
The same executives who make about every other question at the end intuitive,

557
00:30:58.862 --> 00:31:00.543
and the emphasis is on the end.

558
00:31:00.843 --> 00:31:02.563
It's not arbitrary.

559
00:31:03.364 --> 00:31:04.724
So they sit on data,

560
00:31:05.825 --> 00:31:06.945
and there's too much data,

561
00:31:07.185 --> 00:31:09.006
and they don't know how reliable it is.

562
00:31:09.346 --> 00:31:10.387
And in many situations,

563
00:31:10.387 --> 00:31:11.527
the data gives you an answer,

564
00:31:11.607 --> 00:31:12.827
and many others,

565
00:31:13.088 --> 00:31:13.208
no.

566
00:31:14.228 --> 00:31:18.010
And then what's an intuitive decision is that you then,

567
00:31:18.430 --> 00:31:18.710
after,

568
00:31:19.282 --> 00:31:21.464
the data can tell you,

569
00:31:23.065 --> 00:31:24.626
feel that you shouldn't do that.

570
00:31:26.908 --> 00:31:27.028
So,

571
00:31:27.828 --> 00:31:28.469
in the studies,

572
00:31:28.509 --> 00:31:29.029
as I said,

573
00:31:30.831 --> 00:31:33.793
in each of the large companies I've worked with,

574
00:31:33.813 --> 00:31:35.114
and these are worldwide companies,

575
00:31:35.154 --> 00:31:37.555
about 50% of all decisions are,

576
00:31:37.615 --> 00:31:38.116
at the end,

577
00:31:38.496 --> 00:31:39.437
an intuitive one.

578
00:31:40.297 --> 00:31:41.538
And now the interesting point,

579
00:31:42.899 --> 00:31:46.002
the same executives would never admit that in public.

580
00:31:48.410 --> 00:31:49.070
There's fear.

581
00:31:50.411 --> 00:31:52.892
For an intuitive decision,

582
00:31:53.512 --> 00:31:55.233
you have to take responsibility.

583
00:31:56.373 --> 00:32:03.076
We live in a society where fewer and fewer executives are willing to take responsibility.

584
00:32:03.757 --> 00:32:04.517
So what do they do?

585
00:32:05.737 --> 00:32:07.938
They just made an intuitive decision,

586
00:32:08.759 --> 00:32:12.700
but they can't announce it as one.

587
00:32:13.181 --> 00:32:13.301
So,

588
00:32:15.362 --> 00:32:16.022
one version is...

589
00:32:16.590 --> 00:32:22.775
They take a middle manager and then would ask him,

590
00:32:23.296 --> 00:32:24.456
you have a week or two,

591
00:32:25.337 --> 00:32:26.178
find me the reasons.

592
00:32:29.460 --> 00:32:30.561
And that's a loss of time,

593
00:32:31.742 --> 00:32:33.844
intelligence and resources.

594
00:32:34.805 --> 00:32:45.393
The more expensive version of that is the company or the top management hires a consulting firm.

595
00:32:46.754 --> 00:32:54.976
That then will waste months on finding reasons for the already made decision.

596
00:32:57.657 --> 00:32:58.818
How often does it happen?

597
00:32:59.678 --> 00:33:08.020
I have worked with one of the largest worldwide consulting firms and asked the principal over lunch,

598
00:33:08.420 --> 00:33:13.622
would you be willing and tell me how often customer contacts involve?

599
00:33:14.462 --> 00:33:18.746
Justifying decisions that already have been made.

600
00:33:19.867 --> 00:33:20.288
He said,

601
00:33:20.388 --> 00:33:21.409
Professor Gigerenzer,

602
00:33:21.409 --> 00:33:22.169
if you don't tell me,

603
00:33:23.751 --> 00:33:24.772
if you don't tell my name,

604
00:33:24.792 --> 00:33:25.212
I will.

605
00:33:26.053 --> 00:33:27.514
It's over 50%.

606
00:33:28.635 --> 00:33:28.755
So,

607
00:33:29.756 --> 00:33:35.642
I tell you this story to illustrate the anxiety of admitting gut decisions.

608
00:33:36.022 --> 00:33:37.223
They are made,

609
00:33:38.104 --> 00:33:39.285
but there is an anxiety.

610
00:33:40.586 --> 00:33:44.450
And together with the anxiety of taking over responsibility,

611
00:33:45.851 --> 00:33:50.015
and then the waste of time and resources,

612
00:33:51.616 --> 00:33:58.623
only to pretend that decision has been made with only,

613
00:33:58.943 --> 00:34:00.244
with no intuition.

614
00:34:02.266 --> 00:34:03.447
And this is the world we live.

615
00:34:03.988 --> 00:34:05.129
Sounds familiar somehow.

616
00:34:07.090 --> 00:34:11.211
We do research and we can hear that very often.

617
00:34:11.852 --> 00:34:11.972
Now,

618
00:34:11.972 --> 00:34:16.573
there are still some major advantages in using intuition,

619
00:34:16.653 --> 00:34:20.034
sometimes against algorithm or complex systems.

620
00:34:21.234 --> 00:34:29.036
Would you highlight to managers why intuition can actually be a very good way to make decisions?

621
00:34:29.576 --> 00:34:32.477
What are the main advantages you would highlight?

622
00:34:33.678 --> 00:34:34.358
Yeah.

623
00:34:34.358 --> 00:34:34.878
So first...

624
00:34:35.298 --> 00:34:36.379
You don't waste money.

625
00:34:37.200 --> 00:34:38.662
You don't waste time.

626
00:34:41.645 --> 00:34:45.008
But you need courage to stand up and say,

627
00:34:45.649 --> 00:34:48.371
as a manager or as a politician,

628
00:34:48.992 --> 00:34:49.232
look,

629
00:34:50.574 --> 00:34:50.694
so...

630
00:34:53.002 --> 00:34:54.683
The honest version would be,

631
00:34:55.383 --> 00:34:56.743
in the cases I've written,

632
00:34:57.004 --> 00:35:01.965
not to hire a consulting firm and deceive yourself and everyone else,

633
00:35:02.465 --> 00:35:04.546
but to stand up and say,

634
00:35:04.786 --> 00:35:04.966
look,

635
00:35:06.166 --> 00:35:06.286
we,

636
00:35:06.966 --> 00:35:07.466
the board,

637
00:35:07.927 --> 00:35:11.928
have now spent a week over this decision.

638
00:35:12.168 --> 00:35:12.408
Maybe,

639
00:35:13.948 --> 00:35:15.149
should we buy this company,

640
00:35:15.269 --> 00:35:15.949
another company,

641
00:35:16.009 --> 00:35:16.429
or not?

642
00:35:17.309 --> 00:35:20.170
Or should we move to Vietnam?

643
00:35:21.158 --> 00:35:21.718
Something big.

644
00:35:23.759 --> 00:35:24.820
We have looked at data,

645
00:35:25.460 --> 00:35:26.661
a previous experience,

646
00:35:27.181 --> 00:35:28.422
and the data isn't clear.

647
00:35:30.603 --> 00:35:33.405
And there's no point now to go on with that.

648
00:35:34.105 --> 00:35:37.347
But someone has to take the responsibility,

649
00:35:37.667 --> 00:35:38.127
and I,

650
00:35:38.147 --> 00:35:38.808
as the CEO,

651
00:35:39.508 --> 00:35:39.988
have to do.

652
00:35:41.189 --> 00:35:41.309
So,

653
00:35:42.670 --> 00:35:44.711
based on my own experience,

654
00:35:46.232 --> 00:35:48.713
and on my intuition,

655
00:35:49.253 --> 00:35:50.654
I think we should not

656
00:35:50.914 --> 00:35:54.856
pursue this any first.

657
00:35:54.856 --> 00:35:56.317
So that would be the honest version.

658
00:35:58.018 --> 00:36:00.159
And it would save the company lots of time.

659
00:36:02.060 --> 00:36:06.623
And many of the problems being too slow inside the company,

660
00:36:06.803 --> 00:36:07.844
always the customers,

661
00:36:08.164 --> 00:36:12.866
comes from the anxiety of making decisions and prolonging.

662
00:36:12.866 --> 00:36:14.868
You can measure the,

663
00:36:15.528 --> 00:36:19.030
it's the technical term is defensive decision making.

664
00:36:20.282 --> 00:36:28.268
So you protect yourself and may even go with a decision that's not the best for the company.

665
00:36:32.452 --> 00:36:33.032
So what do you,

666
00:36:34.173 --> 00:36:38.156
defensive decision making and the lack of a good error culture,

667
00:36:39.457 --> 00:36:42.720
and that is what hurts many companies.

668
00:36:43.180 --> 00:36:45.882
You do not find it as much in family companies.

669
00:36:47.938 --> 00:36:49.119
And in a family,

670
00:36:49.599 --> 00:36:50.419
it's your own money.

671
00:36:51.760 --> 00:36:52.481
For the CEO,

672
00:36:52.681 --> 00:36:54.442
it's not his or her own money.

673
00:36:55.682 --> 00:36:56.663
It's a company's money.

674
00:36:59.404 --> 00:37:01.966
And family businesses plan long,

675
00:37:02.046 --> 00:37:03.067
the next generation,

676
00:37:03.087 --> 00:37:05.968
and not to the next three months report.

677
00:37:08.329 --> 00:37:09.850
So intuition,

678
00:37:10.251 --> 00:37:13.312
not taking intuition seriously,

679
00:37:13.953 --> 00:37:16.394
costs enormous amounts of time.

680
00:37:17.082 --> 00:37:18.683
and money to companies.

681
00:37:19.964 --> 00:37:20.885
Two very good reasons.

682
00:37:20.925 --> 00:37:22.426
Maybe I'm going to hand over to

683
00:37:23.027 --> 00:37:24.949
Eric, just conscious of time.

684
00:37:27.451 --> 00:37:27.711
Yes,

685
00:37:27.751 --> 00:37:28.311
professor.

686
00:37:30.013 --> 00:37:34.797
How would it be possible to change this situation?

687
00:37:35.277 --> 00:37:45.385
Meaning to make intuition at the center of our decision-making process and the intuition as you define it.

688
00:37:45.545 --> 00:37:46.126
So,

689
00:37:46.962 --> 00:37:56.246
There are a number of very concrete procedures I have in my work with the simple rational done.

690
00:37:57.126 --> 00:37:57.907
So one is,

691
00:37:58.127 --> 00:37:59.207
give you just two examples.

692
00:37:59.347 --> 00:38:05.730
Almost all big companies have problems in defensive decision making,

693
00:38:06.731 --> 00:38:08.211
in wasting too much time.

694
00:38:08.611 --> 00:38:09.492
There are exceptions.

695
00:38:11.093 --> 00:38:11.713
For instance,

696
00:38:12.553 --> 00:38:12.673
the

697
00:38:13.173 --> 00:38:14.234
International Air

698
00:38:16.018 --> 00:38:16.138
So,

699
00:38:17.499 --> 00:38:18.699
companies like Lufthansa,

700
00:38:18.759 --> 00:38:19.659
with whom I have worked,

701
00:38:20.660 --> 00:38:22.900
they have an excellent cockpit culture.

702
00:38:24.281 --> 00:38:25.101
Cockpit culture,

703
00:38:25.121 --> 00:38:30.522
that doesn't mean that in the rest of these companies everything is okay.

704
00:38:31.663 --> 00:38:33.523
And that's why it's so safe to fly.

705
00:38:34.783 --> 00:38:35.504
On the other hand,

706
00:38:35.864 --> 00:38:38.805
many hospitals don't have a good air culture.

707
00:38:40.465 --> 00:38:43.226
And that's why so many people die in hospitals.

708
00:38:44.126 --> 00:38:45.127
That's not the only reason.

709
00:38:46.368 --> 00:38:50.811
So this just illustrates that we live in a society where we have,

710
00:38:51.251 --> 00:38:56.054
here's a pocket of excellent decision culture and error culture,

711
00:38:56.115 --> 00:38:56.875
and there is not.

712
00:38:58.016 --> 00:39:00.538
So what to do if you have a company where it's not?

713
00:39:01.218 --> 00:39:01.959
That's the aggression.

714
00:39:03.420 --> 00:39:06.722
I'll give you just two examples.

715
00:39:07.122 --> 00:39:09.904
One is a juristic that's called,

716
00:39:10.245 --> 00:39:10.365
yeah.

717
00:39:12.566 --> 00:39:19.221
Find a good model and that the best model would be the CEO or people on the very,

718
00:39:19.301 --> 00:39:19.802
very top.

719
00:39:20.283 --> 00:39:20.784
Example.

720
00:39:22.619 --> 00:39:27.863
But with a large health provider that had the same problem,

721
00:39:28.104 --> 00:39:29.485
that decisions are too slow,

722
00:39:30.045 --> 00:39:32.327
that they are too defensive,

723
00:39:33.168 --> 00:39:36.131
and nobody wants to take responsibility.

724
00:39:37.512 --> 00:39:38.913
So there was a new CEO.

725
00:39:40.334 --> 00:39:40.935
It was a she.

726
00:39:41.615 --> 00:39:42.756
And in my experience,

727
00:39:42.836 --> 00:39:47.260
often women have less fear to make decisions.

728
00:39:48.581 --> 00:39:49.702
And she...

729
00:39:51.779 --> 00:39:56.582
She assembled all the top management and said,

730
00:39:56.822 --> 00:39:57.002
look,

731
00:39:57.943 --> 00:40:00.204
last year we made this decision.

732
00:40:00.444 --> 00:40:01.665
We all were for it.

733
00:40:01.945 --> 00:40:02.905
And it was wrong,

734
00:40:03.066 --> 00:40:03.786
as we know now.

735
00:40:04.847 --> 00:40:05.007
Now,

736
00:40:05.327 --> 00:40:06.768
let's discuss what

737
00:40:07.208 --> 00:40:10.030
I, who voted for the decision,

738
00:40:10.830 --> 00:40:10.970
what

739
00:40:11.490 --> 00:40:12.651
I did wrong.

740
00:40:14.152 --> 00:40:18.835
And that sets a totally different sign for a different culture.

741
00:40:19.471 --> 00:40:19.591
So,

742
00:40:19.651 --> 00:40:21.773
from discussing our culture.

743
00:40:21.773 --> 00:40:22.133
And then,

744
00:40:22.153 --> 00:40:22.553
of course,

745
00:40:22.933 --> 00:40:23.854
the manager says,

746
00:40:24.074 --> 00:40:24.194
oh,

747
00:40:24.234 --> 00:40:25.355
we can talk about that.

748
00:40:25.775 --> 00:40:26.756
We don't have to hide that.

749
00:40:28.157 --> 00:40:28.938
It's one example.

750
00:40:29.838 --> 00:40:40.486
A second example is set some signs that signal that the culture has changed.

751
00:40:41.046 --> 00:40:42.687
You can just talk about it.

752
00:40:42.687 --> 00:40:43.548
It doesn't help much.

753
00:40:43.748 --> 00:40:44.108
Science.

754
00:40:44.588 --> 00:40:45.129
Here's one.

755
00:40:46.690 --> 00:40:47.971
Do you know the game?

756
00:40:48.567 --> 00:41:03.898
monopoly so there is a there is you can get into jail there and there is a card called get out of jail card so we instructed the ceo to

757
00:41:04.438 --> 00:41:17.367
give a get out of jail card to every manager with the instruction if you take risks for the company and it goes wrong hand in your card

758
00:41:17.847 --> 00:41:18.548
No questions.

759
00:41:19.028 --> 00:41:19.689
And also,

760
00:41:20.349 --> 00:41:24.392
put the card on your desk so that you see it every day.

761
00:41:25.453 --> 00:41:27.715
And that changes the entire situation.

762
00:41:28.055 --> 00:41:28.616
For instance,

763
00:41:29.256 --> 00:41:33.160
if there's someone who has the card lying on the desk for years,

764
00:41:34.100 --> 00:41:36.722
you ask another question,

765
00:41:36.742 --> 00:41:37.703
a different one.

766
00:41:38.944 --> 00:41:46.931
So these are science models that can help to change the culture in one that helps the organization.

767
00:41:47.583 --> 00:41:51.365
and where the decision makers don't have to hide themselves.

768
00:41:53.306 --> 00:41:53.827
There is,

769
00:41:54.407 --> 00:41:54.827
I think,

770
00:41:54.827 --> 00:41:58.989
a fundamental topic in your book,

771
00:41:59.470 --> 00:42:06.374
which is about helping leaders to create a smart decision culture,

772
00:42:07.334 --> 00:42:09.716
what you call a positive Volcker culture,

773
00:42:09.896 --> 00:42:11.016
heuristic culture,

774
00:42:11.096 --> 00:42:12.417
and error culture.

775
00:42:12.857 --> 00:42:15.759
Could you summarize what...

776
00:42:16.019 --> 00:42:23.122
How are your recommendations to create this smart decision-culturing organization?

777
00:42:23.182 --> 00:42:32.826
I'll give you an example from my own experience as a director at a Max Planck Institute with a research group of maybe 35 researchers,

778
00:42:33.787 --> 00:42:35.308
a dozen technical people,

779
00:42:35.448 --> 00:42:40.390
and any number of research assistants and secretaries and so on.

780
00:42:41.250 --> 00:42:42.431
How to deal with such a group?

781
00:42:43.483 --> 00:42:44.523
The biggest problem is,

782
00:42:46.904 --> 00:42:48.325
it's an interdisciplinary group,

783
00:42:48.365 --> 00:42:49.325
how to get them together.

784
00:42:50.285 --> 00:42:50.405
So,

785
00:42:51.586 --> 00:42:54.947
one heuristic for setting up a research group is,

786
00:42:57.147 --> 00:42:58.028
have them begin,

787
00:42:58.588 --> 00:42:59.928
all of them at the same time,

788
00:43:00.948 --> 00:43:01.529
within a week.

789
00:43:03.069 --> 00:43:03.789
That's important,

790
00:43:04.309 --> 00:43:05.450
because in my experience,

791
00:43:05.850 --> 00:43:08.411
those who are hired a few months earlier,

792
00:43:09.399 --> 00:43:14.721
I tend to think about those who come later always as younger siblings.

793
00:43:17.221 --> 00:43:29.405
And having everyone together and putting the administration into chaos helps everyone to join and to face this new situation.

794
00:43:30.285 --> 00:43:31.905
Second heuristic is,

795
00:43:32.526 --> 00:43:35.726
so every day at four o'clock,

796
00:43:36.167 --> 00:43:36.827
coffee and tea.

797
00:43:38.743 --> 00:43:39.824
And no obligations.

798
00:43:42.366 --> 00:43:43.727
And it's not a waste of time.

799
00:43:45.729 --> 00:43:46.490
This half an hour,

800
00:43:46.630 --> 00:43:47.851
or even if it's more,

801
00:43:48.832 --> 00:43:49.132
people,

802
00:43:49.853 --> 00:43:50.733
there will be trust.

803
00:43:50.773 --> 00:43:53.456
If you talk about something that has nothing to do with research,

804
00:43:54.056 --> 00:43:54.437
trust.

805
00:43:56.574 --> 00:43:57.595
If they talk of research,

806
00:43:57.715 --> 00:44:01.397
this is one of the most important places where new ideas are generated.

807
00:44:03.498 --> 00:44:03.638
And

808
00:44:04.719 --> 00:44:05.199
I always,

809
00:44:05.259 --> 00:44:05.659
when I was,

810
00:44:06.260 --> 00:44:07.340
I always went there.

811
00:44:07.961 --> 00:44:09.461
I never asked people to go there.

812
00:44:09.782 --> 00:44:11.943
It's just signal that.

813
00:44:12.283 --> 00:44:12.863
It's no point.

814
00:44:13.544 --> 00:44:18.046
I know some would try to imitate the model and forced everyone to go there.

815
00:44:18.146 --> 00:44:18.266
No,

816
00:44:18.507 --> 00:44:20.227
that's not important.

817
00:44:20.227 --> 00:44:24.390
Or what's also very important for any manager.

818
00:44:25.002 --> 00:44:26.883
as well for a research director,

819
00:44:27.424 --> 00:44:31.166
is to have at least one contrarian in your group.

820
00:44:31.907 --> 00:44:35.429
A person who speaks up against the director,

821
00:44:35.890 --> 00:44:37.971
against the group consensus,

822
00:44:38.392 --> 00:44:39.913
but on a factual basis,

823
00:44:40.513 --> 00:44:41.254
with respect.

824
00:44:43.675 --> 00:44:48.218
Many politicians fail to see that.

825
00:44:48.699 --> 00:44:51.901
We know about famous politicians like Putin,

826
00:44:52.902 --> 00:44:53.022
who...

827
00:44:53.794 --> 00:44:55.035
As far as I know,

828
00:44:56.677 --> 00:44:57.938
want to have people who clap.

829
00:44:59.499 --> 00:45:00.420
You don't want that.

830
00:45:01.681 --> 00:45:08.627
You want to have people who are willing to stand up for the facts and inform you and criticize you.

831
00:45:09.548 --> 00:45:10.769
I've written an entire paper,

832
00:45:10.809 --> 00:45:12.550
and it's in the book,

833
00:45:12.590 --> 00:45:13.752
The Intelligence of Intuition,

834
00:45:13.752 --> 00:45:15.213
you find more of these rules.

835
00:45:15.693 --> 00:45:16.374
And important,

836
00:45:16.954 --> 00:45:17.595
the rules,

837
00:45:18.115 --> 00:45:19.517
you need always to adapt them.

838
00:45:19.897 --> 00:45:20.317
For instance,

839
00:45:20.317 --> 00:45:21.078
the first rules.

840
00:45:21.490 --> 00:45:26.595
Have everyone start at the same time is not a good rule six years later.

841
00:45:27.796 --> 00:45:33.941
You would do very badly if you would replace your old group at the same time,

842
00:45:34.382 --> 00:45:37.505
because the culture that has evolved will die out.

843
00:45:39.847 --> 00:45:44.771
And it's always going back and forth between facts,

844
00:45:45.192 --> 00:45:46.213
between experience,

845
00:45:46.653 --> 00:45:47.534
between intuition.

846
00:45:49.518 --> 00:45:50.919
That's what leadership does.

847
00:45:50.979 --> 00:45:52.160
There is no one recipe.

848
00:45:53.361 --> 00:45:53.742
Thank you,

849
00:45:54.582 --> 00:45:55.023
Professor.

850
00:45:55.443 --> 00:45:55.603
Well,

851
00:45:56.024 --> 00:45:58.165
maybe a bit of opening,

852
00:45:58.165 --> 00:46:01.929
because there's a hot topic currently with artificial intelligence.

853
00:46:02.629 --> 00:46:07.914
And I've read that you make a link between artificial intelligence and psychological intelligence.

854
00:46:09.735 --> 00:46:11.657
And one can actually help each other.

855
00:46:11.797 --> 00:46:12.718
They could help each other.

856
00:46:13.158 --> 00:46:17.061
So can you just develop a little bit about what can...

857
00:46:18.666 --> 00:46:23.790
Maybe from the theory of Herbert Simon or the development of artificial intelligence,

858
00:46:24.070 --> 00:46:27.172
how can it be inspired by human psychology?

859
00:46:27.793 --> 00:46:27.953
Yes.

860
00:46:28.654 --> 00:46:28.774
So

861
00:46:29.574 --> 00:46:35.859
I started out from the distinction that Jimmy Savage made between small worlds and large worlds.

862
00:46:37.019 --> 00:46:37.700
In a small world,

863
00:46:37.700 --> 00:46:38.561
you can optimize.

864
00:46:39.081 --> 00:46:40.762
Complex methods will work.

865
00:46:41.663 --> 00:46:46.046
And that's why so many decision theories study.

866
00:46:46.482 --> 00:46:47.783
Choices being in game balls,

867
00:46:47.863 --> 00:46:48.704
it's a small world.

868
00:46:49.464 --> 00:46:51.986
But whether that translates into the real world is not clear.

869
00:46:52.947 --> 00:46:53.668
In a large world,

870
00:46:53.708 --> 00:46:54.649
you can't optimize.

871
00:46:55.730 --> 00:46:58.292
And that's what's called the management rule cover-up.

872
00:47:00.053 --> 00:47:00.854
In this situation,

873
00:47:00.874 --> 00:47:01.834
you need heuristics.

874
00:47:02.435 --> 00:47:06.718
The human mind evolved not to deal with game balls,

875
00:47:07.239 --> 00:47:08.260
well-defined game balls,

876
00:47:08.300 --> 00:47:09.260
but with uncertainty.

877
00:47:10.762 --> 00:47:13.744
And that's why we have this adaptive tool ball.

878
00:47:14.364 --> 00:47:14.485
So,

879
00:47:15.225 --> 00:47:16.066
having said that...

880
00:47:17.502 --> 00:47:18.263
The question is,

881
00:47:18.943 --> 00:47:23.946
where can complex algorithms be likely successful and where not?

882
00:47:25.407 --> 00:47:29.949
And that's what I call the difference between a stable world and an instable world.

883
00:47:30.590 --> 00:47:35.692
And it corresponds to the small world of Savage and others and the large world.

884
00:47:37.033 --> 00:47:43.777
The big successes of AI are in well-defined worlds and stable world.

885
00:47:44.077 --> 00:47:45.898
A well-defined world is one like chess.

886
00:47:46.138 --> 00:47:50.781
and go and a stable world is where tomorrow is likely like yes.

887
00:47:53.303 --> 00:47:56.405
Instable worlds include predicting the flu,

888
00:47:57.946 --> 00:47:58.567
predicting

889
00:47:59.167 --> 00:48:02.910
COVID-19, predicting human behavior,

890
00:48:03.910 --> 00:48:05.572
predicting recidivism.

891
00:48:06.652 --> 00:48:07.713
In these areas,

892
00:48:09.114 --> 00:48:14.238
the success of complex algorithms is the key.

893
00:48:15.106 --> 00:48:15.967
is not there.

894
00:48:17.968 --> 00:48:20.611
It is in well-defined problems.

895
00:48:21.952 --> 00:48:25.494
Or like in large language models where there is a language there,

896
00:48:25.515 --> 00:48:28.177
there is a matrix between correlations,

897
00:48:28.217 --> 00:48:29.358
and that's fairly stable.

898
00:48:30.739 --> 00:48:32.220
But just to give an example,

899
00:48:33.781 --> 00:48:40.747
the research in Princeton universities asked the research community,

900
00:48:41.087 --> 00:48:42.688
the machine learning community,

901
00:48:43.329 --> 00:48:44.630
to predict the future

902
00:48:45.062 --> 00:48:46.884
of so-called fragile families.

903
00:48:48.887 --> 00:48:51.389
And they delivered,

904
00:48:52.331 --> 00:48:52.451
yeah,

905
00:48:52.711 --> 00:48:54.173
what can be called big data,

906
00:48:54.573 --> 00:48:57.016
millions of data points for these families.

907
00:48:57.016 --> 00:49:00.360
So fragile families are usually families with only one

908
00:49:01.188 --> 00:49:01.608
parent.

909
00:49:02.469 --> 00:49:05.370
And this was real prediction,

910
00:49:05.450 --> 00:49:06.690
not out of sample,

911
00:49:07.191 --> 00:49:09.071
but real prediction of the future,

912
00:49:09.792 --> 00:49:10.092
where,

913
00:49:11.072 --> 00:49:11.733
for instance,

914
00:49:11.913 --> 00:49:22.797
the child's grade point average at age 15 was predicted or whether the mother still has a job or a house,

915
00:49:23.177 --> 00:49:23.657
a home,

916
00:49:24.798 --> 00:49:25.738
in a certain time.

917
00:49:27.379 --> 00:49:28.660
They had,

918
00:49:29.360 --> 00:49:29.660
I think,

919
00:49:29.944 --> 00:49:30.884
If I remember correctly,

920
00:49:30.904 --> 00:49:38.226
1,600 submissions and from with mostly highly complicated algorithms.

921
00:49:39.027 --> 00:49:48.129
And the result was that most of these machine learning algorithms could not beat a very simple algorithm.

922
00:49:48.509 --> 00:49:52.530
They just looked at three or four data points or maybe just two,

923
00:49:53.070 --> 00:49:56.791
such as how did the child perform six years ago.

924
00:50:00.585 --> 00:50:09.390
And these are examples where machine learning or complex algorithms in general don't do very well.

925
00:50:09.670 --> 00:50:10.690
And we need to see this.

926
00:50:11.090 --> 00:50:12.011
In our studies,

927
00:50:12.031 --> 00:50:12.491
for instance,

928
00:50:12.531 --> 00:50:14.752
closer to management,

929
00:50:15.333 --> 00:50:16.874
there is a common problem.

930
00:50:17.334 --> 00:50:19.355
You have a huge database.

931
00:50:19.435 --> 00:50:27.099
So a company has a huge database and wants to know which of the customers will likely purchase,

932
00:50:27.139 --> 00:50:28.520
make purchases again.

933
00:50:30.229 --> 00:50:32.650
And how do you predict that?

934
00:50:33.111 --> 00:50:34.792
This is all the high uncertainty.

935
00:50:34.792 --> 00:50:37.314
There are so many factors that predict this.

936
00:50:37.874 --> 00:50:39.455
And then there are two answers again.

937
00:50:40.176 --> 00:50:43.538
I use complex models or

938
00:50:44.258 --> 00:50:45.519
I'll use,

939
00:50:46.100 --> 00:50:46.900
I investigate.

940
00:50:46.900 --> 00:50:49.002
Now we're getting to psychological AI.

941
00:50:49.582 --> 00:50:50.763
That's the term I use.

942
00:50:51.704 --> 00:50:57.948
We don't start with logistic regressions or random forests.

943
00:50:58.708 --> 00:51:01.589
or other machine learning and try which one does better.

944
00:51:01.970 --> 00:51:05.291
But we looked how experienced managers do that.

945
00:51:06.932 --> 00:51:16.736
And the answer is most of the managers use a simple heuristic that's called the hiatus heuristic,

946
00:51:17.356 --> 00:51:21.578
which is if someone hasn't bought anything in the last nine months,

947
00:51:22.979 --> 00:51:23.179
out,

948
00:51:23.839 --> 00:51:24.460
otherwise in.

949
00:51:25.528 --> 00:51:26.949
That's again one reason,

950
00:51:27.029 --> 00:51:27.870
like Elon Musk,

951
00:51:28.351 --> 00:51:28.871
one reason.

952
00:51:29.952 --> 00:51:34.836
And one reason heuristics can be powerful under high uncertainty,

953
00:51:35.597 --> 00:51:38.560
not in a world where you know everything.

954
00:51:39.460 --> 00:51:39.580
So,

955
00:51:40.261 --> 00:51:52.492
and we have shown with 24 companies that this hiatus heuristic predicted future outcomes better than

956
00:51:53.112 --> 00:51:55.614
the typical machine learning algorithms.

957
00:51:55.755 --> 00:51:58.197
So that is random forests,

958
00:51:59.157 --> 00:52:02.360
regularized logistic regression in that case.

959
00:52:03.841 --> 00:52:07.424
And so the inside of that is the following.

960
00:52:08.605 --> 00:52:18.493
Complex algorithms can work very well and better than humans in certain areas,

961
00:52:18.753 --> 00:52:20.034
which are called stable worlds,

962
00:52:20.555 --> 00:52:21.876
but not necessarily.

963
00:52:22.312 --> 00:52:22.732
everywhere.

964
00:52:24.054 --> 00:52:37.705
And we can inform AI by studying how experts make decisions and then putting that heuristic as an algorithm.

965
00:52:37.785 --> 00:52:40.467
And the example I just gave is a very simple algorithm.

966
00:52:40.467 --> 00:52:41.468
You just have one.

967
00:52:42.028 --> 00:52:44.651
You still have to estimate the time,

968
00:52:44.731 --> 00:52:46.692
like is it nine months or six months,

969
00:52:46.712 --> 00:52:47.733
depending on the problem.

970
00:52:49.234 --> 00:52:50.876
But it gives you a different thinking.

971
00:52:51.376 --> 00:52:51.796
For instance,

972
00:52:51.796 --> 00:52:55.039
we also have shown that with the same heuristic,

973
00:52:55.119 --> 00:52:58.001
so take just the most recent,

974
00:52:59.383 --> 00:53:00.463
we could predict

975
00:53:01.624 --> 00:53:03.426
Google, so the spread of the flu,

976
00:53:04.126 --> 00:53:05.688
better than Google Flu Trends.

977
00:53:06.688 --> 00:53:07.049
Remember,

978
00:53:07.129 --> 00:53:09.511
Google Flu Trends reigned for eight years.

979
00:53:10.431 --> 00:53:15.736
I was closed down in 2015 and had the idea that with big data,

980
00:53:17.186 --> 00:53:21.367
You can predict the flu-related doctor visits,

981
00:53:21.767 --> 00:53:23.388
which is a useful problem.

982
00:53:23.428 --> 00:53:26.049
So it's what's called now-casting.

983
00:53:26.749 --> 00:53:27.449
What's now?

984
00:53:27.549 --> 00:53:29.489
What's now happening that we know that?

985
00:53:30.710 --> 00:53:35.591
And we have seen that a simple heuristic,

986
00:53:35.691 --> 00:53:36.051
that is,

987
00:53:36.711 --> 00:53:41.013
that we know that humans use in situations of high uncertainty,

988
00:53:41.433 --> 00:53:45.954
namely just take the most recent data point seriously.

989
00:53:46.322 --> 00:53:47.243
and ignore the rest.

990
00:53:47.263 --> 00:53:59.292
That could predict the spread of the flu better and every year better than Google flu trends and for all the updates of Google flu trends.

991
00:54:00.072 --> 00:54:03.995
The Google engineers updated it when something unexpected happened,

992
00:54:04.515 --> 00:54:05.116
the swine flu,

993
00:54:06.257 --> 00:54:07.137
and it couldn't.

994
00:54:08.218 --> 00:54:09.279
With big data,

995
00:54:09.579 --> 00:54:11.200
you are like a big tanker.

996
00:54:11.641 --> 00:54:13.122
You can't steer around.

997
00:54:13.898 --> 00:54:16.800
But the reason is that he can adapt to anything that's new.

998
00:54:18.641 --> 00:54:18.761
So,

999
00:54:19.601 --> 00:54:22.823
psychological AI is the idea to,

1000
00:54:24.984 --> 00:54:27.325
if the problem is about uncertainty,

1001
00:54:28.446 --> 00:54:31.828
psychological AI doesn't help you for playing chess or golf.

1002
00:54:32.828 --> 00:54:39.772
That was Robert Simon's one error that he thought that it would help in these areas.

1003
00:54:40.513 --> 00:54:42.534
But it helps you under uncertainty.

1004
00:54:43.262 --> 00:54:48.504
So find out how experienced people solve problems and its heuristics and answer,

1005
00:54:48.704 --> 00:54:51.746
because it has to be sufficiently simple to be robust.

1006
00:54:52.626 --> 00:54:59.229
And then model this as an algorithm and compare your complex algorithms with the simple ones.

1007
00:54:59.389 --> 00:55:02.730
And often you will find you actually do better simple.

1008
00:55:03.370 --> 00:55:06.172
And that has political consequences because,

1009
00:55:07.252 --> 00:55:07.872
as you may know,

1010
00:55:08.353 --> 00:55:08.473
the

1011
00:55:08.813 --> 00:55:11.574
European Union has a Digital Data Act.

1012
00:55:12.166 --> 00:55:14.168
And one of the issues is transparency,

1013
00:55:14.568 --> 00:55:16.909
meaning understandability for the public,

1014
00:55:17.490 --> 00:55:18.511
for credit scoring.

1015
00:55:19.151 --> 00:55:21.793
You want to know why you have a bad score.

1016
00:55:22.834 --> 00:55:24.375
You might want to improve it,

1017
00:55:24.855 --> 00:55:25.636
but it's black box.

1018
00:55:26.737 --> 00:55:30.920
And I am working with the largest credit scores in Germany.

1019
00:55:31.880 --> 00:55:36.924
And we found that they are using highly complex algorithms.

1020
00:55:38.445 --> 00:55:39.686
And which is.

1021
00:55:41.271 --> 00:55:44.193
You can't even say that if you have more than two credit cards,

1022
00:55:44.734 --> 00:55:45.434
that hurts you.

1023
00:55:46.015 --> 00:55:47.556
It's only true in maybe 80,

1024
00:55:47.636 --> 00:55:48.817
90% of the cases,

1025
00:55:48.857 --> 00:55:50.759
because everyone is linked with everyone.

1026
00:55:51.500 --> 00:55:54.302
So people cannot know what to do.

1027
00:55:55.463 --> 00:56:00.367
And we found that if you have an algorithm,

1028
00:56:00.527 --> 00:56:08.174
if you delete all these interactions and reduce it to half a dozen or a few more variables,

1029
00:56:09.182 --> 00:56:10.242
You do as well and better.

1030
00:56:10.743 --> 00:56:11.623
And it's transparent.

1031
00:56:12.043 --> 00:56:13.044
People can understand.

1032
00:56:13.244 --> 00:56:14.444
They can change their behavior.

1033
00:56:15.325 --> 00:56:20.207
And you don't have to nudge them into something that they don't understand.

1034
00:56:20.927 --> 00:56:21.047
So

1035
00:56:21.767 --> 00:56:33.492
I think that simplicity has also societal value and allows people to understand much better when they're being scolded.

1036
00:56:33.972 --> 00:56:35.853
And that was a very insightful conclusion.

1037
00:56:36.033 --> 00:56:37.554
Thank you very much for...

1038
00:56:38.174 --> 00:56:39.074
your participation,

1039
00:56:39.154 --> 00:56:40.215
Professor Gigerenzer.

1040
00:56:41.275 --> 00:56:44.316
Is there maybe anything you'd like to leave our listeners with,

1041
00:56:44.536 --> 00:56:47.216
like perhaps where they can find more about your work?

1042
00:56:48.297 --> 00:56:48.697
Well,

1043
00:56:48.697 --> 00:56:48.877
I mean,

1044
00:56:49.017 --> 00:56:53.998
you can start with the book that is about smart management,

1045
00:56:56.079 --> 00:56:59.220
and or you might read the book,

1046
00:56:59.260 --> 00:57:00.060
as you mentioned,

1047
00:57:00.060 --> 00:57:01.560
The Intelligence of Intuition,

1048
00:57:02.861 --> 00:57:05.722
or you can read some of my more scientific books.

1049
00:57:07.918 --> 00:57:10.060
that rationality for mortals,

1050
00:57:13.702 --> 00:57:14.383
but read.

1051
00:57:15.163 --> 00:57:15.663
And also,

1052
00:57:16.484 --> 00:57:25.590
I think the basic insights are that take uncertainty seriously and notice that we still,

1053
00:57:26.671 --> 00:57:29.933
the mainstream of behavioral economics doesn't do that.

1054
00:57:31.034 --> 00:57:31.434
Otherwise,

1055
00:57:31.434 --> 00:57:33.135
we wouldn't declare everything an error.

1056
00:57:34.356 --> 00:57:35.257
That is simple.

1057
00:57:38.142 --> 00:57:38.703
And second,

1058
00:57:39.023 --> 00:57:39.804
align to that,

1059
00:57:40.144 --> 00:57:42.405
take heuristics seriously.

1060
00:57:43.566 --> 00:57:47.089
They are not second best in a VUCA world.

1061
00:57:47.829 --> 00:57:49.771
They are the only thing we can do.

1062
00:57:50.751 --> 00:57:52.252
And the question is a different one.

1063
00:57:53.033 --> 00:57:54.474
Which heuristic makes sense?

1064
00:57:54.934 --> 00:58:00.898
Which is smart and which is not smart in this application?

1065
00:58:01.799 --> 00:58:02.379
And finally,

1066
00:58:02.820 --> 00:58:05.902
have the courage to make decisions.

1067
00:58:06.322 --> 00:58:09.653
and to stand up for them and take responsibility.

1068
00:58:10.756 --> 00:58:11.318
Be Good,

1069
00:58:11.599 --> 00:58:13.826
a podcast by the BVA Nudge Unit.

