Episode 161: Reach-Around Knowledge and Bottom Performers (The Dunning-Kruger Effect)
Very Bad WizardsApril 02, 2019
161
01:25:0178.27 MB

Episode 161: Reach-Around Knowledge and Bottom Performers (The Dunning-Kruger Effect)

The less we know, the more we know it. David and Tamler talk about the notorious Dunning-Kruger effect, which makes us overconfident in beliefs on topics we're ignorant about and under-confident when we're experts. Plus, we break down an evolutionary psychology article on why poor men and hungry men prefer women with big breasts. Trust us, it's a really bad study. We're sure about it.

Support Very Bad Wizards

Links:

[00:00:00] Very Bad Wizards is a podcast with a philosopher, my dad, and psychologist, David Pizarro, having

[00:00:06] an informal discussion about issues in science and ethics.

[00:00:09] Please note that the discussion contains bad words that I'm not allowed to say, and knowing

[00:00:14] my dad, some very inappropriate jokes.

[00:00:17] Is magic real?

[00:00:18] Fine.

[00:00:19] And old withered.

[00:00:21] Hmph.

[00:00:22] The wizard who lost his faith in magic.

[00:00:25] That's ironic.

[00:00:26] Ironic and sad.

[00:00:28] The Queen in Oz has spoken.

[00:00:37] Pay no attention to that man behind the curtain.

[00:00:59] They think deep thoughts and with no more brains than you have.

[00:01:14] Anybody can have a brain.

[00:01:15] You're a very bad man.

[00:01:16] I'm a very good man.

[00:01:17] Just a very bad wizard.

[00:01:18] Welcome to Very Bad Wizards.

[00:01:19] I'm from the University of Houston.

[00:01:21] Dave, I like big boobs and I cannot lie.

[00:01:24] Does that mean I'm poor?

[00:01:27] It might make you poor.

[00:01:29] It might actually.

[00:01:31] Depending on where you go in Vegas, that can lead to poverty.

[00:01:35] To poverty?

[00:01:36] Yeah.

[00:01:37] But you've got the causal direction, I think.

[00:01:39] I know.

[00:01:40] We have the science to tell us.

[00:01:42] This is...

[00:01:43] I'm David Pizarro from Cornell University.

[00:01:46] And I am okay with any size breasts.

[00:01:48] I love them all.

[00:01:50] I like big man boobs.

[00:01:52] I like it when a good friend of mine has gotten so heavy that they just have tits.

[00:02:00] They can be your wingman.

[00:02:01] A good wingman should have tits.

[00:02:06] We have science.

[00:02:07] We're going to talk about it.

[00:02:10] We're going to talk about it.

[00:02:11] So on today's episode, we're going to talk about the Dunning Kruger effect in the

[00:02:15] second segment.

[00:02:17] What it is, what the research is that supports it.

[00:02:26] Are you waiting for me to start?

[00:02:28] You're going to pick up on this?

[00:02:30] Because I realize I know barely it.

[00:02:32] The Dunning Kruger effect is the increasingly popular documented effect that people who

[00:02:40] know less about something are more confident that they know about that thing.

[00:02:45] The more expert you are, the less confidence you have in your knowledge.

[00:02:50] The less they know, the more they know it.

[00:02:53] That's why we exude confidence.

[00:02:55] Exactly.

[00:02:56] I'm going to be especially kind of slow today just because I'm very hungover.

[00:03:02] Oh yeah, you did a public event.

[00:03:06] I did.

[00:03:07] It was philosophically drinking.

[00:03:12] Roger It's Pub in Houston.

[00:03:14] But it was fun.

[00:03:16] It was good.

[00:03:17] There were a couple of VBW listeners, at least four or five that came out.

[00:03:22] That's amazing.

[00:03:24] Remember when all we wanted was four or five listeners?

[00:03:27] I think if we're not careful tonight, we might be headed back there.

[00:03:32] Four or five listeners.

[00:03:34] So let's talk about this paper we're going to talk about in the first segment, which

[00:03:37] is kind of, it's unbelievable.

[00:03:41] It is a, well just describe it.

[00:03:44] It's really unbelievable.

[00:03:45] So this is, I don't remember how I came across it on Twitter, but you know, we mentioned that

[00:03:50] Neuroskeptic was failing us.

[00:03:52] I found this somewhere else, but it's definitely the kind of paper that you would tweet out.

[00:03:56] It's called Resource Security Impacts Men's Female Breast Size Preferences by Viren Swamy

[00:04:05] and Martin Tovay.

[00:04:07] And this was published in PLOS One.

[00:04:09] It's the open access journal.

[00:04:12] A good journal?

[00:04:14] You know, it's a funny one.

[00:04:17] So PLOS One, I think has had a reputation of being a not good journal in the eyes of any

[00:04:23] people, but it actually publishes really like sometimes really good stuff, but people usually

[00:04:29] work their way down to get there and sometimes just trash.

[00:04:34] I don't want to shit on it because in fact like there have been really good papers

[00:04:37] in it, but it is considered an easier journal to publish in.

[00:04:41] But it is open access, which is a good thing.

[00:04:43] And it is peer reviewed.

[00:04:44] So it's not just a scam journal, right?

[00:04:46] It's not like one of those that you get.

[00:04:49] So this was just, I couldn't believe it.

[00:04:53] Like I'll just get to the heart of it.

[00:04:57] The argument is that because as we all know breast size in human women is an indicator

[00:05:08] of fat reserves, this in turn might indicate that that woman has access to resources.

[00:05:16] And so they conducted these two guys conducted a test of this hypothesis.

[00:05:21] Two, one was a study from men in Malaysia from three different cities that vary in socioeconomic

[00:05:29] status.

[00:05:30] So here their thinking was men who are from lower SES groups, poor men in Malaysia that

[00:05:39] they would prefer women with larger breasts because this was an indicator of like, hey,

[00:05:46] there's like good shit in the environment.

[00:05:48] Whereas men from higher socioeconomic status would prefer smaller breasts.

[00:05:52] The second one is the killer though.

[00:05:53] This was like, well, like let's not do a correlational study.

[00:05:56] We want to test this directly.

[00:05:58] So they got guys who were either coming out of a cafeteria or going into a cafeteria.

[00:06:05] They selected them for being either hungry or satiated and they asked them to rate what

[00:06:15] their preference for women's body shape would be.

[00:06:19] And they used this like 3D modeling that you could rotate.

[00:06:26] And what they found was that hungry men like bigger boobs.

[00:06:31] Because after all, you're hungry.

[00:06:33] You're looking for resources.

[00:06:34] I can't even talk about this without...

[00:06:38] Let me just, I know I always say something along these lines, but is this real?

[00:06:44] This isn't a hope or at least this is a real journal.

[00:06:47] This isn't a parody.

[00:06:48] Not only that, Tamler, this thing led me down a rabbit hole.

[00:06:53] So I looked up the first author because I was like, this could be like a reverse James

[00:06:59] Lindsay, like somebody trying to mock evolutionary psych.

[00:07:04] And this guy, Birren Swami, no knock on you.

[00:07:06] I don't know who he is.

[00:07:07] I'm sure he's a nice guy.

[00:07:09] Has a shit ton of publications.

[00:07:13] There is apparently just a cottage industry of boobs size research.

[00:07:21] And this guy is at the forefront of it because it is a pressing question, Tamler.

[00:07:27] No, it's a way better cottage industry than like trolley problems.

[00:07:33] Probably.

[00:07:34] But like there's a ton...

[00:07:37] So there's one called Perception of Female Buttocks and Breast Size in Profile.

[00:07:44] That was from 2007, some earlier work of his.

[00:07:49] So this is actually what he does.

[00:07:50] Not all of his papers are like this.

[00:07:52] He does some work on why people believe conspiracy theories.

[00:07:56] But this is one from 2007 in a journal called Body Image.

[00:07:59] A lot of these are just journals I've never heard of.

[00:08:02] Unattractive, promiscuous and heavy drinkers.

[00:08:05] Colon.

[00:08:06] Perceptions of women with tattoos.

[00:08:11] So this guy is just maybe somebody who is not allowed to watch porn at home

[00:08:17] and so has developed a whole research program about being able...

[00:08:22] Yeah, maybe.

[00:08:23] The nice thing is we don't even need to add hominem him because like what's the logic here?

[00:08:29] Yeah, we should state it.

[00:08:31] The idea is that if you are poor, we would have evolved a trait, a disposition to look

[00:08:41] for women with high access to resources.

[00:08:47] Yeah.

[00:08:48] Big breasts signal high access to resources.

[00:08:53] So people who are less economically secure will be more attracted to those women because

[00:09:00] of the signaling of high access to resources.

[00:09:04] Yeah.

[00:09:05] But okay, so let's talk a little bit about this.

[00:09:07] So there is a weird thing going on with the fixation on the female anatomy because if

[00:09:16] it is the case that body fat indicates the presence of rich resources, then poor people

[00:09:24] should like fat dudes and hungry people should love fat dudes too.

[00:09:29] Like that is a better indicator that there are good resources, which leads me to my

[00:09:34] other thing.

[00:09:36] Breast size doesn't just change like that when you...

[00:09:42] It's not that people who eat a lot just get huge boobs.

[00:09:46] I know people like there are plenty of people who probably wish that were the case because

[00:09:50] fat...

[00:09:51] Wait, so there's a few things that you were saying that you sort of ran together.

[00:09:54] So the first is the fixation on female versus male.

[00:10:01] Everybody's trying to know where is the experiment about guys, Johnson.

[00:10:06] Yeah, there's just so much work on what men find attractive in women.

[00:10:15] Sexist men prefer big breasts apparently.

[00:10:18] Men who want a submissive partner prefer small breasts.

[00:10:23] But men who are independent and non-nurturent prefer large breasts, which was by the

[00:10:30] way a study done in 1968 that I actually tracked down and it is ridiculous.

[00:10:36] Like it can't be true.

[00:10:38] You did it.

[00:10:39] Yeah, I got it.

[00:10:40] You're just going to grab it all.

[00:10:42] Men financially stable prefer small breasts.

[00:10:45] Men ready for fatherhood prefer large breasts.

[00:10:46] So a ton of this work has been done and it's hilarious to read about the methodology

[00:10:51] because they're apparently very concerned that early research only used pencil, like

[00:10:55] line drawings, which weren't ecologically valid.

[00:10:58] So like they wanted to use real 3D models.

[00:11:02] That fixes it then.

[00:11:05] Yeah.

[00:11:06] So here I have a couple just as not a evolutionary biologist, could there possibly be a set

[00:11:17] of genes that were triggered to like if you sense that you're poor to make women with bigger

[00:11:25] breasts more attractive and women with smaller breasts less attractive and vice versa?

[00:11:33] How is this?

[00:11:34] There's no, I'm not an evolutionary biologist so it should say with a grain of salt.

[00:11:39] But I don't think there is any plausible account for that.

[00:11:42] There's no plausible biological mechanism for this?

[00:11:46] I mean, there is a lot of work on what poverty might do psychologically.

[00:11:53] But so like we know that for instance it can affect like the stress of being poor and being

[00:12:00] in environments where you're scraping together anything like whatever you can to live.

[00:12:05] Like that stress affects all sorts of things.

[00:12:08] I just don't think that there's somebody can correct me if I'm wrong.

[00:12:10] But in order for this to be true, there has to be a mechanism that is trying to track resources

[00:12:19] in the environment by looking at somebody else's body.

[00:12:24] And that has to be a fairly reliable signal that there are resources in the environment.

[00:12:32] But the problem is and they don't even really flesh this out but breast size is variable.

[00:12:40] So it's not as if when Hunter Gatherer has moved to that area of the savannah that had lots of game.

[00:12:48] Wait, what?

[00:12:49] You would have to have what they're arguing is that breast size is a reliable indicator

[00:12:56] of the resources and environment.

[00:12:58] So if you were like a Hunter Gatherer and you're walking around and you see a woman

[00:13:03] with big boobs, you follow her back like you're supposed to find a much like there's

[00:13:07] a lot supposed to be a lot more food like you know around.

[00:13:10] Well, only if you were poor you would do that.

[00:13:12] No, but they're saying poverty.

[00:13:13] They're they're saying that it is that the reason part like poverty is a proxy here

[00:13:18] because remember they're also doing the thing with hungry and and and not hungry

[00:13:23] men of the same socioeconomic status.

[00:13:26] So the main thing is whether you're hungry or not.

[00:13:29] And socioeconomic status is a proxy for determining hunger.

[00:13:34] So both socioeconomic status and hunger are ways to measure people who are resource

[00:13:41] lacking in resources.

[00:13:43] OK, right?

[00:13:44] So these are two different ways of attacking this this same.

[00:13:50] So then you see someone with big breasts and there's these biological

[00:13:56] mechanisms that says be attracted to her because then you can follow her

[00:14:03] and get you want to eat buddy.

[00:14:05] You starving, you hurting, you hurting for some for some resources.

[00:14:10] Follow the big boobed girl and and you know, marry her or whatever.

[00:14:15] It's like the pimp gene, which is not to say by the way,

[00:14:19] the breasts, the breasts, the presence of breasts aren't here for a selected reason.

[00:14:26] But but the variability in breasts and breast size that like I'm sorry.

[00:14:31] I just this just can't be true.

[00:14:33] It is, though.

[00:14:34] I mean, I suppose it could be true, but it's so many alternate explanations

[00:14:38] spring to mind and they do address that.

[00:14:41] And some of the funniest quotes come from addressing that.

[00:14:44] But oh my God, the limitations.

[00:14:47] This is the thing that people always ridicule evolutionary psychology for.

[00:14:52] You're positing the presence of something that nobody has remotely

[00:14:58] provided any sort of biological account of how this could possibly be true.

[00:15:04] You know, this is the one thing I've learned.

[00:15:06] I have like very, very good colleagues in my department who are,

[00:15:09] you know, they study the the physiological mechanisms in animals and like,

[00:15:14] you know, I have a colleague who studies brain size across all species

[00:15:18] over evolutionary history.

[00:15:19] They're like hardcore and they have one thing that they've taught me

[00:15:24] is that you can't you shouldn't say shit about about evolutionary biology

[00:15:29] unless you really have learned it.

[00:15:32] Because this is the this is the constant critique of

[00:15:35] of like crappy evolutionary psychology, that they're just making up a story

[00:15:39] to explain this shit.

[00:15:40] Just so stories.

[00:15:41] Yeah, just the stories that that have the the whiff of science.

[00:15:46] But seriously, like like just to take it on,

[00:15:49] it's like let's just assume that this is a plausible hypothesis.

[00:15:52] Wouldn't you expect that that obese people would be more attractive

[00:15:58] across the board if your resource resource starved?

[00:16:01] That is something that really does fluctuate in a reliable way

[00:16:05] based on how much access to food you have.

[00:16:07] But then they'll say no, because being skinny is like a handicap.

[00:16:12] And that signal, that's signal.

[00:16:15] I know they have a real story for pretty much any kind of eventuality.

[00:16:20] Right. The hand.

[00:16:21] And it's not that some of these things aren't true.

[00:16:24] It's just that so many of them are just post-hoc and you just have.

[00:16:30] And like they, you know, they they say like, well, this

[00:16:33] this isn't the only reason and obviously there's social cultural factors

[00:16:37] even though now that I'm thinking of it,

[00:16:41] did you ever see the kind of famous black jeopardy with Tom Hanks?

[00:16:46] You know, Black Jeopardy on Saturday Night Live.

[00:16:48] Yeah, I'm trying to remember.

[00:16:50] I don't remember. Yeah.

[00:16:52] So it was a kind of a brilliant skit.

[00:16:53] The premise of it was that that Tom Hanks is this blue collar,

[00:16:59] lower class guy and he goes on Black Jeopardy.

[00:17:03] And up till they get to the lives matter question,

[00:17:08] they agree on pretty much everything.

[00:17:12] OK, let's go to Big Girls for True Hunter.

[00:17:16] OK, the answer there.

[00:17:18] Skinny women can do this for you.

[00:17:20] Doug, what is not a damn thing?

[00:17:33] My wife, my wife, she's a sturdy gal.

[00:17:37] That is my man right there.

[00:17:39] Go Doug, go Doug.

[00:17:41] Right. So maybe like maybe that that like that's a new hypothesis for you to test.

[00:17:45] I think that it's a more plausible hypothesis

[00:17:48] that that ass size would be correlated with with this than then boobs size.

[00:17:54] I just don't.

[00:17:56] I mean, boobs size fluctuates through the lifespan like once you hit puberty.

[00:18:00] And it's not, you know, it is interesting that women are the only female humans

[00:18:05] are the only are the only mammals that have permanent boobs.

[00:18:09] Like all the other animals actually, you know, they get them when they're

[00:18:13] when they have when they're nursing and then they just disappear.

[00:18:17] And so, you know, there I think there's a lot of

[00:18:20] you could speculate a lot about why.

[00:18:22] It's I just can't get to this part to this.

[00:18:25] It's just like, OK, let's let's read some quotes.

[00:18:29] Yeah. When when the two authors are are talking about

[00:18:32] the as good scientists, I'm at the limitations.

[00:18:36] One of the things they say is third, our focus on breast size

[00:18:39] comes at the expense of other breast related variables

[00:18:43] that may have impacted upon participants ratings such as symmetry,

[00:18:46] shape and areola size because after all, they only showed

[00:18:52] a woman in a bikini.

[00:18:54] That's a confound right there.

[00:18:55] Areola size, right?

[00:18:59] It's it's just yeah, it wasn't manipulated so we don't know.

[00:19:02] And further further research like that.

[00:19:06] Can I read one of like just I mean, I have so many that it's like the

[00:19:11] of course this is not to suggest that as opposed tissue reserves

[00:19:17] are the only thing indicated by larger breast size.

[00:19:21] If this were the case, then larger breast size should be no more

[00:19:24] important than fat stored in any part of a woman's body.

[00:19:27] Right. Rather, breast size may act as a queue of null parody,

[00:19:32] age, sexual maturity and fertility.

[00:19:35] And furthermore, there may be other more important cues

[00:19:40] cues of fat storage compared to the breast such as overall body

[00:19:44] size. This just addresses your objection.

[00:19:47] This may help to explain the small to moderate

[00:19:53] small to moderate effect sizes uncovered in both studies report.

[00:19:58] That right there. Oh my God, we're stumped by this by the

[00:20:04] by the small to moderate effect sizes like this should be almost

[00:20:08] unanimous. But like I love how they set the stage for like, you know,

[00:20:13] hey, look, if you want to argue with us this further research,

[00:20:16] we need to do more. We're even like bodying themselves like when in

[00:20:21] the in the limitations because they're coming up with like

[00:20:24] plausible alternative explanations that I hadn't even thought

[00:20:27] of. They're like totally like, like it is possible that

[00:20:31] figures with larger breast size were perceived as heavier

[00:20:33] overall. If so, it is possible that our findings were driven

[00:20:35] by body size preferences in general rather than breast

[00:20:37] size per se. And it's well, why don't you fucking like

[00:20:42] do one more little manipulation of like thigh size so that you

[00:20:46] could have some discriminant validity to your actual test

[00:20:48] of this hypothesis. So here's another one nor do our findings

[00:20:53] deny a role for socio cultural factors in shaping breast

[00:20:57] size judgment. It has been argued, for example, that

[00:21:00] breasts are one of the most important sites of objectification

[00:21:04] of the female body in socio economically developed setting

[00:21:08] and media targeted at some men appear to fetishize large

[00:21:13] breasts. Media targeted at some men appear appear we can't we

[00:21:18] can't we can't totally you know, we can't take a stand on this

[00:21:22] but I'm just saying like it that it appears to fetishize

[00:21:26] large but every as an aside, this should not be used to

[00:21:31] suggest that the importance of breast varies across cultures

[00:21:36] and that our methodology artificially inflates the

[00:21:39] importance of breast size. Earlier ethnographic research

[00:21:42] indicates that breasts are eroticized in many different

[00:21:46] culture. This is just so so here's here's a plausible

[00:21:53] alternative hypothesis that I that it's to me equally

[00:21:58] plausible. If if their reasoning is right, and

[00:22:04] people from traditionally impoverished societies prefer

[00:22:10] women with large boobs, then over time sexual selection

[00:22:16] should mean that they actually get larger boobs.

[00:22:20] The breast size grows over time and then and perhaps at

[00:22:24] the expense like a peacocks tail like at the expense of

[00:22:27] other recent like distribution of whatever fat in the body

[00:22:31] or any other indicator of fitness. And so and so over

[00:22:36] time you should you should see that in poorer cultures is

[00:22:41] so long as they've been poor for a while women have bigger

[00:22:43] boobs, right? Yeah. And I don't then if so then breast

[00:22:51] size would no longer be a reliable indicator of how

[00:22:53] rich the environment was like an a priori objection. This

[00:22:58] isn't a priori objection. I really do want to know if

[00:23:03] there's any good evolutionary psychology on on dick

[00:23:08] aesthetics. I predict that a upward curve ought to have

[00:23:12] been selected because it gave women more pleasure thus

[00:23:17] making looking at more likely they would tell their

[00:23:21] friends to fuck you. You know, but what throws a wrench

[00:23:24] in all of this is the hell lesbian women and gay

[00:23:27] men are thwarting our evolutionary hypotheses by

[00:23:30] selecting on like what are they doing? Like how are they

[00:23:33] even picking like how what is what is there to say who

[00:23:37] you should be attracted to? There's like bird in his

[00:23:39] ass. It's like I just don't know. Like I'm just

[00:23:43] paralyzed. There's nothing. Actually, this dude actually

[00:23:48] this dude actually did publish something on ratings of

[00:23:52] attractiveness for women of different body mass index

[00:23:55] indices. Yeah, he surveyed lesbian feminists. So this

[00:24:01] just goes like the mind state of this research. He got

[00:24:05] feminist and non feminist women by what I don't know what

[00:24:08] measure do they like Christina and off summer? Only if

[00:24:13] they like facts. And and he like he wanted to test

[00:24:19] whether or not there would be I don't know what the

[00:24:22] hypothesis was some interaction between feminism and

[00:24:25] sexuality, but he asked all of the women what bodies they

[00:24:28] find attractive. The only finding that was there was

[00:24:32] that lesbian women report that and it's not even who

[00:24:36] you're attracted to, but just who you find attractive

[00:24:38] that lesbian women choose a body type that's higher in

[00:24:42] body mass index than straight women, which again, I

[00:24:48] don't know like why this is the kind exact kind of

[00:24:51] study that needs to be pre registered. I haven't talked

[00:24:53] about any of the data, but it's you know, it's a very,

[00:24:55] very simple statistical test and the numbers they report

[00:24:58] like if if they're you know, if they're right numbers,

[00:25:02] like they they do show a significant difference, but

[00:25:06] but they could have spliced the data in different ways.

[00:25:09] This is exactly the kind of thing that would be

[00:25:10] suspicious that they because like they they selected

[00:25:13] only people who are on two ends of the scale,

[00:25:16] hungry and not hungry, not people kind of in the

[00:25:19] middle, they they split up the ratings into you know,

[00:25:23] five categories of breast size. There's a lot of ways

[00:25:26] in which you could p hack the data in order to find

[00:25:28] something significant. But at the end, in spite of all

[00:25:31] the limitations they highlight, they say that these

[00:25:34] limits, tations not withstanding the present

[00:25:37] sets of results provides ample evidence that

[00:25:40] breast size may play it may play a role in men's

[00:25:43] assessment of female access to resources.

[00:25:47] All men never tested that.

[00:25:49] They're right.

[00:25:51] They never tested that it's it's purely that they

[00:25:54] prefer them. Yeah.

[00:25:55] So they and they assume they prefer women with access

[00:25:59] to resources if they're poor. All things being

[00:26:02] equal men from relatively low socio economic

[00:26:05] context and who experience temporary hunger,

[00:26:09] rate women with larger breast sizes more

[00:26:12] attractive than men from high socio economic

[00:26:15] context or are experiencing seicy secity.

[00:26:21] Satiety, satiety.

[00:26:23] I was actually wondering how I think it's satiety.

[00:26:25] I like if you read that if you read that sentence

[00:26:28] and maybe this could be something like a future

[00:26:30] gimmick future VW gimmick like real evolutionary

[00:26:35] psychology or fake evolutionary psychology, you know,

[00:26:39] like and then the person had to judge whether

[00:26:42] they thought it was real or whether it's something

[00:26:44] that we made up.

[00:26:45] Like if I read that, would you say fake or real?

[00:26:49] I would have said fake.

[00:26:51] I would have to.

[00:26:52] I mean, there is.

[00:26:55] They're you know, they're bordering on some of

[00:26:58] that real peer review.

[00:26:59] Like I wish real peer review would not just

[00:27:02] not just like fuck with the postmodernists.

[00:27:05] Yeah, they would actually put some of this

[00:27:07] shit up there because like they cite a

[00:27:09] bunch of this stuff.

[00:27:10] People look at eye tracking of men looking at women.

[00:27:16] And there is something like, look, I'm not going

[00:27:18] to say that this person or these researchers are

[00:27:21] sexist, but like I'm pretty sure that there

[00:27:25] is something fucked up about that this being

[00:27:28] the obvious research question to ask.

[00:27:31] Like we got this amazing, amazing device

[00:27:34] that allows you to see exactly where

[00:27:35] human beings are looking.

[00:27:36] Oh, get some boobies up on the screen.

[00:27:38] So I don't always agree with my step

[00:27:40] mother, but here I now think me too has gone

[00:27:42] too far.

[00:27:43] If you could attribute sexism to people who are

[00:27:47] purely invested in a scientific enterprise,

[00:27:52] just wanting.

[00:27:54] Then I was ready to jump on you about the

[00:27:58] Patriarch.

[00:28:00] I challenge anybody to read just the reference

[00:28:04] titles of this paper and not tell me that

[00:28:07] this is this is incredibly, incredibly weird.

[00:28:11] I feel like we didn't do it justice and I'm

[00:28:14] apologized, but really if all we do is give

[00:28:17] you the link to this, we have done you a great

[00:28:20] service. It is high comedy.

[00:28:22] It is a comic, comic masterpiece actually

[00:28:26] this whole page.

[00:28:27] And I have to say this and like are like

[00:28:30] I would like to hear from our women listeners.

[00:28:33] You know, they talk about the male gaze,

[00:28:35] not as in the homosexuals, but like as in

[00:28:39] as in the look.

[00:28:40] Male gaze.

[00:28:43] I can't help but think that at the

[00:28:47] conferences when they talk about this stuff,

[00:28:51] but it's just not that many women.

[00:28:54] I would just it just feels weird, man.

[00:28:56] If there were a group of women talking

[00:28:58] about how the bulge in your pants

[00:29:03] was a good fitness indicator, I would

[00:29:06] feel really weird walking into a crowd

[00:29:09] that like of people who study that and I'm

[00:29:12] like my life is good.

[00:29:14] Like I don't have anything to worry about.

[00:29:16] But I would still feel weird.

[00:29:17] I'd be psyched.

[00:29:20] Hey ladies, check out the bulge.

[00:29:23] High access to resources.

[00:29:25] All right.

[00:29:27] We were going to do a Guilty's Confession,

[00:29:28] but maybe since this has gone on, we

[00:29:30] should table that.

[00:29:33] Let's table it.

[00:29:34] I wasn't happy with mine anyway.

[00:29:36] I wasn't.

[00:29:37] I was too obvious.

[00:29:38] Terribly guilty about that.

[00:29:40] So.

[00:29:42] But I did want to get cleansed like you

[00:29:45] people believe, like you just say,

[00:29:47] oh, I did this.

[00:29:48] You say a couple of Hail Marys and you're good.

[00:29:51] Like I'm not Catholic, but also

[00:29:53] didn't you read the Bible last time?

[00:29:55] Who knows if we go into the ground

[00:29:58] like animals?

[00:29:59] Right. Well, this was probably not

[00:30:01] as edifying as our discussion of Ecclesiastes,

[00:30:04] but we'll be right back to talk about

[00:30:06] the Dunning Kruger Effect.

[00:30:46] Welcome back to Very Bad Wizards.

[00:30:48] This is the time where we like to take a moment

[00:30:51] to thank all the people who get in

[00:30:53] touch with us,

[00:30:55] who email us, who contact us

[00:30:58] in all the various ways that you do.

[00:31:01] We got a bunch of really nice feedback

[00:31:03] about the Ecclesiastes episode that.

[00:31:06] Who knew?

[00:31:07] I know.

[00:31:09] It was really gratifying because

[00:31:11] I don't think either of us went into it

[00:31:13] with any high expectations.

[00:31:17] It's not like we were talking about the intellectual

[00:31:19] dark web or something, you know?

[00:31:22] Maybe it's a lesson that if we talk

[00:31:23] about things we actually like intrinsically,

[00:31:26] they're like it'll actually make for a better episode.

[00:31:28] No, I mean, I think that's right.

[00:31:30] You know, like the Borges episodes are like that.

[00:31:33] Some of the movie episodes are like that for me

[00:31:35] where we go into something

[00:31:38] with that kind of naive excitement

[00:31:40] that is as opposed to when we

[00:31:45] trash an article like we did

[00:31:47] in the last segment.

[00:31:51] Exactly.

[00:31:53] So, yes, to get in touch with us,

[00:31:56] you can email us VeryBadWizards

[00:31:59] at gmail.com.

[00:32:01] You can tweet us

[00:32:04] at Tannler, at Peezz, at VeryBadWizards.

[00:32:07] You can like us on Facebook.

[00:32:11] You can join the discussion

[00:32:13] on our subreddit.

[00:32:16] You can like us, follow us.

[00:32:19] Fuck, follow us on Instagram

[00:32:22] and you can rate us on iTunes.

[00:32:25] Subscribe to us on iTunes.

[00:32:27] Give us a review so that we stay in the

[00:32:31] at least in the what's hot section.

[00:32:33] Even if they've stopped ranking us for some reason.

[00:32:37] I think we're back, by the way.

[00:32:38] I think we're back.

[00:32:39] Oh, we are good.

[00:32:40] Yeah.

[00:32:41] You can also support us in more tangible ways.

[00:32:44] You can first of all go to our support page.

[00:32:48] Click on the Amazon link before you buy

[00:32:52] any purchase, large purchases, small purchases.

[00:32:56] It all helps and we really appreciate that.

[00:32:59] You can give us a one time donation on PayPal,

[00:33:03] possibly even a recurring donation on PayPal.

[00:33:06] Is that right?

[00:33:08] Yeah, you can do that.

[00:33:09] And in fact, we're looking into being able to share some of the bonus

[00:33:13] content with our recurring PayPal donors.

[00:33:16] Yeah. So OK, because I know a lot of international people can't use Patreon.

[00:33:19] So we appreciate all you PayPal donate.

[00:33:21] Plus there are people who think Patreon is like the gulag Soviet thought police.

[00:33:27] That's right. That's right.

[00:33:29] Which we're apparently fine with.

[00:33:31] We're sorry, guys.

[00:33:36] We'd be first in line to like erase people from history and

[00:33:40] I'm all for a very bad wizard's listener

[00:33:43] starting their own donation system

[00:33:46] and yeah, bringing us into beta tested.

[00:33:49] We are, though, just scrambling to figure out how to find the time

[00:33:53] to just do this stuff that we do.

[00:33:55] Exactly.

[00:33:57] We did just drop a episode, a bonus episode on Patreon for our $2.00

[00:34:03] and up listeners on Star Trek, The Inner Light, which I enjoyed.

[00:34:10] And I hope listeners did too.

[00:34:13] And we got we got people signing up.

[00:34:15] Yeah, we got people signing up for it.

[00:34:17] So and any time you sign up, you can get access to all of our bonus

[00:34:22] content. So all of our bonus episodes by putting in the RSS feed

[00:34:29] that will give you access to everything.

[00:34:31] Or if you don't want to do that, you can always just scroll through

[00:34:34] all the posts and get all the bonus content since the very beginning.

[00:34:38] All your beats, all your beat volumes and all the bonus episodes.

[00:34:43] So when I think Jesse Natalia and I are planning on doing another one soon,

[00:34:49] possibly on Inland Empire, possibly on Lost Highway.

[00:34:53] And this is news to you live on air.

[00:34:55] I actually ended up talking to Barry Lam about doing a Star Trek episode

[00:35:01] for both his and our Patreon.

[00:35:04] Really? Yeah. Wow.

[00:35:07] Yeah. How do you feel?

[00:35:09] Impersion.

[00:35:12] Listen, you're the one off having threesomes.

[00:35:16] Threesomes.

[00:35:17] Yeah, with Natalia and Jesse.

[00:35:19] Oh, yeah. Yeah.

[00:35:20] Not the good kind of.

[00:35:24] So yes, you can become one of our beloved Patreon patrons.

[00:35:30] And we really appreciate that.

[00:35:31] It's our most consistent form of income.

[00:35:34] Ads have slowed down.

[00:35:35] Sponsors have slowed down.

[00:35:38] So right now, you know, we're relying on them.

[00:35:41] They don't need to know that, Tamela.

[00:35:43] They need to think that we're turning down sponsors.

[00:35:46] Well, we are turning down sponsors.

[00:35:48] I should be. Yeah, that's true.

[00:35:50] We shouldn't be, but we are turning down sponsors

[00:35:53] that we don't feel like we can fully endorse.

[00:35:59] Or at least, you know, that we're a little worried about.

[00:36:02] Anyway, yes, we really appreciate your support

[00:36:05] and we appreciate just all the different ways

[00:36:09] you get in touch with us.

[00:36:10] It was really nice to see some listeners

[00:36:12] that philosophically drinking at Red Yards Pub,

[00:36:16] that event that I did a couple of days ago.

[00:36:20] And that was, I don't know how the first segment went last night

[00:36:23] because I was just so exhausted and hung over

[00:36:28] from the previous.

[00:36:29] I just thought you were drunk and high.

[00:36:31] Well, I mean, I was,

[00:36:33] Harry, those aren't mutually exclusive.

[00:36:35] Yeah.

[00:36:36] You know, I just made quick mention

[00:36:41] of being on Freakonomics last time.

[00:36:42] And I wanted to say we recorded this Freakonomics Live episode

[00:36:46] that just got released as we're recording.

[00:36:49] And this was in New York City

[00:36:51] and it was in front of a live audience

[00:36:52] and there were no very bad whistlers.

[00:36:55] Very bad whistlers.

[00:36:58] Nobody came up to me and said,

[00:37:00] hey, I know you from your other podcast.

[00:37:04] That's why we should do a live episode

[00:37:06] in Houston rather than New York.

[00:37:09] We're gonna be in together, right?

[00:37:12] One of the rare times that we're together,

[00:37:14] we're going to be in Vancouver

[00:37:15] for the American Philosophical Association

[00:37:20] meetings in Vancouver, British Columbia.

[00:37:23] And we're making no promises,

[00:37:25] but if you are a very bad whistler, just listener

[00:37:29] and you will be in town

[00:37:31] and you think a meetup might be fun,

[00:37:33] just tweet to us at the very bad whistler's account

[00:37:35] or at our personal accounts.

[00:37:37] And if there's enough interest,

[00:37:39] maybe we'll go grab a beer with,

[00:37:41] but not the weird ones.

[00:37:42] Don't be weird.

[00:37:46] You can be a little weird.

[00:37:47] Yeah, I'm sure.

[00:37:48] Everyone's a little weird.

[00:37:51] So yeah, that would be fun.

[00:37:54] We may do that.

[00:37:55] We'll be there from April.

[00:37:58] I'm gonna be there from the 16th to the 20th.

[00:38:00] Yeah, 16th, yeah.

[00:38:02] Okay, well, so thanks everybody.

[00:38:08] Let's get to our main topic.

[00:38:10] So, our main topic today is a topic,

[00:38:17] I don't think we've discussed.

[00:38:19] In fact, weirdly, some of these social psychology effects

[00:38:22] that are supposed to be part of my job

[00:38:25] I rarely talk about.

[00:38:27] But this is the Dunning Kruger effect.

[00:38:30] It is a phenomenon named after two researchers

[00:38:33] who were at Cornell University.

[00:38:35] And it's basically an effect that,

[00:38:38] let me see if I do it justice, explaining the gist of it.

[00:38:42] People who don't know very much about any given domain

[00:38:47] tend to think they know a lot more than they do.

[00:38:50] So being ignorant about a domain leads to

[00:38:56] self enhancement of the very biased variety.

[00:39:00] And at the same time, and we'll explain why,

[00:39:04] if you're at the top of your domain,

[00:39:06] if you're like an expert in your field

[00:39:09] or in your area of knowledge or in whatever performance

[00:39:13] you're good at, you tend to underestimate your performance

[00:39:18] relative to other people.

[00:39:20] So you have this weird effect.

[00:39:22] It's often referred to as unskilled and unaware

[00:39:25] from the very first article they published back in,

[00:39:28] I think the late 90s, 99, which by the way,

[00:39:31] and aside the original paper was published by a grad student

[00:39:34] named Justin Kruger and has been the professor named

[00:39:37] David Dunning and it was Kruger and Dunning.

[00:39:40] And for some reason it is called

[00:39:42] the Dunning Kruger effect,

[00:39:45] which is just shitting on grad students.

[00:39:47] And Dunning, whenever he talks about it,

[00:39:49] he's always sort of jokes about like,

[00:39:51] I don't know how this happened,

[00:39:53] but let me just tell you,

[00:39:54] a lot of the grad students at Cornell used to go

[00:39:57] onto the Wikipedia and actually change it

[00:39:59] to the Kruger Dunning effect.

[00:40:02] And it would miraculously get changed back.

[00:40:04] That's capitalism right there, the exploitation.

[00:40:08] That's right, so pyramid scheme

[00:40:10] as people have offered.

[00:40:11] Yeah.

[00:40:12] But the reason I, so the reason,

[00:40:14] at least one of the reasons

[00:40:15] that I thought it might be interesting to talk about is

[00:40:17] for some reason in the last few years,

[00:40:19] a lot of people talk about this.

[00:40:21] It's like in popular press,

[00:40:23] like I always hear people referring

[00:40:24] to the Dunning Kruger effect.

[00:40:26] And it's like, it's a very, very weird thing.

[00:40:27] It would be like the media all of a sudden talking

[00:40:30] about Gettier cases.

[00:40:32] Because it was just like finding in social psychology

[00:40:35] that was fairly just sort of like,

[00:40:38] just about our little section of the field.

[00:40:43] And now it's a very, very popular way to explain.

[00:40:47] Yeah.

[00:40:48] Well, I mean, so let me ask you this.

[00:40:50] Does its recent popularity coincide

[00:40:54] with the Trump election and with just the general dumbing down

[00:41:00] of political debate over the last couple of few years

[00:41:03] because of social media, because of Twitter, because of...

[00:41:07] You know, that's a good question.

[00:41:08] I wish I had done a little bit of research

[00:41:10] into like the how much the phrase is used.

[00:41:13] I feel like I started hearing it maybe, you know,

[00:41:16] like four or five years ago.

[00:41:18] But I'm sure that it's gotten even more cited

[00:41:23] because of that.

[00:41:24] It's like a very smug way of...

[00:41:26] It's like actually a pretty smug way

[00:41:27] of accusing people of being dumb.

[00:41:29] Right, exactly.

[00:41:30] Like I mean, Vic, I think you...

[00:41:32] It's always liberals accusing conservatives of it

[00:41:36] or using them as an example.

[00:41:38] And I think part of the sort of the sinister aspect

[00:41:41] of the effect is you think it applies to other people,

[00:41:44] but not you.

[00:41:45] That's right.

[00:41:46] That's right.

[00:41:48] Where it really applies to all of us.

[00:41:51] So just to get a sense of how,

[00:41:53] what the boundaries of this effect are,

[00:41:56] I take it that if I know nothing about a field,

[00:42:03] like if you ask me about some organic chemistry

[00:42:09] or something like that,

[00:42:10] I'm not going to overestimate my abilities about that.

[00:42:15] So I can't be totally ignorant.

[00:42:18] That's right.

[00:42:19] And in fact,

[00:42:23] I think that's one of the crucial sort of features

[00:42:29] of this bias is that it's knowing a little bit

[00:42:31] that seems to get the effect going.

[00:42:34] Students who take like one biology class

[00:42:39] feel like they know a ton about evolution

[00:42:42] way more than people.

[00:42:43] And I remember this myself.

[00:42:45] I remember literally feeling after I took intro psych,

[00:42:49] like, wow man, I just know everything about this.

[00:42:52] Like college, like I don't know where to go from here.

[00:42:55] No, it's like the most cringe worthy memory I have

[00:42:58] was after taking economics, like macro economics.

[00:43:03] Yeah, I would think it was macro economics or micro.

[00:43:06] I don't know.

[00:43:06] I don't even remember that.

[00:43:09] I went back like, you know,

[00:43:11] for Christmas break, I was a frickin' freshman

[00:43:13] and I was lecturing people about like economics issues.

[00:43:18] Supply and demand.

[00:43:20] Oh God, it was just, it's so embarrassing how,

[00:43:24] and I think that is, this is something you see

[00:43:28] in college students because they take a class.

[00:43:30] I mean, students do this all the time.

[00:43:31] They'll raise their hand

[00:43:33] and they'll start like informing me about some,

[00:43:37] you know, psychological effect

[00:43:38] because they took intro psych and, you know.

[00:43:43] Exactly.

[00:43:44] I have actually, so you're right, it's cringe worthy

[00:43:46] and I know that these students in retrospect might feel

[00:43:51] the same thing that I feel now,

[00:43:53] but I had a student come in,

[00:43:54] I was teaching a seminar in social psychology

[00:43:57] and I had a student who might love actually,

[00:44:00] but she started telling me

[00:44:01] about my own chip entire own study.

[00:44:03] Like when I was lecturing on trolley problems

[00:44:07] and I was just smiling,

[00:44:08] just waiting for her to finish the explanation.

[00:44:12] So there's some sort of sweet spot

[00:44:14] of a little bit of knowledge

[00:44:17] where you start to vastly overestimate your abilities.

[00:44:22] Yeah, and I don't know what like the,

[00:44:25] you know, the critical dosage is, right?

[00:44:29] I just know that it's knowing a bit

[00:44:31] that leads, it's not being in complete ignorance.

[00:44:34] And then how much of an expert do you have to be

[00:44:37] where the effects reverses?

[00:44:40] So usually this is talked about in percentiles.

[00:44:46] So at what portion of the distribution are you?

[00:44:50] So are you at the top,

[00:44:53] are you at the top quartile of the distribution?

[00:44:56] I don't know that we have precision

[00:44:58] about like where it starts coming out,

[00:45:00] but I believe that it is,

[00:45:03] that the studies at least that I'm familiar with

[00:45:05] look at people who are at the bottom 25%

[00:45:08] and the top 25% percentile of the distribution

[00:45:11] and show this effect there.

[00:45:13] Let me maybe distinguish this from a more general effect

[00:45:16] that has been found over and over again in psychology,

[00:45:19] at least in the psychology of non-eastern peoples

[00:45:23] like Western folks,

[00:45:24] which is the above average effects,

[00:45:27] which probably most people are familiar with

[00:45:29] that most people think they're better

[00:45:31] at a host of like positive traits.

[00:45:36] Sometimes it's called the lake will be on effect, I guess.

[00:45:39] But the idea is that if you ask people,

[00:45:41] my favorite example of this is,

[00:45:44] I'll forget the percentages,

[00:45:45] but if you ask professors how they stand compared

[00:45:50] to all other university professors

[00:45:52] in terms of like say clarity of their lectures

[00:45:56] or the quality of their lectures,

[00:45:59] it's like most professors rate

[00:46:02] that they're in the 90th percentile,

[00:46:06] which is statistically impossible.

[00:46:09] Most people can't be better than most people.

[00:46:12] Right.

[00:46:15] So that's just overestimating your ability in general?

[00:46:21] In general and on positive traits.

[00:46:23] Not knowledge necessarily, not.

[00:46:27] You know it's found in a whole bunch of stuff.

[00:46:29] It's found in skills like driving,

[00:46:31] like most people think they're better drivers than others.

[00:46:34] Most people think they are more moral.

[00:46:36] Better sense of humor, right?

[00:46:38] Better sense of humor, less biased,

[00:46:40] you alluded to this,

[00:46:41] most people think they're less biased than other people.

[00:46:46] And so that seems to be a robust and general effect

[00:46:49] where people just kind of inflate

[00:46:51] their standing on most positive things.

[00:46:54] Now you can get rid of this by,

[00:46:56] well, you can show that this gets lower and lower

[00:47:02] depending on how you ask the questions.

[00:47:04] So if I ask a student say at Cornell,

[00:47:08] how smart are you compared to the average person?

[00:47:11] You'll get a fairly big like effect,

[00:47:16] but in fact they might be smarter than the average person,

[00:47:19] but if you ask them.

[00:47:21] Or they might have just paid like a track coach to say

[00:47:25] that they're a track star.

[00:47:31] If you ask people how smart are you compared

[00:47:33] to the average Cornell student,

[00:47:35] then you start getting a bit lower.

[00:47:38] But that's just not, assuming Cornell students

[00:47:41] are smarter on average than the population,

[00:47:44] that makes sense.

[00:47:44] So there's a rational explanation,

[00:47:47] but here's where it starts getting interesting.

[00:47:49] If you ask people how smart they are compared

[00:47:52] to the average student at their university,

[00:47:56] it goes down a bit.

[00:47:57] If you ask them how smart they are compared

[00:47:59] to the random person sitting next to them,

[00:48:02] it goes down even more.

[00:48:04] So it's not just that they're sort of rationally adjusting,

[00:48:08] it's that there's something about the concrete comparison.

[00:48:11] There's something about the abstract level of comparison

[00:48:13] that when you think about an average person

[00:48:16] or an average Cornell student

[00:48:18] that lets you create sort of in this self-enhancing way,

[00:48:23] you create somebody who's not as good as you in your head.

[00:48:25] And it's harder to do when that person is right now,

[00:48:28] when you're thinking about an actual, actual person.

[00:48:31] And so one of my favorite findings

[00:48:33] is also from David Dunning.

[00:48:36] You can show that people use idiosyncratic

[00:48:39] trait definitions.

[00:48:40] So if I ask people how smart are you

[00:48:44] and what you get is that a bunch of people report

[00:48:47] that they're smarter than average.

[00:48:49] What you find is that people are using

[00:48:51] very different definitions of what smart means.

[00:48:53] So some people might say like,

[00:48:54] well I'm just street smart, right?

[00:48:56] Or I'm savvy, like I'm not a sucker.

[00:48:58] Maybe I'm not book smart,

[00:48:59] but so they use that definition in order to self-enhance

[00:49:03] whereas somebody else like us might be like,

[00:49:05] well I'm book smart.

[00:49:07] That's what smart really means.

[00:49:10] I'm street smart.

[00:49:11] Yeah, I am sure you are.

[00:49:14] You're from South East.

[00:49:16] And so if you give people a definition,

[00:49:19] like if you define what you mean,

[00:49:22] like how intelligent are you?

[00:49:23] And by intelligence we mean this.

[00:49:25] Then the effect seems to go down.

[00:49:27] So people, it seems to be like when you give people

[00:49:30] a little bit of flexibility,

[00:49:31] they find a way for the self enhancement to creep in.

[00:49:36] But you are distinguishing this from a Dunning cruger.

[00:49:39] And this seems to be like maybe part of it,

[00:49:44] the motivated part to seem like you're better

[00:49:47] or to really believe that you're better than other people

[00:49:49] seems to be kind of at the heart

[00:49:51] of the Dunning cruger effect.

[00:49:53] It's just that in the Dunning cruger effect,

[00:49:55] what's going on is this additional thing I believe

[00:49:57] which is that a little bit of knowledge

[00:50:00] is now really, really doing like work

[00:50:05] on the self enhancement.

[00:50:07] And so what Dunning and cruger are saying is

[00:50:10] it's the ignorance.

[00:50:11] So like say you and I after taking our intro courses.

[00:50:14] Yeah.

[00:50:15] The claim, the central claim is because we have no idea

[00:50:20] what we don't know,

[00:50:21] like all of the other things that there are,

[00:50:24] like our denominator is off, right?

[00:50:25] Like I felt like I knew 98 out of 100 things

[00:50:29] to know in psychology.

[00:50:31] What I didn't realize is there was,

[00:50:32] I knew 98 out of like 10,400 things.

[00:50:36] And it's that numerator, speaking metaphorically

[00:50:40] that's driving my sense of how smart I am.

[00:50:43] That makes sense.

[00:50:44] Yeah.

[00:50:45] So if you know that your numerator is zero

[00:50:47] then you won't.

[00:50:48] Exactly.

[00:50:49] And so what seems to be happening with the people

[00:50:53] at the top and of the distribution

[00:50:56] is something different.

[00:50:59] So, and I think that we make this mistake all the times

[00:51:03] when, or at least I do in lecture,

[00:51:07] when I say, when I start talking about the trolley problem

[00:51:10] for instance, I feel like,

[00:51:14] I'm just telling them something they already know.

[00:51:16] Like I don't need to explain what this is, right?

[00:51:19] Everybody knows the trolley problem.

[00:51:20] And so I'll say like raise your hand

[00:51:21] if you've heard of the trolley problem.

[00:51:23] And like barely any of the kids raised their hand.

[00:51:25] Yeah.

[00:51:26] Right.

[00:51:27] Now I know.

[00:51:28] So.

[00:51:28] And yeah, I've just been noticing this too

[00:51:31] when like first all sorts of things,

[00:51:33] I mentioned tenure.

[00:51:35] Like they would definitely know what tenure was.

[00:51:38] And most of them had no idea what tenure was.

[00:51:41] Yeah.

[00:51:42] It's insane that they don't know the details of our lives.

[00:51:44] Yeah.

[00:51:47] They won't even know that I'm up for full professor this year.

[00:51:53] You know, this is shameful

[00:51:55] but we've talked about this before

[00:51:57] when we're at a conference

[00:51:59] and we're talking to our colleagues

[00:52:00] and somebody mentions our podcast

[00:52:02] and someone says, oh, you have a podcast?

[00:52:04] I'm like, fuck you.

[00:52:05] You knew that.

[00:52:06] Everybody knows that.

[00:52:08] I know.

[00:52:09] I get like an inner like, fuck you.

[00:52:13] So.

[00:52:14] But then I'm also really modest and humble about it.

[00:52:17] Oh yeah.

[00:52:18] I would never actually let them know.

[00:52:19] Like, you know.

[00:52:20] So this phenomenon actually that thinking

[00:52:22] that other people know the same things that you know

[00:52:24] or have the same attitudes that you have

[00:52:26] is called the false consensus effect

[00:52:28] in social psychology.

[00:52:30] It's sort of inflating how similar people are to you.

[00:52:33] And that's what seems to be going on with experts.

[00:52:35] They seem to think that people must know a lot more

[00:52:39] than they actually do because of false consensus.

[00:52:41] So they're not self enhancing

[00:52:43] and they're not self effacing either.

[00:52:45] So I think one of the findings that illustrates this

[00:52:49] is that if you get an expert say like,

[00:52:55] you know, ask me about Star Trek trivia.

[00:52:59] Right?

[00:53:00] I actually think people know a lot about Star Trek

[00:53:03] a lot more than they actually do.

[00:53:05] So if you ask me how much do I know about Star Trek

[00:53:11] and make me rank it in terms of percentile

[00:53:13] like compared to the population, I get it wrong.

[00:53:16] So I actually undershoot how much I know about Star Trek

[00:53:20] because I think most people know some about Star Trek.

[00:53:24] If, on the other hand, you ask me if you say,

[00:53:28] hey, I'm going to ask you 30 questions,

[00:53:30] 30 trivia questions about Star Trek.

[00:53:32] How many do you think you'll get right?

[00:53:34] There I'm actually well calibrated like there.

[00:53:37] And that's something that doesn't happen

[00:53:39] at the levels of ignorance.

[00:53:41] The people on the low end,

[00:53:43] the people who are overestimating their knowledge,

[00:53:46] they overestimated both the raw amount

[00:53:50] like and the percentile amount.

[00:53:52] And people at the top end,

[00:53:54] they're pretty good at knowing what they don't know.

[00:53:56] So this all boils down to what I think the interesting part is

[00:54:00] is that we just, it's the Rumsfeld's unknown unknown.

[00:54:06] We have no idea how much we don't know about something.

[00:54:09] Right.

[00:54:10] And yeah.

[00:54:11] And this is an idea that traces back

[00:54:16] in Western philosophy to Socrates

[00:54:18] where he was declared the wisest man in Athens.

[00:54:21] And he said it was because only he is aware

[00:54:26] of how much he doesn't know.

[00:54:28] It's funny because I used to think of that

[00:54:30] as just some sort of mock humility.

[00:54:33] Yeah. Well, he can come off as mock humility sometimes

[00:54:36] in some of the dialogues, but I do think there,

[00:54:42] from what I understand,

[00:54:43] the historical Socrates really did have that attitude

[00:54:47] that the way in which he was wiser

[00:54:52] than all the other Athenian citizens

[00:54:55] was that they were all overconfident

[00:54:58] that they knew about the world,

[00:55:00] about moral knowledge,

[00:55:01] about what a good life was,

[00:55:04] about why it was important to have the job

[00:55:09] or the political office that they did.

[00:55:11] And but it does often also come out as

[00:55:15] I sometimes false modesty.

[00:55:18] Right, right.

[00:55:20] So there's this tough thing where,

[00:55:24] like one of the reasons that social psychologists

[00:55:26] study this so much is that first it was like,

[00:55:28] well overconfidence is a problem, right?

[00:55:31] Like it'll lead you to make all sorts of errors.

[00:55:33] And so it's of concern for that reason.

[00:55:36] And so a lot of people used to talk about,

[00:55:40] well, they still do like,

[00:55:41] well, why do we have these overconfidence effects?

[00:55:44] Now maybe that they're just cultural,

[00:55:46] maybe that it's like, you know.

[00:55:48] Yeah.

[00:55:49] Japanese, right?

[00:55:50] They underestimate their...

[00:55:52] Yeah, yeah, right.

[00:55:54] There is some, I think research,

[00:55:58] don't quote me on it,

[00:55:59] that shows that what collectivist individuals do is they,

[00:56:04] they don't self-enhanced, they group enhance.

[00:56:06] So maybe that there is still a general tendency to view

[00:56:11] at some level view things in a more positive light

[00:56:14] than they actually are.

[00:56:15] That would make sense.

[00:56:16] If you identify with the group,

[00:56:18] yeah, yeah.

[00:56:19] Then that self enhancement property would apply to that

[00:56:24] and not to you as an individual.

[00:56:26] Some people have argued that overconfidence

[00:56:29] is actually a good thing because one makes you happier.

[00:56:34] So there's this really depressing sort of literature

[00:56:39] on depressive realism

[00:56:40] that when you're depressed,

[00:56:41] you actually seem to be more accurate

[00:56:43] about your skills and abilities.

[00:56:47] And so some people have argued that,

[00:56:49] yeah, like thinking that you're slightly better

[00:56:51] than you are is actually good for you.

[00:56:53] Yeah.

[00:56:54] Right, optimism is a greater irrational optimism

[00:56:58] leads to greater success.

[00:57:00] That's right.

[00:57:01] In the long run, there's that effect.

[00:57:03] I mean, we all,

[00:57:05] I think I have been driven often by irrational confidence

[00:57:10] and undue optimism.

[00:57:15] It's all humility on my end.

[00:57:18] Yeah, I'm sure.

[00:57:19] The most.

[00:57:20] And I often tell,

[00:57:20] like I feel like students in academia these days

[00:57:24] are airing on the other end,

[00:57:27] but maybe they're just being so pessimistic

[00:57:29] because that's the real state of affairs right now

[00:57:33] and the job market and grad school.

[00:57:35] Yeah, I mean,

[00:57:36] because it's kind of that's just never

[00:57:38] how I thought of things.

[00:57:39] It just seems like that's a terrible way to live

[00:57:42] is to believe that stuff.

[00:57:44] Oh, right, right.

[00:57:45] Like get a healthy, healthy dose of overconfidence.

[00:57:50] You know, I don't know what the right answer is.

[00:57:52] Like where I was going with this is like,

[00:57:54] it's hard to know, you know,

[00:57:57] like is accuracy the goal?

[00:58:00] Right?

[00:58:00] So I think the Dunning Kruger effect is distressing

[00:58:04] because it's saying,

[00:58:06] it's not just that everybody's miscalibrated a bit.

[00:58:08] It's that the most ignorant people

[00:58:10] are the most miscalibrated.

[00:58:11] And so it seems to be that like we should avoid this.

[00:58:15] We should either be, you know,

[00:58:18] just more humble about what it is that we know

[00:58:22] or maybe not even like, you know,

[00:58:24] don't ever just take one class on something.

[00:58:26] I mean, we're creating a bunch of like-

[00:58:28] Well, that is one of the ideas, right?

[00:58:30] Is just stop filling your head with too much knowledge

[00:58:35] because unless you're really diving into something,

[00:58:39] you're just not, it's gonna end up doing more harm than good

[00:58:43] in terms of your understanding of that field.

[00:58:47] Right?

[00:58:48] So like do you think our liberal arts education

[00:58:50] is creating a bunch of Dunning Kruger effects?

[00:58:52] Like because you take one class in art history,

[00:58:55] you take, you know, one class in anthropology.

[00:59:00] I mean, so here's a potential confound.

[00:59:04] I also think that this is something very characteristic

[00:59:07] of young people and something that's very characteristic

[00:59:10] of growing older is you start to realize just how much

[00:59:15] of the world you don't know that you don't understand

[00:59:17] that's much more complicated than you once thought it was.

[00:59:20] And now sometimes that's because you just know more

[00:59:26] but I also, I don't know,

[00:59:27] doesn't it feel like as you've grown older,

[00:59:30] you start to see the complexity in things

[00:59:33] that you weren't able to see?

[00:59:35] I'm of two minds because I can actually,

[00:59:37] I can get the exact opposite intuition about things too.

[00:59:40] That like I am more willing to speak

[00:59:45] with authority about things than I ever was before.

[00:59:49] Right?

[00:59:50] Like pontificate.

[00:59:52] Yeah.

[00:59:52] If you, I remember actually when I was in grad school,

[00:59:58] I remember talking to this person I was working with

[01:00:01] who was actually working on some of this stuff

[01:00:03] above average effects.

[01:00:04] And we were talking about expertise

[01:00:07] and I was telling him like,

[01:00:09] I don't think I'll ever feel like an expert in anything

[01:00:14] because it's so overwhelming to me.

[01:00:17] And he was like, what are you talking about?

[01:00:19] Like, you know, like I know for a fact

[01:00:23] that I know more about X than anybody,

[01:00:25] like than most people in the world.

[01:00:27] Like you're expert in a lot of things.

[01:00:30] Like I wasn't being mock humble.

[01:00:32] I was actually just expressing like that.

[01:00:35] So you probably were more modest,

[01:00:39] maybe insecure about your ability.

[01:00:41] I was insecure for sure.

[01:00:43] Yeah.

[01:00:44] Whereas I was just kind of insufferable.

[01:00:46] I read the selfish gene and I thought I understood

[01:00:50] like the right way to understand evolutionary biology

[01:00:54] and that was my approach.

[01:01:00] So this gets to this other problem

[01:01:03] that I think arises from nuance,

[01:01:07] which is it makes us really boring.

[01:01:10] So you and I have had this feeling where like,

[01:01:16] I don't know, maybe I just remember the most recent time

[01:01:18] when you were telling me that I was hedging too much.

[01:01:21] Like the more expert I become at something,

[01:01:24] the more I want to like hedge, even hedge my hedges.

[01:01:28] Qualify everything.

[01:01:29] Qualify my qualifications and it makes for like,

[01:01:34] it makes for a real inability to communicate action.

[01:01:39] I mean, one of the reasons to believe firmly in something

[01:01:42] is because it leads you to act.

[01:01:44] If you don't believe anything,

[01:01:45] then like why would you do anything?

[01:01:47] Yeah, no, that's right.

[01:01:49] And as someone who has edited some of your just

[01:01:53] like endless sentences with constant qualifications.

[01:01:58] Like you'll probably be getting a missile.

[01:01:59] Sometimes you just qualify everything

[01:02:00] and you never get to what you would expect.

[01:02:01] I never popped back up to the stack.

[01:02:03] I never popped all the way back up to the stack.

[01:02:05] To the actual view that you were gonna express.

[01:02:07] You know, I heard Sarah on the rationally speaking

[01:02:10] podcast, Julia Gallif had Sarah Hader on

[01:02:17] and they were talking about this

[01:02:18] and they were talking about the fact

[01:02:20] that nuance is a real trade off

[01:02:22] because once you inject nuance into your dialogues

[01:02:28] with others and even just in the way you think about things,

[01:02:31] it really does make you a less inclined

[01:02:35] to inspire action and passion in others

[01:02:39] and be harder to get yourself as motivated.

[01:02:44] And it's a trade off that sometimes

[01:02:49] is a good trade off to make, but maybe sometimes isn't.

[01:02:52] Right.

[01:02:53] And you know, this really matters in the way

[01:02:56] that we communicate for instance,

[01:02:58] say science to the public, right?

[01:03:00] So one of the things that I think academics

[01:03:04] are always getting annoyed at is when somebody

[01:03:07] like a journalist or whatever writes up something

[01:03:12] and they speak with it as if it's just true

[01:03:16] and we know all of the ways in which it's not true,

[01:03:20] but it's something that I've had to kind of

[01:03:22] disabuse myself of in,

[01:03:25] I remember one of the first times

[01:03:27] that I was on a live radio interview, I sucked.

[01:03:30] It sucked.

[01:03:31] They didn't even like, I was talking like a professor

[01:03:34] and they were out of time.

[01:03:36] Like I didn't even get to the point

[01:03:40] and I had to learn, which I don't learn in this podcast

[01:03:42] because I know you're gonna edit it.

[01:03:44] I had to learn how to say things in 10 seconds,

[01:03:49] but I always think, well, maybe it's just that

[01:03:52] the uncertainty that I have like

[01:03:55] about what I know about human psychology for instance

[01:03:59] is an uncertainty that seems like huge variance.

[01:04:04] Like if you had error bars on my views or whatever,

[01:04:07] like it would seem huge,

[01:04:08] like the confidence interview intervals would seem huge,

[01:04:11] but the truth is compared to everybody else,

[01:04:15] like I actually probably know things with a degree of certainty

[01:04:20] that I am failing to convey

[01:04:22] because my comparison is my peers

[01:04:24] when I'm giving like a talk and they're like,

[01:04:25] well did you control for this or whatever?

[01:04:28] And it's a hard balance to strike

[01:04:31] because if you really want to get people

[01:04:34] to believe something, maybe this is why

[01:04:38] people still consistently find ways

[01:04:41] to misinterpret science

[01:04:46] because we're always hedging.

[01:04:48] Yeah, that's really interesting

[01:04:49] because and it connects to that overestimation

[01:04:54] of how much other people know

[01:04:56] and so that leads you to think

[01:05:00] that certain things need to be qualified

[01:05:02] just cause they'll already know the...

[01:05:05] Right.

[01:05:05] Yeah.

[01:05:06] Yeah, yeah.

[01:05:07] I really struggled with that

[01:05:09] when I was doing interviews and podcasts for the Honor book.

[01:05:13] Like I just, and I don't think I ever figured it out,

[01:05:15] especially for the really shorter pieces,

[01:05:18] just how to convey something with confidence,

[01:05:22] you know, on a topic that I felt

[01:05:25] I'm bivalent about to some degree.

[01:05:29] Yeah, and I honestly like I would have to really

[01:05:35] try to get better at that

[01:05:36] if I did another one of those things.

[01:05:38] Yeah, we've talked about this a bit

[01:05:40] about how one of the things I liked about your book

[01:05:43] was that you were nuanced

[01:05:45] and you weren't making crazy claims,

[01:05:47] but you know, maybe that just doesn't sell.

[01:05:53] Like nobody wants to hear people hedging, right?

[01:05:56] It's no things for no people.

[01:05:58] It's like...

[01:05:59] Yeah, exactly.

[01:06:02] But yet it still managed also piss people off.

[01:06:05] Yeah, that's the best of all worlds.

[01:06:08] It was.

[01:06:08] So I have a couple of questions about...

[01:06:11] So this comes from the article,

[01:06:13] the review, the long review you send me,

[01:06:15] where he talks about reach around knowledge.

[01:06:21] You know what, man?

[01:06:22] Reach around.

[01:06:24] Hey you suck, Nix!

[01:06:25] Sir, no sir!

[01:06:26] Are you a Peter Pupper?

[01:06:27] Sir, no sir!

[01:06:29] I bet you're the kind of guy

[01:06:30] that would fuck a person in the ass

[01:06:31] and not even have the goddamn common courtesy

[01:06:33] of giving him a reach around.

[01:06:35] I'll be watching you!

[01:06:38] Dude, I cannot...

[01:06:39] Like I had a text message,

[01:06:44] one of my former students

[01:06:47] whose main advisor was Dave Dunning

[01:06:49] and like I screenshot it and I was like,

[01:06:51] is he just fucking with us?

[01:06:53] Yeah, that's what I wanted to know first.

[01:06:55] I mean, I want to know what it is,

[01:06:57] but mainly I want to know,

[01:06:59] was he, was that a joke?

[01:07:00] Like did he do that kind of intentionally?

[01:07:03] Our theory is that he is Tobias Funke's long lost brother

[01:07:11] because later on,

[01:07:13] he talks about bottom performers and top performers

[01:07:16] and I'm like, no, no Dave, no.

[01:07:20] What are you doing?

[01:07:22] That's hilarious.

[01:07:25] I think all what he means by reach around knowledge,

[01:07:29] Jesus Christ,

[01:07:30] what he means by reach around knowledge is,

[01:07:34] is just when you,

[01:07:39] you just use whatever knowledge you have at hand.

[01:07:43] Yeah.

[01:07:43] So like he's referring to why people answer like,

[01:07:47] like with certainty that they know something about

[01:07:50] something that's completely made up.

[01:07:51] So like if I ask you what's the,

[01:07:55] like if he asks about,

[01:07:57] yes in one study,

[01:07:59] people asked what they think about the agricultural trade act,

[01:08:03] which is just a fictitious act.

[01:08:06] It's not like it doesn't actually exist.

[01:08:07] What they do is they reach for whatever similar shit

[01:08:10] they think they can talk about

[01:08:11] and they use that to respond to this question.

[01:08:16] Like they're,

[01:08:17] they don't have any real domain specific knowledge.

[01:08:20] Like they just find whatever in their heads

[01:08:22] and use that they reach around, grab that knowledge.

[01:08:26] Yeah.

[01:08:29] So yeah, I mean the Jimmy Kimmel stuff that he talks about

[01:08:32] where he goes and interviews people

[01:08:34] at South by Southwest about,

[01:08:36] Exactly.

[01:08:37] He asks them about fake bands.

[01:08:39] Now you might think,

[01:08:42] well they don't know anything about the band.

[01:08:45] And I thought when people didn't know anything

[01:08:49] they didn't overestimate their knowledge.

[01:08:51] But in this case,

[01:08:52] they know about music in general

[01:08:55] and so they have something to reach around

[01:09:01] and grab.

[01:09:04] Jesus Christ, why?

[01:09:06] Why?

[01:09:08] Yeah, so that's,

[01:09:10] so that was one question.

[01:09:13] And again, it's like,

[01:09:15] if you're on Jimmy Kimmel Live,

[01:09:17] like you could say I've never heard of them,

[01:09:19] but that's boring.

[01:09:20] It's just sort of like

[01:09:21] what we were talking about before.

[01:09:25] Being really honest about your lack of knowledge

[01:09:30] is not that interesting.

[01:09:32] You don't often get to be on Jimmy Kimmel, so.

[01:09:34] Right.

[01:09:35] And this is, have you had this experience though

[01:09:37] that it took me a while to have the confidence

[01:09:42] to tell students, I don't know,

[01:09:44] when they asked a question?

[01:09:47] Like I remember thinking that my role was,

[01:09:49] well, like to reach around and grab whatever you know

[01:09:53] and like construct an answer.

[01:09:56] But when I realized that honestly,

[01:09:59] I didn't know the answer and I was able to say like,

[01:10:01] oh, that's a good question.

[01:10:02] I don't know the answer, you know,

[01:10:03] or like I'll look it up or whatever,

[01:10:05] or if anybody in the class wants to look it up,

[01:10:07] it feels so much better.

[01:10:08] Yeah, no, it does.

[01:10:10] I don't know if I had trouble earlier.

[01:10:13] I know for a while

[01:10:15] I was very comfortable saying I don't know.

[01:10:20] Which isn't to say that I don't still sometimes

[01:10:23] do that kind of pontificating about

[01:10:26] if I feel like there's something I can connect

[01:10:31] to their question, but yeah,

[01:10:34] I say I don't know all the time.

[01:10:36] Good question, I have no idea.

[01:10:38] So one way of addressing this,

[01:10:41] so he talks about people who have just taken

[01:10:44] a driving class are more likely to get into accidents

[01:10:47] because now they feel like they have certain skills.

[01:10:50] And he says, in cases like this,

[01:10:53] the most enlightened approach as proposed

[01:10:56] by Swedish researcher Nils Petter Gregorsson

[01:11:00] may be to avoid teaching such skills at all.

[01:11:03] Instead of training drivers how to negotiate icy conditions,

[01:11:06] perhaps classes who just convey the inherent danger,

[01:11:10] they should scare inexperienced students away

[01:11:12] from driving in winter conditions in the first place

[01:11:15] and leave it at that.

[01:11:16] I mean, there is something to this.

[01:11:20] I don't know about just scaring them,

[01:11:22] but there is something about maybe the right approaches

[01:11:26] to let people figure out their skills themselves

[01:11:30] rather than giving them the skills

[01:11:33] and trying to impose it on them.

[01:11:37] And that's like a...

[01:11:41] Yeah, it's a good question.

[01:11:43] I don't know what the right answer is,

[01:11:45] but it talks about this in the longer article

[01:11:48] in cases of, say, medical students training

[01:11:52] where they have a little bit of knowledge,

[01:11:54] so they sort of act with confidence

[01:11:57] when I believe one of the examples was in treating a wound.

[01:12:03] There's like 10 things that you have to be very careful

[01:12:07] about lest the wound get infected.

[01:12:10] And they know like three of them, right?

[01:12:14] They have three or four of them.

[01:12:15] And because they seem...

[01:12:17] They think that there are only three or four things to know,

[01:12:20] they think they did a great job,

[01:12:21] but this can lead to actual people dying.

[01:12:26] Amputation.

[01:12:27] Yeah, right.

[01:12:28] And so how do you get them or the inexperienced driver

[01:12:33] to know when you maybe shouldn't go out on the road

[01:12:37] or know when you should ask a senior medical...

[01:12:41] If you don't even know that you didn't know that,

[01:12:45] why would you go ask somebody, did I do it right?

[01:12:47] And I feel like giving the experience of failure somehow

[01:12:50] must ought to be a good way to communicate it.

[01:12:55] And there has to be a good way to give that feedback,

[01:12:58] like a driving simulator or even a video

[01:13:02] of somebody in an accident,

[01:13:04] show how quickly things can happen, right?

[01:13:06] Yeah.

[01:13:07] Yeah, I mean, he suggests this when you're teaching science

[01:13:11] or teaching...

[01:13:13] I think he uses the example of evolutionary theory.

[01:13:16] Teach it in such a way that they're going to misinterpret it

[01:13:19] and then show them how they misinterpret it.

[01:13:23] That's sort of what you're talking about.

[01:13:24] You give them the experience of failure.

[01:13:27] You just show them flat out

[01:13:30] how prone they were to misinterpret

[01:13:34] something that they thought they knew.

[01:13:36] And that's a good way of learning

[01:13:38] what the right interpretation is.

[01:13:40] Right, you know, in the Star Trek universe.

[01:13:47] Somebody tweeted about this and it's a great example.

[01:13:50] There is a... When they're training cadets,

[01:13:53] they put them through a simulation.

[01:13:55] It's called the Kobayashi Maru.

[01:13:56] And the simulation is like,

[01:13:58] okay, you're piloting the ship

[01:14:00] and some whatever Klingons start attacking your ship

[01:14:04] and you have to make the decisions that will...

[01:14:08] Like what decisions will you make

[01:14:09] to get you out of the situation?

[01:14:11] And the... But the test is rigged

[01:14:14] such that no decision ever leads to success.

[01:14:18] And the whole point of the test

[01:14:20] is to show the cadets

[01:14:23] that there will be no win situations, right?

[01:14:26] That there will be times when you actually get everything wrong

[01:14:28] and you fail through no fault of your own...

[01:14:32] There's no amount of knowledge

[01:14:33] that could have prepared you for it.

[01:14:35] And it seems as if we ought to be able

[01:14:36] to teach this domain generally to people.

[01:14:39] Like if you're trying to teach a kid,

[01:14:41] like I just don't know if it would work, right?

[01:14:45] Sometimes there's no answer.

[01:14:47] There's no good solution.

[01:14:48] There's nothing like...

[01:14:50] It's just...

[01:14:53] I was reading in an earlier class,

[01:14:55] Shuangze, which is an early text

[01:14:59] in the Taoist philosophy.

[01:15:01] And one of the big messages of that text

[01:15:07] is that our perspectives are so limited

[01:15:11] and we know so little,

[01:15:13] the big mistake is not recognizing that.

[01:15:16] So very Socrates-like in that sense,

[01:15:18] not recognizing it leads to all sorts of political disasters.

[01:15:23] It leads to personal disasters.

[01:15:24] It leads to all sorts of misunderstandings and tragedies.

[01:15:29] But unlike Socrates whose solution to that

[01:15:33] was to keep interrogating the world

[01:15:35] until your ignorance gets diminished a bit,

[01:15:39] his solution seemed to be to just not even try.

[01:15:43] Like don't...

[01:15:44] Because it's just too impossible.

[01:15:47] It's too daunting.

[01:15:48] It is like just understanding the world

[01:15:51] with from a perspective as limited as ours,

[01:15:56] is just not possible.

[01:15:57] So stop thinking that you can do it.

[01:16:00] Stop even trying to do it.

[01:16:02] Stop even trying to it.

[01:16:03] Cause I guess the worry is in some sense,

[01:16:07] the best you can do is get to that point

[01:16:10] where you'll start vastly overestimating

[01:16:13] how much you know.

[01:16:14] Right.

[01:16:15] Yeah.

[01:16:16] Like you'll get to that point

[01:16:16] on the Dunning-Kruger effect

[01:16:18] and that's the worst situation that you can be in.

[01:16:21] So like if you take the Socrates path,

[01:16:23] that's where you're gonna end up

[01:16:25] and there's something very depressing

[01:16:29] about that paralyzing, but it's interesting.

[01:16:32] It might be right.

[01:16:33] It might be like I just don't know

[01:16:36] how far do you take that?

[01:16:37] Because it seems as if not...

[01:16:39] Like it would require you to just do less shit.

[01:16:42] So like take an example from our own podcast

[01:16:45] and we often say things like this.

[01:16:46] Like well, you and I don't know

[01:16:49] shit about Ecclesiastes really.

[01:16:51] Right.

[01:16:52] There are people who dedicate their lives

[01:16:53] to studying the books of wisdom.

[01:16:55] So we kind of go into it,

[01:16:58] our remedy is to just sort of admit

[01:17:00] that we don't know anything and still have fun doing it.

[01:17:02] But that's because it's sort of a low consequence action.

[01:17:05] Yeah.

[01:17:06] But like there's tons of stuff that like,

[01:17:09] you know I'm sure I'm overconfident in my driving.

[01:17:12] Like what do I do?

[01:17:13] Do I not drive?

[01:17:14] Right.

[01:17:15] Like there's a weird way in which

[01:17:16] you just have to like bite the bullet and say

[01:17:18] like I know enough about this thing to do a thing.

[01:17:22] But what about something like supporting a political candidate?

[01:17:26] That's why I don't like voting.

[01:17:28] Knowledge is so limited of politics, economics,

[01:17:32] healthcare, climate science.

[01:17:34] It's so limited if you took this view

[01:17:38] it would be hard to justify going door to door

[01:17:41] and volunteer for some politician to get elected.

[01:17:45] You know Bernie or whoever, Beto.

[01:17:49] Right.

[01:17:50] Well...

[01:17:51] And you know because that takes a lot,

[01:17:52] that kind of activism, political activism

[01:17:55] it would seem like would just disappear

[01:17:57] if you took that Schwang's approach of just...

[01:18:02] You know this is often, not awesome,

[01:18:05] often a criticism of the more liberal mentality.

[01:18:09] Not so much impugning conservatives for being overconfident

[01:18:13] but rather accusing liberals of being so uncertain

[01:18:18] so as to not be able to really commit to anything.

[01:18:21] And it's in stark contrast to the certainty, right?

[01:18:24] So when you were giving that example

[01:18:26] I was like yeah, you know what would happen?

[01:18:27] It's just the conservatives would win

[01:18:29] because they're more willing to say like, you know, I know

[01:18:32] and this isn't a condemnation, right?

[01:18:34] There's times there's time when you need to act

[01:18:37] and just bite the bullet and act with limited information

[01:18:41] and the error of not acting might be worse.

[01:18:45] But I feel like there is something at least in modern,

[01:18:49] in modern politics about the wishy-washiness of a liberal

[01:18:53] compared to the certainty of the conservatives.

[01:18:56] Well and that's more centrist liberal

[01:18:59] because then there is the activist progressives

[01:19:04] who share the passion and certainty

[01:19:09] of their, of the people on the right.

[01:19:13] And actually that's why they're making big inroads

[01:19:16] in the Democratic Party, right?

[01:19:18] Because they have that energy and the inspiration

[01:19:24] that the wafflers don't have.

[01:19:28] Right, yeah, no, that's true.

[01:19:30] It's sort of like you have to pick your poison

[01:19:34] and then act confidently about it.

[01:19:37] And it's a hard step for me to take

[01:19:40] in that domain in particular because

[01:19:42] I'm like, I'm a waffler, I think.

[01:19:45] I need to reach around more often.

[01:19:47] Yeah, we should all reach around.

[01:19:49] Have the goddamn courtesy.

[01:19:52] I have a question about one of these studies.

[01:19:54] Yeah.

[01:19:55] So they asked people among conservatives,

[01:20:01] 27% relative to just 10% of liberals

[01:20:05] agreed both that President Obama's rhetorical skills

[01:20:08] are elegant but are insufficient

[01:20:11] to influence major international issues

[01:20:14] and that President Obama has not done enough

[01:20:17] to use his rhetorical skills

[01:20:19] to affect regime change in Iraq.

[01:20:23] Those don't seem inconsistent.

[01:20:27] At all.

[01:20:27] As you were saying it to me,

[01:20:28] I was waiting for the gotcha.

[01:20:32] I mean, I just like the President Obama,

[01:20:34] I don't even know what that means.

[01:20:35] I mean, President Obama has not done enough

[01:20:37] to use his rhetorical skills to it.

[01:20:41] So that sentence doesn't even make sense.

[01:20:44] But if the best sense I can make of it,

[01:20:48] it is not inconsistent with it.

[01:20:51] There might be some tension

[01:20:52] but it's not inconsistent with the first.

[01:20:54] So I think that if you take these as like his,

[01:20:59] President Obama's rhetorical skills

[01:21:01] are insufficient to influence major international issues.

[01:21:05] You're saying like his rhetorical skills

[01:21:07] can never make a change.

[01:21:08] And then he's not using his rhetorical skills

[01:21:11] to make a change enough.

[01:21:13] If you've just said that he's incapable of,

[01:21:16] his rhetoric is incapable of a change,

[01:21:18] then ought implies can.

[01:21:21] Yeah, I guess.

[01:21:22] But like you could also interpret using

[01:21:25] a principle of charity that this is what they're,

[01:21:29] how they're interpreting it as.

[01:21:30] He hasn't tried to develop his rhetorical skills

[01:21:34] enough in a way that would make a change in Iraq.

[01:21:40] I mean like maybe logically,

[01:21:43] if you interpret strictly literally these sentences,

[01:21:47] but that's just not how we understand sentences.

[01:21:49] I mean, I agree with you.

[01:21:51] Grace you.

[01:21:52] Yeah, correct.

[01:21:53] I agree with you.

[01:21:54] That's why it took me a second careful reading

[01:21:56] to even find that because I think,

[01:22:01] this is just true in social psychology in general.

[01:22:05] I find that like often some of the examples,

[01:22:07] they're betraying such an agenda.

[01:22:09] Like what they choose to ask people about.

[01:22:15] You know we, I should say,

[01:22:17] often literally have zero clue

[01:22:19] that what we've just asked is like,

[01:22:23] he's betraying this ignorance about it.

[01:22:26] Even when you're testing the Dunning program.

[01:22:29] Exactly, exactly.

[01:22:31] God knows all the ways that we've betrayed

[01:22:34] but you know our ignorance with.

[01:22:37] It's shtick for us.

[01:22:38] We know all the errors we're making.

[01:22:40] We're just doing them just for your entertainment.

[01:22:43] Every single error we're making,

[01:22:45] we're aware of and we're doing it to inspire you

[01:22:52] to imagine yourself better.

[01:22:55] I don't know.

[01:22:56] My big Dunning cruiser is I'll start a sentence

[01:22:59] thinking I know how to finish it

[01:23:02] properly and often I just don't.

[01:23:06] You just have to finish it with confidence

[01:23:07] no matter what you say.

[01:23:09] Just finish it louder

[01:23:10] and with more confidence than you started it.

[01:23:12] And then just with a nod.

[01:23:14] You know like in a Q&A sometimes,

[01:23:17] you just have to give that nod

[01:23:18] to indicate that you're done because.

[01:23:21] Yeah.

[01:23:22] There's nothing in the actual words

[01:23:24] that indicates that it's done.

[01:23:26] I give a very confident non-answer.

[01:23:29] I pause for three seconds while I'm nodding

[01:23:32] and then I say,

[01:23:33] I hope that answered your question

[01:23:34] and then I just move on.

[01:23:37] Does that make sense?

[01:23:39] Yeah.

[01:23:39] Over there.

[01:23:42] You've had your hand up for a while.

[01:23:44] So you get like fairness norms,

[01:23:46] like preventing them from interrupting you with.

[01:23:48] Yes, the disabled person in the back had a question.

[01:23:54] You've been waiting patiently, sir.

[01:23:59] God, I'm gonna use that for real.

[01:24:00] I'm not disabled.

[01:24:04] Whatever.

[01:24:05] Go ahead and ask your question.

[01:24:10] All right, join us next time on Very Bad Wizard.