David and Tamler talk about the invasion of dual process theories in psychology. Why do we love theories that divide complex phenomena into just two categories? Is there any evidence to back up these theories? Are we distorting our understanding of the mind and morality? And what we can do to get out of this mess? Plus, Liam Neeson, moral pet peeves, and oral ethics.
Sponsored By:
- Mack Weldon Promo Code: VERYBADWIZARDS
Links:
[00:00:00] Very Bad Wizards is a podcast with a philosopher, my dad and psychologist David Pizarro having
[00:00:06] an informal discussion about issues in science and ethics.
[00:00:09] Please note that the discussion contains bad words that I'm not allowed to say and knowing
[00:00:14] my dad some very inappropriate jokes.
[00:00:16] I thought it took God seven days to make the world.
[00:00:20] He rested on a seventh.
[00:00:22] I always thought he should have put the extra day in instead of half asking me.
[00:00:27] Welcome to Very Bad Wizards.
[00:01:17] I'm Tamler Sommers from the University of Houston.
[00:01:20] Dave, the actress Michelle Rodriguez defended Liam Neeson saying he can't be racist because
[00:01:27] of how he kissed Viola Davis and you don't make out with the race you hate.
[00:01:32] Not like that.
[00:01:34] By that standard are you racist?
[00:01:42] By that standard, unfortunately I hate most people in the world.
[00:01:51] I'm not going to air my dirty laundry.
[00:01:56] You're not going to say which people you hate exactly?
[00:01:59] No, no.
[00:02:02] But you know, in my life experience, I blame society.
[00:02:08] They set a standard of beauty for me and especially as like a light-skinned Latino male, like the
[00:02:17] colorism that I was just raised with, I'm entrenched into my psyche.
[00:02:23] I don't endorse it.
[00:02:25] I don't want it to be my true self.
[00:02:27] It's just my, perhaps you could call it my system one, just acting up.
[00:02:33] I would call you a light-skinned quote unquote Latino male.
[00:02:37] Well, your use of quotes is just what a Jew would say.
[00:02:45] What you're doing is essentially the Liam Neeson thing where you're sort of confessing
[00:02:50] something you're not particularly proud of.
[00:02:55] I try.
[00:02:56] I try it.
[00:02:57] I just didn't have enough.
[00:03:02] I've tried to make out with people of all people.
[00:03:12] Really I tried to make out.
[00:03:14] What happened?
[00:03:15] They just slapped me, you know?
[00:03:17] Yeah.
[00:03:18] They're like, who are you?
[00:03:19] Like, who are you?
[00:03:20] Like, why are you walking up to me?
[00:03:21] This is turning.
[00:03:22] I didn't expect it to go this direction, but it's turning into a kind of Harvey Weinstein
[00:03:26] kind of.
[00:03:27] It's a no, but there is an interesting discussion.
[00:03:29] I don't remember if we've ever talked about it.
[00:03:34] I think that if you ask most people and they're being honest, I could be wrong.
[00:03:38] I don't know.
[00:03:39] I don't have hard data on this, but I think that people have preferences for particular
[00:03:46] looks and particular races.
[00:03:49] There is evidence actually from speed dating studies that I think we've mentioned before.
[00:03:54] The question of whether it's racist to prefer one race in terms of sexual attraction is an
[00:04:02] interesting one to me.
[00:04:05] I want to defend people's ability to say that I'm only attracted to, say, black men and
[00:04:11] not call them racist against Asian men.
[00:04:14] I don't think that equality in that sense requires that you be attracted to everybody.
[00:04:20] If so, then I would be like a horrible anti-male.
[00:04:27] The drummers in the woods are worried about.
[00:04:35] I totally agree with you.
[00:04:38] You can't control what you're and who you're attracted to.
[00:04:45] I was reading something and I forget where.
[00:04:48] I don't know to what extent this is true, but apparently there's a certain segment of
[00:04:51] the transgendered community.
[00:04:54] If someone is a lesbian, they want them to still be attracted to someone who just transitioned
[00:05:00] into a woman or transitioning and they say it's transphobic if you make a distinction.
[00:05:09] I was reading something and she's like, I'm a lesbian.
[00:05:13] You have a penis.
[00:05:14] This isn't complicated.
[00:05:21] I think that's fine.
[00:05:22] I defend the lesbian's right to not be attracted to someone with a penis and I defend your
[00:05:27] right to only be attracted to Hitler youth.
[00:05:33] People who look like Hitler youth.
[00:05:37] The Liam Neeson thing, which I know barely about.
[00:05:40] I didn't really read too much about it.
[00:05:45] I think it's fair to say that he was totally being racist when he did what he did.
[00:05:51] Whether or not he is racist now, I don't know.
[00:05:54] But definitely whether or not he kissed a black woman is not.
[00:05:59] Imagine all those clubs that white people used to attend back in the day and they
[00:06:08] were plenty of white men who totally were attracted to black women and they were still
[00:06:14] fucking asshole race.
[00:06:16] They weren't marching.
[00:06:17] They weren't like marching with Dr. King.
[00:06:19] No, but I do think it does say something about a person if they're willing to, not
[00:06:26] willing but even eager to be intimate with a person of a certain race that there's
[00:06:34] a way in which it doesn't obviously say for sure, it's not overriding.
[00:06:40] It's not absolute that they're not racist.
[00:06:42] But it speaks to something.
[00:06:45] I think it does speak to something.
[00:06:48] I don't think it's.
[00:06:49] It speaks to Bill Burr that he's married to a black woman.
[00:06:51] You know, he makes a lot of sort of race on the borderline kind of, you know,
[00:06:55] quit and like, I think the fact that he's married to a black woman and like
[00:07:00] credentials him.
[00:07:02] So OK, so that's a different claim.
[00:07:04] But but but the first one that it's that it's signal something about you.
[00:07:12] I think it might.
[00:07:14] So like, I I I grant that it's like information that could be on your side
[00:07:21] here, but I do also think that it's possible, like set aside the marriage
[00:07:25] thing, so I think if you're married to somebody from another race and
[00:07:27] you're constantly interacting and you're, you know, treating like you're
[00:07:31] learning things that you might not have known about their experience,
[00:07:34] that's one thing. But like, just say, like making out a white man,
[00:07:37] making out with a black woman.
[00:07:39] I don't think that it requires any sense of equality or non-racist.
[00:07:44] I think that you can fully be racist in in a fundamental sense
[00:07:50] that doesn't require disgust at at like the physicality of them.
[00:07:55] I know that some people are racist like that.
[00:07:58] They, you know, old timey white people might say like, oh,
[00:08:01] like they might actually respond with disgust at like non-white people.
[00:08:05] But I don't think that it requires it now.
[00:08:06] You might in fact, you could marry somebody and treat them like shit.
[00:08:10] Right. And and there's of course, like the horrible ugly history
[00:08:14] of the slave and the plantation owner or whatever.
[00:08:19] And you know, right.
[00:08:20] But yeah, Ice Cube has a great great song back when he was not
[00:08:25] making kids movies called Horny Little Devil, which is all about about
[00:08:30] like white men doing exactly that.
[00:08:33] And it's actually hilarious, really offensive if you're the kind of white
[00:08:37] person who thinks that that people shouldn't talk bad about you.
[00:08:40] But but also really funny.
[00:08:43] So yeah, I don't.
[00:08:44] So can we just talk one second about Liam Neeson?
[00:08:47] So here's the thing that I agree that I agree with what you've said,
[00:08:53] certainly at that time, I think him talking about it
[00:08:57] 30 years later is is overall kind of a good thing that he would bring it up.
[00:09:04] And the the rate, you know, anybody and I don't know to the extent
[00:09:07] to which this is a huge story and people are wanting to like not see Liam Neeson
[00:09:13] movies is probably not that big a deal.
[00:09:15] But to the extent that it is, I think it shouldn't be like that's counterproductive.
[00:09:19] It's yeah.
[00:09:21] I and I didn't hear like his comments about it.
[00:09:24] Like if he was talking about like the unfortunate way in which he acted,
[00:09:28] then I think it's actually it.
[00:09:31] I think we don't have enough of those conversations.
[00:09:33] In fact, like when I'm talking to students in intro,
[00:09:36] psych this and this makes my fellow professors nervous were largely white.
[00:09:40] I try when talking about race.
[00:09:44] I try to like diffuse the situation by just admitting
[00:09:49] the ways in which I'm racist.
[00:09:51] And and I feel like that that opens up a little bit.
[00:09:56] It makes people less defensive, like if if somebody is willing to say
[00:10:00] that they themselves have thoughts or or feelings that are racist,
[00:10:05] even though I reject them, they exist.
[00:10:08] And actually talking about that,
[00:10:11] I think is the only way we're going to get past some of some of the things
[00:10:14] like some of the ills with society.
[00:10:16] It's just that it's very difficult to do so when
[00:10:21] when you don't have a relationship of trust with somebody, right?
[00:10:24] And the media is not the kind of trustworthy partner
[00:10:27] that you want to be talking to about this stuff.
[00:10:29] Yeah, and social media also.
[00:10:32] And yeah, right.
[00:10:36] All right. So I had said system one
[00:10:38] to give a segue into what we're going to talk about today
[00:10:41] so that you can say yes.
[00:10:42] So segment two, we're going to talk about
[00:10:45] actually like inspired by a suggestion
[00:10:48] that didn't make our finalists because we didn't.
[00:10:52] But it was it's about this idea of conceptual binaries
[00:10:55] and the degree to which we're wired to see things in terms of binaries
[00:11:00] and how that plays out in philosophy and psychology.
[00:11:03] And you found a really good paper
[00:11:05] about how this is played out in psychology
[00:11:08] and the problems with that.
[00:11:10] So that's what we'll talk about in the in the second segment.
[00:11:14] First segment we're going to do our another suggestion,
[00:11:17] Moral Pet Peaves.
[00:11:19] Before we even get to that, I know this has been all preludes.
[00:11:24] But since our last episode, you have you are now like the star,
[00:11:29] the headliner of the Atlantic
[00:11:33] math, Atlantic monthly, if they still call it that, the Atlantic.
[00:11:37] That's that's that's right.
[00:11:40] I'm actually too big for this podcast now.
[00:11:43] Yeah. No, no, no.
[00:11:44] Thank you.
[00:11:45] Yeah, there was a really nice article written up by a woman named
[00:11:48] Kathleen McAuliffe about about disgust and politics.
[00:11:51] And but I just wanted to give a shout out
[00:11:54] to the graduate students who did a lot of the work.
[00:11:58] So Yol and I have been doing this.
[00:12:00] Yol and Barb, some of you may know him from his
[00:12:02] from when he used to be a friend of the podcast,
[00:12:06] have been doing this work on political orientation and disgust.
[00:12:09] But this a couple of grad
[00:12:11] grad students, Benjamin Roush, who's mentioned in the article
[00:12:14] and Rajan Anderson, who isn't unfortunately mentioned in the article,
[00:12:18] spearheaded this work.
[00:12:19] It's on taste sensitivity and and political orientation.
[00:12:23] And it's a really cool finding.
[00:12:24] But but like as always, the professors get the book of the credit
[00:12:28] and the grad students get a little bit shit on.
[00:12:31] So for all the grads, we're Raj Rajan.
[00:12:34] What's his name?
[00:12:34] Poor Raj Raj Rajan Anderson.
[00:12:38] He's he's one of my students.
[00:12:40] Great, great guy.
[00:12:41] But, you know, he's not the lead.
[00:12:43] You know, though, like that's kind of precious to be named Rajan.
[00:12:50] His he's half Indian, half white, dude.
[00:12:52] Oh, that's OK.
[00:12:53] I didn't know that because I thought Anderson bastard.
[00:12:56] No, Anderson is not a particularly Indian name.
[00:12:59] I know there are biracial people, Tim.
[00:13:02] You know, God, I'd love to love to be with one of them.
[00:13:08] I wanted to know where that one was going to end.
[00:13:12] All right, let's talk about moral.
[00:13:15] But thanks for letting me talk about my own shit.
[00:13:18] But you want to talk more pet peeves?
[00:13:20] Sure. Yeah.
[00:13:21] So we've come up with some moral pet peeves.
[00:13:25] This was sort of something that you.
[00:13:28] Express interest in.
[00:13:29] So maybe talk about why you want to do it.
[00:13:31] OK, let me get I feel like I'm very, very tolerant of people's
[00:13:35] immorality. So like I'm happily friends with people who are shady.
[00:13:39] Like, why no have done shit.
[00:13:41] That's bad.
[00:13:43] Not that I endorse it, but like I don't.
[00:13:45] I think I'm not a judgmental person.
[00:13:49] But what I realize is that I'm a moralizing person in my everyday life
[00:13:54] and that like bulk of my moral cognition and emotion
[00:13:58] comes from everyday interactions with people that like anger me to a degree
[00:14:03] that is probably well, not probably completely unjustified.
[00:14:08] And so so I wanted just a chance like in some of these we've mentioned before.
[00:14:14] Yeah, like I think like well, returning the car.
[00:14:20] Yeah, I'm running the car.
[00:14:21] Returning the car at the supermarket parking lot, the people who don't do that.
[00:14:25] I think we both agreed they should be like lined up and shot.
[00:14:28] Like it should be a fuck proportionality.
[00:14:30] Well, no, no, that is.
[00:14:34] That is Lex Talion.
[00:14:36] And then last episode we talked about people, people who go to the 10 items
[00:14:40] or fewer line with like 20 items and don't seem to have like.
[00:14:47] Like it'd be one thing if they were turning around
[00:14:49] and being like, I'm really sorry, I'm in a rush or something.
[00:14:51] I'd be like, oh, yeah, fine.
[00:14:52] But like they just know like no indication that they think anything is wrong.
[00:14:56] It's totally shameless.
[00:14:57] Yeah. Yeah.
[00:14:58] It's like it's so so my moralizing is what led to this to this top.
[00:15:03] My hyper moralizing, but I take it that you.
[00:15:06] Yeah, I do this.
[00:15:07] I was good.
[00:15:08] You know, it was harder than I thought to think of this even.
[00:15:11] And I think that like, you know, I've been meditating now for three and a half
[00:15:14] years, I think if there's one thing meditating is good for it is not getting
[00:15:20] as irritated and angry about these kinds of things because actually one
[00:15:26] of my pet peeves was people who talk about meditation.
[00:15:28] Yeah, no, I totally get that shit.
[00:15:31] But if you meditated, then it wouldn't bother you that much.
[00:15:37] Well, I kissed I kissed a girl who meditates.
[00:15:39] So, you know, you must not be a meditationist.
[00:15:45] All right.
[00:15:46] Sorry, one thing is my difficulty was distinguishing between bad manners
[00:15:51] and what might be called an actual moral pet peeve.
[00:15:54] And I don't know that there's a good line.
[00:15:56] And I think that part of my problem is that I blur the line,
[00:15:59] but we can get right into it.
[00:16:00] No, yeah.
[00:16:00] And don't even think about it in terms of binaries like that,
[00:16:03] like moral conventional.
[00:16:05] You want to start?
[00:16:07] Sure.
[00:16:09] So this still can get me pretty worked up and furious and yelling.
[00:16:16] And like when you're on with somebody in customer service at a,
[00:16:22] you know, like a big company and they are telling you that they
[00:16:28] can't do anything to help resolve the issue.
[00:16:31] And there's telling you that when you know with 100% certainty,
[00:16:35] they absolutely can that they're just choosing not to.
[00:16:38] And they know that you know that.
[00:16:41] And it's just a massive insult to your intelligence.
[00:16:45] It's just a game that they're playing like.
[00:16:47] Exactly.
[00:16:48] Like, like toying with you, like you're a mouse and they're a cat.
[00:16:53] I like, I, that drives me absolutely crazy.
[00:16:56] You're like, you realize like I could call, I could just hang up,
[00:16:58] call back, get somebody else who for whatever reason just decides
[00:17:02] that they do want to help me.
[00:17:04] Like we all know that that's how this works.
[00:17:07] Like so like that, I hate that.
[00:17:09] I hate people who insult your intelligence in that kind of
[00:17:12] release.
[00:17:13] And is your certainty because you have had it happen before?
[00:17:17] Like, how do you know for sure, for sure that they can?
[00:17:19] Because like every single time, like you get another person
[00:17:22] who has that same position and they're able to magically like
[00:17:27] make your problems go away.
[00:17:29] Like I do think it is a choice on their part, you know?
[00:17:33] I don't know what it would entail, but it's a choice on their part.
[00:17:37] Yeah.
[00:17:38] They probably get disincentives to like say credit someone's
[00:17:43] account.
[00:17:43] Like they probably are on this like horrible like spreadsheet about
[00:17:47] like how much they.
[00:17:48] So that's fine.
[00:17:49] And so I understand that.
[00:17:50] No, no, I agree with that.
[00:17:51] Just tell me that's the deal.
[00:17:53] Like I'm not helping you because I already helped two guys today
[00:17:57] and like so you sorry, just be upfront.
[00:18:00] I can actually take a real setback if I feel like the people are
[00:18:04] just being honest with me about what caused it and what's the
[00:18:08] deal going forward.
[00:18:10] Right.
[00:18:10] Yeah.
[00:18:11] No, I totally agree with you.
[00:18:13] The flip side is I've had like wonderful discussion.
[00:18:15] Sometimes like I get on customer service and like the person
[00:18:18] is really nice and we end up just just shooting the ship for a
[00:18:21] while.
[00:18:22] This one time I was on with Verizon and I don't know what the
[00:18:26] problem I was having.
[00:18:27] I don't, it wasn't about my bill or anything.
[00:18:29] I think it was about service or something like that.
[00:18:31] And like I was just like actually having a fun conversation.
[00:18:36] And at the end she was like, sweetie, is there anything else
[00:18:39] I can help you with today?
[00:18:40] And I was like, yeah, like could you just like cover my
[00:18:43] bill for the next year?
[00:18:45] And just joking, obviously.
[00:18:47] And she started laughing and she goes, no, but I'll tell you what,
[00:18:50] I'll credit your account with $100.
[00:18:54] I was like, oh.
[00:18:57] That's what I mean.
[00:18:58] I think they, when they're good people because you have such
[00:19:02] nightmares, Kafka-esque experiences with the ones who are actually
[00:19:06] really helpful, you feel like they're the best people in the
[00:19:09] entire world.
[00:19:10] Okay, I'm going to work my way from least disgusting to most.
[00:19:15] One thing I hate, hate is behavior in airports when you're at
[00:19:21] the gate and you're waiting for your flight and there's never
[00:19:24] obviously enough seats to accommodate the number of people
[00:19:26] who are on a plane.
[00:19:28] And so you're just looking for a place to sit and somebody has
[00:19:32] decided that their bag and their coat like require one or two
[00:19:38] extra seats and you're walking by clearly looking for a
[00:19:41] place to sit and they don't even make eye contact.
[00:19:44] Like they don't, it's not even entered there.
[00:19:47] They're not saving that for anybody.
[00:19:49] They're just being dicks or even just like when they have their
[00:19:52] carry on in the middle of the row and you're trying to get
[00:19:56] around it and they're not even bothering to move it.
[00:19:58] It's like those little things make travel.
[00:20:01] Travel is just like one whole set of minor annoyances.
[00:20:07] Yeah.
[00:20:07] Like please just be kind.
[00:20:09] Yeah, just don't add to them.
[00:20:11] Like we're all in this bullshit together.
[00:20:14] I mean, that's one that I like I'm like Stockholm syndrome to
[00:20:18] I just expect it.
[00:20:19] Like I don't expect it to get better.
[00:20:21] I just yeah, I'm resigned to it.
[00:20:25] Well, it's so bad that actually so I should say that like
[00:20:29] I'm sure I engage in a lot of the behavior that I'm
[00:20:31] currently I do all of except probably the I've never worked
[00:20:34] in customer service.
[00:20:36] Sure, I've done something like that.
[00:20:38] Right.
[00:20:38] And sometimes you just genuinely don't know so I can
[00:20:41] like give people a benefit of the doubt.
[00:20:42] All right.
[00:20:42] My second one is people who don't control.
[00:20:46] I wrote this down and what I wrote is people who don't
[00:20:50] control they bratty fucking kids.
[00:20:54] Why did it control your race?
[00:20:56] Is that you were thinking?
[00:20:58] No, no, no.
[00:20:59] I actually think that actually it's the opposite.
[00:21:02] Like this is very common in the white community and
[00:21:05] probably more so.
[00:21:06] Like it's race agnostic.
[00:21:09] Yeah.
[00:21:10] I mean, like I think if anything it's more common
[00:21:13] like among white.
[00:21:15] So actually the example I had was on an airplane and
[00:21:19] there's a kid in back of you and the kids just
[00:21:21] kicking the seat over and over again and the parents
[00:21:24] just started not saying or doing anything.
[00:21:27] And then also the ones who like bring their kids
[00:21:29] over to your house and they're yelling free rain
[00:21:34] banging on the piano and like like making huge
[00:21:38] messes and just we're both.
[00:21:39] Parents.
[00:21:40] So I feel like we have some right to complain about this
[00:21:45] because you know, although Eliza was kind of an angel
[00:21:49] that you know when she was at this age.
[00:21:51] Sure, sure.
[00:21:52] Like so was my daughter.
[00:21:53] But yeah, partly because of what what we did.
[00:21:56] Right.
[00:21:56] Yeah.
[00:21:57] Like well maybe in our parent in our parenting
[00:22:00] episode.
[00:22:01] Yes.
[00:22:02] Well, we have a big announcement to make after the break.
[00:22:06] So yeah, like and you know, on the rare occasions
[00:22:11] where Eliza did act up in ways that other people
[00:22:15] had to suffer like we'd stop what was going on.
[00:22:18] Like we'd take care of it.
[00:22:20] Yeah, absolutely.
[00:22:21] Like if there's like a general principle I have
[00:22:24] of like respecting other people's shit.
[00:22:26] And when you have kids, you bring them over
[00:22:30] to somebody's house and they start like touching
[00:22:32] the TV with their hands or that you know, or
[00:22:34] like anything like that might and you're like,
[00:22:37] are you not noticing this?
[00:22:39] Like right.
[00:22:41] I think a lot of the parents they're so it's like
[00:22:43] dead numb to they're just dead.
[00:22:45] Yeah, they're like because they have to deal with
[00:22:47] this all the time.
[00:22:49] And this is nothing.
[00:22:50] Yeah.
[00:22:51] But that's a case where you feel like
[00:22:52] you can't bring it up because like you're an
[00:22:54] asshole if you tell somebody that their kids
[00:22:56] are like behaving like could you tell your kid
[00:22:58] not to like fucking knock over my my vase.
[00:23:02] I suspect I'm more capable of actually doing
[00:23:05] that and just saying it than you are.
[00:23:09] Probably.
[00:23:11] All right, my.
[00:23:13] Mine is a puzzling behavior.
[00:23:15] It's gross, but.
[00:23:18] Like taking a shit and not flushing a toilet.
[00:23:22] Like in a public bathroom.
[00:23:23] Yeah, this is like one of those things where
[00:23:25] like the flushers in public restrooms
[00:23:29] are beautiful feats of engineering.
[00:23:32] Like I wish I had that in my home.
[00:23:34] Like now we have these low flow toilets
[00:23:36] and like sometimes it gets stuck and you have to.
[00:23:38] But like one press of that thing
[00:23:41] and it's sucking everything down.
[00:23:44] And I have to walk into like an explosive
[00:23:47] diarrhea situation or just even a big log.
[00:23:50] Like I don't what I don't understand is how.
[00:23:54] How you could just not.
[00:23:56] Think to flush the toilet.
[00:23:58] Like are they just forgetting?
[00:23:59] It seems like a weird thing to forget.
[00:24:01] You wipe your ass presumably and you look at the toilet.
[00:24:04] You put the toilet paper there.
[00:24:05] You look at the toilet.
[00:24:06] Yeah, you kind of want to review and be like,
[00:24:08] am I proud of this or like.
[00:24:11] Should I let someone else see this?
[00:24:13] Yeah, I mean, sometimes it's so beautiful that you don't.
[00:24:19] You don't want to be the only person
[00:24:20] who ever gets to see it, you know, take a picture.
[00:24:23] That's what.
[00:24:25] Yeah, I know.
[00:24:26] I have.
[00:24:27] I didn't put this on because it's not like,
[00:24:29] but my thing with public bathrooms is the people who go
[00:24:32] into a stall and piss into the stall,
[00:24:35] but they leave the toilet seat down.
[00:24:38] So like now.
[00:24:39] So yeah, now there's just like a bunch of pee.
[00:24:41] And yeah. Yeah.
[00:24:42] And so and now you are a faced if you're going in there to piss,
[00:24:46] you're faced with the do I put the toilet seat up,
[00:24:49] even though it has pee on it?
[00:24:50] Or am I going to just also piss into the toilet seat down?
[00:24:56] Yeah. You know, like the one for the that you would sit on
[00:24:59] for a shit and, you know, like I and I've done both.
[00:25:05] Sometimes I just suck it up and like partly
[00:25:07] because I don't want people to think that I'm the one that did that.
[00:25:10] Do I know? I know. Yeah. No, I play janitor.
[00:25:13] Like and I just like wipe it.
[00:25:16] But that deeply angers me.
[00:25:18] So this is a professional more,
[00:25:21] but I think it's a moral issue for sure.
[00:25:23] People who when they're giving a lecture or presentation
[00:25:27] when they run over time and in a conference conference.
[00:25:32] Yeah, like when there's like three people in the same session or something.
[00:25:35] So then it's like horrible.
[00:25:38] But even if it's just their session
[00:25:41] and they were told, you know, 45 minutes and they're going to an hour.
[00:25:45] And, you know, those same people are the ones that are like constantly
[00:25:48] shuffling papers, you know, oh, should I do this part?
[00:25:51] I don't know. Like it's like, did you not prepare for this?
[00:25:55] Like, you knew we agreed to this like three months ago.
[00:25:59] Why are you doing this right now?
[00:26:01] Like why are you making these decisions as if like somebody told you five minutes
[00:26:05] beforehand, hey, you're going to go on and do this paper in this amount of time.
[00:26:10] Go. I don't like that.
[00:26:13] So it's a disrespect for your audience.
[00:26:15] It's really yeah.
[00:26:16] And the worst is when they acknowledge it laughingly
[00:26:19] and they're like, you guys don't mind, right?
[00:26:21] Like you guys don't mind if I go a little over.
[00:26:23] Like I really want to get and I'm like, yeah, I do my I didn't even want to see
[00:26:26] the normal length version of the paper.
[00:26:31] It's funny to say shuffling papers, though, because no in psychology,
[00:26:35] everybody just uses PowerPoint like nobody actually has a paper.
[00:26:38] Yeah.
[00:26:40] Which makes but then what they do is they scroll through
[00:26:43] and they're like, do I have time for these slides?
[00:26:44] Yeah. OK, I'll do it.
[00:26:46] Yeah, it's the same principle.
[00:26:48] Um, I know I definitely.
[00:26:52] This is my final one.
[00:26:54] I wasn't sure whether to put this on and there are conditions to it.
[00:26:58] So I'm going to say it and just suspend your judgment.
[00:27:02] You're married, so this might not apply to you.
[00:27:05] Not reciprocating oral sex.
[00:27:11] So my I am of the opinion that.
[00:27:14] If you perform oral sex.
[00:27:18] That at the very least somebody should.
[00:27:22] Mention.
[00:27:24] That they might do it for you, not just ignore it entirely
[00:27:29] and just like rest on the leg on my cool.
[00:27:33] Thanks.
[00:27:35] Like I feel like there should really be some reciprocity there.
[00:27:39] Just, you know, it doesn't mean that you have to.
[00:27:41] Like maybe you hate giving oral sex and you love receiving it,
[00:27:44] which would be totally fine.
[00:27:46] But then just say that, like just give somebody a heads up
[00:27:49] that you're not just being an asshole.
[00:27:54] This is sort of like my customer service thing.
[00:27:56] Like just be on just be on.
[00:28:00] You know you could.
[00:28:04] Oh, wow. OK.
[00:28:07] Yeah, I fear that this is gendered in the opposite way
[00:28:10] that it's men are much more reluctant to like they'll take it
[00:28:13] all the time and not not give it.
[00:28:16] Yeah. In fact, I'm sure that that's the case.
[00:28:18] I'm pretty sure too.
[00:28:20] I think there's more of an expectation one way than in the other.
[00:28:24] And I just find that rude.
[00:28:26] This this is this is just.
[00:28:27] But you want true reciprocity.
[00:28:30] Well, I want there to be that I want there to be
[00:28:33] the politeness of of at least acknowledging
[00:28:37] if it's not if it's not going to happen like fine.
[00:28:40] Like there are people who don't who hate doing it.
[00:28:42] Like that's that doesn't bother me.
[00:28:44] But like, do you feel like they owe you to that information before you?
[00:28:50] No, no, no.
[00:28:52] They can bring it on you afterwards.
[00:28:57] They could spring it on you afterwards.
[00:28:58] I think that like that just acknowledging that they're
[00:29:02] you know, like, like even if somebody said like, hey, like in the airport,
[00:29:05] you know, I know my bag is here, but like I have something really valuable.
[00:29:09] I don't want to put on the floor. I'm really sorry.
[00:29:11] Like that would be fine.
[00:29:12] In the same way that like saying like, hey, man, thanks.
[00:29:15] Thanks for like the oral sex.
[00:29:17] But just so you know, like I have like a gag reflex or whatever.
[00:29:22] It does not allow me to.
[00:29:25] Just so you know, like I have no interest.
[00:29:26] There's no fucking way that I'm going to touch you after you.
[00:29:31] Do you love me?
[00:29:33] Very, very much.
[00:29:43] We kiss it.
[00:29:46] But you first.
[00:29:49] OK.
[00:29:57] But but wait a minute.
[00:30:00] So I'm trying to just pin down the parameters of this.
[00:30:05] Like, are you talking about like a one night stand or is this like even into a relationship?
[00:30:11] Like, I think that like if it goes unspoken into a relationship.
[00:30:16] So here's where where like it could happen.
[00:30:18] Like so it could happen that that one partner really loves doing it.
[00:30:22] So they always do it to to their partner.
[00:30:26] And because they love doing it and they love giving their partner pleasure
[00:30:29] or maybe it brings them pleasure to do it, they continue to do it.
[00:30:33] But then it's never mentioned why their partner isn't doing it back.
[00:30:38] Like I would worry like, dude, is like to like a do I smell bad?
[00:30:43] Like is it, you know, like is there some reason that like you just
[00:30:48] continuously ignore like my my side of thing that is hanging there?
[00:30:54] Yeah, I feel like I feel like it should just be something that's put out there.
[00:30:59] You know, like you're the married man.
[00:31:02] You tell me how it works as a married person.
[00:31:05] Is it in the contract?
[00:31:07] It's in the yeah, it's in the things that you sign.
[00:31:11] The pre.
[00:31:13] Yeah, no, I think that I always wondered what would be the episode where,
[00:31:17] you know, like we finally get in trouble and like charges brought against us.
[00:31:23] And I didn't think it would be the moral pet peeves one, but
[00:31:27] 158 the last episode.
[00:31:30] This is like, I mean, this goes both ways.
[00:31:34] There's nothing gendered or about this.
[00:31:36] Like, you know, it might be a greater problem in the gay male community.
[00:31:40] I'd love to hear what the norms are there.
[00:31:42] Like he's just expected.
[00:31:44] Like, I don't know.
[00:31:46] Yeah, well, this is an empirical question.
[00:31:48] Hit us up. Email us.
[00:31:51] Um,
[00:31:53] those reciprocity extend.
[00:31:55] Does it matter if you're kind of like the butch one or the?
[00:31:59] Yeah, like are there expectations that if you're the bottom that you're the one
[00:32:04] who's going to initiate oral sex, like, like I'm I'm actually curious about
[00:32:09] the general norms and conventions.
[00:32:12] All right.
[00:32:13] You know, one thing I was worried about is
[00:32:17] that this would just explode into you and me yelling at each other about like it
[00:32:21] would just be pet peeves about what the other person does.
[00:32:25] Like I know I was I was trying to think of pet peeves.
[00:32:28] I don't think I think that like we pretty much error our grievances.
[00:32:34] Maybe not at first when we probably like the first first bit, like there were
[00:32:39] there were things that like I'm sure I was passive aggressive about.
[00:32:42] But like now I think I'll just tell you like are you fucking kidding me, dude?
[00:32:45] Is that what you're going to do?
[00:32:46] Yeah, like when we we would just
[00:32:49] we figured out that it was just better to air out what we were
[00:32:53] what the problem was.
[00:32:55] But I do remember like 24 hour periods where I'd be like, I know,
[00:32:58] Dave is pissed at me right now.
[00:33:00] I absolutely know and it's coming and I'd just be waiting for that text or that.
[00:33:05] And then I would come with a vengeance.
[00:33:09] I remember actually being annoyed at you because
[00:33:13] there was one time where you like got mad at me because you you thought I hadn't
[00:33:18] prepared appropriately. It was like maybe the day before or like the afternoon
[00:33:23] before we were going to record and you're like, you haven't read the article yet.
[00:33:27] Now it's like I'm going to read it like just not like you were mad at me.
[00:33:31] And then I remember just like holding on to that until until the day that you
[00:33:36] told me that you hadn't prepared just to be like, see, see.
[00:33:41] That probably didn't take long.
[00:33:42] Probably didn't take long for that day to come.
[00:33:47] It's much better to just say, say things like,
[00:33:50] like for instance, this was my way of telling you,
[00:33:53] you never fucking reciprocate.
[00:33:56] I'm sorry, I'll do it.
[00:33:58] I didn't know you felt that way about waiting for a blow job for so long.
[00:34:02] Like, how does it not enter your mind that I might want it to?
[00:34:06] I thought we had that kind of relationship was just all right.
[00:34:11] Yeah, this is it.
[00:34:12] That was our last first segment.
[00:34:14] Now we're going to go into our last second segment.
[00:34:18] We'll be right back to talk about conceptual binaries.
[00:34:24] Dave, how are your balls feeling right now?
[00:34:28] It's funny you should ask, Tamler.
[00:34:31] Currently they're feeling loved, comfortable, hugged.
[00:34:35] And that's because we have a new sponsor, it's Mac Weldon.
[00:34:39] Yes, Mac Weldon is better than whatever you're wearing right now.
[00:34:43] I'm currently girding my loins with their 18 hour Jersey boxer briefs.
[00:34:49] To all of our listeners who aren't
[00:34:51] girding their loins with Mac Weldon, I feel bad for you.
[00:34:54] I feel your pain.
[00:34:55] I feel your rash kind of rashy itch.
[00:34:58] This is why empathy is bad.
[00:35:00] All of them is right.
[00:35:01] Mac Weldon is a premiums men's
[00:35:03] essential brand that believes in smart design and premium fabric.
[00:35:07] Mac Weldon will be the most comfortable underwear, socks, shirts,
[00:35:10] undershirts, hoodies and sweatpants and more that you will ever wear.
[00:35:15] And in fact, I haven't been this excited about underwear.
[00:35:18] I think since I was about six or seven and I had a pair of green lantern under
[00:35:21] use and I like to just like put them on top of my pants and like pose.
[00:35:26] Now I'm just going to take pictures of me wearing my Mac Weldon.
[00:35:31] Available for purchase.
[00:35:32] Mac Weldon believes in smart design, premium fabrics and simple shopping.
[00:35:39] Website is very easy to use.
[00:35:41] They have a line of silver underwear and shirts that are naturally anti-microbial,
[00:35:46] which means they eliminate odor and they want you to be comfortable.
[00:35:49] So if you don't like your first pair, you can keep it and they will still
[00:35:53] refund you. No questions asked.
[00:35:56] I'm very glad that they don't ask you to return it.
[00:35:59] I wouldn't trust any company that asks you to return underwear.
[00:36:02] I got to say, I love the website too.
[00:36:04] I also bought they make travel accessories and, you know,
[00:36:08] I'm kind of a big deal and I travel a lot
[00:36:11] and right.
[00:36:13] Oh, yeah.
[00:36:14] I was just to pick up on that.
[00:36:16] You are a big deal and you do travel a lot.
[00:36:18] And I bought myself an Ion tech case.
[00:36:21] It's like a laptop sleeve with some pockets that means you can
[00:36:26] just easily switch your stuff from bag to bag without having to like repack.
[00:36:31] And I love that too.
[00:36:32] I got my eye on their hoodie.
[00:36:35] Yeah, yeah, actually, you need to.
[00:36:38] So for twenty percent off your first order,
[00:36:41] visit macwelden.com and enter promo code Very Bad Wizards.
[00:36:46] One word, Very Bad Wizards at checkout.
[00:36:49] Again, for twenty percent off your first order,
[00:36:51] go to macwelden.com and enter promo code Very Bad Wizards at checkout.
[00:36:57] You won't regret it.
[00:36:59] Thank you to Mac Welton for sponsoring this episode of Very Bad Wizards.
[00:37:04] Welcome back to Very Bad Wizards.
[00:37:12] At this time, we like to take a moment and thank all the people who have
[00:37:17] been watching this episode of Very Bad Wizards.
[00:37:20] Thank you for watching.
[00:37:24] I hope you enjoyed it.
[00:37:26] I'll see you in the next one.
[00:37:38] Bye.
[00:37:39] Bye.
[00:38:07] Thank you to all the people who get in touch with us, who email us,
[00:38:11] who tweet at us in the various ways that you get in touch with us.
[00:38:17] Email is probably always the best Facebook message, probably the worst.
[00:38:24] We really appreciate it.
[00:38:26] We've had some really nice emails lately, some really interesting emails,
[00:38:29] even some critical emails and I don't know.
[00:38:32] I really we don't have time to respond to a huge percentage of them.
[00:38:36] But we do read all of them and we really value it.
[00:38:39] So you can email us Very Bad Wizards at gmail.com, tweet us at Tamler,
[00:38:46] at P's, at Very Bad Wizards, the three Twitter accounts.
[00:38:52] You can like us on Facebook.
[00:38:56] You can go to our Reddit subreddit, which is not run by us,
[00:39:00] not organized by us, and we make no promises about what we read and what
[00:39:05] we don't, but it's a really good community.
[00:39:08] Like I read it has its ups and downs.
[00:39:13] And I'd say that this is a really nice community overall.
[00:39:18] It's still Reddit, but overall it seems like a thoughtful group of people.
[00:39:24] And they post a lot of funny things too.
[00:39:25] So so you can go there and you can like us on Instagram or follow us on Instagram.
[00:39:33] Sorry, you can follow us on Instagram.
[00:39:35] And you can support us in more tangible ways as well.
[00:39:40] And we a special thank you to all the people who do this.
[00:39:44] There are three ways to do it, three to four.
[00:39:48] There is to click on the Amazon link on the website and then purchase,
[00:39:53] make the purchases you would otherwise make.
[00:39:55] And we get a small cut of that.
[00:39:58] I really appreciate everybody who does that.
[00:40:02] You can give us a one time donation or even a recurring donation on PayPal.
[00:40:08] You can get some merch on Teespring.
[00:40:13] We got to get another campaign up and running at a certain point.
[00:40:17] And finally, you can become one of our deeply beloved
[00:40:22] Patreon supporters and give us a little amount per month.
[00:40:26] And there are some bonus content up there depending on your level,
[00:40:32] including the fact that our
[00:40:35] well, all of our patrons get to suggest episodes and our five dollar and up
[00:40:41] patrons get to vote on a topic.
[00:40:44] And we gave the finalists last time and it turned out to be a two topic race
[00:40:52] for the most part.
[00:40:53] There was ethics of care, denial of death and what was the third one?
[00:41:00] Oh, polyamory.
[00:41:02] And those three never really made a strong run for it.
[00:41:07] But it was self deception and parenting.
[00:41:11] Does parenting matter?
[00:41:12] And also ethics of parenting.
[00:41:15] They were kind of flip flopping for a while.
[00:41:18] It was sort of fun to watch.
[00:41:19] But in the end, parenting does parenting matter and parenting ethics is our winner.
[00:41:26] And that will and maybe we'll get Paul to talk about this.
[00:41:30] Yeah, I think it might be that he would be a great person to talk to.
[00:41:33] Yeah, I think he inspired this.
[00:41:35] Right. It's a genuine development psychologist.
[00:41:38] I was rooting for self deception.
[00:41:40] Were you though?
[00:41:41] You think you were.
[00:41:44] Yes, thank you to all of our patrons and all of the people who get in touch
[00:41:48] with us and who support us.
[00:41:50] You are what makes us do this at the increasing expense of our other lives
[00:41:58] and obligations.
[00:42:00] That's right.
[00:42:00] That's right.
[00:42:01] All right.
[00:42:02] So yeah, so this was your idea for a paper and
[00:42:09] a topic that you wanted to get to sooner rather than later.
[00:42:12] So so tell us about it.
[00:42:14] This start, you know, I've griped about this before.
[00:42:18] So I'll give a little bit.
[00:42:19] So the idea is
[00:42:22] the broad idea is that we tend to think in in sort of these conceptual dichotomies
[00:42:27] that were across many domains and like Taylor was saying, in our fields,
[00:42:33] people just like
[00:42:36] theorizing, classifying, categorizing things into two.
[00:42:39] In psychology, in my area of social psychology,
[00:42:44] the field has been sort of dominated by these dual process theories of the human mind.
[00:42:52] And they all have a very similar flavor, even though the specifics of all of them
[00:42:58] vary, but ever since like the seventies, people have been proposing that there
[00:43:04] are two primary ways in which, say, cognition works.
[00:43:09] It's con.
[00:43:10] And is it conoman?
[00:43:12] Conoman, Conoman wrote the book Thinking Fast and Slow.
[00:43:15] But is that is that based on his early work, the type one, type two sort of?
[00:43:20] The type one and type two does come from Conoman's work.
[00:43:24] Conoman and Tversky and type one and type
[00:43:28] it was coined a bit later like in the nineties.
[00:43:32] But but yeah, that's one example, right?
[00:43:35] So the idea there being
[00:43:38] that there is a mental process that's associated with effort and deliberation
[00:43:43] and really, really thinking hard about something.
[00:43:47] And there's cognitive processes that are intuitive and fast and automatic and
[00:43:51] unconscious and uncontrolled.
[00:43:53] And those understanding human judgment
[00:43:57] is you can understand it as under what conditions people are making judgments
[00:44:05] specifically for Conoman and Tversky under judgments of under uncertainty.
[00:44:10] Like what whether they're stopping to think about it deeply or whether they're
[00:44:15] just like going with their gut that made its way into moral psychology as any
[00:44:19] listener of this podcast probably knows.
[00:44:22] So Josh Green's view of morality is that when you if you think hard about it,
[00:44:26] you become a consequentialist.
[00:44:28] But if you go with your gut, you're more like a deontologist.
[00:44:31] And that itself, those moral theories are also
[00:44:35] those are also dichotomies, right?
[00:44:37] Diacotomies that don't necessarily have to
[00:44:42] that's right.
[00:44:42] Diacotomy like even even before like type one, type two,
[00:44:48] there were theories of persuasion and social
[00:44:50] psychology that essentially were the same thing.
[00:44:52] So so the question was, say, Tamela, I want to convince you to stop smoking.
[00:44:58] Do I present you with a bunch of statistics about how you might die when you smoke?
[00:45:04] Or do I give you an ad with like like an attractive person telling you,
[00:45:09] like, don't smoke, I don't smoke either.
[00:45:12] And so the idea in those theories of persuasion was, well, it depends which one
[00:45:19] works depends on whether you are under the right conditions.
[00:45:23] Do you have the cognitive resources to think deeply about it?
[00:45:26] Then the statistics will work or are you just going with your gut reaction?
[00:45:30] In which case, like the celebrity ad will work.
[00:45:33] So they've been around for a long time and they've always just really annoyed me.
[00:45:39] Right.
[00:45:40] So and like my gripe with it always was
[00:45:45] that when you divide cognition into two types like Josh Green does or
[00:45:53] or the Conor Matoversky and you say one is about deliberation and the other one is
[00:45:58] about whatever that you're missing out on so much different psychology.
[00:46:03] And John Hite does this in his original paper, The Emotional Dog and its
[00:46:09] Rational Tale where when he outlines what system one, which is the fast
[00:46:15] unconscious, right? He says like this category of thinking includes emotions,
[00:46:21] intuitions, uncontrollable and unconscious thinking.
[00:46:25] And all of those things could be very, very psychologically different.
[00:46:30] In fact, even what kind of emotion is influencing your judgment?
[00:46:35] Like it matters to distinguish which emotion is influencing you.
[00:46:39] But let's just even take intuition.
[00:46:41] Right. When we use the word intuition,
[00:46:43] we might mean an intuition that has come from a whole bunch of learning.
[00:46:48] Like a chess master has intuitions about what move to perform next.
[00:46:53] That's a very, very different intuition than one that is like, say,
[00:46:57] evolutionarily evolved, say like the resistance to incest, a disgust reaction to incest.
[00:47:04] That's like an emotional response at something that that probably evolved.
[00:47:09] So if you care about what the mind is like,
[00:47:12] like how the mind works to lump everything together in that system one,
[00:47:16] I think is is ridiculous.
[00:47:18] It's doing a disservice.
[00:47:19] I'll get to maybe a defense of this view later, but like,
[00:47:24] but it was nice to find this paper.
[00:47:26] So this is a paper by David Melnikoff and John Barge called The Mythical Number
[00:47:30] Two that tries to sort of take apart this this style of theorizing.
[00:47:34] So can I just look for people like myself who are generally familiar with this
[00:47:39] type one, type two stuff, but not specifics.
[00:47:44] And just in terms of this paper, so because it talks a lot about this type one,
[00:47:50] type two distinction.
[00:47:51] But the but the idea is that in type one,
[00:47:54] judgments are unconscious, unintentional, uncontrollable and efficient.
[00:48:02] Yeah.
[00:48:02] Like all those four things.
[00:48:04] And then in the other one, it's the opposite.
[00:48:07] So non of an inefficient, controllable, conscious and intentional.
[00:48:15] Intentional.
[00:48:16] So and just the way they start out is by saying, look,
[00:48:22] if you if you even just agree that those are the only kind of those are the proper
[00:48:27] categories, there's no reason to think that all four of them,
[00:48:33] four of them are grouped in one place and four of them are grouped in the other
[00:48:37] in just two quadrants, like there's 16 possible quadrants, 16 possible.
[00:48:42] And so why would you think that they would all be grouped together?
[00:48:46] That's actually a pretty striking claim that you that would need a lot of evidence
[00:48:50] and support of it, and it doesn't have that.
[00:48:54] Right. Right.
[00:48:55] So so it's like sort of a the dual process theory has just proceeded by
[00:49:00] largely by an agreement that this assumption was OK.
[00:49:05] And I have to say, John Barge, you know, even though he's writing this critique,
[00:49:09] he's he's the person who put sort of automatic behavior on the map.
[00:49:14] He's the name associated with automaticity in psychology.
[00:49:19] But that's OK if he.
[00:49:21] Yeah, no, no.
[00:49:21] It's great that he's it's great that that he's actually writing this paper.
[00:49:26] So yeah.
[00:49:27] And I remember early on before before this system, they call it type one, type two.
[00:49:32] But the Conor Mitowersky way is system one, system two.
[00:49:36] So forgive me if I float between the two.
[00:49:38] I'll forget.
[00:49:40] But you know, early on, I remember there this when I was talking about these
[00:49:44] persuasion models, there's one called the elaboration likelihood model
[00:49:49] by Richard Petty and John Casciopo.
[00:49:53] And it seemed like really complicated, right?
[00:49:55] It has all these like boxes and arrows.
[00:49:57] And it just all boiled down to sometimes you think hard and sometimes you don't.
[00:50:01] Like, it's like all of these theories boiled down to that
[00:50:04] and what it means specifically to think hard and not think hard.
[00:50:07] And and like it took it took such a hold of the field that I remember
[00:50:14] when I was in grad school, like an edited volume came out
[00:50:18] that was called dual process theories in social psychology.
[00:50:22] And I remember
[00:50:26] being like, like even at that point,
[00:50:29] this is like why would you categorize theories by the number of things in the theory?
[00:50:35] Like it was literally like, hey,
[00:50:38] the number two seems to pop up across a whole bunch of different theories.
[00:50:42] Like let's get people together and write about the number two.
[00:50:45] It has like some magic magic hold on people.
[00:50:48] So this article, the mythical number two is as you said,
[00:50:52] trying to argue that it is a bad thing to continue to talk about
[00:50:59] two systems or two types for the very reason that you said, because
[00:51:03] you can have things that are say
[00:51:08] intentional and also automatic, which I think is obvious.
[00:51:13] So so
[00:51:15] you know, you learn to drive when I drive, I am doing so intentionally.
[00:51:20] But it's pretty automatic, like me turning the steering wheel or shifting gears
[00:51:25] or whatever they taste might be.
[00:51:27] It's almost, I don't know.
[00:51:30] You as an outsider, you tell me,
[00:51:32] but it's always seemed to me like so obvious that I don't know why it took hold.
[00:51:37] Like that's just.
[00:51:38] Yeah, no, I mean, I think that's right.
[00:51:40] Like it seems very faith based, like just like a religion where people
[00:51:46] never really think to question some of the totally implausible, almost insane claims
[00:51:53] that from the outside it seems like.
[00:51:56] And from the inside, it's like, no, these are our starting premises.
[00:52:00] We've grown up with these like this is our world.
[00:52:03] Right.
[00:52:04] I think that part of it is a result of conceptual sloppiness on the part of
[00:52:10] social psychologists, because I don't think that it was
[00:52:14] that they were so insensitive to the facts of the matter.
[00:52:18] It's just that they very easily were comfortable
[00:52:22] with this slippery definitions of what automatic means and what unconscious means.
[00:52:28] And in fact, but this isn't cited because it was a paper that nobody read,
[00:52:31] but Paul Blume, Eric Ullman and I once wrote a paper called The Varieties
[00:52:35] of Unconscious Social Cognition where we were trying to point out that like
[00:52:39] oftentimes you are
[00:52:42] you you might be there's different kinds of unconscious and
[00:52:45] that the authors of this paper pointed out to you can be unconscious
[00:52:49] of the influence that caused you to have a preference, say.
[00:52:52] So like, why do I like coffee?
[00:52:56] I don't know.
[00:52:57] For all I know it's because somebody in my youth drank coffee and I liked them
[00:53:03] and the smell like made me kind of like it and that's why I enjoy it now.
[00:53:07] But I know I like coffee.
[00:53:09] That's not unconscious.
[00:53:10] Right.
[00:53:11] Like it's it's not unavailable to me to tell you that I have this preference for coffee.
[00:53:16] But but for some reason,
[00:53:19] we just as social psychologists just started
[00:53:23] lumping these concepts together and not really questioning,
[00:53:26] questioning what the differences were.
[00:53:29] And this is like again, you know, like obviously I'm a fan of philosophy,
[00:53:33] but I think a philosopher would have noticed right away.
[00:53:35] Well,
[00:53:38] like what exactly do you mean by unconscious here?
[00:53:40] Like what do you mean that it's completely inaccessible?
[00:53:45] Do you mean that you are unaware of the things that gave rise to this?
[00:53:49] Like I see.
[00:53:50] I think you're giving philosophy.
[00:53:53] I mean, we might have done that about this particular issue.
[00:53:57] Yeah.
[00:53:57] But when you look at moral philosophy and other kinds of philosophy,
[00:54:02] you find all sorts of similar analogous binaries,
[00:54:07] like just the retributivism and utilitarianism and the philosophy of punishment
[00:54:11] like just seemed like the only two conceivable approaches to punishment
[00:54:15] for for a very long time.
[00:54:18] And in the minds of some people, a lot of people still today.
[00:54:22] And yeah, you know, on the margins, people are like, well, there's this
[00:54:27] kind of retributivism and that kind of retributism.
[00:54:29] But you're still working within that framework or people who came up with
[00:54:33] hybrid theories.
[00:54:34] It's like, yeah, but you're still thinking it has to be part of this and part of that.
[00:54:41] And, you know, that's the that's like the materials that you're working with.
[00:54:46] It's yeah, it's what you're allowed to start with.
[00:54:48] Right.
[00:54:49] You know, my one of my favorite examples of this is really old and it goes
[00:54:54] crosses disciplines and it's cognition and emotion.
[00:54:58] And so yeah, or reason and emotion.
[00:55:01] And so people ever since at least I started studying emotion in grad school,
[00:55:07] it's just everybody just says, well, we know that they are not clear dichotomies.
[00:55:14] But then but then they always proceed to treat them as dichotomies.
[00:55:18] Exactly.
[00:55:20] That's what I mean by shrugging off.
[00:55:22] They're like, acknowledge at some abstract level the totally over
[00:55:27] simplification of their model and then go on to use the model.
[00:55:31] Right.
[00:55:32] As if it's not an oversimplification.
[00:55:34] I actually found this analogy that they give in their conclusion really helpful.
[00:55:38] So they say like there are two types of cars, convertibles and hard tops.
[00:55:44] And that's true.
[00:55:46] No problem there.
[00:55:47] And then we also say there is automatic and manual transmission.
[00:55:51] Yep, that's right.
[00:55:52] Gasoline and electric motors and hybrids, foreign and domestic.
[00:55:57] And then he says the point is that these are all different types of cars.
[00:56:01] We know that it's not just two types of cars overall.
[00:56:05] Convertibles have manual transmission, gasoline engines and are manufactured
[00:56:09] overseas and hard tops that have automatic like but that's right.
[00:56:13] Imagine having a theory of cars that just thought that all of those that
[00:56:17] there were only two types of cars and you know, can you imagine like that?
[00:56:22] That has that four features and this group has that four features.
[00:56:26] Can you imagine like a thousand papers on like convertibles versus
[00:56:30] hard tops and being like, well, you know,
[00:56:32] sometimes we find it different than sometimes we don't.
[00:56:34] And it's like really like, you know,
[00:56:37] sometimes we find that they both can travel at the same speed.
[00:56:42] And who knows why?
[00:56:43] This is a deep mystery for for the future of the study of cars.
[00:56:47] Further research should investigate whether what the model is.
[00:56:52] I mean, we know that it's slightly
[00:56:54] oversimplifying to say that there's only two types of cars.
[00:56:57] But anyway, we ran this experiment where
[00:57:02] we tried to open up a electric car.
[00:57:05] This is science.
[00:57:06] We like cracked open a hard top and we put a cloth thing over it.
[00:57:11] And it still drove the same.
[00:57:13] It was weird. It was very, very weird.
[00:57:15] Like it was a failed experiment.
[00:57:16] But thankfully open science allows us to report not significant results.
[00:57:22] So so what you could say is that, well,
[00:57:25] cognition is a separate thing from emotion.
[00:57:28] It's just that sometimes when we're talking about emotion,
[00:57:33] we're not realizing that cognition is also coming along for the ride.
[00:57:37] OK, fine.
[00:57:38] But you're still embracing the dichotomy.
[00:57:40] You're still saying like these are two things.
[00:57:42] These are two ingredients.
[00:57:43] And it just turns out that when you use one ingredient,
[00:57:46] the other one like either necessarily has to come along or or just
[00:57:50] contingently comes along.
[00:57:52] But but people would say like there is no
[00:57:54] just clear distinction between cognition and emotion.
[00:57:57] And it's like, well, then just fucking stop saying cognition and emotion.
[00:58:00] Like come up with another
[00:58:04] con cog emotion.
[00:58:05] I don't know.
[00:58:06] Like if you really believe that there's no distinction,
[00:58:08] it's hard to overemphasize how much these kinds of dichotomies and
[00:58:14] distinctions are at the roots, the core, the foundation of the research
[00:58:20] that is being done.
[00:58:22] Like so really questioning them is is questioning something pretty foundational.
[00:58:28] You're really bringing the house down if you start to challenge this,
[00:58:33] which is probably another reason why
[00:58:36] you're at the center of your closer to the center of that quinine web
[00:58:41] of belief than people would like because it's the quinine web.
[00:58:45] You know where it's like there's certain things core beliefs
[00:58:49] that if you if you challenge those, it'll destroy all your beliefs,
[00:58:54] your whole beliefs.
[00:58:55] And then there's ones on the fringe where
[00:58:57] if you change your mind about those, it doesn't really affect that much of the
[00:59:01] rest of the web.
[00:59:02] But if you go into the center of the web,
[00:59:04] like the whole web comes apart and you have to like start all over essentially.
[00:59:09] Right.
[00:59:10] Right.
[00:59:10] And so I can't overemphasize how much the Conor
[00:59:14] Tversky and others, the judgment and decision making folks,
[00:59:17] how much that system one system two has influenced not just the field,
[00:59:24] but applied work, the work the government is doing, work that even I do in in
[00:59:30] like consulting and it's that is the core idea there is
[00:59:39] that look, we have a number of heuristics that we use,
[00:59:43] the shortcuts to thinking that that we use.
[00:59:45] And so that's what Conor M.
[00:59:46] Tversky sort of famously introduced or at least popularized a catalog of heuristics,
[00:59:54] which are just mental shortcuts.
[00:59:56] The idea was always that, hey, your your mind, you know, we have a limited set
[01:00:01] of we have a limited amount of cognitive resources.
[01:00:04] We can't think deeply about every single decision we make.
[01:00:08] So perhaps what has evolved or maybe what we've learned are these just
[01:00:14] general rules to help us avoid making having to deliberate every time we make a
[01:00:19] decision. So I think something like a stereotype would be a heuristic.
[01:00:23] All kinds of these heuristics that influence our judgment
[01:00:29] that are automatic and they're quick.
[01:00:30] And the reason that we have them is because they work on average.
[01:00:34] But the key message is that they lead to error.
[01:00:39] So they lead to bias.
[01:00:40] So Conor M. Tversky, their career was built on showing under what conditions you
[01:00:46] could show that people fuck up really badly by relying on these heuristics.
[01:00:50] And so the goal, which is not a bad goal, like say for like public policy,
[01:00:54] why do people do why do people make bad decisions, say about savings
[01:00:59] or about whatever organ donation or whatever?
[01:01:03] It's because they're relying on these heuristics.
[01:01:05] So if only we can get people to use system two, that deliberate
[01:01:09] process, then maybe we can stop people from making errors.
[01:01:12] And that's exactly what say Josh Green believes about consequentialism.
[01:01:16] Right? If only we could get people to stop going with their gut,
[01:01:19] we could get them to realize that consequentialism is the right
[01:01:23] is the right theory to endorse.
[01:01:26] Right. On Josh Green's view, the evidence against
[01:01:30] deontological judgments is the fact that we make them unconsciously and
[01:01:36] automatically or at least that's part that's some of the evidence against them.
[01:01:41] You know, and we've evolved to do it and there's that whole element of it.
[01:01:47] But that's the idea is if this comes from this automatic heuristic
[01:01:53] that we have to save cognitive resources, then that gives us reason
[01:01:58] to doubt the truth of it.
[01:02:00] Right. And interestingly, like what got lost along the way
[01:02:05] was the claim that that the heuristics
[01:02:12] often do lead to true judgments.
[01:02:15] Right. Like, that's why they exist.
[01:02:16] And so so like that there is nothing
[01:02:21] that is inherent in the process of applying a heuristic that that means
[01:02:27] that it's wrong, right? It could be truth tracking.
[01:02:30] Like, you know, you know, it's like you need a different set of evidence
[01:02:34] to show when it's right and when it's wrong.
[01:02:37] And the flip side, which they describe very well in this paper,
[01:02:42] a lot of the time when we're reasoning and
[01:02:44] delivering, we're very prone to make errors because there's all sorts of things
[01:02:49] like, you know, reasoning errors, motivated reasoning,
[01:02:55] all sorts of different ways in which we can convince ourselves
[01:02:58] of a belief that it's true using our reason.
[01:03:03] But we won't think of it that way.
[01:03:05] We'll think of it as we are deliberating using this,
[01:03:10] using our reason, our rational capacities towards the truth when, in fact,
[01:03:16] that's not what we're doing.
[01:03:17] Right. There's somehow it got conflated that the process of deliberation
[01:03:23] is akin or equal to valid reasoning.
[01:03:27] And then that's just not as obviously not the case.
[01:03:31] And like we've talked many times on this podcast about like the stuff
[01:03:34] on motivated reasoning that when when you get people who are especially good
[01:03:39] at reasoning, they're able to convince themselves better that they're that they're
[01:03:45] right. So the bias is actually accentuated, not attenuated.
[01:03:49] So like I'm convinced by the paper that we should abandon this.
[01:03:52] And I think they oversell how much people disagree with this.
[01:03:56] I think that like a lot, a lot of people are really dissatisfied with this
[01:04:01] dichotomy in theory.
[01:04:03] It's just that and this might get us to the next topic.
[01:04:06] It's hard
[01:04:08] to think of things in like 16 cells versus two.
[01:04:16] Yeah. Right.
[01:04:17] Like it just is it's harder.
[01:04:20] And so I don't you know that this has always been my fear about psychology
[01:04:25] that if you really want to understand why, for instance, we make the judgments that
[01:04:29] we do, we're going to need to understand interactions between like 50 different
[01:04:34] variables and at that point, like only a computer will be able to to
[01:04:40] like figure out the underlying principles if there are any of our judgment.
[01:04:45] But I've been living in this world
[01:04:47] of
[01:04:49] experimental social psychology where at best you have like
[01:04:55] what we call a two by two design four cells.
[01:04:57] So you say like I gave people coffee versus didn't and I gave them an easy versus
[01:05:03] a hard test and then I put up a graph and it's really, really easy to grok
[01:05:07] because it's just like, OK, two here, two here, I can see the interaction.
[01:05:12] And that just I just don't think that that can be getting us very far.
[01:05:18] And that corresponds with the graphs, you know, of two axes.
[01:05:22] And like there's all everything like our methods and our tools are designed
[01:05:28] for things to be in twos.
[01:05:32] And that could just not be the way the world works.
[01:05:34] Like I've been looking for people to commiserate about this for a long time now.
[01:05:39] And I'll never forget one of my favorite
[01:05:42] so quotes from from a colleague of mine, Piot Winkleman, who is at UC San Diego.
[01:05:49] I was like 14 years ago, I remember, because my wife at the time was pregnant.
[01:05:54] I'm talking to him about my dissatisfaction with this
[01:05:57] and he's just thinking about it and he's like sort of tapping his cheek
[01:06:02] with his fingers, like looking away.
[01:06:04] And I'm like, oh, he's a smart guy.
[01:06:06] He's going to say like he's going to really school me.
[01:06:08] And he goes, yes, I mean, what if there are five things?
[01:06:14] Like.
[01:06:15] So let's let's let's divide the rest of the discussion into two parts.
[01:06:23] I actually didn't even mean that as a joke.
[01:06:25] So the first is like a possible defense of what people are doing,
[01:06:31] which I think you gestured and we should talk about.
[01:06:33] And I have an idea about that.
[01:06:36] And the second is where do we go from here?
[01:06:39] You know, once we recognize this.
[01:06:41] My quick defense is that when you're doing,
[01:06:45] say you are doing applied work where you're trying to get people to make better
[01:06:50] judgments, so better judgments about how they handle their finances,
[01:06:54] but about how they make choices about about what to purchase.
[01:06:59] Maybe even how they make choices about who to date.
[01:07:03] And you have evidence that people systematically make these errors
[01:07:08] that even they themselves realize are errors and say that you show that if you
[01:07:14] give them the opportunity to really think about it,
[01:07:17] that they can correct those errors, then you might be able.
[01:07:22] You can deploy this theory not as as a deep theory about how the mind works,
[01:07:28] but rather as a surface pragmatic theory about how to correct judgment under
[01:07:34] certain circumstances.
[01:07:35] So if I if I know for a fact, for instance,
[01:07:37] there's a lot of good work on on poverty by a guy named Elder Shafir.
[01:07:42] One of the things that he points out is that
[01:07:45] just being financially burdened and I think we've both been there.
[01:07:50] You're under a lot of stress, a lot of pressure.
[01:07:53] And under those circumstances, so like a lot of say single parents, right?
[01:07:58] One of the reasons that they don't even open a bank account is that they
[01:08:01] don't have time.
[01:08:03] They don't have the ability to set their kids aside to go to the bank to like
[01:08:08] open a talk to somebody about opening up.
[01:08:11] Yeah. Yeah.
[01:08:12] And and even the stress and the pressure of knowing that you're
[01:08:17] you're going to overdraft, like your check isn't going to clear.
[01:08:20] That's like an incredible amount of
[01:08:22] what we would call cognitive load, but which just means there's a lot
[01:08:25] of shit on your mind that you can't handle.
[01:08:27] And so your other decisions might suffer.
[01:08:30] So if if you really want to understand maybe the conditions under which you can
[01:08:34] maximize good decisions, but not all of them, but at least some of them,
[01:08:41] then then using a dichotomy might actually be fine.
[01:08:45] Like it might actually be the case that if so long as you specify
[01:08:48] under under which conditions thinking hard about something might lead to a better
[01:08:52] life, then you can go ahead and try to plan interventions for that.
[01:08:56] But that's that's an applied theory and not one that's discovering new things
[01:09:01] about the mind, it's just helping fix a problem.
[01:09:04] It's more of an engineering problem than it is a psychology like a psychology theory.
[01:09:10] So that doesn't seem to me to be a defense of what's happened,
[01:09:16] given that that's not how this
[01:09:20] dichotomy has operated more a way, a productive way of possibly taking some
[01:09:24] of those ideas and right.
[01:09:29] Like yeah, and I fear I actually fear that the thing that I just said is what a lot
[01:09:35] of people have in mind, but that crept its way into a true descriptive theory about
[01:09:40] the mind. I think that there was some somewhere it actually
[01:09:47] like made people think that this is this is the right way of understanding how
[01:09:52] judgment works in general.
[01:09:54] And I think that that's that's right.
[01:09:57] So it's a it's a real tepid defense.
[01:09:59] It's like only very specifically do I think that maybe it's useful.
[01:10:05] Yeah. So here's my defense, very qualified and I'm not sure if I buy it.
[01:10:10] But so here's a good example of a dichotomy in philosophy that's just
[01:10:15] really weird that so compatibilism and compatibilism about free will
[01:10:20] and moral responsibility is just weird that the camps got divided into those two
[01:10:25] things where libertarians are lumped with skeptics and compatibilists.
[01:10:33] You know, like a compatibilist like Daniel Dennett is lumped with
[01:10:37] Strossen and so that I mean it's just a silly and bizarre way for that
[01:10:43] debate to have been shaped up.
[01:10:45] But maybe the defense is look, we had to start somewhere.
[01:10:49] We had to find a way of talking about this and we can't, you know, start out with
[01:10:55] an like a massive, massively complex topology.
[01:11:00] Then we'll just never go anywhere by starting with this kind of framework that
[01:11:05] allowed us to tease out distinctions and all sorts of ways in which, you know,
[01:11:10] the different theories can be formed and the Strossonian compatibilism is
[01:11:16] different from, you know, Frankfurt's compatibilism or Susan Wolf's kind of
[01:11:21] compatibilism and but the dichotomy as sort of silly and over simplistic
[01:11:29] as it may look to us now was necessary to get us thinking about this in more
[01:11:35] complex ways.
[01:11:36] Right, maybe and it's like, do I reject or accept e-categoricalism?
[01:11:42] I mean, because we were posed with that question, now we can make real progress
[01:11:50] on consciousness.
[01:11:51] I just want to report that you I almost got a spit take out of Tamler.
[01:11:57] Yeah, no, that's interesting because I think like here's what might be going
[01:12:02] on like say we just really have the desire to think in twos
[01:12:08] and then you keep it's just essentially like a growing chart exponentially.
[01:12:13] And I think just sociologically, the problem is that now you have like 64
[01:12:21] different positions, but most people are talking to each other about that.
[01:12:25] The one versus the other.
[01:12:27] And it was really hard to integrate any of this stuff into like a general theory.
[01:12:33] You know what?
[01:12:33] Honestly, if somebody did write an article about the 64 varieties of
[01:12:37] noncognitivism, I'd be like,
[01:12:43] you know what?
[01:12:45] I'll reciprocate the oral sex.
[01:12:47] So that's fine.
[01:12:49] Just I don't want to have to read that paper.
[01:12:51] Either either you will or you won't.
[01:12:58] Yeah, you painted that as a bad cut.
[01:13:00] I mean, you could just lick it once.
[01:13:06] There's all sorts of complex shadings.
[01:13:08] That's right.
[01:13:09] If you use mostly hand and a little bit of mouth, is that oral sex?
[01:13:13] Yeah.
[01:13:14] So I don't think it's a limitation in our in our abilities.
[01:13:20] I think we can represent like just like, you know, they do in this paper.
[01:13:25] You can represent 16 different ways of thinking of the mind.
[01:13:31] I just fear that people will always fall back on on the two.
[01:13:36] They'll say exactly like as you were saying, sort of jokingly.
[01:13:39] Well, yeah, yeah, I know.
[01:13:40] But like for now, let's just think of this.
[01:13:42] Well, this is an intervention paper, though.
[01:13:45] The way they're doing it is to critique.
[01:13:48] It isn't a it is to critique a certain way of viewing the challenge will be
[01:13:54] now trying to take this more complex map and create some better
[01:14:01] framework or a way of thinking about the mind.
[01:14:04] Yeah, right.
[01:14:05] So, you know, I mean, I took it a little bit as a positive claim to say like,
[01:14:08] let's look at these 16.
[01:14:09] Right. They review research that has has sort of separated these processes.
[01:14:14] No, but that's what I mean.
[01:14:15] But it's in a critical way like to show, hey, look,
[01:14:18] you can be intentional and it's not and it's unconscious.
[01:14:25] It can be or whatever the various variations they come up with.
[01:14:29] But that's not part of a larger theory.
[01:14:32] That's to show that your theory is false.
[01:14:35] It's their counter.
[01:14:38] But but it but it's also providing a positive description of how the mind works.
[01:14:42] But you're right.
[01:14:43] It is in an attempt to show that this dual press theory is wrong.
[01:14:47] I like I wanted to say, actually, it reminds me of something that that I didn't
[01:14:52] talk about, which is there's always been in social psychology, a really dominant
[01:14:59] set of theories about attribution where the question is,
[01:15:02] right, Tamela, you behave in a certain way.
[01:15:04] Like what do I infer from your behavior?
[01:15:08] Right. So you bumped into me.
[01:15:10] Are you an asshole or not was right.
[01:15:13] And so the distinction that people have always made is something like
[01:15:21] what was it situational or dispositional?
[01:15:23] Is did it emanate from your disposition or was it just a product of the external
[01:15:29] forces working on on your action, Stanford Prison experiment?
[01:15:34] Exactly. Right.
[01:15:34] Situations or was it the fact that they were brutal people?
[01:15:38] Yeah. And social psychologists just not over this kind of distinction.
[01:15:43] And I remember feeling that like it really broke down when when I was trying to study
[01:15:49] moral judgment and I was studying how people make inferences about emotional actions.
[01:15:55] So say that you
[01:15:58] you donated to charity because you were so overcome with emotion when you saw
[01:16:04] a commercial of starving children that you gave money.
[01:16:08] Like is that a situational force or is it a dispositional force?
[01:16:14] Yeah, like it's pretty clear that it's both.
[01:16:16] Right. It's not it doesn't like an emotion can generate from the kind of person
[01:16:21] that you are like you're the kind of person who is moved by by this.
[01:16:25] And these are distinctions that philosophers have made for a long time.
[01:16:28] And yet social psychology still like like reviewers would be like,
[01:16:33] I feel like it's situational versus dispositional.
[01:16:36] Yeah. That's so I totally agree.
[01:16:39] And I certainly have, you know, I used to teach a section on
[01:16:44] situationism and I was, you know, and again, that thing of now I know obviously
[01:16:51] there's a little bit of both.
[01:16:52] So you do a little hand wavy stuff to to say that you acknowledge that this is
[01:16:57] a spectrum and not, you know, but then you then like immediately forget it.
[01:17:02] Here's another kind of thing.
[01:17:03] And I wanted to ask you about this because I was just commenting on a paper
[01:17:06] that was saying that like the emotion anger has a behavioral.
[01:17:10] That's the action tendency.
[01:17:12] Yes, action tendency.
[01:17:14] The action tendency.
[01:17:16] So anger people want to punish, make suffer, you know, engage in acts of retribution.
[01:17:24] So whenever you're angry, that is.
[01:17:25] Yeah.
[01:17:26] And I just couldn't believe that now I might be wrong and maybe I am wrong.
[01:17:33] I couldn't believe that that psychologists really believed that.
[01:17:36] Like there's so many different kinds of anger and there's a lot of times where
[01:17:40] you're angry and you don't want to hurt the person, you don't want to like them
[01:17:43] to suffer in any way.
[01:17:45] You don't want like even to feel guilty, you just are pissed off.
[01:17:49] And maybe you want something to change or maybe you want them to acknowledge
[01:17:52] something or maybe, you know, you want them to apologize.
[01:17:56] But this idea that there's only one action tendency for something
[01:18:01] like anger seemed crazy to me.
[01:18:03] Yeah.
[01:18:04] And like there, there is there's a lot of that that is is, you know,
[01:18:10] somebody
[01:18:12] spelling out like the prototypical kind of emotional response.
[01:18:17] So like fear makes you want to run away.
[01:18:20] Right.
[01:18:20] But obviously that's also not not always the case.
[01:18:25] And I think that this is so like there is this sort of self
[01:18:30] perpetuating thing where you then bring people into the lab and do an experiment
[01:18:36] where you piss somebody off and you show that they are actually more likely
[01:18:41] to be mean to somebody else.
[01:18:43] And you say this is evidence that the action tendency for anger is to punish.
[01:18:48] But you haven't bothered to test a different kind of anger, right?
[01:18:51] You've been so driven by this view that
[01:18:57] you haven't actually even thought of the variety of ways in which you could test
[01:19:01] this, you know, Lisa Feldman Barrett actually in her she she's really opposed
[01:19:05] to this view of emotion.
[01:19:06] She says even something like fear, we think of a prototypical instance like,
[01:19:12] you know, you're in the fucking savannah and you see a lion, right?
[01:19:16] That's like what we think of as fear.
[01:19:18] And she points out that that there are so many different kinds of fear experiences
[01:19:25] that I think maybe she takes it too far.
[01:19:28] But but because she she argues that maybe maybe this is just pure purely
[01:19:34] arbitrary, that we lump all of them together under a certain word.
[01:19:39] But I do take her point that it masks differences to use only one word, right?
[01:19:44] Just in the same way that calling something system one will mask differences
[01:19:49] because you've lumped it into one word.
[01:19:50] And and like the way that this gets embedded, so let's say you want to find
[01:19:56] out if somebody is angry, but you can't test their anger because.
[01:20:02] So so you'll so you'll maybe give them a questionnaire
[01:20:06] that says that they want to punish somebody and then you'll infer from
[01:20:10] that that they're angry because you've established that link.
[01:20:14] So it's like, well, the two things go together.
[01:20:18] So like if you can show one, that's a sign that they have the other.
[01:20:22] And then now like, you know, like, you know,
[01:20:24] like that's kind of stuff gets done all the time.
[01:20:27] I know I know that you probably will get this reference
[01:20:29] because you don't watch Seinfeld, but there is a hilarious episode where
[01:20:33] Uncle Leo, who is Jerry's Jerry's uncle, burns off his eyebrows.
[01:20:39] And Elaine is trying to sneak into his doctor's appointment because she can't
[01:20:44] get a doctor's appointment and she paints eyebrows on him.
[01:20:47] But the way that she paints them makes it makes him look like he's making an anger
[01:20:51] face. Yeah. And then when the doctor comes in, he's like, Leo, calm down.
[01:20:55] Like he's like, what are you talking about?
[01:20:56] He's like, Leo, why?
[01:20:59] Just reminds me of exactly what you said.
[01:21:01] The presence of a superficial feature does not indicate necessarily that feature.
[01:21:06] That's right.
[01:21:08] Yeah, no, like there's a there's a different way in which sometimes what these
[01:21:15] people who talk about action tendencies are often appraisal theorists.
[01:21:19] So they think that there are some judgments that give rise to an emotion.
[01:21:23] So what it means to be angry is that you feel that somebody has done a blame worthy thing.
[01:21:31] So you feel that they had control over what they did and it was harmful to you.
[01:21:36] And that all that sometimes they blur the line between
[01:21:42] just what anger means, like what what we generally take the sentiment anger to be
[01:21:49] describing and an empirical finding, which is when I anger somebody because
[01:21:55] because you have to make a conceptual definition or else.
[01:21:57] Like how do how do I anger somebody?
[01:21:59] So so yeah, like I
[01:22:03] it is problematic across a number of fields.
[01:22:07] And I'm sure it's true in like history or political theory.
[01:22:11] There's probably a lot of this these kinds of dichotomies that are blurring things.
[01:22:16] And, you know, I think the question is whether it's more helpful than harmful.
[01:22:21] Like I think there are probably times where it's helpful.
[01:22:23] But there are times where it's it's masking important distinctions that
[01:22:29] then just multiply like now you're linking this phenomenon with a different kind of
[01:22:34] phenomenon and, you know, the oversimplifications are multiplying to the point where
[01:22:42] you're just not you're no closer to understanding the thing that you're trying
[01:22:47] to understand than you were before you even started any of this research.
[01:22:53] Right. That's and that's a depressing conclusion.
[01:22:56] But but it might be where we are with some with some things.
[01:23:00] Yeah, yeah.
[01:23:02] So so like why so why are we like this?
[01:23:05] I think there's like a trivial answer to why we dichotomize.
[01:23:11] But then there's maybe more interesting ones like it's easy.
[01:23:15] Like that's the the simple the simplest answer is that that it's just very,
[01:23:19] very easy to think about two thing.
[01:23:23] And and maybe there's a more interesting
[01:23:27] there's an interesting body of research about numeracy
[01:23:31] that you might be familiar with.
[01:23:32] But the idea is that
[01:23:35] both some human cultures,
[01:23:40] early at least early human cultures as well, and even infants and some animals
[01:23:45] that they their their ability to track numbers is very limited.
[01:23:50] That is they don't they don't have numbers.
[01:23:52] It took human beings a long time to invent like, you know, integers.
[01:23:58] But people are able to track one, two and many.
[01:24:04] So in some cases, one, two, three and many.
[01:24:07] And and that doesn't seem to require
[01:24:11] anything like a complex numerical system.
[01:24:14] That's just sort of like our brains are capable of that without any without any
[01:24:19] real training. And so it makes sense that
[01:24:23] that we would favor the numbers two and three, you know, they pop up all over the place.
[01:24:30] Like this is something deeply embedded in us.
[01:24:34] I think the other reason is that often many of the important decisions we would
[01:24:39] have made are dichotomous.
[01:24:41] So like you have to decide whether or not to do something.
[01:24:44] Right. And so like take good and bad.
[01:24:48] I was talking recently to Molly Crockett.
[01:24:51] She came and give a talk here and she was talking about
[01:24:54] inferences, moral inferences about people and some of the work that I did with her too.
[01:24:59] Like we tend to categorize people as good or bad.
[01:25:02] And we do it really easily and we do it a lot.
[01:25:06] And the question I was asking her is like, it's weird.
[01:25:10] We both study this and we both know that people do this.
[01:25:12] But like it's also obviously true that nobody is either good like all good
[01:25:17] or all bad, like there's no such thing as just a true villain or not.
[01:25:22] And we were talking about, you know,
[01:25:24] if if the decision really is do I want to cooperate with you?
[01:25:32] Like I have to boil it down to a dichotomous decision, either yes or no.
[01:25:37] And so categorizing people into yeses and noes
[01:25:41] might actually be a very, very efficient way of maintaining sort of this
[01:25:47] your ability to to gather resources from other people.
[01:25:52] Because you do have to make decisions, otherwise you'd be paralyzed.
[01:25:55] Yeah, the analogy that I that I really like, I think I've brought this up before.
[01:26:00] There's a website, I think called Umbrella or not.
[01:26:04] Have I brought this up before?
[01:26:06] Yes, many, many times.
[01:26:08] I probably edited it out every single time, every single time.
[01:26:11] But in this case, don't edit it out because it is an exactly
[01:26:15] right kind of analogy for why dichotomous reasoning is important,
[01:26:20] because it's hard to know what it means
[01:26:24] when it says 35 percent chance of rain or 70 percent chance of rain.
[01:26:27] All I want to know is like, do I take an umbrella or not?
[01:26:31] That's like this is the decision that I have to make.
[01:26:34] It's very hard for human beings to
[01:26:37] process that well, on these kind of days, 70 percent of the time it rained.
[01:26:42] Like, what do I care?
[01:26:43] Just tell me whether or not to wear a fucking raincoat.
[01:26:48] And I think that.
[01:26:50] I've probably said this too, but maybe I added this out too.
[01:26:54] But this is the thing that drives Nate Silver crazy.
[01:26:57] Like when he says like Hillary had a 70 percent chance of winning or
[01:27:03] like people just took that as 100 percent chance.
[01:27:07] And so there was like, you got it wrong.
[01:27:09] It's like, no, I didn't like three out of ten times.
[01:27:12] This is what would happen.
[01:27:14] Of course, Nate Silver, you're just backing off for your claim.
[01:27:17] It's funny. It's funny.
[01:27:18] We are we don't understand it.
[01:27:20] And I sometimes don't.
[01:27:21] I don't understand what it means
[01:27:23] if somebody a 30 percent chance of winning in such a one time event.
[01:27:28] Yeah, you know, it's too hard.
[01:27:30] It's like it's actually kind of meaningless.
[01:27:32] Like what it means is over time.
[01:27:35] Exactly. What it means is over time.
[01:27:37] And for a lot of things, especially those things that are just a single event,
[01:27:42] it's hard to then sort of figure out.
[01:27:44] That's why it's good to have rules like when you're playing 21 Blackjack.
[01:27:49] Right? You know that it's probabilistic.
[01:27:51] You know that if you have a 16, like you may or may not win.
[01:27:55] But like to have a rule that keeps you in check so that over time you do better.
[01:28:00] Like the rule is a dichotomous one.
[01:28:03] It's like and people treat it almost as just like absolute truth.
[01:28:07] Like no, you never hit on whatever 17 or something.
[01:28:10] Yeah.
[01:28:12] So it is important for that.
[01:28:15] And then it really probably just depends on the case whether it does more harm than
[01:28:19] good to divide things up in a certain way.
[01:28:23] Like I think in the punishment debate,
[01:28:24] it's probably done more harm than good that it immediately got divided into
[01:28:30] these two kinds of approaches to the expense of all sorts of other
[01:28:36] possible ways and also, you know, in ways that are are masking how criminal justice
[01:28:44] really works and like what is actually going on, which has nothing to do with
[01:28:49] this simple dichotomy of are we pushing you in a retributive way or a utilitarian way?
[01:28:54] Right.
[01:28:54] And so but people think of punishment that way and that's been really harmful.
[01:28:59] But in other cases, maybe it's been helpful first step to divide things up
[01:29:05] pretty neatly, maybe compatibilism and compatibilism was like this where it's
[01:29:10] actually it was a necessary way to try to wrap our heads around an immensely complex
[01:29:16] problem to just start out with something and then with that framework.
[01:29:22] Yeah.
[01:29:22] Start to tease out the complexities.
[01:29:24] I mean, your field is like that.
[01:29:26] Right.
[01:29:27] The way the way that like the analogy that comes to my head is like suppose
[01:29:31] that you have like a big pile of like a bunch of different things,
[01:29:34] say Legos and you want to organize them.
[01:29:37] It's not unreasonable to start with like, OK, I'm going to just at least my first
[01:29:41] pass is going to separate like the big blocks from the little ones.
[01:29:44] Right.
[01:29:45] Like and so you start there so that you can then refine because that actually
[01:29:49] might save you time.
[01:29:50] If you have to make if you have to make a decision about whether this is one
[01:29:55] of eight things like that might actually take you a longer time than if you
[01:30:00] say to then to then to then to.
[01:30:02] Right.
[01:30:03] But I agree with you that it stifles innovation and creativity
[01:30:07] to rely on just dichotomous categories because there is always the possibility
[01:30:11] you bring a punishment.
[01:30:13] But, you know, I think in a lot of cases,
[01:30:16] it just leads people to ignore what might be staring them in the face,
[01:30:21] which is that there is a richness to to the world or to the conceptual world
[01:30:26] that is being completely ignored because of reliance on these categories.
[01:30:32] Yeah, I mean, so like that analogy, the cars, if for some reason people had
[01:30:38] grouped two types of cars convertible man, you know, the ones that are
[01:30:42] convertible and manual and American and
[01:30:48] gasoline versus electric, foreign and whatever,
[01:30:52] that wouldn't have been helpful at all.
[01:30:55] Like that would have been any further research on cars and classifying cars
[01:31:02] and would have just been completely corrupted by that just arbitrary
[01:31:08] way of grouping them.
[01:31:11] And yet, you know, so like that really would have been a just
[01:31:16] something that distorted your understanding of cars, not something that was
[01:31:21] the first step towards understanding cars better.
[01:31:24] Right, it's hard to know when it's been,
[01:31:27] when it's an arbitrary classification unless you do a little bit more digging
[01:31:31] to find like, is there anything that is generating this dichotomy that I brought up?
[01:31:37] Like, oh, this has a this is a gasoline car so we can infer from that
[01:31:41] that it's also a hard top car.
[01:31:44] And then we run this experiment about how hard top cars will just say
[01:31:49] this is true of gasoline cars, it would just be like it would and then that now
[01:31:54] that's your established finding.
[01:31:56] Gasoline cars have this feature because you're the hard top that they ran on as
[01:32:01] that feature and you make a lot of work so it makes it seem like you're progressing.
[01:32:05] I think this is what's happened in dual process theories of the human mind.
[01:32:08] Like you you catalog all this work that you've been doing.
[01:32:12] You're like, no, like, you know, like we're moving beyond just these two,
[01:32:15] but like we're making progress.
[01:32:17] Don't worry.
[01:32:18] And it's like, no, you could actually
[01:32:21] be completely fucking it up.
[01:32:22] Yeah, that's what I mean.
[01:32:23] Like this I guess it's whether it's at the core of the web or at the fringes.
[01:32:28] Like it could be at the core with dual process stuff.
[01:32:32] And if it's that then this whole
[01:32:35] edifice comes tumbling down of results and findings and because they've been
[01:32:41] relying on that dichotomy right in all their experiments.
[01:32:45] And then confirmation bias kicks in, you know, and it's really hard to discard
[01:32:51] to discard that feature of the theory.
[01:32:53] The last thing I was going to say is that like I think that thinking in dichotomies
[01:32:58] at some point in the history of humanity just was much like it was fine for a lot
[01:33:04] of things that could just be that the world was such that dichotomous decisions
[01:33:09] dominated and so we weren't making dumb mistakes all the time.
[01:33:14] But now like the world is so complex and there's so many things.
[01:33:18] That are actually, you know, they're multiple.
[01:33:22] Everything has multiple attributes, a lot of them which are continuous.
[01:33:27] And it's just really hard to come up with with like how to evaluate this stuff.
[01:33:33] This is why like the websites like the wire cutter have you ever been on the wire cutter?
[01:33:37] No, like all it if you basically if you want to know like what the best
[01:33:43] air purifier is, it just tells you.
[01:33:45] It just says buy this one, right?
[01:33:47] It doesn't like list all of them and give them ratings because that's like what?
[01:33:50] Yeah, it's a great idea.
[01:33:52] That's perfect. Yeah.
[01:33:54] Good. Yeah, I need I'm definitely going to go to that.
[01:33:57] Yeah, I mean there's a couple.
[01:34:00] So it could just be that we have these blinkered lenses on now and we're too
[01:34:06] deep in to do anything about it.
[01:34:09] You know, we could start being like
[01:34:11] contians and start to think, no, this is our conceptual way of understanding.
[01:34:16] Like this is our conceptual framework that
[01:34:19] and so it's transcendentally true of like because these are our categories or
[01:34:25] something like but in terms of really understanding it the way we like to think
[01:34:32] it could literally be that that's not like we're not capable of at this point
[01:34:39] fixing this.
[01:34:40] This is like the truth will come from
[01:34:43] from machines and algorithms and we'll just be slaves to the truth that they're
[01:34:49] tracking, like they'll be right more and more often than not.
[01:34:52] Like now we're relying on models to predict
[01:34:55] that way do way, way better than human judgment.
[01:35:00] And this is like for a conversation for another time.
[01:35:03] But when we get back to the question of justice and punishment and the legal
[01:35:06] system and we talk about the role of machines in in actually making decisions,
[01:35:13] say about parole, I wonder if you'll be more open to to an algorithm that can
[01:35:21] take into account a lot more information to generate an optimal judgment.
[01:35:27] Well, we will have to see.
[01:35:31] All right, we've been going on way too long.
[01:35:33] So join us next time on Very Bad with it.
