Oxford philosophy professor Will MacAskill joins us to talk about effective altruism, moral uncertainty, and why you shouldn't eat your grandmother (even if consequentialism is true). How should we act when we're not sure which moral theory is the right one? Can we formulate a guide for behavior, modeled on decision theory, that maximizes expected moral value? How do we assign credences to ethical (as opposed to empirical) claims? Why has effective altruism become so popular, so fast, yet at the same time seem off-putting to many people? Plus, Tamler faces a dilemma when narrating his audiobook, and Dave is the Louis CK of his own backyard.
0:00 - 25:41 Tamler's dilemma and Guilty Confessions.
25:41 -31:15 Break, contact info, updates, thanks to our listeners and supporters.
31:16 -1:43:19 Wil MacAskill interview.
Special Guest: William MacAskill.
Links:
- William MacAskill homepage
- The Most Efficient Way to Save a Life - The Atlantic
- Doing Good Better: How Effective Altruism Can Help You Help Others, Do Work that Matters, and Make Smarter Choices about Giving Back [amazon.com affiliate link]
- Moral uncertainty - Effective Altruism Concepts
- 80,000 Hours: How to make a difference with your career
- Giving What We Can
[00:00:00] [SPEAKER_01]: Very Bad Wizards is a podcast with a philosopher, my dad and psychologist, David Pizarro, having
[00:00:06] [SPEAKER_01]: an informal discussion about issues in science and ethics.
[00:00:09] [SPEAKER_01]: Please note that the discussion contains bad words that I'm not allowed to say, and
[00:00:14] [SPEAKER_01]: knowing my dad, some very inappropriate jokes.
[00:00:17] [SPEAKER_00]: Indeed.
[00:00:18] [SPEAKER_00]: We scholars like to think science has all the answers, but in the end it's just
[00:00:22] [SPEAKER_00]: a bunch of unprovable nonsense.
[00:00:25] [SPEAKER_00]: Son of art.
[00:00:29] [SPEAKER_11]: The Greatest, I'm a very good man.
[00:00:55] [SPEAKER_08]: Brains and U.S.
[00:01:03] [SPEAKER_09]: Anybody can have a brain.
[00:01:10] [SPEAKER_10]: Very good man.
[00:01:11] [SPEAKER_10]: Just a very bad wizard.
[00:01:15] [SPEAKER_02]: Welcome to Very Bad Wizards.
[00:01:17] [SPEAKER_02]: I'm Tamler Sommers from the University of Houston.
[00:01:20] [SPEAKER_02]: Dave, today we are going to have the effective altruism boy wonder, Will MacAskill on this podcast.
[00:01:28] [SPEAKER_02]: Do you think all of our Patreon supporters are going to leave us once this episode comes out?
[00:01:33] [SPEAKER_05]: It's a... it's not... it's an actual worry.
[00:01:38] [SPEAKER_05]: I know.
[00:01:39] [SPEAKER_05]: No, okay. The wonderful thing about utilitarianism is that you can sort of take your best guess at what maximizing would actually be.
[00:01:48] [SPEAKER_05]: And in this case, getting the word out about Will MacAskill is made possible by the generosity of our Patreon subscribers.
[00:01:59] [SPEAKER_05]: And so...
[00:02:01] [SPEAKER_02]: But now once the episode is out, unless they think we're going to have him again...
[00:02:08] [SPEAKER_05]: We could... well, you know, there's Peter Singer on deck.
[00:02:11] [SPEAKER_05]: That's right.
[00:02:12] [SPEAKER_05]: If Will's the boy wonder, Peter Singer is bad man.
[00:02:16] [SPEAKER_02]: We're just going to become like a Silicon Valley effective altruism, like PR.
[00:02:22] [SPEAKER_05]: But we need to get a stingy content.
[00:02:25] [SPEAKER_05]: Like the kind that not even I would listen to.
[00:02:27] [SPEAKER_05]: Like just one who's really concerned about what this word meant in German when Kant said it in some letter to somebody else.
[00:02:34] [SPEAKER_02]: Yeah, that'll get the Patreon supporters all fired up.
[00:02:38] [SPEAKER_02]: Speaking of PR, he is very good PR for that movement.
[00:02:44] [SPEAKER_02]: He doesn't sound like a robot at all.
[00:02:46] [SPEAKER_05]: No, no. It's amazing how artificial intelligence has made strides.
[00:02:53] [SPEAKER_14]: Do you think that was like the... what do you call it? Test?
[00:02:58] [SPEAKER_14]: The Turing test.
[00:02:59] [SPEAKER_14]: The Turing test, yeah.
[00:03:01] [SPEAKER_05]: The whole time we thought Will was a human being.
[00:03:05] [SPEAKER_02]: He was like ex machina.
[00:03:07] [SPEAKER_02]: We fell in love with him. That was the thing.
[00:03:09] [SPEAKER_05]: It should be a regular segment, the things that you get wrong about Will McCaskill.
[00:03:14] [SPEAKER_05]: Like we should just always just refer to him as something that he's completely not.
[00:03:20] [SPEAKER_02]: Well, his identity is completely fractured by his commitments.
[00:03:25] [SPEAKER_02]: So I'm always right whenever I miss an identity.
[00:03:31] [SPEAKER_02]: It's Borg...
[00:03:34] [SPEAKER_02]: Borgessian?
[00:03:35] [SPEAKER_14]: Borg-es-ian?
[00:03:37] [SPEAKER_14]: Borgessian?
[00:03:39] [SPEAKER_14]: Yeah, that last one.
[00:03:41] [SPEAKER_02]: Borgessian.
[00:03:43] [SPEAKER_02]: Okay, so before that in our first segment, we're of course going to do our famous and really mind blowing segment,
[00:03:52] [SPEAKER_02]: Guilty Confessions, but first with a little mini dilemma that I thought might be interesting for our listeners.
[00:03:58] [SPEAKER_02]: It was I go out to Grand Haven, Michigan to be the narrator for my audio book that will be coming out.
[00:04:07] [SPEAKER_02]: Yeah, and while I was there, I met up with a listener named Matt.
[00:04:12] [SPEAKER_02]: Had a good time with him, so that was fun.
[00:04:14] [SPEAKER_02]: But anyway, I was narrating the audio book, which was not as terrifying an ordeal as I had worried.
[00:04:24] [SPEAKER_02]: I mean, I thought I'm either going to be really bad at it and I'm just not going to be able to do it or I'm going to hate my book as I'm reading it.
[00:04:38] [SPEAKER_02]: And while I certainly had some problems with it, I didn't hate it.
[00:04:42] [SPEAKER_02]: In fact, I...
[00:04:43] [SPEAKER_05]: It went mildly poorly and you only kind of hated it.
[00:04:47] [SPEAKER_02]: No, I mean, yeah, I don't know how according to the engineer director, he said I was good for an author, but I don't know how good.
[00:04:57] [SPEAKER_02]: Good for an author?
[00:04:59] [SPEAKER_02]: Yeah.
[00:04:59] [SPEAKER_05]: Like good looking for a Jew?
[00:05:02] [SPEAKER_05]: Exactly.
[00:05:05] [SPEAKER_02]: Yeah, so I don't know what the bar is for that.
[00:05:08] [SPEAKER_05]: Were they yelling out instructions like, okay, now this time like you're in pain.
[00:05:12] [SPEAKER_05]: Okay, now do it like you've just came and come back from a shopping spree at Sax Fifth Avenue and you're in the 1950s.
[00:05:21] [SPEAKER_14]: No.
[00:05:22] [SPEAKER_14]: No.
[00:05:23] [SPEAKER_02]: But yeah, no, they weren't...
[00:05:26] [SPEAKER_02]: You know, I was like, there's not that many voices in my book, but when there are...
[00:05:34] [SPEAKER_02]: Like I had to do like Shakespeare.
[00:05:35] [SPEAKER_02]: Talk about something I was worried about, like me doing Shakespeare.
[00:05:38] [SPEAKER_02]: But you know, like I thought it was fine.
[00:05:43] [SPEAKER_02]: I'm looking forward to it coming out and people giving me shit for it.
[00:05:46] [SPEAKER_02]: Anyway, so towards the end of the book, something that I had been worried about but didn't realize the extent of my worry until it was the day before,
[00:05:59] [SPEAKER_02]: I realized that I have this rap battle at the end between Loaded Lux and Calico.
[00:06:08] [SPEAKER_02]: Loaded Lux unfortunately never got back to me about the lyrics.
[00:06:14] [SPEAKER_02]: So I could only quote...
[00:06:15] [SPEAKER_02]: Now actually that's good.
[00:06:16] [SPEAKER_02]: It turns out that's very good because I was very limited in how much I could quote from his lyrics.
[00:06:22] [SPEAKER_02]: But in a couple places there were lyrics, one by him and one by another battle rapper that had the N word in it.
[00:06:33] [SPEAKER_02]: You know with the A at the end.
[00:06:36] [SPEAKER_05]: I know the word of what you speak.
[00:06:39] [SPEAKER_05]: Yes.
[00:06:40] [SPEAKER_02]: And so what am I going to do about that?
[00:06:43] [SPEAKER_02]: Like I got to speak it and it was pretty essential to the meaning of what I was trying to get across.
[00:06:50] [SPEAKER_02]: So am I just going to say it?
[00:06:53] [SPEAKER_02]: Am I going to say it like a white Jew rapping and then you know like if I could do it well.
[00:07:01] [SPEAKER_02]: But that was out of the question.
[00:07:03] [SPEAKER_02]: I mean that was some of my like maybe my worst narrating was of me trying to do like a battle rap.
[00:07:11] [SPEAKER_05]: Right.
[00:07:12] [SPEAKER_05]: In an ideal world the solution would be if the rappers themselves could recite those lyrics.
[00:07:16] [SPEAKER_05]: Yes.
[00:07:17] [SPEAKER_02]: Yeah.
[00:07:18] [SPEAKER_02]: Like and which wouldn't be impossible they were on YouTube like so I could quote it from YouTube except that I don't think that they...
[00:07:27] [SPEAKER_02]: The rights.
[00:07:27] [SPEAKER_02]: The lawyers care more about rights than about.
[00:07:30] [SPEAKER_02]: And it's impossible because I actually like tried to track down who has control of the rights and stuff like that from the battle, from the rap battles on YouTube for the battle production company that does them.
[00:07:43] [SPEAKER_02]: And like just it was really impossible for me and maybe somebody else could have figured it out.
[00:07:51] [SPEAKER_02]: But I did try and I spent a while doing it.
[00:07:53] [SPEAKER_05]: I even tried.
[00:07:55] [SPEAKER_05]: I even was really trying to track down like who would own the rights to this stuff and it's completely unclear.
[00:08:00] [SPEAKER_05]: Yeah.
[00:08:00] [SPEAKER_05]: It's not even clear to me that there's a DVD I think right.
[00:08:04] [SPEAKER_05]: So like definitely someone owns the rights to the footage of the performance.
[00:08:09] [SPEAKER_05]: But the lyrics themselves like are those things...
[00:08:12] [SPEAKER_05]: If there are any lawyers who listen like...
[00:08:15] [SPEAKER_02]: Yeah I'm curious because you can find the lyrics on any number of lyrics websites.
[00:08:20] [SPEAKER_02]: Yeah.
[00:08:21] [SPEAKER_02]: But basic books was just like you can't quote more than a tiny little excerpt if you don't have the rights for them.
[00:08:28] [SPEAKER_02]: Right.
[00:08:28] [SPEAKER_02]: And I don't know if they were being paranoid but I don't think so.
[00:08:31] [SPEAKER_02]: Anyway, so what did I...
[00:08:33] [SPEAKER_02]: What do I do?
[00:08:34] [SPEAKER_02]: Like in that situation I need to be able to get it across.
[00:08:37] [SPEAKER_02]: Because my first instinct was I should just say it.
[00:08:41] [SPEAKER_02]: Like you know that's what it is.
[00:08:43] [SPEAKER_02]: That's how they would read it in the book.
[00:08:47] [SPEAKER_02]: So why not say it?
[00:08:50] [SPEAKER_02]: But then the more I actually pictured me doing it the less I liked that idea.
[00:08:58] [SPEAKER_02]: And...
[00:08:58] [SPEAKER_05]: Did you know that I would just like take a little audio snippet?
[00:09:02] [SPEAKER_05]: I would just like...
[00:09:04] [SPEAKER_05]: Just lead off every episode with it.
[00:09:06] [SPEAKER_05]: Like...
[00:09:08] [SPEAKER_05]: Completely out of context.
[00:09:11] [SPEAKER_14]: Yeah, VBW no context.
[00:09:14] [SPEAKER_02]: We'd have a field day with that.
[00:09:16] [SPEAKER_02]: So I texted you and I asked the engineer and the engineer and the director said that
[00:09:24] [SPEAKER_02]: he would talk to the people there but nobody really knew.
[00:09:28] [SPEAKER_05]: You should have just asked Kanye West for permission.
[00:09:31] [SPEAKER_02]: I reached out to him but he didn't give me permission either.
[00:09:37] [SPEAKER_02]: So like everybody sort of naturally coalesced around saying it but bleeping it.
[00:09:45] [SPEAKER_02]: Yeah.
[00:09:46] [SPEAKER_05]: Well, I just said a bleep.
[00:09:48] [SPEAKER_05]: Like the saying part is I think the most interesting part of this conversation
[00:09:51] [SPEAKER_05]: because I think that it's a natural...
[00:09:55] [SPEAKER_05]: Although I also suggested just saying the word nickel but that might be just another lost nickel.
[00:10:04] [SPEAKER_14]: There is the nickel.
[00:10:06] [SPEAKER_05]: But I was assuming when I said bleep I meant as you're saying it pause during the time
[00:10:14] [SPEAKER_05]: that that word comes up and then just insert a bleep.
[00:10:18] [SPEAKER_05]: But that's a little weird.
[00:10:19] [SPEAKER_02]: No, that's...
[00:10:20] [SPEAKER_02]: I think you have to like any bleep you usually get like the first little bit of the word.
[00:10:25] [SPEAKER_02]: Oh, so there's like...
[00:10:27] [SPEAKER_02]: Yeah.
[00:10:28] [SPEAKER_02]: Otherwise you know it could be like mother fuck or something like that even though I say fuck other places in the book but...
[00:10:35] [SPEAKER_02]: Yeah.
[00:10:37] [SPEAKER_02]: So I think they have to...
[00:10:38] [SPEAKER_02]: Which is weird.
[00:10:39] [SPEAKER_02]: I mean that's such a strange thing.
[00:10:41] [SPEAKER_02]: Like you're saying it you know what the word is but it's bleep so like it's...
[00:10:48] [SPEAKER_02]: You know there is something that doesn't seem right about that to me
[00:10:51] [SPEAKER_02]: but it seems like maybe least bad of all the different options.
[00:10:55] [SPEAKER_05]: Right, right.
[00:10:57] [SPEAKER_05]: If you had the proper rhythm and cadence my suggestion was to just strategically pause during that word
[00:11:07] [SPEAKER_05]: but your rap skills probably just aren't on point like that.
[00:11:11] [SPEAKER_02]: I feel like even for me I did it badly and although I could have done more takes of it
[00:11:20] [SPEAKER_02]: I just sort of wanted it to be over.
[00:11:23] [SPEAKER_02]: I wanted to move on.
[00:11:25] [SPEAKER_02]: It was also...
[00:11:25] [SPEAKER_02]: It's very close to the end of the book and I was pretty much done narrating.
[00:11:31] [SPEAKER_02]: So like I just wanted to move on but if I could do it over again I would do those...
[00:11:38] [SPEAKER_02]: I'd do a couple of those lines again because in my mind it is going to be really, really bad.
[00:11:46] [SPEAKER_02]: And the engineer was like an ex DJ and so you know he didn't say anything about it afterwards.
[00:11:54] [SPEAKER_02]: I think he was just like, well...
[00:11:57] [SPEAKER_05]: I would have been laughing my ass off.
[00:12:00] [SPEAKER_02]: So you think I went with the right option except...
[00:12:02] [SPEAKER_05]: I think that's the only option.
[00:12:04] [SPEAKER_05]: I mean again like you know...
[00:12:06] [SPEAKER_05]: What about just saying?
[00:12:07] [SPEAKER_05]: Actually getting loaded luxe in the studio to do it for your audiobook.
[00:12:15] [SPEAKER_05]: That would have been the ideal option.
[00:12:17] [SPEAKER_05]: Not even just copying YouTube, just saying like...
[00:12:19] [SPEAKER_05]: And now here's loaded luxe.
[00:12:20] [SPEAKER_05]: That would have been the ideal option.
[00:12:23] [SPEAKER_14]: Right but he's not flying out to Grand Haven, Michigan.
[00:12:26] [SPEAKER_05]: The second best option is getting Samuel Jackson to read it.
[00:12:32] [SPEAKER_05]: Third best option is to bleep yourself.
[00:12:35] [SPEAKER_05]: Fourth best option is to just say it but then you would have had to like kind of audio footnote.
[00:12:41] [SPEAKER_05]: Like I might just say like...
[00:12:43] [SPEAKER_05]: I'm just going to use the N word because it's part of the text.
[00:12:47] [SPEAKER_05]: You would say it...
[00:12:48] [SPEAKER_05]: I don't know what I would do but I think that that would be...
[00:12:51] [SPEAKER_05]: It seems not unreasonable to read the text as it is written in the same way that like when you read Huck Finn...
[00:12:59] [SPEAKER_05]: I don't know what they do nowadays if they read aloud still like you know when you take turns reading.
[00:13:04] [SPEAKER_05]: I don't know that we ever took turns reading Huck Finn aloud because it was a little bit later
[00:13:09] [SPEAKER_05]: but you can imagine that that's similar kind of dilemma.
[00:13:13] [SPEAKER_05]: And there's a way in which you could just say well that's the text.
[00:13:16] [SPEAKER_02]: You know this reminds me of that debate or discussion that was in a debate really long ago
[00:13:22] [SPEAKER_02]: when Eliza was into Kill a Mockingbird and Play and they nixed it at the last second
[00:13:27] [SPEAKER_02]: because they were cutting out all the N words and the playwright wouldn't allow them to do it
[00:13:34] [SPEAKER_02]: unless they were going to use the word.
[00:13:37] [SPEAKER_02]: And you know...
[00:13:38] [SPEAKER_02]: So they ended up not doing the play and I remember thinking that was sort of a shame
[00:13:45] [SPEAKER_02]: because the whole power of that text comes from the fact that this is how people were
[00:13:51] [SPEAKER_02]: and this is what people were doing but I remember we got that email from a listener
[00:13:56] [SPEAKER_02]: really long thoughtful email saying if I'm in that audience and I see kids
[00:14:03] [SPEAKER_02]: and I even get the slightest hint as is probably likely that some of the kids aren't taking this seriously
[00:14:10] [SPEAKER_02]: what's happening then that's going to be really hurtful for me.
[00:14:16] [SPEAKER_02]: And hard to take and you know I totally get that.
[00:14:21] [SPEAKER_05]: Yeah and none of what we're saying is that we have no problem saying it
[00:14:27] [SPEAKER_05]: but the PC police are going to like come after us.
[00:14:29] [SPEAKER_05]: Like that's not the issue.
[00:14:31] [SPEAKER_05]: No.
[00:14:31] [SPEAKER_05]: But fuck that.
[00:14:32] [SPEAKER_02]: I felt uncomfortable saying it you know again not because I thought the PC police
[00:14:38] [SPEAKER_02]: I think there is for better and for worse and maybe for better unbalanced
[00:14:45] [SPEAKER_02]: just right now just an undeniable poison to that word that it just can't you know
[00:14:52] [SPEAKER_02]: that we talked about that Lenny Bruce that old Lenny Bruce bit about trying to drain the poison out of it
[00:14:59] [SPEAKER_02]: and I like that idea in theory I really do like you know drain the poison
[00:15:04] [SPEAKER_02]: make it not a word but it just doesn't seem like that can happen for whatever reason.
[00:15:10] [SPEAKER_05]: I think within within a segment of the black sort of population in the US
[00:15:16] [SPEAKER_05]: that's happened but right within very white restricted guidelines.
[00:15:21] [SPEAKER_05]: Yeah but even like older civil rights era black people you know like have don't don't like that
[00:15:28] [SPEAKER_02]: and I'm perp you love Pulp Fiction it's like your favorite movie.
[00:15:33] [SPEAKER_02]: Yeah what do you think about when Quentin Tarantino is this but it's you know
[00:15:39] [SPEAKER_05]: there's a lot of jarring shit in that movie that's about the most Jarring like
[00:15:44] [SPEAKER_05]: and I think that it's because there is something about the way in which somebody says something
[00:15:50] [SPEAKER_05]: and the intention behind it maybe because Quentin Tarantino is such a bad actor
[00:15:56] [SPEAKER_05]: but also the fact that he gave himself those words to say like you know I don't have a problem
[00:16:04] [SPEAKER_05]: you know art is art it's just like there's real discussion to be had but like him in that role
[00:16:11] [SPEAKER_05]: giving it to him to say in that way is Jarring to me.
[00:16:15] [SPEAKER_02]: So what's interesting about it that's almost you could look at it as a little bit of a cop out
[00:16:21] [SPEAKER_02]: they make a big deal about the fact that he's really good friends with Jules
[00:16:26] [SPEAKER_02]: that's why Jules is going there and in a completely gratuitous like hypothetical scene
[00:16:33] [SPEAKER_02]: they show that his wife is black and like just absolutely no reason for that it doesn't flow with that
[00:16:41] [SPEAKER_02]: it's really the only sort of because that I think is probably the best segment in that movie the Bonnie situation
[00:16:47] [SPEAKER_02]: and yeah like there's just no reason for that sort of hypothetical what if she came home
[00:16:54] [SPEAKER_02]: except to say that he's married to a black woman so it's okay that he says that word
[00:17:01] [SPEAKER_05]: in my head canon the hypothetical is that Quentin Tarantino has just been lying about having a black wife
[00:17:06] [SPEAKER_05]: so that he can say those things
[00:17:11] [SPEAKER_05]: which reminds me one day we will do a whole Pulp Fiction episode
[00:17:14] [SPEAKER_05]: I think this we've been planning it for six years but since Tamler has veto power
[00:17:22] [SPEAKER_02]: one day
[00:17:24] [SPEAKER_02]: alright should we do our guilty confessions
[00:17:26] [SPEAKER_02]: let's do it
[00:17:27] [SPEAKER_02]: I think you should go first
[00:17:29] [SPEAKER_05]: no no no you should just
[00:17:31] [SPEAKER_05]: we know how that goes
[00:17:35] [SPEAKER_02]: fine
[00:17:36] [SPEAKER_02]: okay so this is this is really hard to confess but I
[00:17:40] [SPEAKER_02]: I guess I owe you one
[00:17:45] [SPEAKER_02]: so not this last time I was traveling but the time before
[00:17:50] [SPEAKER_02]: I put an auto reply for my email
[00:17:54] [SPEAKER_02]: that said I apologize for the delayed response but I'll be traveling and I will have limited access to e-mail
[00:18:03] [SPEAKER_02]: in fact
[00:18:05] [SPEAKER_02]: my access to e-mail wasn't all that limited
[00:18:13] [SPEAKER_05]: Jesus fucking Christ
[00:18:16] [SPEAKER_02]: no that's not my real
[00:18:18] [SPEAKER_14]: but it is true
[00:18:21] [SPEAKER_14]: it's pretty much true of every time I've put that I think
[00:18:24] [SPEAKER_02]: like someone tweeted out that's the biggest lie in all of academia
[00:18:27] [SPEAKER_05]: yeah but it feels so fucking good
[00:18:29] [SPEAKER_05]: it's so good
[00:18:30] [SPEAKER_02]: yeah
[00:18:31] [SPEAKER_02]: and sometimes it's true depending on you know
[00:18:33] [SPEAKER_02]: I don't want to do a conceptual analysis of limited but you know
[00:18:38] [SPEAKER_02]: sometimes
[00:18:40] [SPEAKER_02]: so this is as close as I'm going to get to a real one
[00:18:43] [SPEAKER_02]: so you can fuck me over to get revenge
[00:18:46] [SPEAKER_02]: first year Eliza's born
[00:18:48] [SPEAKER_02]: writer is 2004 she's born
[00:18:52] [SPEAKER_02]: you could just through these online pharmacies that were popped up around that time
[00:18:57] [SPEAKER_02]: you could just get Vicodin
[00:18:59] [SPEAKER_02]: you could just get Norco you know like the best kind of Vicodin
[00:19:03] [SPEAKER_02]: God
[00:19:03] [SPEAKER_02]: all you had to do is like fax them something that said like you hurt your back once or something
[00:19:10] [SPEAKER_02]: and they would just send you pretty much unlimited supplies
[00:19:14] [SPEAKER_02]: so first year Eliza was born
[00:19:17] [SPEAKER_02]: and I don't even know if my wife knows this
[00:19:19] [SPEAKER_02]: but pretty much every day or if not just every day
[00:19:23] [SPEAKER_02]: I was just on Vicodin
[00:19:26] [SPEAKER_02]: you know
[00:19:27] [SPEAKER_02]: and I get that like especially in these times with the opioid epidemic
[00:19:31] [SPEAKER_05]: but it is like Disneyland for your mind
[00:19:34] [SPEAKER_05]: you know
[00:19:35] [SPEAKER_05]: right
[00:19:36] [SPEAKER_02]: it's in
[00:19:37] [SPEAKER_02]: you know I have I'm lucky enough to be wired in a way that when I you know a year later
[00:19:43] [SPEAKER_02]: when it became a little more difficult to get
[00:19:46] [SPEAKER_02]: I just said okay fuck this I'm not getting it
[00:19:48] [SPEAKER_02]: I'm not just doing this anymore plus
[00:19:50] [SPEAKER_02]: it was also like I realized
[00:19:52] [SPEAKER_02]: I wasn't taking enough to keep up with the you know whatever the
[00:19:58] [SPEAKER_02]: you get used to it what's that called
[00:20:00] [SPEAKER_02]: tolerance
[00:20:00] [SPEAKER_02]: tolerance tolerance yeah so
[00:20:03] [SPEAKER_02]: so I was able to quit it I know a lot of people aren't
[00:20:06] [SPEAKER_02]: but
[00:20:07] [SPEAKER_02]: given that I have to say like
[00:20:10] [SPEAKER_02]: it was great
[00:20:11] [SPEAKER_02]: I would come home
[00:20:13] [SPEAKER_02]: I would come home and Eliza
[00:20:15] [SPEAKER_02]: you know she was like six months of delightful
[00:20:19] [SPEAKER_02]: child baby and like just like chill out with some Vicodin
[00:20:23] [SPEAKER_02]: and hang out with Eliza because Jen was working evenings then
[00:20:27] [SPEAKER_02]: and so it really just be me and her
[00:20:29] [SPEAKER_02]: the only time it was
[00:20:30] [SPEAKER_02]: it was really bad and maybe irresponsible I was away at it for a trip
[00:20:35] [SPEAKER_02]: and then I met a friend and just had a really bad night of drinking
[00:20:40] [SPEAKER_02]: and drinking and Vicodin don't often go all that well
[00:20:44] [SPEAKER_02]: if you do
[00:20:44] [SPEAKER_05]: says it on the label
[00:20:46] [SPEAKER_02]: one or both to excess
[00:20:48] [SPEAKER_02]: you can have a few drinks and like and that's that's perfect
[00:20:52] [SPEAKER_02]: that's called activating it yeah that's what I call it
[00:20:54] [SPEAKER_02]: that's right exactly but
[00:20:56] [SPEAKER_02]: but that night was really bad and then I come home
[00:21:00] [SPEAKER_02]: and I'm taking care of Eliza
[00:21:01] [SPEAKER_02]: and I am just I put her in the Jolly Jumper
[00:21:04] [SPEAKER_02]: and I am just vomiting just in just the like all over
[00:21:09] [SPEAKER_02]: and I remember I can picture it right now she's just laughing at me
[00:21:12] [SPEAKER_02]: as I'm vomiting over the toilet in her Jolly Jumper
[00:21:15] [SPEAKER_05]: oh my god you know
[00:21:18] [SPEAKER_05]: I'm just gonna call you Bubs from now on
[00:21:20] [SPEAKER_05]: I'm just picturing you on the couch with Drul coming down
[00:21:23] [SPEAKER_05]: just quick question did you like not shit for a year
[00:21:27] [SPEAKER_02]: that yeah that was an issue
[00:21:30] [SPEAKER_14]: yeah that sucks and it feels like there should be a way around that
[00:21:33] [SPEAKER_14]: and there doesn't seem to be
[00:21:35] [SPEAKER_05]: fuck medicine fuck western science and it's
[00:21:40] [SPEAKER_05]: jokes aside like yeah shit
[00:21:42] [SPEAKER_05]: it sucks because some people really you know obviously do get addicted
[00:21:45] [SPEAKER_05]: and it's easy to see
[00:21:49] [SPEAKER_05]: it's easy to see why like there but for the grace
[00:21:53] [SPEAKER_05]: like if that she was available over the counter
[00:21:54] [SPEAKER_02]: I mean withdrawals sucked
[00:21:57] [SPEAKER_02]: but you know like I think some people are just wired
[00:22:00] [SPEAKER_02]: like I have this like close to addiction
[00:22:05] [SPEAKER_02]: but I can always jump ship
[00:22:08] [SPEAKER_02]: when I realize this is bad for me
[00:22:11] [SPEAKER_05]: yeah drugs are like drugs are like donuts to me
[00:22:13] [SPEAKER_05]: if they're there yeah like all eat them
[00:22:17] [SPEAKER_05]: if they're not I'm not like going out to Dunkin Donuts
[00:22:20] [SPEAKER_05]: like I'm not like I suck your dick
[00:22:24] [SPEAKER_05]: what'd you say motherfucker
[00:22:26] [SPEAKER_05]: yeah no I broke my ankle once I didn't know it was broken
[00:22:30] [SPEAKER_05]: I thought it was a sprain I had sprained it many times
[00:22:32] [SPEAKER_05]: so I go to a doctor this is when I was a postdoc
[00:22:34] [SPEAKER_05]: and he X raised me he's like you know you broke your ankle
[00:22:37] [SPEAKER_05]: and I was like oh shit I just thought it was you know
[00:22:40] [SPEAKER_05]: sprained it plenty of times from basketball and he goes
[00:22:43] [SPEAKER_05]: I'm gonna give you something it's a narcotic
[00:22:48] [SPEAKER_05]: and he gave me like a fat bottle of Ike it in with a refill
[00:22:53] [SPEAKER_05]: and this again is right before my daughter was born
[00:22:56] [SPEAKER_05]: but I was I would play X but I would play Halo
[00:22:58] [SPEAKER_05]: on Xbox with my friends and have a glass of wine
[00:23:02] [SPEAKER_05]: and pops and by good in and it is like when people ask me
[00:23:05] [SPEAKER_05]: to describe what my paradise would be like yeah
[00:23:07] [SPEAKER_05]: it would be have you know circa 2003 2004
[00:23:12] [SPEAKER_05]: Halo Vicodin and why oh and what I was gonna say
[00:23:16] [SPEAKER_02]: is Vicodin and playing online poker like it could be
[00:23:19] [SPEAKER_02]: a little boring to play solid tight aggressive poker
[00:23:22] [SPEAKER_02]: you know like you know but just so you win money
[00:23:25] [SPEAKER_02]: but not on Vicodin it was you know like Vicodin
[00:23:28] [SPEAKER_02]: and a couple in a few beers like and then like
[00:23:31] [SPEAKER_02]: you know then my daughter was born so now it's
[00:23:33] [SPEAKER_02]: not poker it's it's her but it was like oh it's
[00:23:36] [SPEAKER_05]: that's good to same same risk involved
[00:23:42] [SPEAKER_05]: yeah don't do drugs sorry we all probably all know
[00:23:45] [SPEAKER_05]: I don't know I like people who were actually addicted
[00:23:49] [SPEAKER_05]: and a lot of people who overdosed so what gets
[00:23:53] [SPEAKER_05]: I'm usually libertarian about this stuff but like
[00:23:55] [SPEAKER_05]: there's a good reason to regulate that shit
[00:23:57] [SPEAKER_05]: especially that stuff because this is the sneaky
[00:24:01] [SPEAKER_05]: it's super sneaky alright my guilty confession
[00:24:06] [SPEAKER_05]: I when I arrived here Cornell and
[00:24:09] [SPEAKER_05]: in 2006 I bought a house for the first time had
[00:24:15] [SPEAKER_05]: my own piece of land nice and master of all you
[00:24:19] [SPEAKER_05]: survey master sometimes even though the bathroom
[00:24:24] [SPEAKER_05]: is closer to me I just go outside and pee
[00:24:27] [SPEAKER_05]: in my backyard really I just I have a little
[00:24:33] [SPEAKER_05]: deck and there is just nothing like peeing
[00:24:35] [SPEAKER_05]: on my own land just I feel I feel close to nature
[00:24:39] [SPEAKER_05]: I feel like this is this is mine this is how
[00:24:43] [SPEAKER_05]: this is how humans were meant to be male humans
[00:24:46] [SPEAKER_05]: were meant to be it just feels good
[00:24:49] [SPEAKER_05]: what are your neighbors think of that they
[00:24:52] [SPEAKER_05]: it's they're far enough away that they
[00:24:53] [SPEAKER_05]: can't see me nice believe me I try I'm
[00:24:56] [SPEAKER_02]: waving look at me I'm being hey Bill Nancy
[00:25:00] [SPEAKER_02]: check out this stream yeah that's right do
[00:25:04] [SPEAKER_02]: you take dumps out there to know no I don't
[00:25:06] [SPEAKER_05]: I've never taken a dump anywhere in the
[00:25:08] [SPEAKER_05]: toilet I don't know how people do that
[00:25:10] [SPEAKER_05]: my knees just don't work on camping
[00:25:12] [SPEAKER_02]: somewhere with no toilets no never alright
[00:25:17] [SPEAKER_02]: alright and we'll be right back with
[00:25:20] [SPEAKER_02]: me will make a school that was so fucking
[00:25:22] [SPEAKER_02]: long it was we were just rambling to
[00:25:25] [SPEAKER_07]: to just want to know blue when there's
[00:25:57] [SPEAKER_07]: nothing to do when the tension gets
[00:25:59] [SPEAKER_07]: too big for my soul will mine and cut
[00:26:01] [SPEAKER_07]: through it just wants in the blue
[00:26:05] [SPEAKER_07]: nothing to do when there's nothing to do
[00:26:18] [SPEAKER_07]: nothing to do it just wants in the blue
[00:26:30] [SPEAKER_07]: when there's nothing to do
[00:26:35] [SPEAKER_07]: wants in the blue when there's nothing
[00:26:37] [SPEAKER_07]: to do when the tension gets too big
[00:26:39] [SPEAKER_07]: for my soul will mine and cut through
[00:26:44] [SPEAKER_05]: welcome back to very bad wizards at
[00:26:46] [SPEAKER_05]: this point we'd like to thank everybody
[00:26:49] [SPEAKER_05]: for all your support like we said last
[00:26:53] [SPEAKER_05]: time it's been six years I never would
[00:26:56] [SPEAKER_05]: have predicted it would last six years
[00:26:57] [SPEAKER_05]: but it is really really mainly
[00:27:03] [SPEAKER_05]: causally connected to the fact that we
[00:27:05] [SPEAKER_05]: have such great listeners people who
[00:27:07] [SPEAKER_05]: write us people who support us if you
[00:27:11] [SPEAKER_05]: would like to get ahold of us email us
[00:27:13] [SPEAKER_05]: very bad wizards at gmail.com you
[00:27:15] [SPEAKER_05]: can tweet to us at very bad wizards
[00:27:17] [SPEAKER_05]: at peas at tamler you can join the
[00:27:21] [SPEAKER_05]: discussions the lively discussions on
[00:27:23] [SPEAKER_05]: facebook.com slash very bad wizards
[00:27:25] [SPEAKER_05]: or reddit.com slash r slash very bad
[00:27:28] [SPEAKER_05]: wizards for that subreddit get some
[00:27:30] [SPEAKER_05]: in-depth discussion with other
[00:27:32] [SPEAKER_05]: listeners always smart always
[00:27:35] [SPEAKER_05]: interesting and you can follow us
[00:27:38] [SPEAKER_05]: on Instagram as well at the very
[00:27:41] [SPEAKER_05]: bad wizards account you can
[00:27:44] [SPEAKER_05]: support us in more tangible ways by
[00:27:46] [SPEAKER_05]: going to our very bad wizards dot com
[00:27:50] [SPEAKER_05]: slash support page and there you will
[00:27:54] [SPEAKER_05]: find the various ways in which you can
[00:27:55] [SPEAKER_05]: support us there's actually a donate tab
[00:27:56] [SPEAKER_05]: now the fireside our hosts has a
[00:28:00] [SPEAKER_05]: dedicated page where you can go donate
[00:28:04] [SPEAKER_05]: to our patreon you can give us a one
[00:28:07] [SPEAKER_05]: time PayPal donation you can shop on
[00:28:10] [SPEAKER_05]: Amazon which is always great through
[00:28:12] [SPEAKER_05]: our link and like I just said you
[00:28:14] [SPEAKER_05]: can go to our patreon page thank you so
[00:28:15] [SPEAKER_05]: much to all those people who sign up for
[00:28:17] [SPEAKER_05]: even the smallest of regular
[00:28:18] [SPEAKER_05]: contributions if you get value from what
[00:28:20] [SPEAKER_05]: we do we really appreciate it you
[00:28:22] [SPEAKER_05]: certainly don't have to give us money
[00:28:23] [SPEAKER_05]: but that is one way that that we
[00:28:28] [SPEAKER_05]: appreciate you showing appreciation
[00:28:31] [SPEAKER_02]: and we have content coming out yeah
[00:28:34] [SPEAKER_02]: yeah like what I can only imagine
[00:28:38] [SPEAKER_02]: will be a two to three hour bonus
[00:28:40] [SPEAKER_02]: episode with Jesse Graham and Natalia
[00:28:43] [SPEAKER_02]: Washington Utah is a philosopher she's
[00:28:47] [SPEAKER_02]: a philosopher at Utah and and Jesse's
[00:28:50] [SPEAKER_02]: of the psychologist Jesse's a psychologist
[00:28:52] [SPEAKER_02]: Jesse has been a guest on very bad
[00:28:54] [SPEAKER_02]: wizards way back I think it was episode
[00:28:56] [SPEAKER_02]: 18 and we are going to talk about
[00:29:01] [SPEAKER_02]: Twin Peaks season three so and that
[00:29:04] [SPEAKER_02]: should be up I don't know in a week
[00:29:08] [SPEAKER_02]: or so after this drops so yeah if you
[00:29:11] [SPEAKER_02]: if you if you haven't seen it you should
[00:29:13] [SPEAKER_02]: see it because it's one of its maybe the
[00:29:15] [SPEAKER_02]: greatest work of art in the last 50 years
[00:29:18] [SPEAKER_02]: and it is fascinating to talk about I
[00:29:21] [SPEAKER_02]: also might do and I don't know maybe
[00:29:23] [SPEAKER_02]: I'll tack it on at the end of an episode
[00:29:24] [SPEAKER_02]: if you agree but my daughter and I have
[00:29:27] [SPEAKER_02]: been talking about it for like a year
[00:29:29] [SPEAKER_02]: over a year now tell us it a starter
[00:29:32] [SPEAKER_02]: own damn podcast I know and so I might
[00:29:35] [SPEAKER_02]: put a little conversation between her
[00:29:39] [SPEAKER_02]: and me about it to somewhere I'm not
[00:29:40] [SPEAKER_02]: sure so yeah but that'll definitely come
[00:29:44] [SPEAKER_02]: on patreon very shortly we sorry to
[00:29:48] [SPEAKER_02]: bother you like we just never got a
[00:29:50] [SPEAKER_02]: chance to see it again yeah apparently
[00:29:52] [SPEAKER_05]: it might be re-released I just saw
[00:29:55] [SPEAKER_05]: boots really just tweeted that it was
[00:29:57] [SPEAKER_05]: coming back to theaters oh good so
[00:29:59] [SPEAKER_05]: I think we might get a chance to see it
[00:30:01] [SPEAKER_05]: again I given the tamler is is going
[00:30:06] [SPEAKER_05]: outside of the relationship to record
[00:30:08] [SPEAKER_05]: bonus content which I'm all for for the
[00:30:10] [SPEAKER_05]: record I was just telling Tamler I would
[00:30:15] [SPEAKER_05]: love to talk to a an academic who loves
[00:30:18] [SPEAKER_05]: hip-hop as much as I do and for bonus
[00:30:20] [SPEAKER_05]: content so reach out if you're one of
[00:30:22] [SPEAKER_05]: those the last thing I'm gonna say is
[00:30:25] [SPEAKER_05]: that we've been for years now thinking
[00:30:28] [SPEAKER_05]: about just putting together a quick
[00:30:30] [SPEAKER_05]: little survey just to hear to learn who
[00:30:33] [SPEAKER_05]: listens to us and now that we have so
[00:30:35] [SPEAKER_05]: many listen listeners I am a little bit
[00:30:38] [SPEAKER_05]: curious as to what sort of the political
[00:30:40] [SPEAKER_05]: orientation breakdown is of our
[00:30:42] [SPEAKER_05]: listeners and we can include really dumb
[00:30:44] [SPEAKER_05]: questions how many of you white front
[00:30:46] [SPEAKER_05]: front to back or is that long-form
[00:30:48] [SPEAKER_05]: piece of journalism that you linked to
[00:30:52] [SPEAKER_05]: Twitter the other day how many men
[00:30:55] [SPEAKER_05]: get into the bathtub on their hands
[00:30:57] [SPEAKER_14]: the pictures the drawings the drawings
[00:31:01] [SPEAKER_05]: were golden so so anyway thank you all
[00:31:06] [SPEAKER_05]: for all your support and we look forward
[00:31:10] [SPEAKER_05]: to hearing from you and without further
[00:31:13] [SPEAKER_05]: ado the one true Scotsman okay we are
[00:31:17] [SPEAKER_02]: here with will mcaskill and our
[00:31:20] [SPEAKER_02]: listeners have been asking us to have
[00:31:22] [SPEAKER_02]: you on for a while now thank you for
[00:31:26] [SPEAKER_06]: joining us thanks so much for having me
[00:31:28] [SPEAKER_02]: on so we're gonna focus most of our
[00:31:30] [SPEAKER_02]: discussion today on the question of
[00:31:34] [SPEAKER_02]: moral uncertainty which you've written
[00:31:36] [SPEAKER_02]: about but could you just start out
[00:31:39] [SPEAKER_02]: giving us a brief introduction to the
[00:31:42] [SPEAKER_02]: movement of effective altruism and
[00:31:45] [SPEAKER_02]: your involvement with it great so
[00:31:48] [SPEAKER_06]: effective altruism this movement is
[00:31:50] [SPEAKER_06]: about using your time or your money
[00:31:53] [SPEAKER_06]: as effectively as possible to try and
[00:31:55] [SPEAKER_06]: make the world a better place so if you
[00:31:58] [SPEAKER_06]: have some money that you want to give
[00:31:59] [SPEAKER_06]: to charity how do you do that to benefit
[00:32:02] [SPEAKER_06]: other people by as much as possible
[00:32:03] [SPEAKER_06]: or if you're thinking what career should
[00:32:06] [SPEAKER_06]: I pursue what you know what's the career
[00:32:09] [SPEAKER_06]: where you can have the biggest positive
[00:32:10] [SPEAKER_06]: impact on the world and this is a
[00:32:13] [SPEAKER_06]: movement that's philosophy like a lot
[00:32:17] [SPEAKER_06]: of philosophers working on it also
[00:32:19] [SPEAKER_06]: people engage in like empirical
[00:32:20] [SPEAKER_06]: research to figure out what are the
[00:32:22] [SPEAKER_06]: best charities or career paths and
[00:32:25] [SPEAKER_06]: there are typically a few cause areas
[00:32:26] [SPEAKER_06]: that we champion above all one is
[00:32:29] [SPEAKER_06]: global health and development
[00:32:31] [SPEAKER_06]: second is the elimination of
[00:32:34] [SPEAKER_06]: factory farming the kind of worst
[00:32:36] [SPEAKER_06]: excesses of animal suffering there
[00:32:38] [SPEAKER_06]: and then the third is preservation of
[00:32:40] [SPEAKER_06]: the long-run future of humankind in
[00:32:43] [SPEAKER_06]: particular working on reducing
[00:32:45] [SPEAKER_06]: risks of human extinction and
[00:32:47] [SPEAKER_06]: the general idea of effective
[00:32:49] [SPEAKER_06]: altruism has been an idea I've
[00:32:51] [SPEAKER_06]: helped to promote and also work through
[00:32:53] [SPEAKER_06]: over quite a number of years now
[00:32:55] [SPEAKER_06]: both in theory and as far as I can
[00:32:57] [SPEAKER_06]: in practice tell us about the
[00:32:59] [SPEAKER_02]: practice what have you committed
[00:33:01] [SPEAKER_02]: yourself to do yeah so my first big
[00:33:04] [SPEAKER_06]: commitment I made was financial
[00:33:07] [SPEAKER_06]: so I decided originally to give
[00:33:10] [SPEAKER_06]: 10% of whatever I earned
[00:33:12] [SPEAKER_06]: for the rest of my life but then
[00:33:14] [SPEAKER_06]: you know I thought about it more
[00:33:16] [SPEAKER_06]: and went further than that decided to
[00:33:19] [SPEAKER_06]: just kind of cap my income at
[00:33:22] [SPEAKER_06]: what at the time was about 20,000
[00:33:23] [SPEAKER_06]: pounds post tax that's now about 25,000
[00:33:26] [SPEAKER_06]: pounds post tax and just say look I'm
[00:33:30] [SPEAKER_06]: really pretty happy on that amount of money
[00:33:32] [SPEAKER_06]: that's about the median income in the UK
[00:33:34] [SPEAKER_06]: so if I think oh well I need more money
[00:33:37] [SPEAKER_06]: than that then well what about all the
[00:33:39] [SPEAKER_06]: people who are so much poorer and in
[00:33:41] [SPEAKER_06]: fact even on that amount of money I'm
[00:33:43] [SPEAKER_06]: selling over just 2% of the world's population
[00:33:46] [SPEAKER_06]: and anything I earn above that I just
[00:33:48] [SPEAKER_06]: donate to whatever I think is most effective
[00:33:50] [SPEAKER_06]: did you just make it that number so
[00:33:52] [SPEAKER_05]: nobody could accuse you of being the 1%
[00:33:55] [SPEAKER_05]: yeah I just really wanted to
[00:33:58] [SPEAKER_06]: sneer at the people I'm actually in the 1.1%
[00:34:01] [SPEAKER_06]: and so I can really sneer at all those
[00:34:04] [SPEAKER_06]: higher paid professors who are
[00:34:08] [SPEAKER_06]: who the Occupy protesters should be
[00:34:10] [SPEAKER_02]: campaigning against you're very young
[00:34:14] [SPEAKER_02]: I take it you don't have a family yet
[00:34:17] [SPEAKER_02]: are you married?
[00:34:18] [SPEAKER_06]: yeah I don't have a family I mean I have a partner
[00:34:22] [SPEAKER_06]: at the moment you know I don't
[00:34:24] [SPEAKER_06]: particularly intend to have a family it's
[00:34:26] [SPEAKER_06]: not a commitment that I won't have one
[00:34:28] [SPEAKER_06]: I do have various conditions in this
[00:34:31] [SPEAKER_06]: commitment such that you know if I had
[00:34:34] [SPEAKER_06]: a kid for example or many kids I would
[00:34:38] [SPEAKER_06]: increase the amount of money that I was
[00:34:39] [SPEAKER_06]: living on so you know 25,000 pounds
[00:34:42] [SPEAKER_06]: is a good amount for me but if I was
[00:34:44] [SPEAKER_06]: to have a kid I don't want to you know
[00:34:46] [SPEAKER_06]: impose undue sacrifices on the kid as
[00:34:48] [SPEAKER_06]: well so I'd increase that by something
[00:34:52] [SPEAKER_06]: like 6 or 7,000 pounds if I was with
[00:34:54] [SPEAKER_06]: another partner or more than that if I
[00:34:56] [SPEAKER_06]: was a single parent
[00:34:56] [SPEAKER_02]: this movement has touched a nerve
[00:35:00] [SPEAKER_02]: both in on the positive side and also
[00:35:03] [SPEAKER_02]: on the negative side so it's inspired
[00:35:05] [SPEAKER_02]: a lot of people I know on my campus
[00:35:08] [SPEAKER_02]: I was teaching the Singer article and
[00:35:11] [SPEAKER_02]: then all I'm walking to class and
[00:35:13] [SPEAKER_02]: I see this new student group
[00:35:16] [SPEAKER_02]: Effective Altruism so it has inspired
[00:35:19] [SPEAKER_02]: a lot of adherence it's organized a lot
[00:35:21] [SPEAKER_02]: of people and then it's also touched
[00:35:25] [SPEAKER_02]: a nerve on the negative side there
[00:35:29] [SPEAKER_02]: have been some ardent critiques of
[00:35:32] [SPEAKER_02]: effective altruism people who see it
[00:35:35] [SPEAKER_02]: as almost a threat why do you think
[00:35:39] [SPEAKER_02]: people have responded that way on the
[00:35:43] [SPEAKER_02]: negative side what about the movement
[00:35:45] [SPEAKER_02]: do you think has touched a nerve
[00:35:46] [SPEAKER_06]: right yeah I think there's two reasons
[00:35:49] [SPEAKER_06]: for that so I should say we actually
[00:35:51] [SPEAKER_06]: get less criticism than I would expect
[00:35:54] [SPEAKER_06]: given the nature of what we're doing
[00:35:56] [SPEAKER_06]: but we definitely get some and I think
[00:35:58] [SPEAKER_06]: there's two reasons so one is just
[00:36:00] [SPEAKER_06]: we're making it very salient how much
[00:36:03] [SPEAKER_06]: good you can do with your time and
[00:36:05] [SPEAKER_06]: money and it's really a vast amount
[00:36:07] [SPEAKER_06]: it's an amount such that I think
[00:36:09] [SPEAKER_06]: it's a really pretty incontrovertible
[00:36:11] [SPEAKER_06]: that individuals in rich countries
[00:36:14] [SPEAKER_06]: have a pretty strong obligation to use
[00:36:16] [SPEAKER_06]: at least a significant part of their
[00:36:18] [SPEAKER_06]: resources to try and make the world
[00:36:21] [SPEAKER_06]: better as well as they can and obviously
[00:36:23] [SPEAKER_06]: that's not a message that people
[00:36:25] [SPEAKER_06]: particularly want to hear because it means
[00:36:27] [SPEAKER_06]: you've now got to you know give away
[00:36:28] [SPEAKER_06]: some of your money or make
[00:36:30] [SPEAKER_06]: significant plans the other side
[00:36:33] [SPEAKER_06]: of people who just think the
[00:36:35] [SPEAKER_06]: particular things that we're choosing
[00:36:36] [SPEAKER_06]: to focus on are misconstrued so
[00:36:40] [SPEAKER_06]: within poverty for example we focus a lot
[00:36:43] [SPEAKER_06]: on global health because this is
[00:36:44] [SPEAKER_06]: something that firstly has a great
[00:36:46] [SPEAKER_06]: stack of record over the last 60 years
[00:36:48] [SPEAKER_06]: it's also something you can have a lot
[00:36:49] [SPEAKER_06]: of evidence to show it's doing a lot
[00:36:51] [SPEAKER_06]: of good and I think some of the aid
[00:36:53] [SPEAKER_06]: skeptical worries just don't apply as
[00:36:55] [SPEAKER_06]: much then you know focusing on factory
[00:36:58] [SPEAKER_06]: farming reducing extinction risks too
[00:37:00] [SPEAKER_06]: but there are some people who just
[00:37:02] [SPEAKER_06]: think wow this is totally wrong
[00:37:05] [SPEAKER_06]: what you should really be doing is
[00:37:06] [SPEAKER_06]: overthrowing capitalism for example
[00:37:08] [SPEAKER_06]: and any action that you are doing to
[00:37:10] [SPEAKER_06]: say well we can make the world better
[00:37:12] [SPEAKER_06]: and perhaps these kind of more marginal
[00:37:15] [SPEAKER_06]: ways that's just in some sense
[00:37:19] [SPEAKER_06]: kind of pandering to the bourgeoisie
[00:37:21] [SPEAKER_06]: making you feel like you're making
[00:37:22] [SPEAKER_06]: some sort of moral progress whereas
[00:37:25] [SPEAKER_06]: really you're not tackling the very
[00:37:29] [SPEAKER_06]: core of the issue which is private
[00:37:31] [SPEAKER_06]: ownership of property now I mentioned
[00:37:34] [SPEAKER_06]: kind of you know overthrowing
[00:37:35] [SPEAKER_06]: capitalism there's various other
[00:37:37] [SPEAKER_06]: things that you could plug into that
[00:37:38] [SPEAKER_06]: black box of kind of what you should
[00:37:40] [SPEAKER_06]: really be focusing on is X though
[00:37:42] [SPEAKER_06]: that is I think among the most common
[00:37:46] [SPEAKER_06]: and there it's again just not really
[00:37:47] [SPEAKER_06]: surprising that we're going to get
[00:37:49] [SPEAKER_06]: criticism here because we're saying
[00:37:50] [SPEAKER_06]: well this is what we think is most
[00:37:52] [SPEAKER_06]: important and if you've already invested
[00:37:54] [SPEAKER_06]: a lot of time into some up cause area
[00:37:56] [SPEAKER_06]: that is not the one we claim to be
[00:37:58] [SPEAKER_06]: most important well it's pretty
[00:38:00] [SPEAKER_06]: natural that you'll respond by being
[00:38:02] [SPEAKER_06]: like well fuck you and so that's
[00:38:04] [SPEAKER_06]: kind of why I'm surprised that we don't
[00:38:06] [SPEAKER_06]: get more criticism in fact so both
[00:38:08] [SPEAKER_02]: you think are both criticism stem
[00:38:11] [SPEAKER_02]: from a place that is in some sense
[00:38:14] [SPEAKER_02]: disingenuous or self-serving and the
[00:38:17] [SPEAKER_02]: first it's it's I don't want to give
[00:38:18] [SPEAKER_02]: up cable I don't want to give up my
[00:38:20] [SPEAKER_02]: iPhone and in the second it's I
[00:38:23] [SPEAKER_02]: don't want to waste the time that
[00:38:25] [SPEAKER_02]: I've been trying to organize my
[00:38:27] [SPEAKER_02]: Marxist radical group or whatever
[00:38:30] [SPEAKER_02]: it is is that a worry about effective
[00:38:33] [SPEAKER_02]: altruism that it often responds
[00:38:37] [SPEAKER_02]: this way to criticism sort of
[00:38:40] [SPEAKER_02]: explaining them away as
[00:38:43] [SPEAKER_02]: disingenuous or bad faith on the
[00:38:46] [SPEAKER_02]: part of the people making the critiques
[00:38:48] [SPEAKER_06]: yeah so I do
[00:38:50] [SPEAKER_06]: I do have a view which is kind of
[00:38:53] [SPEAKER_06]: against meta arguments so I think
[00:38:56] [SPEAKER_06]: if you give me an argument and I say
[00:38:58] [SPEAKER_06]: well I don't believe that argument
[00:38:59] [SPEAKER_06]: because you're biased because of
[00:39:01] [SPEAKER_06]: XYZ I think that's just generally a
[00:39:03] [SPEAKER_06]: very bad move in arguments because
[00:39:05] [SPEAKER_06]: you can always come up with a bias
[00:39:06] [SPEAKER_06]: for someone or other
[00:39:09] [SPEAKER_06]: and then I think in the case
[00:39:12] [SPEAKER_06]: I think in the case of well I don't
[00:39:14] [SPEAKER_06]: want to give my money away I just
[00:39:16] [SPEAKER_06]: think there look maybe that's my one
[00:39:18] [SPEAKER_06]: exception or something because I
[00:39:19] [SPEAKER_06]: think the kind of bias is so strong
[00:39:22] [SPEAKER_06]: that it's like just so obvious that
[00:39:25] [SPEAKER_06]: you have a vested interest in one
[00:39:27] [SPEAKER_06]: side of the argument being correct
[00:39:28] [SPEAKER_05]: that's the surprising part
[00:39:31] [SPEAKER_05]: in teaching ethics the surprising
[00:39:33] [SPEAKER_05]: part is the degree to which people
[00:39:34] [SPEAKER_05]: are very clear about
[00:39:37] [SPEAKER_05]: the source of that objection
[00:39:39] [SPEAKER_05]: yeah really is I don't want
[00:39:42] [SPEAKER_05]: I don't want to feel guilt
[00:39:43] [SPEAKER_05]: I don't think that what you're
[00:39:45] [SPEAKER_05]: making me feel is right because I
[00:39:47] [SPEAKER_05]: think that the other objections
[00:39:49] [SPEAKER_05]: about the particular
[00:39:51] [SPEAKER_05]: strategy that you're using
[00:39:53] [SPEAKER_05]: to combat suffering or to maximize
[00:39:56] [SPEAKER_05]: welfare take it at face value
[00:39:58] [SPEAKER_05]: and then it's an empirical question
[00:40:00] [SPEAKER_05]: presumably what the
[00:40:02] [SPEAKER_05]: best strategy and I haven't known at
[00:40:04] [SPEAKER_05]: least anybody
[00:40:07] [SPEAKER_05]: that embraces either
[00:40:08] [SPEAKER_05]: utilitarianism or effective altruism
[00:40:10] [SPEAKER_05]: to not just simply say well
[00:40:12] [SPEAKER_05]: then sure like
[00:40:15] [SPEAKER_05]: we're open to the
[00:40:16] [SPEAKER_05]: we're open to the thing that will maximize
[00:40:18] [SPEAKER_05]: welfare yeah like that's pretty clear
[00:40:21] [SPEAKER_05]: yeah if that's a communist
[00:40:22] [SPEAKER_13]: revolution then that's then it's a communist
[00:40:24] [SPEAKER_13]: revolution yeah and
[00:40:26] [SPEAKER_06]: it would be hard to know how but yeah
[00:40:28] [SPEAKER_06]: and that is exactly how I feel and this
[00:40:30] [SPEAKER_06]: being a bunch of things where we really
[00:40:32] [SPEAKER_06]: have changed our views so
[00:40:34] [SPEAKER_06]: I mean one is just
[00:40:37] [SPEAKER_06]: people say well you know you just
[00:40:39] [SPEAKER_06]: focus too much on
[00:40:40] [SPEAKER_06]: what's measurable and short-term
[00:40:43] [SPEAKER_06]: effects if you're looking at say cost to save a life
[00:40:45] [SPEAKER_06]: or other health metrics
[00:40:47] [SPEAKER_06]: I basically think that's right
[00:40:49] [SPEAKER_06]: now despite such an early emphasis
[00:40:51] [SPEAKER_06]: on it and
[00:40:53] [SPEAKER_06]: I tend to think the things the most
[00:40:54] [SPEAKER_06]: important are things extremely hard to
[00:40:57] [SPEAKER_06]: measure because they're
[00:40:59] [SPEAKER_06]: like influencing the very
[00:41:00] [SPEAKER_06]: long-run future of civilization but you
[00:41:03] [SPEAKER_06]: know vast numbers of people who
[00:41:04] [SPEAKER_06]: exist in the future rather than
[00:41:07] [SPEAKER_06]: large but much smaller
[00:41:08] [SPEAKER_06]: number of people who live today
[00:41:11] [SPEAKER_06]: a second
[00:41:13] [SPEAKER_06]: what he is
[00:41:14] [SPEAKER_06]: perhaps the emphasis on
[00:41:16] [SPEAKER_06]: earning to give as well that we this idea
[00:41:19] [SPEAKER_06]: that you mentioned in an earlier
[00:41:20] [SPEAKER_06]: episode of deliberately
[00:41:22] [SPEAKER_06]: entering a high-paying career in order to
[00:41:26] [SPEAKER_06]: donate
[00:41:27] [SPEAKER_06]: a lot of money where we're tamela
[00:41:29] [SPEAKER_05]: we're tamela confused you with a rich
[00:41:31] [SPEAKER_06]: invested yeah where I you confused me
[00:41:33] [SPEAKER_06]: with yeah Matt waggy
[00:41:34] [SPEAKER_06]: I thought you were maybe making some
[00:41:36] [SPEAKER_06]: some subtle joke about how utilitarianism
[00:41:40] [SPEAKER_06]: ignores the separateness
[00:41:41] [SPEAKER_06]: of persons but I just
[00:41:42] [SPEAKER_06]: speak exactly what it was
[00:41:45] [SPEAKER_02]: that's definitely what I was doing it was
[00:41:47] [SPEAKER_02]: very meta
[00:41:49] [SPEAKER_02]: and you're one of the few people who got
[00:41:51] [SPEAKER_02]: that it was most people just
[00:41:53] [SPEAKER_02]: thought I made a careless mistake
[00:41:55] [SPEAKER_06]: yeah no it was beautiful yeah
[00:41:57] [SPEAKER_06]: then I think when you collected
[00:41:59] [SPEAKER_06]: the mistake you also called the map page
[00:42:01] [SPEAKER_06]: um but
[00:42:04] [SPEAKER_06]: I won't go
[00:42:05] [SPEAKER_02]: people will be scholars will be puzzling
[00:42:07] [SPEAKER_02]: over that one for years
[00:42:09] [SPEAKER_06]: yeah I was
[00:42:10] [SPEAKER_06]: I found that joke a little bit harder to
[00:42:13] [SPEAKER_06]: see through I'm still
[00:42:15] [SPEAKER_06]: working on it well you're young
[00:42:17] [SPEAKER_12]: you're young yeah so you have time
[00:42:19] [SPEAKER_06]: my next is my next would be search
[00:42:21] [SPEAKER_06]: project
[00:42:24] [SPEAKER_06]: but yeah so
[00:42:25] [SPEAKER_06]: but I think some of the arguments we used
[00:42:27] [SPEAKER_06]: for that are actually like incorrect
[00:42:29] [SPEAKER_06]: so one classic one
[00:42:31] [SPEAKER_06]: was this idea that
[00:42:32] [SPEAKER_06]: if you enter the charity
[00:42:35] [SPEAKER_06]: sector well you're just displacing someone
[00:42:37] [SPEAKER_06]: else who would have worked there otherwise
[00:42:39] [SPEAKER_06]: but if you're earning to give then
[00:42:41] [SPEAKER_06]: you're displacing
[00:42:43] [SPEAKER_06]: sure another financier or whatever
[00:42:45] [SPEAKER_06]: but someone who wouldn't
[00:42:48] [SPEAKER_06]: have otherwise been donating as much
[00:42:50] [SPEAKER_06]: um
[00:42:51] [SPEAKER_06]: and that idea that like well
[00:42:53] [SPEAKER_06]: you enter the charity sector you're just displacing
[00:42:55] [SPEAKER_06]: someone else I think that just doesn't really
[00:42:57] [SPEAKER_06]: work and I think that was
[00:42:59] [SPEAKER_06]: uh I certainly was kind of
[00:43:01] [SPEAKER_06]: naive on just supply
[00:43:03] [SPEAKER_06]: and demand economics there
[00:43:05] [SPEAKER_06]: and so um I still
[00:43:07] [SPEAKER_06]: think that this can be a good option for many people but
[00:43:09] [SPEAKER_06]: it's something that we've like
[00:43:11] [SPEAKER_06]: really mitigated as a view or certainly
[00:43:13] [SPEAKER_06]: I've mitigated as a view very significantly
[00:43:15] [SPEAKER_06]: from many years ago
[00:43:16] [SPEAKER_06]: um and so yeah
[00:43:19] [SPEAKER_06]: in general very
[00:43:20] [SPEAKER_06]: seemingly open to
[00:43:22] [SPEAKER_06]: being completely wrong about our current best guesses
[00:43:25] [SPEAKER_06]: about you know how we can do as much good
[00:43:27] [SPEAKER_02]: as possible. So I think there's a third
[00:43:29] [SPEAKER_02]: objection that one
[00:43:31] [SPEAKER_02]: that has been made to effective
[00:43:33] [SPEAKER_02]: altruism that
[00:43:34] [SPEAKER_02]: and it's on the question of moral
[00:43:36] [SPEAKER_02]: uncertainty and here you've talked about
[00:43:39] [SPEAKER_02]: the uncertainty in
[00:43:41] [SPEAKER_02]: what the
[00:43:42] [SPEAKER_02]: effective course of action
[00:43:44] [SPEAKER_02]: will be for an individual
[00:43:46] [SPEAKER_02]: um but in this paper
[00:43:49] [SPEAKER_02]: you also talk about
[00:43:51] [SPEAKER_02]: uncertainty as to
[00:43:53] [SPEAKER_02]: what moral theories are
[00:43:54] [SPEAKER_02]: true and how to incorporate
[00:43:56] [SPEAKER_02]: that uncertainty
[00:43:58] [SPEAKER_02]: into moral decision making
[00:44:01] [SPEAKER_02]: could you
[00:44:01] [SPEAKER_02]: give us a brief cliff notes
[00:44:04] [SPEAKER_02]: version of your position
[00:44:06] [SPEAKER_02]: in that paper? Absolutely
[00:44:08] [SPEAKER_06]: so yeah decisions in the face of
[00:44:10] [SPEAKER_06]: empirical uncertainty or uncertainty
[00:44:12] [SPEAKER_06]: that matters of fact all the time
[00:44:14] [SPEAKER_06]: so if I'm speeding
[00:44:16] [SPEAKER_06]: around a um
[00:44:18] [SPEAKER_06]: corner
[00:44:20] [SPEAKER_06]: well why is it wrong to drive really fast
[00:44:22] [SPEAKER_06]: around a blind corner
[00:44:25] [SPEAKER_06]: it's not because you know that you're going to hit
[00:44:26] [SPEAKER_06]: anybody it's because you think there's some chance
[00:44:29] [SPEAKER_06]: you'll hit somebody and if you did
[00:44:30] [SPEAKER_06]: then that would be extremely wrong
[00:44:32] [SPEAKER_06]: so we look at the probabilities
[00:44:34] [SPEAKER_06]: of different outcomes and how
[00:44:36] [SPEAKER_06]: good or in this case bad they would be
[00:44:38] [SPEAKER_06]: and you
[00:44:40] [SPEAKER_06]: take a kind of average of
[00:44:42] [SPEAKER_06]: the likelihood of something happening and how
[00:44:44] [SPEAKER_06]: good or bad it would be and you come
[00:44:46] [SPEAKER_06]: to a kind of compromise option which
[00:44:48] [SPEAKER_06]: decisions theorists call kind of maximizing
[00:44:51] [SPEAKER_06]: expected value
[00:44:54] [SPEAKER_06]: um and what I'm doing in the paper
[00:44:56] [SPEAKER_06]: is arguing well we should
[00:44:58] [SPEAKER_06]: do use that same sort of reasoning
[00:45:01] [SPEAKER_06]: in the model
[00:45:02] [SPEAKER_06]: case as well so rather than
[00:45:04] [SPEAKER_06]: having what I call the football fan model
[00:45:06] [SPEAKER_06]: of model decision making
[00:45:08] [SPEAKER_06]: where you just say well I'm a
[00:45:10] [SPEAKER_06]: utilitarian or libertarian
[00:45:12] [SPEAKER_06]: or marxist or whatever
[00:45:14] [SPEAKER_06]: and then figure
[00:45:16] [SPEAKER_06]: out what follows given that kind of over the
[00:45:18] [SPEAKER_06]: world view instead you think
[00:45:20] [SPEAKER_06]: well how likely do I find each
[00:45:22] [SPEAKER_06]: of these different positions and then
[00:45:24] [SPEAKER_06]: given a certain action how
[00:45:26] [SPEAKER_06]: um good or bad
[00:45:28] [SPEAKER_06]: would this action be given those different model
[00:45:30] [SPEAKER_06]: views and then again you take
[00:45:32] [SPEAKER_06]: this kind of average um between
[00:45:35] [SPEAKER_06]: likelihood of the different model
[00:45:37] [SPEAKER_06]: views and how good or bad
[00:45:39] [SPEAKER_06]: the action would be given those different model
[00:45:40] [SPEAKER_06]: views and then choose
[00:45:42] [SPEAKER_06]: the action that has is the best compromise
[00:45:44] [SPEAKER_06]: or again in this case what has
[00:45:47] [SPEAKER_06]: um what maximizes
[00:45:48] [SPEAKER_06]: expected value or expected
[00:45:51] [SPEAKER_06]: choice worthiness and
[00:45:53] [SPEAKER_06]: an instance of this is to think about
[00:45:55] [SPEAKER_06]: say um decision
[00:45:56] [SPEAKER_06]: whether to uh
[00:45:59] [SPEAKER_06]: eat some chicken or eat a
[00:46:00] [SPEAKER_06]: vegetarian option and
[00:46:02] [SPEAKER_06]: you might think well it's probably the
[00:46:04] [SPEAKER_06]: case that
[00:46:06] [SPEAKER_06]: animals don't matter
[00:46:08] [SPEAKER_06]: morally suppose you think that's your most
[00:46:10] [SPEAKER_06]: likely view maybe even consider
[00:46:12] [SPEAKER_06]: quite some length do you think if
[00:46:14] [SPEAKER_06]: that's true then you know I
[00:46:16] [SPEAKER_06]: should eat the chicken because I would prefer that
[00:46:18] [SPEAKER_06]: but you think I'm not sure about that I'm not completely
[00:46:20] [SPEAKER_06]: certain there's some chance that
[00:46:23] [SPEAKER_06]: uh the vegetarians
[00:46:24] [SPEAKER_06]: are right the animal welfare folks
[00:46:26] [SPEAKER_06]: singer and so on and
[00:46:28] [SPEAKER_06]: if so then it would be
[00:46:30] [SPEAKER_06]: extremely wrong to eat the chicken
[00:46:32] [SPEAKER_06]: um I'd be causing a huge
[00:46:34] [SPEAKER_06]: amount of suffering it would be just as bad
[00:46:36] [SPEAKER_06]: as if that's suffering or caused on a human
[00:46:39] [SPEAKER_06]: and so
[00:46:40] [SPEAKER_06]: if you would apply the sort of
[00:46:42] [SPEAKER_06]: maximizing expected
[00:46:44] [SPEAKER_06]: um value or maximizing
[00:46:46] [SPEAKER_06]: expected choice worthiness sort of reasoning in this case
[00:46:49] [SPEAKER_06]: then you would think
[00:46:50] [SPEAKER_06]: well
[00:46:52] [SPEAKER_06]: it's only a small amount that I'm going to gain
[00:46:54] [SPEAKER_06]: by choosing the chicken
[00:46:56] [SPEAKER_06]: over the vegetarian option
[00:46:58] [SPEAKER_06]: if it is the case that animals don't matter
[00:47:00] [SPEAKER_06]: whereas there's a huge
[00:47:02] [SPEAKER_06]: amount morally at stake if animals
[00:47:04] [SPEAKER_06]: do matter so even though I think
[00:47:06] [SPEAKER_06]: it's you know less likely
[00:47:08] [SPEAKER_06]: that animals do matter morally
[00:47:10] [SPEAKER_06]: still I ought all things considered
[00:47:12] [SPEAKER_06]: to choose the
[00:47:14] [SPEAKER_06]: choose the vegetarian option that's the kind of
[00:47:16] [SPEAKER_06]: safe bet morally given my own
[00:47:18] [SPEAKER_06]: certainty about what's
[00:47:20] [SPEAKER_05]: morally in the case can I ask
[00:47:22] [SPEAKER_05]: so
[00:47:23] [SPEAKER_05]: I really like
[00:47:26] [SPEAKER_05]: enjoy this topic but my
[00:47:28] [SPEAKER_05]: like in some
[00:47:30] [SPEAKER_05]: ways I enjoy it because of the discomfort
[00:47:32] [SPEAKER_05]: that it gives me that stems not so much from
[00:47:34] [SPEAKER_05]: the content of your claims
[00:47:36] [SPEAKER_05]: but rather from this
[00:47:38] [SPEAKER_05]: the step of evaluating
[00:47:41] [SPEAKER_05]: the
[00:47:42] [SPEAKER_05]: likelihood of truth of theories
[00:47:44] [SPEAKER_05]: and you I think
[00:47:46] [SPEAKER_05]: here you briefly mention it
[00:47:48] [SPEAKER_05]: or maybe in something else that I was reading
[00:47:50] [SPEAKER_05]: about
[00:47:52] [SPEAKER_05]: the fear that
[00:47:54] [SPEAKER_05]: that this theory itself
[00:47:56] [SPEAKER_05]: will
[00:47:58] [SPEAKER_05]: will need to be evaluated in a
[00:47:59] [SPEAKER_05]: meta-theoretic context of other theories
[00:48:02] [SPEAKER_05]: and
[00:48:04] [SPEAKER_05]: it seems so
[00:48:06] [SPEAKER_05]: so foreign to me to
[00:48:08] [SPEAKER_05]: start evaluating
[00:48:10] [SPEAKER_05]: at the theory level
[00:48:12] [SPEAKER_05]: and it's not rational but there is
[00:48:14] [SPEAKER_05]: this fear of infinite regressive
[00:48:16] [SPEAKER_05]: evaluating theories and with
[00:48:18] [SPEAKER_05]: increasingly unlikely
[00:48:19] [SPEAKER_05]: ways to determine the truth
[00:48:22] [SPEAKER_05]: maybe that's the concrete question there
[00:48:24] [SPEAKER_05]: what determines the how
[00:48:26] [SPEAKER_05]: could you step pop out of the normative
[00:48:28] [SPEAKER_05]: ethical level and decide
[00:48:30] [SPEAKER_05]: how to balance the
[00:48:32] [SPEAKER_05]: truth condition
[00:48:34] [SPEAKER_05]: what are the truth conditions for normative theory to be
[00:48:36] [SPEAKER_05]: to be true
[00:48:37] [SPEAKER_05]: is it a thing
[00:48:39] [SPEAKER_05]: which it may not be and this paper would still be valuable
[00:48:42] [SPEAKER_05]: but is it even a thing that people are capable
[00:48:44] [SPEAKER_05]: of doing
[00:48:46] [SPEAKER_05]: of popping out
[00:48:48] [SPEAKER_05]: and evaluating theories
[00:48:50] [SPEAKER_05]: and assigning weights to their probability of being
[00:48:52] [SPEAKER_05]: true what are you using
[00:48:53] [SPEAKER_05]: to evaluate the weights of the
[00:48:55] [SPEAKER_05]: normative ethical theory would be true
[00:48:57] [SPEAKER_06]: ok terrific well you brought up a couple
[00:48:59] [SPEAKER_06]: of issues there so
[00:49:01] [SPEAKER_06]: first you mentioned was this idea of a regress
[00:49:04] [SPEAKER_06]: so
[00:49:05] [SPEAKER_06]: you're unsure about what's right
[00:49:08] [SPEAKER_06]: ethically speaking
[00:49:09] [SPEAKER_06]: and I give you an account of what to do in the face
[00:49:11] [SPEAKER_06]: of your model
[00:49:13] [SPEAKER_06]: uncertainty
[00:49:14] [SPEAKER_06]: the issue is that you shouldn't be completely certain
[00:49:17] [SPEAKER_06]: of my account either
[00:49:19] [SPEAKER_06]: um
[00:49:21] [SPEAKER_06]: so what do you do then well do you have this kind of
[00:49:23] [SPEAKER_06]: meta meta
[00:49:25] [SPEAKER_06]: uncertainty principle where
[00:49:27] [SPEAKER_06]: you take into account your uncertainty over
[00:49:29] [SPEAKER_06]: how to take into account uncertainty over
[00:49:31] [SPEAKER_06]: different model views
[00:49:33] [SPEAKER_06]: well then you're not going to be unsure
[00:49:35] [SPEAKER_06]: of that you're going to have to keep
[00:49:37] [SPEAKER_06]: going further and further back up this chain
[00:49:39] [SPEAKER_06]: and so I think there's two the sponsors to
[00:49:41] [SPEAKER_06]: the you know to this regress issue
[00:49:43] [SPEAKER_06]: one is to say yep
[00:49:45] [SPEAKER_06]: you just keep going up the regress
[00:49:47] [SPEAKER_06]: until you get some sort of convergence
[00:49:49] [SPEAKER_06]: perhaps
[00:49:51] [SPEAKER_06]: or until you just don't have any more
[00:49:53] [SPEAKER_06]: beliefs because it's just like it's too high
[00:49:56] [SPEAKER_06]: and that's kind of one way of going
[00:49:57] [SPEAKER_06]: a second way is to
[00:49:59] [SPEAKER_06]: distinguish what it's kind of moral
[00:50:01] [SPEAKER_06]: to do and what it's rational to do
[00:50:03] [SPEAKER_06]: and to say that what I'm talking about
[00:50:05] [SPEAKER_06]: is just what rationally
[00:50:07] [SPEAKER_06]: ought you to do given your uncertainty
[00:50:09] [SPEAKER_06]: in moral matters and empirical
[00:50:11] [SPEAKER_06]: matters and then what it's rational
[00:50:13] [SPEAKER_06]: to do is just to
[00:50:15] [SPEAKER_06]: maximize expected choice wordness
[00:50:17] [SPEAKER_06]: and then when you come and say
[00:50:19] [SPEAKER_06]: well what's it rational to do
[00:50:21] [SPEAKER_06]: when you're unsure about what it's rational to do
[00:50:23] [SPEAKER_06]: to then say at that point
[00:50:25] [SPEAKER_06]: that doesn't make any sense you've got this kind of
[00:50:28] [SPEAKER_06]: you're right you could just sort of
[00:50:29] [SPEAKER_06]: a bit constanious silence
[00:50:32] [SPEAKER_06]: basically yeah you just have to say
[00:50:33] [SPEAKER_06]: look I've got to have a fixed
[00:50:35] [SPEAKER_06]: point somewhere this is where
[00:50:37] [SPEAKER_06]: I'm staking my
[00:50:38] [SPEAKER_06]: you know staking my fixed point
[00:50:40] [SPEAKER_06]: it doesn't matter if you're unsure
[00:50:43] [SPEAKER_06]: about this it's still the case this is
[00:50:44] [SPEAKER_06]: what you want to do but if you're
[00:50:46] [SPEAKER_02]: opening the door for vit constanity
[00:50:48] [SPEAKER_02]: and uncertainty as a stopping point
[00:50:50] [SPEAKER_02]: it seems like it could be a stopping point
[00:50:52] [SPEAKER_02]: much earlier than you
[00:50:54] [SPEAKER_02]: would want as well
[00:50:57] [SPEAKER_02]: I'm uncertain about
[00:50:58] [SPEAKER_02]: whether to give
[00:51:00] [SPEAKER_02]: money to a charity
[00:51:02] [SPEAKER_02]: that I have a personal connection
[00:51:04] [SPEAKER_02]: to because my friend works there I'm just
[00:51:06] [SPEAKER_02]: going to do it right that's my
[00:51:08] [SPEAKER_02]: stopping point you know what I'm saying
[00:51:10] [SPEAKER_02]: so it seems like you that
[00:51:12] [SPEAKER_02]: would be something that you would
[00:51:14] [SPEAKER_02]: want to appeal to only
[00:51:16] [SPEAKER_02]: as a last resort right
[00:51:18] [SPEAKER_02]: have a kind of arbitrarily
[00:51:20] [SPEAKER_02]: chosen fixed point
[00:51:23] [SPEAKER_06]: yeah I agree
[00:51:24] [SPEAKER_06]: with that in particular
[00:51:26] [SPEAKER_06]: it just seems for any
[00:51:28] [SPEAKER_06]: case where
[00:51:30] [SPEAKER_06]: you know you're unsure about something
[00:51:32] [SPEAKER_06]: and I could say hey you can find out
[00:51:34] [SPEAKER_06]: at very low cost
[00:51:37] [SPEAKER_06]: whether this thing is true or not
[00:51:39] [SPEAKER_06]: um it seems like
[00:51:40] [SPEAKER_06]: yeah it's worth your time you know paying
[00:51:42] [SPEAKER_06]: that small cost in order to find out whether this
[00:51:44] [SPEAKER_06]: thing is true but if I say
[00:51:47] [SPEAKER_06]: well there's just this fixed point and you just ought
[00:51:49] [SPEAKER_06]: to obey the principle
[00:51:52] [SPEAKER_06]: um
[00:51:53] [SPEAKER_06]: that you should maximize expected
[00:51:55] [SPEAKER_06]: value no matter what you think about it
[00:51:56] [SPEAKER_06]: then it's not the case that it's worth you
[00:51:58] [SPEAKER_06]: investigating on this kind of view
[00:52:01] [SPEAKER_06]: instead you should just
[00:52:02] [SPEAKER_06]: you should just do the thing
[00:52:04] [SPEAKER_06]: that maximizes expected value
[00:52:06] [SPEAKER_06]: or choice worthiness
[00:52:07] [SPEAKER_06]: and so that kind of seems wrong it seems like
[00:52:10] [SPEAKER_06]: when we have this higher order
[00:52:12] [SPEAKER_06]: uncertainty we really do want to
[00:52:14] [SPEAKER_06]: know what
[00:52:16] [SPEAKER_06]: like how to resolve that uncertainty how to make
[00:52:18] [SPEAKER_06]: a decision but then
[00:52:21] [SPEAKER_06]: there's this difficult and in my view kind
[00:52:22] [SPEAKER_06]: of unresolved issue of
[00:52:24] [SPEAKER_06]: just making sense of that as a full view
[00:52:27] [SPEAKER_06]: um making sense of this
[00:52:28] [SPEAKER_06]: idea that you can just always
[00:52:30] [SPEAKER_06]: go more and more and more
[00:52:32] [SPEAKER_06]: um uncertain
[00:52:35] [SPEAKER_06]: so I do think
[00:52:36] [SPEAKER_06]: you know the field of
[00:52:38] [SPEAKER_06]: decision making under normative uncertainty it's still very
[00:52:40] [SPEAKER_06]: new and
[00:52:42] [SPEAKER_06]: there's this is one of these things
[00:52:44] [SPEAKER_06]: where I just don't think anyone has a kind of great answer
[00:52:46] [SPEAKER_02]: so the second part
[00:52:48] [SPEAKER_02]: of Dave's question I think
[00:52:50] [SPEAKER_02]: was on the question of how do you assign
[00:52:52] [SPEAKER_02]: credences to normative
[00:52:54] [SPEAKER_02]: theories
[00:52:55] [SPEAKER_02]: yeah how is that even in principle
[00:52:58] [SPEAKER_02]: possible
[00:52:59] [SPEAKER_06]: yeah so I think
[00:53:01] [SPEAKER_06]: is definitely going to be hard
[00:53:02] [SPEAKER_06]: I'm not going to say like well I have a
[00:53:04] [SPEAKER_06]: 18.54%
[00:53:06] [SPEAKER_06]: credence in scanlonian
[00:53:09] [SPEAKER_06]: contractualism
[00:53:10] [SPEAKER_06]: you know we're not going to have kind of precise
[00:53:12] [SPEAKER_06]: degrees of belief
[00:53:14] [SPEAKER_06]: but it certainly seems plausible that we have
[00:53:17] [SPEAKER_06]: at least approximate degrees of belief
[00:53:18] [SPEAKER_06]: so you know do you think
[00:53:21] [SPEAKER_06]: it's more likely
[00:53:22] [SPEAKER_06]: that
[00:53:24] [SPEAKER_06]: you know singers write that we ought to give money
[00:53:27] [SPEAKER_06]: um to improve the world
[00:53:29] [SPEAKER_06]: than that
[00:53:30] [SPEAKER_06]: torturing children for fun
[00:53:33] [SPEAKER_06]: um
[00:53:35] [SPEAKER_06]: is a morally good thing
[00:53:37] [SPEAKER_06]: to do and it's like okay we've clearly
[00:53:39] [SPEAKER_02]: those aren't theories those are just moral
[00:53:41] [SPEAKER_02]: claims
[00:53:42] [SPEAKER_02]: it seems like what your view requires
[00:53:44] [SPEAKER_02]: is assigning credences
[00:53:46] [SPEAKER_02]: to entire theories
[00:53:48] [SPEAKER_06]: yeah so
[00:53:49] [SPEAKER_06]: one
[00:53:52] [SPEAKER_06]: slight issue with the literature is we do
[00:53:55] [SPEAKER_06]: often talk just about
[00:53:56] [SPEAKER_06]: in this kind of simplified way as if you
[00:53:59] [SPEAKER_06]: just got credences and kind of whole theories
[00:54:00] [SPEAKER_06]: and there are some reasons for doing that
[00:54:03] [SPEAKER_06]: but the fundamental view
[00:54:05] [SPEAKER_06]: can just be anything I can just
[00:54:06] [SPEAKER_06]: you know I just divide up the
[00:54:09] [SPEAKER_06]: space of kind of moral possibilities
[00:54:10] [SPEAKER_06]: into the set
[00:54:12] [SPEAKER_06]: of views on which animals
[00:54:15] [SPEAKER_06]: matter the set of views on which animals don't
[00:54:16] [SPEAKER_06]: matter and then I just have to look
[00:54:19] [SPEAKER_06]: at my degree of belief in
[00:54:20] [SPEAKER_06]: the proposition that animals matter or the
[00:54:23] [SPEAKER_06]: proposition is the proposition that animals
[00:54:25] [SPEAKER_06]: don't matter and then I can just do
[00:54:27] [SPEAKER_06]: the maximize expected choice
[00:54:29] [SPEAKER_06]: worthiness thing over that
[00:54:30] [SPEAKER_06]: so even though I talk about theories and I
[00:54:32] [SPEAKER_06]: admit that's kind of
[00:54:35] [SPEAKER_06]: confusing it's really not
[00:54:36] [SPEAKER_06]: fundamentally about
[00:54:38] [SPEAKER_06]: complete you know having
[00:54:39] [SPEAKER_06]: these complete moral theories
[00:54:43] [SPEAKER_06]: um having credences just in moral claims
[00:54:45] [SPEAKER_06]: is um is more
[00:54:47] [SPEAKER_06]: you know is more than enough
[00:54:48] [SPEAKER_02]: okay good that's helpful so
[00:54:51] [SPEAKER_02]: so when you say
[00:54:53] [SPEAKER_02]: we can have
[00:54:55] [SPEAKER_02]: imprecise if not precise
[00:54:57] [SPEAKER_02]: credences in
[00:54:59] [SPEAKER_02]: moral claims
[00:55:01] [SPEAKER_02]: where are we getting
[00:55:03] [SPEAKER_02]: even the imprecise or
[00:55:05] [SPEAKER_02]: approximate
[00:55:06] [SPEAKER_02]: credences about the plausibility
[00:55:08] [SPEAKER_02]: more controversial moral claims
[00:55:11] [SPEAKER_02]: not
[00:55:11] [SPEAKER_02]: not like torturing children is
[00:55:15] [SPEAKER_02]: morally obligatory or something
[00:55:17] [SPEAKER_02]: like that but
[00:55:18] [SPEAKER_02]: but other ones
[00:55:20] [SPEAKER_02]: say about the
[00:55:21] [SPEAKER_02]: morality of abortion or
[00:55:24] [SPEAKER_06]: yeah terrific so
[00:55:26] [SPEAKER_06]: your precise answer to this is going to depend on
[00:55:29] [SPEAKER_06]: what metahetical view you have
[00:55:32] [SPEAKER_06]: so supposing you've got some sort of
[00:55:34] [SPEAKER_06]: ideal subjectivist view
[00:55:36] [SPEAKER_06]: where what it means
[00:55:38] [SPEAKER_06]: to say that something is right
[00:55:40] [SPEAKER_06]: or wrong is just to say well
[00:55:42] [SPEAKER_06]: this is an action that would
[00:55:43] [SPEAKER_06]: that my ideal self
[00:55:45] [SPEAKER_06]: you know myself if I had
[00:55:47] [SPEAKER_06]: you know super large brain
[00:55:50] [SPEAKER_06]: and an indefinite amount of time to think
[00:55:51] [SPEAKER_06]: reflect on it on these issues
[00:55:53] [SPEAKER_06]: what the ideal self would kind of want
[00:55:55] [SPEAKER_06]: me to want in which
[00:55:57] [SPEAKER_06]: case when I have
[00:55:59] [SPEAKER_06]: um when I say you know
[00:56:02] [SPEAKER_06]: it's 60%
[00:56:03] [SPEAKER_06]: likely that abortion is permissible
[00:56:05] [SPEAKER_06]: what I'm saying is actually
[00:56:07] [SPEAKER_06]: well I predict that if
[00:56:09] [SPEAKER_06]: all these conditions held and at
[00:56:11] [SPEAKER_06]: unlimited like million years to think
[00:56:13] [SPEAKER_06]: about stuff and um
[00:56:15] [SPEAKER_06]: reflect and had amazing
[00:56:18] [SPEAKER_06]: computational resources in my brain
[00:56:20] [SPEAKER_06]: or so on 60% chance that I would come
[00:56:23] [SPEAKER_06]: to want myself
[00:56:25] [SPEAKER_06]: you know want myself to be okay with
[00:56:27] [SPEAKER_06]: um
[00:56:29] [SPEAKER_06]: having an abortion or
[00:56:31] [SPEAKER_06]: abortion being allowed in society
[00:56:34] [SPEAKER_06]: a 40% chance that
[00:56:36] [SPEAKER_06]: I would want myself to not want to have
[00:56:38] [SPEAKER_06]: abortions
[00:56:39] [SPEAKER_06]: in society so if
[00:56:41] [SPEAKER_06]: you've got this kind of ideal subjectivist view
[00:56:43] [SPEAKER_06]: then you can make sense of
[00:56:45] [SPEAKER_06]: these kind of degrees of belief really quite neatly
[00:56:48] [SPEAKER_06]: and then on different
[00:56:49] [SPEAKER_06]: met other metaheismal views you'll have
[00:56:51] [SPEAKER_06]: somewhat different accounts
[00:56:53] [SPEAKER_06]: so in you know if
[00:56:55] [SPEAKER_06]: you're a model realist
[00:56:58] [SPEAKER_06]: then what you're
[00:56:59] [SPEAKER_06]: doing is just actually making a guess about
[00:57:01] [SPEAKER_06]: what model reality looks like
[00:57:03] [SPEAKER_06]: and the way
[00:57:05] [SPEAKER_06]: you form your degree of belief is just
[00:57:07] [SPEAKER_06]: by weighing up the evidence
[00:57:09] [SPEAKER_06]: that's there um in front of you where
[00:57:11] [SPEAKER_06]: your evidence will include
[00:57:13] [SPEAKER_06]: your kind of starting model intuitions
[00:57:16] [SPEAKER_06]: plausibly it will include
[00:57:17] [SPEAKER_06]: the kind of views of other people who disagree
[00:57:19] [SPEAKER_06]: with you it will also include kind of
[00:57:21] [SPEAKER_06]: more abstract arguments
[00:57:23] [SPEAKER_06]: may also include kind of theoretical considerations
[00:57:26] [SPEAKER_06]: as well um like
[00:57:28] [SPEAKER_06]: you know simplicity
[00:57:29] [SPEAKER_06]: explanatory power if we're talking about
[00:57:31] [SPEAKER_06]: broader theories um
[00:57:33] [SPEAKER_06]: and then for other metaheismal views you might
[00:57:35] [SPEAKER_06]: have uh different
[00:57:37] [SPEAKER_06]: accounts um again
[00:57:40] [SPEAKER_06]: but I think on most
[00:57:42] [SPEAKER_06]: metahethics you can
[00:57:43] [SPEAKER_06]: still make sense of the idea of there being
[00:57:45] [SPEAKER_06]: degrees of belief like rather neatly
[00:57:48] [SPEAKER_06]: um and then
[00:57:49] [SPEAKER_06]: it seems very plausible that
[00:57:51] [SPEAKER_06]: and intuitive but we do
[00:57:53] [SPEAKER_06]: think that some things are
[00:57:54] [SPEAKER_06]: people say all the time like I think
[00:57:58] [SPEAKER_06]: abortion is almost certainly permissible
[00:57:59] [SPEAKER_06]: you know there they're making kind of
[00:58:01] [SPEAKER_06]: proximily quantitative claim
[00:58:03] [SPEAKER_06]: about the degree of belief
[00:58:04] [SPEAKER_02]: and your view doesn't depend on them being right
[00:58:07] [SPEAKER_02]: it just you're just looking
[00:58:09] [SPEAKER_02]: to give
[00:58:11] [SPEAKER_02]: a method
[00:58:12] [SPEAKER_02]: how to address this problem
[00:58:14] [SPEAKER_02]: and if the people are misguided
[00:58:17] [SPEAKER_02]: in what they take to be
[00:58:19] [SPEAKER_02]: a plausible or implausible moral claim
[00:58:22] [SPEAKER_02]: that's a separate
[00:58:23] [SPEAKER_02]: that's a separate question
[00:58:25] [SPEAKER_02]: correct yeah that's right
[00:58:27] [SPEAKER_06]: so you can call this garbage in garbage out
[00:58:29] [SPEAKER_06]: um if you're
[00:58:31] [SPEAKER_06]: you know 90% sure that
[00:58:33] [SPEAKER_06]: torturing children
[00:58:34] [SPEAKER_06]: for fun is a good thing to do
[00:58:36] [SPEAKER_06]: then my account can still
[00:58:38] [SPEAKER_06]: tell you what to do in light of that uncertainty
[00:58:40] [SPEAKER_06]: um there would be a set that
[00:58:42] [SPEAKER_06]: theory that you'd want to develop
[00:58:44] [SPEAKER_06]: to explain why that's
[00:58:46] [SPEAKER_06]: um you know very bad
[00:58:48] [SPEAKER_06]: degree of belief to have
[00:58:50] [SPEAKER_06]: um but that's not part of what I'm
[00:58:52] [SPEAKER_05]: kind of doing in this paper
[00:58:53] [SPEAKER_05]: so in the paper
[00:58:56] [SPEAKER_05]: you frame this as you did now
[00:58:58] [SPEAKER_05]: um as
[00:59:00] [SPEAKER_05]: something like a procedure analogous to
[00:59:02] [SPEAKER_05]: judgments under uncertainty
[00:59:04] [SPEAKER_05]: um in the
[00:59:06] [SPEAKER_05]: and this is
[00:59:08] [SPEAKER_05]: for many people the normative
[00:59:10] [SPEAKER_05]: framework by which to evaluate the quality of any given decision so
[00:59:13] [SPEAKER_05]: you know what's
[00:59:14] [SPEAKER_05]: what's the chance that it's going to rain
[00:59:16] [SPEAKER_05]: you know what should I
[00:59:19] [SPEAKER_05]: when I'm deciding to take an umbrella
[00:59:20] [SPEAKER_05]: um how much do I
[00:59:22] [SPEAKER_05]: dislike getting rained on
[00:59:24] [SPEAKER_05]: and you plug those in
[00:59:26] [SPEAKER_05]: and at the heart of
[00:59:28] [SPEAKER_05]: the question
[00:59:30] [SPEAKER_05]: that Tamler and I are asking
[00:59:32] [SPEAKER_05]: right now I think is
[00:59:34] [SPEAKER_05]: the can it possibly
[00:59:36] [SPEAKER_05]: be analogous
[00:59:38] [SPEAKER_05]: um or is it simply
[00:59:40] [SPEAKER_05]: metaphor because it is
[00:59:42] [SPEAKER_05]: so clear right even the
[00:59:44] [SPEAKER_05]: dismissal like that
[00:59:46] [SPEAKER_05]: Tamler had to let's you know the
[00:59:48] [SPEAKER_05]: torture case which is I'm not talking about those
[00:59:50] [SPEAKER_05]: obvious ones
[00:59:52] [SPEAKER_05]: it strikes me as
[00:59:54] [SPEAKER_05]: those importantly to build a theory
[00:59:56] [SPEAKER_05]: you it really
[00:59:57] [SPEAKER_05]: it matters
[01:00:00] [SPEAKER_05]: a great deal why we think
[01:00:02] [SPEAKER_05]: the torture of innocent children for
[01:00:03] [SPEAKER_05]: shits and giggles is so wrong
[01:00:06] [SPEAKER_05]: and sure you wouldn't want to waste your time arguing with somebody who thought that was true
[01:00:09] [SPEAKER_05]: but presumably because there are probably
[01:00:11] [SPEAKER_05]: reasons unresponsive at that point
[01:00:13] [SPEAKER_05]: but it does there there is
[01:00:15] [SPEAKER_05]: no moral claim that ought not be evaluated
[01:00:17] [SPEAKER_05]: fairly easily on this procedure
[01:00:19] [SPEAKER_05]: I guess right
[01:00:21] [SPEAKER_05]: so that's one thing and the other
[01:00:23] [SPEAKER_05]: thing is and I think
[01:00:25] [SPEAKER_05]: you you discuss this in the paper quite nicely
[01:00:29] [SPEAKER_05]: which is that
[01:00:31] [SPEAKER_05]: the content of many moral theories
[01:00:33] [SPEAKER_05]: is to find something
[01:00:36] [SPEAKER_05]: so that
[01:00:37] [SPEAKER_05]: is impermissible
[01:00:40] [SPEAKER_05]: if asked to give the
[01:00:41] [SPEAKER_05]: weight of an innocent life
[01:00:44] [SPEAKER_05]: um people would
[01:00:45] [SPEAKER_05]: say it's infinite
[01:00:47] [SPEAKER_05]: you can't plug this into an equation
[01:00:49] [SPEAKER_05]: and in fact there's
[01:00:51] [SPEAKER_05]: you know there's good cycle
[01:00:52] [SPEAKER_05]: cycle social psychology work by
[01:00:54] [SPEAKER_05]: Phil Tetlock and colleagues
[01:00:57] [SPEAKER_05]: that this is this
[01:00:58] [SPEAKER_05]: seems to be what people
[01:01:00] [SPEAKER_05]: what people do when you ask them to
[01:01:02] [SPEAKER_05]: put a market value on the life of their child right which
[01:01:06] [SPEAKER_05]: unfortunately
[01:01:06] [SPEAKER_05]: things like insurance companies have to come up with
[01:01:08] [SPEAKER_05]: values for like limbs and children
[01:01:10] [SPEAKER_05]: and place a value on it but
[01:01:12] [SPEAKER_05]: people people find it so offensive
[01:01:14] [SPEAKER_05]: but part of some ethical theories is to say
[01:01:16] [SPEAKER_05]: no what it means
[01:01:18] [SPEAKER_05]: to endorse my ethical theory is
[01:01:20] [SPEAKER_05]: that this is most certainly wrong
[01:01:22] [SPEAKER_05]: in an infinite sense
[01:01:26] [SPEAKER_06]: yeah terrific so
[01:01:28] [SPEAKER_06]: um yeah this in the paper
[01:01:32] [SPEAKER_06]: the way talking about it's actually
[01:01:35] [SPEAKER_06]: in an important way
[01:01:36] [SPEAKER_06]: relatively limited it just says
[01:01:39] [SPEAKER_06]: when theories
[01:01:40] [SPEAKER_06]: or the views you have
[01:01:43] [SPEAKER_06]: things in have a light sort of structure
[01:01:44] [SPEAKER_06]: that is when they give you kind of quantities
[01:01:47] [SPEAKER_06]: of wellness such that
[01:01:50] [SPEAKER_06]: um for any three actions
[01:01:52] [SPEAKER_06]: I can say that the difference
[01:01:53] [SPEAKER_06]: in the wrongness or difference
[01:01:55] [SPEAKER_06]: in reason that you have
[01:01:58] [SPEAKER_06]: between
[01:01:59] [SPEAKER_06]: A and B is
[01:02:00] [SPEAKER_06]: at least approximately N times as great
[01:02:03] [SPEAKER_06]: is the difference between B and C
[01:02:06] [SPEAKER_06]: um
[01:02:07] [SPEAKER_06]: on that theory so you can get at least have some
[01:02:09] [SPEAKER_06]: sort of idea like quantity to strength of
[01:02:11] [SPEAKER_06]: wellness in that case then
[01:02:13] [SPEAKER_06]: maximize expected choice worthiness
[01:02:15] [SPEAKER_06]: but then there's
[01:02:18] [SPEAKER_06]: yeah difficult issues
[01:02:20] [SPEAKER_06]: when you have
[01:02:22] [SPEAKER_06]: something like a few that says
[01:02:23] [SPEAKER_06]: yeah my child's death is infinitely
[01:02:26] [SPEAKER_06]: wrong or
[01:02:27] [SPEAKER_06]: um you know
[01:02:29] [SPEAKER_06]: killing another person is
[01:02:31] [SPEAKER_06]: infinitely wrong no amount of good
[01:02:33] [SPEAKER_06]: and about ways that I don't find those views
[01:02:36] [SPEAKER_06]: very plausible um
[01:02:37] [SPEAKER_06]: there's difficult questions about well
[01:02:40] [SPEAKER_06]: what you do under uncertainty so
[01:02:42] [SPEAKER_06]: you know if you think okay your child's
[01:02:44] [SPEAKER_06]: death is infinitely wrong just let me
[01:02:45] [SPEAKER_06]: never let them out of the house
[01:02:47] [SPEAKER_06]: um yeah you want to just minimize
[01:02:49] [SPEAKER_06]: all possible costs that's exactly what it
[01:02:52] [SPEAKER_05]: means well that's why
[01:02:53] [SPEAKER_05]: my daughter is that's why my daughter is
[01:02:55] [SPEAKER_05]: currently locked in her room yeah okay well
[01:02:57] [SPEAKER_06]: as long as she's got computer games I guess
[01:02:59] [SPEAKER_06]: that's okay but you know but I'm in
[01:03:01] [SPEAKER_06]: the spirit of taking uncertainty really seriously
[01:03:03] [SPEAKER_06]: sorry you know I don't think you should be completely
[01:03:05] [SPEAKER_06]: confident that you know I don't think
[01:03:07] [SPEAKER_06]: you should have no zero credence
[01:03:09] [SPEAKER_06]: in the idea that some longs are
[01:03:11] [SPEAKER_06]: like infinitely great um
[01:03:13] [SPEAKER_06]: but I think we just need to hand and I think there should
[01:03:16] [SPEAKER_06]: be some account of decision-making
[01:03:18] [SPEAKER_06]: or under uncertainty in that case
[01:03:20] [SPEAKER_06]: but I think it should be
[01:03:21] [SPEAKER_06]: different um because it just doesn't
[01:03:24] [SPEAKER_06]: work within this framework of maximizing
[01:03:26] [SPEAKER_06]: expected value because now
[01:03:27] [SPEAKER_06]: like every action I have
[01:03:30] [SPEAKER_06]: has some probability of being infinitely
[01:03:32] [SPEAKER_06]: good and infinitely bad
[01:03:33] [SPEAKER_06]: and then the whole right the
[01:03:36] [SPEAKER_06]: apparatus just completely breaks it's just
[01:03:37] [SPEAKER_06]: suddenly I can't make comparisons between
[01:03:39] [SPEAKER_06]: anything but then I have
[01:03:42] [SPEAKER_06]: in other work I then kind of address this
[01:03:44] [SPEAKER_06]: issue and actually say that
[01:03:46] [SPEAKER_06]: um you know when you've
[01:03:48] [SPEAKER_06]: got these theories
[01:03:49] [SPEAKER_06]: that don't have this kind of nice quantitative
[01:03:51] [SPEAKER_06]: notion of wrongness we need
[01:03:54] [SPEAKER_06]: to do something other than maximizing
[01:03:56] [SPEAKER_06]: expected value it's still hedging
[01:03:58] [SPEAKER_06]: our bets in light of model uncertainty so
[01:04:00] [SPEAKER_06]: it's kind of same in spirit
[01:04:02] [SPEAKER_06]: but the precise details are
[01:04:03] [SPEAKER_02]: quite different. Dave's
[01:04:05] [SPEAKER_02]: the third part of Dave's earlier question
[01:04:07] [SPEAKER_02]: was about
[01:04:09] [SPEAKER_02]: how alien
[01:04:11] [SPEAKER_02]: this is from the way
[01:04:13] [SPEAKER_02]: most people
[01:04:15] [SPEAKER_02]: um go about their
[01:04:17] [SPEAKER_02]: moral decision-making so
[01:04:19] [SPEAKER_02]: I remember a while ago
[01:04:21] [SPEAKER_02]: there was a GoFundMe
[01:04:23] [SPEAKER_02]: for Dave's ex-wife
[01:04:25] [SPEAKER_02]: dogs
[01:04:27] [SPEAKER_02]: needed surgery. When I was
[01:04:29] [SPEAKER_02]: deciding whether to give to that
[01:04:31] [SPEAKER_02]: I wasn't
[01:04:33] [SPEAKER_02]: making calculations I wasn't
[01:04:35] [SPEAKER_02]: deciding between
[01:04:37] [SPEAKER_02]: you know what the probability
[01:04:38] [SPEAKER_02]: that singer is right that this is not
[01:04:40] [SPEAKER_02]: the most effective um
[01:04:42] [SPEAKER_02]: way for me to spend my money that that money
[01:04:44] [SPEAKER_02]: could be doing more good elsewhere
[01:04:47] [SPEAKER_02]: it's those
[01:04:48] [SPEAKER_02]: that I was just okay I have the money
[01:04:51] [SPEAKER_02]: I saw it on Facebook
[01:04:53] [SPEAKER_02]: I'm gonna give some money to it
[01:04:55] [SPEAKER_02]: in part I think
[01:04:56] [SPEAKER_02]: a lot of people who act
[01:04:58] [SPEAKER_02]: in that way that's part
[01:05:00] [SPEAKER_02]: that defines them in
[01:05:03] [SPEAKER_02]: in part that
[01:05:04] [SPEAKER_02]: they are willing
[01:05:06] [SPEAKER_02]: to a certain extent to
[01:05:08] [SPEAKER_02]: almost unthinkingly
[01:05:10] [SPEAKER_02]: support a person that
[01:05:12] [SPEAKER_02]: they have some sort of connection with
[01:05:14] [SPEAKER_02]: to what extent are you trying to
[01:05:16] [SPEAKER_02]: reform the way we
[01:05:18] [SPEAKER_02]: go about our moral decision-making
[01:05:20] [SPEAKER_02]: and to what extent are you just trying to
[01:05:22] [SPEAKER_02]: capture it
[01:05:24] [SPEAKER_02]: in more precise terms than
[01:05:26] [SPEAKER_02]: we may think about it. Yeah so
[01:05:28] [SPEAKER_06]: I'm very heavily on the reforming camp
[01:05:31] [SPEAKER_06]: I'm definitely on the
[01:05:32] [SPEAKER_06]: philosophy is about changing the world
[01:05:34] [SPEAKER_06]: not just understanding it
[01:05:36] [SPEAKER_06]: um and
[01:05:38] [SPEAKER_06]: yeah I think you're right but
[01:05:40] [SPEAKER_06]: I think this is part of a general issue that people
[01:05:42] [SPEAKER_06]: just aren't very morally reflective
[01:05:44] [SPEAKER_06]: um and I think that's a tremendously
[01:05:46] [SPEAKER_06]: bad thing
[01:05:47] [SPEAKER_06]: when we look at
[01:05:50] [SPEAKER_06]: the history of
[01:05:52] [SPEAKER_06]: you know human civilization
[01:05:54] [SPEAKER_06]: we've just committed like awful moral
[01:05:56] [SPEAKER_06]: wrongs every single
[01:05:58] [SPEAKER_06]: generation and people have been
[01:06:00] [SPEAKER_06]: perfectly happy just to accept that and
[01:06:02] [SPEAKER_06]: they've never really thought about
[01:06:04] [SPEAKER_06]: the idea that maybe
[01:06:05] [SPEAKER_06]: women have equal worth to men or
[01:06:08] [SPEAKER_06]: people of color have equal
[01:06:10] [SPEAKER_06]: worth to like white people and so on
[01:06:12] [SPEAKER_06]: people just don't reflect
[01:06:14] [SPEAKER_06]: ethically that much
[01:06:16] [SPEAKER_06]: um certainly not when
[01:06:17] [SPEAKER_06]: you know no one's kind of challenging them to do
[01:06:20] [SPEAKER_05]: so. It's not until they have to
[01:06:22] [SPEAKER_05]: cancel their cable subscription that they
[01:06:24] [SPEAKER_05]: really start questioning. Yeah exactly
[01:06:26] [SPEAKER_06]: then they've got a million different ethical arguments
[01:06:28] [SPEAKER_06]: um to defend to defend that view
[01:06:30] [SPEAKER_06]: I mean I guess actually yeah
[01:06:32] [SPEAKER_02]: who's the old person here nobody has
[01:06:34] [SPEAKER_02]: cable anymore
[01:06:36] [SPEAKER_06]: I do. Wow. So if you want
[01:06:38] [SPEAKER_02]: to reform people
[01:06:40] [SPEAKER_02]: one of the concerns about the effective altruism
[01:06:42] [SPEAKER_02]: movement and
[01:06:44] [SPEAKER_02]: utilitarianism more
[01:06:46] [SPEAKER_02]: generally has
[01:06:48] [SPEAKER_02]: been that you are taking
[01:06:50] [SPEAKER_02]: flesh and blood human beings
[01:06:52] [SPEAKER_02]: and trying to turn them into
[01:06:53] [SPEAKER_02]: computer algorithms
[01:06:55] [SPEAKER_02]: that
[01:06:57] [SPEAKER_02]: are and in doing
[01:06:59] [SPEAKER_02]: so you are stripping them
[01:07:01] [SPEAKER_02]: of some part
[01:07:03] [SPEAKER_02]: at least of who they
[01:07:05] [SPEAKER_02]: are. It's not clear
[01:07:06] [SPEAKER_02]: what the justification for
[01:07:09] [SPEAKER_02]: that is or what standing
[01:07:11] [SPEAKER_02]: you have to tell people
[01:07:13] [SPEAKER_02]: that they should be more reflective in
[01:07:15] [SPEAKER_02]: the way that you would like them to be more
[01:07:17] [SPEAKER_02]: reflective in this
[01:07:18] [SPEAKER_02]: uh somewhat formalized
[01:07:21] [SPEAKER_02]: decision making way so
[01:07:23] [SPEAKER_02]: if you're not just trying to say
[01:07:25] [SPEAKER_02]: hey you're doing
[01:07:26] [SPEAKER_02]: moral reflection like this but here's a better
[01:07:28] [SPEAKER_02]: way to do it that is still in line with
[01:07:31] [SPEAKER_02]: the way you're approaching it that's one
[01:07:33] [SPEAKER_02]: thing but if you want to radically change
[01:07:35] [SPEAKER_02]: the way people are engaging in
[01:07:37] [SPEAKER_02]: it then there's the question of
[01:07:39] [SPEAKER_02]: well why
[01:07:40] [SPEAKER_02]: why should we do that especially
[01:07:43] [SPEAKER_02]: if it's so
[01:07:44] [SPEAKER_02]: alien to are
[01:07:47] [SPEAKER_02]: the way we interact
[01:07:48] [SPEAKER_02]: with the world right now.
[01:07:50] [SPEAKER_06]: Yeah so
[01:07:51] [SPEAKER_06]: I mean to get a grip on this I think I'd want
[01:07:54] [SPEAKER_06]: to just run that kind of form
[01:07:56] [SPEAKER_06]: of argument across
[01:07:58] [SPEAKER_06]: like different periods of
[01:07:59] [SPEAKER_06]: time and space so
[01:08:01] [SPEAKER_06]: thinking about this like
[01:08:04] [SPEAKER_06]: as a response to abolitionism or something
[01:08:06] [SPEAKER_06]: where you're saying well
[01:08:07] [SPEAKER_06]: my projects involve
[01:08:10] [SPEAKER_06]: um you know
[01:08:11] [SPEAKER_06]: keeping slaves and so on it's just
[01:08:14] [SPEAKER_06]: I don't really value this
[01:08:16] [SPEAKER_06]: in a fundamental way and
[01:08:17] [SPEAKER_06]: and so on
[01:08:19] [SPEAKER_06]: it just you know that kind of seems absurd
[01:08:22] [SPEAKER_06]: and so
[01:08:24] [SPEAKER_06]: I think we all when we
[01:08:25] [SPEAKER_06]: actually reflect kind of agree that
[01:08:29] [SPEAKER_06]: we just
[01:08:29] [SPEAKER_06]: don't have license to just kind of choose
[01:08:31] [SPEAKER_06]: our moral projects
[01:08:34] [SPEAKER_06]: at least when they're sufficiently
[01:08:35] [SPEAKER_06]: important kind of on a whim perhaps
[01:08:37] [SPEAKER_06]: there's some things like do you become
[01:08:39] [SPEAKER_06]: a you know do you have
[01:08:41] [SPEAKER_06]: a hobby of being a clarinetist or something
[01:08:43] [SPEAKER_06]: else like well this is your realm of kind of
[01:08:45] [SPEAKER_06]: missable
[01:08:48] [SPEAKER_06]: whereas very many
[01:08:49] [SPEAKER_06]: moral issues and I think very
[01:08:51] [SPEAKER_06]: many more than most
[01:08:53] [SPEAKER_06]: people tend to acknowledge are not like that
[01:08:56] [SPEAKER_06]: there's just actually things that are right
[01:08:57] [SPEAKER_06]: and wrong and
[01:09:00] [SPEAKER_06]: we currently have
[01:09:01] [SPEAKER_06]: a really bad track record about
[01:09:03] [SPEAKER_06]: getting the rights we've done really
[01:09:06] [SPEAKER_06]: horrendous moral things
[01:09:07] [SPEAKER_06]: on the basis of moral
[01:09:09] [SPEAKER_06]: error that was just very widely accepted
[01:09:11] [SPEAKER_06]: regarded as common sense
[01:09:12] [SPEAKER_06]: and so the whole
[01:09:14] [SPEAKER_06]: drive behind at least
[01:09:16] [SPEAKER_06]: for me the drive behind both
[01:09:18] [SPEAKER_06]: effective autism
[01:09:20] [SPEAKER_06]: especially the aspect of
[01:09:22] [SPEAKER_06]: thinking about you know all the different problems in the world
[01:09:24] [SPEAKER_06]: and how to prioritize among them
[01:09:26] [SPEAKER_06]: and then also of moral uncertainty
[01:09:28] [SPEAKER_06]: and thinking the ways that I could
[01:09:30] [SPEAKER_06]: potentially be badly morally wrong
[01:09:32] [SPEAKER_06]: the real motivating drive
[01:09:34] [SPEAKER_06]: for that is this fact that
[01:09:36] [SPEAKER_06]: we've been so badly wrong in the past
[01:09:38] [SPEAKER_06]: probably my starting
[01:09:40] [SPEAKER_06]: model projects are going to be
[01:09:42] [SPEAKER_06]: fundamentally misguided
[01:09:44] [SPEAKER_06]: they're going to be products of my upbringing
[01:09:46] [SPEAKER_06]: my culture
[01:09:48] [SPEAKER_06]: whereas
[01:09:52] [SPEAKER_06]: there are things we could be doing
[01:09:54] [SPEAKER_06]: that are generally morally right
[01:09:56] [SPEAKER_06]: genuinely more morally better leading to
[01:09:58] [SPEAKER_06]: moral progress
[01:10:00] [SPEAKER_02]: Is the implication then that if
[01:10:02] [SPEAKER_02]: say the Romans
[01:10:04] [SPEAKER_02]: had they applied
[01:10:06] [SPEAKER_02]: this decision making
[01:10:09] [SPEAKER_02]: process
[01:10:10] [SPEAKER_02]: to their own
[01:10:12] [SPEAKER_02]: society they would have recognized
[01:10:14] [SPEAKER_02]: the immorality of
[01:10:16] [SPEAKER_02]: having slaves at that time
[01:10:20] [SPEAKER_02]: Is the implication that this not engaging it
[01:10:23] [SPEAKER_02]: is what led to the moral
[01:10:24] [SPEAKER_02]: atrocities of the past
[01:10:26] [SPEAKER_06]: I think
[01:10:28] [SPEAKER_06]: it is true that
[01:10:30] [SPEAKER_06]: if kind of everyone throughout
[01:10:32] [SPEAKER_06]: history had been
[01:10:34] [SPEAKER_06]: you know maximizing expected choice
[01:10:36] [SPEAKER_06]: worthiness rather than just
[01:10:38] [SPEAKER_06]: following their favored moral view
[01:10:40] [SPEAKER_06]: I do think the world would be much much better
[01:10:43] [SPEAKER_06]: and yeah take say
[01:10:45] [SPEAKER_06]: the fights in the Colosseum
[01:10:49] [SPEAKER_06]: where hundreds of thousands
[01:10:51] [SPEAKER_06]: of people were killed
[01:10:53] [SPEAKER_06]: for entertainment
[01:10:56] [SPEAKER_06]: over the course of many years
[01:10:58] [SPEAKER_06]: now if someone was thinking like well
[01:11:01] [SPEAKER_06]: I think it's probably
[01:11:02] [SPEAKER_06]: fine to do this because
[01:11:04] [SPEAKER_06]: they're slaves or they've been captured
[01:11:06] [SPEAKER_06]: but
[01:11:07] [SPEAKER_06]: you know maybe it is wrong
[01:11:09] [SPEAKER_06]: maybe crazy thought
[01:11:11] [SPEAKER_06]: maybe it's just as long to
[01:11:13] [SPEAKER_06]: feed someone to lions
[01:11:15] [SPEAKER_06]: who's not part of my community
[01:11:17] [SPEAKER_06]: is to feed the Roman citizens to lions
[01:11:20] [SPEAKER_06]: well I guess I should give that some
[01:11:22] [SPEAKER_06]: degree of belief you know a few percent
[01:11:24] [SPEAKER_06]: or so on and if it is
[01:11:25] [SPEAKER_06]: that seems to outweigh the kind of
[01:11:29] [SPEAKER_06]: mild you know
[01:11:29] [SPEAKER_06]: the pleasure that I get from
[01:11:31] [SPEAKER_06]: being someone getting eaten by lions
[01:11:33] [SPEAKER_06]: compared to the other
[01:11:35] [SPEAKER_06]: entertainment that I could have been having at the same time
[01:11:37] [SPEAKER_06]: so I think that you know it would
[01:11:39] [SPEAKER_06]: have worked in that case
[01:11:40] [SPEAKER_02]: yeah I guess my point is it's an empirical question
[01:11:43] [SPEAKER_02]: the question of what
[01:11:45] [SPEAKER_02]: has been the most effective
[01:11:47] [SPEAKER_02]: means of reforming
[01:11:49] [SPEAKER_02]: morally abhorrent practices
[01:11:51] [SPEAKER_02]: greater rational
[01:11:53] [SPEAKER_02]: reflection about
[01:11:55] [SPEAKER_02]: moral issues at least arguably
[01:11:58] [SPEAKER_02]: hasn't been as
[01:11:59] [SPEAKER_02]: effective as
[01:12:01] [SPEAKER_02]: other forms
[01:12:03] [SPEAKER_02]: throughout history so
[01:12:05] [SPEAKER_02]: without having a strong
[01:12:07] [SPEAKER_02]: stand on this question
[01:12:09] [SPEAKER_02]: I guess I'm looking for
[01:12:11] [SPEAKER_02]: you to at least acknowledge that it is an empirical question
[01:12:14] [SPEAKER_02]: and we can't just assume
[01:12:15] [SPEAKER_02]: that greater moral
[01:12:17] [SPEAKER_02]: and rational reflection during
[01:12:19] [SPEAKER_02]: these times would have
[01:12:20] [SPEAKER_02]: led to the reforms
[01:12:22] [SPEAKER_02]: that were desirable
[01:12:25] [SPEAKER_06]: okay terrific yeah
[01:12:27] [SPEAKER_06]: so I definitely agree it's like
[01:12:29] [SPEAKER_06]: it's very unclear
[01:12:31] [SPEAKER_06]: why moral progress happens
[01:12:33] [SPEAKER_06]: are people reflecting more
[01:12:36] [SPEAKER_06]: are people
[01:12:39] [SPEAKER_06]: actually just engage
[01:12:41] [SPEAKER_06]: getting different experiences and that leads them
[01:12:44] [SPEAKER_06]: to change their views
[01:12:44] [SPEAKER_06]: is it just a mere matter of economics
[01:12:47] [SPEAKER_06]: or technological state that means that people actually
[01:12:50] [SPEAKER_06]: change
[01:12:50] [SPEAKER_06]: I agree that's like super hard
[01:12:53] [SPEAKER_06]: question I'm like very interested in
[01:12:57] [SPEAKER_06]: on the question
[01:12:59] [SPEAKER_06]: there's a like sub question
[01:13:01] [SPEAKER_06]: which was just
[01:13:02] [SPEAKER_06]: if they employ
[01:13:04] [SPEAKER_06]: this maximized expected choiceworthiness idea
[01:13:07] [SPEAKER_06]: rather than
[01:13:10] [SPEAKER_06]: just going with their favourite theory
[01:13:13] [SPEAKER_06]: and if you allow me the premise
[01:13:16] [SPEAKER_06]: that people's beliefs
[01:13:18] [SPEAKER_06]: on average
[01:13:20] [SPEAKER_06]: are you know
[01:13:21] [SPEAKER_06]: closer to being
[01:13:23] [SPEAKER_06]: correct than completely
[01:13:25] [SPEAKER_06]: false
[01:13:26] [SPEAKER_06]: so they're kind of like on average at least
[01:13:29] [SPEAKER_06]: getting somewhere in the right direction
[01:13:31] [SPEAKER_06]: then it's not just an empirical
[01:13:33] [SPEAKER_06]: fact but like a mathematical
[01:13:35] [SPEAKER_06]: fact that someone who's
[01:13:38] [SPEAKER_06]: maximizing
[01:13:40] [SPEAKER_06]: expected choiceworthiness
[01:13:41] [SPEAKER_06]: is going to do more good than
[01:13:43] [SPEAKER_06]: someone
[01:13:45] [SPEAKER_06]: who is just doing their kind of favourite
[01:13:47] [SPEAKER_06]: view at least if they've got
[01:13:49] [SPEAKER_06]: like a wide if we've got like a wide variety
[01:13:51] [SPEAKER_06]: of people with different views and so on
[01:13:53] [SPEAKER_06]: but they're all like somewhat correlated
[01:13:55] [SPEAKER_06]: with the truth
[01:13:56] [SPEAKER_06]: in the same way as
[01:13:58] [SPEAKER_06]: if you're just gambling, you're playing poker
[01:14:01] [SPEAKER_06]: and
[01:14:03] [SPEAKER_06]: one of
[01:14:04] [SPEAKER_06]: you know
[01:14:04] [SPEAKER_06]: one of you is
[01:14:08] [SPEAKER_06]: maximizing your expected
[01:14:10] [SPEAKER_06]: financial payoff
[01:14:12] [SPEAKER_06]: so you're doing whatever bets
[01:14:14] [SPEAKER_06]: kind of maximize expected
[01:14:16] [SPEAKER_06]: monetary gain
[01:14:18] [SPEAKER_06]: and the other of you just looks at what's most likely
[01:14:21] [SPEAKER_06]: to get you some money and just always bets
[01:14:23] [SPEAKER_06]: in accordance with that
[01:14:25] [SPEAKER_06]: well the one of you who is maximizing
[01:14:27] [SPEAKER_06]: expected financial payoff
[01:14:29] [SPEAKER_06]: in the long run will do better
[01:14:31] [SPEAKER_06]: so that's kind of just what I was meaning
[01:14:33] [SPEAKER_06]: with
[01:14:35] [SPEAKER_06]: earlier comment about things would have been
[01:14:37] [SPEAKER_06]: better I do agree though that like
[01:14:40] [SPEAKER_06]: the role of
[01:14:41] [SPEAKER_06]: kind of ethical reasoning
[01:14:44] [SPEAKER_06]: in
[01:14:47] [SPEAKER_06]: model progress
[01:14:49] [SPEAKER_06]: is certainly
[01:14:50] [SPEAKER_06]: up for debate
[01:14:51] [SPEAKER_06]: I'm like relatively optimistic about it
[01:14:53] [SPEAKER_06]: I don't think it's in any way
[01:14:55] [SPEAKER_06]: the only thing or even the most important
[01:14:58] [SPEAKER_06]: thing but I think
[01:14:59] [SPEAKER_06]: our capacity to reason and reflect
[01:15:02] [SPEAKER_06]: has been at least in part responsible
[01:15:03] [SPEAKER_06]: for model progress over time
[01:15:06] [SPEAKER_05]: so
[01:15:07] [SPEAKER_05]: there is a
[01:15:09] [SPEAKER_05]: an aspect of this that I've been
[01:15:12] [SPEAKER_05]: dying to get to and there is
[01:15:14] [SPEAKER_05]: it's
[01:15:15] [SPEAKER_05]: it's similar to
[01:15:17] [SPEAKER_05]: in some form
[01:15:19] [SPEAKER_05]: to what Tamler was saying but it's a
[01:15:21] [SPEAKER_05]: slightly different take on it and
[01:15:22] [SPEAKER_05]: it is
[01:15:25] [SPEAKER_05]: maybe at the heart of it
[01:15:26] [SPEAKER_05]: it's the distinction
[01:15:29] [SPEAKER_05]: between
[01:15:31] [SPEAKER_05]: what
[01:15:33] [SPEAKER_05]: say a theory like consequentialism
[01:15:35] [SPEAKER_05]: or utilitarianism
[01:15:36] [SPEAKER_05]: the difference between the claim that
[01:15:38] [SPEAKER_05]: it is a way to evaluate
[01:15:40] [SPEAKER_05]: the rightness of an action
[01:15:42] [SPEAKER_05]: or an outcome
[01:15:44] [SPEAKER_05]: versus
[01:15:46] [SPEAKER_05]: it's usefulness as a guide
[01:15:48] [SPEAKER_05]: to decision making and I think
[01:15:50] [SPEAKER_05]: in past shows
[01:15:52] [SPEAKER_05]: where I've been sort of
[01:15:55] [SPEAKER_05]: critical
[01:15:56] [SPEAKER_05]: of consequentialism
[01:15:58] [SPEAKER_05]: and I think this is
[01:16:01] [SPEAKER_05]: just my fault
[01:16:02] [SPEAKER_05]: for not distinguishing and not knowing
[01:16:04] [SPEAKER_05]: but I've been schooled by many consequentialists
[01:16:07] [SPEAKER_05]: on the importance of this distinction
[01:16:08] [SPEAKER_05]: which is that I actually have
[01:16:10] [SPEAKER_05]: I'm very very sympathetic to the claims
[01:16:12] [SPEAKER_05]: that things that consequentialists
[01:16:17] [SPEAKER_05]: that's
[01:16:17] [SPEAKER_05]: the way to evaluate
[01:16:18] [SPEAKER_05]: the goodness of an actor
[01:16:20] [SPEAKER_05]: and outcome. What I'm very
[01:16:22] [SPEAKER_05]: what I have a lot of problems
[01:16:24] [SPEAKER_05]: with is whether or not people
[01:16:26] [SPEAKER_05]: believing that consequentialism
[01:16:28] [SPEAKER_05]: is the right thing and using it as a guide
[01:16:30] [SPEAKER_05]: is the best
[01:16:32] [SPEAKER_05]: way to bring about
[01:16:34] [SPEAKER_05]: a world that is good
[01:16:36] [SPEAKER_05]: and I think
[01:16:38] [SPEAKER_05]: that part of my
[01:16:40] [SPEAKER_05]: my confusion
[01:16:42] [SPEAKER_05]: in thinking and in communicating has come
[01:16:44] [SPEAKER_05]: from
[01:16:46] [SPEAKER_05]: my familiarity with
[01:16:48] [SPEAKER_05]: utilitarianism and its family
[01:16:50] [SPEAKER_05]: theories
[01:16:51] [SPEAKER_05]: coming from people like
[01:16:54] [SPEAKER_05]: you and Singer and to some extent
[01:16:56] [SPEAKER_05]: even Josh Green
[01:16:58] [SPEAKER_05]: who are actually
[01:17:01] [SPEAKER_05]: who actually
[01:17:02] [SPEAKER_05]: seem to endorse the view
[01:17:04] [SPEAKER_05]: that if people were to
[01:17:06] [SPEAKER_05]: calculate better
[01:17:07] [SPEAKER_05]: the world would be a better place
[01:17:10] [SPEAKER_05]: and I think this is at the heart of
[01:17:12] [SPEAKER_05]: maybe what Tamler's
[01:17:14] [SPEAKER_05]: saying and what my objections have always been
[01:17:16] [SPEAKER_05]: that people are very uncomfortable
[01:17:18] [SPEAKER_05]: with this and they're also bad at it
[01:17:20] [SPEAKER_05]: and so
[01:17:22] [SPEAKER_05]: I've always thought that there's a great
[01:17:24] [SPEAKER_05]: sigewit quote about secretly wanting the world
[01:17:26] [SPEAKER_05]: not to be utilitarian
[01:17:28] [SPEAKER_05]: because that would make for a better world
[01:17:30] [SPEAKER_05]: this is all to say that I think
[01:17:32] [SPEAKER_05]: one of the big challenges of
[01:17:34] [SPEAKER_05]: actually getting the world to be a better place
[01:17:36] [SPEAKER_05]: is using the right psychological tools
[01:17:38] [SPEAKER_05]: to make them think that
[01:17:40] [SPEAKER_05]: maximizing is the right thing to do
[01:17:43] [SPEAKER_05]: and
[01:17:43] [SPEAKER_05]: one of the big objections
[01:17:46] [SPEAKER_05]: that I hear when people
[01:17:48] [SPEAKER_05]: talk about things like effective altruism
[01:17:50] [SPEAKER_05]: is
[01:17:50] [SPEAKER_05]: a variety of Tamler's objection which is
[01:17:54] [SPEAKER_05]: it seems
[01:17:56] [SPEAKER_05]: off-putting to calculate
[01:17:58] [SPEAKER_05]: cost and benefit to my moral
[01:18:00] [SPEAKER_05]: actions and
[01:18:01] [SPEAKER_05]: they people really want
[01:18:03] [SPEAKER_05]: charity
[01:18:05] [SPEAKER_05]: to come from a particular
[01:18:08] [SPEAKER_05]: sometimes emotional
[01:18:10] [SPEAKER_05]: response or reflection
[01:18:12] [SPEAKER_05]: of connection
[01:18:13] [SPEAKER_05]: to another human being
[01:18:15] [SPEAKER_05]: sometimes they want it to reflect
[01:18:18] [SPEAKER_05]: their values
[01:18:19] [SPEAKER_05]: and so they
[01:18:21] [SPEAKER_05]: so opening say
[01:18:23] [SPEAKER_05]: some account
[01:18:25] [SPEAKER_05]: where some percentage of your income
[01:18:27] [SPEAKER_05]: is going to the
[01:18:29] [SPEAKER_05]: latest number crunching algorithm
[01:18:31] [SPEAKER_05]: that is telling you
[01:18:33] [SPEAKER_05]: what the best maximizing
[01:18:35] [SPEAKER_05]: strategy is
[01:18:37] [SPEAKER_05]: is off-putting to them
[01:18:39] [SPEAKER_05]: because that's just how
[01:18:41] [SPEAKER_05]: not how we're built
[01:18:42] [SPEAKER_05]: we're not well suited
[01:18:45] [SPEAKER_05]: for that kind of decision making
[01:18:47] [SPEAKER_05]: and moreover
[01:18:49] [SPEAKER_05]: if we do try it we often air in all
[01:18:51] [SPEAKER_05]: the ways that psychologists might
[01:18:53] [SPEAKER_05]: expect where you distort
[01:18:55] [SPEAKER_05]: the probabilities and the values of future
[01:18:57] [SPEAKER_05]: events in a self-serving way
[01:18:58] [SPEAKER_06]: that makes a lot of sense
[01:19:01] [SPEAKER_06]: firstly the distinction between
[01:19:03] [SPEAKER_06]: the criterion of brightness
[01:19:05] [SPEAKER_06]: and the rules you should follow
[01:19:07] [SPEAKER_06]: is very important
[01:19:10] [SPEAKER_06]: I'm a mediocre squash player
[01:19:12] [SPEAKER_06]: the aim of squash
[01:19:14] [SPEAKER_06]: is to get more than
[01:19:16] [SPEAKER_06]: 9 points in each game
[01:19:17] [SPEAKER_06]: and so on
[01:19:18] [SPEAKER_06]: the rule I follow is just hit it down the wall
[01:19:20] [SPEAKER_06]: and if I try and do it follow any other rule
[01:19:22] [SPEAKER_06]: than that I do worse
[01:19:23] [SPEAKER_06]: so I should just forget about that as a cool point
[01:19:27] [SPEAKER_06]: the other thing I should say
[01:19:29] [SPEAKER_06]: you mentioned kind of what we're built to do
[01:19:31] [SPEAKER_06]: and I think one thing that
[01:19:33] [SPEAKER_06]: I see as a kind of win for utilitarianism
[01:19:36] [SPEAKER_06]: is recognizing
[01:19:37] [SPEAKER_06]: that the moral situation
[01:19:39] [SPEAKER_06]: wherein as rich countries
[01:19:41] [SPEAKER_06]: rich as middle class
[01:19:44] [SPEAKER_06]: members of rich countries in
[01:19:45] [SPEAKER_06]: the early 21st century
[01:19:47] [SPEAKER_06]: it's just radically different
[01:19:49] [SPEAKER_06]: than the moral situation that
[01:19:51] [SPEAKER_06]: we're built for or used to
[01:19:53] [SPEAKER_06]: all we would regard as kind of normal
[01:19:55] [SPEAKER_06]: so kind of as a thought experiment
[01:19:58] [SPEAKER_06]: just think about like
[01:19:59] [SPEAKER_06]: what would a utilitarian say about what someone
[01:20:01] [SPEAKER_06]: in the poorless 2 billion people
[01:20:03] [SPEAKER_06]: in the world who are living
[01:20:05] [SPEAKER_06]: in less than a few dollars per day
[01:20:07] [SPEAKER_06]: you know what's their kind of moral recommendation
[01:20:09] [SPEAKER_06]: for those people
[01:20:10] [SPEAKER_06]: which is actually much more like
[01:20:12] [SPEAKER_06]: the normal situation for humans
[01:20:15] [SPEAKER_06]: and it's like for the common sense
[01:20:16] [SPEAKER_06]: it's like work hard, take care of your family
[01:20:20] [SPEAKER_06]: be a good person
[01:20:21] [SPEAKER_06]: don't steal, don't cheat
[01:20:23] [SPEAKER_06]: you know if you're like
[01:20:25] [SPEAKER_06]: engaged by some community thing
[01:20:28] [SPEAKER_06]: and you're going to work hard about
[01:20:29] [SPEAKER_06]: do that too it's like actually
[01:20:31] [SPEAKER_06]: really not that
[01:20:32] [SPEAKER_06]: uncommon sensical
[01:20:34] [SPEAKER_06]: it gets weird when
[01:20:37] [SPEAKER_06]: we're talking about you or I
[01:20:39] [SPEAKER_06]: but we're in an incredibly weird position
[01:20:41] [SPEAKER_06]: we're like out of the stock of seeds
[01:20:42] [SPEAKER_06]: we're in a few percent of the world's population
[01:20:45] [SPEAKER_06]: and so I think like
[01:20:47] [SPEAKER_06]: yeah a lot of the unintuitiveness
[01:20:48] [SPEAKER_06]: of utilitarianism
[01:20:51] [SPEAKER_06]: comes from
[01:20:51] [SPEAKER_06]: the empirical fact not the moral fact
[01:20:55] [SPEAKER_05]: right and most there's
[01:20:56] [SPEAKER_05]: a lot of agreement as to
[01:20:59] [SPEAKER_05]: what normative theories would
[01:21:01] [SPEAKER_05]: you know would claim
[01:21:02] [SPEAKER_05]: like so it's not
[01:21:04] [SPEAKER_05]: as if in the wild people are
[01:21:07] [SPEAKER_05]: in supreme disagreement
[01:21:09] [SPEAKER_05]: because one of them is a deontologist
[01:21:11] [SPEAKER_05]: and the other ones a utilitarian
[01:21:13] [SPEAKER_06]: yeah I actually
[01:21:15] [SPEAKER_06]: I think when it comes to people like you
[01:21:17] [SPEAKER_06]: or I there's like quite a lot of
[01:21:19] [SPEAKER_06]: disagreement and so far there's
[01:21:21] [SPEAKER_06]: you know non consequential
[01:21:22] [SPEAKER_05]: this domain of charitable giving
[01:21:25] [SPEAKER_05]: is probably the most
[01:21:26] [SPEAKER_05]: right like the most
[01:21:29] [SPEAKER_05]: salient way
[01:21:31] [SPEAKER_05]: in which people might disagree
[01:21:33] [SPEAKER_05]: yeah that's right
[01:21:35] [SPEAKER_05]: but like to
[01:21:36] [SPEAKER_05]: focus my question more
[01:21:38] [SPEAKER_05]: like I think so
[01:21:40] [SPEAKER_05]: would you be sympathetic
[01:21:42] [SPEAKER_05]: to the view that
[01:21:43] [SPEAKER_05]: the best way to
[01:21:46] [SPEAKER_05]: to get people
[01:21:48] [SPEAKER_05]: to donate
[01:21:49] [SPEAKER_05]: to you know
[01:21:51] [SPEAKER_05]: to actually maximize the good in the world
[01:21:53] [SPEAKER_05]: would be to simply appeal
[01:21:56] [SPEAKER_05]: to their more simple deontological intuitions
[01:22:00] [SPEAKER_06]: I'm not
[01:22:01] [SPEAKER_06]: sure one kind of back down thing is
[01:22:04] [SPEAKER_06]: that I'm just generally
[01:22:06] [SPEAKER_06]: extremely reticent about anything
[01:22:08] [SPEAKER_06]: that's kind of disingenuous
[01:22:10] [SPEAKER_06]: so certainly if it was the case that
[01:22:12] [SPEAKER_06]: and you know even just on
[01:22:14] [SPEAKER_06]: consequentialist clowns
[01:22:15] [SPEAKER_06]: so certainly if it was the case that you're misleading people
[01:22:17] [SPEAKER_06]: then I'm just
[01:22:19] [SPEAKER_06]: you know my strong default
[01:22:21] [SPEAKER_06]: is that's going to be a kind of bad thing to do
[01:22:23] [SPEAKER_06]: but of course you don't need to
[01:22:24] [SPEAKER_06]: you don't need to mislead people
[01:22:27] [SPEAKER_06]: there can be multiple
[01:22:28] [SPEAKER_06]: like why give away your money
[01:22:30] [SPEAKER_06]: I think there's two strong
[01:22:32] [SPEAKER_06]: completely independent reasons
[01:22:35] [SPEAKER_06]: one is that we're more likely required to do so
[01:22:37] [SPEAKER_06]: another is this
[01:22:38] [SPEAKER_06]: amazing opportunity to
[01:22:40] [SPEAKER_06]: have a more fulfilling and meaningful life
[01:22:43] [SPEAKER_06]: I think both of those things can be true
[01:22:45] [SPEAKER_06]: if you're saying a message you can choose
[01:22:46] [SPEAKER_06]: which message to broadcast
[01:22:50] [SPEAKER_06]: right
[01:22:51] [SPEAKER_06]: then I would have definitely
[01:22:53] [SPEAKER_06]: been like confident that yes
[01:22:55] [SPEAKER_06]: you should
[01:22:57] [SPEAKER_06]: tailor the message
[01:22:58] [SPEAKER_06]: not just give these kind of rational arguments
[01:23:00] [SPEAKER_06]: and so on but I kind of think
[01:23:02] [SPEAKER_06]: that the success of
[01:23:04] [SPEAKER_06]: effective autism belies
[01:23:06] [SPEAKER_06]: that at least a little bit and I think that's for
[01:23:09] [SPEAKER_06]: two reasons
[01:23:09] [SPEAKER_06]: one is just there's this weird class
[01:23:12] [SPEAKER_06]: of people who are kind of like aliens
[01:23:14] [SPEAKER_06]: who hear
[01:23:16] [SPEAKER_06]: and I'm one of them
[01:23:17] [SPEAKER_06]: who hear these arguments and just like
[01:23:20] [SPEAKER_06]: oh yeah that makes sense
[01:23:22] [SPEAKER_06]: and then they
[01:23:24] [SPEAKER_06]: often make just massive changes to their life
[01:23:26] [SPEAKER_06]: like giving 10% of their income and so on
[01:23:29] [SPEAKER_06]: whereas
[01:23:30] [SPEAKER_06]: all the psychological evidence is about much
[01:23:32] [SPEAKER_06]: smaller amounts of money you know it's a few dollars
[01:23:34] [SPEAKER_06]: or something in a lab because you can't
[01:23:37] [SPEAKER_06]: you wouldn't be able to get the
[01:23:38] [SPEAKER_06]: statistical significance if it's pledging 10%
[01:23:41] [SPEAKER_06]: then the second aspect is just
[01:23:43] [SPEAKER_06]: like
[01:23:44] [SPEAKER_06]: marketing on the margin question so there's
[01:23:46] [SPEAKER_06]: already huge numbers of
[01:23:49] [SPEAKER_06]: organizations that are
[01:23:50] [SPEAKER_06]: appealing to our guts and our emotions
[01:23:53] [SPEAKER_06]: and trying to get
[01:23:54] [SPEAKER_06]: people just to do like a little bit more
[01:23:58] [SPEAKER_06]: the fact
[01:23:58] [SPEAKER_06]: that we're saying look
[01:24:00] [SPEAKER_06]: we're just going to give you the arguments
[01:24:03] [SPEAKER_06]: and you can listen to them
[01:24:05] [SPEAKER_06]: if you want to
[01:24:06] [SPEAKER_06]: or you can ignore them but this is like
[01:24:08] [SPEAKER_06]: how we're going to speak
[01:24:10] [SPEAKER_06]: just actually makes us kind of
[01:24:13] [SPEAKER_06]: much more distinctive
[01:24:14] [SPEAKER_06]: and so just thinking about it
[01:24:16] [SPEAKER_06]: from a marketing perspective
[01:24:19] [SPEAKER_06]: even if that might not be the best
[01:24:21] [SPEAKER_06]: strategy in general I think there's a case for it
[01:24:22] [SPEAKER_06]: for being the best strategy for us
[01:24:24] [SPEAKER_02]: yeah I mean in some ways
[01:24:26] [SPEAKER_02]: yes the popularity
[01:24:28] [SPEAKER_02]: and the rapid increase
[01:24:30] [SPEAKER_02]: in popularity of the movement
[01:24:31] [SPEAKER_02]: does belie
[01:24:34] [SPEAKER_02]: the view that it is
[01:24:36] [SPEAKER_02]: psychologically
[01:24:37] [SPEAKER_02]: undoable for
[01:24:38] [SPEAKER_02]: at least many of us or some of us
[01:24:43] [SPEAKER_02]: I actually though
[01:24:44] [SPEAKER_02]: want to
[01:24:45] [SPEAKER_02]: build on Dave's question
[01:24:48] [SPEAKER_02]: and I know we're getting short on time
[01:24:50] [SPEAKER_02]: about the off-puttingness
[01:24:51] [SPEAKER_02]: of it and I just want to present to you
[01:24:54] [SPEAKER_02]: a case so imagine
[01:24:55] [SPEAKER_02]: you're an effective altruist
[01:24:57] [SPEAKER_02]: committed to the principles
[01:24:59] [SPEAKER_02]: behind the movement
[01:25:02] [SPEAKER_02]: and your child
[01:25:05] [SPEAKER_02]: needs
[01:25:05] [SPEAKER_02]: surgery
[01:25:07] [SPEAKER_02]: maybe she won't die
[01:25:09] [SPEAKER_02]: but she'll go blind if she doesn't
[01:25:12] [SPEAKER_02]: have the surgery
[01:25:13] [SPEAKER_02]: and the surgery is expensive
[01:25:15] [SPEAKER_02]: maybe you've gone into
[01:25:17] [SPEAKER_02]: banking you know so that
[01:25:19] [SPEAKER_02]: you can give a big portion of your money to
[01:25:21] [SPEAKER_02]: so you have the money for the surgery
[01:25:24] [SPEAKER_02]: now you apply
[01:25:26] [SPEAKER_02]: your decision-making procedure
[01:25:28] [SPEAKER_02]: to this question
[01:25:30] [SPEAKER_02]: of whether you should give that money
[01:25:32] [SPEAKER_02]: for your daughter's surgery
[01:25:35] [SPEAKER_02]: or whether you should give it
[01:25:36] [SPEAKER_02]: for a much more
[01:25:38] [SPEAKER_02]: effective use
[01:25:40] [SPEAKER_02]: a much you know
[01:25:41] [SPEAKER_02]: reducing suffering to a much greater degree
[01:25:44] [SPEAKER_02]: than a single person
[01:25:46] [SPEAKER_02]: not losing their sight
[01:25:48] [SPEAKER_02]: so two things
[01:25:50] [SPEAKER_02]: it seems like you might
[01:25:51] [SPEAKER_02]: that decision procedure might
[01:25:53] [SPEAKER_02]: lead you to say no I'm not doing
[01:25:55] [SPEAKER_02]: the surgery is that something
[01:25:57] [SPEAKER_02]: you're comfortable with
[01:25:59] [SPEAKER_02]: but the second part of it
[01:26:01] [SPEAKER_02]: is even if you think the decision
[01:26:03] [SPEAKER_02]: procedure wouldn't lead to that
[01:26:05] [SPEAKER_02]: I think the off-puttingness
[01:26:07] [SPEAKER_02]: sometimes comes from
[01:26:09] [SPEAKER_02]: just the fact that you
[01:26:11] [SPEAKER_02]: would be engaging
[01:26:12] [SPEAKER_02]: in a decision procedure of this
[01:26:15] [SPEAKER_02]: nature at all
[01:26:17] [SPEAKER_02]: about something so personal
[01:26:19] [SPEAKER_02]: that this should just be a no-brainer
[01:26:21] [SPEAKER_02]: and not a no-brainer
[01:26:23] [SPEAKER_02]: because you've applied a decision
[01:26:25] [SPEAKER_02]: procedure and it
[01:26:27] [SPEAKER_02]: and it leads to okay it's
[01:26:29] [SPEAKER_02]: permissible to
[01:26:31] [SPEAKER_02]: spend the money for the surgery
[01:26:33] [SPEAKER_02]: I guess this is a version of the one thought too many
[01:26:35] [SPEAKER_02]: no you just do it
[01:26:37] [SPEAKER_02]: not because
[01:26:38] [SPEAKER_02]: the decision procedure said it was okay
[01:26:41] [SPEAKER_02]: but because she's your daughter
[01:26:43] [SPEAKER_02]: and she needs
[01:26:44] [SPEAKER_02]: your help
[01:26:45] [SPEAKER_06]: terrific so
[01:26:46] [SPEAKER_06]: the first thing I want to say
[01:26:48] [SPEAKER_06]: sorry not terrific but this hypothetical daughter
[01:26:51] [SPEAKER_06]: is
[01:26:52] [SPEAKER_06]: a terrible
[01:26:55] [SPEAKER_06]: habit of saying terrific whenever anyone says anything
[01:26:57] [SPEAKER_06]: to me and it really gets me into trouble
[01:26:59] [SPEAKER_06]: when you're talking about this
[01:27:00] [SPEAKER_06]: I was thinking oh great that was a good question
[01:27:03] [SPEAKER_13]: every time you said terrific
[01:27:04] [SPEAKER_13]: but now I have to rethink that
[01:27:07] [SPEAKER_06]: no but it is an excellent question
[01:27:09] [SPEAKER_06]: and I firstly just
[01:27:11] [SPEAKER_06]: I really want to distinguish
[01:27:12] [SPEAKER_06]: effective autism from
[01:27:15] [SPEAKER_06]: utilitarianism where
[01:27:16] [SPEAKER_06]: effective autism is just the idea
[01:27:18] [SPEAKER_06]: you know use
[01:27:19] [SPEAKER_06]: the project would kind of do as much good
[01:27:22] [SPEAKER_06]: as possible with a given amount of money or resources
[01:27:24] [SPEAKER_06]: and people make significance kind of
[01:27:26] [SPEAKER_06]: sacrifices and if you want
[01:27:28] [SPEAKER_06]: you can bring in kind of singers principle which is
[01:27:30] [SPEAKER_06]: if you can prevent something very bad
[01:27:32] [SPEAKER_06]: from happening without thereby
[01:27:34] [SPEAKER_06]: sacrificing anything of moral significance
[01:27:37] [SPEAKER_06]: you ought morally to do it
[01:27:38] [SPEAKER_06]: and singers principle
[01:27:40] [SPEAKER_06]: wouldn't apply in this case your child going blind
[01:27:42] [SPEAKER_06]: is clearly of moral significance
[01:27:44] [SPEAKER_02]: but his strong principle would apply
[01:27:46] [SPEAKER_02]: and that is
[01:27:48] [SPEAKER_02]: certainly a principle you would have to
[01:27:50] [SPEAKER_02]: think and it's one
[01:27:52] [SPEAKER_02]: that he thinks it's true the one of comparable
[01:27:54] [SPEAKER_02]: moral significance
[01:27:56] [SPEAKER_02]: and so you would have to if you had
[01:27:58] [SPEAKER_02]: your way we would put that the
[01:28:00] [SPEAKER_02]: plausibility of that principle into
[01:28:02] [SPEAKER_02]: the decision making process right
[01:28:06] [SPEAKER_06]: yes that's right so I think you ought to have
[01:28:08] [SPEAKER_06]: so utilitarianism will say
[01:28:10] [SPEAKER_06]: and this is like
[01:28:12] [SPEAKER_06]: you know utilitarianism being an extremely
[01:28:14] [SPEAKER_06]: radical view would say that
[01:28:16] [SPEAKER_06]: ultimately at rock bottom
[01:28:18] [SPEAKER_06]: yeah everyone's life is equal
[01:28:20] [SPEAKER_06]: the fact that someone's your daughter doesn't give
[01:28:24] [SPEAKER_06]: you special reason to prioritize
[01:28:26] [SPEAKER_06]: them above someone else
[01:28:29] [SPEAKER_06]: you know obviously there's
[01:28:31] [SPEAKER_06]: various things you can say about how it will affect
[01:28:32] [SPEAKER_06]: you and like the good
[01:28:34] [SPEAKER_06]: that you might do afterwards and so on
[01:28:36] [SPEAKER_06]: but kind of the core thing is just that everyone is equal
[01:28:39] [SPEAKER_06]: and that's a tremendously radical view
[01:28:41] [SPEAKER_06]: that's clearly a very unintuitive view
[01:28:43] [SPEAKER_06]: if you come to believe that view it's because
[01:28:47] [SPEAKER_06]: of
[01:28:48] [SPEAKER_06]: you know thinking about well or you were
[01:28:50] [SPEAKER_06]: biased by evolution and so on etc
[01:28:53] [SPEAKER_06]: to favor kind of
[01:28:54] [SPEAKER_06]: near and dear
[01:28:56] [SPEAKER_06]: and then
[01:28:58] [SPEAKER_06]: in my kind of moral uncertainty
[01:29:00] [SPEAKER_06]: framework yeah you should place kind of some
[01:29:02] [SPEAKER_06]: degree of belief on that very open question
[01:29:04] [SPEAKER_06]: kind of what degree of belief you should place
[01:29:08] [SPEAKER_06]: but then it could be the
[01:29:10] [SPEAKER_06]: case that given my moral uncertainty
[01:29:12] [SPEAKER_06]: framework you do end up concluding well
[01:29:14] [SPEAKER_06]: we should help people I should
[01:29:16] [SPEAKER_06]: help the strangers rather than my own child
[01:29:19] [SPEAKER_06]: but it would depend
[01:29:22] [SPEAKER_06]: you know crucially on just how plausible
[01:29:25] [SPEAKER_06]: do you find this view to be
[01:29:26] [SPEAKER_06]: and if you are as it seems
[01:29:28] [SPEAKER_06]: like you are maybe find it incredibly unplausible
[01:29:30] [SPEAKER_06]: well then
[01:29:33] [SPEAKER_06]: it would then also
[01:29:34] [SPEAKER_06]: it would you know say no
[01:29:35] [SPEAKER_06]: you want to save your child
[01:29:38] [SPEAKER_06]: and you might say
[01:29:40] [SPEAKER_06]: anticipating a possible response
[01:29:41] [SPEAKER_06]: to say well even the process
[01:29:44] [SPEAKER_06]: of going through this well I'm going to place
[01:29:46] [SPEAKER_06]: credences on these different
[01:29:48] [SPEAKER_06]: views and figure out what's right
[01:29:49] [SPEAKER_06]: even that is kind of violating the one thought too many
[01:29:52] [SPEAKER_06]: idea
[01:29:53] [SPEAKER_06]: but there again I'd want to distinguish between
[01:29:56] [SPEAKER_06]: the criteria of brightness
[01:29:58] [SPEAKER_06]: and the decision procedure where
[01:30:00] [SPEAKER_06]: my moral uncertainty framework
[01:30:02] [SPEAKER_06]: is give saying what is
[01:30:04] [SPEAKER_06]: in fact the right decision
[01:30:06] [SPEAKER_06]: it's not making a claim about how
[01:30:08] [SPEAKER_06]: ought you to reason
[01:30:10] [SPEAKER_06]: so in a sense if you just flip a coin
[01:30:12] [SPEAKER_06]: and then pick the right option
[01:30:13] [SPEAKER_06]: then you've done the right thing
[01:30:16] [SPEAKER_06]: and so there's
[01:30:17] [SPEAKER_06]: so yeah so you might think
[01:30:19] [SPEAKER_06]: in fact the right way to deliberate
[01:30:21] [SPEAKER_06]: in this case is just to pick the thing
[01:30:24] [SPEAKER_06]: that's right because you've cultivated all right
[01:30:25] [SPEAKER_06]: kind of moral virtues
[01:30:28] [SPEAKER_06]: and so on and so in
[01:30:30] [SPEAKER_06]: in that sense the kind of analogy of the discussion
[01:30:32] [SPEAKER_06]: that we had before where we're talking about
[01:30:35] [SPEAKER_06]: utilitarianism or how I play squash
[01:30:37] [SPEAKER_06]: where you want to have a sharp
[01:30:39] [SPEAKER_06]: division between what things
[01:30:41] [SPEAKER_06]: are actually right and long
[01:30:42] [SPEAKER_06]: and how should we go about
[01:30:44] [SPEAKER_06]: deliberating or thinking so
[01:30:46] [SPEAKER_05]: I think these are really important
[01:30:49] [SPEAKER_05]: distinctions to make because
[01:30:50] [SPEAKER_05]: I think it is
[01:30:53] [SPEAKER_05]: both to the
[01:30:56] [SPEAKER_05]: disadvantage of
[01:30:57] [SPEAKER_05]: consequentialist and
[01:30:59] [SPEAKER_05]: deontologists the thought that
[01:31:00] [SPEAKER_05]: doing more to
[01:31:02] [SPEAKER_05]: help out people in the
[01:31:04] [SPEAKER_05]: world is
[01:31:06] [SPEAKER_05]: the purview of utilitarianism alone
[01:31:09] [SPEAKER_05]: is wrong headed
[01:31:12] [SPEAKER_05]: right so you can
[01:31:13] [SPEAKER_05]: you can
[01:31:14] [SPEAKER_05]: be a deontologist at
[01:31:16] [SPEAKER_05]: hardcore content and still be convinced
[01:31:18] [SPEAKER_05]: that you personally ought to do more
[01:31:21] [SPEAKER_05]: and when you decide that
[01:31:24] [SPEAKER_05]: then it's to me
[01:31:25] [SPEAKER_05]: a no brainer to try to
[01:31:26] [SPEAKER_05]: find the people who might need it the most
[01:31:29] [SPEAKER_05]: that's not to say that you
[01:31:30] [SPEAKER_05]: can't continue to help out family and friends
[01:31:32] [SPEAKER_05]: but everybody could do a little more
[01:31:34] [SPEAKER_05]: and there's nothing in
[01:31:36] [SPEAKER_05]: deontology that prevents you from doing it
[01:31:39] [SPEAKER_05]: so or virtue ethics
[01:31:41] [SPEAKER_13]: or really or
[01:31:44] [SPEAKER_05]: no particular ethical
[01:31:46] [SPEAKER_05]: theory
[01:31:47] [SPEAKER_05]: of particularist
[01:31:49] [SPEAKER_02]: would want to do it
[01:31:51] [SPEAKER_02]: and then might be interested in knowing what the
[01:31:52] [SPEAKER_02]: most effective
[01:31:54] [SPEAKER_02]: way of helping people is
[01:31:56] [SPEAKER_05]: yeah and because I haven't had a chance
[01:31:59] [SPEAKER_05]: to say it yet I want to
[01:32:01] [SPEAKER_05]: say like all of the work that you've been doing
[01:32:03] [SPEAKER_05]: is something that should
[01:32:05] [SPEAKER_05]: be championed
[01:32:06] [SPEAKER_05]: on behalf of humanity I thank you
[01:32:08] [SPEAKER_05]: not to kiss your ass but like I think that
[01:32:11] [SPEAKER_05]: this is really important
[01:32:12] [SPEAKER_05]: work but all that
[01:32:14] [SPEAKER_05]: said the question of the decision
[01:32:16] [SPEAKER_05]: procedure and how people arrive
[01:32:19] [SPEAKER_05]: at decision making I think
[01:32:21] [SPEAKER_05]: is more important than just for
[01:32:23] [SPEAKER_05]: what you said right so I think you're absolutely
[01:32:25] [SPEAKER_05]: right that like if you can just convince
[01:32:26] [SPEAKER_05]: whatever you know
[01:32:28] [SPEAKER_05]: 1% of the 1%
[01:32:30] [SPEAKER_05]: who are rich silicon
[01:32:32] [SPEAKER_05]: valley engineers to give money this might
[01:32:35] [SPEAKER_05]: solve a lot of the world's problems and in that
[01:32:37] [SPEAKER_05]: sense you don't need to
[01:32:39] [SPEAKER_05]: pay that much attention to
[01:32:41] [SPEAKER_05]: the descriptive data about
[01:32:43] [SPEAKER_05]: the average person
[01:32:44] [SPEAKER_05]: but there is there is
[01:32:47] [SPEAKER_05]: a sense in which I think
[01:32:48] [SPEAKER_05]: everything that tamler
[01:32:51] [SPEAKER_05]: was saying about
[01:32:53] [SPEAKER_05]: people's reactions
[01:32:54] [SPEAKER_05]: and the one thought too many that
[01:32:56] [SPEAKER_05]: that is I think still
[01:32:59] [SPEAKER_05]: important in the communication of
[01:33:00] [SPEAKER_05]: this stuff because
[01:33:03] [SPEAKER_05]: it isn't
[01:33:04] [SPEAKER_05]: it's not just
[01:33:07] [SPEAKER_05]: that
[01:33:07] [SPEAKER_05]: there are certain people who are comfortable doing the
[01:33:10] [SPEAKER_05]: calculations it's that those people themselves
[01:33:14] [SPEAKER_05]: I think
[01:33:15] [SPEAKER_05]: are just perceived poorly
[01:33:16] [SPEAKER_05]: and wrongly
[01:33:18] [SPEAKER_05]: because of what they're doing and so
[01:33:20] [SPEAKER_05]: I want to give this
[01:33:22] [SPEAKER_05]: example from my
[01:33:24] [SPEAKER_05]: incoming student Andres Montelegre
[01:33:26] [SPEAKER_05]: who's very much into
[01:33:28] [SPEAKER_05]: effective altruism
[01:33:31] [SPEAKER_05]: he was on a bus
[01:33:33] [SPEAKER_05]: here you know he's a young kid
[01:33:35] [SPEAKER_05]: from his
[01:33:36] [SPEAKER_05]: student in Columbia he was here for
[01:33:38] [SPEAKER_05]: a fourth of July weekend and
[01:33:40] [SPEAKER_05]: here in Ithaca you can take the bus
[01:33:42] [SPEAKER_05]: to go see the fireworks at the lake
[01:33:45] [SPEAKER_05]: and
[01:33:46] [SPEAKER_05]: it's free so he
[01:33:48] [SPEAKER_05]: hopped on one of the buses
[01:33:50] [SPEAKER_05]: and
[01:33:51] [SPEAKER_05]: at the end of the bus ride the bus driver
[01:33:54] [SPEAKER_05]: stood up and said
[01:33:56] [SPEAKER_05]: hey you guys I
[01:33:58] [SPEAKER_05]: just want to take a moment if you
[01:34:00] [SPEAKER_05]: let me
[01:34:02] [SPEAKER_05]: it was something like his friend's daughter
[01:34:04] [SPEAKER_05]: was in the hospital
[01:34:07] [SPEAKER_05]: with cancer
[01:34:08] [SPEAKER_05]: and they were struggling
[01:34:10] [SPEAKER_05]: you know here in the good old us
[01:34:12] [SPEAKER_05]: of A we don't pay for everybody's
[01:34:14] [SPEAKER_05]: health insurance so he said
[01:34:16] [SPEAKER_05]: I'd just like to ask
[01:34:18] [SPEAKER_05]: if anybody would be willing to donate some money
[01:34:20] [SPEAKER_05]: and so he was walking
[01:34:21] [SPEAKER_05]: down the bus aisle asking
[01:34:24] [SPEAKER_05]: people to give money people were taking out their
[01:34:26] [SPEAKER_05]: money and my
[01:34:27] [SPEAKER_05]: student was with somebody else
[01:34:30] [SPEAKER_05]: and he just sort of
[01:34:32] [SPEAKER_05]: very clearly
[01:34:34] [SPEAKER_05]: said no when the guy came
[01:34:36] [SPEAKER_05]: around and his friend
[01:34:38] [SPEAKER_05]: says what like why did you say no
[01:34:40] [SPEAKER_05]: and his response was
[01:34:42] [SPEAKER_05]: there are just other more effective ways that I
[01:34:44] [SPEAKER_05]: can use my money
[01:34:46] [SPEAKER_05]: and of course
[01:34:48] [SPEAKER_05]: this response was sort of
[01:34:50] [SPEAKER_05]: abhorrent to both the person
[01:34:52] [SPEAKER_05]: he was with and in some ways
[01:34:53] [SPEAKER_05]: to me and I think this is at the heart
[01:34:56] [SPEAKER_05]: of it and
[01:34:58] [SPEAKER_05]: shout out to Molly Crockett and Jim Ever who
[01:35:00] [SPEAKER_05]: did research with me on this very
[01:35:02] [SPEAKER_05]: question is that your moral
[01:35:05] [SPEAKER_05]: deliberation
[01:35:05] [SPEAKER_05]: in that case is signaling
[01:35:07] [SPEAKER_05]: something that people really care
[01:35:09] [SPEAKER_05]: about which is
[01:35:10] [SPEAKER_05]: what kind of a person are you like
[01:35:13] [SPEAKER_05]: can I trust you are you compassionate and kind
[01:35:16] [SPEAKER_05]: and
[01:35:17] [SPEAKER_05]: what I was telling my student is it's
[01:35:19] [SPEAKER_05]: very in those cases
[01:35:21] [SPEAKER_05]: very easy to just take a dollar out of your pocket
[01:35:23] [SPEAKER_05]: and give it
[01:35:25] [SPEAKER_05]: because people are now going to associate
[01:35:27] [SPEAKER_05]: that effective altruism movement with
[01:35:29] [SPEAKER_05]: like some spectrumy
[01:35:31] [SPEAKER_05]: computer crunching guy who doesn't care about
[01:35:33] [SPEAKER_05]: the suffering of this bus driver's daughter
[01:35:35] [SPEAKER_05]: right
[01:35:36] [SPEAKER_05]: and so it's not to be disingenuous but there are
[01:35:39] [SPEAKER_05]: ways of communicating
[01:35:41] [SPEAKER_05]: the right
[01:35:42] [SPEAKER_05]: kind of
[01:35:45] [SPEAKER_05]: sensibility you don't have to sacrifice
[01:35:47] [SPEAKER_05]: the sensibilities of
[01:35:49] [SPEAKER_05]: a normal person interpersonally
[01:35:51] [SPEAKER_05]: when you
[01:35:52] [SPEAKER_05]: are judging somebody and you're judging
[01:35:55] [SPEAKER_05]: their trustworthiness
[01:35:56] [SPEAKER_05]: and their character you want to know
[01:35:59] [SPEAKER_05]: that they are not
[01:36:00] [SPEAKER_05]: they actually are
[01:36:01] [SPEAKER_05]: their heart is warmed by others
[01:36:04] [SPEAKER_05]: that they are loyal, that they're kind
[01:36:06] [SPEAKER_05]: and compassionate and that just matters
[01:36:08] [SPEAKER_05]: because we live with other people
[01:36:09] [SPEAKER_05]: it's not even a big moral principle
[01:36:11] [SPEAKER_05]: it's just how
[01:36:14] [SPEAKER_05]: people will value you and respect you
[01:36:16] [SPEAKER_05]: and what kind of a citizen
[01:36:17] [SPEAKER_05]: you'll be to all those around you right
[01:36:19] [SPEAKER_06]: yeah I strongly agree with that
[01:36:22] [SPEAKER_06]: I think there's an easy
[01:36:24] [SPEAKER_06]: risk of over optimizing
[01:36:25] [SPEAKER_06]: in a whole host of ways
[01:36:28] [SPEAKER_06]: and
[01:36:29] [SPEAKER_06]: yeah you're exactly right
[01:36:31] [SPEAKER_06]: it's like
[01:36:32] [SPEAKER_06]: giving a dollar to someone
[01:36:36] [SPEAKER_06]: there's an
[01:36:37] [SPEAKER_06]: interesting question because the reaction you get
[01:36:41] [SPEAKER_06]: one of the things I get this strongly with
[01:36:44] [SPEAKER_06]: is
[01:36:45] [SPEAKER_06]: there's some people I know
[01:36:48] [SPEAKER_06]: who
[01:36:51] [SPEAKER_06]: care immensely about animal welfare
[01:36:54] [SPEAKER_06]: but then still
[01:36:56] [SPEAKER_06]: certain forms of meat
[01:36:57] [SPEAKER_06]: that's kind of humanely raised and so on
[01:36:59] [SPEAKER_06]: hey I'm like that
[01:37:00] [SPEAKER_06]: so I don't understand you
[01:37:03] [SPEAKER_06]: from my perspective
[01:37:06] [SPEAKER_04]: you and me both
[01:37:07] [SPEAKER_02]: yeah I'm like
[01:37:09] [SPEAKER_02]: there's a very simple
[01:37:11] [SPEAKER_02]: answer right
[01:37:12] [SPEAKER_02]: that these animals
[01:37:15] [SPEAKER_02]: who are raised humanely
[01:37:17] [SPEAKER_02]: are given lives that they otherwise
[01:37:19] [SPEAKER_02]: wouldn't have to enjoy
[01:37:22] [SPEAKER_02]: and
[01:37:23] [SPEAKER_02]: so
[01:37:25] [SPEAKER_02]: that's something to support
[01:37:26] [SPEAKER_06]: yeah I theoretically completely
[01:37:28] [SPEAKER_06]: understand you I'm kind of on board with all of the arguments
[01:37:31] [SPEAKER_06]: oh I see
[01:37:32] [SPEAKER_06]: it's just the psychological thing of like
[01:37:34] [SPEAKER_06]: could I myself
[01:37:36] [SPEAKER_06]: knowing what I know
[01:37:38] [SPEAKER_06]: feeling the way I do about like
[01:37:41] [SPEAKER_06]: animals
[01:37:42] [SPEAKER_06]: and the way they're taken care of and so on
[01:37:44] [SPEAKER_06]: or the terrible ways they're treated
[01:37:47] [SPEAKER_06]: often
[01:37:47] [SPEAKER_06]: like and thinking that they have
[01:37:49] [SPEAKER_06]: you know moral dignity and respect
[01:37:50] [SPEAKER_06]: then just like eat one
[01:37:52] [SPEAKER_06]: it's like just that would be
[01:37:54] [SPEAKER_06]: psychologically impossible for me
[01:37:57] [SPEAKER_06]: or psychologically very difficult
[01:37:58] [SPEAKER_06]: in the same way as like supposing
[01:38:01] [SPEAKER_06]: your grandmother dies you love her
[01:38:03] [SPEAKER_06]: very much
[01:38:05] [SPEAKER_06]: you were there for her
[01:38:06] [SPEAKER_06]: like unless she passed away and so on
[01:38:08] [SPEAKER_06]: and then you just flow out in the garbage
[01:38:12] [SPEAKER_06]: and you're like well there's nothing wrong with that
[01:38:14] [SPEAKER_06]: like she doesn't care like
[01:38:16] [SPEAKER_06]: are you either
[01:38:18] [SPEAKER_06]: or you eat her right yeah
[01:38:20] [SPEAKER_06]: even more directly
[01:38:21] [SPEAKER_06]: let that go to waste
[01:38:24] [SPEAKER_02]: that's good grandmother meat right there
[01:38:27] [SPEAKER_14]: you can make a stew
[01:38:31] [SPEAKER_06]: yeah so there's like
[01:38:32] [SPEAKER_06]: a large class of activities
[01:38:34] [SPEAKER_06]: that are
[01:38:36] [SPEAKER_06]: regarded as
[01:38:38] [SPEAKER_06]: that are like importantly symbolic
[01:38:40] [SPEAKER_06]: and like they are symbolic
[01:38:42] [SPEAKER_06]: and that means like
[01:38:44] [SPEAKER_06]: you know you both do kind of represent
[01:38:45] [SPEAKER_06]: the sort of person you are by doing something
[01:38:50] [SPEAKER_06]: and potentially
[01:38:52] [SPEAKER_06]: you know there's a risk you do become a sort of
[01:38:53] [SPEAKER_06]: certain sort of person by violating certain symbols
[01:38:56] [SPEAKER_06]: and it's at least
[01:38:58] [SPEAKER_06]: I think like as a default like take those
[01:39:00] [SPEAKER_06]: kind of symbols for given
[01:39:01] [SPEAKER_06]: as given there's some cases where
[01:39:04] [SPEAKER_06]: we want to push on them and say no
[01:39:05] [SPEAKER_06]: this symbolic action
[01:39:08] [SPEAKER_06]: should not be symbolic in this way you should have some other
[01:39:10] [SPEAKER_06]: sort of symbol perhaps
[01:39:12] [SPEAKER_06]: you know certain things are giving to charity people
[01:39:14] [SPEAKER_06]: regard it as like a symbolic action
[01:39:16] [SPEAKER_06]: I'm actually like fine with that as long as we think
[01:39:18] [SPEAKER_06]: of it as a different thing from giving
[01:39:20] [SPEAKER_06]: to charity for
[01:39:22] [SPEAKER_06]: you know giving to charity for the aims of like
[01:39:24] [SPEAKER_06]: trying to make the world better
[01:39:26] [SPEAKER_06]: but I think yeah just in the same way as like
[01:39:28] [SPEAKER_06]: yeah to the people with respect and so on
[01:39:31] [SPEAKER_06]: who are close to you and again you're best not
[01:39:34] [SPEAKER_06]: to think about it in terms of well
[01:39:36] [SPEAKER_06]: if I delete my close friend in a nice
[01:39:38] [SPEAKER_06]: way then this good thing will happen
[01:39:40] [SPEAKER_06]: you know I think even from a purely
[01:39:42] [SPEAKER_06]: consequential perspective you want to just forget
[01:39:44] [SPEAKER_06]: about that and just attend
[01:39:46] [SPEAKER_06]: to your close friends needs
[01:39:49] [SPEAKER_06]: you know that's
[01:39:50] [SPEAKER_06]: I think probably to also
[01:39:52] [SPEAKER_06]: for like you know common sense small
[01:39:54] [SPEAKER_06]: amounts of kindness Dave unless you have
[01:39:56] [SPEAKER_02]: anything else I think we should wrap this up
[01:39:58] [SPEAKER_02]: that was a great analogy actually the vegetarian
[01:40:01] [SPEAKER_02]: that
[01:40:02] [SPEAKER_02]: sort of I had a flash of insight
[01:40:04] [SPEAKER_02]: about how
[01:40:06] [SPEAKER_02]: a vegetarian
[01:40:08] [SPEAKER_02]: often
[01:40:08] [SPEAKER_02]: has no real
[01:40:11] [SPEAKER_02]: no real understanding of the position
[01:40:14] [SPEAKER_02]: that I have and when you say understanding
[01:40:16] [SPEAKER_02]: it is this kind of
[01:40:17] [SPEAKER_02]: and probably views me in the way
[01:40:20] [SPEAKER_02]: that Dave's student viewed that
[01:40:22] [SPEAKER_02]: effective out the altruist on the bus
[01:40:24] [SPEAKER_02]: like yeah there's something
[01:40:26] [SPEAKER_02]: wrong with
[01:40:28] [SPEAKER_02]: you that you would treat
[01:40:30] [SPEAKER_02]: a case like this in that
[01:40:32] [SPEAKER_02]: very theoretical
[01:40:34] [SPEAKER_02]: way or that
[01:40:36] [SPEAKER_02]: very algorithmic way that
[01:40:38] [SPEAKER_02]: you're sending off some kind of bad signal
[01:40:40] [SPEAKER_02]: in the same way that someone
[01:40:42] [SPEAKER_02]: who eats meat but says
[01:40:44] [SPEAKER_02]: they care about animals might be sending
[01:40:46] [SPEAKER_02]: off a bad signal to the vegetarian
[01:40:48] [SPEAKER_06]: yeah exactly
[01:40:50] [SPEAKER_02]: yeah that's sounds like
[01:40:52] [SPEAKER_06]: maybe you're more of an EA than you know
[01:40:55] [SPEAKER_02]: well I
[01:40:56] [SPEAKER_02]: want to I don't claim like Dave
[01:40:58] [SPEAKER_02]: to speak for all of humanity but I
[01:41:00] [SPEAKER_02]: would also like
[01:41:02] [SPEAKER_02]: to thank you for
[01:41:04] [SPEAKER_02]: being on our podcast and
[01:41:06] [SPEAKER_02]: and it really is impressive
[01:41:08] [SPEAKER_02]: what you have done
[01:41:10] [SPEAKER_02]: with this movement and
[01:41:12] [SPEAKER_02]: how much you've contributed
[01:41:14] [SPEAKER_02]: I think most people
[01:41:16] [SPEAKER_02]: are
[01:41:17] [SPEAKER_02]: whatever problems they have with the theoretical
[01:41:20] [SPEAKER_02]: grounding and that's
[01:41:22] [SPEAKER_02]: unambiguously a good thing
[01:41:23] [SPEAKER_02]: that that's happened so
[01:41:26] [SPEAKER_05]: we'll put links in show notes but
[01:41:28] [SPEAKER_05]: but it's only some small percentage of our listeners
[01:41:30] [SPEAKER_05]: who actually go so what are the best places
[01:41:32] [SPEAKER_05]: if you want to
[01:41:34] [SPEAKER_05]: find out the best hurdys to donate to
[01:41:36] [SPEAKER_06]: right so if you want to find out more
[01:41:38] [SPEAKER_06]: about effective autism go to
[01:41:40] [SPEAKER_06]: effectiveautism.org
[01:41:42] [SPEAKER_06]: and then for a variety
[01:41:44] [SPEAKER_06]: of areas to donate
[01:41:46] [SPEAKER_06]: so both the best places within
[01:41:49] [SPEAKER_06]: global development
[01:41:50] [SPEAKER_06]: factory farming
[01:41:52] [SPEAKER_06]: long-term future
[01:41:54] [SPEAKER_06]: and you know promoting these
[01:41:56] [SPEAKER_06]: ideas themselves
[01:41:58] [SPEAKER_06]: you can go to the effective
[01:41:59] [SPEAKER_06]: out-to-est funds
[01:42:01] [SPEAKER_06]: if you google them
[01:42:02] [SPEAKER_06]: and patreon.com.com.
[01:42:04] [SPEAKER_05]: which is the most effective
[01:42:06] [SPEAKER_05]: charity possible
[01:42:08] [SPEAKER_05]: I do thanks for coming on
[01:42:11] [SPEAKER_05]: I do wish we had been able to talk about
[01:42:13] [SPEAKER_05]: the long-termism stuff but maybe that we could save
[01:42:15] [SPEAKER_05]: that for some other time if you'll
[01:42:18] [SPEAKER_05]: if this wasn't
[01:42:19] [SPEAKER_05]: a horrible experience
[01:42:20] [SPEAKER_02]: yeah decision-making procedure leads you to come
[01:42:23] [SPEAKER_02]: on again
[01:42:25] [SPEAKER_06]: I'm sure that will
[01:42:26] [SPEAKER_06]: I'd love to
[01:42:27] [SPEAKER_06]: thanks so much for having me on
[01:42:29] [SPEAKER_02]: thank you well
