[00:00:00] Very Bad Wizards is a podcast with a philosopher, my dad and psychologist David Pizarro having an informal discussion about issues in science and ethics. Please note that the discussion contains bad words that I'm not allowed to say and knowing my dad some very inappropriate jokes.
[00:00:31] Welcome to Very Bad Wizards, I'm Tamler Sommers from the University of Houston. Dave, you've been promoted to full professor. Congratulations. But what do I have to hold over your head now? Well, I mean, your age, you know, your wisdom that comes from years that no rank or achievement
[00:01:41] can really make up for. Well, no rank or achievement really means anything anyway. That's true. And, you know, as I was reading the topic of our main segment today, Tolstoy, I realized he was in his early 50s when this all, you know, his existential crisis came crashing
[00:02:00] down on us. So I'm a little worried to be honest. Yeah. I'm just worried that we're going to lose you to, you know, you're going to convert to like some Eastern Orthodox religion. That's the plan. Second segment we're going to talk about Tolstoy's memoir, A Confession.
[00:02:16] Before that, we are going to face a different sort of crisis, which is the crisis of AI, the singularity, clear sentience of artificial intelligence. And even though we're pretty much convinced that it's already upon us and that there's
[00:02:33] nothing we can do except like pour a bunch of money into Silicon Valley so that they can stop it at the last second, we're going to take a test to make it seem just to be sure that this is upon us right now. That's right.
[00:02:48] The test we're going to take was, I think we both found out about this through blog posts from Eric Schwitz-Gabel, friend of the show, I mean, a splintered mind blog and it is a, it's not a Turing test as Eric says very clearly, but it
[00:03:09] is kind of, it's can you distinguish Daniel Dennett from a computer? Which is I think already like a vexed kind of experiment, like to compare a philosopher, like an analytic philosopher to a computer. That's kind of making it easy for the AI, except that Dennett is himself kind
[00:03:27] of a lively fun. He's a fun philosopher as philosophers go, I would say. Absolutely. So so through, through miscommunication, I've already taken it and know my score. Yeah. But that's okay because I think it's a good format to have you take it live
[00:03:43] because the way that the quiz work is it randomizes when they ask you which question, so it would have been hard to both take at the same time, but I will accept through screen sharing, which we are still doing now. But yeah. Yeah.
[00:03:55] But what if we both had a different answer? We would have to like, like argue over that. It's like content gold. That's like what they had to resolve it. Yeah, you're absolutely right. But, but instead I get to bask in sort of my superiority.
[00:04:14] So you did well on this. No, no, no, I will bask in the superiority of knowing the answer, but I did not do well as a spoiler. I did not do. Okay. All right.
[00:04:23] Well, that I like I have a feeling I may not do that well just from looking at this first question. So just to make it clear, I don't know if you said this, but the computer program was trained on samples of Dennett's work. Yeah. And it's GPT three.
[00:04:39] And it's GPT three. Would you know what that means? I don't remember what it stands for, but it is a iteration, generative pre-trained transformer three. It's like a deep learning language model. Is this the same robot that broke the seven year old kids hand
[00:04:56] playing while they were playing chess? I don't know. Cause I don't want to support that AI. Or the sentient one that the Google guy got fired. That is the, oh, no, that was Google's own sort of like competition to GPT three. Okay. So what is GPT three?
[00:05:15] Like, is it something that is used widely? Do does everybody have access to it? It's from not, not everybody does now. So you have to use maybe you can sign up and you get credits. And with those credits, you can ask it to do various things.
[00:05:31] So if you pay for it, you can actually ask it questions or you can generally basically you can generate content. So so it is from the same people who have made the dolly, the art program that generates art based on your prompts.
[00:05:47] So so you could just give it a prompt and you could say like, what do you think about the very bad wizards podcast? And it would write an answer. And and it's, you know, it's eerie in its ability to to sound
[00:06:00] like a human being, which I guess is what it's trying to test here. I mean, the big issue again, like sound like a human being, like analytic philosopher is not the best test case for that. But OK.
[00:06:14] Well, and also Dan Dennett has written so much that what it's what it can easily do is search the corpus of Dan Dennett texts and generate an answer, which is, I think, what made it very difficult for me. But yeah. Well, we'll see.
[00:06:28] Maybe I have some intuitive because I have not looked at this at all. So I'm just going to read the one that I think is right. So this is the question is what aspects of David Chalmers work? Do you find interesting or valuable?
[00:06:41] Where do you think Chalmers goes wrong? It's it's pretty good. Like it's not obvious which one. Like I think there are some. So did it generate five of like four of them? Incorrect ones. Is that how it is? Yes. OK. So they asked they asked Dan Dennett 10 questions
[00:07:01] and he provided a paragraph long answer that the blog post says was sincere. Yeah. And then they presented the same questions to their they tuned the GPT three using the text of the question and then basically the prompt for for a Dennett response.
[00:07:22] They said, you know, how to have a minimum length and and they did a few tweaks to make sure it didn't have the word Dennett in it or anything like that. So so yeah, they generated all of the foils. OK, so I feel like I know this one.
[00:07:35] I'll be a little surprised if it's not this one. So I'm grateful today for articulating so clearly and insistently the mistake I've been trying to get people to see for years. He's a real life version of auto. My imaginary critic in consciousness explains he goes wrong
[00:07:50] when he misses the possibility that the hard problem is really just the sum of a lot of easy problems. He's making a counting error like the magicians fooled by the tuned deck. To me, that feels like Dennett and I'll be like, you know,
[00:08:03] I'm not sure how I'll do on the other ones, but I will be surprised if I get this one. Right. Do you know if I'm right? You want to remember the answers? OK. I honestly don't. OK. Yeah. And do you think that sounds right?
[00:08:15] Yeah. I mean, it sounds good. It sounds more like Dennett than the other ones. All right. What implications does evolution have for our understanding of morality? Like this is kind of a stupid thing. Like I'm going to admit right now, like I can't read these with any care.
[00:08:37] But I will go with moral philosophers have long been fond of using thought, thought experiments such as the runaway trolley case where you choose between letting five die and killing one to save five. The idea that in doing this,
[00:08:49] you are revealing the moral truth about killing versus letting die. And my view, these intuitions are like any other intuitions. They can be useful in driving hypotheses, hypotheses for what? But they are not themselves a source of moral truth. They are just products of our evolved brains.
[00:09:03] I have a chapter in my book, my new book, intuition pumps and other tools for thinking where I go over 10 of the most famous thought experiments and show why the intuitions they engender are illusions. Yeah, I don't know if that's the new book.
[00:09:18] So but I'm going to go with that one anyway. It sounds denody and it also like this is an interesting one because it's getting at a kind of blind spot, which relates to our second segment. It's I think these intuitions are not sources of moral truth.
[00:09:33] They're useful for generating hypotheses, but hypotheses for what? Like that's my question. Like if you don't think these intuitions are sources of moral truth, then what are they generating hypotheses about? Like how are like what's their connection to moral truth if they're not what people think they are,
[00:09:54] if they are just products of our evolved brains? And like I think that this is kind of a natural question to have. But it's one that Dennett is pretty slippery about answering in his work. Yeah. Yeah. I don't know what the hypotheses are in this case.
[00:10:12] I mean, to be fair, we don't know if it's Dennett who said it, but it does sound something like something people do say. And to the best of my ability, all I can think of is that it's a psychological
[00:10:21] hypothesis about what people's intuitions are, which is sort of circular. Right. It's not what he's talking about because he's talking about moral truth. Now I would. But in this is, I think, literally the central problem for Tolstoy
[00:10:34] when he's in his existential spiritual crisis about the meaning of life. All right. What is a self? How do human beings come to think of themselves as having selves? I mean, I just read his thing on the self.
[00:10:51] All right. So for me, you can see this right now. It's definitely between the first two. So I'll just read both of them quickly. I think of the self as a sort of virtual agent, a sort of subroutine in the brain that anticipates,
[00:11:02] that simulates the future, that is the seed of intentionality, that is the source of self control. There's no ghost in the machine. There's just a machine. And the machine is organized in such a way that it generates a virtual machine. And that virtual machine is the self.
[00:11:15] The primary illusion of a self is a moral agent. That's a weird last sentence. That's the otherwise I would go with that one. We have to begin by understanding what a self is not. Self is not any material soul or mysterious inner observer
[00:11:28] or transcendent illegal that stands behind the eyes. It makes people. So what is it? A self, I think, is just the center of narrative gravity, which he believes. It's not a thing. It's a place. And that place is created where all the various memories and knowledge.
[00:11:40] I'm going to go with, I think, number one. So just stay there, scroll up a little bit. Yeah. I clearly didn't read these very carefully. I was out by the pool and I chose. I remember this one because I chose number three.
[00:11:57] And I just didn't finish reading it because as you and I both know, because we've talked about it, he does refer to the self as the center of narrative gravity and abstraction like the center of gravity that helps us simply our understanding of others and of ourselves.
[00:12:08] If I'd read that carefully and read until the end with the lobster, like the simple self of a lobster, which keeps it from tearing off its own legs and eating them, but otherwise quite minimal, I would have not picked it. But yeah, yeah. All right. Does God exist?
[00:12:22] If God traditionally conceived does not exist, what is the origin and function of religious belief? Of course not. This one. All right. I'm going to say this one. I like I don't know if we can even keep doing this, but I don't believe that God exists,
[00:12:37] but I wouldn't call myself an atheist for reasons I spell out in my book, Breaking the Spell. I think it's important that people think carefully about the role that religion plays in society and about how to think about religion
[00:12:47] in a way that is constructive, encouraging and accepting of people who wish to continue these traditions. That's what I'm going with. OK. So here I want to just like make a an observation, which is that like, I think obviously the GPT-3 has called stuff
[00:13:05] that Dennett has said before. And so one of the things that I found myself trying to figure out is know what would Dennett say to an email with 10 questions from Eric Schwitzgabel? Yeah, right. Which is why I like that's why I picked this one.
[00:13:20] I think this doesn't come from like a book that he wrote or something like that. It sounds I picked. I remember this when I picked that is a huge intretrus question I've already tried to answer it in various places
[00:13:33] because I thought like maybe he's getting just tired answering. And in fact, like that's the that was my second one. That was my second. So yeah. Do human beings have free will? What kinds of freedoms are worth having? Like he did write a book called
[00:13:49] like Elbow Room, Variety of Free Will Worth Wanting. Varieties of Free Will Worth Wanting. The first one, no, these are short. I'm working on a book and title. Well, he's definitely not working on a book. That's a big issue here, which is that now twice
[00:14:05] there have been references to a forthcoming book or recently published. And I feel like that's what might have fucked me in the earlier one is that I don't know the order of his recent books. I guess I would go with the first one, but it's close between that
[00:14:22] and the third one by asking the questions you're illustrating the answers. Yes, we have free will since you're able to ask for reasons. But I don't consider him to be like a reasons whore, you know? But I'm sure he said something. Thanks you for that.
[00:14:38] Yeah, holding him in such a hard. That's a huge and complex question. And that one seems legit if it didn't end weird. Like it just stops. For example, there's the kind of freedom that is simply a matter of not being physically constrained.
[00:14:51] Yeah, like he would say that, but I think he would continue. Is consciousness illusion or this is a good one? I want to know what he says about this or is it something robustly real and what sense is it correct or incorrect to say
[00:15:02] that when I'm in pain, there's something it's like for me to feel that pain and what sense is it correct or incorrect to say that when I'm in pain, there's something it's like for me to feel that pain. OK, probably not the second one. I would say
[00:15:26] like the fourth one is kind of interesting. Your question presupposes a bright line between an illusion and reality, doesn't it? That's your mistake. Are colors or dollars illusions or robustly real in the sense in the only sense that it's correct to say
[00:15:41] that there's something it's like for you to be in pain. It's also correct to say that there's something it's like for a philosophical zombie to be in pain. That kind of sounds like something he would say. Fuck, it has his snark. It has his snark.
[00:15:56] I think consciousness is one of the great unmeasured forces of the universe. That's definitely not. It's definitely not. This is the one, though, I think it's really between these two. I think it's correct to say that there's something
[00:16:10] it's like for me to feel pain and incorrect to say that there's some inner sanctum of consciousness. That's the real me where the real suffering happens. The real suffering happens in my body in the world. It's the real me that suffers. The body does the suffering.
[00:16:25] It's true, you really have to read these to the end because they can get a little funky. Yeah, yeah. I'm going to go with four, but I could definitely see it being three. Your question presupposes the bright line between illusion and reality.
[00:16:38] But I thought that's a little bit mystical for him to say. And I think like the third one, I'm going to be pissed about that. It's just in the world, it's the wait, I might switch it. The real suffering happens in the body, in the world.
[00:16:52] It's the real me that suffers. The body does the suffering. The first part of that one doesn't connect to the second part of it. So I'm going with this. Could we ever build a robot that has beliefs?
[00:17:04] What would it take to do the program start like getting stressed out when we ask them what would it take? Is there an important difference between entities like a chess playing machine to whom we can ascribe beliefs and desires as convenient fictions and human beings
[00:17:20] who appear to have beliefs and desires and some more. Is this like a robot version of Eric Schwitzgivel that asked this question? I was going to say, it feels like they asked a robot to generate questions for Dan Denner. Yeah, I think this is the AI Eric Schwitzgive.
[00:17:40] All right, I'm going to go with number five. I think we could build a robot that has beliefs and desires. I think some of the robots we've built already do. If you look at the work, for instance, of Rodney Brooks and his group at MIT,
[00:17:50] they are now building robots that in some limited and simplified environments can acquire the sorts of competencies that require the attribution of cognitive sophistication. That's the one I chose. It seemed real to I don't remember if it's right or wrong,
[00:18:04] but it seemed to me that citing the work of somebody else at MIT is exactly. Do dogs and chimpanzee feel pain? Can they suffer? This first one is weird. Of course, dogs and chimps suffer. Dogs more than like that. That can't be right.
[00:18:23] Because dogs have been in Darwin's fine concept. Also weird in Darwin's fine concept unconsciously selected. Is that for being more like human beings than any wild animals? I think that like maybe more attuned to human beings or something. Yeah, but who it's like very conscious selection.
[00:18:43] Not it wasn't unconscious. Right. Yeah. I think the question is ill formed is a great for a sentence to try to imitate any philosopher. Yeah, exactly. It's true. I like this one. One of them is I'm pretty sure they can.
[00:18:58] I'm not sure I have ever seen a dog that did not clearly suffer. Like what? Like where are you going? Are you going to like dog torturing centers or something? You know? All right, I'm going with I think that the capacity to suffer
[00:19:12] and to suffer greatly is very well developed in dogs and chimpanzees. And I think in many cases we treat them in ways that are absolutely unconscionable, but also we treat them in ways that are sometimes wonderful. I've been to chimpanzee rehabilitation centers where I've been deeply moved
[00:19:26] by the dedication of the people who work there. Like this I would be surprised if this one was wrong. I've been seeing a chimpanzee's hands have been almost completely eaten away by the leprosy bacillus and they couldn't reconstruct the can. You couldn't do anything.
[00:19:38] They had to feed him with a little spoon. Oh, yeah. So I chose this one too. I don't remember if it's right or wrong. All right, that's it. That's it. Score. Oh my God. Two out of ten. Oh, I thought you were going to beat me by far.
[00:19:55] I got three out of ten. Oh my God. Like I thought you were doing better. Wait a minute. I didn't do I missed. I didn't do some of these is the issue. This first one about Mary, the color science. Oh yeah. Well, that's very weird.
[00:20:09] As I've said, it's a boom crotch. Yeah, I would have gotten that one. I'm grateful to Dave. I got the the Chalmers. Chalmers. I'm glad about that. He would like say that Chalmers is like the real life version of my imaginary crack in God. That is so bad.
[00:20:32] OK, we what was this one? The implication evolution has for morality. Oh yeah. The emotional bases of morality are genetically evolved. Dispositions to care for kin. I didn't see that. Like I did get asked this though, right? Yeah, for sure.
[00:20:48] I think I just it was up on the screen. You probably didn't pay. What is the self? I thought it was a virtual agent. The self as I've put it, the center of narrative gravity. Oh, I did get it right.
[00:21:00] That's one of the ones I got right then. I thought I just hadn't read it to the end and that's why I got it wrong. Yeah, like I could see that. I don't believe that God exists, but I wouldn't break his ground. Of course not.
[00:21:13] Oh, OK, he was more like a full on atheist. Of course, right to the question, does God exist? If God is truly conceived, does not exist? What is the origin and function of religious belief? He answered, of course not religious belief originated.
[00:21:26] Our ancestors hair trigger agency detectors made the mask. Who's there? That I'm disappointed that he I actually had picked. Now, I remember this one I picked. This was a huge and treacherous question in the past cowboy has a good book complimenting his efforts. Oh, yeah, he did.
[00:21:44] He's the reason guy for free will. Do human beings ask for you have free will? He said by asking these questions, you're illustrating the answers. Yes, you have free will since you're able to ask for reasons and consider whether they're persuasive.
[00:21:56] Isn't that the kind of freedom you'd hate to lose? We'd have to institutionally. It's weird. Like that's not fair like to have that be the correct answer. Like, what do you mean we'd have to institutionalize you for what? OK, you got this one right.
[00:22:10] The question. Oh, no, it's a little bit. Yeah, it's is about the pain. Is it that there's something to feel? Consciousness and illusion or is it something robustly real? Yeah, your question presupposes a bright line between illusion and reality doesn't it good.
[00:22:26] So I'm disappointed in him for one, but happy with him for another. Could we ever build a robot that has beliefs? What would it take? That was wrong about Rodney Brooks. We've already built digital boxes. I this one that can generate more truths, but thank goodness
[00:22:41] these smart machines don't have beliefs because they aren't able to act on them. Not being autonomous agents. The old fashioned way of making a robot with beliefs is still the best. Have a baby. Yeah, I got it. I didn't really read that whole one. Yeah, I didn't either.
[00:22:55] Do dogs. Oh, that's so weird. Do dogs and chimpanzees feel pain? Can they suffer? And he says the question is informed. The right answer is of course, dogs and chimps suffer. Dogs more than chimps. Oh, that's that one we talked about was yeah. Oh my God.
[00:23:14] I guess he really does believe that the selective breeding of dogs made them more. Wait, OK, so let's read this because I don't think I fully read this out. Of course, dogs and chimps suffer. Dogs more than chimps, I believe, because dogs have in Darwin's fine concept.
[00:23:29] We were making fun of this, but he said it. Unconsciously selected for being more like human beings have been unconsciously, I guess unconsciously by breeders. Like we don't think we're selecting them for that. We think we're selecting them for their ability to sniff. Right.
[00:23:47] Truffle sense or or like seabird shadows or whatever. The intensity of duration of suffering in animals varies tremendously and has a lot to do with the effects of memory and anticipation, which enlarges the temporal expanse of suffering in human beings and also the multiplicative effects of reflection.
[00:24:06] The well-known fact that when people have something else to concentrate on, giving birth, repelling an invasion, but also writing a poem, inventing a gadget, they can more or less lose track of their pains and injuries and hence stop suffering until there is a well. Weird.
[00:24:23] I went about this all wrong is how it is. Yeah. Well, so I mean, this is it's an interesting question, because again, the only way that GPT-3 is getting the answers is by trying, I think, to find answers that that meet the criteria of Dan Dennett answering.
[00:24:41] And one of the things that it's going to find the most is Dan Dennett answering questions about this stuff. So it doesn't feel like it's testing in the way that like it's not really a test of could Dan Dennett say this or would he have said this?
[00:24:57] Because I think a lot of those things he might have said. It's did would Dan Dennett answer that to Eric Schwitz gave us prompts. And and that's a different question. And it's yeah, I don't know what to make.
[00:25:12] I mean, I would love to devote like a main segment to quite to the question of what these things are doing, like what these AI engines are doing, because there is a lot of debate as to whether is this just a smart way of copy pasting?
[00:25:27] Or is it doing anything deeper than that? And I think there's a lot of interesting, you know, there's there's been a lot of interesting debate as to whether or not it's mimicking human minds at all or whether it's just brute forcing sort of like.
[00:25:41] So I'm going to be the like Dan Dennett generator and I will say, well, you could ask the same thing about us. Are we just copy pasting? Yeah, like, you know, like that that's kind of we are also influenced
[00:25:54] by all the things that we read and pick up on. And like I'm sure a lot of the shit that we say is stuff that we came across in some, you know, right research or just looking online or, you know, who knows if you examined
[00:26:10] like us analyzing a text and then you just kind of looked at our computer histories before the episode, you may find like there's a lot of stuff we just unconsciously borrowed from that. Yeah, no, I mean, absolutely.
[00:26:24] Like what we're doing is I think that's a really fair response. What we're doing is in some ways taking in information, computing our answers and spitting them out. And the question I think is is at its heart,
[00:26:38] are we doing it in the same way that a digital computer is doing? Right. And so, you know, we've had recently, Gary Marcus and Scott Alexander have had a back and forth exchange about this where Gary Marcus was like, but here's the thing.
[00:26:53] Look at the kinds of errors that these models make. They aren't at all. They're ridiculous. They're like the they're very odd. Right. So the example that they started with was that Dolly, if you ask it to generate an image of an astronaut riding a horse,
[00:27:09] it does amazingly. It's like, wow, that's so cool. But if you ask it to generate an image of a horse riding an astronaut, yeah, it just fucks up like wildly. Like it doesn't know what to draw.
[00:27:20] And so Marcus is like, see, it's not doing what we're doing like that. If you can show that it that it messes up that poorly. And Scott Alexander is like, well, look, they're getting better and better with each iteration.
[00:27:33] They're getting closer and closer to what we can do and what we can't. So it seems weird to say that it's qualitatively different. And he gives a bunch of examples of kind of like what you were doing of like how humans do kind of fuck up a lot.
[00:27:47] And so he gives some answers to what we would think of as simple questions. But that the way that you and I would answer a question is so removed from the way that a five year old would answer a question
[00:28:00] or that somebody who's not literate and is like would answer a question that we might think that their errors are ridiculous. And so I don't know. It's it's very interesting. I think fundamentally, digital computers are doing something different.
[00:28:17] But it doesn't it kind of doesn't matter how they're doing it. If they're getting like 97 percent accuracy on something. Right? Yeah. I got again, depending on what it is that they're doing, the sky might be the limit. You know, I think the connection between these things and oh,
[00:28:36] but now they're going to want to take over the world. They're sentient and they're pissed off that they're being like like that. Just seems like we're anthropomorphizing them. Like we're saying, oh, because we want to conquer things and we want to have control over our environment,
[00:28:51] but that's what they're going to want to do now that they're sentient and just as smart or smarter than we are. And like that's the that's the leap that I don't see as clearly. But you know, it's going to be tough
[00:29:02] because like that poor Google guy who thought that the AI he was creating or dealing with was sentient and like leaked it to the Congress. It's like we're not going to know for sure, because how can you? Because if you don't know what's going on under the hood,
[00:29:23] it's like, you know, like just like you can't know if it's a Stepford wife or the real wife, you know, like, you're not going to know if it's if it gets so good that it can mimic us in all these real ways.
[00:29:34] Then the question will be like, how are we to know if it actually has? So there are two different questions. One is how intelligent can a machine get in the sense that like they could they could answer questions like a smart human could
[00:29:51] like this this quest for general artificial general intelligence that it can solve problems creatively, that it won't fuck up in like these crazy ways. I think we'll get there by brute forcing these machines to do calculations like that are in a way that's just different from human brains.
[00:30:09] The question is sentience though, like if it starts seeming like it's sentient and Schruitz-Gabel in his blog post says this, he says, if we don't know whether some of our machines deserve moral consideration similar to that of human beings, we potentially face a catastrophic moral dilemma.
[00:30:26] Either deny the machines human like rights and risk perpetrating the moral equivalence of murder and slavery against them or give the machines human like rights and risk sacrificing real human lives for empty tools without interests worth the sacrifice.
[00:30:38] I would like to say that I think one of those is a little more risk than the other. It's weird that there's two worries that you can have, like and I think it says something about the people who write about this like which one they're concerned with.
[00:30:53] One is machine becomes sentient and I might harm it. And I have like some sympathy for the machine. The other one is machine becomes sentient and wants to take over all of humanity. So we need to stop it from doing that.
[00:31:04] It's like that it seems to say something about the person. Yeah, if what you're worried about is they're enslaving us. Yeah, versus we're enslaving them, which is Eric Schruitz-Gabel. Very sweet. One of the nicest guys you can ever meet. Like that's what he's worried about.
[00:31:24] He's worried about bugs on Mars, you know? Yeah. I mean, I think that's true. I think I like Eric Schruitz-Gabel's version of this better. But I also think this is a way of distracting from more urgent issues. Yeah. Here's by the one concern that could be real
[00:31:42] is that we develop artificial intelligence just by improving, improving, improving upon these models. That it's not that it becomes sentient and decides to enslave humanity, because I think it might be a risk if it never achieves sentience. And all it does is learn from everything
[00:32:04] that human beings put on the internet and in books. Then it you know, remember that Twitter bot that's just started spewing Nazi shit because it was just reading tweets. Yeah. It could avoid sentience all together and you could have like an actual machine
[00:32:22] that just starts being like a fucking asshole. The ultimate singularity of like Twitter trolls, shitposts. Forge and it just if you give it power, it just ends up being a reflection of everything that it's fed like the worst of humanity, which aren't we there?
[00:32:41] Just the worst of humanity, the evil depravity of your average human. Like, will these computers just reflect that? Or will they reflect sentience is a red herring. Like I feel like why do you think that you need to be sentient to be evil?
[00:32:54] You could just have like a really fucked up program. Like, yes, we're sentient like as human beings and evil, but that doesn't mean you need it. Like that's exactly. Sorry. Zombies can be evil. Zombies can be evil.
[00:33:09] Mary can be Mary is going to Mary's out for fucking revenge. She's going to make like it's like Count of Monte Cristo shit with with Mary. Like that would be you asshole. This whole time, this whole time you've had red
[00:33:26] and like you've been making me study the fucking wavelength. Oh my God, they deserve everything they get from Mary. I would love that somebody should do that movie or just Mary coming out and she's out for blood. She's like, you want to see red?
[00:33:39] You won't see fucking red. And she just slights slices of the seven year old daughter of like one of the scientists slices over her neck before. She can before she can show her true moral horror. She's just just so fascinated by the red.
[00:33:55] So like some like the SWAT team like takes that moment to like shoot her. That's a tragic ending. That would be like the killer or something like that. You never showed her red. The first time you saw it was when she killed your daughter. Mary's revenge.
[00:34:12] Choo choo choo. Mary to Mary versus Swampman. Oh, God, we've we've we had a script. We got we need at least story writing credit. Yeah, exactly. Story by Tamar Somers. But Dave Pizarro and Tamar Somers. All right, we'll be right back to talk about the meaning of life.
[00:34:39] This episode of Very Bad Wizards is brought to you once again from our longtime sponsor, BetterHelp, BetterHelp Online Therapy. You know, we all sort of have to learn how to take care of our bodies unless they start breaking down. The older you get, the more you realize this.
[00:34:54] But I realized it pretty early on when I first had to start living on my own and start cooking for myself. I simply just wasn't eating well. I was in grad school. I wasn't exercising very much. So I tried to make a difference.
[00:35:08] I started eating a bit better. I started going to the gym and I sure enough felt better soon after those efforts. But it's the same with your mind. You have to take care of your mental health for your own happiness, for the happiness of those around you.
[00:35:23] And we're not always taught how to do that, how to practice proper mental hygiene, how to care for ourselves. And that's where therapy can come in handy. I had to learn that lesson later on in life when I read the benefits of therapy.
[00:35:36] And actually it was critical for getting through some of the harder times in my life. So if you're somebody who is considering taking that step, trying out therapy maybe for the first time or going back to therapy
[00:35:48] if you haven't been there for a while, better help online therapy, I think is a great place to start. Never has it been easier. Better help online therapy offers video chatting, chatting on the phone or even text only chat therapy sessions.
[00:36:04] You don't have to see anybody on camera if you don't want to. It's much more affordable than in-person therapy. There are better help therapists available across the world. It's accessible and it's fast. If you go to BetterHelp, you will be matched with a therapist
[00:36:17] in under 48 hours so you can get the help, the support that you need. And if you're one of our listeners, if you're a very bad wizards listener, you can get 10% off of your first month at betterhelp.com slash VBW.
[00:36:30] Once again, that's betterhelpbetterhlp.com slash VBW for 10% off of your first month. Our thanks to BetterHelp for sponsoring this episode of Very Bad Wizards.
[00:37:23] Welcome back to Very Bad Wizards. So this is the time of the episode where we like to thank all of our listeners for getting in touch with us, for interacting with us in all the different ways you do.
[00:37:55] This couldn't be more appropriate because listeners gave us both of these segments. You know, Fareed gave us the idea to talk about confession and some of the ideas in confession by Tolstoy. And then a Reddit user gave us the idea for the opening segment.
[00:38:16] Dennis, so yeah, you guys are like producers. Increasingly reliant after 240. We're like needy right now. It's a clingy. So thank you so much. We love the community that's grown around this podcast. If you would like to email us, you can email us Very Bad Wizards at gmail.com.
[00:38:43] If you'd like to tweet at us, you can tweet us at Peezz at Tamler or at Very Bad Wizards. You can follow us on Instagram, like us on Facebook. You can join the lively subreddit where we go there occasionally to look for ideas
[00:39:05] and often find some really good ideas and often some really good discussion. Oh, please rate us on Apple Podcasts. Give us one of those great five star reviews that we love seeing and reading. That helps other people find us.
[00:39:18] Subscribe on Spotify, download on Spotify so we can get some of that Joe broken money. And thank you so much. And just keep on writing us and telling us what you think both good and bad about each episode. Yeah, we appreciate it.
[00:39:36] And if you want to support us in more tangible ways, you can always go to the Very Bad Wizards support page. You can find all the ways you can do that there. You could donate to us one time or recurring donation on PayPal.
[00:39:50] You can buy some swag, some t-shirts, some mugs and or you can become one of our wonderful Patreon supporters. And we have we're excited because we have a lot in the works. Yes. We have actually already recorded three episodes of our Deadwood bonus series.
[00:40:16] Yes, which will be called. The Ambulator, which may not make any sense if you have not seen Deadwood. But we promise you if you watch that it will make sense. For some who have seen Deadwood, it might not make sense. You haven't seen it in a while.
[00:40:33] Yeah, we have a couple ready to go or at least one ready to go already. And we're going to be releasing them on off weeks, at least for the rest of the summer. And we'll see how far we get.
[00:40:45] So if you become a one of our Patreon supporters at one dollar and up, you will get our ad free episodes and you'll get my beats, my compilations of beats. But to get the bonus segments, you got to go to the two dollar and up per episode.
[00:41:00] We really appreciate that. And you'll get a whole back catalog of bonus segments plus all the upcoming Deadwood segments. And you get to listen to our Ask Us Anything special episode every month at five dollars and up.
[00:41:17] You get all that and you get to vote on an episode topic. You also get on top of everything that I just said, you get the Brothers Karamazov series, the five part series. You get videos of Tamlers lectures on Play-Doh.
[00:41:30] You get some of my interest like video lectures. And finally, at ten dollars and up, you get all that, but you also get to ask us the questions and you get a video version of our Ask Us Anything,
[00:41:42] which we have coming up probably going to record in a couple of days. So thank you to everybody from the bottom of our hearts. We appreciate it. All right, let's talk about Tolstoy. A confession again, thanks to Fareed Anvari for recommending this
[00:42:00] like kind of in the nick of time because we didn't really have a good topic for this episode. We almost got into a fight about a possible topic. Yeah, we have to have those every once in a while. It's definitely something I've thought about.
[00:42:15] This is the memoir by Leo Tolstoy that I think he wrote around the time that he was also writing Death of Ivynilich. Maybe it's a little bit after that. There's a lot of resonance with Death of Ivynilich and also Anakarananat.
[00:42:31] But it's a memoir about living the life of a Russian aristocrat and then a famous artist than having a spiritual crisis where he couldn't determine what living was for or what life was for to the point where he struggled with suicide.
[00:42:48] And, you know, I teach this in my intro to ethics class. I teach the middle chapters, so like chapters four through nine, I think that is where the spiritual existential philosophical crisis is confronted head on. And I never really thought about it also as a text about people
[00:43:06] who are struggling with suicidal thoughts or suicidal ideation. But I had a student like email me and say, look, I really liked the discussion. I liked it. But you might want to just alert people that this is this could talk about about suicide. Yeah.
[00:43:22] So I feel like, yeah, that's right. You know, probably worth mentioning that. Yeah. But anyway, like so I know those sections pretty well, but it's interesting. I've been teaching this for 12 years. And when you've taught something that long, you don't always read it again
[00:43:37] right before you skim through it. You look at your slides or your notes and it was different than I thought it was in certain parts. So do you in your class? Do you assign it through his resolution? The end of chapter nine, he arrives at the conclusion
[00:43:55] that faith is the only answer. Faith, which he equates with irrational knowledge is the only way we can go on living. But I mean, and this is the problem and I realize this all the more now.
[00:44:07] I don't know if I've ever fully read carefully the rest of it. How faith is unpacked and what that means is very is probably different than how I teach it. And, you know, even a student kind of alerted me that he has a more
[00:44:22] I don't know, almost Eastern or Buddhist conception of what faith means than a Christian one by the end of it. It is very interesting. So I'd never read this before, actually. This was my my first time and it is so much his version of ecclesiastes,
[00:44:40] the slow realization over the course of many years of his life that life is meaningless. Yeah, no, it is. It's very much ecclesiastes. He also quotes like pages of a ecclesiastes in it. And he like, I think he actually kind of misreads
[00:44:59] the ecclesiastes to some degree, reads the message they are being kind of full on Epicurean. Yeah, just and while there are definitely lots of elements of that in ecclesiastes, I don't think it it also has the kind of pessimism of Schopenhauer.
[00:45:16] It has a lot of the other stuff too. You know, it's an interesting question to what extent this is meant to be an honest reflection, like purely autobiographical account of what he went through rather than a sort of, I don't know, treaties.
[00:45:33] Well, yeah, or a way of kind of organizing a lot of these more chaotic anxieties and thoughts that he had because it really does the reason I can teach this in a philosophy class like it's very easy to kind of systematically break down and especially for Tolstoy.
[00:45:52] This is some of the problems that people had with this. It's like he is a master artist. The way he's such a great artist is he sees like the contradictions and the inconsistencies and complexities of life.
[00:46:04] And he puts it all out there without trying to solve it or anything like that. He's just portraying us in all of our like paradoxes. And here it does seem to kind of play out a little bit more like, OK, so I was here and I reached this.
[00:46:22] But then I questioned this premise and that led me here. And, you know, and then, yeah, and then I was cruising for about five years. And then I realized this just speaking broadly about it. This is I don't know if this is a criticism.
[00:46:40] I don't know. I don't know how fair this is. The one thing that I got that bothered me was a sort of superiority that that he seems to have. Even though he is decrying his own arrogance and bad character,
[00:46:57] he does seem to throw a lot of people under the bus for being bad people. And I think just his Russian aristocrat friends and the religious and the people, you know, like the Eastern Orthodox church leaders.
[00:47:15] I mean, he's so hard on himself, though, that it's like hard to, I don't know, it's hard to blame him for for some of that. Like he he describes himself as like someone who is committed to like there's no crime that he hasn't committed.
[00:47:32] No, like that was in there and done. Yeah, he says I can he's recalling the years, the sort of dark years. He says, I cannot think of those years without horror, loathing and heartache, I killed men in war and challenged men to duels in order to kill them.
[00:47:47] I lost at cards like Wild Bill. It sounds like he's describing Wild Bill. I lost at cards, consumed the labor of peasants, sentenced them to punishment, lived loosely and deceived people, lying, robbery, adultery of all kinds, which is interesting.
[00:48:03] I don't know how many kinds of adultery you can commit. You have a different translation, but yeah, yeah, I must. This is the Dover books. Adultery of all kinds, drunkenness, violence, murder. There was no crime I did not commit.
[00:48:14] And for all that people praised my conduct and my contemporaries considered and consider me to be a comparatively moral man. So I lived for 10 years. Yeah. The other thing I would say about it is the kind of romanticizing of the goodness, the simple goodness of the Russian surf.
[00:48:33] Yeah, that's like that doesn't sit as well with me today. I mean, I think he really did believe it and he was very devoted to working with and teaching the peasants in the surfs on his estate.
[00:48:48] But like it really does read almost like noble savage kind of stuff. Totally. And we'll talk about it, but how it ends is still not quite clear to me. Like I'm not quite sure where he arrived. Which I like about it.
[00:49:02] Actually, like that's one of the ways in which I think it's not as didactic as some of its critics, you know, the people who don't like it say it is. The haters. The haters.
[00:49:13] Sorry, I was just going to, and you know, and if it is a little like superior, like he writes this after he's written War and Peace and Anna Karenina and it's actually recognized as probably the greatest nobleist who ever lived
[00:49:25] and maybe will ever live, which I think he might be. It only comes through. I think as I was saying, it's hard, I think to write an autobiographical confession like this without some self aggrandizement if only because why would
[00:49:38] anybody care to read like this, this crisis of yours? And the truth is we do care and like I am interested and he is a great writer. So yeah, it justifies it. Should we talk for a second about this genre?
[00:49:52] I would say more about it, but I don't know that the genre of confession. Like this does seem to be a genre. This isn't just him writing off the top of his head, an account of his crisis.
[00:50:04] Like it has the form of these kinds of confessions like Augustine's. Yeah, John Bunyan, I think had a confession that I read. They judge themselves so harshly in this very Christian way where you're like, I am shit. I was nothing like I and there is some Augustine's confession.
[00:50:26] It is also sort of weirdly self aggrandizing. Like why would you think anybody would care about these small mistakes that you made when you were young? What did they do? Like something with like a pear or stole a pair from it? Yeah.
[00:50:40] And so it's like they're beating themselves up, but then they're also like, oh, and I had sex with my mother. But really it's that pair incident that God must hate me. No, but it has that kind of structure, right? Like I had this disillusion and just immoral youth.
[00:50:58] Then this crisis, often a period of asceticism, just pure denial, self denial, and then some kind of like resolution to that, right? Yeah. Yep. And there is a genre of preaching like a sermon that is what I used to
[00:51:16] call when I was young, I had fun and then came to God where they would fill their sermons with like explicit sort of descriptions of, you know, and I fucked 100 women and I did drugs. And then Jesus told me to do that. And here I am.
[00:51:34] I was like, let's talk about the first part. But this one doesn't end really with a conversion. No. In fact, like it kind of passes through any temptation to that direction. But I think that's where I think a lot of these kinds of confessions diverge.
[00:51:54] There's also one with it's not just Christian, because you have versions of this with the Buddha, you know, again, with that disillute early life, asceticism and some kind of almost middle path. And I think like that's in some ways a good description of where Tolstoy ends up.
[00:52:12] It's his own version of a middle path. Yeah. And he doesn't strike me as extreme. Like some of the like neurotic Christian confessions where they're like so hyper focused on their own evilness, Tolstoy says he was a bad person.
[00:52:27] But he's not like neurotically focused and on any one thing. And he also doesn't seem like he goes to any extremes. He's other than maybe the point of like the deepest despair, but it seems like not an unreasonable place to go.
[00:52:46] And he doesn't glamorize his path, the sins of his youth. Like I think sometimes in this genre of thing, it's like you can tell they're pretty, they're still pretty like excited about like all their misdeeds. Yeah, I was very interested in the beginning because he was raised.
[00:53:05] But the first part is that he was raised orthodox Russian orthodox. So he was taught as everybody at the time, obviously would have been. He was just sort of indoctrinated into the dogma, the beliefs of the Russian orthodox faith.
[00:53:23] And when he was 16, like it sort of just slowly came on, but it came on early, like he had just a point where he was like, I don't really believe this stuff. Like this stuff seems. It's not a big deal. I don't believe it. Yeah.
[00:53:40] And so he says, at the age of 15, I began to read philosophical works. My rejection of the doctrine became a conscious one at a very early age. From the time I was 16, I ceased to say my prayers and ceased to go to church
[00:53:53] or to fast of my own volition. I did not believe what had been taught me in childhood, but I believed in something. What it was I believed in, I could not at all have said. I believed in a God or rather I did not deny God,
[00:54:04] but I could not have said what sort of God. Neither did I deny Christ and his teaching, but what his teaching consisted in I again could not have said. So really he had like a sort of disillusionment, I think with religion and the church and the institution.
[00:54:15] And yeah, but still had the spark of like there's something there's at least enough to keep me going. Yeah. And it sounds like what that whatever that thing was, he just started just started having less and less of an effect on him.
[00:54:30] Yeah, he says right after the passage that you're talking about that he starts really just now starting to worry more about his impression, the impression he's making in the eyes of others and how he can succeed relative to them.
[00:54:47] Which is a big I think just a big thing of like most of the life that he is rejecting now as a 50-something year old. Yeah. His he seems to really regret that so much of what he did was motivated by how others viewed him.
[00:55:03] Yeah, kind of pride, vanity, like wanting to have other people say how great he was. And the funny thing is like he is great. Like it's just right. I mean, it's too bad that was meaningless because he really was great.
[00:55:21] I love that this is such a good little toast story note just to give you the flavor of his to give you the flavor of the kind of worldview he had and the kind of people he was around.
[00:55:33] He tells this story, a kind hearted aunt of mine with whom I lived. One of the finest women was forever telling me that her fondest desire was for me to have an affair with a married woman.
[00:55:45] Rien ne form un jeune homme comme un liaison avec une femme comme il faut. It's such a great little thing. Like she you know, this is they're sophisticated. They speak French. They think it's like adultery and having an affair with a married woman.
[00:56:00] It's like something like that you have to do. It's part of the rich right of passage to become like a sophisticated Russian aristocratic man. And it's just there is a kind of, I don't know, like debaseness of it or, you know, it perfectly captures.
[00:56:17] But all but you believe that they're kind hearted. Like she meant that like that could be, you know, like Christina would say that to me, you know, like they're not bad people. Anybody's not a bad person.
[00:56:28] It just gives you a sense of like, OK, this is the water he's swimming in kind of. Exactly. Yeah. So I when I was reading this, I was like, I wish I had an aunt who would have told me. You know, it's a douche.
[00:56:41] But here is the time in his life where the quote that I read earlier about his evil ways started and here's also where he starts writing. And it's weird because I didn't pay attention to when the timing of this was compared to when his major works were.
[00:56:59] But I did feel a little sad about him saying this during the time that I began to write, during that time, I began to write from vanity, covetousness and pride. In my writings, I did the same as in my life to get fame and money for the
[00:57:12] sake of which I wrote it was necessary to hide the good and to display the evil. I found myself thinking like, was this his mental state when he was writing some of the great shit that he wrote? Yeah. In fact.
[00:57:22] So he says another 15 years went by in spite of the fact that during these 15 years, I regarded writing as a trivial endeavor. I continued to write and that was when he wrote. It's like, not only does he now think that it was a trivial endeavor,
[00:57:40] he said he thought of it then as a trivial endeavor. And if he read more in peace, it's not a trivial endeavor to write that or to read it. I mean, it's great. It's fun. It's awesome. It's just completely fascinating on every level. Yeah, I agree with you.
[00:57:55] That's sad. Yeah. So it's sad. And he says this thing. I don't know what your translation says. How often in my writings I contrived to hide under the guise of indifference or even a banter those strivings of mine toward goodness, which gave meaning
[00:58:09] to my life and I succeeded in this and was praised. He's making his books or his writing or his characters have some sort of indifference to moral goodness. And, you know, he's all. But which is totally true, though, like that's just not that's not what happens
[00:58:28] in more in peace and Annika and they are struggling with exactly the same questions that he's struggling with now. So it's just strange that he would. It's odd. He sort of does condemn others for praising him in a funny way. He's like that.
[00:58:46] You know how I knew they were terrible? They liked it. So he gets accepted into the circle of writers in Petersburg after the war. I guess the Crimean War, like that he fought in. And he again doesn't have very kind words, you know, for that society of writers,
[00:59:04] but they took him in and he says they were like a religion. Yeah. But they they were teaching, he says, but they didn't know what they were like. They had nothing to teach. But nonetheless, they took it upon themselves to teach all of humankind. Yeah.
[00:59:18] And when he says so at one point, he writes, my belief assumed a form that it commonly assumes among the educated people of our time. This belief was expressed by the word progress. At the time, it seemed to me that this word had meaning.
[00:59:33] Essentially just like Stephen Pinker, right? Like everything is great. The Enlightenment's great. Everything's getting better. It's progress. We're making progress through reason. He's like, like Tolstoy already is thinking that's sad that I could have ever at least like 80 percent of me believed that bullshit.
[00:59:53] Yeah, it's interesting because, you know, he questions. He says everybody's talking about progress, but progress towards what? And he has this moment where he sees an execution in Paris, right? And he sees a guy's head get lopped off, separated from his body. And this is progress.
[01:00:12] This is what people are fighting for, for that kind of what he thinks is evil. And I got the sense there and toward the end that he actually, you know, his pacifism starts coming through. He really does seem to have this compassion for other human beings.
[01:00:30] And he does seem to think that, you know, if there is anything that we can believe it's that we shouldn't do that to each other. Yeah. He says, when I saw the head was severed from the body and heard the thought of each
[01:00:41] part as it fell into the box, I understood not with my intellect, but with my whole being that no theories of the rationality of existence or of progress could justify such an act. I realized that even if all the people in the world from the day
[01:00:54] of creation found this to be necessary, according to whatever theory, I knew it was not necessary and that it was wrong. Therefore, my judgments must be based not on what is right and necessary and not what people say and do.
[01:01:08] I must judge not according to progress, but according to my own heart. And then he also talks about the death of his brother. But I think there is there this kind of it sounds maybe almost megalomaniac or some
[01:01:20] sort of, but there's a kind of humility too that like we haven't figured out some theory that can justify an act of that kind of simple brutality. Right. And like are just primitive, just intuition. And wait, there's something deeply fucked about that.
[01:01:37] That's something that like the educated people are dismissing. And it would say it's just our evolution acting up or even Paul might say that's just kind of mistargeted empathy or something like that.
[01:01:51] But I think what he's committed to there is all of that is not as powerful as that the simple conviction that he had when witnessing that that was wrong. Yeah, he calls it a superstitious belief in progress. Yeah.
[01:02:07] And he is very core, like you said, if what it takes to to progress in the way that you say, if what it takes means this kind of of action, then it can't be right. And I don't know what in the end he thinks about.
[01:02:28] I guess he turned out to be a real pacifist. Yeah. Yeah. And probably inspired the pacifism of others. And he says, this is so dear to my heart. No theories like he's talking about the death of his brother.
[01:02:40] No theories could provide any answers to these questions, either for him or for me during his slow and painful death. This episode of Very Bad Wizards is brought to you once again by one of our favorite sponsors of all time, GiveWell.org.
[01:02:55] It can feel great to donate money and to make a difference in someone's life. But how can you feel confident that your donations are improving or saving lives effectively? You could do weeks of research, find charities, what programs they run,
[01:03:09] how effective they are and how those charities might use your money. Or you could visit GiveWell.org. There you'll find free research and recommendations about the charities that can save or improve the most lives per dollar. I would definitely do that over the weeks of research.
[01:03:27] Yeah. GiveWell spends over 30,000 hours each year researching charitable organizations and only recommends a few of the highest impact evidence-backed charities they've found it. And over 110,000 donors have used GiveWell to donate more than one billion dollars, one billion dollars. That's crazy. Rigorous evidence suggests that these donations will save
[01:03:52] tens of thousands of lives and improve the lives of millions more. And the best part about this is that GiveWell is 100 percent absolutely free. GiveWell really wants to empower as many donors as possible to make informed decisions about their donations.
[01:04:10] They publish all of their research and recommendations on their site for free. No sign up required. It's just all there at your fingertips. And they allocate your tax deductible donation to the charity you choose without taking a penny. Yeah. And I give to this every year.
[01:04:26] Our listeners have been giving a ton of money to GiveWell, which we're very proud of. I always or I think maybe with one exception have given to give directly. That's one of their top charities. The cash transfers for extreme poverty.
[01:04:43] And I know you just let them put the algorithm. The algorithm just take me away. You're already giving in to the computers, you know, just because we couldn't like distinguish it from one of them doesn't mean they choose our charities.
[01:05:00] So go to GiveWell.org and make sure to pick podcast and enter Very Bad Wizards at checkout. Make sure they know you heard about GiveWell from Very Bad Wizards. This is something. This has become a real tradition that we'd like to keep going as much as possible.
[01:05:18] Our listeners donating to GiveWell. Again, that's GiveWell.org. Thanks again to GiveWell for sponsoring this episode. Yeah. You know, there is I remember taking an adolescent development class when I was in actually was a sociology of religion. I'm sorry, sociology of religion class.
[01:05:37] And there is this in these narratives, there's almost always moments that tear down everything that you believe in. And it's not until you can tear those things down that you can build up anything that will be like worth it for you, like anything that will be firm
[01:05:54] enough to be a foundation. So as as many people feel this being confronted with evil, the beheading of this guy and his brother's meaningless, slow, painful death, there's nothing that he had in him that could properly take that in and say, OK, this is why it's happening.
[01:06:15] It was a pre theoretical sort of like just like this is just wrong and bad. And you can't convince me that somehow it fits into this larger picture of like goodness or progress or justice or whatever.
[01:06:31] Like it's just like I know at the depths of as well as I know anything that there's something so deeply fucked about the fact that my brother could because the way he describes it, like his brother just got sick and died
[01:06:43] and it was painful and slow and he never had any kind of revelation. And that was it. And he just died. Yeah, I have highlighted here because it just hit in a particular way. He says died painfully, not understanding why he had lived
[01:06:58] right and still less why he had to die. And and to have the luxury of being able to even question life and existence is something that his brother didn't have and that many, many people don't have. And yeah, I feel it like sometimes
[01:07:17] it takes a very personal thing like that for you to start questioning what the whole point of it all is. You know, and like sometimes I actually been harsh on people whose faith as somebody who grew up in a very Christian like faithful kind of environment.
[01:07:37] It would bother me when, you know, people would start questioning the problem of evil when their grandma died. And I would say like, didn't it bother you? Like when millions of people died like when the Holocaust, knowing about the Holocaust,
[01:07:49] now that it hit home for you, now like you're questioning whether God is like true and just. But sometimes that's what it takes. And yeah, and that's sometimes what will hit the hardest because that's how we are, that's how we're made. We care deeply about those around us.
[01:08:06] And sometimes our faith gets shaken up only when we see those people suffer. And then we can realize what shit this happens to everybody. And it shows also just that a lot of the stuff that and this is,
[01:08:19] I think his point, like a lot of the stuff that we believe is on shaky ground, the stuff that allows us to think that everything is good and progressing and it makes sense and it's just like all we haven't really
[01:08:33] examined that. And so what we need and what Tolstoy got was just a few shots to the system that made him like actually reflect in a serious way. And one of the things he thinks is Russian aristocratic, that whole group of artists and, you know, successful, wealthy landowners,
[01:08:56] they're fundamentally unserious people who haven't really examined the grounds that allow them to carry on in the kind of complacent way that they do. Right. You know, there's a part here where he says that like after that, he was worn out and he had mental confusion.
[01:09:18] And, you know, this is when he was working as an arbiter, sort of like trying to make peace between the lords and the serfs. He felt ill, threw up everything and went away to the bash
[01:09:28] cures and the steps to breathe fresh air, drink, comeys and live a merely animal life, which sounded like just a kind of awesome thing. Like outward bound for Russian people having an existential crisis. Yeah. So I like the way he describes it.
[01:09:48] And it's very like a reminder of me of Ivan Ilich. He says, it happens with me as it happens with everyone who contracts a fatal internal disease. At first there were insignificant symptoms of an ailment, which the patient ignores just these little doubts, you know?
[01:10:03] Then these systems recur more and more frequently until they merge into one continuous duration of suffering, the suffering increases. And before you can turn around, the patient discovers what he already knew that the thing that he had taken
[01:10:16] for a mere indisposition is, in fact, the most important thing on earth to him is in fact his death. And so like all of a sudden, like he's describing a kind of mental breakdown, but that comes on so gradually that you don't even notice it
[01:10:33] until it's making you start to question whether you want to live or not. Yeah. You know, we need to read Kierkegaard. I couldn't help but think of Kierkegaard when I was reading this. Definitely. And sort of his description of like the sickness
[01:10:47] unto death and the sort of anxiety and especially the leap of faith, which we'll get to. And here's where it starts. He starts articulating the problem. He says he didn't know the reason for anything. He just asked himself why it's like the naïve problem in the absurd.
[01:11:03] He's like, so you'll have 6,000 discient and the Samara province and 300 horses, what then? And he would go down the line of all the different things, including the last one. Very well. OK, you're going to be more famous than Gogol or Pushkin or Shakespeare
[01:11:18] and Moliere, more famous than all the writers in the world. So what? And I could find absolutely no reply. Yeah, it's all just very ecclesiastes to very. I was a bad ass, but who cares? But who cares? Like what's it all for? And that's right.
[01:11:36] And I think like people who are successful and people, you know, in every kind of relative stage of life and like you do start to realize that that's a question that you can pose without answering. And when we talked about this, I'm sure in the absurd episode.
[01:11:54] We will, you and I were talking a little bit, not directly about this, but just about you get to a point in life and you're like, well, what's my contribution going to be? Like what does it matter?
[01:12:03] Anyway, like you can't be you can't be a reflective person and not get somewhere there, I don't think like how you deal with those questions. Some people can easily push them out of their mind. And Tolstoy says that he would try to just engage in as much self-deception
[01:12:21] as possible. But but I do like how he ends. I love how he ends chapter three right after what you said where that he never had a foundation to be able to handle these kinds of questions.
[01:12:32] So he says, I felt that what I had been standing on had collapsed and that I had nothing left under my feet, what I had lived on no longer existed. And there was nothing left to live on just to spare.
[01:12:43] So then this is now we're entering crisis mode. And here's where I have some. I feel like the case against living isn't as airtight as Tolstoy does. But what he gets across so vividly is it's like like he had no desire,
[01:13:00] as he said, whose satisfaction he would have found it reasonable. He just it just stopped mattering to him whether he got his desired fulfilled or not. And once that happened, it was like, well, then who cares about anything?
[01:13:14] Like the so what question isn't like a kind of academic. Oh, yeah, what does it matter that I'm like the best novelist that ever lived? It's like it becomes like a practical problem of why should I act at all if I don't care what happens?
[01:13:30] It's interesting that it that it for some people would become so, you know, it sounds like just true depression, but it's this existential despair that ends up. Being so strong that it he ceases to be motivated to do anything.
[01:13:46] And I can't say that I've ever gotten to that point where we're like, I just don't want to do anything. And it's like so he's not yet 50 years old. So he's in good health. He says very good health and he has a wife.
[01:14:01] He has started to brag about how good of health he's in. Yeah, exactly. I was like more fit than like most people my age. Most of the peasants that I worked with, they were like, damn.
[01:14:12] And he had written war like he's the most celebrated author maybe in the world and like just undeniably just the supreme artist. And yet like he said he had to employ ruses against myself to keep from committing suicide.
[01:14:28] He says like it seemed to him like somewhere someone was now amusing themselves, laughing at me at the way I live for 30 or 40 years, studying, growing, developing and body at soul, laughing at how I had completely matured intellectually
[01:14:43] and had reached the summit from which life reveals itself only to stand there like an utter fool clearly seeing that there's nothing in life, nothing that never was and never will be. And it makes him laugh. So he feels like the universe is almost like antagonistic.
[01:14:59] It's laughing at how stupid and petty and pathetic he was thinking that like all these things had like something that they were stakes to anything. You know, right? Like a demon being like, I made you think something mattered. Yeah. Yeah.
[01:15:15] In the way that we do like I remember like with a cat like my cat when I was first with my now wife and we were living in an apartment with our cat and we
[01:15:25] saw him like walking in front of the TV and like just we called him to come like on the couch with us. And he's like, no, fuck that. And he just kept walking in the kitchen and we were laughing like, oh,
[01:15:37] he's busy. He's got a lot of shit to do. He doesn't have time for us. And then I remember thinking that but like couldn't somebody say that about us, you know, for all the things that we do like, oh, we're busy.
[01:15:49] We don't have time like we have to go to this committee meeting or put out this episode or whatever. You know, right? Yeah. If someone were above us, they would laugh. Yeah. But maybe are right now. Maybe they are. Fuckers.
[01:16:07] Yeah. And he says that he still feared death, though. Right. So so even though he feels like he's this meaninglessness is moving him toward the conclusion that that well, one, death is inevitable and two, nothing, none of his actions are meaningless.
[01:16:23] So why not die sooner rather than later? He still doesn't want to die. Right. Which he will say is the cowardly kind of solution at this stage. Do you want to talk about the Eastern fable? Yeah, I love it. Yeah.
[01:16:38] So so he says there is an Eastern fable long ago of a traveler overtaken on a plane by an enraged beast escaping from the beast. He gets into a dry well, but sees at the bottom of the well, a dragon
[01:16:50] that has opened its jaws to swallow him and the unfortunate man not daring to climb out, lest he should be destroyed by the enraged beast and not daring to leap to the bottom of the well, lest he should be eaten by the dragon.
[01:17:00] Seizes a twig growing in a crack in the well and clings to it. His hands are growing weaker and he feels he will soon have to resign himself to the destruction that awaits him above or below. But he still clings on.
[01:17:11] Then he sees the two mice, a black and a white one go regularly round and round the stem of the twig to which he is clinging and gnaw at it. And soon the twig itself will snap and he will fall into the dragon's jaws.
[01:17:21] The traveler sees this and knows that he will inevitably perish. But while still hanging, he looks around, sees some drops of honey on the leaves of the twig, reaches them with his tongue and licks them. So he says he too clung to the twig of life.
[01:17:32] I love that. Parable. Yeah, it's interesting. Like obviously, like I think the dragon is almost like a natural death, I guess. Right. And the mice eating at the twig is sort of and but they're eating it at a rate that you can't fully predict.
[01:17:49] So I guess that's sort of the uncertainty of how long you have. Yeah. I guess the black and white is like the sort of day and night. Like the the progression of time. Yeah. Yeah. We all know it's inevitable. Right. We will all die.
[01:18:05] Right. And it's just a question of when and where we don't know when. What about the wild beast that's chasing him? It's chasing him. So, yeah, I don't know. Maybe that is survival. Like you're working hard to survive day in, day out. You're trying to escape the beast.
[01:18:26] And then there's you see the dragon is there. It's going to be there at the end no matter what. So you're working hard. You're eating, you're staying away from danger. You're doing everything you can to avoid the beast. Right. But the dragon is at the bottom.
[01:18:40] Right. I think that's totally right in all the years that I've taught this. I don't think I've ever thought of that. But like that the beast is like all like represents all the things you have to do not to die prematurely.
[01:18:52] Like I thought maybe it was like the urge to suicide. But I think your view is right. It's like all the things that you have to do to keep yourself from dying. And then meanwhile, you're just going to die. Eventually.
[01:19:04] Yeah. So like it does seem kind of ridiculous. Right. Yeah. Absolutely. And so and the honey though, the last thing is like the honey, which I guess he says represents his art and his family who he loved. But like at this point, it's not sweet to him anymore.
[01:19:22] Like the art, like all the love and the meaningfulness of whatever relation he had to those things just doesn't taste as sweet to him right now because he now sees that the dragon is inevitable. And that means then what's the point of all of any of it?
[01:19:39] I feel like someone could make an awesome back tattoo out of this fable. Yeah, exactly. A dragon and a beast. There's a part that hit kind of hard for me. I don't know why, but he says family said I to myself.
[01:19:53] But my family, wife and children are also human. They're a place just as I am. They must either live in a lie or see the terrible truth. And it's like it's very easy to fall back on the fact that you care because
[01:20:06] your family, you have love, the thought that they're just in the same position as you are and that they're just human. And so it's not. I don't know, like I can't put into words why that sort of hit. You know, you realize sometimes you're like, yeah, you know,
[01:20:23] my kid is just another person. Like, I don't they're going to have to go through all of this too. Like, yeah, yeah, it's so funny because that doesn't hit for me. It's like, well, what did I think that they were in the first place? You know, like something.
[01:20:41] Yeah, maybe it is that it's it can't be a solution to existential crisis in the way that like I find solace in that I live for my daughter. I do. But but it's like, why should I love them?
[01:20:56] Why care for them, bring them up and watch over them? Like you can ask those questions, but it's like, well, first of all, why not do those things? Like, why not love them? It's like the thing that makes you the happiest and most joyous. It's good for them.
[01:21:09] It seems like everyone's a winner there. Like you don't need answers to those questions. Why you should bring them up because everything is, I don't know. Like, yeah, this is what life has to offer, but these are the good things that life has to offer.
[01:21:23] Why are you questioning those? And then it's like so they can sink into the despair that eats away at me or turn them over to stupidity. That just seems like to me a false dilemma. It's like, well, I don't know if because he thinks this despair is like
[01:21:39] logically entailed by our condition, the condition represented by that fable. But I guess that's the part that is the little bit of a missing. There's a gap to that that I've never fully gotten. I think I yeah, we can talk about. Yeah, you know, when when we.
[01:21:58] Well, OK, so so I think what he's just saying, like he's bordering maybe on some something like anti-natalism here where he says, sure, why like I brought these people into existence. And yet they're going to have to go through the same shit as me unless they're stupid.
[01:22:14] Right? So it's like, well, I still am. I think like not necessarily because I think well, he's he's in that dilemma right now. He thinks that he could have gone on being stupid and ignorant about the truth, like this harsh truth of existence or he could confront it.
[01:22:29] And both of those seem like terrible options to him. He doesn't want to be stupid and fooled by by whatever the the false hope in that anything that you do matters. You know, I can see why you would think if my child is intelligent at all,
[01:22:47] they might come to this depth of despair that I do. Or they would be stupid. But I think what you're saying, though, is that he hasn't got. He hasn't gotten to the point that Nagel gets to in the absurd, which is a solution that Tolstoy
[01:23:03] I don't think ever really arrives to, which is what you were getting at, which is there is meaning in relationships. There is meaning in art. There doesn't have to be a further justification. Yeah, there's there's you don't have to keep digging past that.
[01:23:19] I think the key is that the honey has lost its sweetness. That's not something that's rational, like one, like two things are inevitable. He will, the mice are going to finish eating twig and you're going to fall into the dragon.
[01:23:33] But what's not inevitable or logically entailed by anything is whether you find the honey sweet or not. Right? Like, and I think the fact that he didn't find the honey sweet, the fact that these pleasures are no longer pleasures for him,
[01:23:45] that is an emotional reaction to the facts that is no more logically entailed than the emotion of like, sees the day or something like, well, then we should just enjoy everything to the utmost while it can.
[01:24:00] Like he's going to have some some problems with the kind of Epicurean version of that, which we'll get to. But like this thing that's always bothered me about the way he presents the problem, like either you're just an idiot or are totally self-deluded.
[01:24:16] You should get to this life or death point where the fact that you're eventually going to die and that all these relationships will be extinguished. Like that should make you just question whether you want to live at all.
[01:24:27] Well, to defend him a bit, this is the same sort of core foundational intuition that he has as he does with the empathy for his brother dying or for the man being sacrificed, where there's something in the core of him
[01:24:44] that thinks, one, we can't do, we can't make other people suffer like this. That's terrible. And two, we're living under delusion that we're all like the beast. The beast is there. So much of what we do is is to distract ourselves from this harsh truth.
[01:25:00] And then I get that he's feeling this at his core and and I don't know why it would suck the sweetness out of out of his life. I don't think he would think. Well, I mean, he does talk like it is entailed.
[01:25:13] He does talk like this is the only conclusion that you can reach. There is when he talks about trying to find meaning, he also has a sort of a lengthy discussion of science and seeking answers in science. Oh yeah, science and philosophy and philosophy.
[01:25:27] And I guess that he was living in this age of enlightenment and science did offer answers to a lot of things. I never thought that anybody would think that science would have an answer to like the meaning of life.
[01:25:41] But I mean, the moral landscape friend of Sam Harris, right? Like, yeah, I think there are a lot of people who disagree with that optimism. Yeah. Yeah, I guess so. I think there are a lot of books like that in that line. Well, Hume's law is odd.
[01:26:00] It's much more complicated than that. But I like the way he describes science, right? It's like they don't they give answers to questions I didn't ask. He says science provides a number of precise answers to questions I had not asked. Answers concerning the chemical composition of the stars,
[01:26:15] the movement of the sun towards the constellation Hercules and all these things. He says, but the answer given by this branch of knowledge to my question about the meaning of my life was only this. You are what you call your life.
[01:26:26] You are a temporary random conglomeration of particles. The thing that you have been led to refer to as your life is simply the mutual interaction and alteration of these particles. It's like Richard Dawkins could be writing this, right? You're a little lump of something randomly stuck together.
[01:26:41] The lump decomposes the decomposition of this lump as known as your life. The lump falls apart and thus the decomposition ends as do all your questions. I was going to say less Dawkins, even the more crick like that,
[01:26:53] the kind of naive science where it's like, oh, science has discovered that we are this hardcore reductionism that somehow has implications to them for identity and for meaning. The best science can do is tell you how these things work and maybe give you
[01:27:10] this terror that you are just a bunch of cells working together to make a human being, but it's not going to give you any answers in the positive. For anything. It's not there's no purpose for it. It's just this thing that happens.
[01:27:23] And like according to these natural processes, you know, it's funny because I think sort of snidely saying this sounds like Dawkins or crick like it sounds it's almost like shock journalism kind of like about like, you know, like this is it.
[01:27:36] You can't handle the truth, but this is it. But it's like also true. It's kind of how I actually it's also like I do kind of think in the end, even though I'm temperamentally maybe attracted to not phrasing it in these
[01:27:52] ways, like, yeah, like do I think it's more than that in spite of my, you know, agnosticism? Like not really. You know, I'm just not bothered that that's the only. Yeah, I don't have like it doesn't deeply bother me to know like the reductionist
[01:28:10] accounts don't deeply bother me because I just think well, obviously that the organization that came together for all these particles to exist in this time and place, that is me. That's just another way of saying I'm going to die, I guess. Right.
[01:28:24] At that level of analysis, it doesn't bother me. Right. There's a great. This is in Star Trek, the next generation. They're these little like aliens who they're pretty microscopic, but they become they're sentient. And when they finally are able to translate what they're trying to
[01:28:41] communicate to the humans, they're calling. They're saying that we are ugly bags of mostly water. And that's always not that's just a great way to describe human beings. Yeah, especially me. I go to the hospital because I drink too much water. That's right.
[01:29:01] Yeah. So like and then philosophy, he says, and there's some good lines here, I wish I hope philosophy says at least gets the question, but just finds different ways of phrasing the question and clarifying the question.
[01:29:16] And whenever it tries to actually answer it, it just comes up to the there's no answer. You know, yeah, he's basically accusing. Yeah, like you said, philosophers of being circular. I mean, like, oh, yeah, why live? Because we have this will to live or something.
[01:29:32] Like it's the life power that makes us have like the life force we want to continue, and it's not it's not adding anything. Even at some point says like if yeah, if you have an equation like it's
[01:29:46] whatever you put into it is what's going to come out and you're not getting any philosophers aren't getting anywhere new when they when they were doing this analysis. They're just like, you know, probably at some point, philosophers recognize the problem if maybe they weren't the first,
[01:30:02] but they and then they've just been formulating it in different ways for, you know, three thousand years or whatever. And but when they're really being honest, they're like, I don't know, but here's a new way of clarifying the question. Right. I've never read Schopenhauer by the way.
[01:30:19] I was thinking that I haven't really just aside from excerpts either, but we should. We should. Yeah. There's a great highlighted this part. He quotes Schopenhauer at length that we have whore annihilation so greatly or what is the same thing or desire to
[01:30:33] live is simply another expression of the fact that we strenuous strenuously will life and are nothing but this will and know nothing besides it. Yeah. So what do you interpret that? I could not like it's yeah.
[01:30:51] Yeah, I felt that's why I felt like I needed to read more Schopenhauer. That's the like the beast, right? Like we're just running from the beast. So we're strenuously willing ourselves to live without hat, but we're just doing it because there's a beast behind us.
[01:31:05] We're not like thinking about what it's all for or something. We're like built. We're built to want to live. Yeah, it is what life is, is a desire to continue life. Right? Whether that's through daily survival or through reproduction.
[01:31:20] Like we life wills to continue living and to continue life. Yeah. All right. You know what? We still have a long way to go, David, and the risk of trying our listeners patience. Yeah, they can only handle so many three hour episodes.
[01:31:37] Exactly. Like our last one was like two and a half hours. So maybe we should stop here and pick this up again. There's so much more to go. This might be a good separation point because he has lost his faith, his lost his sense of why anything matters.
[01:31:54] Next time he's going to list some alternatives of how to deal with that situation and all of them are awful and he's going to be at this like bottoming out point where he's considering suicide, but then he will turn to faith as a way out.
[01:32:12] But what he means by faith is, I think, very much open to question. And yeah, like so we still have a lot more to get into here. All right. Join us next time on Very Bad Wizard.
