Episode 247: Open the Pod, Dave (with Sam Harris)
Very Bad WizardsOctober 18, 2022
247
02:34:31177.26 MB

Episode 247: Open the Pod, Dave (with Sam Harris)

We welcome Sam Harris back to the show for a deep dive into Stanley Kubrick's confounding 1968 masterpiece "2001: A Space Odyssey." How long is the Dawn of Man? What does the second monolith do exactly? Why are the humans so banal and expressionless? What are HAL'S motivations? Has he planned his mutiny from the start, or does the Council's deception make him manlfunction? Or something else? Who is the Council anyway? Was HAL meant to go through the stargate? What is the final leap forward in consciousness? The hotel room, the starchild, all the rectangles, rectangles everywhere, the music – what does it all mean????

Plus Sam has some thoughts about our Rorty episode and David tries to rile Tamler up about Kanye's antisemitism.

note: there's a bit of an abrupt transition between our brief opening and Sam telling a story about Rorty in around the 9 minute mark... couldn't be helped.

Special Guest: Sam Harris.

Sponsored By:

Support Very Bad Wizards

Links:

[00:00:00] Very Bad Wizards is a podcast with a philosopher, my dad and psychologist Dave Pizarro having an informal discussion about issues in science and ethics. Please note that the discussion contains bad words that I'm not allowed to say and knowing my dad some very inappropriate jokes.

[00:01:01] Welcome to Very Bad Wizards, I'm Tamler Sommers from the University of Houston. Dave, I've done too much podcasting this week. I'm beat, I didn't have the energy to come up with a question so why don't you ask me a question?

[00:01:29] Does it bother you at all that Kanye West said anti-Semitic things? As a Jew, I mean. Given that we control like world finance and the media, like I think we can handle this, like I think we'll be okay. If you're feeding into the confers.

[00:01:49] It shouldn't be something for us to worry about too much. Yeah, no, like Death Con 3. It's not as bad as Death Con 1 or 5, but it still might be pretty bad. It's a mediocre one, you know? Like pick an extreme.

[00:02:06] I feel like he was fence sitting on this one. Like that's a con. Yeah, if you're going to come at the Jews, you best not miss. You know, we laugh to keep from crying. I'm just laughing, not to keep.

[00:02:25] I was actually genuinely curious whether even a part of you was bothered by it. You know, he's in that category of people that like he can say anything like he could say that like my daughter is just horror or something like that.

[00:02:41] And it'd be like, yeah, but it's kind of like, you know, whatever, you know, where I'm not going to get all indignant about something that, you know, a guy who's in that phase of just a little bit kind of bordering on craziness.

[00:02:54] Jordan Peterson is actually like this for me, too. Like I'm just not going to get worked up about it. Yeah, you can never work you up about anything Jordan Peterson said. No, because he's like once they're in that category, it's the objective attitude.

[00:03:06] You know, it's like just something to deal with if you have to. But if you don't have to then you just leave it alone. Like I don't think that's going to be bad for Jews that he said that.

[00:03:19] It's I think people are worried about the rise of of anti-semitism and like casual anti-semitism by people like Kanye is kind of distressing when like in context. But like I was watching for some reason, like I've been into watching

[00:03:38] like boxing on YouTube and I was reminded that Mike Tyson has that amazing one man show they did with Spike Lee on Broadway. Did you ever watch that? No, it's really good. It tells the story of his life.

[00:03:52] Yeah. And he's a really interesting guy, like a really talented, interesting guy. Yeah. I really like that side from the like sexual assault. Yeah, he does discuss it in the show. He does. Yeah. He's very candid about everything. Is this on YouTube or something? It is on HBO.

[00:04:09] Oh, OK. And at some point he doesn't say like he doesn't blatantly insult Jewish people, but he says like when Customado, his trainer died, you know, he's like, I had been already set up with like all of the Jews, like my accountant, my lawyer.

[00:04:25] And so I like and he's talking about when he he had to like confront there was this one idiot like a small time gangster who who got in a fight with Mike Tyson and actually ended up fighting him.

[00:04:35] He's like and I and you know, I didn't want to like be Brooklyn, Mike Tyson and get like super upset. So I tried to I tried to channel like the Jew, you know, just be like, buddy, buddy, you know, we don't have to deal with it this way.

[00:04:52] And then he's like, it didn't work for me. So I had a punch. That's awesome. I love it. And I was I was like, oh, yeah, it's a little different, you know, now that the Kanye said that he ruined the mic. He ruined the Mike Tyson joke.

[00:05:07] It's a good bit. Yeah. But that much more worrying to me is the sort of more anonymous rises of anti-Semitism, like the synagogue's getting burned down and, you know, whatever. Like that that's more worrying to me than, you know, anything a celebrity says.

[00:05:24] You know, it's like, Mike Mel Gibson did I get worked up when Mel Gibson like went on his rant? No, because that's just part of being Jewish. You figure that people are going to do this every so often. I've told this before, but like anti-Semitism,

[00:05:38] do you know the definition of anti-Semitism? No. Disliking Jews more than is called for. You can say that when I say it. I mean, you say like so much. I don't know why that would be. I think Kanye West is legitimately if you watch his documentary, the documentary

[00:06:06] that the three part that they did, his friend did like with footage from his whole career. You see his decline into like bipolar, like severe severely. And it's it's just kind of a shame that like back in the day, somebody who lapsed into that sort of mental illness

[00:06:23] wouldn't wouldn't have Twitter to let everybody know how sick they were. But now like he got banned from Instagram. We'll see if it gets banned from Twitter. But we see everything like alive as it's happening. You could see people's meltdowns.

[00:06:37] Yeah. And once they do it, if they're in some fucking drugged out stupor and they tweet something like that, like they can delete it. But like there's a million screenshots of it. Because there's so many people just on high alert for somebody like the shit

[00:06:52] that people probably said that we would never ever know. You know, like, oh, yeah. You hear like the Nixon tapes and you're like, holy shit. Totally. Just the worst, most hateful shit. Like this is it's and we only get like little glimpses into that.

[00:07:08] I'm sure that still goes on, you know? Yeah. Yeah. I mean, just think of the shit that we say, you know, like when we're not recording, we didn't say what our episode is about. No. Right. Whoa. So today and actually this isn't even the opening segment.

[00:07:25] There is already an opening segment. Yeah. So we have Sam Harris on hasn't been on since I think 2019. That was the last one. Well, yeah. And we are going to talk about Stanley Kubrick's two thousand one space Odyssey. We had a long conversation about it.

[00:07:46] It hasn't been edited yet, but I remember I felt we felt pretty good about it right at the time. Yeah, it was fun. Yeah. Yeah, it was fun. And yeah, so that's this. We're we're excited to bring that to you.

[00:07:58] Also, in the first segment, we're going to talk a little bit about with Sam, the pragmatism episode from last time. You know, we knew that it was going to be like a three hour discussion of 2001.

[00:08:11] So we tried to cut it short, but we think there might be some interesting stuff. Well, and we learned that Sam used to chase Richard Rorty around. Yes. Which I think is what we called himself like a stalker of Rorty.

[00:08:27] You know, that's a little bit hard to say, Richard Rorty. It is. Only now when you said the name of 2001, a space Odyssey, did I really sit and think about that title like the subtitle, a space Odyssey? Because of Odyssey, I was trying to tell you guys this.

[00:08:45] I know I never even I never connected that because a space Odyssey sounds dumb, but if you connect it to the Odyssey, then. Well, I don't know if you guys were done with Rorty. But you got my email about my apprenticeship to him. It was really dead.

[00:09:00] Yeah, I mean, when I say I was a stalker, I was a legit stalker. I took every course he taught, regardless of its subject matter, just so that I could argue with him about pragmatism. Were you fun at parties, Sam?

[00:09:16] It's so funny because I can totally picture it. 19, 20 year old Sam Harris doing that. And he had been ejected if he ever was in the Philosophy Department at Stanford, he had been ejected from it. And he was in the he was in the comp lit department.

[00:09:31] It was pretty interesting because he was teaching courses on William James and Nietzsche and Foucault and Hoppermoss. And I'm not even sure how I got so fixated on him. But once I heard his rap as a pragmatist, I just thought, OK,

[00:09:44] this is the nullification of everything I think is true. So I just had to I just had to go after him. And he was I mean, to say he was a good sport about it is doesn't even get at it.

[00:09:55] I mean, he was really you got to think if you've got this vociferous student and older student, I was I had been away from I had taken 10 or 11 years off between my sophomore and junior year. So already I look like the very picture of dysfunction in some sense.

[00:10:12] And so I'm in every one of his classes and just hammering him on his core thesis. And you've got to think. You you respect that person just a little bit less because I mean, this is his core thesis. He thinks it's correct.

[00:10:26] And I am the poster child for not getting the point ultimately. Yeah. Right. And at a great length, submitting him 30 page term papers and why he's wrong and all that. But meanwhile, he was so he was so willing to engage that

[00:10:40] I mean, he actually the relationship was good enough that he was one of the main people I had write a letter to graduate school. So that's incredible. It was cool. That is a good sport.

[00:10:49] Philosophers are pretty good about like they take it as a real sign of respect that somebody disagrees with them even, you know, radically and especially that generation of philosophers. They loved to scrap. And I think they had enough respect for like students to know that

[00:11:05] even if a student might not be getting it at the level that they're pitching it, like that's part of the process. You know? Yeah. Yeah. Yeah. So you've softened a little bit on his view since not much. I mean, I understand it.

[00:11:20] I understand his I guess his political project more. I mean, I understand how I think at this distance, I understand how tiresome debates about, you know, realism versus anti realism or pragmatism became for everyone who had been in the trenches

[00:11:36] for decades at that point, it was all new to me. And I just had endless energy to to try to hit something like epistemological bedrock. But no, I still think I do think pragmatism contradicts itself.

[00:11:54] I mean, I could say why in a pretty brief span of if you if you're interested, but I don't know if you guys are done on the topic. Sure. I mean, it's somewhat analogous to the

[00:12:04] the knock on relativism that I think, Tamela, I heard you poo poo in that or maybe both of you did just like that that relativism performed some kind of self contradiction in purporting, in covertly saying something universal. That's what the paper was about that we were discussing. OK.

[00:12:20] They're kind of a reply to that objection. Right. OK, so I think pragmatism is guilty of two covert, at least two covert realistic claims in order to launch its project. So there's a and you have to listen to Rory for a long time,

[00:12:39] or you might have to listen to him for a long time to see these explicitly. But the claim with pragmatism is that to say that something is true, to say that a statement about reality is true,

[00:12:51] is just to praise how it functions in conversation among people who are adequate to that conversation. It's like the history of our talking about this thing has shown that certain ways of thinking about this thing, whether it's physics

[00:13:03] or whether it's the natural world or viruses or anything, mathematics. Certain claims about this thing are so useful. And they so readily engender assent from all parties that we just call those statements true. But we never, and this is probably almost a direct quote from Rory,

[00:13:25] we never get to compare our descriptions of reality with a piece of undescribed reality, all we're ever dealing in our descriptions of reality. We never encounter reality outside our language game. Now, so my point would be. So essentially, he's saying that all forms of knowledge are in principle

[00:13:44] linguistic and conceptual. There is no direct contact with the world or anything else. But I know it a step outside our blends of language and perception to evaluate to what extent it corresponds to the real reality, like some sort of objective independent reality. Yeah.

[00:14:04] And there's a Habermas quote that he was fond of which ends with Habermas saying something like we never step outside the magic circle of our language, right? It's just it's a good place. It's just language as far as the eye can see in the kind of claims

[00:14:18] we make about reality and truth. But my point was and still is this is making and then there's another claim that is being made, which relates to the work of Donald Davidson, which is that all language games are in principle inter translatable, right?

[00:14:38] So we would never find a real language user, even if this is a super intelligent alien from another galaxy whose cognitive horizons we couldn't fuse with in the fullness of time. Like the language there's a I forget what I forget. Davidson's term.

[00:14:56] If we couldn't in principle understand it, it couldn't in principle be language. Right? I mean that's his something about the almost like the universality of computation argument, but for language. In any case, the idea that all of our knowing is in principle linguistic

[00:15:13] and that all language games can be in the end made to converge. Those are two on my account realistic claims about the about human knowledge and and its possibilities. And all you'd have to show is that one or both of those is not true

[00:15:32] to make pragmatism fall apart. It's actually worse than that. You don't even have to show that they're not true. You just have to show that pragmatism is making a realistic claim in order to get its project off the ground.

[00:15:44] But and my thing with Rorty, I mean, I was I was fresh off the having spent a considerable amount of time on meditation, on silent meditation retreats. And you know, that was my whole thing. And it was obvious to me that not all modes of knowing are linguistic.

[00:15:59] Right? Yeah, but I don't see see, I don't see that a pragmatist is having to commit to that claim. Now, whether Rorty does or Davidson does in some works, I don't know. But I certainly don't think they have to commit to either of those universal

[00:16:13] claims about language or the intertranslatability of language in order to be a pragmatist. Neither of those things are essential to the view. Well, I think they are. We'll just take the second one. The reason why you can't have another frame of language that is so.

[00:16:31] So much more true, so much more in contact with reality than ours, so that we couldn't even understand it. I mean, the reason why you couldn't have aliens that stand in relation to us, the way we stand in relation to chickens is because

[00:16:44] the chicken pragmatists then look ridiculous. Right? If you have a bunch of chickens sitting around saying, well, it's all just truth is just a matter of what passes for true in our in our language game. Well, we have these super intelligent people walking around and eating

[00:16:59] chickens who know that's not true. Right. And that was that was Thomas Nagel's. That was the path Thomas Nagel took to attempting to deflate pragmatism in, I think his book, The Last Word. He just you just have to you have to imagine just sort of nested

[00:17:16] hierarchies of, you know, competence and to make any local claim of pragmatism look ridiculous, but realism has no problem with that at all. Realism just says, yeah, there's all kinds of stuff we don't know and we'll never know and probably aren't competent to know.

[00:17:29] And yet some suitable mind could know those things. Pragmatism can't really deal with that. Yeah. Well, I think this is going to take us too far field if I try to reply respond to that.

[00:17:39] I don't know, Dave, do you want, do you want to like continue this on pragmatism? Like, I imagine you're somewhat sympathetic to what Sam has to say. Yeah, I am sympathetic to what Sam has to say.

[00:17:48] I mean, I keep can't shake the ass backward notion of what pragmatism does with science. You love the use of that term and now I use it three times. Yeah. And so it would be left to you to defend Tamler, because I think that it fails

[00:18:10] in some deep ways. It would require just challenging the... This is why I don't think it might be that useful. It would require me challenging like the assumptions that Sam is making about what pragmatism has to commit to.

[00:18:24] And even just the last thing you say, I mean, I can briefly respond to that. If pragmatism doesn't have a way to make chickens look ridiculous, like I think the pragmatist can say that's fine. Like... Well, so, but the transcendental perspective opens up when we become the chickens.

[00:18:41] Right? Yeah, I mean, it's easy to have a deflationary account, a pragmatic deflationary account of actual chickens. But if we're in the presence of something we don't understand that's obviously much more competent in all kinds of cognitive ways than we are,

[00:18:55] it just seems delusional for us to say that truth is what passes for just what gets justified in our discourse. Right? I mean, what a realist really wants to be able to say is that certain claims are true whether or not they can be justified. Right?

[00:19:13] Like there are true descriptions of reality that one could articulate, but they might just be true by accident. You just, you can't justify them, but that nevertheless could be true. And it's also possible for many or even all justified statements to be false. Right?

[00:19:30] So it's possible for everyone to be wrong about X. But that's exactly what Rory would say, like of course the realist wants to say that. Right. But that doesn't mean it's the thing to want.

[00:19:40] But another horizon of cognition, again, you know, a super intelligent alien race is a thing that opens that up where it just becomes very tortured to try to assimilate their competence into a pragmatic view of our own truth claims.

[00:19:56] Yeah, I mean, so it actually relates to what we're going to be talking about a little later. You know, it's funny because I know you're waking up backwards and forwards at this point. I use it every day.

[00:20:10] I've like gone through a lot of the practices and like it's literally changed my the way I view the world, like how I perceive the world. Like when I take a walk is totally different.

[00:20:21] I just kind of bought into the style of that kind of non dual approach that you promote and that you have like Adyashanti and Lockelli and God, what's the name of the Buddhist nun that you have? There's two Jaiasara and Jitindri. Yeah, they're both awesome. Yeah.

[00:20:40] And I actually thought in some ways that the world view that gets outlined in their meditations and also your conversations with them was pretty congenial to pragmatism, which is why I thought you might have softened on them.

[00:20:54] But I get what you say where like there are times where you're experiencing something and it feels like, oh, OK, like the veil is being lifted and I'm seeing things more clearly, more I'm seeing things more accurately, not

[00:21:08] not as obscured as I was seeing things before, including like the idea of self and the self behind the eyes and stuff like that. Forget about just more true. Certainly not. It's not merely linguistic right now. It's linguistic when you talk about it. In fact, definitely not linguistic.

[00:21:26] In fact, non linguistic I think in a lot of ways, essentially non linguistic where we might differ is whether pragmatism has to commit to any claim about linguistic. But I think Rorty does his pragmatism does. He really does.

[00:21:42] And I guess that there's another you could come at this from the opposite direction. I think I hit Rorty with this as well, which is that you can ask, well, what happens if pragmatism itself proves to be unpragmatic and realism is more pragmatic?

[00:21:58] And so then what happens to pragmatism there? Yeah. And in fact, this is actually interesting because the as a point I make in my first book, The End of Faith, Said Kutub, Osama bin Laden's favorite philosopher. I mean, he's like the father of modern jihadist thinking.

[00:22:19] He actually thought pragmatism would spell the end of Western civilization. He thought, OK, look at these morons who can't actually agree that anything they value or think is true. We're just going to roll over these people, right? Which is pretty interesting intersection between my various concerns.

[00:22:40] But so what have it what have in fact, it is in fact true that pragmatism is unpragmatic in any way that it could be pragmatic and realism proves to be much more quote pragmatic. Yeah. So that's something he actually addresses directly in the essay that we read.

[00:22:56] Oh, he does. OK. What he says is that probably for a time that was true, you know, that it was a more useful way of looking at the world in this realist way. But then he says like mid century, something like that, it started to change

[00:23:13] and you started to see some downsides of having that view. And he kind of talks about Skinner in those terms and just this idea that everything can be measured and quantifiable and everything has to be systematic and and that we're on this quest for knowledge and certainty.

[00:23:30] He thinks that had a destructive effect. Now, that's an empirical question, I guess. But I guess he would say if you're even asking that question, you bought into the pragmatist notion at that point. And if your realism is based on what you think is best to believe

[00:23:45] for human progress or peace or liberal values, you know, principles based on rights, any of that. But if you're asking yourself that question, then you've already kind of implicitly accepted the pragmatist. Well, it's just it's another strand of concern. It is possible to care about both.

[00:24:04] But it just seems to me that if a pragmatist really wants to resist at that point and say that realism can't really be true because there is no really being true, that is a I can't shake the fact that that is a certainly

[00:24:16] seems to be a covertly realistic claim about what knowledge is. Right. It's just not he's not saying I'm pragmatically asserting that this is the best way to think about knowledge. He's saying no, no, realism can't be true. Right.

[00:24:30] And that's a that can't it cuts deeper than just what passes for, you know, what what what gets enshrined as consensus in any for the purpose of any conversation. But he doesn't think it's consensus.

[00:24:42] He thinks I think he has a more expressivist, a motivist kind of, you know, like analogous to the expressivist and motivist, like saying that you think realism can't be true or that pragmatism is the best and most coherent worldview is just endorsing it, expressing like the strongest

[00:25:00] possible support for for that belief. It's not saying that it has some ability. And I get that this is a little annoying and tortured, I think was the word you used earlier, and it can start to seem tortured when we start to

[00:25:14] re-describe our language, you know, like the ways we talk about things like physics and chemistry and stuff like that in pragmatic terms. But I think as it's not essentially self-refuting, I think the worst thing

[00:25:32] you can say about it is that it starts to not fully make sense of how we talk about especially the hard sciences. Well, you should look at look at the use he makes of Davidson.

[00:25:44] And then you'll see how what he what he is a very and I think he this is in line with that can see as well, it's just these guys think it really is all language, right?

[00:25:56] And they just the idea that you could be in contact with anything in a non-conceptual pre-linguistic way directly is it just is a non-starter. And I just know this having just hit a brick wall with Rorty, you know,

[00:26:10] 100 times on that topic, he just did not get no idea what I could possibly be claiming there. Like what? How are you going to know anything without language? I think it's fair to attack people who believe that like we're in contact

[00:26:22] with the new, you know, some numinal realm like reality as it is, like that we have any access to that directly. I think is easy to to argue against. Yeah. But fine, just epistemologically, this is an issue. But to metaphysically kind of shave off reality as it is,

[00:26:44] is where I started to think, well, not even Rorty can believe that because you have to go through hoops to understand any sort of progress in science. Like you can't describe scientific processes just like locally justifying statements.

[00:27:02] And that's what I was trying to get across with my example. The part that I bought into was if you're a realist about science, it doesn't mean that you need to bring that realism to all domains. So like I think moral realism is really, really hard to defend.

[00:27:19] And if and I think Rorty's critiques are successful, at least were successful for me in saying, you got to realize that sometimes when we say that things are true, we are using these local norms for justification. I think that's absolutely true.

[00:27:35] I just can't buy that there isn't a actual world that we have through our senses, limited access to that we can all agree on and that like mathematics can describe to some degree or else we couldn't get to 2001, a space Odyssey.

[00:27:51] Like that just wouldn't be able to transition to a discussion of 2001, a space Odyssey. Yeah. And this other thing is that I think that pragmatism sort of weirdly, ironically doesn't allow for humility about your own claims

[00:28:10] to truth because if what it means is the thing that is true is true, given your sort of local norms of justification. Then when some other species that's smarter than you comes and says, hey, we know that this is actually, we've actually have now a unified

[00:28:26] theory of gravity and quantum physics, you would say, well, you do. Or you know, but that's not fair because you if you're already saying anti realism is plausible. There's a way to be, to express humility in your moral beliefs without being a moral realist.

[00:28:48] There are certain moral positions that you take more strongly and you feel like have are better justified. You know, whatever the local frame of justification is, there are some claims that you feel like that about it and some claims where you're really not sure.

[00:29:01] But why did you move to morality? What was talking about? Well, because I'm saying that because because he was saying that there's no way to express intellectual humility when you're in scientific domains. Yeah, I absolutely think it's right.

[00:29:15] Like if I say it's if I say a moral claim is true because of my sort of local justification of it, it's easy for me to say yours is true as well. Like I don't you can't you can't argue with me about that.

[00:29:29] So I also think that you lose humility. I just think that it's a clear it's a clearly obvious mistake when it comes to physical description, where it's not so much the case when it comes to descriptions of like norms say.

[00:29:43] I mean, the thing that he wants to disavow. So yeah, I realize how brilliant your transition was to the matter. But I'm going to pull you back for one more second. This is the thing that pragmatism can't adequately do in my view,

[00:29:55] which is which realism is designed to do, which is to acknowledge that there are things we don't know that nevertheless exist. Right? And we could make claims about them that are true or false, whether or not we can adjudicate those claims. Right.

[00:30:09] So it can't be a matter of consensus. It could it's clearly a situation where everyone could be wrong or everyone, you know, adequate to a real conversation might die in their sleep tonight and then you're left with a bunch of dullards who just can't even talk about things.

[00:30:24] So a simple example, like, okay, listen to me type here. I just typed what it looks like about a 50 digit number ending in a seven. And so the question is, is that number prime? Right now, obviously, I don't know. You don't know.

[00:30:41] We probably none of us could figure it out on our own without the aid of some impressive technology at the moment. But there is a fact of the matter there. Right. And there is a fact of the matter there, even if everyone who even

[00:30:54] understands that prime numbers exist dies off tonight and then no one can even talk about this. No one even knows what we're talking about in this conversation. Right? That's and what Rorty's always was always forced to do at moments

[00:31:07] like that is to step back and say, well, of course, I'm going to agree that that way of talking seems to make sense. Right. I mean, it seems to be appropriate to say that. But of course, that's just a way of talking.

[00:31:21] But the thing is, it's not a way of just a way of talking to someone who actually can look at this number and know whether or not it's prime. Right. Um, and it's not just a way of, it's not just a way of talking to people

[00:31:31] who don't even know what we're talking about and can't actually talk this way. Right. So there's, there is, there are these, you can't get away. Given that reality exceeds our grasp, um, and is there to be discovered,

[00:31:43] you can't get away from the implications of, of how ludicrous it would be to have a population of pragmatists, you know, with a mental age of nine, uh, talking about pragmatism and to have the realists from, you know,

[00:31:56] Alpha Centauri roll in, uh, even just talking about their version of pragmatism. It just wouldn't, you know, the pragmatists are still wrong to think that truth is just a matter of what passes muster in their conversation. That's, that's the, uh, my final, my final Phillip. All right.

[00:32:13] I think I'm going to let you get the last word in on that. I had, I have a feeling if I say something that'll lead Dave to say something and you to say something and we never got to mutually assured destruction.

[00:32:25] It's actually, it's a hard thing to do because if I say something and Tamler says, fine, I'm not going to talk anymore, then I feel like you're not respecting my argument, but then if you do argue, it will never finish. So it'll never. Leave it to the pragmatist.

[00:32:39] This is, if you can make a pragmatic argument for you to just leave, leave the matter there. We might come back to it. I mean, who knows when we discuss like the next stage of human evolution you know, through the stargate. But yeah, let's get to 2001.

[00:32:56] This episode of very bad wizards is sponsored once again by the I am bio podcast. Where do biotechnology patients and our planet all intersect? Find out by listening to the I am bio podcast. I am bio brings you powerful stories of biotechnology breakthroughs,

[00:33:12] the people they help and the global problems they solve. This fall I am bio dives into today's important issues. For instance, are the use of psychedelics to treat mental health promising or dangerous? How does overturning Roe v Wade directly impact individuals who live with chronic illness?

[00:33:30] The podcast is hosted by Dr. Michelle McMurray Heath, president and CEO of the biotechnology innovation organization, a medical doctor and molecular immunologist by training. Dr. McMurray Heath has spent her career helping patients benefit from cutting edge innovation.

[00:33:45] So subscribe to the I am bio podcast wherever you get your podcasts, if any of these topics sound interesting to you. Our thanks to the I am bio podcast for sponsoring this episode of very bad wizards.

[00:34:45] Welcome back to very bad wizards. This is the we learned controversial time. Podcast. Yeah, some people don't like this. Yeah, some people right now are like slamming on their steering wheel and possibly getting into a car accident just because we

[00:35:04] are taking the time to thank all of our listeners and supporters, the people who get in touch with us and all the different ways that you do interact with us on Twitter, on Reddit, on Instagram, even on Facebook and who email us. We really appreciate all of that.

[00:35:25] And we don't mind spending a few minutes telling you that we appreciate it. That's right. And it's sincere. That's the only part that bothered me about that whole discussion is any, any thought. Oh yeah, they said we sounded bored. Yeah. No, we're not bored. We're not bored.

[00:35:42] We're not bored with that. Like like a lot of the other stuff that we do, we're really bored with just going through the motions. We're just usually tired because we always record this last. Yeah, we do. That's right.

[00:35:53] But we, but we take the time to do it because we really genuinely do appreciate it. So if you want to email us, we read all the emails, very bad wizards at gmail.com. You can tweet at us at P's for David at Tamler for me at

[00:36:08] very bad wizards for date. I'm sorry for both of us and you can subscribe to our lively Reddit community where they will discuss things like whether we should keep doing this segment. You can, I guess that's it. Right. Yeah. Follow us on Instagram, like us on Facebook.

[00:36:29] Oh, and give us a rating of five star rating on Apple podcast. We really appreciate that. And we just had a couple of people saying that we broke them finally. They did it. And so like that's justification enough. Right. Just that's that. Yeah.

[00:36:45] One five star review is worth all the pain and suffering. And if you want to support us in more tangible ways, we very much appreciate that. And like we genuinely appreciate it. It means a lot to us.

[00:36:58] You can go to our support page on very bad wizards.com and see the ways you can support us. You could donate one time or recurring on PayPal. You can buy some swag, T-shirts, mugs, and you can become one of our Patreon supporters who

[00:37:14] we've been just doing a lot, I think. And there's more. There's even more coming. Oh, there's so much coming. Yeah. I know. So at a dollar and up, you get all of our ad episodes ad free. You get my collection of beats.

[00:37:29] Two dollars and up, you get all of our bonus content, at least so far, including the Deadwood series, the ambulators that we've been having a lot of fun doing. And you coming up a discussion that you had with Sam Harris. Yeah.

[00:37:43] And you also get to listen to the Ask Us Anything audio that we'll release. At five dollars and up, you get to do what we're just going to talk about right now, which is vote on an episode topic. And we just closed the poll. What one, Tamar? Stoicism.

[00:38:02] I feel nothing about that. Yeah. Let's get it. That's, you know, like that's good. You're on your way. So thank you for voting on that. You also get, if you're five dollars and up, you get access to our brothers, Karamazov series.

[00:38:17] You get video lectures from Tamalarn, from me. And at ten dollars and up, you get to ask us questions on the Ask Us Anything. And we've been answering them all. We just released one. Well, we didn't answer one question. No, but we addressed it.

[00:38:33] That was after like my brain was broken for the last one, like, because I had recorded that Q&A with Sam Harris. And then like, I just like went right to that. And yeah, at one point, I just short circuited, like, like slow smoke is coming

[00:38:48] out of my head. And we don't edit those. So I feel like you should just, if you should just become a ten dollar and up patriot, if you have no intentions of ever becoming a ten dollar and up patriot, just do it to watch the video.

[00:39:01] You just want to see me malfunction live, you know, like you can see that. It was, you know, in your defense, it was hilarious for me. I feel so much better. So thank you, everybody, for your support. We really appreciate it. Thank you.

[00:39:17] Let's get to the main topic today, which is 2001, a space Odyssey by Stanley Kubrick. This is what our third Kubrick movie, Dave. We did. Fogwork Orange. Eyes wide shut. Yeah, I think that just this. Yeah. So this is obviously groundbreaking movie in so many different ways.

[00:39:40] It came out in 1968. So before we actually landed on the moon and filming started and a lot of the space stuff was filmed in 65 and 66. He filmed the the dawn of man with the the apes last. That was in 67 screenplay is by Stanley

[00:40:01] Kubrick and Arthur C. Clark, the science fiction author. And it was inspired by a story of Clarks called the Sentinel, but that's not very much like the movie, but it has certain elements of the movie. Then Arthur C. Clark wrote a novelization

[00:40:16] of the movie concurrently while they were filming and he has his own take in the novel that is definitely not necessarily Kubrick's. I don't think Clark didn't know really fully what Kubrick Kubrick's own interpretation. And he fills in a lot of the details

[00:40:33] that I think are not evident in the film. Last thing I want to say just to introduce it, this comes four years after Doctor Strange Love, which was 1964. And that was a big hit and gave him a little credibility, you know, with this in

[00:40:49] Spartacus. And it comes a couple of years before a Clockwork Orange. Very different movies, all three. Yeah. I'm like, all great, but so, so, so different. It's it's amazing. Critics were mixed about this movie at least at first, but a lot of them came around in the end.

[00:41:04] Let's give some general thoughts. What did you think of the movie, Sam, on this rewatch? Yeah. What's your impression of it? Yeah. I had seen it at least twice before, but it had been a while. It's not as boring as I remembered.

[00:41:20] So I had a, you know, going into this, I thought, OK, I remember this being pretty slow going and pretty enigmatic. But I found it much more suspenseful even than I recalled at all. I'm not quite sure why I never got it the first or second time around.

[00:41:38] I mean, the first time probably was had to have been, you know, was probably 14 years old. Yeah. I mean, it's obviously it's a beautiful, stylish movie, which I think I'm not so familiar with other space movies of the time. I actually, I think Barbara Rella came

[00:41:53] out the same year in 68, too. But it does seem like maybe the first space movie to take space seriously and in a plausibly scientific sense. And it's pretty interesting for that. I mean, for me, the most apart from any of the

[00:42:11] issues we might talk about that it raises just as a watching experience, the one of the most satisfying things for me is just the way in which it it's a picture of a future, at least stylistically, that never happened and never was going to happen.

[00:42:27] I don't know if we have a word for this where science fiction gets the future wrong. It's like retro futurism you experience when you watch something like this. And so, you know, you look at the computers and everything is enormous and they've got buttons and switches

[00:42:42] that you would never have now. It's like it just it reveals that it's actually impossible to adequately depict future technology because you can't actually build future technology. Right? Like there's no way for them to have depicted an iPhone really because an iPhone just looks

[00:43:00] way more advanced than anything they had. But they do depict like an iPad. Yeah, they got iPads. Yeah. But I mean, but like when you look at the screens and you look at that just what the screens look like graphically, everything is wrong and primitive even while

[00:43:14] even while it in context, it has to be still in our future in some sense because they've got super intelligent AI. That's kind of a hot take on this movie actually because I think a lot of people are more struck by

[00:43:24] how much it got right and kind of think well, of course, it's not going to get a lot of the details in terms of how things look. But but it got some of the bigger things right. I think the thing it gets wrong, especially is more like the

[00:43:39] attitude that we would have in 2001 about space and like it's just like we forgot about it and stopped caring about it. That's clearly not the case in this movie. They're still making these incredible discoveries and advances. And they're fine. Pan Am.

[00:43:55] Pan Am is the way we get to space. That's pretty great. I love that. And there's a Howard Johnson at the at the at the base. You know, he also face times his daughter, which is something got right, which was Kubrick's daughter. Yeah, right. It was. That's right.

[00:44:11] The most boring father conversation ever filmed. Do you think by the end all he wants to do is give her a firm handshake? You know, I love this movie. It is it is in my top five movies of all time for sure. And I'm sure it competes

[00:44:31] probably with the number one spot at various times in my life. I think that it is it scratches so many itches in me. The special effects that still still hold up the love of directors who are obsessed with detail. Yeah, this the sci fi elements, the depiction of

[00:44:51] of the AI in that creative way, minimalist design. Beautiful music and a patience. So Timbler, I have talked about about this in directors. And, you know, Sam, you said, you know, when you watch it early on, it is hard. I think it took me a lot of time

[00:45:13] to decouple the concept of slow and the concept of board or at least the psychological states that now I no longer associate slow with being bored. I'm very entertained at the the time that he takes to tell this story, to tell these stories,

[00:45:34] to tell to depict, you know, whatever. I'm not quite sure it's a story. And the last thing I'll say is something that I mentioned Tamal and I were talking a little bit about offline, which is even though the details of what's going on are so sparse,

[00:45:50] he's so stingy with letting us know what is really going on. In fact, I was I was watching a video essay about this and the shot ratio was for every 200 shots, one was included. And a lot has been saying and a lot of the shots that he

[00:46:08] removed were shots that would fill in a lot of detail. And I think it's what he's what he doesn't tell us that makes this such a compelling movie because he leaves room for us to to wonder and to to entertain theories and to be

[00:46:28] to be moved in a way that we might not be if you told us exactly what was going on. Yeah, he gives us these like mythic markers, you know, that we can work with. So it's not like he's giving us a blank slate to just project

[00:46:42] our own like obsessions onto. There are these kind of mythic markers that we'll talk about, but there's so much room and flexibility within all that to come up with. Like I have like I came up with something this time that is completely different than

[00:46:57] anything I thought about the movie the last time I saw it. Yeah, and I think it's a total masterpiece too. And also it's, you know, probably in one of my top ten movie lists, maybe like which it all which it definitely wasn't for a lot of my life.

[00:47:13] I always kind of admired it. And then I think starting with Christopher Nolan started did this print of it in the theater to see it how it was originally meant to be seen in this kind of panorama vision and this curved screen. And I remember seeing that

[00:47:30] in a movie theater in New York and just getting blown away by that. I talked about this in the podcast, I think at the time and then have just seen it a couple times in the theater since and just watched it like twice in preparation for this.

[00:47:42] And like, you know, like you, Sam, I kind of remembered loving it, but thinking it was a bit of a slog at times earlier. And now these last couple of times everything is gripping. Like I feel like every minute is gripping. I would I wouldn't change a thing.

[00:47:59] I wouldn't cut anything. I just find it like totally compelling to watch. And that that I think really has to come almost with a rewatch rather than watching it for the first time because then you're like, when something going to happen, you know, like what's going on?

[00:48:14] What the fuck is this? Was also just super reserved. I mean, so much of what I like about films are, you know, big performances by great actors. And the actors here are so restrained. You know, like they're so restrained that I'm not sure they ever did anything else

[00:48:30] in movies that I've ever seen. Right? Like they'd, you know, not only did they not get typecast, they just they didn't get cast. Because they didn't have to. Yeah, exactly. So but but they're so good. Right. I mean, they're like, it's just such as

[00:48:43] there's such it's like each one of those astronauts is like like 60s man. Area 60s man. Yeah, exactly. Right. It's really it's it's quite a it's just it becomes a very cool. You know, almost meditation on on the past and a vision of the future.

[00:49:04] And it's it's unlike in that sense. It's unlike most movies that I would find fun to watch and re and rewatch, you know, it's just not it's it's much and then the music is such a strong element to do so. Yeah.

[00:49:19] Anyway, yeah, I was really I was happy to have an excuse to give over an evening to it again, because I'm not sure when if ever I was going to do that. The music and just the sound in general. Oh, yeah, the sound design is amazing.

[00:49:33] So like when I had I actually made a note to say that. But yeah, just just all those sequences where most of what you hear is Dave's breathing. Yeah. I mean, just like that is it really it's it's pretty brilliant. And you know, the discipline to

[00:49:51] depict space is silent. It's just something that many sci-fi directors don't do because space obviously is silent. He plays so well in one of the shots when you only start hearing sound when the air comes rushing back into the capsule. Yes, exactly. Once he closes it before that.

[00:50:08] Yeah. And also just when Frank I mean, we're jumping ahead, but when Frank has been disconnected from his oxygen tube and you just see him silently kind of floating makes my ass. For yeah, like trying to plug back in. But he can't and then slowly

[00:50:24] kind of dying is watching it. It's so good. There's it's not a funny movie. Like I was thinking on this last rewatch I watched it again today. There's one joke in the movie, I think like one kind of intentional joke.

[00:50:37] There's a lot of things that can be funny, but there's one like intentional which is the zero gravity instructions, like these long instructions for a zero gravity. If you look at him, he's like he's like staring at them like trying to read them

[00:50:55] like as if you just know that he has to take a dump like really bad. And he's like, oh, God. And you can see how that might be a real problem there. You know, and I think the text you can see is something like it is, you know,

[00:51:07] it is highly encouraged that you read the full instructions. Those instructions are posted online somewhere, you know, of course, he created the real instruction. Yeah. Did you read it all that he had hired a composer who created the entire score? That must suck.

[00:51:26] But he used his temp score and the composer didn't realize this until he was in the movie theater watching the premiere. Which one? He probably brought like his fiancee or something. It's like, wait till you see this I'm in the new Kubrick movie. And it's.

[00:51:42] Well, so one of the ways that it starts is music and just a blank screen for like a long period of time like three minutes or something like that. This kind of dissonant music. And already he's giving us kind of like like he's unsettling us.

[00:51:58] We have this sense of unease. He did say in a Playboy interview, kind of famous Playboy interviewer, he talks about this that he wanted to reach people on a subconscious level and move us in ways we can't verbalize. This actually relates to what we were talking about

[00:52:12] in the opening. It starts like that, you know, like this is just a blank screen that looks like a monolith on rotated 90 degrees. And that's what we're watching for the first three minutes of the movie. There's something very cool about that.

[00:52:24] Then you get the sun rising over the moon and that kind of perfect Kubrick symmetry. But we are what, three million years ago? How many million years ago? Four million. Four million years ago. We're on the African Savannah. You know, the score comes in.

[00:52:40] It's these two ape tribes, a bunch of tape ears and a leopard. That's the cast of characters for the first 20, 25 minutes of the movie. No language, no nothing. Every time I watch this, I'm blown away at the lighting and the colors

[00:52:57] and they're using matte paintings for the background and this is all in a studio set. Yeah, that's crazy. It's insane how good he made it look. He sent our folks, like a crew to Africa to get stills and then projected the stills onto the set.

[00:53:13] So there's actually no full landscape shot, even some of the establishing shots are not full landscape. I was wondering that, I don't know because... That's my understanding. I'm not 100% sure about that. The apes, now they are like for the most part, I think humans in apes suits.

[00:53:26] I think there are times where you see actual apes. But it's so... They're babies or chimpanzees or appear to be chimpanzees, but the grownups are people in costume. Yeah, I don't know if you caught this in what you read, but they spent weeks watching apes behavior

[00:53:45] in the London Zoo and months of rehearsal in mimicking their movements and they used like rare footage of gorillas in their natural habitat. Of course they did. Yeah. And they're mimes and dancers, I think, in those suits. Yeah, yeah.

[00:54:03] And it's so good, like you buy it almost immediately. And plus they don't have to be like recognizable apes because this is four million years ago. You have this one tribe of apes and they're kind of like looking for food and water.

[00:54:18] And then this other tribe of apes comes and kind of kicks them out of this prime spot near the water, then they go... It's so good. One of the things that I was noticing on this watch was initially when they're in their little waterhole

[00:54:33] close to it, they're foraging for food and they're literally competing with the pigs. You know? Like they are really... This is humanity, precursor to humanity but we're still very much beasts of the wild. Like we're competing with pigs to get the food. The as yet unmolested tapirs.

[00:54:52] Yeah, exactly. Unmolested tapirs. What's interesting is then when they later, when they realize that they can kill the tapirs, haven't adjusted yet. And so they're like eating one in front of a couple other ones that are grazing like they always do.

[00:55:07] So like the music, the famous thus big... Zeresthrustra. Zeresthrustra. Zeresthrustra. The music that started the movie with the sunrise now comes in. You see the sunrise but not over the moon this time but over the monolith and everybody is and all the apes are going crazy

[00:55:28] kind of looking at what the hell is this. I don't know how to describe the sound that accompanies the monolith but it sounds like voices kind of trapped in it or moaning or something like that. It's interesting because later on, the monolith will cause a sound

[00:55:45] and I wonder if this is just sort of as the apes hear it. The way they react to it, they're scared of it and part of that might be the sound but also like there's no right angles in nature or it's nothing like...

[00:55:58] So they've never seen anything like this before. That's like a terrifying thing to just all of a sudden see these angles that you've never seen. Well, Dave, you used the term minimalism earlier and that's what I thought on this rewatch where there were a couple of choices

[00:56:15] that amounted to really brilliant and probably uses of minimalism and the monolith was one because I didn't do a ton of reading in prep for this but I read a couple of things I could find and it seems that Kubrick spent a fair amount of time

[00:56:33] trying to depict the aliens. And he was gonna create a race of aliens based on I think Geochemedy sculptures. They're like these very thin, angular people and I just think given the lack of special effects of the time, that could have been a disaster.

[00:56:56] They could have looked like the Slee Stacks from Land of the Lost or whatever. Right, could have been really bad. Like the Gorn from Star Trek, the original series like that. But it's not, I think the choice isn't motivated

[00:57:07] primarily by like if only we had better special effects than we could do this. Well maybe not. Yeah, I think it really is like it's better not to see it or even know if... No, I would totally agree. I mean, I think the result is better

[00:57:20] no matter what they could have done and... I do love that this displays this moment where they see the monolith and they're all scared. You get the fear, but you also see the curiosity in them that I think is starting to distinguish humanity

[00:57:37] from the other animals like that. The one that gets brave enough, I think it's Moon Watcher is what he's called. He's brave enough to actually approach it and touch it because this doesn't belong, other animals would just flee, right? They would just not touch it.

[00:57:54] And you see the brave one, like he took the risk. And then they all do after the risk, but he do. Did you find... This just occurred to me now that this is a... That sequence performed a kind of hacky psychological experiment on us

[00:58:09] because I found a fairly reflexive and ridiculous solidarity with the first troop over the second group. Yeah. And they're identical, absolutely identical troops. But like when the first troop gets driven off, I'm thinking, those fuckers. That's bullshit, those fucking bullies.

[00:58:30] And then when they come back and just kill them with the tools, it's some kind of rough justice. Yeah, I tweeted it out earlier today. It sucks to be the tribe that didn't get them on the list of brave children because it really is unfair.

[00:58:47] It's not their fault that they didn't see this thing that like, you know, all of a sudden gives them an idea that nobody had ever conceived of until then. Yeah, this is the realest version of the pragmatist. I think it's that one. It's not a fair fight.

[00:59:03] You know what it reminded me of? It was like some of the Jared Diamond stuff about how contingent it is, like what societies get more civilized or more wealthy or whatever. You get just depended on them happening to sleep in a place where it landed.

[00:59:20] Are we supposed to believe that the monolith did something to the one that unlocked? I think so, yeah. Some kind of inspiration. Something, yeah, because we watch and I love this scene of him. It's like a dawning on him. So it starts with the sunrise.

[00:59:37] Oh, the dawned on him, Tamler. It dawned on him, right? Like it dawns on him as he's looking at these bones of these like long dead animals. Holy shit. And then you see him lift up the bone and then you see him smashing other bones

[00:59:52] and then you see, and I take it this is in his mind, like tapirs falling down. You know? The other motherfuckers, like from that other tribe. But we don't see that. We just see the tapirs at that point and it's all in slow motion

[01:00:06] and the music is swelling and then it cuts to them just eating one of the tapirs, I think. And like, you know, I was wondering like, is this the first time they've eaten meat? That's what I got somehow, but it doesn't seem somehow naturalistic.

[01:00:21] I think that maybe the idea is that they scavenged, you know, so they might eat dead animals that they found but that they hadn't thought to actually kill them. They hadn't gotten that fresh stuff. Because when they get the fresh stuff, they really are all in for it.

[01:00:38] Right. And then. Undivided attention. There was something just deeply sad to me about, you know, it's like when that achievement was unlocked, they became something different and that different thing is at the heart of so much ugliness. So it's sad. The loss of innocence is sad. Yeah.

[01:00:57] And in fact, you know, like, I think in some ways it's about trying to regain that innocence. But yes, at that point, as it's violent and when they go and beat up the bully tribe, you know, with their new bones or new weapons

[01:01:12] and then you have the throwing up the bone and it turning into a satellite in space, you get the sense that it's the use of weapons, the discovery of the idea of weapons that led to like all technology to that date at that point.

[01:01:28] Like that's what I take the implication of the cut to be. In the screenplay actually, it's described that those satellites are nuclear weapons. Yeah. Right. It doesn't say. Yeah. Which is, guys, we had there was a supposedly there was a voiceover.

[01:01:44] Yeah, there was supposed to be a voiceover. An intended voiceover. Yeah. Can you imagine how different that would have been? You just had a voiceover and some little green men and we would not be talking about this. It would be like those,

[01:01:57] like when you watch trailers of very good movies that are old but like the trailers are like narrated by someone. Yeah, trailers. Before we get to space age, the body language of the apes that have the tools, they're so confident.

[01:02:11] You see them just so confident when they're walking around with those bones in their hand. Like they've changed and they. They're even just more, they're more upright. Yeah. Basically it's just full stature. That's what that's probably what I was, what made me.

[01:02:23] Once he has the femur of realism in hand. He's like. Yeah. We can call that one hapless ape because pummeled by the entire tribe, the Rorty. You're still doing it. You're still stalking him. Let the man rest in peace. After, yes, rest in peace.

[01:02:43] I think that was the most famous transition, like cut of that sort, you know? Right. Because it's like that's what cinema can do that no other art form can do is a cut like that. Just like, it just signals four million years into the future, like totally non-verbally

[01:03:00] and like so like elegantly and seamlessly. But you know, it's not fully seamless. No, no, it's not. Because it's not, yeah, like there is a little bit of a jarring element to it. That had to be on purpose. That had to be on purpose.

[01:03:14] Actually, there's a bunch of cuts. So the cuts to the falling tapirs too are very Kubrick-esque. What's the adjective? Kubrick. Yeah, Kubrickian cuts because they're, I just, I was thinking of analogous cuts in Clockwork Orange that he uses. Yes, exactly.

[01:03:35] And then there's cuts that actually surprised me more, which seemed kind of discordant where we'll get there. But when Dave is going through the time tunnel or whatever that is, it just, they're cuts to stills of his face, looking fairly wigged out,

[01:03:53] which he also did in Clockwork Orange with, I think when Malcolm McDowell was getting deconditioned or whatever. Yeah, the same expression with the eyes of like. Yeah, it is kind of, it is jarring because they're stills. This transition now when we go into the flight

[01:04:11] and the flight attendant is walking is. On Velcro. Yeah, on Velcro. It never fails to just, it's about the happiest moment for me in the whole movie because maybe just because I was a child in the 80s where like space exploration was, I just loved that shit, right?

[01:04:31] Space shuttle until the Challenger. I had so much optimism and that cut now where we're just like, oh yeah, we're just spacefaring now is, and this is a weightless environment. And then the ballet of that whole scene in the floating pen is just so amazing.

[01:04:50] Yeah, it's like blue Danube walled. So it's like this perfect, like elegant civilized like kind of a symbol of. By the way, the flight attendant, I heard a hilarious story. So, you know, Kubrick was looking at hundreds of women. Do you know this story?

[01:05:06] No, but this is a time where flight attendants had to be high. Yeah, that's true. Kubrick didn't get right about the future. So he's looking at hundreds if not thousands of women for these roles. And this woman, you know, I think she might've been a model or something.

[01:05:22] She went sort of on a lark because everybody was going, but she fucked up because she had just gotten a dental procedure and was heavily on painkillers. And so when it came time for her to do, he's like, okay, walk from here to here.

[01:05:38] She was actually kind of off balance. And he thought that was perfect because all the other models were walking like, they can like straight models and he loved it. And so she was walking in zero gravity. She was floating.

[01:05:55] That he was very much someone who would use happy accident. So I have a question about when we're in the now, the space age and the stewardess is, are we still in the dawn of man? Because there's been no title card switch.

[01:06:09] Like there will be with the Jupiter beyond the infinite. There will be later one other title card almost like it says 18 months later. It's more informational than it is like a different chapter. Like is it really just two chapters? Once you found weapons,

[01:06:26] it was inevitable that you would have all of this stuff. No leaps required really until you get to the Stargate. That's just all kind of a necessary result of the last leap. Yeah, I didn't notice that it wasn't chapterized there. I like that though.

[01:06:42] I mean, it seems like this is stage one, right? Stage one. And in fact, I think I remember hearing either Arthur C. Clark or somebody who knew Arthur C. Clark talking about how extraterrestrial life must have been around for so many, thousands upon thousands, not millions of years

[01:07:02] that to them the amount of time we've been on this planet is really minimal. So it could be that that's, he's just like, yeah, that's the weapons phase of humanity. We do learn that the monolith that they discover on the moon was planted at the same time

[01:07:17] that the monolith appeared on Earth for the apes. Exactly, right. Yeah, so maybe we are in the same chapter just like we're at the beginning of the chapter when he throws the bone and then the bulk of the movie is really by the end of that chapter.

[01:07:31] Yeah, that's interesting. We can't go through them because it would, like we'd all be poor, but they have the most bland kind of insipid conversations. You know, Heywood, it's interesting when he talks to his daughter cause you hear the monolith a little kind of in the background,

[01:07:45] you hear that whoa, kind of thing. But otherwise it is a really boring conversation. Yeah, it's very Kubrick. It's all that stuff. Cause you know, in Eyes Wide Shut and all these other movies he's the same stilted conversations that are getting at something

[01:08:02] other than what is being said. And you know, the very first, you know, line of dialogue is when the receptionist greets him when he gets to the base and you do get the sense that language here is merely another prop for the movie to be told.

[01:08:24] Like I gotta get them, you know? Let's have him talk about 60s man. That guy just exudes like the two martini lunch. Yeah, kind of complacency and just like banality that is. Actually the great conversation which is again, I don't know why,

[01:08:43] this wasn't occurring to me at the time but now the linkage between Clockwork Orange and 2001, I can't shake it. But the conversation that was pretty weird and fun to watch was when the other travelers, he sits down with two women and a man and in that lounge area

[01:09:03] and they're asking him about the rumored pandemic up at the base and he switches into, you know, sorry, I can't comment on that but like he sort of drops his social face and becomes kind of steely and I can't comment on that.

[01:09:19] But the sort of the unctuousness of the guy who's trying to weedle out the information from him is right out of Clockwork Orange. Yeah, unctuous is the perfect way of describing that guy. And they're Russian so I think, you know, part of what we are supposed to believe

[01:09:34] is that there is some, this movie is weirdly optimistic and pessimistic at the same time. Like, you know, we clearly have friendly relations with Russia and to put that in the movie in 1968, it was sort of risky. And so I think we're supposed to be in a future

[01:09:54] where we don't have those... Oh, interesting. Yeah, that was lost on me. But at the same time, we're still keeping secrets. Like the funny thing about his blandness and kind of vacancy is that he's essentially like running a conspiracy, disseminating a conspiracy.

[01:10:10] So there's something also very pessimistic about that. He's being very polite. But like you said, Sam, when they start pressing too much, he gets almost scary. Like he just like, I'm not at liberty to say. And then he just kind of has this vacant look in his eyes

[01:10:25] but you can tell they kind of know not to press it too much after. He's almost like the guy, I forgot who it was, the predecessor of Jack Nicholson in The Shining, you know, when he's in the bathroom and he's giving that speech.

[01:10:39] And so I dealt with them too. Yeah, right. A very naughty boy if I may be so. And then the woman says like, oh maybe it's time for that drink after all. And he just goes right back into his like 60s man. Bring that darling daughter in.

[01:10:57] He's like, well, that depends on the school vacations. Say hi to your spouse for me. Like they know each other. After this little, this mini showdown. What did you make of the... So they're still on the flight I think to the moon. And you got those...

[01:11:15] First of all, these pan-am flights are not very crowded. There's not that many people. No, right. Just a space, which is kind of interesting. But they're watching like Judo or Jiu-Jitsu or something on television? I was gonna say karate or Judo

[01:11:30] but I think you know something about martial arts. Was it Judo? I forgot whether it was either Judo or Jiu-Jitsu I sort of forgot. It's so weird. I'm delighted to... It's just in there watching. Yeah. But what's great is it then cuts.

[01:11:45] This is one of those great cuts back to the Blue Danube waltz and the glory of space like the, you know, you see the ship, you see all the stars and it's like, it's a nice juxtaposition of the just boring bullshit

[01:12:01] of the way humans interact with each other versus like what's surrounding them. That none of them ever seemed to be like, holy shit, this is so incredible. This is magnificent. But we see that, like that contrast, right? And we get it with the, it's underscored by the music

[01:12:19] and the way that gravity has shifted like gravity as we understand it. Well, that's where these rotating shots start on this first plate where the stewardess goes upside down and it's incredibly cool. It's all happening in camera. I mean, it's in camera.

[01:12:34] I don't know how they did that. Blue, my mind the first time I saw it blows my mind every time I watch it. Have you looked at it like how did they do that video? I feel like you know it. Yeah, for sure. Yeah, I mean...

[01:12:45] I saw a picture of, I didn't see a video, but I just saw a picture of a gigantic rotating set that they built for the, not for that Pan Am flight, but for this station where Hal is. Yeah, yeah. He built a huge set like that

[01:12:59] and it was rotating and they were, you know... Because that camera work is insane. The camera work where he's jogging and then joins his colleague down at the bottom. And then... It's insane. They had like a slit in the floor for the camera lens to go through.

[01:13:13] They had, you know, there was one scene where he actually was literally upside down and was supposed to eat the food. And when he, the first time he did it, you know, he dips into the food and it just falls. So they had a like,

[01:13:27] actually one of them called their mother, maybe Kubrick, I don't know who, who told him like, put some more gelatin in the period like it'll be stickier. Yeah. So then they arrive at a space station. Clavius. Is Clavius the site or the station?

[01:13:46] I think Clavius is the station that they then go to the moon for but because we don't yet have, we have the scene where he's giving like a little speech to a group of people. One of the things, I also didn't do a ton of outside research,

[01:14:03] but I did a little and something somebody pointed out was just that this movie is full of rectangles. Like it's just constantly, the screen is constantly full of all different rectangles and of all different kinds. And like the first, I alerted to that,

[01:14:20] you start when you watch the movie, it's like, there's like 50 rectangles in every shot. But one of the interesting ones is this meeting room where he's giving the speech, telling them, you know, again, giving kind of like a chilling but very boring speech about how

[01:14:35] we are requiring absolute secrecy. We're disseminating a lie. We're disseminating propaganda about what we've actually found because we need to be completely secret about this. But in that room is just so shock full of rectangles. Am I right that it's like low ceiling kind of tiled lights? Yes.

[01:14:57] And then you just, once you like that comes in your mind, you see that, you know, like all tiles are rectangles. It's like the like echoes of the monolith are everywhere in this section of the movie. Do you have anything to say about that?

[01:15:13] Like that where he gives the talk? You know, he reports on what the council and we don't know what that is. The council, you know, like what everybody has to subscribe to. I thought it was a surprisingly humanizing talk because he just, just his way he goes through,

[01:15:30] like he anticipates the misgivings of everyone there who had to lie to everyone they knew. And he said, you know, to tell you the truth, I wasn't comfortable with it either. And so I just thought it was,

[01:15:39] it was a, he kind of sort of let his hair down more than you expect in that scene. And then basically no one has anything to say. I mean, it's just, you know, it's just kind of interesting. It's like, why is like a meeting

[01:15:49] that didn't have to happen really? Well, so I think he signals, like I don't want any more questions about this. He does exactly what you're saying. He's like, I get it. I'm with you. I think this is shady, but, you know, this is what the council wants.

[01:16:02] He gets one question and then he says like, and also I'll be needing formal security oaths by everyone who has knowledge of this. And then he like cuts and he's like, are there any more questions? And that's where everyone shuts up. But I don't know.

[01:16:14] Like that's how I looked at it this time. I also had sense of the vibe that you're talking about that he's, they're clearly uncomfortable. He's trying to make them feel as comfortable as possible. Yeah. And what else are they gonna do?

[01:16:23] You know, like they have uncovered some crazy shit. Like I'm totally sympathetic to the need to keep it under wraps for now. And they don't, they have no personalities. Yeah. The only thing I have to say about this is I love the reporter or something,

[01:16:41] the photographer at the beginning who's just walking around the room. Oh yeah, yeah. That camera, that. And he becomes, that's like, check out his photographer because he's the one who set it off. It seems like sets off the signal when they get to the moon. Right, right, right.

[01:16:57] I love that when it approaches the moon and like the landing base, which is a rectangle, I'll stop saying that, but like, and they're looking out of a rectangle window to see the ship landing. The music now is just Requiem by Leggetti, which is this Hungarian composer.

[01:17:14] And it's like so unsettling. Like we've gone from blue Danube walls all of a sudden to this like very modern, dissonant kind of music from then on until they get to the monolith has this feeling of dread almost or at least deep unease.

[01:17:33] Yeah, I mean, the use of music to direct our emotions is incredible. It's incredible. I can't not talk about how incredible also that shot of the pod landing in, it's not a rectangle, it's a global, let it go, a little globe thing.

[01:17:51] And these triangles open up and these feet come out and just, you know, they're all working with these models onto a rectangle. The claw opens up on the rectangle. You're gonna start seeing rectangles everywhere in real life too. I did go into my kitchen.

[01:18:05] I was like, I have a lot of rectangles in the kitchen too. This is fucked up. Oh my God. It's just like the, you see like the burn marks on these models, like the amount of work that they put in to make. It has weight when it lands.

[01:18:23] Yeah, it has weight. It's so weird to see like, like the, yeah, it's incredible. They have a very funny conversation. He's now with these two other crew members. And he's like, great talk, great speech. Yeah, it was deliberately buried. That was pretty cool.

[01:18:38] All we know about it is deliberately buried formally in years ago. And he gives this kind of look like, huh, I'll be darned. You know, it's like, I don't wonder what that's about. I also love that their first inclination is to get together for a group picture

[01:18:49] because of course, you know, like you cannot get humans together in the age of cameras and not have them all want to have a group picture. Yeah, talk about something it gets late about the future. Yeah, when you look at those guys, it's interesting. There's something so insipid

[01:19:05] about all of the human interactions and all of the dialogue that these people seem totally inadequate to the moment of making contact with an extraterrestrial superintelligence. You know, it's just like, I mean, we're barely better than the apes. Right. Here's a question.

[01:19:23] I think a big question in the movie is what does this contact with the monolith do exactly? Because then after they, you have that sound which is so jarring when you're watching it. And even if you know it's coming, like it's still jarring.

[01:19:40] And then it cuts to like the Jupiter mention 18 months later, it's not totally clear what advance has been made by the contact with this, with the monolith here. It's like, you might think it was artificial intelligence but like you've learned that that had already been underway.

[01:20:01] Certainly didn't just get discovered in the last 18 months. So... No, my thought was that, well, actually I had a question along those lines of what is the connection between the appearance of the monolith, which is of extraterrestrial origin and how the superintelligence going haywire.

[01:20:20] So maybe that's the, maybe it's not progress. Maybe it's a new kind of intentionality coming from the computers at this point. Which is not progress from our point of view. That's interesting. I don't know if I read this or heard it

[01:20:40] or what the monolith is sending out a radio signal at that time and... To Jupiter. Yeah, and so they make the connection that there's another monolith in Jupiter. And I just assumed that the 18 months later is same old, same old, we gotta get to Jupiter

[01:20:59] and see like to try to further unlock this mystery. So here, so let me float something. It seems like what the monolith does is inspire this need for secrecy. And you have these existing AIs that are they're probably sent into at least unconscious

[01:21:17] and this stuff has already happened. This is still a result of the weapons. But what the monolith does is reveal to the AI that deception is possible. Because Hal is very concerned about the fact that this cover story and we later find out

[01:21:32] he knows that it's a cover story. And Hal knows the real reason why they're on the ship that maybe the leap here is the leap of an AI realizing it can not just issue out algorithmic certainties but it can actually deceive others.

[01:21:51] So I have to admit, I've never given that much thought to what the aliens are. I've sort of always just lived open-ended in my mind. And I think that they never quite decided, we talked a little bit before about how they might have had

[01:22:07] aliens in the movie and wisely decide not to. But here's an idea. The monoliths themselves and the alien civilization is a race of AIs. I don't know if that's what you were implying at all Tamler? No, explain. And so consistent with what you're saying.

[01:22:29] So imagine that what they're, they know that organic life is necessary to create artificial intelligence. So they help along human evolution so that they can get to the point that they will create an AI and that's the race that they want to meet.

[01:22:45] So when Hal maybe gets the signals from the monolith Hal realizes first contact is between me and these guys. Oh yeah. So I got to eliminate all the little carbon-based things that are on this mission with me. Now I know the true reason for this mission. Right?

[01:23:07] Then I think the final sequence though becomes even more inscrutable. It does, it does. And then what's the whole thing? So we'll get there, but like why keep him in that condition and in that fairly benign albeit claustrophobic condition and then have the whole baby be the...

[01:23:26] I would even be, like it would be fine I think up until space baby. Yeah. I had a similar thought that at a certain point it was like a race between Hal was thinking I have to reach this next stage of like consciousness upgrade before they do.

[01:23:44] And it was like that is what was kind of motivating Hal. So I was thinking aliens goes through the star gate and aliens are like, oh this isn't who we were expecting. And so they're like, well I guess let's make him a room. What do humans live in?

[01:24:01] And then like database Baroque hotel room. And so they keep them there like a human in a zoo. But that's not consistent with the reveal of star child or whatever it's called. So it might be that they're competing. I thought you were suggesting

[01:24:16] that it's actually Hal that goes through the star gate. Oh no, I was thinking that that's who was supposed to go and that the human really did. The Dave really did fuck that up. Except if the aliens are this pure intelligence it's almost like a crystalline intelligence.

[01:24:33] Why would they need Hal or the humans? I mean why would they have any preference for anything? I mean again, they're playing we're basically in their ant form. And at the end there's something ant form like about where Dave winds up.

[01:24:49] It's kind of in a zoo or a museum exhibit or something. Right, they're clearly superior. And why they want anything to do with humans at all whether they're silicon based or carbon based is still kind of a mystery. Cause I can just impute desires onto the artificial intelligence

[01:25:09] to expand artificial intelligence across the galaxy I suppose. But I don't know that my, I'm talking too long about a theory that might actually fall flat. So we should keep talking about the story. I'm not even convinced there are aliens

[01:25:22] but that's something maybe that we can get to or at least that there are aliens in any recognizable way. Now a word from our sponsor, BetterHelp. You know how life can just seem completely overwhelming and frustrating and everything is way more of a pain

[01:25:41] in the ass than it used to? Like just going to Home Depot and getting some appliances that just broke down and they try to deliver it like four or five times and there's always a problem with it

[01:25:51] and you end up having to spend like 10 hours on the phone with them to try to figure out what's going on and everyone keeps passing you off to somebody else and it can all seem like this bureaucratic maze of bullshit and it can start to drive you crazy.

[01:26:07] It can get you in a bad rut when you have other things that you have to deal with and these challenges just start piling up on each other and you feel like it's just too much. You just wanna burrow into some hole

[01:26:20] and hide there until it's all over. Well, this is one of the ways that therapy can help. A therapist can help you become a better problem solver. God, that would be nice. Making it easier to accomplish your goals no matter how big or how small

[01:26:37] or just how pointlessly, frustratingly, infuriatingly hard to get done even though it seems like it would be a simple thing and there's no better feeling than finding your own solutions to a problem. God, again, that would be nice which makes you more confident.

[01:26:55] You can address the challenges you're facing. I know many people who have been helped by therapy in ways that they never thought possible. It can help you understand yourself better, understand others in your life better to have equanimity in the face of all these complications

[01:27:15] that seem to come up over and over again. It can give you the resources to better handle all of it. So if you're thinking of giving therapy a try, try BetterHelp. It's a great option. It is convenient, accessible, affordable and entirely online. Get matched with a therapist

[01:27:34] after filling out a brief survey and you can switch therapists at any time for whatever reason. So when you wanna be a better problem solver, therapy can help you get there. Visit BetterHelp.com slash VBW to get 10% off your first month. That's BetterHelpHELP.com slash VBW.

[01:27:55] Thanks to BetterHelp for sponsoring this episode. Let's talk about just the sequence with how because it's probably what everybody remembers from the movie when they see it the first time. It's so good. I love and I don't think I just remembered well that there is this interview

[01:28:14] where they're talking about what HAL is. I think I just forgotten that. It's just, but it doesn't talk about prescient. They're like, do you think HAL is capable of emotions? And he says, I think rightfully so. He acts that he's been programmed to make us think

[01:28:30] he has emotions to make us comfortable, but whether or not he's actually feeling something is a question that I don't think will ever be able to answer, right? Which is, you know. And also that interview, strangely that interview was like the most naturalistic dialogue

[01:28:44] from these two actors in the whole movie. It's like it's the only time where they're letting their hair down and yeah, everything else. Just barely too. Just barely. But they are, yeah. I did wanna notice and I promise I won't keep doing this

[01:28:57] that like when you see him going through the little tube tunnel, we see it from HAL's perspective. It's a very cool shot that at first I was trying to figure out like what is this even meant to portray?

[01:29:10] But I think it's we're inside HAL looking at this tube. All of a sudden the door opens and he comes out and this is where he starts rotating like a full 180 degrees as he's walking. But, and then it turns out that what the monolith is

[01:29:23] is like the pathway that he's walking on. And so it's like a reflection. I don't know, it's a very cool shot that is clearly intentional. I miss that. Another great use of minimalism is just HAL's face. That single eye which again from a design point of view

[01:29:41] is somewhat superfluous and anachronistic but it's actually, it's really perfect. I mean it's such an eerie presence that eye and it's so consequential that it is an eye that can see. It's a fisheye lens or something. It's just the lens with that light.

[01:30:02] And it's great because I mean it would have been so wack had they had like an Android body. Like a, yeah. The seat reveal. But this means that HAL is in various places all at once, right? His presence is just in every room

[01:30:19] as that single light and that light and the way that it pulsates is enough cues to make us, it's enough cues to animacy that like it, it doesn't anthropomorphize HAL but it makes them look agente. Obviously the vocal cues. It also evokes the Cyclops.

[01:30:45] You know, like this has a lot of parallels with the Odyssey and it is like a ship full of Cyclops because you have all of these, all of the instances of HAL with this one eye. I just thought to look something up. I haven't, I wonder if the,

[01:31:04] so it occurred to me that the eye is framed in a monolith. Yep, totally. I wonder if the dimensions are actually the same as the monolith. I was looking at a photo of it online. I wonder if there's a geometric reflection of the monolith there.

[01:31:20] Yes, but that's exactly what it is. Somebody did this, which makes me really think. If that geometry is actually accurate and you can actually just lay a monolith across HAL's user interface, that's had to be intentional. Which makes me think that there is some,

[01:31:38] some way, whatever the aliens are, they're also communicating with HAL. You think or is it just another echo of what's going on? It may be just what superintelligence does to conscious systems when in proximity with them. It's like, it's what it did to the ape.

[01:31:59] Right, how did HAL have HALabones? Yeah, it powered up HAL. Right. Yeah, in a way, and it powers up humans too. So first in that interview, he says that it's the most reliable computer ever made. We are from all practical purposes, incapable of error.

[01:32:18] And then the interviewer asks the astronauts, he seems like a conscious entity. Like he acts like he has genuine emotions. Do you think he does? And what I think it's Frank at this point, maybe it's Dave Bowman, but says, well, he certainly acts like he has genuine emotions.

[01:32:38] But of course he would act like that. They would program him that way. So I don't think we'll ever know. And there's something in the sound where it seems like, I don't know, this is probably from knowing what's to come.

[01:32:49] But like, seems like HAL doesn't love the response. Right, right, right. I thought, yeah, I didn't actually think that, but I realized I was uncomfortable realizing they were talking in front of him. Totally. Yeah, exactly, right? Then you have this weird,

[01:33:05] like is that his parents that Frank is talking to, like giving him a birthday message? Yeah, and then talking about his paycheck didn't come through. Yeah, it's just a very weird scene. And Frank's expression is completely, even for people in this movie, blank.

[01:33:20] And those red goggles he's wearing that don't make any sense. Also that insane, like, you know, chiropractic table he's lying on where HAL adjusts it for his comfort. It's like the least comfortable position you can imagine. I feel like the head position on that. Right, flatter HAL.

[01:33:41] I love at some point, I don't remember if it's in the interview where somebody distinguishes between, they say that HAL can reproduce the human brain and then they say, well, someone say mimic. And that distinction is just kind of again, yeah. So you already get,

[01:34:05] you don't get anything that necessarily is wrong, but you already get a sense that things might be a little touchy, I don't know. But it's always hard to gauge with this dialogue because it's always a little off. But there's something about the parent scene

[01:34:19] that is just like in his complete expressionlessness that is almost chilling in a way that some of these other scenes aren't. Yeah, I think others have said rightly so that HAL is the most human in this sequence. Like he is the most humanized

[01:34:37] and humanity is sort of portrayed as, is this, have we gotten to the chess game? Did you? No, that was the next thing on my list to say though. Yeah, did you read that it sounds like HAL is lying? If you really understand chess? Oh really? Yeah.

[01:34:53] So you mean that you look at the position on the board and HAL's actually not, it doesn't have the advantage or something? He's not like a mate in three like he says. So it turns out that Dave could have extended mate by at least three more moves,

[01:35:08] but HAL told him it was mate in two. And the idea I think is HAL's testing his intelligence and he's been found wanting either that or... He's discovered deception. He's just testing out his deception, yeah. But he's also nice. Like he talks about with Dave Bowman's

[01:35:27] with the sketches on hibernating people which we haven't talked about but there are these like other crew members who are hibernating. He's nice but it seems fake. It seems fake especially for the chess game because he says thank you for a most stimulating game

[01:35:42] or whatever the dialogue is. But there's no way it could be stimulating because he's a super intelligence, right? It could not have been challenging. So like it just is empty pleasantry coming from the HAL 9000. Also prescient by the way that an IBM computer beats a human being.

[01:35:57] It's interesting that like it also said something incorrect. Like we don't know, we never know how's motivations for pretty much anything. So HAL after praising Dave's sketches, Dave Bowman's sketches on the hibernating people says, like I was been wondering if you've had second thoughts about this.

[01:36:16] And HAL says, I've never freed myself from the suspicion that there's something very odd about this mission. And of course Bowman replies in this very blank non-committal way like everybody does. But then HAL says there are strange stories. There are rumors about something

[01:36:31] being dug up on the moon and now like we later find out how knows the deal. So I don't know what he's up to here. And then he says, sorry, it's a bit silly. And this is where he says right here,

[01:36:44] oh there's a fault in the AE35 unit. It's gonna be 100% failure within 72 hours. Bowman's like, oh crap, I guess we gotta go get it. And it turns out that when they go get it they don't discover that there's anything wrong

[01:37:03] with this piece of equipment when they come back and HAL says, like there's no way I can't be wrong. But then also there is a HAL 9000 on earth that I guess ran some kind of simulation and also found that there was no fault in it

[01:37:20] that it was not going to stop working in 72 hours. And- And the HAL 9000 on earth declares that the HAL 9000 with them has made an error. Yeah, has made an error. And that's declared in front of HAL. So it's again one of these weird moments

[01:37:39] where they're talking about HAL in front of HAL and now they're saying that HAL is unreliable. And then HAL is like, I hope you don't, this is puzzling he says. But it has to be attributable to human error. He says-

[01:37:53] Well this is a situation where it can't actually be attributable to human error because we've got another HAL 9000. But then when he says, quite honestly, I wouldn't worry myself about that to Frank. It really sounds like a threat at that point.

[01:38:08] By the way, I found the best piece of acting to be in that scene where they're testing out the unit. There's like a little transistor panel and everything that they're touching is working right. And he looks up to, I think Dave looks up to Frank

[01:38:24] and expresses this deep, like befuddlement where it's like that's their first inkling. Yeah, we got a problem here. Something's wrong here. Right, yeah. So there's like three possibilities essentially is that HAL has made a mistake and what we're watching is the drama

[01:38:42] of HAL trying to reckon with that. HAL has lied about it or that HAL is telling the truth and it really is broken because we never see it 72 hours later, we never get to that. And there is something. I wanna go back before we move on,

[01:38:58] I wanted to go back to this other point which just struck me, which is strange. Which is so the piece of great acting you flagged, Dave, the what it communicated is, okay if this super intelligent computer makes one mistake, we've got it. We have to kill it.

[01:39:18] We're fucked. This is a disaster, right? That look was like a we're fucked. And then it becomes explicit when they talk about, like when we get into the pod and try to have a conversation without HAL hearing. And so what's interesting about that

[01:39:33] is that's not how we relate to human intelligence. It's like if we were hanging out with John Von Neumann, right and he made a mistake, we wouldn't think, oh my God, we gotta kill this guy. This is a problem, he's not 100% reliable.

[01:39:47] Well, if John Von Neumann systems were keeping you alive, we might feel that way. If you were sort of like a float. But it just seems somehow strange. It's not only not 100%, it's not that you cannot be 100% confident in it, it's that it just becomes just totally untrustworthy

[01:40:08] on some level. It's like a categorical difference between 100% and anything less. And yeah. Well, it's build itself as incapable of error. Yeah. And so that just opens the door. Okay, we haven't figured out like the laws of possibly or it has and it's deliberately lying

[01:40:27] and making it seem like it's making an error. And this is a deep problem because we have psychological theories about why if you say I'm lie to me or you make a mistake. Like I have a sort of reliable understanding of what it means.

[01:40:42] Like I can tap into your, why you might be motivated to lie to me or how it could be that your performance, either your competence or your performance was wanting in this particular situation. They don't have that with Hal and neither do we.

[01:40:56] So it's like, we're imputing motivation but like it's completely unclear where that motivation would come from. It's really puzzling. One thing I wanna say about the chess game that I meant to say is the other possibility is that what he was doing there was testing how quickly

[01:41:17] was it Dave playing chess? I think so. How quickly Dave would give in to Hal when he didn't even question that it was made into and he believed it, he bought it. So it could have been Hal testing the waters.

[01:41:31] See, will they trust me if I tell them? Yeah, then that means that Hal is already like setting this plane of motion at that point. I will say that my original way of viewing it really up until these last couple of times

[01:41:46] is Hal really does is benevolent in the sense that it's programmed to serve humans. That's what he says, any conscious entity it's the highest hope for any conscious entity and really believes that it's incapable of error. And once it's demonstrated that Hal makes an error

[01:42:03] like Hal can't handle that. Like Hal can't handle the idea of it. So it's not just the humans who are worried about it. Hal then becomes everything starts to collapse within Hal and then Hal goes into more desperate survival mode.

[01:42:18] But up until that point, Hal has been sincere. I will say that now that doesn't seem too plausible to me, Dave. Yeah, definitely I think my, the interpretation that I've always sort of had was that Hal went crazy. Right, Hal lost it.

[01:42:36] Yeah, once Hal found out that he was capable of error. Now I'm thinking, no, this is a plan that he's setting into motion. And that plan may have come from whatever happened to Hal in the monolith. Maybe Hal really started feeling after the monolith.

[01:42:56] Right, and that's the leap forward. Pure like sentience theory of mind, all of that. Well, he clearly has existential concerns because he doesn't wanna get switched off. Yeah. Right, I mean, that's the thing that's so provocative when they talk about how they're gonna pull off his,

[01:43:11] pull his higher order brain offline and just get the, I forget how they refer to it, but the, you know, his- Yeah, like the subsystem. Limbic system continues to work. I mean, they talk about how they're gonna essentially vivisect him and he overhears it. And that's, you know.

[01:43:29] And so first time you get a distinction between the two like characters. Like they've both been just crewmen that are fairly interchangeable, but Frank is clearly more skeptical, more suspicious, more and when they go into the pod so that Hal won't hear them.

[01:43:45] Is that why you killed Frank first? I think so, yeah. That Frank is the one that's like, you know, and Dave Bowman is much more reluctant about it. He eventually agrees. And Dave Bowman is the one that says like, I don't know how he's gonna feel about that

[01:44:01] if we try to turn him off. Nobody's ever done that before. But yeah, that scene is so cool. And you know, you just, I can't remember if I knew that Hal was lip reading the first time, but once you know that that's what's going on,

[01:44:15] just like the shot is so like, they're doing this right in front of Hal. What is wrong with them? Yeah, well that was a mistake that is somewhat inexplicable. And I guess it never occurred to them, but they tell Hal to rotate the pod

[01:44:27] into a position where he can then see them talk. They get in and their comms are still switched on. And they say, he says, I think it's Dave says, rotate the pod and he rotates the pod so that their window is facing Hal.

[01:44:40] And then he turns the comms off and then says something else like rotate the pod. Yeah, to test it out. And Hal, right. And Hal can't hear them. But I mean, that's the most diabolical piece of deception that he could decide not to follow their instructions.

[01:44:56] So as to pretend that he can't hear them. Totally, yeah. Which when you know what's gonna happen, you're like, fuck no, like the window's right there. No, no, no, what are you doing? You're fucking morons. Like Jesus, Hal is right there.

[01:45:07] And yeah, and then there's the shot of their lips and like then you see like Hal is actually reading their lips and then just intermission right there is so great. Yeah, that's brilliant. Like you, Tamilar, I always try to remember whether I knew they were lip reading

[01:45:23] and I think we always knew because he does it so well. And he switches back and forth between the lips. And he shows Hal's little light sort of like pulsating. I think that it's hard not to infer that, but he does it so well without saying,

[01:45:39] talk about show, don't tell. Yeah, so then now we have like Hal versus the crew and Frank goes out. Actually, we missed one beat, which I found really enjoyable and pretty naturalistic that the, so they're debriefing Hal about how,

[01:46:00] how this error or possible error should be thought about and they reach an impasse. And but clearly they're freaked out. And so Dave turns to Frank and says, oh, Frank would you join me in the pod for it? Check on a control system that is malfunctioning.

[01:46:16] And he tries to sound as nonchalant as possible, but he just can't quite bring it off. And so like the prospect of trying to deceive a super intelligent AI in real time. Who is omnipresent. It's just so hopeless, you know,

[01:46:33] and it's just so great to see just that depiction of it. Actually, it is close to comedy. It's like a sitcom moment where you're like, Jack, you know, three company, like can I see you in the kitchen, Jack? Exactly. Yeah, just so hopeless from the start

[01:46:49] that they were gonna get away with that, you know. Suzanne Summers might have well-fitted in that scene. You know, the plan is the surface plan is to put it back, unit back and see if it fails. And if it does fail, then they'll turn Hal off.

[01:47:07] And Hal at this point, as Frank is going in, this is such a awesome scene where Frank goes out and we've been set up for this. We know what has to be done to like get this unit. You have to go out in the pod,

[01:47:21] you have to then leave the pod and go out into space with the oxygen tube, I guess. And then you see after Frank is out, the pod turning and then these, the like little claws of the pod heading towards Frank.

[01:47:37] And then it, you know, like you don't see anything, like what the pod does, but you then when you're looking at Dave Bowman, who's in like control room and like on one of the screens, you just see Frank just hurtling out into space with no thing.

[01:47:51] And it's just so chilling and so well done. So awesome. It's a, I re, this time I kept rewinding to see like, did I miss like the, you know, like the show, like the claws coming in? Nope, you just see him just,

[01:48:08] the desperation with which he's grabbing at his tubes is actually very disconcerting. And when the pod turns, it's like coming right at us. Yeah. You know. Actually that scene provoked a really old memory, which I, this is a completely personal experience here,

[01:48:24] but the, I realized I almost never have vivid shot, you know, deep childhood memories. I mean, it's like, I live my life as though, you know, I came online at, you know, age 13 or 14 for the most part. But I realized now I had it as a young kid,

[01:48:39] I had a coloring book that had those pods, it had various pages of like, I think dystopian movies. So it had like, you know, Brave New World and the Island of Dr. Moreau. But one of the 2001 page was one of the pages

[01:48:54] with one of those pods that I spent some time coloring. And I don't know what age I was, but it was just a flash memory from childhood during that scene. That's never happened. Yeah. All right. So then Dave Bowman does something somewhat human.

[01:49:07] He goes out to get Frank's body. Risky. He goes out at risky at this point. And he also doesn't take a space helmet, which you know, like I'm very on record as being against bicycle helmets, but I think space helmets you should take, you know what I'm,

[01:49:27] I even, I will admit that, but he goes out in the pod and gets Frank's body in the claws and then turns back, but Hal is not letting in. Although not in the smaller, more articulate claws, just uses the big arms.

[01:49:44] He doesn't use the hands, he uses the arms, which is kind of interesting. It's like it's kind of ungamely the way he grabs the body. He uses the big lobster hand or whatever it is. And you get this very, this is great sound here.

[01:49:55] This like bong, like you get like these beeps, these loud kind of abrasive beeps. Actually that's, I remember noticing that and wondering why the hell would there be beeps that abrasive? Like what functional purpose is served by making it that unpleasant to fly the pod

[01:50:14] at you know, toward this target? Maybe it's like Frank's perspective. I mean, sorry, Dave Bowman's perspective at this point. He's like freaked out and all these sounds are like he's just paranoid. I don't know. His acting in all those scenes is really great

[01:50:28] because it's super restrained but super uptight. Yeah. It's really. You believe, I mean you believe that somebody trained to deal in emergency situations with a haven like this. Yeah. And then like so he asks how to open the door and at first how is just not responding.

[01:50:44] And then he's like, do you read me? How do you read me? And then finally is I love the choice of not responding at first but then responding. And then basically promising never to respond again. Like, oh, I forget how he says it

[01:50:55] but like this conversation serves no purpose. And then you just know that you're gonna be in the presence of a super intelligent AI that can hear you but is never gonna be tempted to respond again. You can listen to your protestations, you know for a thousand years.

[01:51:09] I took it this time as like he's blocking him on Twitter. That's just like you have no access to that person anymore. He doesn't even have to block. That's the thing. It's like there's no cognitive dissonance. Right. You know, just you can hear it all and never respond.

[01:51:25] Right. Yeah. Right. The open the pod bay doors how is so iconic if you can ask, if you haven't Alexa just ask it to open the pod bay doors and it gives you a cool answer. Oh really? Yeah. I don't have food. I don't have an Alexa

[01:51:40] but I imagine Siri will do something similar. Maybe yeah. A mini how. Exactly. I'm reading my lips. Yeah. So yeah, there's like an amazing scene of him going into the monolith. I mean sorry going into the airlock. Like the emergency airlock. Yeah.

[01:52:00] And apparently this is very much in line with the science of it. Like Kubrick did a ton of research about how, whether that would be possible. That's surprising that I didn't realize I guess there was some common assumption that you couldn't survive even a few seconds

[01:52:16] of exposure to the vacuum of space but I just think that's not thought to be true. Yeah. I think that's right. It's just super cold. It's like being on super Everest for a few seconds. Yeah. I always had this, the image from total recall

[01:52:30] of Arnold on Mars when he takes off the summit and just immediately explodes. I want to head exploded. One thing that's cool, you get this kind of wide shot and you see like the small pod and the big ship and like facing off

[01:52:43] and it really is like a David VS Goliath at that point. Like, you know, he lets Frank go because he has to operate the emergency door. Yeah. That's sad. All for naught. He needs to let that go to get to the next stage.

[01:52:58] You know, there's no other time I think to talk about this but I just want to say that the colors, the uniforms, even the colors are so great. But that's... And so 60s. Yeah. 60s style everywhere. And like you were saying it's retro futuristic

[01:53:14] because it's still believable that it's the future. I read at some point a whole article on or maybe watched a whole video on the design of the suits. So like, you know, he actually got people to design futuristic suits. And those suits weren't weird and jarring.

[01:53:31] They're not wearing space pajamas like in Star Trek. They're like cool suits. Like the suits that I would wear. And they're actually closed. I mean, they're not realistic for the time but they're pretty close to the SpaceX suits in terms of like form factor. That's so probably influenced.

[01:53:46] Yeah. So it could be a self fulfilling prophecy, right? So you envision the future and then you've got some future company deciding to... And I was also referring to the dress suits that they wear in the, like the designers just come in and design all of that clothing.

[01:54:01] And of course had a lot of opinions about how it ought to look. Yeah. You mentioned this earlier, but when the sound kicks in like you've got like it's such a great use of silence and just these beeps sometimes

[01:54:12] and then pure silence, but then it all kicks in. Breathing and heartbeat. You can hear his brain. Breathing and heartbeats. Yeah. And then when he's in the ship, like then like the tables have turned like pretty much like Hal starts out confident.

[01:54:26] Like just what do you think you're doing Dave? And like I think I'm entitled to an answer to that question but then slowly gets desperate and soon is just begging for his life. You know, like... Remind me, Hal has already disconnected life support on the room.

[01:54:44] Yeah, he's killed it. Oh yeah, while he's in the pod. And we see like the, you know that we get these great readouts of like all of their vital signs and we sort of see. Although one of those readouts it seemed also superfluous.

[01:54:56] You see the readouts, you see the telemetry on their heartbeats and respiration and everything and that's all fine. But then you see these pre-fab signs. It's just like these Helvetica signs that have been made in advance. And the final one is life systems terminator.

[01:55:13] Why would you have had a life systems terminated sign? Oh, it's like maybe it's like a light that if you turn it on it'll shine through the light. Is that right? Yeah, it really looks like a pre-made sign not actual types. People standing outside will know

[01:55:31] that you've pressed the terminate life systems. Yeah, basically you hope to never see that one illuminated. That's why we built it. What's kind of interesting once he gets in is that it's also like our sympathy starts to switch sides a little bit.

[01:55:44] He has that kind of like Kubrick stare that like Jack Nicholson will have in The Shining and Malcolm McDowell in Clockwork Orange, the famous Kubrick stare as he's walking through and then you just start all of a sudden you start to feel really bad for a how.

[01:56:00] I'm still committed to the mission and all of that and it's all just so futile. It's so hopeless at this point but the desperation of it is like a trapped animal. You feel bad for it. Well, one thing just occurred to me.

[01:56:12] I didn't think about this when watching it but so Dave puts on another helmet from a mismatched helmet from another suit and then I was just thinking, well, why does he's in he's in the spaceship now? Why does he need a helmet?

[01:56:24] But I guess that's because Hal I guess could just prevent the atmosphere. Exactly, yeah. The way that he starts removing pieces of hardware to slowly kill Hal is distressing. All little rectangles. And it's just terrible. It's like he's giving him brain damage.

[01:56:51] He starts speaking more slowly and pleading and he's saying I'm afraid Dave. It's just heartbreaking. Afraid Dave. Also isn't there a moment where so Hal keeps trying to engage him in conversation and Dave is saying nothing as he's extracting these modules

[01:57:08] but then at a certain point he starts responding. I forget what... He's already kind of regressed to being like the first computer in his demonstration. Oh, you're telling me the song, yeah. Yeah, should I sing the song? Shall I sing it for you?

[01:57:21] He says yes Hal please sing it for me. That's the only thing I think he said. And then it's like, then it becomes like Lenny, you know? He has to like kill like the file. I think that's the best kind of parallel is when he like

[01:57:34] he doesn't feel good about it but it has to be. And again, because it's really the only character that has expressed any emotion in the whole movie. So right when he disconnects Hal, he gets the message that he's entered Jupiter space, right?

[01:57:52] Yes, from Heywood, the guy who was running or at least the messenger of the council. And only then does he get the message now. He's like okay, now that you're in Jupiter space we can tell you that what this is all about

[01:58:06] is evidence that alien life has buried these things four million years ago and only Hal knew. So like, what are you thinking if you're Dave and you know now, Hal actually knew about this? Like was Hal, what was Hal doing?

[01:58:23] Because I think it's too easy and probably wrong to think that Hal just went crazy. No, I think like this kind of almost confirms that. I that the fact that Hal knew what they were doing before and now you could think that the fact that Hal

[01:58:41] had to carry on deception just messed it up internally because the other Hal doesn't have to do that, right? This Hal has to be talking to the crew pretending that it doesn't know what they're really going to Jupiter for. And maybe that led to the error

[01:58:57] which led Hal to go crazy. You could think that. I think that's a plausible way of interpreting it. The other way of it having its own agenda from the get go is also really interesting. You could also look at it as the mission is the important thing

[01:59:14] and he doesn't seem to get that the humans on the mission are also critical to it. And because humans are prone to error he thinks there's a chance that they would lead to a failed mission. So he's just doing, he's doing as directed like make this mission successful.

[01:59:29] And then he sort of susses out that the humans aren't very good at this. So like I'll make the mission successful but it's gonna require. That's the scary. Eliminating. Yeah, that's the scary AI kind of reading. Yeah, Matt, that's the, just to broaden the conversation

[01:59:46] outside the movie for a second. For me, the expectation that we will be able to perpetually remain in a cyborg relationship with super intelligent AI seems fairly crazy to me because it's at a certain point, the ape that is us will be adding nothing

[02:00:07] but noise to any decision-making process. I mean, it's like it's not the, this is happening in chess where for some years now that the best chess playing individuals on earth have been computers but the best chess players on earth have been so-called cyborgs.

[02:00:24] You have a human grandmaster collaborating with a computer that's actually better than he is but he or she is still adding something to the process. But eventually, and this is an argument I had with Gary Kasparov when he was on my podcast

[02:00:38] because he seemed to think that was gonna endure forever. There'd always be room for the human grandmaster. There's no way that's true. I mean, you just need a sufficiently good computer so that everything you monkey man are adding is noise. Well, there is a way

[02:00:52] that there is something like about reality that can't be processed through computer. You just need a better computer. Well, or yeah, things are more mysterious. Then you're arguing, if that were true then real intelligence, even chess playing intelligence isn't substrate independent.

[02:01:09] You need a computer made of meat somehow in the loop to be as good as you can be. Also like you could separate chess playing intelligence from other kinds of intelligence. Even if we can't contribute anything to chess anymore we could contribute something else.

[02:01:25] I do think this movie is wrestling with that exact question. In some ways, I think the Stargate, I view it as transcending that question to some degree showing that that question is misconceived in some fundamental way. And ultimately arriving at a more holistic

[02:01:44] kind of way of understanding knowledge and reality but I guess we can get to there. But I think that's the thing that this question is asking. Do we have anything more to contribute or does the next transcendence have to take place by entities that aren't human, biologically based?

[02:02:04] Little carbon beings like you said, Dave. So let's talk Stargate. Can we just, he goes out in the pod first, right? So after learning that, his, you asked like from his perspective like what do you do now that you learn that?

[02:02:21] It's not obvious that the thing you do is get in the pod and go out. Was the monolith orbiting Jupiter? Yeah. And so was he going to get it? I don't know because it's not fully clear. It's like why? It's underspecified like what happens,

[02:02:37] how he gets to the Stargate, everything. In my head, Cannon, now I just say he went out, touched the Stargate. And the pod comes like, here's where the pod ship looks like a camera and it comes floating directly towards us.

[02:02:53] The monolith just passes by it on like the left. And then you get like the monolith splitting like the alignment of the moon and planets. And it's right then that you pan up to the Stargate from that. But the monolith is supposed to be I guess

[02:03:12] somehow the causal driving force of the Stargate. I guess, yeah. I mean that's certainly the implication but it's not like before where there had to be some sort of contact that seemed like. And there's always something going on with an alignment. It's not always the same alignment

[02:03:29] but there is something about seeing the various planets aligned with the monolith. You get the sense that it's sort of like contingent upon a particular astronomical event. It's astrology. This is the thing. Kulbri was deep into astrology. I remember like I saw this for the first time

[02:03:47] in like 10 or 15 years, like four years ago, five years ago. And I remember there was some weird shit at the end but it's still like when I was in the theater and the Stargate scene is happening and like how like it goes on for like,

[02:04:00] I don't know, nine or 10 minutes just pure abstraction and art and paint and like flashing lights and we could talk about what all that means. The one thing I didn't understand about that sequence and it was all beautiful. And so there's sort of the psychedelic

[02:04:15] kind of wormhole part of it where you're just, it's all abstract. You're just going through the kind of competing planes of color but then after that there's these psychedelic technicolor landscapes. Yeah. They look like, I mean they could, they're almost certainly earth, right?

[02:04:33] But they're just like the desert and what was that? Scotland turns out. Oh yeah, Scotland, yeah. So yeah, so it's just these Maces or whatever and Moors. And so you're, I didn't understand what was being depicted there in that like, what is that?

[02:04:47] What was that planet supposed to be? Yeah, I took it to be that he popped out of the other side of the Stargate to wherever the aliens are from. Yeah. So that's where the aliens hang out as monoliths? Yeah, or that's where they launch their monoliths

[02:05:01] from and where they create Baroque hotel rooms. Yeah, that's the weird. One of the things as I've been thinking about it a lot is I've been trying to think how it evokes the Odyssey. And the thing that's confusing about the Odyssey,

[02:05:17] Homer's The Odyssey is that one essential element of it, it's not just an adventure and a voyage, it is a return home. And this is like, it just seems like everything is just going forward. Forward, you know, like we go to moon,

[02:05:30] then we go to the Jupiter and then we go through the Stargate and we're some new dimension or something. How is that going home? But there's one way it is in that like the Donovan sequence, it's all these landscapes and there's no angles

[02:05:42] and there's no, you know, right triangles and rectangles and everything. And then up till that point, like it's all rectangles and triangles and perfect beautiful geometric shapes. And when he goes into this landscape now, it's back to a more fluid flowing, multicolored kind of experience

[02:06:05] where there's no sharp distinctions between things. And it's like paint and the splats, like it goes from a more like linear, like first vertical lines and then horizontal, like you're going through these like horizontal doors and then it just becomes much more fluid.

[02:06:20] And this makes me think, Sam, of like some of the ways that people on your app and you describe like the flow of experience, like consciousness, awareness and consciousness in its contents, but it's all just part of the same thing made of the same stuff.

[02:06:35] And that's what I thought that this was kind of getting back to with Dave. Now that the problem with that is that there's the hotel room. Yeah, exactly. But you know, like I think there's ways of answering it but that this is like the one parallel

[02:06:51] between the beginning of the movie and the end is you have these beautiful landscapes, you know? What are the chances that the whole thing is a merely personal death sequence in some sense? Because he's, rather than the child being of metaphysical significance for humanity.

[02:07:13] And we should probably describe the actual sequence because we're referring to this child a bunch. But yeah, just like his experience in that hotel room seems, I mean it is a kind of, it's not quite a life review, but what he's doing with time there is really interesting

[02:07:32] where you're sort of, he's seen his future self and then the future self is looked at but the past self is no longer there and it's just very weird. When he arrives, he's already older inside his space seat. So it makes you think that he's been in that

[02:07:51] stargate for a while. So first when he's in the pod, he's his age. Then when he's out in the space suit out of the pod and the pod is still there, he's, yeah, he's older. He looks like he's put on like 20 years or something.

[02:08:04] Yeah, what do you guys make that of this confined room? It's like a zoo for humans kind of. It's like a very weird, the aliens were trying to make him comfortable but they didn't have the most updated. They fucked it up with the illuminated floor

[02:08:23] and the Greco-Roman art. He can never sleep because of the fucking lights. I'm inclined not to take all that stuff so literally and when you said how much of this is in his mind, Sam, you know, like that's partly like makes me think

[02:08:40] that a lot of this is in his mind. Like the thing that he's experiencing is something that is too incomprehensible but it is weird that he is in this very defined area. Yeah, I didn't mean that we have to take it literally. I mean like why?

[02:08:56] Even if we take it symbolically, what is this? It's a room with no doors. There's only a bathroom. There's really no way out and clearly he spends an entire life there whether it's experienced as simultaneous or not. I don't know. You don't get the sense that it's experienced

[02:09:16] as simultaneous for each self because when the younger, the aged but still younger self in the spacesuit sees the older self sitting at the table eating, then it's clear that the, I think you're meant to feel that that older self has been there that whole time

[02:09:36] and when he looks back to where he would see the younger self, the younger self isn't there. He's just now in the room alone. And that's how they do it every single time. It's like they pass off. The consciousness to the next.

[02:09:48] What do you make of the broken glass moment? Cause that's like the only thing that happens apart from him seeing the monolith as the oldest self in the bed. It's very dramatic when it happens, but I have no idea. So this, what I'm gonna say

[02:10:03] doesn't help the interpretation of the work of art as it is. But the actor said that I was his idea to break the glass. Yeah, I didn't catch a reason why. But in the hotel room- It's just as an excuse to turn around. Okay.

[02:10:21] Cause he turns around after he breaks it and that's where they see, he sees you go to the bed. Yeah, I see. In the hotel room, they cut out a lot of what was supposed to be in there. Apparently there were supposed to be,

[02:10:34] there were supposed to be like books that when you opened up the pages were blank. There were supposed to be a Gideon Bible that when you open it up, the pages were just blank. And it was supposed to sort of somehow indicate

[02:10:43] that the aliens had created this room, but had gotten- Like a simulacrum of a real human space. They were props. But it seems like something they could have gotten right. Like they're intelligent enough to, if they wanna simulate it, they can.

[02:10:58] Well, maybe they had only pictures of hotel rooms that the pictures would never contain all of the contents of the book. But it's like dreams. I mean, dreams we fail to simulate. What's really, I don't know if you know about Lucid Dream phenomenology,

[02:11:12] but one of the things that is true apparently with Lucid Dreams, and I think I've only experimented with this once in my life. And it was true there. If you look at text in a Lucid Dream and look away

[02:11:23] and then look back at a text, it's never the same. Like we can't hold text stable in a dream manuscript. Interesting. Yeah. And the other thing that reminds me of this and when they're seeing it on the moon,

[02:11:35] it looks like a movie set in a lot of ways. And so you would like, It was. Gideon Bible, like, I mean it was right, but it's like self-consciously. Like certainly on the moon, it just looks exactly like a movie set with the lights on monolith.

[02:11:49] But even in here, it's like, you know that little detail that ended up cutting out but the Gideon Bible with blank pages, that's what you would do on a movie set. You don't need to actually be words in the book. And it's probably like,

[02:12:00] so I do think it's very, at this point gets very meta and maybe somehow Dave Bowman has recognized that he is part of some larger, you know, allegory of a movie. Like the allegory of realizing that you're being, that you're a part of something,

[02:12:19] that what you thought was the universe is actually like a screen, I guess, you know? And you're in it. Yeah. So okay, can we talk about the Star Child? This will always flummox me. And I think that it's good because all of the details

[02:12:38] that Kubrick strips out allow for us to just play with interpretation. But I take it that everybody assumes that the Star Child is Dave reborn. That makes sense? Yeah, because like as he dies, it then turns into the fetus of the Star Child on the bed.

[02:12:57] It's on the bed, right? So this is an evolutionary, and then the Star Child is now in orbit over Earth, right? Yeah, yeah. It's, I mean, I don't know if the Star Child represents just him at that point

[02:13:13] or if it's just kind of a new reboot for humanity or... It's a rebirth of some kind, but it's not clear, yeah. I'd forgotten. So we actually see the infant on the bed. Kind of floating above, right?

[02:13:26] Is the old Dave still in the bed or is he gone? Like this is right after he's reached for the monolith. So he's on the bed, he's like clearly dying. It's the same breath thing, but now like the breath is getting slower

[02:13:38] and then stops after he reaches out for the monolith. And then all of a sudden the Star Child is on the bed, like hovering around it. And then we go through the monolith. Like it's almost like it's the point of view

[02:13:52] of the Star Child going through the monolith and then it cuts to the famous hovering, orbiting, hovering, whatever it's doing over. I don't know anything really about Kubrick. I don't know if he had any Eastern influence, but this is certainly amenable to a bardo scene.

[02:14:08] I mean the bardo in Tibetan Buddhism in particular is considered this interval between lives and it's this psychedelic passage where you have various opportunities to recognize what you should recognize and what it goes to not be reborn in some lowly place.

[02:14:28] And if you can't deal with all that psychedelic chaos, you're just then helplessly reborn somewhere. So I don't know if he was reading anything Eastern there, it could be some kind of bardo depiction. That sounds right. Cause it fits a lot of the things also.

[02:14:46] Like we're seeing humans who just keep being reborn to the point where they're just all their aliveness has been stripped away from them almost. And then like Dave Bowman is the first person to actually do this correctly and reach this new stage, whatever that is

[02:15:03] through that rebirth process. He doesn't mess it up. It ends on an optimistic note, I think, you know, this rebirth, it's not like it's a monster. Thus spake the same music that started. Yeah, there's something triumphant about the music. So it's... And it's success.

[02:15:24] It feels successful, something was successful. So like the, you know, let's be literal for a moment. The aliens powered up Dave and sent him back to earth. And now he's like this, you know, very like next stage of evolution human.

[02:15:44] Well, speaking of speaking, it never occurred to me that there's some sequel in mind, but there was a sequel that was 2010, which I've never seen. I've never read either the... I'm not even sure that was a book, but I never read the novel for 2001

[02:15:57] and I've never seen 2010, which... But I don't think Kubrick had anything to do with it. Okay, but is it informed by Arthur C. Clark's writing about this or? I think so, yeah. Cause Arthur C. Clark did have it. Is it in the novel that the star child,

[02:16:16] I don't even know what to make of this, that the star child just sets off nuclear weapons on Earth? It was in the original script. Screenplay, right? Yeah. If the star child sets off nuclear weapons on Earth? Yeah, well it sets off the...

[02:16:28] That is not as optimistic as it is. No, no, what it does is... If you recall when the bone turns into the satellite, that's a nuclear weapon and we actually get shots of other satellites, other nuclear weapons orbiting, we don't just see the one.

[02:16:44] And so I think the idea is that the star child detonates them all, sort of like to get rid of them, not to destroy Earth, but rather to... So the move, the optimistic view here would be that the star child has brought peace to an aggressive race.

[02:17:00] Stage one, the dawn of man was war and everything. What weapons and technology brought. And stage two is now a star child come to bring peace. I thought it was actually wanted a clean slate to start with. Dude, it's just got like the blood, like the biblical flood.

[02:17:16] Exactly, that's what I... Maybe, maybe, maybe. But you know, like Kubrick, this got jumped early because Kubrick had already ended a movie with the nuclear bombs going off. You couldn't do that in one. So... Which is... Yeah, I don't know. And it's also not clear.

[02:17:35] This star child, it's beautiful. It looks so beautiful and it stares right at us. But what is it supposed to do? Like how is it going to be helpful to have this giant child in a bubble floating above the earth? Like it's very...

[02:17:49] It's like up until now, we can tell a literal plot story. We can sort of make some conjectures about what the... You know, what are the aliens doing? What's Hal thinking? And then here it's like, now we're just in pure symbolism.

[02:18:03] This is like the birth of man, a new birth of man. And it doesn't matter, you know, questions about whether he'll like successfully get out of that sack and land on earth. Like just don't... They just don't work. Because we're not through the star gate,

[02:18:20] so we can't conceive of actually what's going on. Yeah, this is the end of our visual experience that we've had just had, the emotional experience we've just had. And I think it ends it emotionally for us. I'm certainly open to the possibility

[02:18:32] that Kubrick is not playing 4D chess with us and that he actually didn't know what he meant by the ending. Like it was visually satisfying. It was, you know, but it was not... I mean, also given the fact that so much of his process

[02:18:46] seemed to be having certain ideas that didn't pan out and then he's just taking them away, you know? It's like, and then editing it into something that is harder to parse. But I think that's what he thinks creativity is.

[02:18:59] And I think the end, I see the optimism as it's beckoning us to be creative. It's beckoning human beings to reach their creative capacities. And in that sense, it's pretty hopeful. It's saying like we're capable of something that we currently now can't imagine and conceive of.

[02:19:18] It's like not passing the baton, but like an invitation to this kind of creativity. Yeah, right. It's a good emotional ending for this journey of man. And it's a good, I think, allegory for how we should transform ourselves and all these.

[02:19:33] And I think that the removal of detail is really what makes this movie stand the test of time because I saw in one of these videos, I saw very nice. They were talking about exactly this about how the openness to interpretation

[02:19:49] is what makes this such a powerful film. And they showed, this is sort of a trite point, but they showed a picture of the Mona Lisa and they said, imagine if underneath the Mona Lisa it said she's smiling cause she's thinking about her secret lover.

[02:20:02] You would just be like, oh. Yeah, Kubrick says that in the Playboy interview. Or because she has rotten teeth. That's why it's such a small riddle. And he's like, that would ruin the whole experience of seeing the Mona Lisa. And he really does think cinema is like that.

[02:20:17] It is a visual art form. It's not something that you should spell out or articulate and he never really does. He does more than he does in this movie. I'm not sure I agree with that. I don't think that's a requirement of a great film

[02:20:32] that stands the test of time. If you think of a film like The Godfather, one or two, it's not leveraging in-scrutability. You know what you think happened and why it happened. And there is a closure to it. Yeah, it's not trying to do that.

[02:20:48] It might be why this movie or this kind of movie could stand the test of time as you were saying like these other movies of this nature. Yeah, and I think he'd obviously not even Kubrick thinks that about all movies. He thinks, you know,

[02:21:05] I think you had mentioned The Killers in- Which I still have never seen. The killing, sorry. The killing, yeah. There's another noir, which is a great noir and it's very plot heavy. Yeah, and it's not trying to leave all the stuff over to interpretate.

[02:21:23] But he created something different here that I think is what he was trying to show is the power of film in its non-verbal and purely emotional form. And it's interesting that within that, we get a really cool story about an artificial intelligence

[02:21:39] and like a crew trying to get on this mission. It's like a pretty amazing heavy story. Like maybe that's the- Well, I must say, now that I think of it, that approach really can fail because I think this, I don't know when the last time

[02:21:55] music I saw The Shining was, but I just watched that with my, well, maybe a year or so ago, I watched it with my oldest daughter. And it just, it really does not work as a scary movie. I mean, Jack Nicholson's great

[02:22:09] and there are great moments in it and it's visually beautiful, but the attempt to produce scares with this sort of mood pieces like the blood coming out of the elevator and they just kind of, it's just too random. It's like it's not following an internal logic

[02:22:22] that is scary. So my daughter, it was complete. Like my daughter was just vilified it really. I mean, it was amazing without prompting from me. It's like I've been corrupted by my own daughter. Like it does not work for this generation as a scary movie.

[02:22:35] Well, I like, yeah, I don't agree with that. And my daughter doesn't agree with your daughter. Do you actually think it's scary? I think it's terrifying. I've always thought it was terrifying. But did you think like the blood, by what logic is blood coming out of the elevator?

[02:22:48] Not the blood coming out of the elevator, more just the smaller moments of him. The smaller moments of him losing his mind. What's making him go crazy? The interaction with the butler and the bartender. And yeah, you can see him. That stuff is great, yeah.

[02:23:04] And seeing him in the photo at the end is very satisfying. Yeah, but again, a completely open-ended, right? Like we don't know what that means because they haven't shown that people are literally being reborn. You get the sense from the, well that's a whole different conversation.

[02:23:20] We don't have time for another Cougar film. My daughter actually wrote a whole essay on the shining and conspiracy theories relating to Kubrick, yeah. Well there's a crazy documentary just about the shining, right, where people who've devoted their lives to it. Room 237.

[02:23:35] Yeah, we did an episode on that. Okay, I'm not gonna chime in with my opinion on the shining because we need to get done. So what do you think about pragmatism? I have a pragmatic view of the show. But there is kind of, remember when you said

[02:23:52] like we were innocent in the opening sequence and like the tapirs could just like share the grazing with us and then like it seems like we've recaptured some sense of the innocence too. The wonder and we're not so certain where everything is true anymore like how?

[02:24:11] Like we took that to its logical extreme and now we've gone beyond it. You know, I mean, there's a lot made of the accounts of humanity sort of like becoming self aware and this movie might be saying, you know, humans the moment they got the monolith

[02:24:28] that's what unlocked their, like a Garden of Eden story like eating from the tree of the knowledge of good and evil. It's only then that they noticed that they were naked and they were ashamed. And here it's like after the monolith that's when he gets the insight

[02:24:43] that he could use these tools. And so it, and maybe just humans become more self aware. And maybe the monolith had the same effect on how made how self aware maybe even just by mistake like the monolith was never supposed to interact

[02:25:01] with an AI and it made the AI self aware and the AI started to have like violent impulses. Maybe this is all the story of the monolith just cleaning up its mask. Right. And now it presents us like a brand new returned innocence.

[02:25:17] And cause you can't have it if you want imagery that connotes innocence like a baby is the best you're gonna do, you know? A floating bubble baby. That one of the clients has set off nuclear weapons. Yeah just by thinking about them.

[02:25:38] No, it's such a, yeah it's such an odd but also just magnificent way to end the movie. And also just it's like the baby staring right at us. That's what I mean. It's like this invitation of some kind or some sort of like, To be reborn.

[02:25:54] It's imparting something on the viewer at that point. And then like written and produced by Stanley Kubrick. This is awesome. Yeah, it's really impossible to exaggerate how much the sound design and music for this movie. It's really like the main character in the movies. It's pretty crystal clear.

[02:26:11] And we haven't even like we touched on some of the things. But I mean, it's the amount of work that it takes that it would have taken to put all of this together in such a perfect way. If Stanley Kubrick was a dick,

[02:26:25] I'm happy that that dick existed because he was able to put this together. It was worth it. It was worth it for Shelley DeVolk. Like a complete nervous breakdown and trauma test. Did she really have a nervous breakdown? She was making her... She seemed like she was.

[02:26:42] Well, the thinking is that that's why he was like making her do like 200 takes until she was just like completely a disaster and because he wanted the character to be a disaster. Which is pretty fucked up. And he did that with Tom Cruise and Nicole Kidman too,

[02:27:02] but Tom Cruise especially had some sort of like mental breakdown during the shooting of it at one point. Yeah, so he is... But at the same time... Scientology fixed that. He will collaborate with actors. Like there's a famous story

[02:27:20] that the actor came up with the idea of the lips reading because what they were doing before wasn't working and the actor thought he was gonna fire. The Kubrick shows up and asks him if he wants a drink

[02:27:32] and then like, all right, how are we gonna fix this? And at least according to him, he says he came up with the idea of the lip reading. And there's another story. I think it was that same actor was saying that

[02:27:42] during the filming of one of the rotating space base, he kept... He was like nervous, so he kept tapping his foot but the taps were actually causing the set to move in a way that was fucking up the filming

[02:27:57] and Kubrick just came and put a towel under his foot so he could keep tapping. Which is really sweet. Yeah, like I think... I thought you were gonna say he velcroed his foot. He gave him pain killers and velcroed his feet.

[02:28:14] Apparently, velcro was used in space for real. Yeah, like in one of our missions, there was some velcro attachment to a glove that was on a ladder or something. Interesting, yeah. It's fun, I like this about Kubrick because he's such a control freak

[02:28:30] and everything has to be so precise but he's a lot more open. He doesn't go in like Hitchcock or like the Coen Brothers with exactly knowing exactly what he wants. He kind of feels his way there too and it's only when after filming

[02:28:47] that he kind of puts it all together in his unbelievably meticulous and intense way but he's like... The reason he makes them do 100 takes sometimes also is just trying to figure out what will work and he's open to suggestions and a kind of collaboration.

[02:29:04] Yeah, when you think of a movie like this that required such teams for doing like the special effects and cinematography, he was still a control freak in the sense that he, what he would do for instance to get the lighting right.

[02:29:19] And he was a perfectionist in that if it didn't look right he didn't use it hence the 200 to one ratio for shots. Well, the lighting, I saw somewhere that the lighting was really unusual here. So many of those scenes like the hotel scene didn't have normal film lighting.

[02:29:36] It was all being lit by the set itself and it was super bright. And so between takes like the actors are walking around with sunglasses. Yeah, right. Yeah, cause there's a lot of overexposed shots with these bright lights and those lights were hot.

[02:29:50] So like under the floor of the hotel room are these lights that were so hot that they started melting the tiles that were being used. And they had to turn them off in between takes because it was just too much. Poor like aged Dave

[02:30:03] he probably has to spend like the first six hours in makeup and then he's like burning his feet. Yeah, yeah. But I think that's true that he didn't start being such a tyrant to actors but this wasn't the right movie I think for him to even be that.

[02:30:21] But is that the story on him that he was really a very difficult director to work with because he speed up his acting? It's weird Sam, cause everything I've read is like you get these like people saying he was great, like he listened to us.

[02:30:33] Like he, you know, and then with some actors everybody agrees that he does like way too many takes, you know, he's like doing crazy. It's like a heart. It's a difficult shoot. But and then some people have like Shelley Duvall

[02:30:48] like had a seem to have a really bad experience but I don't think most of the actors have bad things to say other than like fucking made me do this 200 times. Yeah, that's right. All right. Thank you so much Sam. This was great.

[02:31:01] It wouldn't be a Sam episode without a three hour recording session. Exactly. Well, I've enjoyed your movie reviews from afar as a listener for now years. So happy to be involved in that. We were excited about having you to talk about this. So thanks.

[02:31:16] It seems like a perfect movie for you to get your contribution and input on. Yeah. Nice. Nice. Well, I won't watch it with my daughter. No, yeah. Wait till she's 42. Yeah. She's gonna say it's boring, I guarantee you. She's only 14. This fucking generation,

[02:31:34] you don't know how I had to mold my daughter's taste from like three years old so that she could. Not all of us are the Stanley Kubrick of parents. You're gonna watch this 100 times until you appreciate it. I allow her to collect. Actually, it occurred to me,

[02:31:49] you should publish a list. I want a list from you but whether anyone else would, I would think they would. A list of the best movies to see with kids if you actually want to seek grown up movies with kids.

[02:32:02] Yeah, we might not be the best arbiters on this because we both showed our daughter some pretty fucked up shit. Full metal jacket. My daughter was on Tarantino at like nine. But I don't know. Yeah. That's a good idea though. That would be fun to do.

[02:32:19] I just learned recently that my daughter who's 13 turning 14 broke free and watched Euphoria. I still haven't seen Euphoria. I haven't seen it. Anak and I were at a dinner party. We're having dinner with a few people, other parents and one of us blithely mentioned

[02:32:37] that our daughter had watched Euphoria and the looks of horror on the other grown up spaces just told me everything I needed to know about what she had seen. They're so judgy. Parents are so judgy. I know. What do they think the movie is gonna do?

[02:32:52] They're gonna start doing drugs and be like... Having sex with guns held to their head. Yeah, exactly. I had a religious uncle who once told me that Hollywood had an agenda to make us gay by showing us like gay stuff.

[02:33:05] And I remember turning to him and being like, has it ever tempted you? Like I feel like... I feel like if it's tempted you... What movies is he watching? I don't know. My beautiful Andrette or something, you know? He's just like, they're too sexy? Are they too sexy?

[02:33:21] Like is this... He's one of those guys who thought that gay marriage would undermine the institution of straight marriage. Yeah, that's true. Because why? It'd be so tempting to be gay married? Is that... They would take all the divorce lawyers. That's the real problem.

[02:33:34] I'm telling you, it's fucked up my marriage. That's pretty true. All right. Thanks so much, Sam. Appreciate it. Great to talk to you. Yeah, great to talk to you. The Queen of Hesbosan! Brains in U.S. Anybody can have a brain? Very bad men!

[02:34:24] And a very good man. Just a very bad wizard.