Join the conversation on Discord!
June 18, 2024

S6E12 - The Future of Humanity: TESCREAL

Wanna chat about the episode? Or just hang out?   --- AI will probably most likely lead to the end of the world, but in the meantime, there'll be great companies. - Sam Altman   Chris & Kayla talk about the future of humanity, in the...

Wanna chat about the episode? Or just hang out?

Come join us on discord!

 

---

AI will probably most likely lead to the end of the world, but in the meantime, there'll be great companies. - Sam Altman

 

Chris & Kayla talk about the future of humanity, in the long term.

---

*Search Categories*

Anthropological; Science / Pseudoscience; Common interest / Fandom; Internet Culture

 

---

*Topic Spoiler*

TESCREAL & interview with Dr. Émile P. Torres

 

---

Further Reading

Dr. Émile P. Torres's Website

Dr. Émile P. Torres Wikipedia

Dr. Émile P. Torres Twitter

Dr Torres & Dr Gebru's joint paper on TESCREAL

 

---

*Patreon Credits*

Michaela Evans, Heather Aunspach, Alyssa Ottum, David Whiteside, Jade A, amy sarah marshall, Martina Dobson, Eillie Anzilotti, Lewis Brown, Kelly Smith Upton, Wild Hunt Alex, Niklas Brock, Jim Fingal

<<>>

Jenny Lamb, Matthew Walden, Rebecca Kirsch, Pam Westergard, Ryan Quinn, Paul Sweeney, Erin Bratu, Liz T, Lianne Cole, Samantha Bayliff, Katie Larimer, Fio H, Jessica Senk, Proper Gander, Nancy Carlson, Carly Westergard-Dobson, banana, Megan Blackburn, Instantly Joy, Athena of CaveSystem, John Grelish, Rose Kerchinske, Annika Ramen, Alicia Smith, Kevin, Velm, Dan Malmud, tiny, Dom, Tribe Label - Panda - Austin, Noelle Hoover, Tesa Hamilton, Nicole Carter, Paige, Brian Lancaster, tiny

Transcript
1
00:00:01,560 --> 00:00:34,790
Emile Torres: In the near future, where we go from our current human state to this radically different post human state, in a sense, that's the end of the world. The end of the world as we know it, at least. And that transition itself will be. There's this utopian promise on the other side of it, but we have to cross this field of landmines to get there. And that's sort of like the apocalypse, right? Right. One possible outcome of trying to build this utopian post human future is human extinction.

2
00:00:48,130 --> 00:00:52,618
Chris: All right, let's get started. Welcome to Cult of just weird. This is Chris.

3
00:00:52,754 --> 00:00:53,674
Kayla: This is Kayla.

4
00:00:53,762 --> 00:01:00,326
Chris: I forgot to give my credentials. I'm a game designer and a data scientist, even though those credentials are totally worthless for what I'm doing.

5
00:01:00,418 --> 00:01:10,614
Kayla: This is Kayla. I already said that. That's my biggest credential. And then my next credential is that I'm a tv writer. And then my third credential is that I host a podcast.

6
00:01:10,782 --> 00:01:26,006
Chris: Well, welcome to Cult of just weird. We'll get right into it here. But before we do that, just a few quick items of business, or maybe just one item of business, unless you have anything. But we have a new Patreon subscriber, so gotta do our shout out. Thank you to Jim Fingal.

7
00:01:26,078 --> 00:01:26,486
Kayla: Hell, yeah.

8
00:01:26,518 --> 00:01:39,562
Chris: For subscribing to our patreon. Hopefully you join us on our discord. You don't need to be in our patreon to join us on our discord. It'll just give you access to exclusive chat rooms in the discord.

9
00:01:39,626 --> 00:01:43,922
Kayla: Our discord is a lot of fun, my friends. There's a lot of really good memes in there.

10
00:01:44,066 --> 00:02:07,188
Chris: It's all memes that I've stolen from Elon Musk because he steals them from everybody else. Yeah. So if you just go to our show notes, there's a link to the discord there. It's also on our website and pretty much everywhere else that we've done anything on the Internet. All right, Kayla, now we need to transition to talking about transhumanism. That got you. Really? I thought that was pretty lame.

11
00:02:07,244 --> 00:02:08,639
Kayla: You're fired. I'm quitting.

12
00:02:10,500 --> 00:02:20,276
Chris: Well, so we're not just talking about transhumanism. Actually, if you've been with us for the last several episodes, what, like five episodes now? I think five or six. Something like that. Really?

13
00:02:20,308 --> 00:02:31,848
Kayla: This whole season, more than we anticipated. This season took some turns. I mean, we're still on theme, which I'm very proud of us for. But we did end up taking some turns that I don't think were.

14
00:02:31,864 --> 00:02:51,584
Chris: Expecting, well, this is one of those serious rabbit hole situations. This rabbit hole is deep with many chambers and it's confusing and it's hard to know. It's hard to orient yourself or get a framework for understanding any of this stuff. So what do we do when we are confused about something and we want to learn more about it?

15
00:02:51,712 --> 00:02:52,984
Kayla: Well, go to Twitter.

16
00:02:53,072 --> 00:03:23,032
Chris: We go to Twitter. We look at my mom's Facebook posts. Or alternatively, we look for an expert to talk to, which is exactly what we did here. So the next three episodes will be an interview with myself and Doctor Emile Torres. They were kind enough to speak with me about a concept that. Well, we'll get to that. But basically it's an effort to tie a lot of the concepts that we've already talked about in these last few episodes together.

17
00:03:23,186 --> 00:03:26,476
Kayla: There's an umbrella here. I just don't know what that umbrella is.

18
00:03:26,628 --> 00:03:35,596
Chris: It prevents you from getting rained on. Oh, yeah, I'm just cribbing this directly from the Wikipedia article on.

19
00:03:35,668 --> 00:03:39,004
Kayla: That's how you know you've made it, is when you got a Wikipedia.

20
00:03:39,092 --> 00:03:45,700
Chris: I know. Which is like, it's actually not that you've got a Wikipedia, it's that you have maintained a Wikipedia page.

21
00:03:45,740 --> 00:03:48,780
Kayla: They haven't stolen it away from you because they've deemed you unworthy.

22
00:03:48,860 --> 00:04:13,072
Chris: Yeah. So Doctor Emile Torres, who does have a Wikipedia page, that, to my knowledge, has lasted at least a few weeks since I've been doing this research. According to the Wikipedia page, they are an american philosopher, intellectual historian, author and postdoctoral researcher at Case Western University. Their research focuses on eschatology, existential risk and human extinction.

23
00:04:13,176 --> 00:04:13,920
Kayla: Cool.

24
00:04:14,080 --> 00:04:20,598
Chris: Along with computer scientist Tim Nitgebru, Torres coined the acronym. Well, we'll get to that.

25
00:04:20,654 --> 00:04:21,886
Kayla: Like how you keep hiding it.

26
00:04:21,958 --> 00:04:53,226
Chris: I know, I know. It's like, going to be so anticlimactic. Oh, whatever. But that's what we're here to talk about. So anyway, without further ado, here is the first part of my interview with Doctor Neil Torres. I'm super excited to talk to you because. Yeah, I followed you on Twitter and on your other socials for a little while now and I'm sort of like a fan of singularity type stuff. But I also agree with what you're saying. So I've just been really curious to, like, I don't know, help me unpack some of this stuff.

27
00:04:53,378 --> 00:04:54,114
Emile Torres: Yeah, sure.

28
00:04:54,202 --> 00:05:02,910
Chris: So let's just start with just the real basic stuff. If you could introduce yourself for our audience, if you want to talk about your work or even plug anything you're working on right now?

29
00:05:03,290 --> 00:05:38,170
Emile Torres: Sure. I'm Emile P. Torres, and my work over the past decade plus has focused on existential threats to humanity and civilization. And my most recent book was on the history of thinking about human extinction within the western tradition and the ethics of human extinction. More recently, I've done a lot of AI ethics work, including publishing a paper with Doctor Timbit Gebru on what we call the TESCREAL bundle of ideologies, from transhumanism, the t, to long termism, the l, the TESCREAL bundle. So that's who I am. That's what I've been up to.

30
00:05:38,830 --> 00:05:46,530
Chris: That sounds like a fascinating line of work to research human extinction. How did you get into that?

31
00:05:47,640 --> 00:06:11,768
Emile Torres: So I think what piqued my curiosity and whetted my appetite for this topic was probably growing up in a deeply religious community where there was lots of talk about the end of the world and the end of humanity. So there was the expectation that the rapture was going to happen at any point, the Antichrist was going to emerge.

32
00:06:11,864 --> 00:06:14,220
Chris: What religion were you brought up in, if you don't mind my asking?

33
00:06:14,760 --> 00:06:32,752
Emile Torres: I don't mind at all. It was, yes, very fundamentalist evangelical community I went to attended a Baptist church for many years. And so, yeah, there was this expectation that the Rapture was imminent. 1990s is when the left behind series was.

34
00:06:32,896 --> 00:06:34,528
Chris: Oh, right.

35
00:06:34,624 --> 00:07:07,930
Emile Torres: Remember that? I mean, I think series. I mean, they sold. I can't remember exactly what the numbers were, but, I mean, it was a massive success. Like, people don't realize how many books. I think it sold as many books as were sold in total the previous three decades, something like that, yeah, it was incredibly successful. Yeah, I think that's awful, the details, but, I mean, massively popular. So I was very much sort of caught up in that, and I think that planted the seeds of interest in the future of humanity. And the end of the age where you.

36
00:07:07,970 --> 00:07:11,216
Chris: Sorry, that age where you. You were a believer, too.

37
00:07:11,408 --> 00:07:12,288
Emile Torres: Absolutely, yes.

38
00:07:12,344 --> 00:07:12,824
Chris: Okay.

39
00:07:12,912 --> 00:08:05,546
Emile Torres: Very much so, yes. I mean, I sort of had, you know, some dreams of possibly being a preacher. So, yeah, I was very much a believer. But ultimately, you know, by the early two thousands, I had pretty much left religion. The problem of evil, as philosophers call it, was a main reason. I just couldn't understand how God would allow there to be so much or even any evil in the world if God really was omnibenevolent, perfectly good, and omnipotence, all powerful. So that's what led me. But nonetheless, what was left behind was a void that took the shape of eschatology, and that is just the study of last thing. So it's been a branch of theology since forever.

40
00:08:05,698 --> 00:08:08,756
Chris: Yeah, yeah. Nice callback to left behind there, by the way.

41
00:08:08,908 --> 00:08:54,258
Emile Torres: Yeah, yeah, right. Yeah. So, yeah, I don't know. Then I stumbled upon this. Writings from transhumanists and long termists, although the word long termism hadn't been coined at that point. And they basically were talking about some radical transformation in the near future where we go from our current human state to this, you know, radically different post human state. In a sense, that's the end of the world. The end of the world as we know it, at least. And that transition itself will be. There's this utopian promise on the other side of it, but we have to cross this field of landmines to get there. And that's sort of like the apocalypse.

42
00:08:54,394 --> 00:08:55,130
Chris: Right? Right.

43
00:08:55,170 --> 00:09:36,110
Emile Torres: And one possible outcome of trying to build this utopian post humanity future is human extinction. So that's how I got into it. And I was very much involved in that transhumanist long termist movement for a decade or more and then became a critic of it. Felt like their whole approach to thinking about the future of humanity is misguided. But still, I mean, you can be interested in human extinction and the long term future of humanity without subscribing to any of these ideologies. So that's sort of where I am now. Anti transient, anti long termist, but nonetheless super interested in this possibility that our species may cease to exist at some point in the future.

44
00:09:36,270 --> 00:09:39,730
Chris: It seems like maybe mission accomplished on the preacher thing there.

45
00:09:40,030 --> 00:09:40,638
Emile Torres: Yeah, right.

46
00:09:40,694 --> 00:09:41,890
Chris: Just a different topic.

47
00:09:42,270 --> 00:10:04,708
Emile Torres: Yeah, yeah, absolutely. I mean, I talked to a journalist just the other day who asked me if it would be appropriate to refer to some leading figures in the transhumanist of long term movement as prophets. And it's like, kind of. I mean, you know, they'd object to that term, of course, because they don't want to be associated with, you know, traditional organized religion.

48
00:10:04,884 --> 00:10:06,640
Chris: But Ray Kurzweil, for sure.

49
00:10:07,140 --> 00:10:40,320
Emile Torres: Yeah, exactly. I mean, even down to his specific predictions about when the singularity, sometimes called the techno rapture, will occur, 2045, you know, and which is a very specific dates like that for the second coming of Christ or the rapture. I mean, those have been very common throughout the history of religion. You know, the Millerites famously thought 1844, the second coming would happen. Harold camping, not that long ago predicted. I think it was 2011, the rapture was going to happen.

50
00:10:40,860 --> 00:10:42,732
Chris: So at least track, honestly.

51
00:10:42,876 --> 00:10:46,120
Emile Torres: Yeah, right. There are a lot of examples. Yeah.

52
00:10:46,760 --> 00:11:09,580
Chris: I do want to get back to your journey, because I also have questions about my own journey with these ideologies. But first, I want to level set a little bit. We've been talking on the show the past, previous few episodes about transhumanism. I've mentioned extropianism, but I kind of want to level set with you. What is test real? What does that acronym mean?

53
00:11:09,880 --> 00:11:36,530
Emile Torres: So, the reason that Tim, Niet, Gebru, and I came up with that term was were writing a paper trying to understand the ideologies that have driven and shaped the current race to build artificial general intelligence, or AGI. So what are these? Where did this race to build AGI come from? Because ten years ago, maybe even really five years ago, basically, nobody was talking about AGI.

54
00:11:37,510 --> 00:11:39,062
Chris: That's kind of a good point. Yeah.

55
00:11:39,246 --> 00:12:01,458
Emile Torres: Right. So it sort of came out of nowhere. Even ten years ago, a lot of people in the field that we would now consider to be AI wouldn't use the term AI. They were just embarrassed by the long history of unfulfilled promises. So there was this. Okay, we're working on, like, machine learning or natural language processing.

56
00:12:01,554 --> 00:12:18,626
Chris: Yeah, you're right. The language was different. And, I mean, I've. Sorry to interrupt, but machine learning is something that I'm pretty familiar with my work with big data analysis, and. Yeah, that's all you heard. Like, you know, you never heard AI. You only heard machine learning.

57
00:12:18,778 --> 00:12:57,910
Emile Torres: Yeah. So that's a fascinating transformation of the field that is very recent. And so I don't want to speak for Gabru, but my understanding is that part of the reason that we. She got in touch with me in the first place was because she was trying to understand. Why is it that all of a sudden everybody's talking about AGI, and these companies that have billions and billions of dollars in funding or investments have as their explicit goal the creation of these artificially general intelligent systems. How did that happen? And so our claim in the paper that we just recently published, there are two interpretations of it.

58
00:12:58,410 --> 00:13:32,250
Emile Torres: One is the weak thesis, and that is just the claim that if you want to understand the origins of the race to build AGI, if you want to understand where these companies came from in the first place, a complete explanation requires reference to seven ideologies, and these are the test rule ideologies. So there's transhumanism, extropionism, singularitarianism, cosmism, rationalism, effective altruism, and long termism. I know that it's a mouthful. The big polysyllabic words.

59
00:13:32,370 --> 00:13:35,458
Chris: No, you just gave me the rest of the episodes for the season. So there you go.

60
00:13:35,514 --> 00:14:34,148
Emile Torres: Yeah, great. I mean, without a doubt, like, a book could be written on each of these ideologies. So the weak thesis is just saying, like, if you want to give a complete explanation, you have to reference these ideologies. The strong thesis says that actually. So strong thesis accepts the weak thesis, but goes beyond it in saying that actually we should conceptualize these ideologies as forming a single cohesive bundle. You know, it's basically just one tradition of thinking that has these various little branches, these variations. And so that is what we defend in the article. Transhumanism is sort of the backbone of the test. Real bundle. Pretty much. But, I mean, all of the other ideologies grew out of the transhumanist movement, either entirely or at least partly. So extropianism, singularitarianism, and cosmism are just versions of transhumanism.

61
00:14:34,324 --> 00:15:40,330
Emile Torres: Rationalism was founded by a transhumanist, extropian singularitarian who has close connections to cosmism. Altruism came out of the transhumanist movement and was developed on. Within the rationalist blogosphere, on the blogs run by rationalists. And long termism is really just a kind of ethical interpretation of the vision that is at the heart of a lot of the worldviews of a lot of transhumanists. We go out, colonize space, we create this sprawling, huge, multi galactic civilization. It's very similar to cosmism in terms of its vision of the future. So that's the strong thesis. And I think the evidence for the weak thesis is unambiguous and overwhelming. Those ideologies played an integral role in the race to build AGI. And I also think the evidence for the strong thesis is extremely strong. That's my intention at least.

62
00:15:40,410 --> 00:15:57,340
Chris: Right? Yeah, it's interesting. You're kind of getting into my follow up question, was, I was gonna ask why you guys coined this term. Like, does it convey something important? That's simply saying transhumanist or singletarian doesn't convey by themselves.

63
00:15:58,040 --> 00:16:27,620
Emile Torres: Yeah, yeah. Great question. So I would say, like, part of the importance of the TESCREAL concept is the following. I think a lot of companies within our capitalist system, their behavior can be explained fully and completely simply by referencing the profit motive. Right. So why are the fossil fuel companies destroying the world? Well, they're just trying to maximize profit.

64
00:16:27,740 --> 00:16:30,044
Chris: Maximize shareholder value, line go up.

65
00:16:30,172 --> 00:17:27,287
Emile Torres: Yeah. Their only obligation is to their shareholders, not to the future. Generations or, you know, whatever. So part of the claim then is that the AI companies are a weird anomaly. They're not like other companies. Without a doubt, there are larger companies that are pouring money into these AI companies, like Microsoft and Google, Amazon and so on. And why are they doing that? Because they expect to make billions in profits. But the founding of these AI companies in the first place really didn't have that much to do with profit. It had to do with. With this techno utopian vision that lies at the very heart of the TESCREAL bundle and has been given slightly different interpretations by some of these different ideologies. Singularitarianism emphasizes the singularity a bit more than ex tropianism or cosmism.

66
00:17:27,463 --> 00:18:34,142
Emile Torres: But really, there's just tremendous overlap in terms of the communities that have coalesced around each of these ideologies. In terms of the value commitments, the intellectual tendencies, and the visions of the future, these ideologies overlap significantly. And then historically, they just grew out of each other. Extropionism was the first organized modern transhumanist movement. It gave birth to singularitarianism and cosmism. And then, as I mentioned before, it was out of this early testeral movement that you get rationalism, effective altruism, then ultimately long termism. So the work that this concept of test realism is doing is trying to provide an explanatory framework for understanding these phenomena. And, you know, one, if you ask philosophers, they'll say that one sort of theory about what makes an explanation in science, for example, a good explanation is its capacity to unify a bunch of disparate phenomena, right?

67
00:18:34,206 --> 00:19:09,410
Emile Torres: So, like, evolutionary theory does this, it's like you've got all this evidence from the fossil records, you've got biogeographical data, you know, some species over here. And evolutionary theory just brings it all together, explains why you see what you see. So I think that one of the virtues of the test real framework or explanation is that it does precisely this. You might think, well, you know, like, what exactly does long termism have to do with transhumanism or cosmism? The TESCREAL framework aims to explain that and thereby bring together these various phenomena into one single coherent framework.

68
00:19:09,530 --> 00:19:37,346
Chris: Yeah, if it catches on when I use it, just talking with my co host, it definitely reminds me that there are multiple ideologies that we're talking about. It reminds me that they're tied together, and it reminds me that there's a history, like you said, of one growing out of the other. So it kind of puts that by saying that the one word to encompass all of it. It puts that all at the forefront. So that makes sense.

69
00:19:37,538 --> 00:20:02,650
Emile Torres: Absolutely. That is one of the aims with this concept is to foreground these particular ideologies, to make them explicit, to name them in a way that they weren't before. So, for example, there's a cultural critic named Douglas Rushkoff, and he published this fantastic book, maybe in 2022, I can't remember, but quite recently called survival of the richest.

70
00:20:02,810 --> 00:20:03,786
Chris: Oh, I've heard of that book.

71
00:20:03,818 --> 00:20:16,554
Emile Torres: Yeah, yeah, it's really good. So he follows a bunch of, like, tech billionaires around and asks them questions about their views of the likelihood that civilization will collapse. And they. All right, yeah.

72
00:20:16,682 --> 00:20:22,122
Chris: This is the one where he talks to everyone about their, like, mega bunkers in New Zealand.

73
00:20:22,266 --> 00:20:22,838
Emile Torres: Yes.

74
00:20:22,954 --> 00:20:37,246
Chris: Which is so weird, by the way. Sorry. The whole, like, oh, man, the world's going to end, and we can't. We, the most powerful people on earth, can't do anything about it. So let's. Let's have mega bunkers in New Zealand. Oh, my God. Sorry. Yeah, go ahead.

75
00:20:37,438 --> 00:20:53,060
Emile Torres: No, I highly recommend it. It is very amusing. And he is extremely critical. And, you know, I think he had extraordinary access to all of these really powerful, super richest individuals. And he completely burned all those bridges, I think by intention.

76
00:20:53,400 --> 00:20:54,264
Chris: Right? Yeah.

77
00:20:54,392 --> 00:21:05,248
Emile Torres: Basically kind of mocking them because they just, you know, like, hard not to. Right? Yeah, yeah. He has some questions, like, well, you know. Okay, so you could hire these security guards to keep your bunker.

78
00:21:05,304 --> 00:21:09,952
Chris: Oh, he was the security guard guy. Yep, I remember that. Please tell a story. I love that story.

79
00:21:10,056 --> 00:21:30,816
Emile Torres: Yeah. I don't remember the exact details, which are juicy and very abusing, but it was something like, well, when civilization collapses and money is worthless, how exactly are you going to prevent them from just killing you? And he has stories of these billionaires going like, oh, shit, I hadn't really thought about that.

80
00:21:30,928 --> 00:21:37,020
Chris: Right. Yeah, well, you can't, sir. Have you tried being friends with them?

81
00:21:40,320 --> 00:22:27,864
Emile Torres: So, ultimately, the reason I mentioned this is a lot of the billionaires that he talks to have this vision of the future. Like, okay, maybe civilization could collapse, but we're going to survive the apocalypse. And ultimately, their vision of the future is that they will digitize their brains, live in, you know, live on the cloud in some computer simulation, and that is the future. And so Rushkoff calls this the mindset with a capital t and a capital m. And basically it's more or less synonymous with TESCREAL. And so I think one of the virtues of TESCREALism, and I suspect I don't want to speak for Rushkoff, but I suspect he would agree with this. One of the virtues is that it foregrounds, like I said, it foregrounds these ideologies. What is the mindset?

82
00:22:27,952 --> 00:22:48,960
Emile Torres: The mindset is the vision built on transhumanism and a kind of cosmist or long termist view of the longer term future of humanity whereby we re engineer ourselves, we upload our minds to computers, we spread into space and so on. So, yeah, I think that's partly why, insofar as the concept is valuable, I think that's one reason.

83
00:22:51,020 --> 00:22:55,468
Chris: So does that explain it all, Kayla? Pretty much. Do you understand all of the. Everything now?

84
00:22:55,644 --> 00:22:58,164
Kayla: All of the everything. Everything everywhere all at once.

85
00:22:58,212 --> 00:22:59,828
Chris: All of it, yeah.

86
00:22:59,964 --> 00:23:01,396
Kayla: Thanks for listening to culture. Just weird.

87
00:23:01,428 --> 00:23:03,188
Chris: We did it. We explained everything.

88
00:23:03,284 --> 00:23:07,982
Kayla: I think that was the appetizer course.

89
00:23:08,126 --> 00:23:08,494
Chris: Yeah.

90
00:23:08,542 --> 00:23:09,798
Kayla: So there's two more.

91
00:23:09,894 --> 00:23:11,294
Chris: Yeah. Two more episodes coming.

92
00:23:11,422 --> 00:23:14,974
Kayla: An entree and then a dessert. Is that what's coming?

93
00:23:15,102 --> 00:23:16,342
Chris: As long as there's not a requirement.

94
00:23:16,366 --> 00:23:17,718
Kayla: Or is it more like tapas?

95
00:23:17,894 --> 00:23:23,214
Chris: I think it's more like tapas. Cause I'm just concerned that the final episode may not be sweet, it may be more savory.

96
00:23:23,342 --> 00:23:27,878
Kayla: So this is the dim sum or sour episode runs. Okay. Got it.

97
00:23:27,894 --> 00:23:36,180
Chris: Well, cuz, you know, we used to do these episodes where it was like, here's a two hour long interview, and then it was like us talking for 2 hours. So we're trying to break it up into smaller chunks.

98
00:23:36,600 --> 00:23:40,424
Kayla: Well, I took. I took a lot of notes while were listening because there's a lot of. There's a lot of things that.

99
00:23:40,472 --> 00:23:41,144
Chris: You took notes.

100
00:23:41,192 --> 00:23:42,816
Kayla: Of course you did too. Don't hate.

101
00:23:42,848 --> 00:23:43,512
Chris: Yeah, it's horrible.

102
00:23:43,536 --> 00:23:44,408
Kayla: You took more than me.

103
00:23:44,504 --> 00:23:46,736
Chris: I took more notes than I ever did in school.

104
00:23:46,808 --> 00:23:59,458
Kayla: Oh, my God. Like, combined, there were a lot of things that the two of you talked about that I just wanted to be like, ooh. So I'll just mention some of those things. I don't necessarily have a lot to add. I have a lot to talk about when it comes to survival of the richest. But, you know, we'll get to that.

105
00:23:59,514 --> 00:24:00,234
Chris: A lot of takes.

106
00:24:00,282 --> 00:24:12,306
Kayla: A lot of takes to give. I just think it's interesting how we've drawn the parallel before in this season, but how much of a parallel there is between millenary. I can't ever say the word millenarian.

107
00:24:12,418 --> 00:24:13,042
Chris: Millenarian.

108
00:24:13,106 --> 00:24:15,010
Kayla: Millenarian religious beliefs.

109
00:24:15,090 --> 00:24:16,386
Chris: Millie Brownian.

110
00:24:16,458 --> 00:24:24,692
Kayla: Thank you. Millenarian religious beliefs and transhumanist singularity. Long termist future beliefs.

111
00:24:24,756 --> 00:24:36,932
Chris: Yeah. And like, we've seen it in other areas, too, and in other groups we've covered on the show to the point where I'm just kind of like. I think, like, millenarianism is just something that. It's like an archetype, you know?

112
00:24:36,956 --> 00:24:50,372
Kayla: I think there's some sort of fundamental human thing. Exactly like furries, like how we decided there was something fundamentally human about furry fandom. I think furry fandom and millenarian beliefs, like those are really two things that.

113
00:24:50,396 --> 00:25:04,028
Chris: Define you as a human being. I agree. You know, we mentioned the left behind books and how that kind of, like, planted some of those seeds in eschatology. I just want to mention a couple things. First of all, there was an rts made by the left behind guys.

114
00:25:04,084 --> 00:25:05,476
Kayla: That's a type of video game.

115
00:25:05,588 --> 00:25:09,948
Chris: Yeah. I think it was basically just command and conquer. But then, like, you could also use God powers or something.

116
00:25:10,004 --> 00:25:15,004
Kayla: Did we say left behind? The conceit of left behind is that it's post. It's the world post rapture.

117
00:25:15,052 --> 00:25:15,492
Chris: Yeah, yeah.

118
00:25:15,516 --> 00:25:18,414
Kayla: So the rapture happens. Those that have been left behind.

119
00:25:18,542 --> 00:25:36,846
Chris: Right. And there's a million books. I tried because it was just, like, ridiculously popular. I tried reading the first one and I just couldn't do it. Was it dog shit? I don't like if anybody's out there listening. It was not for me. If you like the series, I'm not trying to shit on you or anything, but it just. I could not. I didn't even get past the first chapter.

120
00:25:36,918 --> 00:26:06,184
Kayla: I think kind of staying on this, like, talking about the religious aspect and the faith aspect. The two of you talked about profits in Tezcreal and whether or not profits exist in Tuscarial. And you mentioned Ray Kurzweil and just. I think we could name a lot of people. I think that Max Moore could be granted profit status. I think that would definitely be exonyms. I doubt that folks within the community would be interested in having those labels applied to themselves.

121
00:26:06,232 --> 00:26:08,620
Chris: Depends on how tongue in cheek they want to be about themselves.

122
00:26:09,140 --> 00:26:26,268
Kayla: There's a future episode coming up after these interviews where they use that term. I don't want to spoil too much of the episode, but literally, futurists, long termists, singularitarians that use the term profit to describe the futurist thinkers of the past.

123
00:26:26,364 --> 00:26:37,888
Chris: Yeah, I mean, it can be pejorative, but at the same time, futurist and profit are both just. They're two people that are trying to make predictions about the future. They just get their information from different sources.

124
00:26:38,024 --> 00:26:44,232
Kayla: Allegedly in one reality, Roko of Roko's basilisk is simply a prophet. It's simply Rocco's prophet.

125
00:26:44,336 --> 00:26:44,840
Chris: He's a prophet.

126
00:26:44,880 --> 00:26:46,448
Kayla: Simply Basilisk's prophet.

127
00:26:46,544 --> 00:26:53,784
Chris: Right. He's more of, like, a herald of Roko's Basil. He's the basilisk's herald. It's like Galactus.

128
00:26:53,912 --> 00:26:56,144
Kayla: H a r o l d. Harold.

129
00:26:56,192 --> 00:27:10,650
Chris: Harold. Yeah. Harold. Emil also mentioned how their first chink in the armor of religion was theodicy. It was the problem of evil. And I just thought that was like, oh, yeah, that's everyone's right? Like, isn't that everyone?

130
00:27:10,770 --> 00:27:46,506
Kayla: This is, again, why I say I want to have, like. And I've done no work to make this happen, but I say, like, oh, I want to have, like, a religious academic or a religious scholar or, like, a priest. Like, somebody who's very learned in the, like, academics of faith and not, like, in a. Not in a way that Emile Torres is. But I mean, like, somebody who, like, is within the church. To fucking talk to me about that, because, come on. I need to. I. If this is such a. If this is the turnpoint for so many people, there has to be a lot of philosophical thought about it.

131
00:27:46,578 --> 00:27:53,218
Chris: If this were a business, there would be a lot of research done on. This is what's causing people to unsub from our app.

132
00:27:53,274 --> 00:27:53,914
Kayla: Yeah.

133
00:27:54,082 --> 00:28:16,014
Chris: Yeah. Overall, I think, you know, the. Obviously, the takeaway from. From this episode and really, the reason that I had Emile on the show, the reason that I talked to Emile was because of this whole test reel thing, because the ideologies that you and I have been sort of like, rabbit holing down are so. There's, like, so many of them, and they're, like, all same but all different.

134
00:28:16,102 --> 00:28:16,518
Kayla: Right.

135
00:28:16,614 --> 00:29:03,706
Chris: And so I was, like. I was getting overwhelmed. And I just, like, the. The framework of Tescreal helps me think about this stuff easier. And, like they were saying, it forefronts the. It reminds you that when you're talking about one thing, you're also kind of talking about the other. Right. Like, it reminds you that when you're talking about so and so, you know, Elon Musk and his SpaceX cosmism, you know, he's also got this, like, effective altruism slash singletarian thing, too. So. And also that bit about, quote, unquote, the mindset. I think that's. And we'll get into more of that in future episodes. But just how influential this stuff is the other big, like, why I wanted to talk to Doctor Torres, right.

136
00:29:03,778 --> 00:29:18,212
Chris: Because if this was just, like, four people in Nebraska that were just like, we have a club to talk about robots. Who cares? But this is the most powerful people in the world that are into the mindset that he's talking about is the Silicon Valley ruling class.

137
00:29:18,396 --> 00:29:54,634
Kayla: And I think that's maybe the thing that is the fundamental human component is this just like, the common ground between that Silicon Valley mindset that begets tezcrial and millenarian religious beliefs is, I think, that something that sets us apart as human beings from other creatures, other animals, is that we're not just thinking about our own mortality. We're cursed with thinking about and considering the extinction of our species, which is kind of inevitable, like, in the grand scheme of things.

138
00:29:54,802 --> 00:29:56,550
Chris: No, it's not kind of inevitable.

139
00:29:57,290 --> 00:30:00,234
Kayla: I'm softening the blow. It's an inevitability, y'all.

140
00:30:00,282 --> 00:30:01,586
Chris: Everyone is gonna eventually die.

141
00:30:01,658 --> 00:30:26,552
Kayla: Everyone, including the entire species, will be no more. And so I think that maybe that's the fundamental human condition at the core here. And, like, of course, that's terrifying. And of. I don't really blame anybody for trying to come up with more, quote, unquote, science minded approaches to that feeling, that fear, doing something about it.

142
00:30:26,616 --> 00:30:30,456
Chris: Would you say they are using those approaches to manage their terror?

143
00:30:30,528 --> 00:30:33,760
Kayla: I totally forgot about terror management theory, which is underpinning.

144
00:30:33,800 --> 00:30:35,048
Chris: There's only a few episodes ago.

145
00:30:35,104 --> 00:30:49,548
Kayla: I know. I got a lot of my mind. Yeah. I think that terror management. And I don't ever want to. I don't ever want terror management to take away from why somebody is doing something.

146
00:30:49,644 --> 00:30:50,280
Chris: Sure.

147
00:30:51,580 --> 00:30:53,540
Kayla: But I think it's very present here.

148
00:30:53,700 --> 00:31:12,918
Chris: Right. I agree. I think that it's not a coincidence that we wound up here talking about this stuff. Right. We're like, hey, let's do a season where we kind of talk about death and overcoming death and, oh, cryonics. It's really quite a natural progression to where we are when that's what you start with.

149
00:31:13,054 --> 00:31:18,790
Kayla: Right. I mean, look, two of the folks we've talked to this season already were brought up in baptist households, right?

150
00:31:18,910 --> 00:31:19,446
Chris: Yeah.

151
00:31:19,558 --> 00:31:21,290
Kayla: Specifically baptist households.

152
00:31:22,070 --> 00:31:28,430
Chris: So Emil mentioned. I think they mentioned all of the different names of the things in the acronym.

153
00:31:28,550 --> 00:31:29,290
Kayla: Yes.

154
00:31:29,670 --> 00:31:32,966
Chris: I'll just go over it again one more time just for, like, a.

155
00:31:33,078 --> 00:31:33,572
Kayla: Please.

156
00:31:33,686 --> 00:31:40,752
Chris: It's good to repeat stuff. Right? All right, so TESCREAL. So, luckily, it starts with transhumanism, because that's what we started with.

157
00:31:40,816 --> 00:31:41,472
Kayla: Right.

158
00:31:41,656 --> 00:32:36,700
Chris: And then the e is extropianism. We mentioned that on the show a few episodes ago, but you were kind of actually just talking about this. You were talking about like human lifespan extinction and also species extinction. And extropions are kind of like we're anti that. It was coined to sort of be like an opposite to entropy, right? If entropy is going to kill everyone eventually, then can we reverse that? Can we prevent it? And also an opposite to an alternative to utopia because of all the negative things that utopianism carries with it. So that's the e in testreal. The s is singulitarianism. Singularitarianism. The singularity, singularitist people that are into the techno rapture. That's basically what that is. That's like we're all going to upload our minds and there's going to be a super AI that enables us to do cool stuff like that.

159
00:32:37,400 --> 00:32:39,152
Chris: C is cosmism.

160
00:32:39,336 --> 00:32:40,712
Kayla: That's one I don't quite understand.

161
00:32:40,776 --> 00:33:12,206
Chris: Yeah, cosmism is, at least in my estimation, it's kind of like the black sheep of the acronym. Not that it doesn't belong, but it's kind of harder to fit that puzzle piece in. But don't you worry, Kayla. We are going to do that. We're going to jam that puzzle piece in there for everyone. We'll get to that. But anyway, just think of it as like, our destiny is to colonize space and then, like, apply other parts of TESCREAL to it. So our destiny is to colonize space and, you know, with robots.

162
00:33:12,398 --> 00:33:13,170
Kayla: Cool.

163
00:33:13,710 --> 00:33:45,424
Chris: The r is rationalism. We just filled the airwaves with three episodes worth of rationalism when we talked about less wrong. And the rationalist diaspora. The e and the a are both effective altruism. We'll get to that as well. And the l is long termism. Long term ism is super weird, you guys. I think we'll probably get to that too in a future episode. But I just wanted to say all the words in the acronym so that we can kind of level set. So are you ready to go live in a mega bunker in New Zealand then?

164
00:33:45,472 --> 00:34:02,524
Kayla: No, because I don't think we should even talk about it because we're trying to keep these episodes to digest. And I could talk about that for a long time. And also, like, where are my stories about that kind of dystopian setting? I want like a Sci-Fi book about.

165
00:34:02,572 --> 00:34:05,500
Chris: What happens, like, when the billionaires actually are the only ones.

166
00:34:05,540 --> 00:34:13,219
Kayla: The billionaires are in the bunkers and their people turn on them. Yeah, that would be great. Where is it somebody who writes books do it?

167
00:34:13,340 --> 00:34:17,956
Chris: If only we had a tv writer that could write a television show about this thing.

168
00:34:17,987 --> 00:34:54,608
Kayla: I got too many projects already, none of them paid somebody. Hire me. I think that it's one of the. We were talking a little bit as were supposed to be listening to the interview and waiting to talk on the air, but we didn't wait. Cause we're really bad at it. Of just how, like, I don't even know what the correct word is, just how impressive it is that the billionaires of the world, the most powerful people on planet Earth, which could mean the most powerful people in the universe, the most powerful in the solar system, all.

169
00:34:54,623 --> 00:34:57,139
Chris: Those people in the solar system cannot.

170
00:34:57,840 --> 00:35:17,122
Kayla: Solve the problems of climate change, of disaster, of, like, oh, no, human extinction is upon us. Cannot solve that problem in any way beyond. I gotta get up in a bunker and, like, figure out how to, like, not be killed by my own security. Like, that's impressive.

171
00:35:17,226 --> 00:35:26,310
Chris: I remember reading that. So I think that the book survival of the richest was actually based on an article. And I want to say slate or something.

172
00:35:26,650 --> 00:35:28,274
Kayla: I feel like you and I read this article.

173
00:35:28,402 --> 00:35:54,646
Chris: I think everyone read this article because I remember talking about it. It was like, water cooler topic for a hot minute there. But, yeah, it's crazy. This guy went and talked to the world's richest people, the world's most powerful people, and they were like, we need your advice, man. Like, how do we keep our security guards from turning on us when money isn't a thing, when the current money system goes away? And it was kind of like, you can't. What are you talking about?

174
00:35:54,758 --> 00:35:59,966
Kayla: You could try and stop that from happening with your extreme wealth.

175
00:36:00,118 --> 00:36:06,086
Chris: Yeah. And this also goes back to what you and I were discussing while were listening to the interview, as I'm.

176
00:36:06,118 --> 00:36:08,556
Kayla: Lecturing all the billionaires listening to our show.

177
00:36:08,678 --> 00:36:14,432
Chris: Elon, listen here. If you just give this, if you donate to culture, just weird. You can avert this whole problem.

178
00:36:14,496 --> 00:36:15,300
Kayla: There we go.

179
00:36:15,880 --> 00:37:05,104
Chris: So were talking about how it's interesting that they feel sort of almost trapped by the same system that we feel trapped by. And I think that the problem is that just that it's one of those passing through the eye of the needle things where the only way to do it is to drop the thing that is precious to you. So, like, for these elite billionaires, right? It's like, hey, how do we prevent the world from collapsing? Is there a way? I'll do anything right? And then the answer is like, yeah, share. And they're like, oh, is that the only way? Yeah, I'm afraid so. And the consequences. What? You said it was extinction. Oh, that's a tough call. I don't know, man. I don't think I can do that.

180
00:37:05,152 --> 00:37:35,326
Kayla: Actually feel like that, weirdly does and doesn't feel TESCREAL aligned. Like, how is that aligned with extropianism? Like, how is that aligned with the tech utopia ideals? It's not. But I'm not surprised that mentality is part of this mindset. But it doesn't really align with what is preached by what I seem to understand as being preached by the greater community here.

181
00:37:35,398 --> 00:38:18,536
Chris: Yeah, I agree. This is going to be a recurrent theme. And this is one thing that we did talk about in the humanity plus episode, was that there does seem to be at least one large fracture within the sort of TESCREAL type community. And also, by the way, Doctor Torres themselves say this. Not everyone who is a transhumanist, for example, or not everyone who is a cosmist would they classify as a TESCREAList. So it's not like a one size fits all. And they're very clear about that. I've seen them do other talks where they're very clear about how diverse the Tuscal, transhumanist, blah. I won't say all the words, how diverse that community is.

182
00:38:18,728 --> 00:38:30,486
Chris: And I think that's something we're going to keep coming back to, because it does feel like there is at least one major fracture between the sort of like, libertarian Peter Thiel Elon Musk, like, ultra wealthy set.

183
00:38:30,638 --> 00:38:31,262
Kayla: Right?

184
00:38:31,406 --> 00:38:40,430
Chris: And then people who were like, I think it would be cool if there were robots and I could upload my mind to San Junipero. That'd be neat.

185
00:38:40,550 --> 00:38:44,350
Kayla: Oh, San Junipero. I forgot about that. That was nice.

186
00:38:44,510 --> 00:38:45,742
Chris: That's a reference for.

187
00:38:45,846 --> 00:38:48,350
Kayla: That's the one nice episode of Black.

188
00:38:48,390 --> 00:39:15,792
Chris: Mirror where people uploaded their minds to like a. It's like a sick utopia. Anyway, just answer to your question. I think that there is kind of a fracture there, right? And I think that's a theme that is just a recurrent theme with this whole topic. Next time on cult or just weird. How are all these poor billionaires gonna save themselves? When the shit hits the fan, the screen.

189
00:39:15,816 --> 00:39:22,630
Kayla: I like it. I didn't know if I'm gonna say anything else. The way for them to do that is to, like, share and subscribe.

190
00:39:22,970 --> 00:39:24,130
Chris: That's how you save the world.

191
00:39:24,170 --> 00:39:38,938
Kayla: If you want to save the world, come hang out on our discord. The link is in the show notes. If you want to save the world even harder, you can go to patreon.com culturejisweird. Get access to bonus content and behind the scenes stuff. And, you know, just keep listening and we'll see you next time.

192
00:39:38,994 --> 00:39:43,178
Chris: Caleb, please, okay? We're not trying to save the world. We're trying to save the billionaires.

193
00:39:43,274 --> 00:39:44,490
Kayla: We're trying to save ourselves.

194
00:39:44,610 --> 00:39:49,030
Chris: When the billionaires are what's important, everyone else can die.

195
00:39:49,410 --> 00:39:51,786
Kayla: That's what they think. And that's why I hate them.

196
00:39:51,898 --> 00:39:57,950
Chris: This is Chris, this is Kayla, and this has been cult or just escriel?

197
00:40:14,300 --> 00:40:17,160
Kayla: Subscribe.

198
00:40:19,420 --> 00:40:19,740
Emile Torres: Our channel.

Dr. Émile P. Torres Profile Photo

Dr. Émile P. Torres

Author / Eschatology expert

Émile P. Torres is a philosopher and historian whose work focuses on existential threats to civilization and humanity. They have published on a wide range of topics, including machine superintelligence, emerging technologies, and religious eschatology, as well as the history and ethics of human extinction. Their most recent book is Human Extinction: A History of the Science and Ethics of Annihilation (Routledge).