Join the conversation on Discord!
Sept. 3, 2024

S6E23 - The Mindset

Wanna chat about the episode? Or just hang out?   --- The future is there… looking back at us. Trying to make sense of the fiction we will have become. -William Gibson   Chris & Kayla finally land the plane on this whole TESCREAL...

Wanna chat about the episode? Or just hang out?

Come join us on discord!

 

---

The future is there… looking back at us. Trying to make sense of the fiction we will have become.

-William Gibson

 

Chris & Kayla finally land the plane on this whole TESCREAL situation.

---

*Search Categories*

anthropological; internet culture; science/pseudoscience

---

*Topic Spoiler*

TESCREAL "is it a cult or just weird?" evaluation & discussion

---

Further Reading

https://en.wikipedia.org/wiki/TESCREAL#:~:text=TESCREAL%20is%20an%20acronym%20neologism,effective%20altruism%2C%20and%20longtermism%22.

https://firstmonday.org/ojs/index.php/fm/article/view/13636

---

*Patreon Credits*

Michaela Evans, Heather Aunspach, Alyssa Ottum, David Whiteside, Jade A, amy sarah marshall, Martina Dobson, Eillie Anzilotti, Lewis Brown, Kelly Smith Upton, Wild Hunt Alex, Niklas Brock, Jim Fingal

<<>>

Jenny Lamb, Matthew Walden, Rebecca Kirsch, Pam Westergard, Ryan Quinn, Paul Sweeney, Erin Bratu, Liz T, Lianne Cole, Samantha Bayliff, Katie Larimer, Fio H, Jessica Senk, Proper Gander, Nancy Carlson, Carly Westergard-Dobson, banana, Megan Blackburn, Instantly Joy, Athena of CaveSystem, John Grelish, Rose Kerchinske, Annika Ramen, Alicia Smith, Kevin, Velm, Dan Malmud, tiny, Dom, Tribe Label - Panda - Austin, Noelle Hoover, Tesa Hamilton, Nicole Carter, Paige, Brian Lancaster, tiny, GD

Transcript
1
00:00:24,560 --> 00:00:27,290
Chris: Welcome to cult or just weird. I'm Chris.

2
00:00:28,220 --> 00:00:29,880
Kayla: Are you confused about it?

3
00:00:30,860 --> 00:00:32,828
Chris: No, I just don't know how to start.

4
00:00:33,004 --> 00:00:34,420
Kayla: I can start. You want me to start it?

5
00:00:34,460 --> 00:00:35,124
Chris: Yeah, go ahead.

6
00:00:35,212 --> 00:00:37,452
Kayla: I'll just start it with the fucking intro regular.

7
00:00:37,516 --> 00:00:40,652
Chris: Oh, you just. Well, okay. Well, in that case, welcome to culture just weird. I'm Chris.

8
00:00:40,836 --> 00:00:48,956
Kayla: Say a little better. Say it like this. Say it like a podcast. Hey, guys, welcome to.

9
00:00:49,068 --> 00:00:50,924
Chris: Oh, no, that's good. That's good. Keep doing it.

10
00:00:51,012 --> 00:00:51,564
Kayla: Terrible.

11
00:00:51,652 --> 00:01:01,900
Chris: No, that's very tiktokable. That's so tiktokable. Hey, guys, welcome to culture just weird. I'm Chris. I'm here showing you some food videos that are way too rapid.

12
00:01:02,280 --> 00:01:04,379
Kayla: And I'm Kayla. I'm a tv writer.

13
00:01:05,239 --> 00:01:08,088
Chris: Your energy was not up to par. Okay.

14
00:01:08,184 --> 00:01:15,656
Kayla: Welcome to Culture just weird, the podcast where we talk about cults and weirds. That's Chris. He is a data scientist and a game designer.

15
00:01:15,808 --> 00:01:18,328
Chris: And that's Kayla. She is a tv writer.

16
00:01:18,464 --> 00:01:24,768
Kayla: And welcome to the season finale for season six of culture just weird.

17
00:01:24,904 --> 00:01:25,660
Chris: Yay.

18
00:01:26,530 --> 00:01:50,150
Kayla: Little bit of a surprise, little bit of a curve ball we are throwing at you there. But as we've been doing these episodes on the TESCREAL bundle, which we'll define for anybody who might be tuning in to our show for the first time, and it happens to be the finale, we are at the point where we are to evaluate TESCREAL based on our. Is it a cult or just weird criteria? And that kind of felt like a good stopping point for the season.

19
00:01:50,490 --> 00:02:24,938
Chris: Yeah, I don't think that was where we necessarily saw the season going back in, you know, December or January, but that's definitely where the. That's definitely where the story took us, is down this sort of, like, transhumanist extra. So we'll just define it now. It's an acronym standing for transhumanism, extropionism, scientology. What's the. Oh, no. Singular. I'm going to keep that one. Singletarianism. Cosmism. I'm, like, sitting here having to remember this.

20
00:02:24,994 --> 00:02:26,386
Kayla: I'm having to remember how to spell.

21
00:02:26,498 --> 00:02:35,802
Chris: Yeah. Rationalism, that's T E S c r. Then EA is effective. Altruism. That's one thing. And then the l is long term.

22
00:02:35,906 --> 00:02:38,658
Kayla: Ism, basically what the Silicon Valley guys are into.

23
00:02:38,754 --> 00:02:44,058
Chris: It's that. And that's kind of the thing. That's why we ended up kind of thinking, like, oh, maybe this is important, actually.

24
00:02:44,154 --> 00:02:44,870
Kayla: Yeah.

25
00:02:45,170 --> 00:02:59,940
Chris: Anyway, yeah. So we're here to evaluate whether tescreol is a cult or nothing. This is definitely the longest we've gone between using our criteria. It's like our gimmick. And we didn't do it almost at all. What did we do it for this season? Do you remember?

26
00:03:00,560 --> 00:03:02,032
Kayla: We must have done it for cryonics.

27
00:03:02,096 --> 00:03:05,180
Chris: We did it for cryonics. I don't remember what we decided.

28
00:03:05,760 --> 00:03:07,656
Kayla: Not a cult. We decided it was not a cult.

29
00:03:07,688 --> 00:03:10,000
Chris: That sounds right. And then I think we also did it for less.

30
00:03:10,040 --> 00:03:11,872
Kayla: Wrong, probably.

31
00:03:12,056 --> 00:03:17,950
Chris: And I think we also decided. No, that one. We said it was a cult, but that's okay.

32
00:03:18,530 --> 00:03:20,674
Kayla: And we also did it with the turge of perpetual life.

33
00:03:20,802 --> 00:03:37,530
Chris: Oh, yes, we did. You're right. You're right. Okay, so now is when we're gonna just bring everything together and talk about TESCREAL as a whole. Before we do that, though, I think we have just a couple little admin things to get out of the way just to tell you guys about. You're the one with the notes, so.

34
00:03:37,610 --> 00:04:00,400
Kayla: Oh, I didn't know what you were doing. He was pointing his little fingers at me. I just wanna tell you guys about how kind of the rest of the year is going to play out for culture. Just weird. So, like we said, this is the official season finale, but there are a few more stories that we want to tell that didn't fit neatly into the season. Things that are non TESCREAL related, and then some side journeys, some avenues we want to go down that are related to TESCREAL.

35
00:04:00,440 --> 00:04:09,146
Chris: So there's one thing that I want to talk about that is, like, maybe could be like, a lowercase letter that is in the acronym that I think should be there.

36
00:04:09,288 --> 00:04:41,010
Kayla: So what we're going to be doing is we're going to be releasing these stories as we get to them kind of through the end of the year, not on a regular schedule. But what we do want to do is make sure they're available to our Patreon patrons first, and then they will publish on the main feed after that. And then, of course, we will also be catching up on our Patreon exclusive bonus episodes. So if you are a patron and you're wondering where those are, they're coming. And if you are not yet a patron, go to patreon.com culturjisweird so you can make sure you don't miss any cultures weird content.

37
00:04:41,480 --> 00:04:50,960
Chris: And if you are not chatting with us on discord, the link for that is in the show notes. You should be chatting with us on discord because we talk about all the stuff that we talk about on the show.

38
00:04:51,040 --> 00:04:56,944
Kayla: There are some great conversations happening in there, and they are the things we talk about here.

39
00:04:57,112 --> 00:04:58,544
Chris: Yeah. I'm bleeding right now.

40
00:04:58,632 --> 00:05:00,940
Kayla: I wouldn't be recording a podcast if I weren't.

41
00:05:01,760 --> 00:05:18,868
Chris: Okay, so I think that now is the time for us to actually do our gimmick and go to Drumroll. Maybe I'll insert a drumroll here. Oh, God. You can tell when we don't have a script because just so lame.

42
00:05:18,924 --> 00:05:24,040
Kayla: You can tell we don't have a script and we've just had a nap because we're all like, oh, I have energy.

43
00:05:24,500 --> 00:05:32,020
Chris: Anyway with that, let's talk about whether TESCREAL is a cult or not. Kayla, what's the first criterion?

44
00:05:32,100 --> 00:05:34,636
Kayla: He keeps saying cult or not. And you're missing an opportunity.

45
00:05:34,668 --> 00:05:38,684
Chris: Excuse me. Excuse me. You're right. Cult or whether it's just weird.

46
00:05:38,732 --> 00:05:39,452
Kayla: There we go. Thank you.

47
00:05:39,476 --> 00:05:41,960
Chris: Because it's definitely at least weird.

48
00:05:42,340 --> 00:05:43,916
Kayla: Everything is, actually.

49
00:05:43,948 --> 00:05:49,988
Chris: Do you want to do the thing where we. Like I ask you? So what's your initial? Before we go through the criteria?

50
00:05:50,044 --> 00:05:51,004
Kayla: Yeah. Always ask me that.

51
00:05:51,052 --> 00:05:54,556
Chris: Yeah, well, what is it? Cult, cult, cult.

52
00:05:54,668 --> 00:05:58,920
Kayla: But I also feel like, I say everything's a cult, and then I go through the criteria and then it convinces me that it's not.

53
00:05:59,380 --> 00:06:04,348
Chris: I mean, I think that guilty until proven innocent is perfectly valid.

54
00:06:04,404 --> 00:06:06,156
Kayla: That's what's made this country great.

55
00:06:06,228 --> 00:06:09,180
Chris: That's right. Okay, so you think Cult. I think.

56
00:06:10,480 --> 00:06:11,856
Kayla: I think it's a network of cults.

57
00:06:11,888 --> 00:06:15,056
Chris: Frankly, I think it's too big to be anything but.

58
00:06:15,088 --> 00:06:17,896
Kayla: Like, it's too big to be anything but TESCREAL.

59
00:06:18,008 --> 00:06:20,340
Chris: Yeah. I don't know. I'll say cult.

60
00:06:20,920 --> 00:06:23,300
Kayla: Did doctor Torres have an opinion?

61
00:06:24,520 --> 00:06:29,340
Chris: So we did talk about that. I might try to find that clip and splice it in here.

62
00:06:30,600 --> 00:07:20,200
Dr Emile Torres: I would say that there is a good case to make that it's that a lot of these communities are run like a cult. It might sound hyperbolic, but I really don't think it is. So take EA, for example. It is very hierarchical. There's a lot of hero worship. I mean, by their own admission, you could find them talking about this. People in the EA community have said there's way too much hero worship. There's a lot of worship of McCaskill and Toby Ord, Elias Suitkowski and Nick Bostrom and so on. You know, you sort of, like, if you're one of the lower rungs of this hierarchy, you oftentimes need some kind of permission to do certain things. You know, the center for Effective Altruism has guidelines for what people in the a community should and should not say to journalists.

63
00:07:20,780 --> 00:07:54,420
Dr Emile Torres: They tested a ranking system, secretly tested an internal ranking system of members of the EA community based in part on IQ. So members who have an iq of less than 100 points removed. Once they have an iq of 120, then they can start to have points added. All of this sounds very cultish. They even do something. Many members participate in what they call doom circles, where you get in a circle and you criticize each other one at a time. And if you.

64
00:07:54,840 --> 00:07:57,508
Chris: Oh man, yeah, that is for sure.

65
00:07:57,704 --> 00:08:47,092
Dr Emile Torres: Right? And if you go on the EA forum website, you can find the statement that they begin each critique with. And it is like, you know, for the greater good, for the benefit of this individual and the aim of improving themselves and their epistemic hygiene and so on. It sounds so bizarre. I mean, there are many examples. There's a lot of myth making in the community. You know, leading figures like McCaskill basically propagated lies about Sam Bankman Fried, one of the saints of the EA rationalist, long termist community. He drove a beat up Toyota Corolla and lived this very humble lifestyle. What they didn't tell you, what they knew was that Bankman free lived in a huge mansion and owned $300 million in bahamian real estate.

66
00:08:47,276 --> 00:09:28,738
Dr Emile Torres: So just like a lot of the televangelists who preach one thing and then have private jetse who are flying around in private jets without anybody really supposed to know about that. So anyways, there's probably 50 more things to say about that. But it is very cultish in many ways. And in fact, when I was part of the community, we had a number of conversations about whether or not we are part of a cult. Even. Here's my last thing I'll say. I've even had. This is 100% true. I've had family members of some of the leading eas contact me in private to say that they're worried their family member is part of a cult.

67
00:09:28,874 --> 00:09:34,870
Chris: Oh my gosh. I mean, you're. Those. Some of the things you mentioned are pretty classic.

68
00:09:37,490 --> 00:09:41,434
Kayla: Well, I think the first criteria. Criterion, criteria.

69
00:09:41,482 --> 00:09:43,430
Chris: Eight criteria.

70
00:09:43,930 --> 00:09:58,100
Kayla: First criteria. It was going great, difficult. And I don't know how we would define who the charismatic leader of TESCREAL is.

71
00:09:58,260 --> 00:10:03,196
Chris: Yeah, I think this kind of goes directly to the. It's so big, right?

72
00:10:03,268 --> 00:10:03,828
Kayla: Because I can name.

73
00:10:03,844 --> 00:10:04,920
Chris: This is what she said.

74
00:10:05,220 --> 00:10:24,086
Kayla: I can name a bunch of guys, which is also what she said. And I don't know if I can, like, is it Ray Kurzweil? I don't know. Is it Elon Musk? I don't know. Is it Nick Bostrom? I don't know. Is it William McCaskill? I don't know. I could probably make an argument for any of those. Probably least of all Elon Musk. Frankly.

75
00:10:24,158 --> 00:10:40,530
Chris: I think so. And then there's Julian Huxley, who coined the term. And then there's Max Moore, who coined other terms like extropianism, so we could voltron all these white dudes together into like a giant nerd.

76
00:10:40,910 --> 00:10:46,528
Kayla: I was just going to say, is the acronym itself the charismatic leader?

77
00:10:46,664 --> 00:10:47,792
Chris: So it's Emile's fault.

78
00:10:47,856 --> 00:10:49,980
Kayla: It's all doctor Torres fault?

79
00:10:50,760 --> 00:10:53,080
Chris: No, I think let's pick someone.

80
00:10:53,240 --> 00:11:18,630
Kayla: And I mean, I think that not being able to pick someone is also answer. If there's not a definitive person, is that the answer? That there is no good point singular. Charismatic leader. But then that makes me question the criteria, because there's certainly been cults with more than one leader thinking on theme. Heaven's gate for most of its life had two leaders, t and o, right? So I don't know.

81
00:11:19,850 --> 00:11:27,202
Chris: Yeah, so I think you're right that, like, the more diffuse it is, the less this particular criterion hits.

82
00:11:27,306 --> 00:11:28,346
Kayla: Criterias.

83
00:11:28,498 --> 00:11:46,734
Chris: Sorry, criterias hits. But I think that if I did have to pick one and maybe we can talk about multiple people as being charismatic leaders within the movement. Like, you know, we could talk about McCaskill, I think would be a good choice. I think Max Moore would be a good choice. But I think if I really had to pick someone, I would pick Nick Bostrom.

84
00:11:46,902 --> 00:11:47,702
Kayla: Interesting.

85
00:11:47,766 --> 00:12:39,122
Chris: And it's unfortunate that we didn't talk more about him on the show. He came up a couple times, but I really feel like he should have come up more because of how influential he has been across all of the letters. Because remember, we're not talking about transhumanism. We're not talking about just ea. We're not talking about just singletarianism. Right. Like, if it was the t, I would say maybe Max Moore. E. Max Moore, the S. I'll give you Ray Kurzweil for that one, for sure. And he's definitely charismatic. C. We'd have to maybe go to Ben Goertzel. So he's like the father of modern cosmism and also invented the term AGI, artificial general intelligence. If it was r rationalism, maybe you could say, see, now that I'm going through them, though, I'm like, each one has a. Its own guy.

86
00:12:39,186 --> 00:12:40,554
Kayla: Each one has its own guy.

87
00:12:40,642 --> 00:12:44,722
Chris: And like, it's okay. I'm like, picturing, like, a Power Rangers set up here.

88
00:12:44,786 --> 00:12:51,910
Kayla: It's funny now, because now I've gone back to convincing myself in my head that the best argument is actually Elon Musk.

89
00:12:52,850 --> 00:13:24,266
Chris: See, that's why the best argument for me is Nick Bostrom, because Elon Musk feels more like a receiver than a. Than a giver in terms of these philosophies. Wait, let me finish the letters, though. Let me finish the letters. Okay, so you got R. That's rationalism. That's less wrong. That's Eliezer Yudkowski. Then you got EA. That's William McCaskill. Then you got Elle. That's also William MacAskill and Nick Bostrom. Yeah, but now that we're picking one for each, let's say the EA is Sam Bankman fried. Let's say it's Sam Bankman fried.

90
00:13:24,418 --> 00:13:31,866
Kayla: I don't think. Yeah, wait, sorry. SBF for EA. Too many syllables. Too many acronyms.

91
00:13:31,938 --> 00:13:33,474
Chris: It's a very acronym heavy season.

92
00:13:33,562 --> 00:13:37,298
Kayla: I think McCaskill is the EA and.

93
00:13:37,314 --> 00:13:56,818
Chris: The L. Okay, so we'll count him for both of those. And then, like, so those are the Power Rangers and then, or those are the Voltron characters and the I'll form the head one for anybody who's watched Voltron, which is going to be like, one person here, but the main guy that's sort of, like, sitting above it all seems to me to be Nick Bostrom.

94
00:13:56,914 --> 00:13:58,426
Kayla: Nick Bostrom is.

95
00:13:58,618 --> 00:14:00,138
Chris: He touches all of those letters.

96
00:14:00,194 --> 00:14:13,142
Kayla: Possibly not even arguably the most, like, preeminent philosopher on this stuff. Like, if you were to get into the philosophy world and say, who's the most famous guy talking about x, risk and AI and.

97
00:14:13,326 --> 00:14:17,934
Chris: Right, but he's also talked about transhumanism. He's also talked on EA.

98
00:14:18,022 --> 00:14:21,294
Kayla: It would be him. Yeah, but do you want to hear my Elon Musk argument?

99
00:14:21,382 --> 00:14:22,050
Chris: Yeah.

100
00:14:22,830 --> 00:14:25,742
Kayla: I really don't want to give him the credit for this, however.

101
00:14:25,926 --> 00:14:28,190
Chris: Well, it's about time he gets credit for something, Caleb.

102
00:14:28,230 --> 00:14:50,740
Kayla: He gets credit for something. I think of all of those names that we've said of the people who have made Tezcriol the most visible to the people, to the non Silicon Valley people, to the non academia people, to the non upper echelon elitist, to us plebs out here listening to podcasts is potentially Elon Musk.

103
00:14:51,916 --> 00:14:52,760
Chris: Okay.

104
00:14:53,140 --> 00:15:23,662
Kayla: And so I think that I could see the argument here of, like, yeah, sure. He's not. He's definitely not coming up with these ideas. He's not a philosopher. He's not an academic. However, does the person coming up with the ideas have to be the number one, or can they be the number two? There definitely are instances we can think of in cults where there is a figurehead and a number two that maybe one is actually coming up with the ideas more than the other. But I don't think these guys. I also. I have to say this. I think a lot of these guys probably don't like Elon Musk.

105
00:15:23,846 --> 00:15:25,860
Chris: Probably. I imagine that.

106
00:15:25,950 --> 00:15:28,416
Kayla: I think Eliezer and Elon get into it on Twitter.

107
00:15:28,528 --> 00:15:44,304
Chris: Yeah. And I imagine that, like, a guy like William McCaskill probably doesn't care for him. But I don't want to put words in anybody's mouth. Okay, so what I think. Let's sort of summarize here. I think there's, like, potential arguments for Nick Bostrom, potential arguments for Elon, and then each individual.

108
00:15:44,312 --> 00:15:48,416
Kayla: I'd rather Nick Bostrom. Cause fuck Elon Musk, but also fuck Mick Bostrom.

109
00:15:48,448 --> 00:16:03,406
Chris: Well, yeah, fuck both. They're both shitty. And then each individual letter kind of has its own little leader. So I kind of feel like that means, to me, that means overall charismatic leader is high. You have a group of charismatic leaders, and then you have an overlord thing.

110
00:16:03,438 --> 00:16:27,492
Kayla: Charismatic someone is the overlord. It reminds me, then, of Hare Krishna, of the International Society for Krishna Consciousness, in which after the overlord, after RLA Prabhupada passed, instead of the group, then passing to the second in command, it went to ten guys. And so there were, like, ten. And they all had individual factions and were their own kinds of charismatic leaders.

111
00:16:27,556 --> 00:16:33,260
Chris: And their own flavors of ISKCOn, while our guys have their own flavors of TESCREAL.

112
00:16:33,300 --> 00:16:34,700
Kayla: So it reminds me of that a little bit.

113
00:16:34,780 --> 00:16:40,748
Chris: Okay. For the art for this, I want to draw these guys faces in Power Rangers outfits.

114
00:16:40,844 --> 00:16:43,260
Kayla: Please do. Wait, why Power Rangers? Why not Voltron?

115
00:16:43,300 --> 00:16:49,316
Chris: Well. Cause it's easy. I mean, what am I gonna do? Like, stick them in one of those giant metal tigers for that? I don't know. How's that gonna come out?

116
00:16:49,348 --> 00:16:50,356
Kayla: I don't know. It sounds cool, though.

117
00:16:50,428 --> 00:16:56,070
Chris: It does. Okay, so first criteria. We're gonna say, hi, charismatic leader. What do we have up next?

118
00:16:56,150 --> 00:16:57,290
Kayla: Expected harm.

119
00:16:57,790 --> 00:17:05,246
Chris: Expected harm. Oh, boy. Ugh. Some of these things feel more harmful than others.

120
00:17:05,438 --> 00:17:21,638
Kayla: I don't think that most of these guys are gonna, like. I'm trying to think of, like, other things that have had expected harm are gonna take, like, a bath with, like, a mud product that was made from, like, petroleum on accident and say that it's like healing them. I don't think they're going to eat borax. I don't think they're going to.

121
00:17:22,219 --> 00:17:34,563
Chris: Yeah. And I guess if we're saying expected harm, like, yeah, we're talking about rank and file, not leaders. So if you are a person that's doing ea stuff, I don't think it's going to be harmful to you for the most part.

122
00:17:34,651 --> 00:17:40,227
Kayla: Well, I don't think physically, but I think that there's other harms that can, especially like the less wrong communities.

123
00:17:40,323 --> 00:17:44,483
Chris: Right. Roko's basilisk and all the people that donated money to Miri after that, the.

124
00:17:44,491 --> 00:18:21,344
Kayla: Psychological harm, the financial harm. I do think if we do start talking about, you know, we already evaluated church of perpetual life on its own, but if we talk about the folks in this bundle who get into some of this life extension longevity stuff, there is potential for harm there. And a lot of these tuscarial guys do kind of end up being like antithetical to quote unquote big pharma, which I don't fault them for, but that can wade into dangerous territory. And then I also think just like social harm, you and I have heard stories of it's impossible to go to a party in Silicon Valley area anymore without people just talking about aih.

125
00:18:21,422 --> 00:18:22,476
Chris: What about AI?

126
00:18:22,628 --> 00:18:23,364
Kayla: And like that.

127
00:18:23,412 --> 00:18:25,180
Chris: You're supposed to talk about the weather.

128
00:18:25,300 --> 00:18:40,340
Kayla: Okay, I'd rather talk about the weather at this point. If that's the only social skill that is being built up in this population to the point where you would not be able to then go to a party in Boston. That feels like a harm to me.

129
00:18:40,460 --> 00:18:43,684
Chris: Well, you can't go to a party in Boston without talking about the pats, so it's kind of similar.

130
00:18:43,732 --> 00:18:44,740
Kayla: I guess that's true.

131
00:18:44,900 --> 00:18:47,840
Chris: Yeah. So I see what you're saying. I think.

132
00:18:49,790 --> 00:18:52,550
Kayla: How do you think actually, can I interrupt you please?

133
00:18:52,630 --> 00:18:55,566
Chris: Because I don't even know what the fuck I'm thinking.

134
00:18:55,758 --> 00:19:05,998
Kayla: Not to keep appealing to authority, but how do you think doctor Torres would answer this question? Doctor Torres, who is one of the academics who coined the phrase, well, they.

135
00:19:06,014 --> 00:19:08,610
Chris: Think that it's a tremendously dangerous ideology.

136
00:19:09,030 --> 00:19:10,286
Kayla: That says something to me.

137
00:19:10,358 --> 00:20:00,316
Chris: Yeah. And I think that if that's part of what I was going to get at is it kind of depends on what we're talking about. First of all, which letter we're talking about. One letter might be more harmful than another letter. But if we're talking about TESCREAL, if we're talking about the big Voltron, then it does dip into at least these ideas that allow billionaires to say, well, it's okay if we nuke most of the world as long as it doesn't threaten the project. I should say specifically, that was Eliezer, who was not a billionaire. Point stands. There's some ideas there that they seem rather comfortable with. Lots of bad things happening to currently living people in order to preserve this bizarre vision of a long, long term future.

138
00:20:00,428 --> 00:20:10,492
Kayla: Yeah. Interestingly, the expected harm for this is high for everyone, even people not in it, which is different from most of the groups we've talked about.

139
00:20:10,556 --> 00:20:14,620
Chris: And you know what I'm thinking of now, which either sucks or is really interesting?

140
00:20:14,700 --> 00:20:15,560
Kayla: Qanon.

141
00:20:16,800 --> 00:20:18,904
Chris: They are both suck and interesting.

142
00:20:18,952 --> 00:20:23,056
Kayla: I just mean that QAnon has, like, oh, that's gonna hurt everybody, not just the people in it.

143
00:20:23,128 --> 00:21:03,720
Chris: Right, right. Yeah. No, I'm thinking, like, for us to evaluate expected harm, we kind of have to do the moat math. You know, we kind of have to sit here and go like, well, it's pretty unlikely that these guys are gonna cause nuclear war, but if they do, the punishment is very high. So when you multiply those two together, it equals this. Like, I kind of felt myself doing that in my head, and I'm like, no, don't do that. That's them. That's what they do. But that being said, I don't know, there's also a place and a time for that, and maybe this is one of those. So I'm gonna say relatively high, although the big harms definitely have a lower chance.

144
00:21:04,740 --> 00:21:09,452
Kayla: I think that's valid. Okay, the next one's easy. Cause it's kind of our gimme question.

145
00:21:09,636 --> 00:21:10,196
Chris: Okay.

146
00:21:10,268 --> 00:21:14,044
Kayla: Presence of ritual. Oh, yeah, it's so high, it couldn't be higher.

147
00:21:14,092 --> 00:21:14,628
Chris: Oh, my God.

148
00:21:14,684 --> 00:21:19,612
Kayla: And I think that you and I potentially need to reevaluate this criteria just.

149
00:21:19,636 --> 00:21:21,080
Chris: Because it's just on everything.

150
00:21:21,740 --> 00:21:25,716
Kayla: Have we ever talked about something that didn't have high presence of ritual? I don't know.

151
00:21:25,748 --> 00:21:34,668
Chris: I don't know. I kind of want to go back and we've actually talked about this before where, like, this criterion criterias. Excuse me. This is usually like the canary in the coal mine.

152
00:21:34,724 --> 00:21:37,060
Kayla: You know, this is what usually gets us into a topic.

153
00:21:37,140 --> 00:22:05,702
Chris: Yeah. Like, with Irvine, it's like, why does Irvine feel weird? Well, it's got a weird vibe, but, like, mainly it's the logo, right. You know, like, it's always something like that. Like, what are those people doing? Why are they chanting that weird chant? Why are they all wearing the same clothes? Why do they use that weird language. Right. And that's one of the things that doctor Shore has talked about too, is like when talking to these outsiders who are worried about their insider friends and family, they'll say like, oh, they use this weird vocabulary now.

154
00:22:05,846 --> 00:22:08,502
Kayla: Yeah, the jargon in this, the jargon.

155
00:22:08,526 --> 00:22:09,718
Chris: In less wrong was insane.

156
00:22:09,774 --> 00:22:27,208
Kayla: The chart and the jargon doesn't just mean like, oh, they use a word to mean as a slang. No, when you're talking about less wrong, it's like you have to have entire volumes of context in order to understand the phrases and words that are being used in other posts and contexts.

157
00:22:27,264 --> 00:22:29,592
Chris: Yeah. Okay, ritual is high.

158
00:22:29,736 --> 00:22:34,060
Kayla: What other examples of ritual do you think? Because it's definitely jargon, but there's other stuff too.

159
00:22:34,560 --> 00:23:04,226
Chris: I mean, if we're going all the way back to our cryonics episodes, which I think we should, I think that cryonics does fit as a puzzle piece into the TESCREAL landscape. There's plenty of stuff there. I mean, essentially it is a death ritual in order to preserve your body, to be resurrected in the future. There's all kinds of procedures. I don't know, maybe it's not a ritual if it's science based and it's medical based, but I don't know.

160
00:23:04,418 --> 00:23:19,186
Kayla: I think another example of presence of ritual is we didn't necessarily touch on this a lot this season, but just this is a very conference based community. There's a lot of symposiums and conferences and talks.

161
00:23:19,258 --> 00:23:21,282
Chris: We talked about that in the humanity plus episode a little.

162
00:23:21,346 --> 00:23:48,320
Kayla: The act of getting together and being in community is there's singularity summits, there's effective altruism summits, there's radfests, there's all this stuff that is, yes, an opportunity to spread the good word, but it's also reifying the community and taking part in actions that solidify your identity as a member of this group. I think that's a high ritual.

163
00:23:48,620 --> 00:24:19,328
Chris: Yeah. Now that we're kind of going through it a little more, I don't know if this is the highest ritual we've ever seen. Now that we're actually talking about individual examples, I'm like, okay, yeah, there's some ritual going on with that. And if they add them all together, it's quite a bit. And certainly if you depending, again, on the letter, right. If it was just the rationalists, then I would say it's much higher. But if it's like extropions, then, you know, I don't know, they had a listserv, they had a group, I don't know if they had, like, logos and chants and songs.

164
00:24:19,424 --> 00:24:22,408
Kayla: They definitely don't have chants and songs, unfortunately.

165
00:24:22,544 --> 00:24:28,376
Chris: So I think it's definitely there. I just. But I'm not sure that it's, like, the highest thing that is ever.

166
00:24:28,448 --> 00:24:29,904
Kayla: Are we talking ourselves in a medium?

167
00:24:30,032 --> 00:24:34,112
Chris: I think I'm talking. Yeah, I think we talked ourselves into a medium ritual or, like, medium.

168
00:24:34,176 --> 00:24:36,344
Kayla: Well, I'll agree to that.

169
00:24:36,432 --> 00:24:38,060
Chris: Okay, what's the next one?

170
00:24:38,440 --> 00:24:57,370
Kayla: I think another fairly simple answer. Niche within society. Actually, this is not a simple. No, this is tremendously difficult because the actual people who are, like, who would identify as a TESCREAL, and again, they wouldn't identify as a TESCREAL, but you know what I mean? That's niche within society. The wide ranging effects are massive.

171
00:24:58,470 --> 00:25:36,712
Chris: Yeah. It's so strange, because if you don't either a, live in Silicon Valley, or b, listen to this podcast, then it's likely you don't know what the hell this is. Even my mom was telling me the other day, she's like, I was listening to some of your recent episodes, and I was kind of lost. And I was like, yeah, that makes sense. You're not online. It's like, a lot of this stuff is normal. People that aren't living on Twitter or engaging with this type of content and this type of these movements. It does feel kind of like, wait, what? What are you talking about? What the hell is that? And Silicon Valley itself, in terms of number of people, is a small number of people. A very small number of people.

172
00:25:36,776 --> 00:25:37,432
Kayla: Right.

173
00:25:37,616 --> 00:25:39,168
Chris: But they control the world.

174
00:25:39,224 --> 00:25:56,812
Kayla: So people we're talking about are Elon Musk and Mark Andreessen and Peter Thiel, and people who have the ear of Ray Kurzweil. The most powerful governments in the world, and the most powerful entertainment industries in the world, and the most powerful industries in the world, affecting all of them.

175
00:25:56,836 --> 00:25:59,972
Chris: Sitting on top of giant influence platforms like Facebook and Twitter.

176
00:26:00,036 --> 00:26:07,884
Kayla: Like, yes. You walk down the street and ask people, do you know what TESCREAL is? They wouldn't know what the fuck you're talking about. You ask them what AI is. Everybody would know.

177
00:26:07,972 --> 00:26:13,192
Chris: Also do note, TESCREAL as a acronym is only like, what, two years old, year and a half.

178
00:26:13,286 --> 00:26:15,308
Kayla: That's the niche within society of that.

179
00:26:15,404 --> 00:26:29,846
Chris: Yeah. So I think I am going to call this niche because I'm going to grade it on how many people are we might put in the cult, how many members there are. I'm gonna grade it on that, not on its outsized influence.

180
00:26:29,918 --> 00:26:40,246
Kayla: This analogy might help if you were to evaluate if the illuminati were real in the way that conspiracy theorists think it's real. Would you evaluate the illuminati as.

181
00:26:40,278 --> 00:26:41,158
Chris: This is just as hard.

182
00:26:41,214 --> 00:26:46,630
Kayla: I don't know, as niche within society, because they say it's a small group of elites that are controlling the world.

183
00:26:46,670 --> 00:26:47,270
Chris: The world.

184
00:26:47,390 --> 00:27:00,808
Kayla: And I think that's still makes it niche within society, because it is a secret society. It is a hidden activity that kind of feels more similar to what the tescrealists are doing. I'm not saying the illuminati is real. I don't think the illuminati is real.

185
00:27:00,864 --> 00:27:14,872
Chris: No. I'm actually going the other direction in my head. And I'm like, I kind of see some of the critique of TESCREAL being a conspiracy theory. I don't think it is, but I think it has some elements, and it should definitely raise some pink flags at least.

186
00:27:14,936 --> 00:27:34,612
Kayla: I just think. Make sure you don't fall into the idea that there is a grand unifying theory of, hello, we are the council of TESCREAL elders, and this is our unified goal, and we are using all of these arms to make sure that it happens. That's the conspiracy theory. It's not a conspiracy theory to say that Elon Musk uses his money to exert influence. That's not right.

187
00:27:34,636 --> 00:27:54,232
Chris: And it's not a conspiracy theory to say one of these ideas influences the next. It's just not a cabal. Okay. I think I would still say niche for illuminati, although maybe a little less here, because, like, most people know that word now at least. But I would still say niche. I mean, what would you say? I don't think that's definitively the correct way to look at it.

188
00:27:54,256 --> 00:27:57,784
Kayla: I think I would call. I think if I were to evaluate the illuminati, I would call it a niche group.

189
00:27:57,872 --> 00:27:59,860
Chris: Okay, then I think this is, too.

190
00:28:00,160 --> 00:28:03,272
Kayla: I think there's exclusivity, and I think there's exclusivity in tuscarial.

191
00:28:03,336 --> 00:28:04,660
Chris: Yep. Yep.

192
00:28:05,000 --> 00:28:10,200
Kayla: Different kind of exclusivity, because you don't have to, like, do a secret handshake, but they should probably come up with.

193
00:28:10,240 --> 00:28:15,504
Chris: Yeah, you don't have to do, like, a satanic ritual to kill children or whatever. Like, you have to do with illuminati.

194
00:28:15,552 --> 00:28:18,424
Kayla: Yeah. Next one is antifactuality.

195
00:28:18,512 --> 00:28:30,592
Chris: Actually, you probably do. All right, sorry. Antifactuality. This one, I'm just gonna go ahead and say is relatively. I don't know.

196
00:28:30,656 --> 00:28:32,976
Kayla: I would call it high man as.

197
00:28:33,008 --> 00:28:37,016
Chris: I'm saying the sentence out loud. My brain is going through each letter.

198
00:28:37,128 --> 00:28:37,896
Kayla: I know.

199
00:28:38,048 --> 00:28:52,712
Chris: And transhumanists are like, we want to transcend our human limitations. I'm like, okay, that's just the thing that you want to do. That's not antifactual. But then by the time my brain gets to long termism, I'm like, there's a bunch of crap there.

200
00:28:52,776 --> 00:29:35,256
Kayla: I think the buck pretty easily stops with effective altruism in that, like we talked about in the episode, I think they very clearly downplay facts, and they very clearly downplay patterns of history. In order to keep the glue holding the philosophy together, in order to keep the glue holding the philosophy together, you have to ignore the way wealthy power structures, wealthy classes and power structures have existed in our societies forever. And that, to me, that, to me, is antifactual. And I would love to have a conversation with William McCaskill about this, because I cannot believe that he has not thought about that. I do believe that he has thoughts and feelings about it. I just don't know what they are.

201
00:29:35,408 --> 00:30:21,782
Chris: Yeah, that would kind of go into, I think. Yeah, some of the blind spots part of antifactuality. I don't know about, like, logical fallacies, but I do think that there are some. Like, I don't. I don't necessarily believe them when they say each dollar is spent is worth, like, x number of lives. Like, I. I get the impulse to do something like that, but, like, I don't know. That doesn't feel, like, super based in reality. I don't know about, though, about, like, other logical fallacies. I mean, if we're gonna link together transhumanism and eugenics, which we kind of did these last two episodes, there's a ton there. Right. So to the extent that TESCREALists are into the eugenicsy type stuff, which, personally, that's part of, I think, of how I would.

202
00:30:21,926 --> 00:30:30,246
Chris: Part of how I differentiate TESCREALsts from, like, other people who are just into one or more of the letters, is that they have a more eugenicsy sort of bent.

203
00:30:30,318 --> 00:30:30,930
Kayla: Right?

204
00:30:31,290 --> 00:30:43,114
Chris: So, for example, when a Nick Bostrom, who we have called charismatic leader, talks about dysogenic pressures, that's when you go like, oh, okay. You got some blind spots in your facts.

205
00:30:43,202 --> 00:30:46,498
Kayla: That makes me go back to expected harm. Yeah, hi.

206
00:30:46,674 --> 00:30:54,474
Chris: Yeah. So what are we saying on this one, then? This topic is so big. It's too big.

207
00:30:54,562 --> 00:30:59,226
Kayla: TESCREAL is not antifactual in the way that, like, an MLM is anti factual.

208
00:30:59,298 --> 00:31:01,870
Chris: Right. Or in the way that QAnon is. Yeah.

209
00:31:02,170 --> 00:31:21,138
Kayla: I think that we can sit here. I think what's happening more with TESCREAL, and this is kind of getting into a personal place, is that it's a bundle of philosophies that have reached different conclusions re philosophy than, like, you or I would. And maybe that's not anti factual.

210
00:31:21,274 --> 00:31:22,058
Chris: That's true.

211
00:31:22,194 --> 00:31:33,480
Kayla: I think it's a lot of people with the same information reaching different conclusions. And that doesn't necessarily mean the whole thing is antifactual. We can definitely find instances of, like, this is wrong. This is wrong.

212
00:31:33,520 --> 00:31:43,840
Chris: But I think the dysgenic pressures and scientific racism, there's actual evidence against that, whereas some of the other stuff is just, like, you've reached different conclusions.

213
00:31:43,920 --> 00:31:44,592
Kayla: Right.

214
00:31:44,776 --> 00:31:49,976
Chris: Yeah. And I think that's important because antifactual can't just mean, like, I disagree with your ideas.

215
00:31:50,048 --> 00:31:50,424
Kayla: Right.

216
00:31:50,512 --> 00:31:59,320
Chris: Right. It has to be, like, some specific. There's evidence against this. I am noting the presence of logical fallacies and blind spots.

217
00:31:59,700 --> 00:32:10,612
Kayla: So, like, I don't agree with the conclusions that long termists have drawn, and I think that's. I think I'm right, but I don't think that makes them antifactual, anti factual.

218
00:32:10,716 --> 00:32:10,988
Chris: Okay.

219
00:32:11,004 --> 00:32:20,600
Kayla: Which I could be wrong, honestly. This is one that I really want to hear from, like, people on. So, like, please, if you're listening right now and you're, like, screaming at your podcast, please let us know. Cause this one's hard.

220
00:32:21,060 --> 00:32:24,318
Chris: Discord call to action. Come join us on discord. Talk about it.

221
00:32:24,444 --> 00:32:30,346
Kayla: Do we want to just say low for, like, funsies? Okay, that sounds low to me.

222
00:32:30,498 --> 00:32:35,354
Chris: It's, like, low, but it's not in the. It's just above a quarter.

223
00:32:35,522 --> 00:32:36,162
Kayla: Right.

224
00:32:36,306 --> 00:32:38,978
Chris: So that's, you know, it's science.

225
00:32:39,074 --> 00:32:44,030
Kayla: Science. We're doing science here. Percentage of life consumed or life consumption.

226
00:32:45,850 --> 00:32:56,604
Chris: Oh, God. Again, this topic is biggest, depending on who we're talking about and which letter. It could be a lot or a little. If it's cryonics, it consumes your whole life.

227
00:32:56,692 --> 00:33:04,356
Kayla: No, it doesn't. And also, we evaluated cryonics on its own, and it's not part of the TESCREAL bundle. I would honestly.

228
00:33:04,388 --> 00:33:06,060
Chris: It's a little barnacle on the TESCREAL ship.

229
00:33:06,100 --> 00:33:12,116
Kayla: It's like. It's. The Venn diagram is quite, like, the overlap of the Venn diagram is quite small, actually.

230
00:33:12,188 --> 00:33:17,428
Chris: I don't know about that, my dude. Like, there's a lot of TESCREALists that are signed up for cryonics.

231
00:33:17,524 --> 00:33:33,076
Kayla: I think that there's more cryonics people. People hinted for cryonics that would subscribe to a TESCREAL philosophy than there are independent TESCREALists that are signed up for cryonics. I don't think I agree with you in saying that there's a lot of TESCREALists signed up for cryonics.

232
00:33:33,228 --> 00:33:50,072
Chris: I see it as, like, if TESCREAL is, like, nerd culture, then cryonics is, like, the gathering. Nerd culture includes a bunch of different things. Yeah, there's definitely. There's a lot of nerds that play magic, but not every nerd plays magic. Some nerds play Pokemon.

233
00:33:50,176 --> 00:33:55,752
Kayla: Okay, well, this is not about cryonics. You're just trying to. You're just trying to avoid answering the question.

234
00:33:55,816 --> 00:33:56,600
Chris: Yeah, I don't want to.

235
00:33:56,640 --> 00:33:58,800
Kayla: Percentage of life. I'll answer it. I think it's high.

236
00:33:58,920 --> 00:34:00,024
Chris: Why do you think it's high?

237
00:34:00,152 --> 00:34:08,496
Kayla: I think it's high because I'm basing it off of a lot of, like, the folks, the specific folks that we've talked about. If you're talking about William McCaskill, this is his entire.

238
00:34:08,648 --> 00:34:09,560
Chris: But those are the leaders.

239
00:34:09,639 --> 00:34:20,760
Kayla: I know. I will land the plane. Nick Bostrom, William Macaskill. The names that we said, like, 100%, like, this is their entire lives. I'm sure they have hobbies as well that don't have anything to do with it.

240
00:34:20,880 --> 00:34:21,775
Chris: Magic, the gathering.

241
00:34:21,848 --> 00:35:07,300
Kayla: I think that the stereotype that you and I have come across of, like, if you're a Silicon Valley person, this is all you can talk about. While may not 100% reflect reality, reflects a trend, reflects a trope, reflects something that is going on in the culture of TESCREAL, in which this becomes a very big part of one's life. I think the folks that are on less wrong spend a lot of time thinking and learning about less wrong and rational stuff. I think the potential for a large portion of your life to be dedicated to this is very high and to the TESCREAL bundle on ramps that I think that you can, like, kind of. I don't want to keep comparing it to QAnon because there are huge, huge differences.

242
00:35:07,460 --> 00:35:17,878
Kayla: But I think there's a similar on ramp, a similar onboarding, where, like, you can kind of take a small step in, and it's really easy to snowball that into. This is my everything.

243
00:35:18,014 --> 00:35:21,382
Chris: Both QAnon and TESCREAL have letters in their titles.

244
00:35:21,446 --> 00:35:25,070
Kayla: It's just the acronyms. Yeah. So I guess QAnon's not an acronym.

245
00:35:25,110 --> 00:35:55,312
Chris: Not really, no. Yes. I think I agree with you, and I think Emil also was kind of talking about that with the, like, I know people who emailed me. Like, that suggests that there's, like, a. It's not family separation, but it's like, maybe step one of that. It's like dipping your toes into family separation, where somebody's starting to spend a lot of time doing this other thing. That being said, I don't know. There's no compounds. There's a lot of conferences that might take up most of your life going to all these different conferences.

246
00:35:55,416 --> 00:35:59,288
Kayla: It does seem like if you're into this, you might be going to a.

247
00:35:59,304 --> 00:36:02,016
Chris: Lot of conferences, eating a lot of hotel food.

248
00:36:02,088 --> 00:36:34,724
Kayla: Yeah, yeah, you're right. It's not the same as. I'm trying to think of something that has a compound. It's not the same as the Hare Krishnas, where people were leaving their families to go live communally. It's not the same as an MLM, where it literally does require you to have your entire life be consumed by the project. But I'm thinking again about if we go back to russian cosmism, which is one of the forefathers of this, the father of russian cosmism, very much had the belief that this should be.

249
00:36:34,772 --> 00:36:53,700
Chris: Everybody's dedicated to the tag, the sort of like, oh, my God, this is the most important thing ever. That thread definitely carried through into TESCREAL at large, I think I'm gonna say medium. Medium.

250
00:36:53,740 --> 00:36:57,720
Kayla: Because the potential's there, but it doesn't have to be. I think you can be a casual TESCREAL guy.

251
00:36:58,100 --> 00:36:59,140
Chris: Casual TESCREAL.

252
00:36:59,180 --> 00:37:09,760
Kayla: I think you can casually be like, wow, this effect of altruism thing sounds cool. Or you can casually read a Nick Bostrom book. We have a Nick Bostrom book on our bookshelf. You can casually dip your toe into these ideas.

253
00:37:09,840 --> 00:37:11,620
Chris: Yeah, I'm not gonna read it, though.

254
00:37:11,920 --> 00:37:12,920
Kayla: I think I had this last.

255
00:37:13,000 --> 00:37:14,432
Chris: Yeah, maybe I do want to.

256
00:37:14,576 --> 00:37:20,072
Kayla: Let's at least take it off our bookshelf. Cause now I'm embarrassed when people come over and see it sitting next to, like, our Ray Kurzweil one, which I'm.

257
00:37:20,096 --> 00:37:26,904
Chris: And they listen to the show. Like, people that come to the. Come to our apartment also listen to the show. So they'll just be like, I thought you hated that guy.

258
00:37:26,952 --> 00:37:35,960
Kayla: I'd rather keep Ray Kurzweil on our bookshelf than Nick Bostrom. And, like, frankly, Ray Kurzweil is way more of a magical prophet than. And, like, a factual source.

259
00:37:36,260 --> 00:37:42,036
Chris: Yeah, I still like him, though. I still like Ray Carter's wild. You're not going to get me to not like Greg.

260
00:37:42,188 --> 00:37:44,652
Kayla: He's got a lot of ideas that are.

261
00:37:44,796 --> 00:37:46,120
Chris: Yeah. Oh, yeah.

262
00:37:46,620 --> 00:37:48,436
Kayla: Not based on anybody.

263
00:37:48,588 --> 00:37:49,480
Chris: Agreed.

264
00:37:49,820 --> 00:37:52,260
Kayla: And spends a lot of time disseminating those ideas.

265
00:37:52,380 --> 00:38:03,982
Chris: Okay. So because we have the ability for people that are into maybe one of the letters, to get into all of the letters and be consumed by the letters, then we're gonna say, what medium?

266
00:38:04,046 --> 00:38:04,518
Kayla: Medium's good.

267
00:38:04,534 --> 00:38:06,050
Chris: Medium. Okay. Great.

268
00:38:06,510 --> 00:38:07,526
Kayla: Dogmatic beliefs.

269
00:38:07,558 --> 00:38:08,566
Chris: Are you keeping track of all these?

270
00:38:08,598 --> 00:38:09,210
Kayla: Yes.

271
00:38:11,150 --> 00:38:17,590
Chris: We'Re right. Everybody else is wrong. Kick you out if you dissent in any way, is that gonna depend on the letter?

272
00:38:17,630 --> 00:38:20,990
Kayla: Because I feel like, less wrong. No, I feel like there's room for dissent.

273
00:38:21,030 --> 00:38:22,446
Chris: Less wrong was very low dogma.

274
00:38:22,558 --> 00:38:41,366
Kayla: But I feel like. I feel like. And this is so mean. This is not like, let's take this out of the evaluation, but I just wanna say it. I feel like I could have a. This is just based on vibes. I feel like I could have a conversation with William McCaskill in which I could challenge his ideas. I don't know if I could have a conversation with Nick Bostrom and challenge his ideas. I don't think he.

275
00:38:41,398 --> 00:38:41,950
Chris: Why is that mean?

276
00:38:41,990 --> 00:38:46,974
Kayla: I'm keeping that because that's literally based on nothing. And I'm basing a lot of things.

277
00:38:47,022 --> 00:38:47,930
Chris: You said feel.

278
00:38:48,630 --> 00:38:52,470
Kayla: I know, and I think it's mean to say, like, I feel like this guy's a dick.

279
00:38:52,550 --> 00:38:56,734
Chris: Yeah. Okay. Fair. It's not mean. It's maybe, like, a little antifactual.

280
00:38:56,862 --> 00:39:15,066
Kayla: It's definitely antifactual, and that's why I want to say it. Still, let's strike it from the record. Separate it from the criteria. There's no evidence for this, whereas we have the evidence for. There's room for dissent unless wrong. I think there's even room. I mean, I'm in the cryonics discord. There's room for dissent there. I think that also.

281
00:39:15,138 --> 00:39:17,390
Chris: Now you're bringing up cryonics when it's supported.

282
00:39:17,770 --> 00:39:19,394
Kayla: You said we was allowed.

283
00:39:19,522 --> 00:39:22,474
Chris: Just throwing it around whenever you think it's useful.

284
00:39:22,562 --> 00:39:47,786
Kayla: I think there's room for dissent in the effective altruist community. I've seen examples of differing philosophers and academics and people being able to talk about this, and I can't imagine that if you go to a Silicon Valley party where everyone's talking about AI, they all have the exact same thought about AI. Even within TESCREAL, there are differing, diametrically opposed ideas about AI specifically. And those people are all part of TESCREAL.

285
00:39:47,818 --> 00:40:00,092
Chris: Right? If you're gonna call Eliezer Yudkowski and Mark Andreessen, TESCREALists, which I think we are. Then you have to say that the dogma is low because they have some very different views about how to go about this.

286
00:40:00,156 --> 00:40:00,760
Kayla: Right.

287
00:40:01,660 --> 00:40:02,644
Chris: Low dogma.

288
00:40:02,772 --> 00:40:04,404
Kayla: Ugh. That feels weird.

289
00:40:04,572 --> 00:40:09,640
Chris: Too bad. We thought about it. We said it. We are infallible.

290
00:40:10,060 --> 00:40:12,876
Kayla: That feels weird. But yeah.

291
00:40:12,908 --> 00:40:16,588
Chris: Yeah. Well, low dogma is just weird. And that's what coming up on the.

292
00:40:16,604 --> 00:40:18,560
Kayla: End here, you got two more. Chain of victims.

293
00:40:19,430 --> 00:40:22,702
Chris: I mean, I had you read the singularity isn't the air, so I. Yeah.

294
00:40:22,726 --> 00:40:23,430
Kayla: Can I confess something?

295
00:40:23,470 --> 00:40:24,690
Chris: I got you into it.

296
00:40:24,990 --> 00:40:26,518
Kayla: I never finished that book.

297
00:40:26,654 --> 00:40:27,070
Chris: What?

298
00:40:27,150 --> 00:40:30,254
Kayla: I never. It's fabulous, Kayla. It's so big. I never finished.

299
00:40:30,262 --> 00:40:31,462
Chris: It is pretty. It's a big fabric.

300
00:40:31,486 --> 00:40:41,570
Kayla: I've read portions of it from beginning to end. I referenced it for these episodes. I've Wikipedia ed. It's like when you watch a movie on TNT. I've seen it.

301
00:40:42,190 --> 00:40:54,468
Chris: Okay, no, that is not at all like that. So if you read little chunks of. Of the singularities, near, like, small little chunks over time, and then, like, eventually enough of them overlap that you know, the whole movie.

302
00:40:54,524 --> 00:40:55,236
Kayla: Yeah, I think it's that.

303
00:40:55,268 --> 00:40:57,420
Chris: Then it's like war games. I mean, TNT.

304
00:40:57,500 --> 00:40:59,800
Kayla: Yeah. Yes. It's like Lord of the Rings.

305
00:41:00,780 --> 00:41:02,960
Chris: Lord of the Rings is your TNt example.

306
00:41:03,340 --> 00:41:08,020
Kayla: I just think it interesting. Before I cut the cord, it was, like, always on d and D. Oh, man.

307
00:41:08,060 --> 00:41:14,228
Chris: No, for me, maybe it's just as, like, an age gap thing, but for me, it's either war games or tremors.

308
00:41:14,324 --> 00:41:18,280
Kayla: I just think we didn't have TNT when I was growing up. I think that was, like, a teenage thing.

309
00:41:18,720 --> 00:41:22,180
Chris: And now we have all the Gen Zs listening. What the hell's a TNT?

310
00:41:23,440 --> 00:41:25,900
Kayla: Okay, so chain of victims. Is it recruity?

311
00:41:29,240 --> 00:41:35,128
Chris: I don't get the sense that it compels anyone to want to recruit people into the fold.

312
00:41:35,224 --> 00:41:42,232
Kayla: It's not an MLM in that regard. It's not like, here's a codified recruitment chain, and it's not.

313
00:41:42,296 --> 00:41:43,328
Chris: It's not Qanon.

314
00:41:43,464 --> 00:42:10,324
Kayla: I gotta bring it up again. It's not QAnon in which it, like, you can see the virality. I think that this has the potential for virality, and that's probably the way it spreads more in that a book gets published and someone reads it, and then you recommend it to your friend. Or you're on Twitter and you see a thing and you're like, oh, what's that? Or you see an interview with Elon Musk, and he talks about population stuff that's going on, I think it has the potential for that virality, but it doesn't seem to grab hold the same way that QAnon did.

315
00:42:10,452 --> 00:42:14,644
Chris: Yeah. What about something like. Cause I would call NXIVM high chain of victims, too, because.

316
00:42:14,732 --> 00:42:15,604
Kayla: Yeah, but that started as an.

317
00:42:15,652 --> 00:42:30,792
Chris: It's hard to differentiate, but that's true. But it's hard to differentiate there between victim and perpetrator, especially at those mid to high levels where they still got recruited in. They're still victims, but then they also did bad things. Two people.

318
00:42:30,936 --> 00:42:54,340
Kayla: I think that this. Having a diffuse, even though it's high, having diffuse charismatic leaders helps with that a little bit, and that there's not one point that is forcing everybody to revolve around them and requiring this top down structure. I don't think that this requires. I think if you really got into it, you could make an argument, but it would be an exonym, not an endonym, if that makes sense.

319
00:42:54,420 --> 00:42:54,748
Chris: Yeah.

320
00:42:54,804 --> 00:43:01,308
Kayla: It would be something that we are a framework that we are assigning rather than a framework that has been designed or emerged.

321
00:43:01,404 --> 00:43:26,994
Chris: I think also, I think this criteria is sort of inextricably linked to the expected harm one. Because if there's harm and you bring somebody in, then you start to get that ambiguity between victim and perpetrator here. I'm not feeling the harm on the individual member level nearly as much as I'm feeling the harm on a global influence level.

322
00:43:27,122 --> 00:43:29,034
Kayla: We are all chained victims in this.

323
00:43:29,082 --> 00:43:34,442
Chris: Right. That's true. We are the victims of Tuscrael. So I'm gonna say low on chain of victims.

324
00:43:34,506 --> 00:43:41,242
Kayla: I do think there's a little bit of virality with, again, going back to our favorite man, I think that Elon Musk has a little bit of a viral reach.

325
00:43:41,386 --> 00:43:41,882
Chris: Sure.

326
00:43:41,986 --> 00:43:51,038
Kayla: And I think that there's some virality in AI stuff. But again, just because I can go, like, well, this and this doesn't mean that TESCREAL as a whole.

327
00:43:51,174 --> 00:44:05,290
Chris: Right. And even with Elon, like, he has a big reach. Is it a viral reach? Yeah, I guess maybe people do like to share his shit, but, like, usually because it's stupid, I don't know. Yeah, I think. But I think overall, low.

328
00:44:07,190 --> 00:44:13,498
Kayla: And this is the last one. The final criterias of. We know that.

329
00:44:13,514 --> 00:44:15,546
Chris: I think people are going to come away from this and, like, start using.

330
00:44:15,578 --> 00:44:19,070
Kayla: The word experience is the wrong word. Safe or unsafe. Exit.

331
00:44:20,050 --> 00:44:21,870
Chris: I don't feel like it's unsafe to.

332
00:44:22,170 --> 00:44:33,026
Kayla: I don't feel like it's unsafe at all. And if you're a TESCREAList and you want to leave, come, leave. You'll be fine. We'll take care of you. We'll welcome you into the fold of crip theory. And watching tv on the couch and not talking about AI.

333
00:44:33,138 --> 00:44:41,378
Chris: Yeah. You know, I lurked around less wrong. And then I stopped. And, like, I didn't get shunned by anybody. Of course, granted, I didn't make any friends.

334
00:44:41,434 --> 00:44:45,802
Kayla: No. And were singularitarians once upon a time, and it was easy to get in, easy to get out.

335
00:44:45,906 --> 00:44:52,690
Chris: Yeah. But again, I don't know. Like, if were friends with Ray Kurzweil, do you think he'd shun us now if were like, dude, I don't know.

336
00:44:52,770 --> 00:45:02,618
Kayla: If I got shunned by Ray Kurzweil, that would be the greatest thing to ever happen to me. Imagine getting to put that on your, like, resume.

337
00:45:02,714 --> 00:45:03,138
Chris: LinkedIn.

338
00:45:03,194 --> 00:45:03,818
Kayla: On your LinkedIn.

339
00:45:03,874 --> 00:45:04,914
Chris: Shunned by Ray curse.

340
00:45:04,962 --> 00:45:05,930
Kayla: Shunned by ray curse.

341
00:45:05,970 --> 00:45:10,886
Chris: Would you put, like, a date there? You know, like, 2005. Attended so and so college and shunned.

342
00:45:10,918 --> 00:45:12,838
Kayla: By rakers, put under special skills.

343
00:45:12,934 --> 00:45:14,570
Chris: Special skills. Okay.

344
00:45:15,030 --> 00:45:27,958
Kayla: I think that the exit is safe. You're not gonna be cut off from your friends and family. You're not gonna be. Yeah. Shunned as in the Amish. You're not gonna lose your entire community, as in QAnon. I think that the exit is safe here.

345
00:45:28,054 --> 00:45:52,194
Chris: And, of course, your mileage may vary. I'm sure that has happened to some people where they were, like, really into EA, and they and their EA friends didn't care for it when they stopped being into it. I'm sure that exists. But from our research on these letters, seems low, so. Okay, so you know what's weird is that, like, the first few were, like, so high.

346
00:45:52,322 --> 00:45:52,722
Kayla: Yeah.

347
00:45:52,786 --> 00:46:01,746
Chris: And I was like, oh, we are just heading straight towards a cult evaluation. And then it just kind of petered out. And the last few were low.

348
00:46:01,818 --> 00:46:17,170
Kayla: So just a reminder for everybody. Charismatic leader, high. Expected harm, high. Presence of ritual, medium, maybe medium well. Niche within society. Yes, it is. Niche. Antifactuality, low. Life consumption, medium. Dogmatic beliefs, low. Chain of victims, low. Safer, unsafe. Exit, low.

349
00:46:17,510 --> 00:46:18,886
Chris: It's almost split down the middle.

350
00:46:18,958 --> 00:46:22,102
Kayla: This, my friends, I think, is just weird.

351
00:46:22,166 --> 00:46:29,470
Chris: It's. Oh, man, I really was not expecting that, but it has such potential for danger and harm and perniciousness.

352
00:46:29,550 --> 00:46:40,174
Kayla: I think if were we to desire to punish our listeners, we go through every letter and evaluate every letter, and we're not.

353
00:46:40,222 --> 00:46:41,782
Chris: We should have done that at the time that ships.

354
00:46:41,806 --> 00:47:13,964
Kayla: We're not doing that. We love our listeners. We're not going to do that. We could probably make individual cases that are different. Like, maybe the t is a little higher in this and the l is a little lower than this and the l is a cult and the EA is just weird. We could probably play that game. But because we've discovered that this thing is a bundle, that this. That the TESCREAL acronym is a terror management theory of bundle, we're evaluating it as one. And I think you're right that it split down the middle. But I think that there's enough lows that I'm hitting. Just weird, and I don't want to say that.

355
00:47:14,012 --> 00:47:49,646
Chris: Yeah. And I also think, like, for me, this needs to, like, for me to call something a cult with these criteria, it needs to be, like, more than simple majority. It needs to be, like, most of the criteria. Like, you know, maybe with like one or two or three that don't. So I agree. Based on our criteria, it's not a cult. It's just weird. And I. You know, and I think about, like, okay, well, does anything make me balk about that? The harmfulness, like, things can be harmful without being a cult, right? Things can have, like, top down leaders and, you know, people influencing other people without being a cult.

356
00:47:49,718 --> 00:47:57,190
Kayla: But I wouldn't call, like, this is that case Chevron a cult? Probably. I don't know. Companies are cults. We did that.

357
00:47:57,230 --> 00:48:06,650
Chris: Yeah, we did. Well, I mean, that's the risk of a show like this, right? Is that we're like, you know, everything's a cult, right? Like, we could easily fall into that. And I think we shouldn't.

358
00:48:07,830 --> 00:48:26,112
Kayla: Man, I really don't want to call it just weird, but I think that. I think that TESCREAL as a whole, while it has the potential to grow and blossom into something more currently, is just weird and has some cult like tendencies, for sure, as we learned from doctor Torres and just from other people we've talked to. But, yeah, I think that's our definitive answer this season.

359
00:48:26,256 --> 00:48:30,216
Chris: I'm gonna tell Emile. They're gonna be so upset. They're gonna disown us on Twitter.

360
00:48:30,408 --> 00:48:38,672
Kayla: If you are in the same camp as Doctor Emile Torres and you disagree with us again, please let us know, because we're wrong all the time.

361
00:48:38,816 --> 00:48:39,136
Chris: Yeah.

362
00:48:39,168 --> 00:48:40,208
Kayla: And we wanna be corrected.

363
00:48:40,264 --> 00:48:42,936
Chris: No, we want to be less wrong.

364
00:48:43,008 --> 00:48:43,504
Kayla: Oh, God.

365
00:48:43,552 --> 00:49:34,214
Chris: Okay, so at the end of all this. Cause, like, you mentioned this earlier, we've kind of self identified in the past as singletarians and in general, we kind of like this stuff. Like, in general, we kind of like cool, sort of like futurist cyborg. We would like to not necessarily have to die at age 77. We would like to have more control over when we die. There's a lot of things that are appealing that have traditionally, in the past, been appealing to us. Maybe still are. So after doing this season of the show, how do you feel about TESCREAL? And I'll say TESCREAL, I know you've never been like, I'm an Extropian, but how do you feel about that part of your identity and belief system.

366
00:49:34,262 --> 00:49:37,890
Kayla: Now calling me a TESCREAL?

367
00:49:38,470 --> 00:49:39,250
Chris: Yeah.

368
00:49:41,990 --> 00:50:28,716
Kayla: I don't think that you have to be a TESCREAList in order to be a futurist. Futurist is not part of the TESCREAL acronym. It's not fTESCREAL. It probably could be, maybe because a lot of people are futurists, TESCREAL. But I don't think you have to be a TESCREAList to be a futurist. To be somebody who thinks about, is interested in, is excited about, is passionate about, maybe has a reverent relationship with what the future might hold in terms of, like, our technology and our expansion and our growth and what that means for, like, human beings and humankind and going off planet. I don't think you have to be a task realist to be interested in those things. And I think that it's a little bit unfortunate that TESCREAL is such a big chunk of that community these days.

369
00:50:28,748 --> 00:50:42,852
Kayla: And it reminds me about how effective altruism got a little bit bastardized by the community that kind of took it over. It started as like, oh, this is people who want to be, like, frugal and ascetic and donate lots of their.

370
00:50:42,876 --> 00:50:52,740
Chris: Money to things versus, and to be fair, had, like, a naivete about what, you know, the power structures are that enabled them to have that sort of takeover.

371
00:50:52,820 --> 00:50:58,732
Kayla: They didn't build. William McCaskill didn't build a philosophy around, like, how can Elon Musk get billions of dollars?

372
00:50:58,836 --> 00:50:59,244
Chris: Right.

373
00:50:59,332 --> 00:51:04,796
Kayla: It just kind of got, because of the blind spots in EA, it got co opted by the billionaires.

374
00:51:04,828 --> 00:51:05,388
Chris: Sure. Yeah.

375
00:51:05,444 --> 00:51:17,172
Kayla: And I think that futurism, or, like, thinking about the future and thinking about technology and thinking about cyborgs and thinking about all these cool things have gotten a bit co opted via the blind spots by TESCREAL.

376
00:51:17,236 --> 00:51:17,676
Chris: Right.

377
00:51:17,788 --> 00:51:24,004
Kayla: And so I think that I can still identify as somebody who is a futurist.

378
00:51:24,132 --> 00:51:29,716
Chris: Would you, if somebody asked, identify as a TESCREAL, if somebody asked you, were you a transhumanist, what would you say?

379
00:51:29,868 --> 00:51:41,596
Kayla: No, no, I think because you introduced crip theory in the last episode, and that gives me something that I align with more to talk about.

380
00:51:41,668 --> 00:51:49,020
Chris: Then you could say you're into crypt technoscience. Yeah, yeah, I kind of. So I agree with all that.

381
00:51:49,100 --> 00:52:00,416
Kayla: I'm not saying all TESCREALists are bad, and I don't talk to them. Love to have these conversations with people. And there's definitely, you know, take what you like and leave the rest. There's definitely things there that I can take, and there's a lot of things that I can leave.

382
00:52:00,488 --> 00:52:39,830
Chris: Yeah, I think take what you like and leave the rest is a big part of this show. We've definitely talked over the years about sort of being anti the concept that, like, oh, you're brainwashed when you're in the cult, and then when you're out of the cult, you're super not brainwashed in that sort of, like, binary of like, oh, my God, I believe everything these guys are saying and the rest is evil. And then when you get out, oh, my God, those guys were evil. And I believe that everything that he's. I think that we advocate or think is a better model for understanding as a more integrative approach. For example, I'm really glad that we did those episodes on Ayn Rand and objectivism, because I think that's a good example.

383
00:52:40,530 --> 00:53:24,318
Chris: There's definitely still some things that I think I keep in me from that same, even though a lot of it is crap, a lot of it is horrible. And I just think that's a more healthy relationship with some of our past engagements with things that we may not like now is the sort of, like, again, take what you want. It just, it kind of sounds so cliche, because it is a cliche, but, like, I I think it's the best approach for, like, because if I sat here now and was like, everything about objectivism was bad, and now everything I think is right, then, like, I would just be running into the same. I would have the same blind spots. I would have the same, like, I can't be wrong. Everything I believe right now must be correct. And then how. What good is that?

384
00:53:24,414 --> 00:53:46,262
Kayla: And I think that having been a part of some of these communities in some ways better positions you to be able to criticize from within. Part of why we wanted to bring doctor Torres on the show as an authority is not simply because they helped coin the term TESCREAL and has done this research, but also because they have a deep background in being a member of that.

385
00:53:46,366 --> 00:53:50,456
Chris: Yeah, they helped edit the Greg Kurzweil's second book.

386
00:53:50,598 --> 00:54:01,260
Kayla: And I think that people who have that lived experience are better positioned to offer honest and valid criticism to a movement.

387
00:54:01,380 --> 00:55:00,368
Chris: Yeah. So with that in mind, I think what I call myself a transhumanist, I don't know, maybe I would say I'm a crip technoscientist instead. But if somebody said yes or no to just the transhumanists and didn't ask me to give an alternative, then I might still say yes. I think there's a lot of danger here. I think that doing the eugenics episodes definitely opened my eyes to the idea that. And I think this is partially because us talking about crip theory and crypt technoscience is, like, the. How you go about achieving a goal really does matter. The goal, like, you might say that this is me putting words in people's mouths. So this is just my thought. But you might say, like, a cyborgist and a eugenicist both have this goal of, like, transcending their biology. Right.

388
00:55:00,504 --> 00:55:31,956
Chris: But, like, one is going about it via very pernicious bad means that have been proven to be extremely dangerous to the genocide level. And then the other one is. Is doing it in a manner that is, like, you know, creative and individual focused and blah, blah. So I think that, like, the how makes a big difference. And that's why I think I'd be comfortable saying, I'm a transhumanist. I'd be like, I'm a transhumanist, but I don't like idiocracy. I'm a transhumanist, but I don't believe in dysgenic pressures.

389
00:55:32,068 --> 00:55:35,076
Kayla: So it's a little bit of my relationship to the word vegetarian.

390
00:55:35,188 --> 00:55:35,940
Chris: Mm.

391
00:55:36,100 --> 00:55:38,068
Kayla: I don't like to call myself that.

392
00:55:38,244 --> 00:55:41,494
Chris: And part of that vegetarianism is a dysgenic pressure.

393
00:55:41,572 --> 00:55:58,402
Kayla: It's absolutely a cult. No, part of that is because of the stigma that's associated with it, which is definitely less now. Like, I'm somebody who a. My vegetarianism has waxed and waned through my life, but I came up as a vegetarian in time when it was, like, really not looked upon, kind of.

394
00:55:58,426 --> 00:56:01,722
Chris: Well, you also lived in a pretty red meat sort of area.

395
00:56:01,786 --> 00:56:29,476
Kayla: Yeah. And so there was stigma attached. And then also there is internal stigma. There are people who are vegetarians and vegans who I am not in community with because I don't fuck with the way that they talk about the underpinning philosophy of vegetarianism. And veganism. So I have a little bit of a complicated relationship with that label. And I think that mirrors a little bit, maybe how you're feeling about something like TESCREAL or transhumanism.

396
00:56:29,628 --> 00:56:58,138
Chris: Yeah. And I don't think we're nearly at the place where somebody's going to ask me if I'm a tTESCREAList. It's not nearly that much in the lexicon yet, but I think somebody might ask you if you're a transhumanist. And certainly I think. I think our friends might ask us how we feel about the whole thing now that we've done these episodes this season. And I think that's my answer. I had already kind of given up on a few of the singletarian things that feel a little more fantastical.

397
00:56:58,234 --> 00:57:01,390
Kayla: No hope in my life.

398
00:57:03,330 --> 00:57:32,130
Chris: But I really have. Until I started really engaging with some of the TESCREAL stuff from Timnit and Emil head, I didn't see as much harm as I see now and as much potential harm. And I think that is useful when these ideas are so influential in Silicon Valley, they're so influential in the most influential place that has ever existed. I think it's important to know those things.

399
00:57:34,590 --> 00:57:42,518
Kayla: So, Chris, I think that's a good place to kind of transition us into the latter. The second half of this episode or.

400
00:57:42,534 --> 00:57:45,330
Chris: Second part of this episode to Transhuman.

401
00:57:45,710 --> 00:58:27,804
Kayla: To transfer what we've been talking about. We started this season by talking about death. We were really captivated by what cryonics meant about our relationship to death when we started the season and before we started the season, and we wanted to extrapolate that and explore that. And that led us to Tuskriel, which I would still argue is very much about the fear of death and is very much about managing terror regarding our mortality. So much of this has to do with immortality. And not just immortality of the self, but immortality of the species, ensuring that humanity exists forever. People who are talking about x risk, existential risk are managing their.

402
00:58:27,892 --> 00:58:39,446
Chris: Their terror about the species. Yeah. And transhumanism is like, we want to transcend human biology and do all these things and make people better in so many ways, but mostly we want to have longevity.

403
00:58:39,518 --> 00:58:43,622
Kayla: Right? And so I kind of want to ask you asked me a question, so I'm gonna ask you a question.

404
00:58:43,726 --> 00:58:44,750
Chris: No, please don't.

405
00:58:44,870 --> 00:59:01,050
Kayla: I think we've distilled what we're saying about, like, tuscarial. What are we saying about the fear of death with this season? Like, what is your takeaway on that and I have a second question to transition into that. So if you want me to ask you the second half, I can. Or you can answer this part.

406
00:59:01,990 --> 00:59:03,730
Chris: Go ahead and lay the second half on me.

407
00:59:04,100 --> 00:59:11,200
Kayla: I want to talk to you about how maybe your relationship to your fear of death has been altered, or not, by this season.

408
00:59:15,100 --> 00:59:28,012
Chris: Honestly, I think I had. If this season had taken away my dreams of living in a San Junipero, of uploading my brain to a heaven adjacent techno heaven.

409
00:59:28,076 --> 00:59:28,540
Kayla: Right.

410
00:59:28,660 --> 00:59:29,092
Chris: So that I could.

411
00:59:29,116 --> 00:59:31,156
Kayla: Persistent world or persistent world. Digital self.

412
00:59:31,228 --> 01:00:26,270
Chris: Right. Of pleasure and joy or whatever, then I think I would have a different answer than I do. But I think I had already kind of. Before we started this season, I was already pretty skeptical that the rapture for nerds was actually going to become a reality, right. So I don't think it's really changed my relationship to death as much. I do think that the last episode of last season, though, did the death cafes. The death cafes did. And I don't necessarily think. I'm a little torn. Inside me, there are two wolves, as always, and one wolf is like, hey, cryonics might be cool, and it would be nice to live as long as you want and choose your own time of passing. That would be cool. And then there's another wolf inside me that's like, that.

413
01:00:26,310 --> 01:01:18,256
Chris: Fear has led humans down some weird paths over. According to terror management theory, all of the paths we have gone down over the centuries and millennia have been because of our fear of death. So there's this other wolf that's kind of going like, yeah, but maybe it's better to somehow find a way to peace, come to terms with that reality. And I think that I definitely get more of that from the death cafe. One of the biggest takeaways I have from that is how often we talk about living well and how not often we talk about dying well. And so I've. At least over the past year, if somebody dies well, I think that's important. I don't know. Sometimes we joke about this, right? Like, okay, Evel Knievel does 47 flips over the Grand Canyon while he's on fire and cool.

414
01:01:18,288 --> 01:01:32,770
Chris: And we're like, well, but what if he dies? And I'm like, what a way to go. Like, after going to death cafe and sort of having that mindset changed a little bit about dying well. Right now, when I say what a way to go, I'm like, kind of serious.

415
01:01:32,930 --> 01:01:33,530
Kayla: Yeah.

416
01:01:33,650 --> 01:01:40,442
Chris: You know, like, if that's. If your life's work is doing crazy shit like that, and that's how you.

417
01:01:40,466 --> 01:01:47,230
Kayla: Go out if your life's work is dying. Well, yeah, that's tremendous.

418
01:01:47,860 --> 01:01:57,596
Chris: Yeah. So, like, sometimes now when I see things where it's like, well, that's risky, you know, rather than being like, oh, no, the most. The overriding, most important thing is to not die.

419
01:01:57,668 --> 01:01:58,280
Kayla: Right.

420
01:01:58,780 --> 01:02:02,012
Chris: But maybe we should make some room for dying.

421
01:02:02,076 --> 01:02:14,588
Kayla: Well, not that we're advocating everybody go out and jump the Grand Canyon on motorcycles. I'm still. I totally see what you're saying, and I still will probably continue to live my life with, like, oh, no, that's scary. That's risky. I don't want to.

422
01:02:14,604 --> 01:02:19,376
Chris: No. Yeah, yeah. I think that having an appreciation for risk and safety is also important.

423
01:02:19,508 --> 01:02:54,216
Kayla: It makes me think more about, like, you know, I don't. No spoilers, but it makes me think about a recent tv show that we've watched in which a survival scenario was presented to a group of people, and lots of those people did not survive. Most of the people did not survive. And many of those people went out in ways that, while fictional, were beautiful. Defending other people, helping other people, you know, making meaning out of their death in ways that, frankly, made them more immortal than simple immortality would.

424
01:02:54,328 --> 01:03:09,608
Chris: Yeah. Yeah. So I know that this season has definitely changed some of my perspectives. I just don't know if it's changed my perspective on death. It's changed my perspective on. Has it changed your TESCREALs? I'm still pretty afraid of dying.

425
01:03:09,664 --> 01:03:10,420
Kayla: Me too.

426
01:03:10,930 --> 01:03:20,546
Chris: The only thing that has been able to alleviate that a little bit for me was in the brief moments that our son was alive.

427
01:03:20,698 --> 01:03:24,122
Kayla: Yeah. So not the podcast.

428
01:03:24,226 --> 01:03:29,310
Chris: So not the podcast. Sadly, it's a good podcast. It's not that good.

429
01:03:30,330 --> 01:03:34,674
Kayla: I think, for me, it has affected my fear of death.

430
01:03:34,802 --> 01:03:35,474
Chris: Really?

431
01:03:35,642 --> 01:03:56,954
Kayla: And I think when I say that, I mean, not that it's lessened my fear at all. Terrified, but it's made me more comfortable in, like, knowing that I kind of have to build a relationship with that fear and not try to manage it away. Like, yes. According to terror management theory, that's kind of what we're always doing.

432
01:03:57,082 --> 01:04:03,350
Chris: The desire to not manage away your terror is itself terrible. Managing away.

433
01:04:04,410 --> 01:04:38,026
Kayla: I think it's just made me. When I first learned about cryonics, it instilled a panic in me, really. I became panicked to a degree of like, oh, fuck, I have to do this, and I have to do this now, and I have to do this fast, and I need to get you signed up and, like, should we have done this for our son? Am I a bad parent. Am I. Do I need my. Do I need to get my family involved in this? Like, we need to do this now? Now? Like, I felt a panic of, like, if I don't do this, then I'm going to die and be dead, and then that's it. And there's no. It's like, it's the worst thing I can think of. And to me, that was terror management. That was me going like, well, Kayla.

434
01:04:38,098 --> 01:04:39,338
Chris: That was just your hyper fixation.

435
01:04:39,394 --> 01:04:47,954
Kayla: Yeah. I've discovered a way to solve this unsolvable thing. I've discovered a way to surmount death. And so I have to do it now. I could die at any time. I have to do it now.

436
01:04:48,122 --> 01:04:48,802
Chris: Yeah.

437
01:04:48,946 --> 01:04:57,130
Kayla: And going through this season, I don't feel that anymore. I actually feel much more calm in a weird way where.

438
01:04:57,250 --> 01:05:01,314
Chris: Cause we don't wanna wake up alongside Peter Thiel and Nick Bostrom.

439
01:05:01,362 --> 01:05:57,496
Kayla: Peter Thiel and Nick Bostrom. That does help, I'm not gonna lie. And also, I think it just talking about Crown X and talking about the church of perpetual life and talking about long termism and EA and transhumanism and extropianism and cosmism and how much, like, panic and anxiety are behind these things make me realize that I can either spend my life in panic and anxiety and try and stop death for myself or not. And kind of either one's okay. And that's. I'm seeing a possibility that there's just as much value in going, I'm not going to solve my own death. And that's okay as there is in going. I think I found a way, and that's what I want to do. And I'm more comfortable right now with the part of me that's going, I can have a relationship with this fear of death.

440
01:05:57,528 --> 01:06:09,464
Kayla: And if I don't sign up for cryonics and I'm committing to the finality rather than leaving an ellipses, I feel more comfortable about that than I did when we started the season. I feel a lot more comfortable about that.

441
01:06:09,552 --> 01:06:14,056
Chris: Yeah. Dying well can be as valuable as living forever.

442
01:06:14,208 --> 01:06:14,980
Kayla: Yeah.

443
01:06:16,160 --> 01:06:20,780
Chris: Would you still sign up for cryonics? Say you had money to spare?

444
01:06:21,880 --> 01:06:24,400
Kayla: If I had money. If I had. Fuck you money? Yes.

445
01:06:24,560 --> 01:06:25,288
Chris: Okay.

446
01:06:25,424 --> 01:06:29,192
Kayla: I would sooner sign up for cryonics than buy a fancy car. But considering that I don't.

447
01:06:29,216 --> 01:06:31,140
Chris: What if you had, like, screw you, buddy money?

448
01:06:31,760 --> 01:07:06,840
Kayla: Considering that I don't. And so I'm evaluating my life as the person I am now who has to make budgetary decisions and decide, okay, do I want to live in a one bedroom apartment and not have children and live as frugally as I possibly can in this life for the possibility of an extension. I'm not currently willing to make that trade off, and I'm. And I feel comfortable with that. I feel comfortable with living well now or living as well as I can, rather than trying to tie my life to the possibility of more.

449
01:07:08,260 --> 01:07:16,924
Chris: What if you woke up any solarpunk utopia, though? Well, you wouldn't. You wouldn't. Because the Peter Thiel and Nick Foster.

450
01:07:17,052 --> 01:07:20,034
Kayla: Yeah, that's cool. And also, like, what if they weren't?

451
01:07:20,132 --> 01:07:26,886
Chris: What if it was, like, people that you liked were all signed up and all these douchebags weren't? Would that actually make a difference?

452
01:07:27,078 --> 01:07:32,166
Kayla: Yeah, kind of would. Yes, it would. But that's not the reality.

453
01:07:32,318 --> 01:07:33,246
Chris: Yeah, I know.

454
01:07:33,358 --> 01:08:16,564
Kayla: And I also think that, like, right now, where I am talking about this, I'm feeling less attachment to, like, I have. I. The me. The me that eat me is the eye that I am right now has to experience that or else is not valuable. Or else. Or else I'm a little detached. I'm a little less attached to the fomo. And maybe that's just because I'm like, I don't know. Am I about to have a midlife crisis? I don't know. But I think that I can find. I think part of that is talking about this season of so much of these people are nothing. Like I said, they're not all cryonicists. So much of these people are not signing up forever life for themselves.

455
01:08:16,732 --> 01:08:22,220
Kayla: So many of these people are talking about long termism and talking about the future of humanity, not the future of themselves.

456
01:08:22,340 --> 01:08:22,812
Chris: Right.

457
01:08:22,916 --> 01:08:57,944
Kayla: And so if I can do. If I can take what I like from that and leave the rest, it's okay that other people will get to experience that and not me. And I find a value in that. Like, even though my consciousness does not experience that. Like, we're all just pieces of the universe waking up to itself. We're all that. Do I find some comfort in thinking about somebody else will experience that? And in a way, that's a humanity thing, not a me thing. Does that make sense?

458
01:08:58,032 --> 01:09:01,504
Chris: It does. I don't know if this is for the show. It kind of burst your bubble.

459
01:09:01,671 --> 01:09:03,819
Kayla: No. Why?

460
01:09:06,979 --> 01:09:10,080
Chris: That's great. I love it.

461
01:09:10,979 --> 01:09:13,228
Kayla: But cool story, bro.

462
01:09:13,283 --> 01:09:22,308
Chris: Cool story, bro. No, it's beautiful. But isn't that also true for people to get bronze bulls and tortured and.

463
01:09:22,484 --> 01:09:23,300
Kayla: What do you mean?

464
01:09:23,420 --> 01:09:35,801
Chris: Horrible things happen, too. It's kind of like at the end of season one of true detective, bro, you asked for every good thing, for every part of the universe that's waking up experiencing something cool. There's also brother negative one.

465
01:09:35,906 --> 01:09:49,590
Kayla: You can't ask me, what if you woke up in a solarpunk utopia? Wouldn't that be great? And then when I say, like, yeah, I hope other people get to experience that. You can't gotcha me. And say, like, well, what if there's bad stuff? You presented me a scenario and I answered it.

466
01:09:50,170 --> 01:10:02,970
Chris: What if the solarpunk utopia doesn't have SPF? Then what are you gonna do with all the solars? Yeah, no, I think that's a good answer. That is a beautiful answer.

467
01:10:04,190 --> 01:10:26,102
Kayla: And I'm not saying I don't want to live forever. I'm just saying I'm a little more okay after this season with acknowledging and accepting that may not happen. When I learned about the singularity and I learned about, oh, my God, I could live forever. Kind of really attached to that, and I'm unattaching now.

468
01:10:26,286 --> 01:10:47,062
Chris: Yeah. Just to answer my own question, I think I would still sign up for cryonics. I don't know. I think it's basically very similar to what you said, where there are budgetary constraints for us. We're not fuck you money. But if the budgetary constraints were relaxed for whatever reason, because we do get fucking money or whatever, if I could.

469
01:10:47,086 --> 01:10:51,534
Kayla: Get a good life insurance policy, which unfortunately, I probably can't, it's kind of.

470
01:10:51,542 --> 01:10:56,818
Chris: Like the might as well thing. I'm going to be dead anyway. Might as well roll the dice.

471
01:10:56,954 --> 01:11:03,698
Kayla: And I think coming at it from a might as well perspective versus a panic and anxiety perspective feels fine.

472
01:11:03,794 --> 01:11:12,770
Chris: And I think I had a bit of that sort of transition, like you were talking about more with the death cafe than with all this stuff.

473
01:11:12,850 --> 01:11:16,710
Kayla: Yeah. Are you feeling done?

474
01:11:17,810 --> 01:11:34,354
Chris: What I'm feeling is, like, I've had a million thoughts over the past, like, five months about all of this, and now, like, I don't know if I'm, like, now the panic I have is there's, like, just all these really good thoughts up in my head that I just can't remember.

475
01:11:34,482 --> 01:11:40,510
Kayla: That's why we're going to leave room open for future episodes this season, even though this is the finale.

476
01:11:41,050 --> 01:11:44,314
Chris: Oh, wow. What a good. I didn't even do that on purpose.

477
01:11:44,362 --> 01:11:49,640
Kayla: I know. This is called making accommodations for ADHD. We're an accessible podcast.

478
01:11:49,720 --> 01:11:50,536
Chris: That's right.

479
01:11:50,688 --> 01:11:56,288
Kayla: I guess if we're ready to close out, I kind of have just a little food for thought to maybe close us out.

480
01:11:56,384 --> 01:11:58,136
Chris: Ooh, what is it? Is it Mac and cheese?

481
01:11:58,208 --> 01:12:10,820
Kayla: Well, I just. I think that something that is cool about TESCREAL and futurism and cryonics and all of this that we've talked about this season is. It presents the opportunity for a grand adventure. This is all grand adventure stuff.

482
01:12:11,120 --> 01:12:19,170
Chris: Even though I'm waking up next to Peter Thiel, it might be in this, like, crazy samurai Jack post modern weird.

483
01:12:19,210 --> 01:12:29,746
Kayla: World and the things that transhumanism may do to our bodies that we can't even conceive of and thinking about what the long term future is and what humanity might look like when it spreads out across the stars.

484
01:12:29,858 --> 01:12:31,554
Chris: This is grand adventure stuff.

485
01:12:31,602 --> 01:12:31,786
Kayla: Yeah.

486
01:12:31,818 --> 01:12:33,042
Chris: I would like to visit Mars.

487
01:12:33,106 --> 01:12:34,074
Kayla: It's very cool.

488
01:12:34,162 --> 01:12:39,610
Chris: I know there's a lot to dislike about all. There's a lot to criticize, but, man, I really want to visit Mars.

489
01:12:39,770 --> 01:13:04,384
Kayla: And I think that's something, again, that I will take from TESCREAL is that there's really cool things to think about here. Sci-Fi is cool for a reason, and this is why the grand adventure. But I came across a quote from Jane Goodall recently, and it kind offered an alternative to that as the only great adventure. So I'm just going to read this quote with you and the rest of our listeners.

490
01:13:04,472 --> 01:13:06,020
Chris: Okay. Is it ook?

491
01:13:07,510 --> 01:13:09,370
Kayla: Is that how chimpanzees talk?

492
01:13:09,710 --> 01:13:12,926
Chris: I think they do sign language, actually. So you're gonna sign this to me?

493
01:13:12,958 --> 01:13:14,254
Kayla: I'm not gonna sign this to you.

494
01:13:14,342 --> 01:13:17,930
Chris: Okay. That would not be good on a podcast. That would be actually not accessible.

495
01:13:18,550 --> 01:13:34,370
Kayla: So here's the quote. My next great adventure, aged 90, is going to be dying. There's either nothing or something. If there's nothing. That's it. If there's something, I can't think of a greater adventure than finding out what it is.

496
01:13:36,790 --> 01:13:47,662
Chris: And thank you to all of our wonderful listeners for going on this adventure with us of this tescrial terror management. Season of cult or just weird.

497
01:13:47,846 --> 01:13:49,670
Kayla: This is Kayla, and this is Chris.

498
01:13:49,790 --> 01:13:52,190
Chris: And this has been season six of.

499
01:13:52,310 --> 01:13:53,910
Kayla: Cult or just weird.