Join the conversation on Discord!
June 4, 2024

S6E10 - The Future of Humanity: Rationalists

Wanna chat about the episode? Or just hang out?   --- The Territory is not The Map.   Chris & Kayla fall out of a coconut tree and continue their discussion of LessWrong.   --- *Search Categories* Anthropological; Science /...

Wanna chat about the episode? Or just hang out?

Come join us on discord!

 

---

The Territory is not The Map.

 

Chris & Kayla fall out of a coconut tree and continue their discussion of LessWrong.

 

---

*Search Categories*

Anthropological; Science / Pseudoscience; Common interest / Fandom; Internet Culture

 

---

*Topic Spoiler*

LessWrong

 

---

*Further Reading*

https://www.lesswrong.com/

https://www.reddit.com/r/LessWrong/

https://rationalwiki.org/wiki/LessWrong

https://en.wikipedia.org/wiki/LessWrong

The Sequences

just some Harry Potter fanfic

https://www.lesswrong.com/posts/DyG8tzmj3NnGRE8Gt/explaining-the-rationalist-movement-to-the-uninitiated

https://www.lesswrong.com/posts/LgavAYtzFQZKg95WC/extreme-rationality-it-s-not-that-great (the community is remarkably self-aware)

Yudkowsky on cults!

Yudkowsky on cults again!

the podcast eats its tail (CoJW's previous LessWrong episode)

https://rationalwiki.org/wiki/Roko's_basilisk

https://news.ycombinator.com/item?id=8060440

https://www.reddit.com/r/askphilosophy/comments/19cmro3/should_i_read_lesswrong_wiki_or_not/

https://www.reddit.com/r/askphilosophy/comments/2ddfs6/comment/cjp22pf/

Slate Star Codex (LessWrong2.0)

somethingawful has a bitch-thread about LessWrong

https://www.reddit.com/r/SneerClub/ (LessWrong hater's group)

https://rationalwiki.org/wiki/Bayesian

https://en.wikipedia.org/wiki/Functional_Decision_Theory

https://en.wikipedia.org/wiki/Newcomb%27s_paradox

Time platforms Yudkowsky's doomerism

Receipts for how bad Yudkowsky's predictions have been

 

---

*Patreon Credits*

Michaela Evans, Heather Aunspach, Alyssa Ottum, David Whiteside, Jade A, amy sarah marshall, Martina Dobson, Eillie Anzilotti, Lewis Brown, Kelly Smith Upton, Wild Hunt Alex, Niklas Brock

<<>>

Jenny Lamb, Matthew Walden, Rebecca Kirsch, Pam Westergard, Ryan Quinn, Paul Sweeney, Erin Bratu, Liz T, Lianne Cole, Samantha Bayliff, Katie Larimer, Fio H, Jessica Senk, Proper Gander, Nancy Carlson, Carly Westergard-Dobson, banana, Megan Blackburn, Instantly Joy, Athena of CaveSystem, John Grelish, Rose Kerchinske, Annika Ramen, Alicia Smith, Kevin, Velm, Dan Malmud, tiny, Dom, Tribe Label - Panda - Austin, Noelle Hoover, Tesa Hamilton, Nicole Carter, Paige, Brian Lancaster, tiny

Transcript
1
00:00:00,520 --> 00:00:28,530
Chris: Somebody posts, should I read less wrong wiki or not? And then says, like, yeah, I recently discovered it because of Roko's basilisk, actually. So, like, what's going on there? Like, should I read it? Is it good? And then the first answer here says, I read through a bunch of it some years ago because I had heard some of my students mention it and wanted to see what the fuss was about. The main problems I could find were the following, and each one of these problems listed as a whole description. But I'm just gonna just read the bullet points. One. Reinventing the wheel.

2
00:00:29,310 --> 00:00:34,950
Kayla: Two. See, that's what happens when you don't engage with. What do you think, you just fell out of a coconut tree?

3
00:00:48,050 --> 00:00:54,106
Chris: Welcome back to Cult or J. ust weird. I'm Chris. I'm a game designer and data scientist.

4
00:00:54,298 --> 00:01:00,818
Kayla: I'm Kayla. I have so many biases. So many biases. I'm biased and irrational.

5
00:01:00,874 --> 00:01:02,936
Chris: I am incredibly biased and irrational.

6
00:01:03,058 --> 00:01:04,596
Kayla: That was one of the reviews we've gotten.

7
00:01:04,628 --> 00:01:06,228
Chris: No, it's irresponsible and biased.

8
00:01:06,244 --> 00:01:20,268
Kayla: No, it wasn't irrational. Okay, well, in addition to being irresponsible and biased, we are also irrational, irresponsible, and biased. If you want to support those three things about this show, you can by going to patreon.com culturejustweird. And if you want to pay us.

9
00:01:20,284 --> 00:01:21,708
Chris: For our irrationality, if you want to.

10
00:01:21,724 --> 00:01:26,600
Kayla: Talk about how irresponsible we are, join us on our discord that is linked in the show notes.

11
00:01:27,540 --> 00:01:50,658
Chris: So, Kayla, this is probably the hardest question I've ever asked you at the top of a cult or just weird episode. But I need you to think of just one of the wonderful things about me that you like. I know it's going to be a challenge to think of just one, but try. And here, I'll give you some help. I've heard you say before that you really like how I like to think about things really carefully.

12
00:01:50,714 --> 00:01:52,146
Kayla: That's my favorite thing.

13
00:01:52,338 --> 00:01:57,056
Chris: And, like, all joking aside, that's not just something that you like about me. That's something I like about you, too.

14
00:01:57,128 --> 00:01:59,020
Kayla: Oh, I don't do it the same way you do.

15
00:02:00,520 --> 00:02:11,008
Chris: Well, that's why you're irresponsible and biased. So. But what if I told you that there was a whole community of Internet denizens that was dedicated to precisely that sort of thing?

16
00:02:11,104 --> 00:02:12,736
Kayla: I believe they're called netizens.

17
00:02:12,808 --> 00:02:24,274
Chris: And what if I told you that it was the same community that we are currently comparing to a cult because of how inside baseball the jargon is and how they have a set of scriptures written by a charismatic leader.

18
00:02:24,392 --> 00:02:30,550
Kayla: Are you saying that thinking is a cult activity? That sounds like something a cult would say.

19
00:02:30,630 --> 00:02:31,638
Chris: Well, we are a cult.

20
00:02:31,774 --> 00:02:33,294
Kayla: I don't know how to feel anymore.

21
00:02:33,422 --> 00:03:06,380
Chris: It's okay. Good. Because now you know how I have been feeling, and you mentioned this last episode, but, yeah, you already know that I've been, like, really disoriented because I've been, like, venting and fuming all week about how disoriented all this research has made me feel. But now I want to vent and fume on the podcast. I don't know what the hell is going on. I can't decide whether in anything or with just less wrong in anything. But also in particular, right now, less wrong. I can't decide whether it's just like, a doomsday cult or whether it's something I like.

22
00:03:07,160 --> 00:03:35,036
Kayla: I mean, do we want to get into this here? But, like, you and I have been talking a little bit, the last little bit, about how the singularity. Singularitarians and, like, some of these guys, like, echo millenarial, millenarian beliefs. Yeah, echo millenarian beliefs. Echo, like, trying to bring about or, like, wanting the end times to occur because the end times will signal the death of the bad and the birth of the good.

23
00:03:35,148 --> 00:03:54,318
Chris: Absolutely. And we will get to that. Singularitarianism, which I believe is the correct ism to refer to singulitarians, is a huge overlap with this community, which we will get to. But right now, I'm just like, I want to talk about how. I don't know. Like, I was really.

24
00:03:54,334 --> 00:03:57,902
Kayla: You just want to gab? You just want to, like, I just want to gab. You want to sit crooked and talk straight.

25
00:03:57,966 --> 00:04:20,140
Chris: Yeah, exactly. Like, when I was doing this research, I was like, all ready to be like, all right, let's see what these dipshits are about. You know, because I had my same biases you do about, like, Eliezer and, like, his dumbass quotes, because those are the things that rise to the surface, blah, blah. And, like, as I was reading more of the posts, I was like, I don't know. Like, I hate to use these words, but, like, I was like, man, these are kind of my people a little bit.

26
00:04:21,160 --> 00:04:22,952
Kayla: It's like being a Rick and Morty fan.

27
00:04:23,056 --> 00:04:28,936
Chris: Yeah. Yeah. Like, they kind of, like, I like arguing about pedantic stuff just to think about it.

28
00:04:28,968 --> 00:04:29,232
Kayla: Right.

29
00:04:29,296 --> 00:04:37,944
Chris: You know, I like arguing about, like, strange hypotheticals and, like, stuff where, like, a lot of people will be like, that doesn't apply to real life what the hell is wrong with.

30
00:04:37,952 --> 00:04:43,172
Kayla: I once had an eight hour conversation with friends about what ontologically makes up a sandwich.

31
00:04:43,296 --> 00:04:52,380
Chris: Yes. Oh, exactly. Right. Okay. Yeah, exactly. So it's that kind of stuff. Like, but on a website with a lot of people, and I was like, as I was reading this, I was like, man, I really like that actually.

32
00:04:52,460 --> 00:04:53,020
Kayla: Yeah.

33
00:04:53,140 --> 00:05:34,496
Chris: And, like, for the most part, there are good comment sections on the Internet and there are bad comment sections on the Internet. And, like, this comment section, like, felt pretty decent to me. Like, everybody was kind of, like, respectful even though they disagreed. That's so nice because they kind of, like, it's kind of built in. Like we're supposed to disagree. Like, we're supposed to be, like, challenging each other's biases and stuff. Now, that doesn't always happen, but for the most part, it seemed like posts and comments on those posts were total circle jerk because they always linked to all stuff around their own. So there's totaltvtropes.com, but for the most part, I don't know, nobody was like, calling each other an idiot or being like, ugh, this is stupid libtard stuff.

34
00:05:34,568 --> 00:05:35,656
Kayla: It's very refreshing.

35
00:05:35,768 --> 00:05:37,416
Chris: Yeah. Like, I kind of like this.

36
00:05:37,488 --> 00:05:39,060
Kayla: I know that sounds really nice.

37
00:05:39,900 --> 00:05:42,932
Chris: So I don't know. Like, I don't know how to.

38
00:05:42,956 --> 00:05:43,820
Kayla: Can I ask you a question?

39
00:05:43,860 --> 00:05:48,644
Chris: Can I approach this? But, yeah, you totally can. So I have a pros and cons list.

40
00:05:48,732 --> 00:06:11,414
Kayla: Nice. Okay, I want to hear the pros and cons. Nice is not on the same, like, spectrum as good and bad. So nice is great. Prefer nice, and also kind of is nice is a different thing because you can be extremely nice and polite while being, like, a racist bigot?

41
00:06:11,502 --> 00:06:12,942
Chris: Oh, okay. Sure, sure.

42
00:06:13,006 --> 00:06:32,610
Kayla: And so my question here is the style of communication polite? And also there's room for racism and bigotry, or is the style of communication polite? And it is also a welcoming atmosphere to marginalized populations.

43
00:06:33,550 --> 00:06:37,702
Chris: So I think that this is the type of community, and I don't know.

44
00:06:37,726 --> 00:06:39,230
Kayla: Either way, I'm truly asking.

45
00:06:39,270 --> 00:07:23,712
Chris: No, I know. Yeah, I think this is the type of community that I think any community can definitely have some of that stuff creep in. I think this community, some of that, like, bad stuff creep in. I think this community is maybe a little more prone than, like, average because they definitely have, like, a more prone, like a laissez faire, more prone to having, like, bad ideas, bad actors, shitty people creep in. I think they are more prone to that because of the sort of like, you know, veneer of, well, order Saul, talking about rationalism stuff. And we're not going to. I'm not going to critique you for something I disagree with. So somebody can come in and be like, hello, here's some racist things. And, like, it's not really encouraged to, like, shun someone like that.

46
00:07:23,736 --> 00:07:31,426
Chris: It's more encouraged to, like, engage with that person. And they're more pro debate than they are pro de platforming, if that makes sense.

47
00:07:31,458 --> 00:07:44,114
Kayla: Sure. Which is. Which is, again, like, I don't know if that's right or wrong. I just mean more like, it's possible to have an extremely polite conversation about whether or not certain queer people should be able to exist in society.

48
00:07:44,162 --> 00:08:24,356
Chris: Totally. Yeah. So I think it's more like what I was just saying where it's like, it's. I don't think the majority of them have that view themselves, but I. You're gonna find it a little more difficult to combat if you're not willing to curate as part of your sort of overall steez, which they've kind of had some of that problem, which we'll get to a little bit, but for the most part. Yeah, for the most part, I don't think. I don't. I don't find it to be. I didn't find it to be a community that was like, we are very polite. We're like, polite owners of a plantation. Like, I didn't get that impression. I got more of, like, we are always trying to engage in good faith, and sometimes that can bite you in the ass.

49
00:08:24,488 --> 00:08:26,240
Kayla: That's. And that's.

50
00:08:27,100 --> 00:08:27,652
Chris: I know.

51
00:08:27,716 --> 00:08:32,700
Kayla: I think that's a really. In certain arenas, like, that is the way to operate.

52
00:08:32,780 --> 00:08:33,236
Chris: I know.

53
00:08:33,308 --> 00:08:44,724
Kayla: Not in all arenas necessarily, but in certain arenas, acting as if everyone is engaging in good faith is, like, very important. But, yeah, you're gonna get fucked by people who aren't engaging in good faith.

54
00:08:44,772 --> 00:08:57,000
Chris: Yeah, I think that they had a little bit of security by obscurity that something like Twitter wouldn't have with that. Right. You're not really gonna find yourself on l's wrong unless you give a shit and you were like, read about it somewhere else.

55
00:08:57,080 --> 00:08:57,536
Kayla: Right.

56
00:08:57,648 --> 00:09:08,904
Chris: Whereas, like, cross population sample in Twitter is gonna have x percent that are, like, deplorable idiots and ballsy of you.

57
00:09:08,912 --> 00:09:09,940
Kayla: To use that word.

58
00:09:12,080 --> 00:09:17,300
Chris: You know, Hillary's my gal. The.

59
00:09:18,400 --> 00:09:19,420
Kayla: I'm with her.

60
00:09:20,960 --> 00:09:49,506
Chris: You're totally derailing me, Kayla. That's right. This is the derail episode of this. But yeah, no, there's going to be a percentage of folks on a site like Twitter that are just going to suck. And so you have to curate more in a site. Like, you can kind of get away with it more on a site, like, less wrong. But that doesn't mean that it doesn't create problems, which we will get to bodybuilding.com. So, yeah, that's another great example. We should definitely do that for the show at some point. All right, let's get to pros and cons. Let's get to the good and Badlandhouse. This is probably not comprehensive, but I just kind of wanted to, like, lay it all out in front of me.

61
00:09:49,538 --> 00:09:52,850
Kayla: Now, is this a rational pros and cons list or is this an irrational pros?

62
00:09:52,890 --> 00:10:02,230
Chris: I wrote it. So it is extremely irrational, extremely pristinely rational, and it cannot be challenged because otherwise you're just being illogical.

63
00:10:02,610 --> 00:10:03,910
Kayla: I need to adjust.

64
00:10:04,650 --> 00:10:06,430
Chris: Adjust your thinking or your chair.

65
00:10:07,410 --> 00:10:11,378
Kayla: My whole body is asleep. Lay it on me.

66
00:10:11,514 --> 00:10:19,894
Chris: All right, so the first. And, like, this is obvious. Cause we've already talked about this a lot, but the whole challenging bias thing, right? Like, that's what they're all about.

67
00:10:19,942 --> 00:10:21,570
Kayla: And I, wait, is this a pro or a con?

68
00:10:23,190 --> 00:10:24,582
Chris: We're starting with the good.

69
00:10:24,646 --> 00:10:25,370
Kayla: Okay.

70
00:10:27,070 --> 00:10:53,478
Chris: We can go back and forth, but the way I have it written here is we'll talk about the good, then we'll talk about the bad. So anyway, I think the whole challenging bias thing is pretty good. I like that. That's something. That's why we sort of let off today's episode with, like, I like to think about stuff because, like, that's. That's what they do. And I really like it. You know, I enjoy the same sort of, like, oh, yeah, that is a bias. Interesting. Okay, how can I combat that?

71
00:10:53,494 --> 00:11:10,286
Kayla: Like, and I like having my biases, like, pointed out to me, not like, pointedly. I don't want someone to walk up to me and be like, hello, you just did a bias. But I like having that. Those styles of thinking pointed out to me. I like my pitfalls. I like to. I like to notice my pitfalls these days.

72
00:11:10,358 --> 00:11:18,774
Chris: Me too. I think it's fascinating. I've always liked to be like, I kind of like to be proven wrong. Like, I don't know.

73
00:11:18,822 --> 00:11:21,810
Kayla: Oh, you love being proven wrong. I hate it. But you love it.

74
00:11:22,790 --> 00:11:24,822
Chris: Like, I know that's another bias.

75
00:11:24,886 --> 00:11:26,062
Kayla: That's the thing I like about you.

76
00:11:26,126 --> 00:11:38,702
Chris: Me saying that means, like, well, that means that all the ideas I have are correct. Because if it wasn't, then I. Because I like being proven wrong. I would have been proven wrong. And so everything I have in my brain is correct.

77
00:11:38,806 --> 00:11:39,438
Kayla: I know, but I just love.

78
00:11:39,454 --> 00:11:40,490
Chris: Is that fucked up.

79
00:11:41,390 --> 00:11:52,302
Kayla: Something that I really like about you is that we can engage in, like, there's a point in arguing with you. There's not a point in arguing with a lot of people, necessarily, but there's a point in arguing with you.

80
00:11:52,366 --> 00:11:55,698
Chris: Oh, good. So I'm basically inviting people to argue with me. That's a great.

81
00:11:55,704 --> 00:12:20,612
Kayla: There's a point in arguing with you in that you engage in a debate or discussion or an argument with the possibility in your head that you could change your mind. And I think that's a difficult thing to do. But you kind of come by that naturally. We've had arguments where instead of both of us just digging in on the respective side, we've changed each other's minds. And I think you're actually better at.

82
00:12:20,636 --> 00:12:30,188
Chris: That than I am, so, Kayla, cis hetero white men are born with the debate gene, where we just like to debate things.

83
00:12:30,244 --> 00:12:39,576
Kayla: Yeah. That's different than being able to change your mind. I think that's actually the opposite. The debate gene is the thing that you guys have where you just want to fucking argue, to argue, and you can't. Your minds will not be.

84
00:12:39,608 --> 00:12:40,512
Chris: Yeah, yeah, I have that, too.

85
00:12:40,576 --> 00:12:49,592
Kayla: You'll be presented with new information, and you figure out how to, like, maintain your biases versus taking that new information and changing your mind, which, again, it's.

86
00:12:49,616 --> 00:12:51,648
Chris: A bias for me to think that I'm not the.

87
00:12:51,744 --> 00:12:52,904
Kayla: Yeah, no, I shouldn't be.

88
00:12:53,032 --> 00:12:53,392
Chris: Right.

89
00:12:53,456 --> 00:12:55,552
Kayla: I shouldn't be tooting your horn too much here.

90
00:12:55,736 --> 00:13:21,520
Chris: And I actually think that. And this is sort of riffing here, but, you know, I was joking by saying, like, oh, is this that white dude? We're like that. But, like, when you come from a position of privilege, it is easier to view things in, like, this sort of neutral manner. It's like, you have this position of privilege where, like, you know, whatever you're talking about probably doesn't affect you. Like, you're probably fine. So you can sit.

91
00:13:21,560 --> 00:13:28,280
Kayla: You get to be rational about, like, whether somebody should have rights versus the person who's like, I'm literally trying to fight for right to exist.

92
00:13:28,360 --> 00:13:50,210
Chris: Exactly. So, like, both of those sides of those coins offer their own bias. If you're not privileged, you have some sort of bias based on that. And if you're privileged, you have some bias based on that. So I think there is some truth there, that part of the privilege of being privileged is also the privilege of having topics of discussion not affect you as much.

93
00:13:50,370 --> 00:14:13,572
Kayla: I also think it makes you unable to have those discussions in fully rounded ways. And there's this cognitive bias of, like, oh, I can be objective about it because it doesn't affect me. And you're not actually being objective about it. Not you, but the general you. You still have a very biased and influenced perspective on said thing because you're not a part of it.

94
00:14:13,636 --> 00:14:29,460
Chris: Right. Yeah, I think it kind of cuts both ways back to less wrong. They have, like, some pretty good insights in there, too, about the biased stuff. So I'll just collated a few of them here. There's one thing that they call the happy death spiral.

95
00:14:30,840 --> 00:14:32,340
Kayla: I don't want you to explain that.

96
00:14:33,880 --> 00:15:13,168
Chris: I know when I was reading. So as is everything on tv, tropes, I mean, less wrong. You encounter a term several times before you ever actually click on a link to see what the hell it is. Otherwise, you can't read things. And so, yeah, I definitely read that five, six, seven times before I finally read what it was. I don't know. I was disappointed or not. But, like, they definitely had, like, a. Ooh, I wonder if that's, like, I don't know, you just, like, go on drugs and then you just, like, spiral your way into death because you're just so happy from the drugs. But no, by the way. And then that's, you know, an example of the jargonization of this website.

97
00:15:13,224 --> 00:15:14,060
Kayla: Right, right.

98
00:15:14,440 --> 00:15:20,240
Chris: So what it is basically like a feedback loop from the halo effect. Have you heard of the halo effect?

99
00:15:20,280 --> 00:15:23,432
Kayla: The halo effect is where if you're hot, you're going to get stuff.

100
00:15:23,496 --> 00:15:30,864
Chris: It's mostly, if you're hot, people think that you are better in other ways as well, which is like, you know, you're shorter or more.

101
00:15:31,032 --> 00:15:33,856
Kayla: That's the whole thing of, like, oh, that's why JFK won.

102
00:15:34,048 --> 00:15:34,920
Chris: Yeah, and, like.

103
00:15:34,960 --> 00:15:36,648
Kayla: Cause he didn't look sweaty in that one debate.

104
00:15:36,744 --> 00:15:40,504
Chris: Yeah, that was the first debate on tv. And he was a.

105
00:15:40,632 --> 00:15:41,600
Kayla: He was not sweaty.

106
00:15:41,680 --> 00:15:44,280
Chris: More charismatic looking man than Richard Nixon.

107
00:15:44,320 --> 00:15:46,456
Kayla: Yeah, Richard Nixon looked real sweaty in that one. I'm sorry.

108
00:15:46,528 --> 00:16:14,436
Chris: I think it was the sweat, which I will never become president because I'm extremely sweaty. But that's the only reason. The halo effect also can be more generalized as, like, if one person has a property that's, like, awesome. Like, maybe instead of being good looking, they're smart, then you're gonna think that they are better in other properties. So it's not just good looking, it's also like, well, he's smart at math, so he must also be a good person. Cause smart people aren't.

109
00:16:14,548 --> 00:16:16,344
Kayla: Oh, the Neil degrasse Tyson problem.

110
00:16:16,492 --> 00:16:58,476
Chris: Right. And then there's the, like, the domain thing. Like, if you're smart at physics, then you also know about vaccine science, and you also know about this and that. So what they call the happy death spiral is a feedback loop on that where it's like, if you think something that you. That yourself maybe are involved in, you know, your club is better at, like, you have really smart people in physics, whatever, you're also going to think they're better in other areas. And then thinking that you're better in other areas is then going to feed back to every other property about the group. So there's this feedback loop where the halo effect from one property affects all the other properties, and then those properties are biased in this way that makes those seem better, which makes everything else seem better.

111
00:16:58,508 --> 00:17:04,228
Chris: And there's a death spiral based on that. This cognitive bias death spiral. Does that make sense?

112
00:17:04,284 --> 00:17:04,660
Kayla: Yes.

113
00:17:04,740 --> 00:17:23,223
Chris: Okay, so they have solutions for that. They say you really have to be looking out for the bad stuff and not just accepting the good stuff. And it also helps to chop up ideas into smaller chunks for evaluation, which I thought was, like, pretty insightful, you know? So, like, rather, this is kind of related to, like, one true cure.

114
00:17:23,311 --> 00:17:25,311
Kayla: We do hate gestalts on this podcast.

115
00:17:25,415 --> 00:17:41,980
Chris: Yeah, exactly. So if you chop up, like, well, this group is good at this. Let's. Let's evaluate their claims about physics, but then let's separately, let's evaluate their claims about this thing, and let's separately, let's evaluate their effectiveness over here. Then that can help you sort of combat that.

116
00:17:42,320 --> 00:17:43,176
Kayla: I like that.

117
00:17:43,288 --> 00:17:51,184
Chris: Yeah. Another example, Eliezer Yudkowski has a whole section in the sequences about cults and cult like behavior.

118
00:17:51,312 --> 00:18:03,656
Kayla: And one answer there, I think that people who are not extremely well researched and have classic academic backgrounds, I think that they shouldn't talk about cults.

119
00:18:03,848 --> 00:18:06,040
Chris: That's kind of the problem.

120
00:18:06,160 --> 00:18:08,620
Kayla: I was making a joke about our podcast.

121
00:18:10,230 --> 00:18:20,846
Chris: Oh, man. Oh, that was real bad. All right, we're gonna end here today because I feel really bad. Now.

122
00:18:20,998 --> 00:18:22,614
Kayla: Do you want to keep that in, or do you.

123
00:18:22,662 --> 00:18:54,134
Chris: No, I. Look, if I. If I cut that out, it would be really biased. In an. Yeah. In an episode about, like, examining one's own biases, I think that would be like. I don't know. That's my happy death spiral. That's bad. I agree. I think that we do that. I think that's our entire. That's another. What I was gonna say in my like, utterly blind state was. This is, like, also an example of what does he know about cults? Like, he knows about AI research and cognitive stuff.

124
00:18:54,182 --> 00:18:54,462
Kayla: Sure.

125
00:18:54,526 --> 00:18:55,918
Chris: But it looks like it's kind of related.

126
00:18:56,014 --> 00:19:01,690
Kayla: I don't talk about cults because I'm really smart at anything. I talk about cults because I'm not smart at anything.

127
00:19:02,150 --> 00:19:25,528
Chris: Right. And neither am I. But he does talk about cults, and there's a user in that. When one of those. I don't know, somebody in one of those essays commented something and brought up a great point about what? Like, it was essentially, he was talking about our. Is it? Niche criteria. Like, obviously he didn't put it that way, but, like, there was this, like, argument of, like, well, cults are just religions that haven't caught on yet.

128
00:19:25,584 --> 00:19:25,920
Kayla: Right?

129
00:19:26,000 --> 00:19:34,134
Chris: Like, cults are just. There's no difference between, like, a new religious movement and an established religion, other than, like, society accepts one or the other.

130
00:19:34,182 --> 00:19:34,726
Kayla: Right?

131
00:19:34,878 --> 00:19:53,342
Chris: And this guy brought up a really good point, which is, like, actually there are some, like, things that we see that feel, like, irrational sometimes aren't. And, like, when you think about something, like, say, Catholicism versus a new religious movement, Catholicism has thousands of years of, like, we're not gonna ask you to Jim Jones yourselves.

132
00:19:53,446 --> 00:19:54,006
Kayla: Yeah.

133
00:19:54,118 --> 00:20:12,264
Chris: And has, like, millions of adherents. So there's, like, a ton of, like, branding power with the group that's been around longer, and there's, like, a ton of social proof with the group that is larger right now. Obvious caveat. The catholic church has been responsible for all kinds of horrible atrocities.

134
00:20:12,312 --> 00:20:13,872
Kayla: Let's list all the atrocities.

135
00:20:14,016 --> 00:20:34,342
Chris: But if you're a person that's like, I would like to join a group, please. And there's a bunch of new groups that were just popped into existence this year, and then there's one that's been around for 1000 years. One of those is a known quantity, and the others aren't. Maybe one of them might be good, but one of them might be heaven's gate, and that might be bad for you.

136
00:20:34,446 --> 00:20:37,686
Kayla: The heavens Gate people didn't think it was bad fair.

137
00:20:37,838 --> 00:20:41,062
Chris: But if you're someone who is wanting.

138
00:20:41,086 --> 00:20:43,574
Kayla: To join something, not simply to die.

139
00:20:43,702 --> 00:20:49,230
Chris: Not simply to die, if you don't already have those heavens gaity beliefs, you might not want to join.

140
00:20:49,350 --> 00:20:50,726
Kayla: Correct? Yeah.

141
00:20:50,838 --> 00:21:05,620
Chris: So I thought that was a pretty good point that I'm like, I don't know if I'd ever really thought about that on the show. We sort of, like, have this criteria about, like, is it niche? And that sort of, you know, separates whether it's a cult or a religion in our, you know, little show parlance.

142
00:21:05,700 --> 00:21:06,348
Kayla: Right.

143
00:21:06,524 --> 00:21:09,812
Chris: But it doesn't necessarily mean that's like an irrational thing to do.

144
00:21:09,916 --> 00:21:10,348
Kayla: No.

145
00:21:10,444 --> 00:21:18,484
Chris: Or an irrational way to think that, like, people have reasons for belonging or viewing religions in a certain way versus new religious movements.

146
00:21:18,532 --> 00:21:22,756
Kayla: And it's because of what your mom and dad made you do on a Sunday morning.

147
00:21:22,868 --> 00:21:34,462
Chris: Exactly. So, yeah, the comments on less wrong really do seem to support the notion that these guys are trying to be rational and fight their biases. We talked about that already.

148
00:21:34,566 --> 00:21:35,246
Kayla: That's really good.

149
00:21:35,278 --> 00:21:39,518
Chris: The comments were thoughtful and measured. That's another part that I was like, oh, man, I like them.

150
00:21:39,574 --> 00:21:40,730
Kayla: That's fantastic.

151
00:21:42,230 --> 00:21:45,302
Chris: They really like thinking fast and slow, the book.

152
00:21:45,366 --> 00:21:46,518
Kayla: Or they like doing that.

153
00:21:46,614 --> 00:22:05,550
Chris: They like thinking fast and then they like thinking slow, and then they like thinking fast again. No, they like the book by Daniel Kahneman. Like, they. That's like one of their things that they, like reference a lot. And that's like, one of the things that. So you were asking me a little bit ago, like, what does Elliot like? What did he read that's fast and slow? So one of the things is thinking fast and slow.

154
00:22:05,590 --> 00:22:07,806
Kayla: Every man in my life has read thinking fast and slow.

155
00:22:07,838 --> 00:22:09,918
Chris: They're, like, into the pop psychology.

156
00:22:10,094 --> 00:22:12,630
Kayla: Okay, how do they feel about Malcolm Gladwell?

157
00:22:12,790 --> 00:22:13,638
Chris: That I don't know.

158
00:22:13,694 --> 00:22:14,766
Kayla: I really want to know.

159
00:22:14,878 --> 00:22:21,784
Chris: I would like to know that as well. I think I have. I have readdez the words tipping point on the site.

160
00:22:21,872 --> 00:22:22,240
Kayla: Okay.

161
00:22:22,280 --> 00:23:02,400
Chris: But that doesn't mean they necessarily like Malcolm Gladwell. Tipping point is a thing other than just the title of his book. But thinking fast and slow for you guys is just a book written by an experimental psychologist. And based on his research, we sort of have two modes of thinking. There's the fast, low resource, low energy kind of thinking. And that's sort of like our snap intuition, right? That's your first thought, and that's what's in control most of the time. Most of the time, you're nothing. Even though it may feel that way. You're not actually sitting, like, making a decision every second of your life. Most of the time, you're like, kind of on autopilot. In thinking fast and slow. They call the first one system one. They call the second one system two.

162
00:23:02,440 --> 00:23:35,042
Chris: So the fast part of system one, the slow part of system two, and the slow part is your executive function, where you're actually sitting down, focusing on a problem and trying to make a logical decision on it. And that part is, like, very resource intensive. So like, that's why most of the time you're doing intuitive sort of snap thinking. That's, like, you know, 80 20. It's, like, probably gonna be right most of the time. And then the second part is more for, like, I need to actually look at this. And then, you know, like, people kind of, like, make some inferences, maybe, that they shouldn't sometimes. Like, oh, well, obviously the second one's better.

163
00:23:35,146 --> 00:23:36,026
Kayla: Right? Right.

164
00:23:36,178 --> 00:23:41,098
Chris: So we should try to do that as much as possible. It's like, no, that's not what he was saying.

165
00:23:41,154 --> 00:23:47,928
Kayla: You shouldn't do it as much as possible. I'm not saying ethically or morally or rationally, but, like, your brain. Bad for brain.

166
00:23:48,024 --> 00:23:56,312
Chris: Yeah, but I'm bringing it up here in the good list just because, like, I really liked that book. So, like, when I know you didn't. When I saw that was something that they liked too, I was, like.

167
00:23:56,496 --> 00:24:16,008
Kayla: Thinking fast and slow is for people who read Malcolm Gladwell's blink, and then later on realized, like, oh, shit. That was an extremely poorly researched book. And, like, you can make the argument that Malcolm Gladwell is a hack. And so then you got your hands on thinking fast and slow, and you went, oh, sweet. So all the things I went in blank are true, but they're more true now.

168
00:24:16,184 --> 00:24:26,752
Chris: That's what that point is, right? Talk about a fucking bias. Holy shit. And then finally, on my good bad list here, they like debating fun philosophy, hypothetical type stuff.

169
00:24:26,816 --> 00:24:28,056
Kayla: That sounds like the best.

170
00:24:28,248 --> 00:24:41,336
Chris: They're kind of like an online version of that dumb Ayn Rand club I had in college. Shout out to last season's objectivism episodes, where it's just like a bunch of nerds discussing utilitarianism and how to think better.

171
00:24:41,408 --> 00:24:42,620
Kayla: Again, it's a fandom.

172
00:24:43,010 --> 00:24:47,510
Chris: Speaking of objectivism, let's move from talking about the good to talking about the bad and ugly.

173
00:24:47,850 --> 00:24:49,350
Kayla: This is what I'm here for.

174
00:24:49,770 --> 00:24:59,626
Chris: So objectivism is a good segue because the less wrong community has a good amount of overlap with some of the same sort of, like, tropes that were also present in the objectivist community as.

175
00:24:59,658 --> 00:25:02,350
Kayla: We have brought up having sex with your young fans.

176
00:25:03,210 --> 00:25:04,778
Chris: I didn't see anything about that.

177
00:25:04,834 --> 00:25:05,590
Kayla: Okay.

178
00:25:08,300 --> 00:25:32,468
Chris: So that's not one of the overlaps, but one of the overlaps is the whole, like, amateur thing that has come up a couple times, just like Ayn Rand and sort of the objectivists around her. They also, like, hate incumbents. Maybe hate's a strong word, but they don't care for incumbents in the field that they are themselves doing. And so, like, what do you mean? Incumbents don't like them. So like, they don't engage with, like.

179
00:25:32,484 --> 00:25:33,732
Kayla: They don't read philosophy.

180
00:25:33,876 --> 00:25:49,660
Chris: Yeah. So they don't like, yeah, they don't like they think that philosophers and like the field of philosophy is like, oh, kind of stupid and a waste of time. But then all they do is talk about philosophy without, like, realizing really that's what they're doing. And so then, I mean, so, like.

181
00:25:49,740 --> 00:25:51,252
Kayla: I have a hard time with that one.

182
00:25:51,356 --> 00:26:00,924
Chris: Yeah. R askphilosophy so there's a Reddit community where it's like, you can ask a philosopher stuff, right? Like philosophers go on there and answer people's questions.

183
00:26:00,972 --> 00:26:02,160
Kayla: Hard time with that, too.

184
00:26:03,090 --> 00:26:35,250
Chris: I saw at least two posts of somebody being like, hey, I just encountered this site called less wrong, and I was just wondering, like, what you guys think about it. They seem to talk about a lot of philosophical topics. Is it worth my time? And the answers are, like, not great. Let me see if I can actually find this. Yeah. So here, I'll just, I won't read the whole Reddit post here because it's long, but somebody posts, should I read less wrong wiki or not? And then says, like, yeah, I recently discovered it because of Roko's basilisk, actually. So, like, what's going on there? Like, should I read it? Is it good?

185
00:26:35,550 --> 00:26:53,890
Chris: And then the first answer here says, I read through a bunch of it some years ago because I had heard some of my students mention it and wanted to see what the fuss was about. The main problems I could find were the following. And each one of these problems listed has a whole description, but I'm just gonna just read the bullet points. One, reinventing the wheel.

186
00:26:54,710 --> 00:27:08,008
Kayla: Two, see, that's what happens when you don't engage with, what do you think, you just fell out of a coconut tree? You exist in the context of all in which you live and what came before you. Thank you, Kamala.

187
00:27:08,064 --> 00:27:25,968
Chris: That is gonna be the title of this episode because that honestly really hits the nail on the head for this. Bullet point number two, contempt for philosophy. Oh, bullet point number three, errors. Bullet point number four, cult like atmosphere surrounding the author. The author being Ilyat Jowski.

188
00:27:26,024 --> 00:27:26,464
Kayla: Okay.

189
00:27:26,552 --> 00:27:44,682
Chris: Yeah. And again, like, yeah, there's this weird sort of reverence for his writing, right? And then five biographical stories from the author all over the place. So, like, Eliezer, he makes a lot of, like, this happened to me statements. And it's usually just kind of cringe. And so this person says, what, like.

190
00:27:44,706 --> 00:27:46,194
Kayla: Everybody clapped type stuff?

191
00:27:46,322 --> 00:27:51,470
Chris: Sort of. So, like, here's a quote. I was like, I'm an intellectual maverick after all, because I work in machine learning.

192
00:27:51,810 --> 00:27:54,450
Kayla: Wait, I'm gonna throw up. I'm sorry. I'm sorry.

193
00:27:54,490 --> 00:28:01,070
Chris: I told them I work in AI, and their jaws hit the floor. People often say they're afraid of me because I work in AI.

194
00:28:03,890 --> 00:28:07,930
Kayla: I'm speechless. See, this is what I mean. I would not get along with this person.

195
00:28:07,970 --> 00:28:08,670
Chris: I know.

196
00:28:10,090 --> 00:28:20,010
Kayla: I can't handle that kind of self perception I find very grating.

197
00:28:20,170 --> 00:28:49,908
Chris: Yeah, I think most people do. Another good quote. I read about them, and this is actually, like, kind of quoted in several places. When people are critiquing less wrong is somebody said in an article, the good bits aren't original and the original bits aren't good. So they dismiss academics and philosophers, reinvent a wheel, and then, like, claim they discover something. And when I say they referring to less wrong. But, like, in this case, I'm mostly talking about Eliezer and his sequences.

198
00:28:50,004 --> 00:28:55,290
Kayla: I really see. This is the kind of stuff. Yeah. You didn't just fall out of a coconut tree.

199
00:28:55,590 --> 00:29:14,406
Chris: Yeah. And then if somebody takes a philosophy course, they kind of have to, like, I saw this as well on another Reddit post. Like, they kind of have to, like, unlearn this, like, weird framework of vocab and mishmash and then, like, correct course onto, like, you know, what the current thinking and, like, the way that philosophers talk about these problems and there's nothing.

200
00:29:14,438 --> 00:29:20,560
Kayla: Wrong with already exists being original and having original thoughts. The problem is when you don't. Insane.

201
00:29:21,900 --> 00:30:09,342
Chris: Yeah. The good bits are not original, and the original bits are not good. My next bad con, whatever bullet, is the jargon, which we've already talked about at length. So I'll just note it under the bad column here. But it also reminds me a little bit of objectivism. So objectivism has vocabulary tropes. Like, a is a. And, like, don't be an Ellsworth, too. You know, like, they have their own sort of, like, in group language as well. Like, not in, like, as extensive a manner. But it just kind of reminded me of that, too, where you won't know what somebody is talking about unless, like, you're both already in the know. It's more of, like, a signaling thing, right. Both objectivism and less wrong have their foundational texts on less wrong. You should read the sequences to become familiar with their ideology.

202
00:30:09,486 --> 00:30:47,618
Chris: In objectivism, you should read Atlas Shrugged. And now, to be fair, I read several posts on less wrong, where members are discussing the exact problem of, hey, if somebody asks me what less wrong is, you can't really direct them to a collection of essays that will take them a week to read, right? There has to be some better way of summarizing the community and its goals. So there's definitely people on the site that are aware of that. So they've created some summaries like this in the form of increasingly short additional blog posts. One person summarized it in a post that's like 2200 words. Another one gets all the way down to 1600 words.

203
00:30:47,754 --> 00:31:09,110
Chris: And then finally I read one that was like, it's just like a seven minute read and comes complete with a one sentence definition of the rationalist movement, which the rationalist movement is considered to have sort of synonymous with less wrong. Or it was when less wrong was more active. Now it's its own thing, right? Would you like to hear the one sentence definition?

204
00:31:09,480 --> 00:31:10,224
Kayla: Yes, please.

205
00:31:10,312 --> 00:31:38,932
Chris: Good, because I was going to say it anyway. All right, so this definition is provided by a user named Mellivora on less wrong, and it goes as follows. Rationalism is the philosophy of pragmatism, which is the practical application of the scientific method, experiment, and evidence, as well as logical thought and any awareness of limitations and uncertainty to achieving our goals. So it's the philosophy of pragmatism, which is all that with a particular focus on bayesian statistics and cognitive biases, end quote.

206
00:31:39,036 --> 00:31:42,000
Kayla: I won't lie. I don't. I don't understand.

207
00:31:43,620 --> 00:31:48,420
Chris: They like science. They like to think about biases, and they like to think in terms of, like, probability and statistics.

208
00:31:48,460 --> 00:31:49,520
Kayla: Got it. Thank you.

209
00:31:50,980 --> 00:32:15,178
Chris: Like, objectivists, they're sort of, like, head over heels in love with logic and rationality. Now, counterpoint to this. Like, I think they. Correctly. So there's, like, a bias that they point out to, like, a fallacy is to pit logic against emotion, which I don't get that sort of, like, sense from objectivism. Like, I think they're like, yes, emotion sucks, right?

210
00:32:15,234 --> 00:32:15,690
Kayla: Right.

211
00:32:15,810 --> 00:32:22,570
Chris: Whereas on less wrong, the sense I got was more like, no, that's a false dichotomy.

212
00:32:22,610 --> 00:32:25,754
Kayla: Hey. That we've created. Because that's like. That's a problem. You see a lot.

213
00:32:25,802 --> 00:32:28,554
Chris: And they even have a word for it called straw Vulcan.

214
00:32:28,642 --> 00:32:29,576
Kayla: I'm in love.

215
00:32:29,738 --> 00:32:48,732
Chris: So it's like straw man argument, except the straw man is like, look at that dumb Vulcan, because all he uses is logic, and that means that he can't ever feel emotion, and that gets him into trouble. And so they're like, no, it's like a false dichotomy. Like, emotion is also, like, a thing and is, like, a valid way to help us come to decisions, blah, blah.

216
00:32:48,756 --> 00:32:49,460
Kayla: Right, right.

217
00:32:49,620 --> 00:33:05,064
Chris: So I want to give them credit there. Like, objectivists, they both have a tendency to be overconfident in conclusions because they think it comes from a rational place. And this is mostly manifest when they strike out into areas other than just talking about shit like confirmation bias.

218
00:33:05,112 --> 00:33:05,700
Kayla: Right.

219
00:33:06,040 --> 00:33:18,760
Chris: Just like with Ayn Rand, when she's talking about metaphysics and epistemology, it's mostly okay. But then when she extrapolates that out to, like. And therefore, the only author you should enjoy, rationally speaking, is Victor Hugo.

220
00:33:18,920 --> 00:33:24,024
Kayla: Oh, she had a problem, too with the, like, I have invented an idea.

221
00:33:24,112 --> 00:33:24,680
Chris: Oh, totally.

222
00:33:24,720 --> 00:33:26,208
Kayla: And it was already an existing idea.

223
00:33:26,264 --> 00:33:56,650
Chris: Absolutely. That was why that was the first thing I brought up, is because that's very objectivist. I have disdain for these things. I am going to do this thing, though, and reinvent the wheel in a sloppier way. Anyway, that extrapolation into those other fields is where it gets weird. Now, I see a lot less of that on less wrong than I do in objectivism. They wouldn't say, you can rationally only enjoy this type of music.

224
00:33:57,710 --> 00:33:59,650
Kayla: God bless Ein ran. What if.

225
00:33:59,990 --> 00:34:18,853
Chris: Jesus, I know, like, talk about confident. You know, like, holy shit. So they're not quite that crazy, but I still see a little bit of that. And then, like, on the other end of the, like, this isn't in our wheelhouse spectrum. They steer clear of politics to a fault. So we mentioned the jargon phrase politics is the mind killer.

226
00:34:18,902 --> 00:34:20,487
Kayla: Right. I want to know what that means to.

227
00:34:20,533 --> 00:34:22,331
Chris: So what that means is basically like.

228
00:34:22,435 --> 00:34:26,507
Kayla: Cause I wanna spoiler you on something. Oh, no, I don't think you can.

229
00:34:26,683 --> 00:34:28,130
Chris: You already know the results of the election.

230
00:34:28,195 --> 00:34:37,650
Kayla: I do. I don't think you can avoid politics is the thing. I think you can avoid talking about, like, electoral politics, but you can't actually avoid politics. Sorry, say what you were gonna say.

231
00:34:37,675 --> 00:34:41,554
Chris: No, that's not a spoiler. That's a critique. I have also read about the less wrong community.

232
00:34:41,627 --> 00:34:42,399
Kayla: Gotcha.

233
00:34:43,299 --> 00:34:56,304
Chris: But, yeah, so they say politics is the mind killer. Because when you get into a political mindset, you have so much emotional, involved, investment, and also, like, you are. You're getting into that thing where, like, identity comes into it.

234
00:34:56,337 --> 00:34:58,049
Kayla: A lot of identity stuff there makes it hard.

235
00:34:58,129 --> 00:35:43,130
Chris: It makes it really hard to be rational about something. And, like, say, I could be wrong about this because, like, it attacks some fundamental. And that's like well researched. Yeah, so that's true. But then they, like, because they think they're like neutrally divorced from politics. They have a little bit of like a blind spot in the both sides risms, which we kind of talked about earlier. Just getting back to that, this has boiled over like once or twice. And in fact, there's a site, or there was a site, it's now defunct, which is good. But there's a site that was called more. Right, which is a great name. And it was founded by a bunch of less wrongers that wanted to be like more openly racist and such with their commentary. And so they made their own little play for it.

236
00:35:43,250 --> 00:35:44,506
Kayla: Okay, good bye.

237
00:35:44,578 --> 00:36:04,114
Chris: Yeah, no, so actually, yeah, that's like self policing working. I saw comments that were almost exactly what you're saying right now. Like, to their credit, there was like a post I read on less wrong about this schism. And like, a lot of the comments on the post were like, I really hope that, like, the assholes self select out of our community now and like, it's like cleaned up in here.

238
00:36:04,282 --> 00:36:27,862
Kayla: That style of like, policing, quote unquote, your community. Yeah, that can work. Especially for these smaller communities. Like you're talking about, I think is also similar to how we talked about with empty spaces. Like, especially we talked about with nicole in the bonus episode, talking about like, okay, well, what happens when like, shit ass people come into your forum and start being shit ass?

239
00:36:28,006 --> 00:36:34,214
Chris: Especially with empty spaces because it's like so diffuse. There's not like a site with a guy with, you know, it's like even.

240
00:36:34,262 --> 00:36:58,534
Kayla: Harder, I think where we ended up with that was that essentially because of the way this community functions, the people that come in that are saying like, because again, it was a very queer community, et cetera. The people that are coming in and saying like, dumbass, racist, horrible shit are simply not going to get the attention. The eyes, they're going to essentially be siloed out of the community because no one's going to be paying it any attention. And that kind of seems similar to.

241
00:36:58,542 --> 00:36:59,730
Chris: What I think that's here.

242
00:37:00,030 --> 00:37:03,430
Kayla: Community siloed. There was a siloing effect happening.

243
00:37:03,510 --> 00:37:07,576
Chris: Yeah, I think that happened here. I just think that happened slower. I.

244
00:37:07,638 --> 00:37:07,956
Kayla: Okay.

245
00:37:07,988 --> 00:37:10,276
Chris: Than in a place like empty spaces.

246
00:37:10,308 --> 00:37:16,836
Kayla: And I don't know how if empty spaces is still doing okay with this. I have not checked on particularly true in the last year, but that's how it was at the time.

247
00:37:16,908 --> 00:37:22,932
Chris: Yeah, I think the slowness is because like, these guys are really into engaging in good faith, like we talked about earlier.

248
00:37:22,996 --> 00:37:23,196
Kayla: Right?

249
00:37:23,228 --> 00:37:30,580
Chris: So, like, they're really, like, it's gonna take longer to be like, okay, I'm not talking to this guy. Like, he just keeps bringing up the bell curve and I'm sick of it.

250
00:37:30,660 --> 00:37:46,486
Kayla: Yeah. It's like, how do you engage, how long do you engage in good faith when the, like, when the engaging in good faith is actually part of your DNA, making your community less safe? Because if people are coming in, being like, bell curve, bell curve, then that's not a safe community for the people that the bell curve targets.

251
00:37:46,678 --> 00:38:02,402
Chris: Yeah. So that's related to the whole politics is the mind killer thing, too, because they feel like they're neutral. They don't want to talk about politics. So when somebody does talk about politics, there's the la la. This isn't happening until eventually somebody's racist enough that you have to be like, we're not talking this guy.

252
00:38:02,466 --> 00:38:02,834
Kayla: Right.

253
00:38:02,922 --> 00:38:48,238
Chris: And then also, as you pointed out, you can't really avoid politics. You can avoid electoral politics, but you can't avoid, like, the political realities of, like, the different power structures that are in our world. You just kind of can't. I think some less wrongers I've definitely saw, and I think this includes Eliezer, actually, where he basically says, like, look, when we say politics is the mind killer, we just mean, like, let's not bring it into debates about XYZ because it's going to bias us. That doesn't mean that we shouldn't bring our rational thinking community, blah, blah to bear on politics outside of this site. So again, like Eliezer said, that it.

254
00:38:48,254 --> 00:39:05,820
Kayla: Sounds like they thought good or bad. Who's to say they've thought about a lot of this stuff? And I'm coming in with the bias of, like, you know, intangible. And it's not that they're just coming to it, like, superficially. And I think that's something that I'm also grappling with these days, as, you know, trying to check some of my biases and failing a lot of ways.

255
00:39:05,980 --> 00:39:06,708
Chris: It's hard.

256
00:39:06,804 --> 00:39:16,360
Kayla: Have I said this before on the show of, like, it's probably better to assume that the person you're engaging with a different opinion than you has.

257
00:39:17,140 --> 00:39:18,836
Chris: We've definitely talked about that in real.

258
00:39:18,868 --> 00:39:31,976
Kayla: Life, has the same amount of information. Like, it's, it is very possible for somebody to have reached a different conclusion than you with the same amount of information. And not everybody does have the same amount of information. And you're probably gonna talk to somebody who hasn't read the sequence.

259
00:39:32,008 --> 00:39:33,400
Chris: You can never know.

260
00:39:33,480 --> 00:39:34,336
Kayla: You can never know.

261
00:39:34,408 --> 00:39:38,872
Chris: Because we are all islands inside our own little brains. We don't know what information people have up there.

262
00:39:38,936 --> 00:39:59,180
Kayla: And also, it does seem like a cognitive bias to me to always jump to the conclusion of that person thinks the right wing take on this, and I think the left wing. And that means they haven't done enough reading. That means they haven't learned. That means that they don't know they're voting against their own interests. That means that they're dumb. That means that they're less smart.

263
00:39:59,260 --> 00:40:07,600
Chris: Almost always it means there's some sort of premises that you're not thinking of. There's probably something in that person's head that you just don't know.

264
00:40:08,020 --> 00:40:09,244
Kayla: They don't necessarily have.

265
00:40:09,292 --> 00:40:12,148
Chris: They don't have, quote unquote less information. Right, exactly.

266
00:40:12,244 --> 00:40:14,116
Kayla: They just have reached a different conclusion.

267
00:40:14,188 --> 00:40:23,026
Chris: Right. So speaking of, like, going outside the less wrong club, there's, like, a whole hater community dedicated to it. See, I read it called sneer Club.

268
00:40:23,178 --> 00:40:24,690
Kayla: Oh, is that what sneer club is?

269
00:40:24,730 --> 00:40:30,698
Chris: That's what sneer club is. It's like a less wrong hating community where, just, like, people discuss the dumbass stuff that happens with less wrong.

270
00:40:30,754 --> 00:40:32,910
Kayla: I have a lot of feelings about hater communities.

271
00:40:33,330 --> 00:40:46,314
Chris: I think. Don't quote me on this, but I think Sneer Club comes from Eliezer talking about hater communities and saying, like, well, people will form, like, a sneer club just to hate you. And then I think that's where they came up with the name Sneer club.

272
00:40:46,402 --> 00:40:47,522
Kayla: Okay, that's. I like that.

273
00:40:47,546 --> 00:40:58,004
Chris: Which is funny. And lots of the vitriol on Sneer club is devoted to Eliezer in particular. So his insanely smug arrogance, as it turns out, has rubbed a lot of people the wrong way.

274
00:40:58,052 --> 00:40:59,120
Kayla: I wonder why.

275
00:41:00,140 --> 00:41:28,122
Chris: And sort of like the, you know how you always only hear about the really crazy stuff people say and you never hear about the reasonable stuff? You know, like, the crazier takes rise to the surface, because that's the thing that goes viral. That's kind of present in the Sneer club, too, where it's like, okay, yeah, you're talking about, this is, like, an insane thing that this person said, but for every insane thing, there's also, like, ten normal things that are just not fun to sneer about.

276
00:41:28,186 --> 00:41:28,790
Kayla: Right.

277
00:41:29,170 --> 00:41:46,562
Chris: I'm also speaking a bit in the past here, because Sneer club is currently in stasis thanks to Reddit shenanigans like Cryo Stasis it's frozen in time thanks to Reddit. So, you know, there's like the whole Reddit controversy, like, way outside the scope here. But anyway, so they basically like put a pause on it. Like, you can still read it, you just can't post there.

278
00:41:46,626 --> 00:41:49,022
Kayla: Wait, the mods put a pause or the Reddit put a pause.

279
00:41:49,106 --> 00:41:51,390
Chris: The mods put a pause based on the actions of Reddit.

280
00:41:51,430 --> 00:41:52,170
Kayla: Got it.

281
00:41:52,750 --> 00:42:08,606
Chris: But this is also somewhat past tense, thanks to the fact that Lessfrong is not nearly as active as it once was. So the Roko's Basilisk, this is not part of the, like, pro con thing, but the Roko's Basilisk incident alienated and drove away a chunk of users.

282
00:42:08,678 --> 00:42:09,246
Kayla: Really?

283
00:42:09,398 --> 00:42:41,514
Chris: Yes, really. Then other incidents like yudkowskiisms and associations over the years eventually made it so that the rationalist community refers to themselves now as an online diaspora, the rationalists diaspora, because they don't really feel like they have a home. Kind of like, what less wrong was, like, that was kind of their home and now it's kind of like, well, we can't really go there. Although a large chunk of them apparently went to another website called Slate Star Codex, which to my understanding is a bit of like a less wrong 2.0.

284
00:42:41,562 --> 00:42:42,522
Kayla: I like that name.

285
00:42:42,626 --> 00:43:24,180
Chris: I know. I was like, wait, what the hell does that. I don't actually know what that means since this is not the episode about that, but that's sort of my pro and con list. And then there's like, things that I don't know, people don't like, but I think are kind of good. Like, are either good or like, why don't you like this? This is just them, like pontificating. So I think that there's def, I've read stuff where it's like they are too rational and don't think that intuition or emotion are good things to make decisions. Like, I've seen that critique and I'm like, no, if you actually read what they're saying, I don't think that's the case. And then, like, you know, they do weird thought experiments that sometimes get weaponized against them.

286
00:43:25,600 --> 00:43:35,000
Chris: One of the things we talked about in our previous episode on Roko's Basilisk, and we will probably talk about again in a future episode, here is the mote of dust thing.

287
00:43:35,120 --> 00:43:38,144
Kayla: I don't want to talk about the mote of dust thing. Cause it's going to tear our marriage apart.

288
00:43:38,192 --> 00:44:13,372
Chris: It is going to tear our marriage apartheid. I think that we should have a discussion about it on microphone. But for right now, and I know this is going to be really tough to compartmentalize, but for right now, I'm just going to say that I'll explain where it is. So there's this thought experiment of, like, if you had x number of people, where x is like a number so large that it makes the number of atoms in the universe look like nothing. Get a mote of dust in their eye, would you torture someone for 50 years, one person, to make that not happen? So it's basically like the trolley problem on steroids. No, it's a utilitarian thing.

289
00:44:13,436 --> 00:44:14,260
Kayla: I'm gonna go insane.

290
00:44:14,300 --> 00:44:37,102
Chris: It's a utilitarian thing. You can sit there and go insane because we don't have time to get into it right now. But that's. That's don't shoot the messenger. That's what it is. And it does make people insane. And then they talk about it in terms of, like, can you believe that they think that you should do this? Like, that's so abominable. But actually it's. I've read the posts. Just them talking about it is like a thought experiment.

291
00:44:37,126 --> 00:44:44,710
Kayla: Oh, no, that's dumb. If people are getting mad at them for talking about it or sneering at them for talking about the thought experiment, that's silly to me.

292
00:44:44,870 --> 00:44:45,862
Chris: What else would it be?

293
00:44:45,926 --> 00:44:50,810
Kayla: I'm just saying that if you come to the conclusion that's different than my conclusion, you're bad and wrong.

294
00:44:51,510 --> 00:44:53,382
Chris: Right. We are not biased here.

295
00:44:53,446 --> 00:44:58,126
Kayla: I'm not saying you shouldn't talk about it. I'm just saying that if you come to the wrong conclusion, you're bad and wrong.

296
00:44:58,158 --> 00:45:23,288
Chris: Yeah, I know how you feel, Kayla. Anyway, after all that, good, bad, whatever. Bottom line, I don't really know what to think. There were definitely a few days of me reading posts on less wrong where I was like, I don't really understand what the problem is. These guys seem pretty cool to me. Like, they're kind of my kind of nerds, like, the way they like to think about thinking and fight their biases. But then I read some of the receipts that sneer club had, and I'm like, oh, my God, these guys are idiots.

297
00:45:23,464 --> 00:45:27,828
Kayla: I think, really, the big problem for me is simply Eliezer Yudkowski.

298
00:45:27,884 --> 00:45:31,400
Chris: I think so, too. A lot of the weird, like, guru worship.

299
00:45:32,540 --> 00:46:02,686
Kayla: That's really the problem. That it's like, problem is such a nebulous word, but that is a red flag that comes up for me in this community and any community that would have a de facto leader like this, where. Yeah, no, one's saying, oh, he's infallible, but there is kind of that undercurrent or, like, things being inferred and implied. And then also simply the way that he talks about himself and the authority with which he makes certain statements, what.

300
00:46:02,718 --> 00:46:13,470
Chris: They would say, which I don't know if, I don't think I agree with this, but I think it's a valid counterpoint. They would say, well, just because something has the tropes and trappings of x doesn't make it x.

301
00:46:13,590 --> 00:46:17,090
Kayla: We're not a pyramid scheme because a pyramid scheme is illegal.

302
00:46:18,270 --> 00:46:27,006
Chris: So they kind of say that, like, yeah, I mean, just because I hold this man in high regard doesn't mean I'm, like, hero worshipping him or whatever.

303
00:46:27,078 --> 00:46:28,510
Kayla: You can hold a man in high regard.

304
00:46:28,590 --> 00:46:29,150
Chris: Yeah.

305
00:46:29,270 --> 00:46:34,878
Kayla: That doesn't mean that man doesn't hold himself to a level that a man shouldn't.

306
00:46:34,974 --> 00:46:35,294
Chris: Yeah.

307
00:46:35,342 --> 00:46:47,814
Kayla: Then that's just kind of the thing that sticks in my craw here. You know what? They talked about breaking things down into chunks, and the good chunk is all the stuff that we've talked about is having these conversations.

308
00:46:47,862 --> 00:46:48,918
Chris: That's a wise idea.

309
00:46:48,974 --> 00:46:56,254
Kayla: That chunk over here. And then the chunk of the danger of the charismatic leader stuff. Let's put that chunk over there in the bad.

310
00:46:56,342 --> 00:47:03,206
Chris: Yeah. Cause it kind of feels like, and I didn't read this on their site, so I'm just gonna coin it right now because I'm a genius, because you've.

311
00:47:03,238 --> 00:47:05,710
Kayla: Come independently come to this conclusion, independently.

312
00:47:05,750 --> 00:47:08,406
Chris: Come to a conclusion nobody has ever thought before.

313
00:47:08,478 --> 00:47:09,734
Kayla: Out of the coconut tree.

314
00:47:09,862 --> 00:47:28,860
Chris: You know what, Kayla? I did fall out of the coconut tree. Okay? Kamala can't tell me what to do. I think there's also, like, an unhappy death spiral where there's, like, a. This guy said some stupid shit. Look at this other stupid shit. And then you're like, everything about this place is bad. Clearly they're a bad group. Obviously they're a cult.

315
00:47:29,020 --> 00:47:29,460
Kayla: Right.

316
00:47:29,540 --> 00:47:31,628
Chris: And I think that's its own bias, too.

317
00:47:31,724 --> 00:47:34,956
Kayla: They get cult accusations levied against them. Oh, okay. Yeah.

318
00:47:35,068 --> 00:47:38,972
Chris: One of the, I think the first, when you Google search is less wrong.

319
00:47:38,996 --> 00:47:39,940
Kayla: Is less wrong. A cult.

320
00:47:40,060 --> 00:47:45,370
Chris: The first thing, it'll autocomplete is less wrong. And the second thing, it'll autocomplete is less wrong. Space cult.

321
00:47:45,540 --> 00:47:46,254
Kayla: Okay.

322
00:47:46,382 --> 00:47:49,070
Chris: And certainly, like, sneer club obviously says that.

323
00:47:49,110 --> 00:47:49,406
Kayla: Okay.

324
00:47:49,438 --> 00:47:59,126
Chris: Which I have a little bit about here, too, which is kind of what you just said. Like, hater communities. Like, I got a problem. They have an irritating vibe, too. Like, why are they obsessed with the wrong things? Unless wrong.

325
00:47:59,198 --> 00:48:00,502
Kayla: I think there's just.

326
00:48:00,686 --> 00:48:10,142
Chris: Yeah, but counter, counterpoint, there is answer to that. I think sneer club is also like a lot of burned x less wrongers.

327
00:48:10,206 --> 00:48:11,886
Kayla: And you know what? People that kind of need to work.

328
00:48:11,918 --> 00:48:13,958
Chris: Out their feelings of disappointment and betrayal.

329
00:48:14,014 --> 00:48:15,464
Kayla: They deserve to have a space too.

330
00:48:15,582 --> 00:48:42,566
Chris: But it's just like, as an outsider trying to make some sort of judgment call about this community, it's like it's impossible, right? And that's part of probably why the rationalists feel like they're part of a diaspora, right? I have a few quotes here too, just to confuse us further. So here's one good quote I liked from an article. I was trying to get a sense of how Les wrong felt about their critics, and I really liked this quote. This guy almost reached out to talk to him and I might still.

331
00:48:42,708 --> 00:48:44,330
Kayla: Is it I don't think about you at all?

332
00:48:44,410 --> 00:49:08,430
Chris: No, it's not that cool, they say. From what I can tell, the anti rationalist, anti transhumanist crowd does not seem interested in object level arguments about AI existential safety at this time. They are analyzing the situation purely from a social perspective. So what do we do? Do we get mad at them? What is the rational thing to do? I think the answer is understanding and kindness, end quote.

333
00:49:08,490 --> 00:49:09,038
Kayla: I like that.

334
00:49:09,094 --> 00:49:24,238
Chris: And I'm like, dude, yeah, man, better than me. And then like, further down that article, that person was like, here's some like, letters I wrote to these people that don't like us and say like, you know, I totally understand why you might not like us. And you know, I support you even though I disagree with you. Like, it was really good.

335
00:49:24,374 --> 00:49:25,090
Kayla: Yeah.

336
00:49:26,150 --> 00:50:02,082
Chris: And then, like, even Eliezer has his own good quotes. Quote. Once upon a time, I tried to tell my mother about the problem of expert calibration, saying, so when an expert says they're 99% confident, it actually only happens about 70% of the time. I don't know where he got these numbers, but made up the expert. They're probably made up the expert calibration problem is like an expert. Even though they're an expert, they're going to be overconfident about how likely whatever they're saying is to happen. So then there was a pause as suddenly and I'm talking, this is Eliezer talking now. There was a pause as suddenly I realized I was talking to my mother.

337
00:50:02,226 --> 00:50:36,396
Chris: And I hastily added, of course you've got to make sure that you apply that skepticism even handedly, including to yourself, rather than just using it to argue against anything you disagree with. Dot, dot. And then he goes on to talk about how it can actually be dangerous even to know these things about cognitive biases. If you know the cognitive biases, you can still. There's this period where if you don't recognize that it's also talking about you, then you're liable to just use it as like, oh, that's a bias that person's using, therefore, I'm right. I think that's a great insight.

338
00:50:36,468 --> 00:50:38,770
Kayla: Yep, yep.

339
00:50:39,470 --> 00:50:45,050
Chris: Another quote. This is really short, but just not every change is an improvement, but every improvement is a change.

340
00:50:45,550 --> 00:50:47,078
Kayla: That's cute. I like that.

341
00:50:47,254 --> 00:50:58,310
Chris: He also has a whole medium article, and this will be the last quote I talk. Well, it's not really a quote, I guess, but he has a whole medium article called hash rebootthepolice. And this is just to challenge your hatred of him.

342
00:50:58,350 --> 00:50:59,366
Kayla: Wanna kill him?

343
00:50:59,438 --> 00:51:05,504
Chris: I'm sorry. I know, I know. Even with that, he still has to use the, like, Silicon Valley words.

344
00:51:05,632 --> 00:51:07,340
Kayla: Just turn it off and on again.

345
00:51:08,000 --> 00:51:09,264
Chris: It's a long article, but I'll turn.

346
00:51:09,272 --> 00:51:13,300
Kayla: It off and then not on again, is the thing.

347
00:51:14,080 --> 00:51:18,328
Chris: I mean, it is kind of like a turn it off and on again. That kind of summarizes it.

348
00:51:18,464 --> 00:51:19,720
Kayla: I know. And I'm saying that's.

349
00:51:19,760 --> 00:51:28,616
Chris: No, I know. Yeah, I know. That's. Well, he does so, like, in this article, he talks about, like, well, this is sort of, like, different from abolished the police, but there seems to be a lot of appetite for.

350
00:51:28,648 --> 00:51:33,608
Kayla: You want to make sure that there's people with sticks and guns that have government power that can kill you. Okay.

351
00:51:33,624 --> 00:51:50,250
Chris: The laws do need to be enforced. And I say this as a hater of the police, but, like, laws do need to be enforced. But here, let me read you some of the bullet points from this one, completely disentangle punition from revenue, including outlawing civil forfeiture.

352
00:51:50,370 --> 00:51:51,106
Kayla: Hell yeah.

353
00:51:51,218 --> 00:51:53,754
Chris: Two, delegalize police unions.

354
00:51:53,882 --> 00:51:55,050
Kayla: Oh, fuck yeah.

355
00:51:55,130 --> 00:51:55,466
Chris: Three.

356
00:51:55,538 --> 00:51:55,930
Kayla: Hell yeah.

357
00:51:55,970 --> 00:51:57,890
Chris: Fire body cams with teeth.

358
00:51:58,050 --> 00:52:00,194
Kayla: What does that mean? Like the genitals?

359
00:52:00,362 --> 00:52:04,762
Chris: Yeah. So the body cam will actually have some teeth attached to it. I'll bite you if you like a little necklace.

360
00:52:04,866 --> 00:52:05,510
Kayla: Cool.

361
00:52:06,250 --> 00:52:15,866
Chris: Four, nationwide zero tolerance policy for death of unarmed persons caused by law enforcement. Five, demilitarize city and county police and most state police.

362
00:52:15,978 --> 00:52:16,506
Kayla: Be nice.

363
00:52:16,578 --> 00:52:18,506
Chris: Six, abolish qualified immunity.

364
00:52:18,658 --> 00:52:19,866
Kayla: Oh, that'd be nice.

365
00:52:20,018 --> 00:52:27,722
Chris: Oh, my God. Separate oversight from policing. Eight, separate investigation from policing. And he goes on, there's more and a shit ton.

366
00:52:27,746 --> 00:52:31,512
Kayla: And the best part about all of that is that he came to all those conclusions himself.

367
00:52:31,666 --> 00:52:32,788
Chris: There's 23 points.

368
00:52:32,844 --> 00:52:45,320
Kayla: All of those conclusions weren't, like, already existing, like, philosophy and rhetoric in the abolitionist movement. That has extremely lengthy roots. He came to this all by himself. All by himself.

369
00:52:45,780 --> 00:53:00,314
Chris: I can't look, the generous read here because he does talk about, like, yeah, there seems to be, like, a lot of appetite for these types of things. So I'm just kind of like, Cole. It. The generous read here is that he actually did engage with that, and it's sort of like, that's nice. Collating his own version of it.

370
00:53:00,362 --> 00:53:01,522
Kayla: I was mostly just being a bitch.

371
00:53:01,586 --> 00:53:10,826
Chris: Definitely. I mean, it definitely makes sense to, like, have that as your initial assumption about the guy. I more read that, though.

372
00:53:11,018 --> 00:53:19,230
Kayla: There's a lot of good stuff there. And also, I'm just gonna fall back on you existing in the context of all the living before you.

373
00:53:20,610 --> 00:53:22,546
Chris: That's definitely gonna be in the show notes.

374
00:53:22,618 --> 00:53:23,310
Kayla: Sorry.

375
00:53:23,890 --> 00:53:32,374
Chris: No. So the reason I bring it up, though, is to, like, sort of challenge. Like, this guy's just a dickhead. It's hard. But then he has bad quotes, too.

376
00:53:32,462 --> 00:53:33,374
Kayla: Yes, he does.

377
00:53:33,502 --> 00:54:01,478
Chris: So he has, like, to worship a sacred mystery was just to worship your own ignorance. That's a quote I read of his. And so he kind of thinks that, like, a sacred. I think that might be from the Harry Potter thing, but basically, like, that, like, mystery is just ignorance in a different form. And, like, mysteries exist to be, like, uncovered and known. And so he doesn't have, like, a good view of, like, the value of mystery as, like, a sacred mystery, which I could not agree with more.

378
00:54:01,534 --> 00:54:03,246
Kayla: Follow a religious faith.

379
00:54:03,438 --> 00:54:17,174
Chris: I mean, well, there's definitely that. But I would even say, like, that goes beyond just, like, you're dumb if you're religious. I think that sacred mysteries aren't actually, like, valuable, whether they're religious or not.

380
00:54:17,262 --> 00:54:21,974
Kayla: I want to know how he feels about the book of Job, then. I really want to know. I want.

381
00:54:22,022 --> 00:55:16,416
Chris: He probably has a very definitive answer, to be honest, super overconfident about. There's another quote here that I don't care for, which is the purpose of doubt is to annihilate some particular belief. So basically, he sees doubt as entirely changed to a particular belief. Like, I have doubt in this belief, and either the belief stands and the doubt is resolved, or it doesn't stand, and the doubt annihilates that belief as it was born to do. So. Like, there's always this, like, pull of, like, doubt exists to destroy this thing. And either it does or doesn't and then goes away. And it's a little, this is, like, a little pedantic, but I, and I've read other people that critique this in the same way. Like, sometimes doubt is just a thing that, like, you have to kind of, like, continuously hold on to.

382
00:55:16,488 --> 00:55:17,064
Kayla: Yeah.

383
00:55:17,192 --> 00:55:27,582
Chris: And I think that comes into, like, even, like, the very core of less wrong itself. Like, you kind of always have to doubt your own cognitive processes, your own decision making, because if you don't, it's.

384
00:55:27,606 --> 00:55:29,358
Kayla: Kind of, like, baked into the scientific method.

385
00:55:29,454 --> 00:55:46,174
Chris: Right. So I don't see, like, and I think a lot of people don't see doubt as, like, a, you know, like a moment in time where, like, it needs to either get resolved or not. Some doubts are, like, just kind of, like, things that you kind of have to carry with you for safety sake.

386
00:55:46,262 --> 00:55:48,854
Kayla: That does feel very ayn Randy to me, in a way.

387
00:55:48,942 --> 00:55:49,418
Chris: Totally.

388
00:55:49,494 --> 00:55:52,350
Kayla: Like, we should play a game. Whose quote is it?

389
00:55:54,170 --> 00:56:02,450
Chris: That's a really good idea. I think there's another quote that you mentioned. Did you mention in our cryonics episodes? I know you wanted to talk about. Waiting for me to say it.

390
00:56:02,490 --> 00:56:03,554
Kayla: Chomping at the microphone.

391
00:56:03,602 --> 00:56:05,554
Chris: Chomp. Do you want to say what it is?

392
00:56:05,642 --> 00:56:52,902
Kayla: It's not a quote. I mean, I can pull some quotes. It's not a quote, but it's just some of the stuff that I came across in visiting less wrong, specifically on their cryonics topic. So obviously, Eliezer Yadkowski is very pro cryonics and is signed up to be cryopreserved. And I believe his whole family is signed to be cryopreserved, which, like, great. Cool, awesome. Like, I want more pro cryonics people out there in the world with platforms because, I don't know, man. We talked. We had multiple episodes about this, but I'm just like, just give us more options of what to do with my corpse after I die. Mister Ukowski has come to a specific conclusion about cryonics, and that is, if you don't sign up your kids for cryonics, then you are a lousy parent. This is from an essay written in 2010.

393
00:56:53,006 --> 00:57:10,854
Kayla: So very old. This may not be this person's stance anymore. This is from an article written called normal cryonics. And it's when Eliezer Yudkowski attended a cryonics gathering that was specifically about getting young people signed up for cryonics and how he felt finally.

394
00:57:10,902 --> 00:57:16,488
Chris: Oh, okay. Well, I mean, that's, you know, there's a lot of motivated reasoning going on there, right?

395
00:57:16,624 --> 00:57:23,736
Kayla: Oh, wait, I'm sorry. It was for young people who signed up for cryonics to meet older people who'd sign up for cryonics. Kind of like this meeting of the minds.

396
00:57:23,808 --> 00:57:26,704
Chris: Oh, that's okay. That's an interesting summit.

397
00:57:26,872 --> 00:58:10,218
Kayla: And this essay kind of ended up being about how he finally felt sane for a moment because he was surrounded by parents who were signing their kids up for cryonics as, like, a matter of default, and how he feels so insane and abnormal in the regular world where people just have kids that die. And this is coming from a place of, I believe Eliezer Yudkowski had a younger brother who died as a child and was not cryopreserved and is therefore simply lost. There is no hope of maybe one day his brother coming back, which is a tough thing to deal with. But the conclusions drawn in this article are things like, I am a hero. Like, that's a quote from here.

398
00:58:10,394 --> 00:58:12,306
Chris: It's a quote. I am a hero.

399
00:58:12,498 --> 00:58:16,190
Kayla: And it's specifically written like this. I am a hero.

400
00:58:16,650 --> 00:58:19,390
Chris: Oh, man. I mean, that's so alien. Look.

401
00:58:20,210 --> 00:59:04,334
Kayla: And he's a hero because he's signed up for. I got more to say. Hold on. And the reason why I just. Why this one stuck in my craw, and I don't want to say it's, like, dumb or bad, whatever. The reason why it stuck in my craw is because it's only having half a conversation. If you are Internet philosopher and this is like your whole thing, and you're coming to the conclusion of if you don't cry on your kids for cryonics, you're a lousy parent because they could die and then they're dead, and you have failed them as a parent because you have not signed them up for the thing that could preserve their life, and then you're not engaging in a conversation of is it ethical to have children at all? You're only having half the conversation if you're a lousy parent because.

402
00:59:04,422 --> 00:59:05,438
Chris: Which I am, by the way.

403
00:59:05,494 --> 00:59:30,352
Kayla: Absolutely. If you're a lousy parent because you don't sign your kid up for cryonics. I want to have the rest of the conversation then, of how are you not a lousy parent for having kids at all? You're creating an entity that did not have a say in it before it happened because that's impossible. Who is now essentially doomed to a life of entropy? I want to have the rest of that conversation. And I'm obviously biased. Not in this article.

404
00:59:30,416 --> 00:59:31,120
Chris: Not in that article.

405
00:59:31,200 --> 01:00:12,962
Kayla: Yeah, I'm also biased coming into this because I am somebody who has a dead child that was not signed up for cryonics and they are dead and do not have any chance of being cryopreserved in cryo. Yeah, I'm just saying. And I just found this article so off putting and coming from a place of, like, this is an authoritative statement and, like, also very dangerous in a lot of ways. And just like, so judgmental in the, like, if you don't do this thing that I have decreed as good, you are bad. You are. You are. And he goes into this whole thing of like, I've been afraid to say it and I haven't said it for so long.

406
01:00:12,986 --> 01:01:03,910
Chris: And I'm finally exactly why people don't like him is because he presents things in this manner that you're discussing. Yeah, I will say that having the context of where this quote was presented does take the edge off for me a little bit because he's talking to people that have done this. And so, like, yeah, you're the good. The converse of that is you guys are good parents. Like, you can kind of read it as, like, he's talking to the people that are the good ones so he can say that the other ones are bad ones. That being said, that caveat aside, I agree with obviously with everything you've said, and yes, that is part of what makes him people the wrong way. So now that you and everyone listening is hopefully as confused and disoriented as I am about who to root for here.

407
01:01:04,370 --> 01:01:46,660
Chris: Next time on cultures. Weird. I'll talk a little bit more about some of the distinctive properties of the modern rationalism community. I know it was a little strange to put the, like, how do I feel about this episode before we talk about all the rest of the details, but I really wanted to move that discussion closer to the forefront because the uncertainty and mental flip flopping was, like, such a core part of my research experience. And like, I still, as we've talked about, still can't really figure out who to be versus here. But that's okay. There's still plenty to talk about as we continue to strive to be incrementally less wrong about less wrong. Until next week. This is Chris, this is Kayla, and this has been cult or just wrong, just confused.