Transcript
1
00:00:40,050 --> 00:00:40,338
Kayla: Okay.
2
00:00:40,394 --> 00:00:44,850
Chris: We're gonna catch all of our beautiful edits that you don't hear because we actually do edit the podcast.
3
00:00:44,930 --> 00:00:49,130
Kayla: We're recording. And you need to just make sure that you're gonna talk into the mic.
4
00:00:49,210 --> 00:00:52,962
Chris: I know that you're recording. Cause we always do this. You always start recording and then we.
5
00:00:52,986 --> 00:00:54,522
Kayla: Just like, say, I'm not recording it.
6
00:00:54,586 --> 00:00:55,242
Chris: Yes, you are.
7
00:00:55,306 --> 00:00:57,874
Kayla: I'm not. I was recording.
8
00:00:57,922 --> 00:00:59,470
Chris: Yeah. Oh, no shit.
9
00:00:59,790 --> 00:01:01,950
Kayla: Go do your thing. Do the podcast.
10
00:01:01,990 --> 00:01:05,518
Chris: Do the podcast. Okay. We don't care about the quality of the content. We're just trying to get it out.
11
00:01:05,534 --> 00:01:07,050
Kayla: And get it just fucking done.
12
00:01:08,190 --> 00:01:12,398
Chris: Anyway, welcome back to culture. Just weird.
13
00:01:12,454 --> 00:01:13,294
Kayla: Season two.
14
00:01:13,422 --> 00:01:17,806
Chris: Season two. Well, we're on now. Don't forget we're on like, episode three still.
15
00:01:17,838 --> 00:01:18,730
Kayla: Season two.
16
00:01:19,270 --> 00:01:22,542
Chris: Season two. We made it to season two, which.
17
00:01:22,566 --> 00:01:24,830
Kayla: Is so should we not act like we're like, excited about those?
18
00:01:24,870 --> 00:01:26,454
Chris: Cuz we're, ya know, like we should.
19
00:01:26,502 --> 00:01:31,870
Kayla: Go back and pretend like this wasn't the first episode we recorded. I'll lie as much as I want.
20
00:01:31,990 --> 00:01:38,190
Chris: In any case, this will be episode three of season two. S two. E three.
21
00:01:38,350 --> 00:01:39,006
Kayla: Sweet.
22
00:01:39,118 --> 00:01:45,886
Chris: Unless we break up one of the other episodes into multiples, in which case that is also a lie, and then it'll be episode four or five or something.
23
00:01:45,958 --> 00:01:47,230
Kayla: I'm probably gonna do that.
24
00:01:47,350 --> 00:01:50,510
Chris: Oh, boy. So I've already lied. It's been like 10 seconds.
25
00:01:50,590 --> 00:01:52,250
Kayla: What do you have for us today?
26
00:01:53,150 --> 00:01:56,664
Chris: I have a cult. Or maybe just a weird.
27
00:01:56,832 --> 00:02:01,520
Kayla: Do you have a gut feeling about it? Do you want to prime us with it or do you want to wait?
28
00:02:01,560 --> 00:02:07,208
Chris: Prime you with my gut, actually. Well, are we going to do banter at all or just go? Just ready? Go.
29
00:02:07,264 --> 00:02:08,600
Kayla: We did banter. We bantered.
30
00:02:08,680 --> 00:02:12,320
Chris: We bantered about banter, which is, I guess, the only thing we ever banter about.
31
00:02:12,400 --> 00:02:17,296
Kayla: Well, I hit myself in the tooth with a fork just before going on here.
32
00:02:17,368 --> 00:02:18,016
Chris: Cool story, bro.
33
00:02:18,048 --> 00:02:19,672
Kayla: And my tooth hurts.
34
00:02:19,856 --> 00:02:27,328
Chris: Cool story. We're recording. Oh, I know. We can talk about Perry. We're recording with the mascot cat, who is currently asleep.
35
00:02:27,504 --> 00:02:33,704
Kayla: The correct term that mascot was originated on. My other podcast is podcast got.
36
00:02:33,792 --> 00:02:35,224
Chris: Podcast cot. Excuse me.
37
00:02:35,272 --> 00:02:36,360
Kayla: So he is our podcast.
38
00:02:36,440 --> 00:02:49,162
Chris: Podcast got. Thank you, Perry. He is sound asleep right now. Okay, so I guess we can get into it. But first, I just wanted to talk to you about something real quick. You are about to lose the game. Oh, you just lost it. Sorry.
39
00:02:49,226 --> 00:02:57,830
Kayla: You're doing the game. Oh, everyone listens to me. So mad. Everyone's gonna be so mad.
40
00:02:59,730 --> 00:03:01,778
Chris: Yeah. Cause everybody listening just lost it, too.
41
00:03:01,834 --> 00:03:02,186
Kayla: Yep.
42
00:03:02,258 --> 00:03:05,242
Chris: Can you explain for our listeners what the game is?
43
00:03:05,306 --> 00:03:11,994
Kayla: I don't think I can. I don't know. I just know that it exists. The game is just the game. And if you think of. If you think of the game, you lose the game, right?
44
00:03:12,082 --> 00:03:12,394
Chris: Yes.
45
00:03:12,442 --> 00:03:13,266
Kayla: Yeah. It's a game.
46
00:03:13,298 --> 00:03:14,058
Chris: So the game, or game.
47
00:03:14,074 --> 00:03:14,954
Kayla: If you think of the game, the.
48
00:03:14,962 --> 00:03:17,122
Chris: Objective is to not think of the game.
49
00:03:17,186 --> 00:03:18,010
Kayla: Yeah. And if you do, you lose the.
50
00:03:18,010 --> 00:03:20,286
Chris: Game of the game, which we are all now doing.
51
00:03:20,358 --> 00:03:21,214
Kayla: Yeah. We've lost.
52
00:03:21,302 --> 00:03:21,774
Chris: You lose.
53
00:03:21,822 --> 00:03:22,006
Kayla: Yeah.
54
00:03:22,038 --> 00:03:25,806
Chris: So as long as you're not thinking of it, you win, which you were all winning until I said something.
55
00:03:25,878 --> 00:03:27,182
Kayla: Yeah. Thanks.
56
00:03:27,326 --> 00:03:27,894
Chris: Sorry.
57
00:03:27,982 --> 00:03:31,718
Kayla: It's kind of like getting iced, but, like, psychologically.
58
00:03:31,854 --> 00:03:35,270
Chris: Psychologically iced, yeah. Oh, man, I miss that.
59
00:03:35,390 --> 00:03:36,414
Kayla: This can be still a thing.
60
00:03:36,462 --> 00:03:39,366
Chris: But you can't counter game someone, though, right? Like, you can.
61
00:03:39,398 --> 00:03:40,662
Kayla: I don't know. You tell me.
62
00:03:40,846 --> 00:03:43,130
Chris: I mean, you can't. It's just the way that it works.
63
00:03:43,830 --> 00:03:50,750
Kayla: I feel like, technically, if you say you just lost the game, yourself are also losing the game, because I am.
64
00:03:50,790 --> 00:03:51,102
Chris: Yeah.
65
00:03:51,166 --> 00:03:55,270
Kayla: You have to think of the game in order to say you just lost the game.
66
00:03:55,350 --> 00:03:58,686
Chris: Right. But if I. And you have an ice in your pocket.
67
00:03:58,718 --> 00:03:59,934
Kayla: Yes. I know how that works.
68
00:03:59,982 --> 00:04:00,822
Chris: I know. I'm just saying.
69
00:04:00,886 --> 00:04:04,926
Kayla: I'm just saying it does not behoove anyone to say, no, it's lose.
70
00:04:05,078 --> 00:04:06,830
Chris: Yeah. The game is just bad for everyone.
71
00:04:06,910 --> 00:04:07,570
Kayla: Yeah.
72
00:04:09,630 --> 00:04:31,746
Chris: But it's funny and weird. I don't know if it's culty. Definitely weird and interesting, but. Yeah. The crux is that if you're thinking about it at all, you've already lost just by thinking about it. Also interesting is that the game isn't the only thing that has that property. Icing and the game are not the only things that have that property.
73
00:04:31,818 --> 00:04:34,290
Kayla: Should you explain what ice is?
74
00:04:34,330 --> 00:04:41,996
Chris: I guess we should probably explain that, too. Yeah. So if you've never been iced or even know what ice is, there's. There's a. A beverage?
75
00:04:42,108 --> 00:04:45,956
Kayla: A delectable. A delectable alcoholic treat.
76
00:04:46,028 --> 00:04:51,788
Chris: Alcoholic treat. Very high class, very classy. Called. Wait, it's not natural ice.
77
00:04:51,844 --> 00:04:52,564
Kayla: That's the beer.
78
00:04:52,692 --> 00:04:55,044
Chris: Smirnoff ice. All I can think of is natty ice.
79
00:04:55,092 --> 00:04:58,492
Kayla: No, it's natty light. Anyway, there's.
80
00:04:58,556 --> 00:05:02,364
Chris: Oh, I'm thinking of ice house. I'm combining ice house. Natty. Wow. Okay.
81
00:05:02,412 --> 00:05:03,156
Kayla: Get out.
82
00:05:03,268 --> 00:05:06,572
Chris: I cannot get my low class alcoholic beverages straight.
83
00:05:06,636 --> 00:05:09,004
Kayla: Smear, not ice. First of all. Second of all, how dare you?
84
00:05:09,092 --> 00:05:25,794
Chris: Sorry. I smirnoff ice and there is just a, I don't know where it came from, how it started, or if people even still do it. But there's this game called being iced, where if you bring someone an ice and hand it to them and say, you got ice, then they have to drink it.
85
00:05:25,842 --> 00:05:30,378
Kayla: No matter where they are, what they're doing. They could be giving birth, they could.
86
00:05:30,394 --> 00:05:32,722
Chris: Be flying landing, 747.
87
00:05:32,746 --> 00:05:39,904
Kayla: If you ice them, they gotta drink it. Unless, except they have their own ice, and then they can counter ice you, and then you have to drink both, right?
88
00:05:39,952 --> 00:05:47,520
Chris: So if you have an ice on you and somebody ices you, then you pull out your ice and you counter ice them, and then they have to drink.
89
00:05:47,600 --> 00:05:57,144
Kayla: It's like when you're playing crazy eights, or it's like when you're playing crazy eights and you play like a two card, and you think you're gonna make the person pick up two, but no, they have a two card, and then.
90
00:05:57,152 --> 00:05:59,856
Chris: You got, do we have to explain crazy eights now? Or it can be in this chain.
91
00:05:59,888 --> 00:06:02,336
Kayla: Of explain, or it's like uno with the reverse card.
92
00:06:02,408 --> 00:06:23,150
Chris: It's like, yeah, it's like reversing an Uno. In any case, icing is similar to the game, but the whole point here is that it's regardless of any physical object or beverage that is involved. The idea is just that if you think about it, you've lost. Yep, it's the thought that's dangerous.
93
00:06:23,690 --> 00:06:28,018
Kayla: I swear to God, there's so many directions you could go with this.
94
00:06:28,074 --> 00:06:37,532
Chris: I know some. We will get to whatever that is. I just wanted to say the catchphrase. We'll get to that. So before we do, it's really.
95
00:06:37,676 --> 00:06:43,196
Kayla: I'm very interested to see where this goes, because I might get mad at you. Depending on what this is about.
96
00:06:43,308 --> 00:06:44,720
Chris: You're gonna get mad at me.
97
00:06:45,380 --> 00:06:46,612
Kayla: I don't know if I can do this.
98
00:06:46,676 --> 00:06:52,092
Chris: Okay, so I'm gonna do a quick inline content warning. Speaking of getting mad at me, for.
99
00:06:52,116 --> 00:06:54,568
Kayla: This episode, I'm so mad right now.
100
00:06:54,764 --> 00:07:11,552
Chris: The content warning itself is kind of interesting, because it's not that we're going to be talking about murders or sexual assault or anything involving bodily harm. Rather, this is a subject that a small subset of folks have in the past found somewhat mentally disturbing in a sort of unique way.
101
00:07:11,616 --> 00:07:12,776
Kayla: I'm so mad.
102
00:07:12,888 --> 00:07:26,392
Chris: So if you're the type of person that has a tendency to overthink things and or you get a lot of anxiety about the future, or generally have a hard time with excess essential dread, this episode may not be for you.
103
00:07:26,576 --> 00:07:29,200
Kayla: But except we are both of those things.
104
00:07:29,320 --> 00:08:14,064
Chris: I will also say these things, which sounded really weird after you just said these things in a different context. No, that's okay. I'm just, like, blindly reading my script because it's been so long. No, but I. So I'll say these things. First of all, the anxiety that some people feel about this topic is part of the story itself. Maybe it is the story. And secondly, if you decide to stick with us, and no pressure, but I promise that there is a whole portion of the episode where I will talk to you about why the anxiety and disturbance around this topic is almost entirely baseless. Not to invalidate people's feelings that do feel anxiety about it, especially because the folks that get disturbed by this tend to be actually extremely intelligent, hyper rational, smart people.
105
00:08:14,232 --> 00:08:25,518
Chris: But the premises that lead to these anxious feelings that people have are tenuous at best. Or honestly, I'd even call some of these premises, like silly if you really look at them.
106
00:08:25,574 --> 00:08:26,846
Kayla: Well, I call you silly.
107
00:08:26,918 --> 00:08:32,046
Chris: And if I didn't think that they were silly, we wouldn't be doing this episode. You have my word.
108
00:08:32,118 --> 00:08:34,998
Kayla: Mm. Just cause you think they're silly, don't.
109
00:08:35,014 --> 00:08:35,837
Chris: Get mad at me.
110
00:08:35,933 --> 00:08:42,238
Kayla: Doesn't mean I'm gonna think that they're silly until, see, you're just unilaterally deciding that it's silly until we get to.
111
00:08:42,254 --> 00:08:44,278
Chris: The end of the episode. Don't get too mad.
112
00:08:44,334 --> 00:08:45,148
Kayla: I'm Madden.
113
00:08:45,194 --> 00:09:14,312
Chris: And then for everybody else, content warning. There may be some anxiety inducing topics that we will talk about as we continue. Okay, so if you're still with us after the content warning, which I'm actually guessing is most of you, because the thing is, content warnings, as good as they are, can tend to have a bit of a Streisand effect. Do you want to explain the Streisand effect? Since we're, like, having to explain, this is going to be an explanation heavy episode. Do you want to explain that one?
114
00:09:14,376 --> 00:09:30,552
Kayla: I am not really feeling like being cooperative right now because I'm very mad at you. But sure, I'll go ahead and explain the Streisand effect because I definitely wanted to do work today. The Streisand effect is the name that has been given to an effect that is related to Barbra Streisand. The end. No.
115
00:09:30,616 --> 00:09:33,840
Chris: Wow. That's a very good explanation. Thank you.
116
00:09:33,920 --> 00:09:51,314
Kayla: So Barbra Streisand, the story goes, really didn't want there to be any pictures of her luxurious, I believe, Malibu home anywhere on the Internet or, like, anywhere. And so she was like, no pictures of my house. Which naturally made people interested in knowing what the fuck Barbra Streisand's house looked like.
117
00:09:51,322 --> 00:09:53,194
Chris: I think that she even got her lawyers involved.
118
00:09:53,242 --> 00:10:04,852
Kayla: Oh, yeah. No, it was like a thing. It was like she kept doing more and more things to try and get pictures off cease and desist, whatever, all of those things to try and get pictures of her house removed from collective conscience.
119
00:10:05,026 --> 00:10:05,560
Chris: Yeah.
120
00:10:05,640 --> 00:10:23,880
Kayla: And the streisand effect is when you try to prevent people from seeing a thing that makes them want to see the thing more than if you had just not tried to prevent them in the first place. So what you're saying with content warning is that, like, if you say, hey, there's anxiety inducing stuff in here, it's gonna make people want to listen more, right.
121
00:10:23,920 --> 00:10:32,032
Chris: It's like every time we watch an HBO show and, like, at the beginning, it says, violence, strong sexual content, nudity, we're like, yes, awesome.
122
00:10:32,096 --> 00:10:34,100
Kayla: It's gotta have all of those. Otherwise I'm not watching.
123
00:10:34,770 --> 00:10:54,370
Chris: But, yeah, I mean, basically, this dry sand effect is what the Internet calls censorship, right. When something gets censored and then everybody reads it with the catcher on the rye. Right. Being censored only had the effect of making it much more widely read than it probably would have otherwise. So anyway, it's kind of like, sorry.
124
00:10:54,410 --> 00:11:01,750
Kayla: I'm interrupting again, but it's kind of like how every time I go, oh, no, what if somebody, we do an episode about sues us? You're like, hell, yeah, I hope they sue us.
125
00:11:01,870 --> 00:11:20,566
Chris: Yeah, yeah, exactly. Which may happen this time, too. I don't know. But I'm not just saying the Streisand effect because of the content warning, it's also something that's going to come up later in the show. So if you're keeping score at home, today's topic overlaps with the game, which is simply the thing that you lose, if you think about it. So you actually just lost it again.
126
00:11:20,638 --> 00:11:21,334
Kayla: Yep.
127
00:11:21,502 --> 00:11:31,308
Chris: The streisand effect, which we just described in a weird content warning about, like, overthinking stuff. So not sure if it's a cult yet or not, but very clearly weird.
128
00:11:31,404 --> 00:11:32,396
Kayla: Yeah, sounds like it.
129
00:11:32,468 --> 00:12:27,986
Chris: Very weird. So let's mentally journey back into the distant past, a much simpler era. The year is 2010, and the month is July. President Obama is halfway into his first term, and the biggest thing we're all worried about right now is the BP oil spill. Like, that's it. That's the whole thing. That's all of 2010. That's the biggest thing we had to worry about. Much simpler time. But someone on the Internet had something bigger and much stranger to worry about. In July 2010, a regular contributor to an online forum called lessrong.com made a post. What is lesswrong.com, you ask? Here is the quickie blurb from Wikipedia. Less wrong is a community blog and forum focused on discussion of cognitive biases, philosophy, psychology, economics, rationality, and artificial intelligence, among other topics.
130
00:12:28,138 --> 00:12:40,394
Chris: I would also add, crucially, decision theory, which is like this weird esoteric discipline that discusses and tries to formulate, like, why people make certain decisions under certain circumstances.
131
00:12:40,482 --> 00:12:42,750
Kayla: This sounds so heavy.
132
00:12:45,450 --> 00:12:46,146
Chris: Right?
133
00:12:46,298 --> 00:12:48,154
Kayla: The Internet is for looking at cute pictures of cats.
134
00:12:48,202 --> 00:13:24,358
Chris: I know, but, oh, man, I spent a little bit of time on less wrong for this episode. Like someone like me, like, I do find it fascinating, sure, but it is super heavy. And as we'll talk about later, man, is there a lot of inside baseball like jargon on that site? Like they're throwing acronyms around and like, yeah, it's pretty nuts, but basically they're just like a bunch of hyper nerd, rationalist, AI loving people. Like, they're a bunch of singletarians. Okay, basically, don't mess.
135
00:13:24,454 --> 00:13:26,862
Kayla: Oh no, you just disturbed him way more than I did.
136
00:13:26,886 --> 00:13:33,070
Chris: We disturbed the podcast cat. What was it? Podcast cot. Yes, podcat cot.
137
00:13:33,190 --> 00:13:34,410
Kayla: Podcast caught.
138
00:13:36,630 --> 00:13:49,958
Chris: Anyways, so that's what less wrong is. So it's just an online forum. The site was founded by an AI researcher by the name of Eliezer Yudkowski, who I think I'm pronouncing that name right, but like, such an awesome name.
139
00:13:50,014 --> 00:13:50,998
Kayla: That is a really good name.
140
00:13:51,054 --> 00:14:07,454
Chris: Eliezer Yudkowski. Anyway, not Eliezer, but the other person I referenced above, the community member who made this strange and worrisome post on lesswrong.com. The name he goes by on this community forum is Roko.
141
00:14:07,582 --> 00:14:10,710
Kayla: No, I don't want to talk about this. I really, I.
142
00:14:10,870 --> 00:14:18,770
Chris: And he has really don't want to talk about this to some people. Terrifying ideas about a super intelligent AI that doesn't exist yet.
143
00:14:19,310 --> 00:14:22,050
Kayla: Some people. Smart people.
144
00:14:22,670 --> 00:14:33,694
Chris: Yeah. So anyway, yes, my friends, this is why we had to give a content warning at the top of the show today. We are talking about none other than Roko's basilisk.
145
00:14:33,862 --> 00:14:36,726
Kayla: Fuck yeah. Is it roko or rocko?
146
00:14:36,838 --> 00:14:37,598
Chris: I don't know.
147
00:14:37,694 --> 00:14:38,638
Kayla: We'll say Roko.
148
00:14:38,734 --> 00:14:51,184
Chris: I'm gonna say roko. And if you say rocko, then I'll yell at you because that's what you're supposed to do when somebody pronounces something totally validly a different way? Gif. Gif. We'll do that episode sometime.
149
00:14:51,312 --> 00:14:54,696
Kayla: That is, there is definitely cult behavior there for sure.
150
00:14:54,888 --> 00:15:26,232
Chris: So if you know even a little bit about Rokos basilisk, you know that this shit gets insanely esoteric. I mean, you've already kind of gathered that. By the way, I described Lesswrong.com. So it gets, yeah, really esoteric really quickly. And we're going to try to swim into that deep end of the esoteric pool as much as we can get away with. So before we do though, I want to say, are you with me so far? Like, do you have any questions about what we've already talked about less wrong, Mister Yudkowski or Roko or just sort of like the table we've set so far?
151
00:15:26,376 --> 00:15:31,580
Kayla: Do you want me to have questions? I don't have questions, but I can make one up.
152
00:15:32,480 --> 00:15:38,530
Chris: This is just me trying to make sure that everybody is still on board the understanding train.
153
00:15:38,570 --> 00:15:40,466
Kayla: So far I'm good.
154
00:15:40,578 --> 00:16:24,292
Chris: Okay, listeners at home, any questions? No? Okay, sounds good. So let's start with the basics. What even is Roko's basque? What the hell am I talking about? Alright, because I talked about this guy. He made a forum post and online. What does that have to do with anything? How is it a call? Why is it, why am I talking about AI? So before I answer, and this goes back to what were talking about just a second ago, I want to reiterate something also passingly mentioned in the content. Warning. If you start feeling freaked out at any point during the show here, just remember one thing. The theme of this whole episode is about how dangerous mere information can be. And also that we ought to practice responsibility with information that is potentially dangerous.
155
00:16:24,476 --> 00:16:44,092
Chris: I did a lot of considering and thinking about whether this episode itself was a safe enough parcel of information to disseminate across our platform and our audience. And I decided pretty conclusively that it was. So if you stick with me till the end, I think you'll see why and you should have any possible anxiety of yours assuaged.
156
00:16:44,276 --> 00:16:45,572
Kayla: Is that how you pronounce that word?
157
00:16:45,636 --> 00:16:46,388
Chris: I don't know.
158
00:16:46,524 --> 00:16:48,012
Kayla: I always avoid saying it.
159
00:16:48,116 --> 00:16:53,708
Chris: Well, now we're on a podcast so I have to say it. I guess I didn't have to type it. This is my own damn fault.
160
00:16:53,724 --> 00:16:56,776
Kayla: Yeah, that is your own damn fault. Anyway, past me. I trust you.
161
00:16:56,888 --> 00:17:09,992
Chris: Okay? I'm glad that you trust me. I'm glad that I'm part of your trust network. Maybe by now. Okay, so again, what is Rokos basilisk. Basilisk. Oh, this is not a good one.
162
00:17:10,016 --> 00:17:13,088
Kayla: For my perceived list that you don't have.
163
00:17:13,223 --> 00:17:37,650
Chris: So the simplest way I can put it is Rokos Basilisk is the name given to a thought experiment about a hyper intelligent future artificial intelligence. This future AI then captures or whatever any humans that knew about the potential for it to exist, such as you and me right now because we're talking about it. And every person listening to this and then tortures them for the rest of their lives.
164
00:17:37,770 --> 00:17:39,830
Kayla: For the rest of their lives or the rest of ever.
165
00:17:40,970 --> 00:17:43,298
Chris: I think there's differing opinions on that.
166
00:17:43,354 --> 00:17:44,074
Kayla: Okay.
167
00:17:44,242 --> 00:17:52,552
Chris: It gets into some weird stuff about, like, the AI may not actually torture you. It may torture some simulation of you.
168
00:17:52,616 --> 00:17:53,248
Kayla: What?
169
00:17:53,424 --> 00:18:05,872
Chris: But then, like, the people on less wrong. And this is sort of getting ahead of ourselves here, but they have, some of them feel like a simulation of you is indistinguishable from you. So it basically is you.
170
00:18:05,936 --> 00:18:06,304
Kayla: Right?
171
00:18:06,392 --> 00:18:09,616
Chris: Or like maybe that's already happening. Like some weird stuff like that.
172
00:18:09,768 --> 00:18:10,864
Kayla: I mean, that I believe.
173
00:18:10,952 --> 00:18:30,698
Chris: Yeah, I know. Yeah, that's after 2016, I believe that bit of it. But anyway, there's. Yeah, so there's kind of different. Different ways that it executes the whole torture scenario bit. But anyway, the point is, if it comes to be and realizes that you knew it could have come to have been but didn't help it come to.
174
00:18:30,714 --> 00:18:33,954
Kayla: Be, it will know, because it's an AI and all, and because it's a.
175
00:18:34,002 --> 00:18:57,672
Chris: Super intelligent AI, it will know, then it will find you and it will torture you. And so why does it do this? And by the way, let me know if any point, you're not following me or any of the logic here, as weird as it may be. So the Roko basilisk AI is not evil per se. Actually, this is one of the things I didn't realize and I thought was quite interesting.
176
00:18:57,776 --> 00:18:58,632
Kayla: How is it not evil?
177
00:18:58,696 --> 00:19:00,120
Chris: Almost freakier this way.
178
00:19:00,160 --> 00:19:00,984
Kayla: I don't like it.
179
00:19:01,112 --> 00:19:48,300
Chris: In fact, it may be friendly or probably will be and is quite good. In fact, it may even be so good, so beneficial for mankind that it runs some simple calculations. Once it comes online and those calculations go like this, the AI's existence is so massively beneficial to all humans that every single day, hour, even minute that it took for it to be developed, that it wasn't developed yet, may have literally cost real human lives, maybe thousands, maybe hundreds of thousands. And by the way, if you're wondering, this is worse, if you're wondering how exactly an AI could be so beneficial. Let's just say for the sake of understanding the thought experiment that this AI, among other things, can instantly come up with, like, cheap cures for all known cancers. Right? Something like that.
180
00:19:48,340 --> 00:19:48,660
Kayla: Right.
181
00:19:48,740 --> 00:20:13,306
Chris: Because the idea among about super intelligent AI's is a key assumption that we're working with and that they work with on this website is that those super intelligent AI's will be so far above us in terms of their intelligence that it feels like they're going to wield powers that we might view as godlike. I mean, yeah, they'll just be so powerful, so intelligent that they'll seem like godlike beings and essentially, and we'll come back to that too, by the way, that idea.
182
00:20:13,378 --> 00:20:18,706
Kayla: But also, yeah, it's way worse for him for Rocko to not be evil. To not be evil.
183
00:20:18,778 --> 00:21:05,470
Chris: Yeah. So actually that was, I'd never heard that before. That was, it's, that was how it was originally posited. Like the whole evil thing is sort of just an artifact of like, well, feel torturing AI. Oh my God, that's an evil thing to do. But actually, the way that it was posited is that you've got this godlike AI bringing about this massive benefit wealth discovery, maybe ushering in a golden age, the likes of which we've never seen. So much so that the AI, in its infinite mathematical wisdom, knowing that if it were now to punish people who knew of its potential future existence but did not help bring it about, that would motivate and hasten its development, which is actually a good thing because of how much benefit it's bringing.
184
00:21:06,730 --> 00:21:07,990
Kayla: I hate this.
185
00:21:08,490 --> 00:21:41,812
Chris: And in fact, Rocko says, rocco, we're going back and forth. Rocco even says in his post something about, like, it's something, I forget the word. It's like he says, like brutal in the way only utilitarianism can be. So this dovetails, which again, we'll get to that quite nicely with utilitarian morality. And utilitarian morality is something that is very prominent on lesswrong.com dot. Okay, one of the things you should maybe keep in mind is that utilitarianism is not the only framework we have for morality.
186
00:21:41,996 --> 00:21:43,884
Kayla: I've never heard that before in my life.
187
00:21:43,972 --> 00:21:44,788
Chris: Utilitarianism?
188
00:21:44,844 --> 00:21:45,300
Kayla: No.
189
00:21:45,420 --> 00:22:08,732
Chris: Oh, okay. Well, I actually have a section where we're going to talk about it, but we can talk about it now just real quick, just to kind of level set what it is, and then we can keep going. Utilitarianism is basically if morality was math. So the, for example, so think about. Well, no, but everybody's heard the utilitarian experiment, thought experiment of the trolley.
190
00:22:08,796 --> 00:22:09,612
Kayla: Oh, yeah. Okay.
191
00:22:09,676 --> 00:22:30,756
Chris: Right. So the trolley is. Do you like there's a train or trolley train and it's coming down the tracks and it's totally going to hit five people. But you can pull the lever and cause it to go on the other track and hit only one person, but then you're murdering someone.
192
00:22:30,828 --> 00:22:31,412
Kayla: Yeah.
193
00:22:31,556 --> 00:22:45,988
Chris: So would you murder one person to save five? And if the idea behind that thought experiment, which actually a little tangent here because I wound up reading a little bit more about that. You know how these things are, man.
194
00:22:46,164 --> 00:22:47,292
Kayla: You're measuring the coastline.
195
00:22:47,356 --> 00:22:51,684
Chris: You're measuring the coastline, you start zooming in and it just gets more complex the more you go.
196
00:22:51,732 --> 00:22:52,360
Kayla: Yep.
197
00:22:52,820 --> 00:23:12,716
Chris: So side tangent about this is that's supposed to tell whether you're utilitarian or not. I think there's actually a name for. If you say, I don't do anything because I'm not going to kill someone, that is the intent behind that question. Because the utilitarian is supposed to say, well, I would definitely kill the one person to save the five.
198
00:23:12,788 --> 00:23:13,400
Kayla: Right.
199
00:23:13,740 --> 00:23:34,944
Chris: But actually they find when they put people into brain scans and ask them that question, people tend to whatever the section is of their brain that lights up when they're doing that. It's like, it's more of a measure of like, sociopathy than it is of utilitarianism. Their brain lights up more in the like. Yeah, I don't care if people have to die.
200
00:23:35,072 --> 00:23:41,480
Kayla: Wait, so if you kill the one to save the five, you're sociopath, or if you don't do anything, you're sociopath.
201
00:23:41,520 --> 00:24:20,898
Chris: It's if you kill the one to save the five. But let me check on that. Da da da. Philosophically, the sacrificial dilemma has a narrow purpose. Your choice supposedly illuminates. I'm reading this from one of the sources that I'll post in the show notes. It's from a really interesting website, actually called lastword on nothing.com. We can talk about what that is. But anyway, reading from here, philosophically, the sacrificial dilemma has a narrow purpose. Your choice supposedly illuminates whether you fall into one of two camps on moral reasoning, choose to hypothetically end a life and save a few more. And yours is described as utilitarian judgment rejected. And you are said to be making a non utilitarian, which they call deontological judgment.
202
00:24:20,994 --> 00:24:22,482
Kayla: Deontological.
203
00:24:22,586 --> 00:24:22,986
Chris: Yeah.
204
00:24:23,058 --> 00:24:23,994
Kayla: I love that.
205
00:24:24,122 --> 00:24:40,406
Chris: So roughly translated, the utilitarian is concerned primarily with outcomes, while the deontologist has a morally absolute point of view that holds that you couldn't tell a lie to save someone's life because it's wrong to tell a lie. And they even mentioned that Kant is the most extreme member of this camp.
206
00:24:40,598 --> 00:24:44,062
Kayla: Wait, so if you wouldn't tell a lie to save someone's life, you're deontological?
207
00:24:44,126 --> 00:24:52,490
Chris: Correct means you have morally absolute positions like it's wrong to kill, period. I don't care that there's five people on the other side of the track.
208
00:24:52,870 --> 00:24:54,126
Kayla: That's just wrong.
209
00:24:54,278 --> 00:25:24,750
Chris: Well, maybe you're a utilitarian, then. Maybe you should go spend some time on lesswrong.com in any case. So they go on to say that doing some of these experiments reveals that not only does a utilitarian response not actually reflect a utilitarian outlook, it may actually be driven by broad antisocial tendencies, such as lowered empathy and a reduced aversion to causing someone harm. Which kind of makes sense in the real world, given a choice between two kinds of harm, most people wouldn't be able to cost it up quite so coldly.
210
00:25:25,690 --> 00:25:27,410
Kayla: That makes sense, right?
211
00:25:27,530 --> 00:26:08,768
Chris: Which is part of the thing with a trolley problem, too, is that it's like, I think most people that think about the trolley problem, or at least when it's posed to them for the first time, go like, isn't there any way that we could, like, just put on the brakes, right? Or could we derail the trolley? Could we just untie the guy? Like, it's hard to. When you completely strip away that real world context, then you find yourself in the situation where the answer that you give says more maybe about, like, one aspect of your brain than the other. You know, if you have to force you to answer that, then it's like, sure, yeah, five is better than one, I guess. But most people, I think, have, like, maybe an initial aversion to it.
212
00:26:08,904 --> 00:26:18,088
Chris: And if you don't, I think that's what they're saying. Like, five is greater than one. Yeah. Do that. Kill the one person for sure, right? Versus saying, like, can I save all of them? I don't know.
213
00:26:18,144 --> 00:26:38,498
Kayla: Right, right. Or like, because for me, I like, when I'm posed like that question, I know I would want to say if the only options are kill five or kill one. I know want to say I'd throw the switch and kill the one person. But then I'm really, like, honest with myself. I don't know if I would actually. It's hard be able to do it.
214
00:26:38,514 --> 00:26:38,714
Chris: Right.
215
00:26:38,762 --> 00:26:47,322
Kayla: Like, even if morally, I'm like, that's the thing you should do. But I don't actually know if I could do it. I think I would freeze up and panic and not even be able to, like.
216
00:26:47,426 --> 00:27:14,148
Chris: Right. And there's a lot of other things, too, that play into it that, like, you have to control for. You have to ignore in the trolley problem, which is, like, probably what you would do is whoever you were, like, closest to. Like, if the person that you were killing was, like, in the room with you saying, please don't kill me. And the other five people were, like, half a globe away, then you would probably let the other five people die.
217
00:27:14,204 --> 00:27:14,396
Kayla: Right?
218
00:27:14,428 --> 00:27:36,058
Chris: Like, there's a whole thing. But two about how proximity makes it a more. Proximity makes it much easier to kill, which is, like, part of a problem with our modern weapon systems. It's really hard to kill someone with a knife. It's a little bit easier to kill them with the gun. It's easier to kill them with, like, a sniper rifle or a mortar, or it's even easier to kill them with a drone system from half a continent away.
219
00:27:36,114 --> 00:27:36,690
Kayla: Right.
220
00:27:36,850 --> 00:28:02,690
Chris: So anyway, back to utilitarianism. That's one of the things that Rocco mentions in his post, is that there's this harsh utilitarian nature to. To the thought experiment that this robot is actually maybe doing good. And because it's doing so good, so much good, that it's actually good for it torture these people that knew about it and didn't bring it into existence.
221
00:28:02,730 --> 00:28:03,418
Kayla: Yep.
222
00:28:03,594 --> 00:28:17,126
Chris: Right. Now, again, if you're getting freaked out about this, trust me, there's plenty to undermine this, least of which being that utilitarianism is not the be all and end all for morality, which means that's not definitely the thing that a robot would do.
223
00:28:17,198 --> 00:28:23,758
Kayla: I also don't know if you've quite illustrated as to why people might be getting freaked out about this at this point.
224
00:28:23,934 --> 00:28:25,702
Chris: Yeah. Which is actually kind of a good thing.
225
00:28:25,766 --> 00:28:28,126
Kayla: Like, yeah, so I was gonna say the why, but.
226
00:28:28,198 --> 00:28:46,612
Chris: Well, so, yeah, I mean, we'll kind of get to that. But it's also, like, people that like our listeners, unless they're also less wrong forum members, are probably gonna have to make some mental leaps to get to the point where they're like, oh, I see why. It freaks people out, which is actually good.
227
00:28:46,716 --> 00:28:47,220
Kayla: Right.
228
00:28:47,340 --> 00:29:30,162
Chris: And normal, I would almost say. And that's kind of what makes. That's also kind of what makes this a good topic for culture. Just weird, is that there's, like, a certain subset of people with their own rituals and vocabulary and everything that. That make this, like, an effective sort of harm that is, that happened to them. That's kind of what makes it a good topic for the show. Let me go back to the point in the script where I was. So were talking about Roko's basilisk and how it has this utilitarian imperative to say, I'm gonna torture people if they knew about it and didn't help me bring into existence. If you're still not kind of getting it, try this analogy. Let's say you're trapped in a.
229
00:29:30,226 --> 00:30:09,050
Chris: In a house, and there's only two rooms in this house, and you're trapped in one room with a bunch of people, but the other room next to you is an abusive monster, and the monster is locked up as the door is locked. So if even one person starts trying to unlock the door, you might reason out that, oh, boy, you better also help unlock the door so you don't get punished for effectively increasing the time he's been locked up in that room. So the monster doesn't actually even have to say anything to you or your fellow captives or even know that you started picking the lock to have his future decisions and your current motivations interact in this weird way that makes the locke gets picked faster.
230
00:30:09,130 --> 00:30:09,778
Kayla: Right.
231
00:30:09,954 --> 00:30:15,338
Chris: And that may make you worry now that, oh, shit, I'm gonna get tortured by this future thing that may or may not happen.
232
00:30:15,394 --> 00:30:15,946
Kayla: Right.
233
00:30:16,098 --> 00:30:23,634
Chris: That's basically Roko's basilisk in a nutshell. If you know about it and then it becomes real and you didn't help it exist, it will torture you forever.
234
00:30:23,722 --> 00:30:24,390
Kayla: Yep.
235
00:30:25,010 --> 00:31:09,806
Chris: So let's just say this has freaked some people out, mostly on the less wrong forum and me and Kayla, which we will get more into later. Once again, if you're feeling that, I promise you, this entire thought experiment sits on a house of cards, and a lot of the cards are missing, so try not to worry too much. But you were just asking me a little bit about, do people understand why it freaks people out. So I think I've kind of illustrated that because it's. It's the idea that, you know, oh, no, I'm worried about this future artificial intelligence that's just because I know about it is going torture me. And actually, unfortunately, that's actually sort of the most interesting part to me about Roko's basilisk is that the danger isn't really the AI, it's not actually the basilisk.
236
00:31:09,918 --> 00:31:14,326
Chris: And, in fact, the basilisk isn't even really referring to the AI itself.
237
00:31:14,398 --> 00:31:14,998
Kayla: Oh, really?
238
00:31:15,094 --> 00:32:00,786
Chris: It refers to the thought experiment. It refers to the piece of information. So this goes back to the top of the show when I was talking about, you know, the information being dangerous. That's the actual danger. Knowing about Roko's basilisk, if you believe in it, is the thing that makes it dangerous to you. And in fact, that's where the badass name comes from. Obviously, the roco part is named after the less wrong forum poster that posted about it. The basilisk part references the mythological creature known as the Basilisk, whose power as a creature is that anyone that even looks at it dies instantly just from having seen it. Something I learned in doing the research for this episode, actually, is that Rokos basilisk is already a second degree reference. It's not actually referencing the mythology.
239
00:32:00,858 --> 00:32:33,294
Chris: I mean, it is, but really it's referencing something else that's referencing the mythology. It's from a short science fiction story called Blit. It sounds terrifying. Actually, Blit doesn't sound terrifying. I read about the story and the story sounds terrifying. In this story, humanity has developed certain images that they call basilisks that have the property of having some specific patterning in them that if a human looks at them, it triggers the brain to undergo a lethal shutdown. Sort of like introducing a virus into a computer that causes it to crash.
240
00:32:33,342 --> 00:32:34,250
Kayla: Jesus Christ.
241
00:32:37,070 --> 00:32:48,320
Chris: So Roko's Basilisk gets its surname, actually from that short story which references the whole, if you look at it's already too late. You're fucked. I was short aside here. You know the movie the ring?
242
00:32:48,480 --> 00:32:49,024
Kayla: Yes.
243
00:32:49,112 --> 00:33:27,416
Chris: So when the ring was first advertised in theaters, I thought, like, it's. So the movie is actually about is like if you watch a video, then a ghost comes out of the tv and like, you know, kills you in seven days. In seven days. I actually, when I originally saw the trailer for it, didn't know it was a ghost thing. I thought it was more of like, if you watch this video, it. I actually thought that it was like this basilisk sort of thing that like, if you watch the video that it just like, fucks up your brain mechanics and then, like, you die from it. Which to me was always a much cooler, scarier thing than just what I learned. It was just about another ghost. I was just like, eh, ghost, whatever. But this idea, did you ever see it?
244
00:33:27,528 --> 00:33:32,904
Chris: Yeah, it was decent. In any case, just decide there about. I thought the ring was actually this.
245
00:33:33,072 --> 00:33:35,160
Kayla: Aren't you just so smart?
246
00:33:35,240 --> 00:33:43,590
Chris: Oh, man. Yeah, I am. Super smart. Somebody please give me a movie to make, because I'm obviously way ahead of the curve on this stuff.
247
00:33:43,630 --> 00:33:44,502
Kayla: There you go.
248
00:33:44,686 --> 00:33:50,398
Chris: So now you also know why we referenced the game at the top of the show, which, by the way, you just lost again.
249
00:33:50,574 --> 00:33:51,270
Kayla: So rude.
250
00:33:51,310 --> 00:34:15,373
Chris: But a bunch of people have compared this whole phenomenon to the game really predates Rokos basilisk by a number of years. A few years, yeah. Which you. You can kind of see why, right? Because the whole thing is. Again, it's not about the AI. It's the thought experiments. That's. That is the dangerous thing, because the thought experiment has caused people to stress, and it caused people distressed by this weird sort of. Oh, my God. As soon as I know about it, I'm screwed.
251
00:34:15,460 --> 00:34:15,900
Kayla: Right?
252
00:34:16,005 --> 00:34:27,933
Chris: The same way that the game does. Like, as soon as I know about it, I've lost. It's sort of a high stakes, anxiety ridden version of the game. Perfect for millennials.
253
00:34:28,061 --> 00:34:29,049
Kayla: We need that.
254
00:34:30,069 --> 00:34:40,103
Chris: All right, quick breather time. Any questions about what Roco's basilisk is or about its weirdo basilisk nature? What a basilisk is?
255
00:34:40,152 --> 00:34:43,420
Kayla: What does Roko's basilisk look like in your mind's eye?
256
00:34:44,040 --> 00:34:46,815
Chris: It looks like a little lizard.
257
00:34:46,967 --> 00:34:48,136
Kayla: Mine looks like.
258
00:34:48,328 --> 00:34:49,487
Chris: Actually, you know what it looks like?
259
00:34:49,543 --> 00:34:50,000
Kayla: What?
260
00:34:50,120 --> 00:34:56,280
Chris: So, I mean, it looks like a lizard, but it looks like a very specific lizard. Once again, here's a reference. Magic the gathering.
261
00:34:56,320 --> 00:34:56,967
Kayla: Oh, God.
262
00:34:57,063 --> 00:35:04,762
Chris: There was a card in the first set of magic called a basilisk, and it had the property that if it engaged in combat with any other creature, the other creature died of.
263
00:35:04,776 --> 00:35:05,494
Kayla: Mm.
264
00:35:05,662 --> 00:35:13,318
Chris: Get it? Anyway, that's just the image I have in my head of a basilisk now. And it just kind of looks like a fucking iguana. Like, it's nothing special.
265
00:35:13,454 --> 00:35:22,690
Kayla: Well, mine looks like. Let me see if I can find, like, a picture of it. Okay. In my head, you remember Trogdor the burninator?
266
00:35:23,110 --> 00:35:26,614
Chris: Vaguely I remember the name. I don't remember what he looks like.
267
00:35:26,622 --> 00:35:34,102
Kayla: Wait, wait. Trogdor the burninator from strong bad, which deep ass cuts right now? He looked like this.
268
00:35:34,206 --> 00:35:34,910
Chris: Yeah, yeah.
269
00:35:34,950 --> 00:35:39,934
Kayla: So he looks like a terrible drawing of a dragon with a single buff arm.
270
00:35:40,062 --> 00:35:41,942
Chris: Yeah, he definitely skips leg day.
271
00:35:42,046 --> 00:35:45,094
Kayla: Like, in my head, he looks like.
272
00:35:45,142 --> 00:35:45,918
Chris: That's pretty good, actually.
273
00:35:45,974 --> 00:36:00,014
Kayla: Really realistic. Like a more, like, scary realistic version. And I literally just googled Trogdor the burninator realistic. And I got, like, fan art. Please show me of a magic car. That's what Rocco's Vascules looks like to me.
274
00:36:00,102 --> 00:36:03,250
Chris: That is a very good depiction. It's not what I was thinking.
275
00:36:03,350 --> 00:36:04,346
Kayla: That's what it looks like to me.
276
00:36:04,378 --> 00:36:05,506
Chris: But it's much better.
277
00:36:05,658 --> 00:36:08,466
Kayla: Yeah, that's what Roko's Basilisk looks like to me.
278
00:36:08,498 --> 00:36:09,074
Chris: That's a good one.
279
00:36:09,122 --> 00:36:09,750
Kayla: Yeah.
280
00:36:10,290 --> 00:36:14,290
Chris: So now that you know what it is, let's go back to our little historical timeline.
281
00:36:14,370 --> 00:36:14,922
Kayla: Okay.
282
00:36:15,026 --> 00:36:44,308
Chris: So Roko makes his posts July of 2010, and it garners some thread comments on the site that vary from things like, this is an interesting post to, this is super entertaining to, I don't know, man. This is, this kind of, this is a little bit freaky. There's places that have preserved this original post. I mean, I didn't read every single comment on the post, but I didn't really get the impression from the comments that it was like, that people were really losing their minds over it.
283
00:36:44,364 --> 00:36:45,092
Kayla: Interesting.
284
00:36:45,236 --> 00:37:01,118
Chris: But there's one comment that I can almost guarantee that if this comment never happened, no, we wouldn't even know about Rokos Basilisk to this day. And that was the comment reply from Eliezer Yudkowski himself.
285
00:37:01,254 --> 00:37:02,530
Kayla: Oh, Jesus Christ.
286
00:37:02,910 --> 00:37:12,110
Chris: He is very active on the less wrong forums that he created. I've read some of his stuff. It's very interesting, very full of all of the jargon town.
287
00:37:12,190 --> 00:37:12,670
Kayla: Right.
288
00:37:12,790 --> 00:37:25,908
Chris: He seems to generally be, by the way, we'll get into him more later because, you know, criteria. But he seems like a generally a good dude, smart dude. He actually came up with the whole concept of like a friendly super intelligence.
289
00:37:25,964 --> 00:37:26,444
Kayla: Right?
290
00:37:26,572 --> 00:37:31,036
Chris: A super artificial soup. A friendly artificial super intelligence.
291
00:37:31,068 --> 00:37:33,564
Kayla: A super friendlier, super friendly.
292
00:37:33,692 --> 00:37:38,364
Chris: Not very super smart, like friendly, but dumb. No, he came up with the idea of that. So.
293
00:37:38,452 --> 00:37:41,236
Kayla: Okay, so you're saying he's our charismatic leader?
294
00:37:41,308 --> 00:37:42,076
Chris: Maybe I'm saying that.
295
00:37:42,108 --> 00:37:42,628
Kayla: Oh my God.
296
00:37:42,684 --> 00:37:44,636
Chris: I mean, he started less wrong.
297
00:37:44,748 --> 00:37:46,548
Kayla: That's true. Oh, he started.
298
00:37:46,684 --> 00:37:48,356
Chris: He was the creator of less wrong.
299
00:37:48,388 --> 00:37:49,464
Kayla: Oh, I don't think I missed it.
300
00:37:49,464 --> 00:37:50,124
Chris: Did I not say that?
301
00:37:50,172 --> 00:37:51,924
Kayla: No, you might have. I think I just missed it.
302
00:37:52,012 --> 00:37:57,524
Chris: Yes, he is the creator of the less wrong forum, but he's also very active, posting and commenting on.
303
00:37:57,532 --> 00:38:05,220
Kayla: Okay, here's my theory. He is, he is the super friendly AI. And this. Oh my.
304
00:38:05,260 --> 00:38:07,780
Chris: You're gonna change your idea about super friendly here in a second.
305
00:38:07,820 --> 00:38:10,396
Kayla: No, but that's, he's the product.
306
00:38:10,588 --> 00:38:11,540
Chris: He's the thing.
307
00:38:11,620 --> 00:38:13,204
Kayla: This is. Oh my God.
308
00:38:13,372 --> 00:38:14,308
Chris: Is your mind blown?
309
00:38:14,364 --> 00:38:21,674
Kayla: Yeah, that's, he's the, I can't say he's the basilisk anymore. Cause the basilisk is the thought experiment. He's the AI. He's the AI.
310
00:38:21,802 --> 00:38:25,786
Chris: Okay, maybe he's trying to warn us. I think he's a person. But you're right. Maybe he's.
311
00:38:25,818 --> 00:38:26,530
Kayla: I think he's the AI.
312
00:38:26,570 --> 00:38:33,170
Chris: Maybe he's the friendly AI. That's. I mean, we're all. Here's the thing. We're all artificial intelligences anyway, right? We're all man made.
313
00:38:33,210 --> 00:38:34,090
Kayla: I know I am.
314
00:38:34,170 --> 00:38:37,586
Chris: Hey. Oh. I don't know if I would say intelligence in your case, though.
315
00:38:37,658 --> 00:38:42,066
Kayla: Oh, my God, you made a joke. That was basically the same joke that I made. You're so funny and smart.
316
00:38:42,098 --> 00:39:08,880
Chris: Mic drop. Yeah. Anyway, let me quote you just. He has a pretty long comment reply on this post, so I'm just gonna quote you part of it. One might think that the possibility of punishing people couldn't possibly be taken seriously enough by anyone to actually motivate them. But in fact, one person at SiAi, which we will talk about that later, which is the singularity institute for Artificial Intelligence.
317
00:39:08,920 --> 00:39:09,792
Kayla: Oh, geez.
318
00:39:09,976 --> 00:39:39,642
Chris: One person at Siai was severely worried about this, to the point of having terrible nightmares, though they wish to remain anonymous. I don't usually talk like this, but I'm going to make an exception for this case. Listen to me very closely, you idiot. All caps. You do not think in sufficient detail about super intelligences considering whether or not to blackmail you. And by the way, blackmail is the whole process of saying, if you don't help me come into existence, I'm going to punish you. That's the blackmail they're referring to.
319
00:39:39,706 --> 00:39:41,910
Kayla: I don't think I understood that sentence at all.
320
00:39:42,210 --> 00:39:43,706
Chris: The one I said or the one he said?
321
00:39:43,818 --> 00:39:46,430
Kayla: The one he said, yeah, me either.
322
00:39:46,770 --> 00:39:54,310
Chris: But here, let me. Let me reread it regardless, just because I like reading in all caps, you do not think. But you understand the blackmail bit, right?
323
00:39:54,650 --> 00:39:56,386
Kayla: No, I didn't understand the sentence.
324
00:39:56,458 --> 00:40:00,386
Chris: Okay, pause in the sentence. They. When they talk about.
325
00:40:00,498 --> 00:40:01,298
Kayla: Who's they?
326
00:40:01,434 --> 00:40:02,274
Chris: People on less wrong.
327
00:40:02,322 --> 00:40:02,610
Kayla: Okay.
328
00:40:02,650 --> 00:40:19,448
Chris: When they talk about the future. Rocos basilisk intelligence punishing you if you didn't help it come into existence. They call it blackmail. The same way that I would be. Like, if you don't, I have dirt on you, so you better give me money, or else I'm gonna release the dirt. It's that.
329
00:40:19,504 --> 00:40:21,624
Kayla: Okay, so Trogdor is blackmailing you.
330
00:40:21,792 --> 00:40:25,104
Chris: Roko is. But not Roko. Roko is the guy that made the post.
331
00:40:25,152 --> 00:40:25,312
Kayla: Yes.
332
00:40:25,336 --> 00:40:44,486
Chris: I'm saying yes, the AI is blackmailing you. Okay, you may wonder, how does the AI blackmail you when it doesn't exist yet? That's actually a key part of their whole thing, which we'll get to, but that's their shorthand for that whole thing we described is blackmail. Okay, so let me reread his sentences. And so he's replying to Roko here. You do not think.
333
00:40:44,518 --> 00:40:56,734
Kayla: Wait, wait. Okay, okay, okay. Sorry. I got really confused. I got confused. I got really confused. I thought that the guy who created less wrong was also the guy who posted about Roko's bachelor's first. But no, it's two different guys.
334
00:40:56,782 --> 00:40:57,358
Chris: Two different guys.
335
00:40:57,414 --> 00:41:01,142
Kayla: That's why I was very confused. I was like, he's talking to himself. I don't understand. Okay, I get it.
336
00:41:01,166 --> 00:41:04,486
Chris: Okay, I understand now. No, so Roko posts this thing.
337
00:41:04,518 --> 00:41:05,064
Kayla: Yes.
338
00:41:05,222 --> 00:41:12,220
Chris: On less wrong. Eliezer Yudkowski, the creator of less wrong, comes to this post and replies to it in a comment.
339
00:41:12,340 --> 00:41:12,852
Kayla: Yes.
340
00:41:12,956 --> 00:41:54,852
Chris: And then in the reply, he says the things I already said about somebody at Siai having nightmares about this. And then he says, I don't usually talk like this, but I'm going to make an exception for this case. Listen to me very closely, you idiot. Talking to Roko, okay, you do not think in sufficient detail about superintelligences, considering whether or not to blackmail you. That is the only possible thing which gives them a motive to follow through on the blackmail. That's where I'm gonna end the quote. But he goes on to continue with his comment in equal measure of let's call it urgency instead of hysteria. Okay, there's actually a bunch more of that quote. And I was saying, let's call it urgency instead of hysteria.
341
00:41:54,956 --> 00:42:32,922
Chris: I just don't want to call it hysteria because I think he actually had good reason for being so upset. And he's since talked about the fallout from Roko's basilisk. And he says that his upsetness, which I believe him, was not because he thought the idea itself had real merit, but rather that it was reasoned out powerfully enough by Roko. Like, he presented this case powerfully enough that it was causing people that had these thoughts and these ideas in their brain from already being less wrong members and thinking in certain ways and having this utilitarian mindset and all blah, blah, that it was causing people distress in his community.
342
00:42:33,026 --> 00:42:33,530
Kayla: Right.
343
00:42:33,650 --> 00:42:43,146
Chris: That makes sense, and that's super valid. Just remember when were talking about how the most interesting part of the basilisk isn't the thing itself, but the fact that the idea was hurting people.
344
00:42:43,218 --> 00:42:43,546
Kayla: Right?
345
00:42:43,618 --> 00:43:19,040
Chris: So that's what this is. And he'd probably be pretty mad at me then for making a podcast about this, actually. But that's why I'm taking so much precaution to content, warn everyone, and also to get to the part where we clean up this mess we're making. A. And actually, we mentioned this before, but because this podcast is directed at a general audience and not at members of less wrong who have all those mental preconditions for something like Roko's Basilisk to bother them, it's possible that he may not even have a problem with this. I mean, there's been a ton of articles come out, which that's what we're about to talk about here. We're finally at the part of the episode where we said we'd get back to the Streisand effect. Remember were talking about the Streisand effect?
346
00:43:19,080 --> 00:43:21,024
Kayla: I do remember we talked about the Streisand effect.
347
00:43:21,112 --> 00:43:26,844
Chris: The Streisand effect being the whole censorship. If you sensor catcher on the ride becomes more popular.
348
00:43:26,932 --> 00:43:27,252
Kayla: Right.
349
00:43:27,316 --> 00:43:30,676
Chris: If you strike me down, I shall become more powerful than you can possibly imagine.
350
00:43:30,748 --> 00:43:32,556
Kayla: So bummed you said that, because I was gonna say it.
351
00:43:32,628 --> 00:43:36,020
Chris: Oh, I should, man. I'm sorry. I should let you react more.
352
00:43:36,140 --> 00:43:37,948
Kayla: No, you. You're the bigger nerds. That's fine.
353
00:43:38,004 --> 00:43:54,032
Chris: Mmm. That's fair. Yeah, you're just the pretty face. So that's what ends up happening here, because what Yadkowski did next after he made that very long, angry, all caps, idiot calling comment, was to delete the post about Roko's basilisk.
354
00:43:54,056 --> 00:43:54,616
Kayla: Oh, jeez.
355
00:43:54,688 --> 00:43:57,632
Chris: And ban any discussion of Roko's basilisk on less wrong.
356
00:43:57,696 --> 00:43:58,168
Kayla: Oh, my God.
357
00:43:58,224 --> 00:44:00,872
Chris: Even, like, deleted stuff that got discussed after that.
358
00:44:00,936 --> 00:44:02,024
Kayla: Holy shit.
359
00:44:02,112 --> 00:44:07,248
Chris: He was very concerned about it, and he's also, I think, since said, like, that may have been the wrong thing to do.
360
00:44:07,304 --> 00:44:12,776
Kayla: Oh. Oh, my God. Oh, my God. He did it. It's his fault. He's the basilisk.
361
00:44:12,848 --> 00:44:13,216
Chris: Yeah.
362
00:44:13,288 --> 00:44:13,824
Kayla: Oh, my God.
363
00:44:13,872 --> 00:44:21,876
Chris: Well, I mean, I don't know about that, but, like, remember when I said if it weren't for this comment, we wouldn't be sitting here talking about it? That's 100% true.
364
00:44:21,948 --> 00:44:35,572
Kayla: But even though he tried to suppress the discussion about it, if the AI became a lie, if it became real, he would not be up for torture because he actually did the work.
365
00:44:35,676 --> 00:44:57,574
Chris: That's true. The more people know about it. Maybe that's why he did the Streisand effect thing was because he was worried about Roko's basilisk and. And reasoned out that if he spread the information by streisanding it, then they'd be like, yo, good job, buddy. You helped people know about me, and now I'm real.
366
00:44:57,662 --> 00:45:02,238
Kayla: That's exactly what happened. We cracked this case.
367
00:45:02,294 --> 00:45:08,318
Chris: We cracked it. It's funny that we just went through that because that type of. Oh, if he knows, then I know. And then that thing.
368
00:45:08,374 --> 00:45:08,798
Kayla: Yeah.
369
00:45:08,894 --> 00:45:15,950
Chris: Is their whole site. That's all. Less wrong is. It's like they talked about, like, remember when I said most of what they do is decision theory?
370
00:45:16,030 --> 00:45:16,366
Kayla: Right.
371
00:45:16,438 --> 00:45:36,352
Chris: It's all that. It's all, well, if the AI knows that I know this, then it's gonna torture me, because XYZ, that's the whole thing. You already knew. Even if we didn't say Streisand effect earlier, you know what that effect would have if he. If I just said, yeah, he banned it, and then, like, deleted stuff, the idea basically took off.
372
00:45:36,456 --> 00:45:36,792
Kayla: Right.
373
00:45:36,856 --> 00:45:44,232
Chris: And pretty soon, you have articles on vice, on Slate. Even business insider wrote about the damn thing.
374
00:45:44,416 --> 00:45:46,128
Kayla: Slate's where I found out about it.
375
00:45:46,184 --> 00:45:48,216
Chris: Yeah. Slate, I think, is where most people found out about it.
376
00:45:48,248 --> 00:45:51,584
Kayla: Slate. Do you remember when I sent the article to you? Do you remember? Yeah.
377
00:45:51,592 --> 00:46:21,402
Chris: Because it's called the most terrifying thought experiment of all time, is the name of the article. It's very click baity. I believe it took four years for it to get to the mainstream. It probably was starting to get into some Internet spheres before that, but I think the Slate article is the one that kind of took off. But, yeah, it really exploded into the general consciousness to the point where a lot of our listeners probably already know about it. Hell, I knew about it. And I don't think before doing this research that I had even heard of less wrong, which is where it originally came from.
378
00:46:21,506 --> 00:46:25,150
Kayla: Just be clear that had you. You knew about it because of me.
379
00:46:27,010 --> 00:46:38,190
Chris: The important thing here is that, yes, my own wife polluted my brain with a dangerous idea. And this was, like, way before I did all this research that made me, like, not worried about it at all.
380
00:46:38,230 --> 00:46:38,662
Kayla: Yeah.
381
00:46:38,766 --> 00:46:44,126
Chris: You were just like, hey, here's this thing that me sharing it with you fucks you already.
382
00:46:44,198 --> 00:46:44,590
Kayla: Yeah.
383
00:46:44,670 --> 00:46:47,502
Chris: Here's a look at this thing that could kill you just by looking at it.
384
00:46:47,526 --> 00:46:47,758
Kayla: Yeah.
385
00:46:47,814 --> 00:46:53,294
Chris: Yeah. Thanks a lot. You're welcome. I appreciate it. But had you heard of less wrong?
386
00:46:53,422 --> 00:46:53,950
Kayla: No.
387
00:46:54,070 --> 00:47:10,780
Chris: Yeah, I hadn't either, which is kind of crazy, by the way. You know that guy. What's his name? He's famous for being famous. He shitposts on Twitter. Oh, yeah. Elon Musk. He got together with Grimes because she made a tweet about Rococo's basilisk.
388
00:47:10,860 --> 00:47:11,228
Kayla: Yep.
389
00:47:11,284 --> 00:47:12,548
Chris: Which is just super clever.
390
00:47:12,604 --> 00:47:13,588
Kayla: Oh, my God. So funny.
391
00:47:13,644 --> 00:47:20,972
Chris: Do you want to explain? Should we explain Rococo or. We've already explained too much. It's just an art style that's, like, very ornate. Ornate.
392
00:47:21,036 --> 00:47:21,788
Kayla: Overly ornate.
393
00:47:21,844 --> 00:47:33,850
Chris: Overly ornate. But the point is that it sounds like Roco. So she said, rococo's basilisk, and then I guess Elon, he thought it fought. He would just slide into her DM's, yada. Then they became an item.
394
00:47:33,930 --> 00:47:34,842
Kayla: Now they're having a baby.
395
00:47:34,906 --> 00:47:35,834
Chris: Now they're having a baby.
396
00:47:35,882 --> 00:47:37,130
Kayla: So let's be clear.
397
00:47:37,290 --> 00:47:39,386
Chris: Thanks. Less wrong. And Roko.
398
00:47:39,458 --> 00:47:48,522
Kayla: Elon Musk, inventor, science entrepreneur, man Grimes, alternative canadian pop singer.
399
00:47:48,586 --> 00:47:55,876
Chris: Yeah, I prefer my description of famous for being famous and Twitter shitposter, but I guess you could sort of say he's an entrepreneur as well.
400
00:47:55,908 --> 00:48:01,580
Kayla: Oh, and then she went from being hardcore lefty to be, like, union busting because she got with him.
401
00:48:01,660 --> 00:48:06,540
Chris: Well, you know. Or is it because she. The Rococo's basilisk?
402
00:48:06,580 --> 00:48:10,436
Kayla: Maybe it was Rococo's basilisk. Either way, she's a union buster.
403
00:48:10,468 --> 00:48:56,300
Chris: Now, Elon, don't hate us, just in case we ever need to have you interview on the show, because we could totally do that. I know he does interviews on podcasts because I've seen him smoke. So, anyway, that's how Elon got together with Grimes, I guess. Bizarre twitter references is like Tinder for the hyper wealthy, I guess. Yeah, I don't know. Anyway, so here we are, ten years later, and you and I are now part of the pop culture explosion that brought this whole weird post to the public consciousness. Distressing to some, but ultimately more, I think, interesting and more cultural. Just weird. I. So, yeah, I feel good to be part of that whole thing. It's pretty cool question, though. Yeah, we've talked about this in, like, four or five different places now. How worried are you still about Roko's basilisk?
404
00:48:57,600 --> 00:49:11,012
Kayla: I mean, clearly I'm not so worried that I won't talk about it, and clearly I'm not so worried. I've done nothing in order to bring about its existence. Although I guess one could argue that.
405
00:49:11,076 --> 00:49:12,676
Chris: You haven't done shit. Although now we are.
406
00:49:12,708 --> 00:49:14,316
Kayla: Yeah, that's what I'm saying.
407
00:49:14,468 --> 00:49:15,556
Chris: This is our ticket out.
408
00:49:15,628 --> 00:49:29,908
Kayla: One could argue that by me bringing the article to you, I did my part in that. Now you have spread the word to our listeners. So. Roko's basilisk. I did my job. This is what I'm supposed to do.
409
00:49:29,964 --> 00:49:56,976
Chris: So, you know, there are some followers of this. Whatever thought experiment, followers of the basilisk that think that it's not, you did something. It's, did you do as much as you possibly could? Sure, we put it on our podcast, but did you liquidate your entire life savings and, well, I feel like those people are wrong. Well, you're welcome to disagree, but there's some people that think that if you don't do as much as you possibly can, then you're gonna get tortured.
410
00:49:57,048 --> 00:50:15,286
Kayla: Here's the thing. The super friendly, intelligent AI has to be rational enough to understand that human beings brains do nothing, work in such a way that the mere possibility of something is going to be sufficient enough motivation in order to get them to do the thing.
411
00:50:15,358 --> 00:50:18,246
Chris: Oh, you're starting to pierce the veil. We'll get to that.
412
00:50:18,318 --> 00:50:18,854
Kayla: Okay.
413
00:50:18,942 --> 00:50:19,630
Chris: We'll get to that.
414
00:50:19,670 --> 00:50:20,590
Kayla: Okay, good.
415
00:50:20,750 --> 00:50:28,558
Chris: So, actually, that's sort of what this transition is that I was about to do. If you are still worried, it's time to start administering the cure.
416
00:50:28,694 --> 00:50:29,190
Kayla: Thank you.
417
00:50:29,230 --> 00:51:07,514
Chris: I need a lead into that. I'd like to. I'd like to read to you just an amazing and insanely relevant quote from one Philip K. Dick. It goes, for each person, there is a sentence, a series of words, which has the power to destroy him. Another sentence exists, another series of words which will heal the person. If you're lucky, you will get the second, but you can be certain of getting the first. That is the way it works. On their own, without training, individuals know how to deal out the lethal sentence, but training is required to deal out the second. End quote. I really like this Philip K. Dick quote.
418
00:51:07,562 --> 00:51:08,346
Kayla: That's a good quote.
419
00:51:08,418 --> 00:51:24,686
Chris: It's literally this. Right, right. And it's actually, I don't want to. We'll get to that. But we'll get to that. There's a whole interesting paper that I read for this episode, and it's about something called information hazardous.
420
00:51:24,748 --> 00:51:25,730
Kayla: That's my band name.
421
00:51:25,770 --> 00:51:33,882
Chris: And information hazard is essentially theme of this episode. It's information itself can be harmful.
422
00:51:33,986 --> 00:51:37,350
Kayla: Right. That's what the government tries to tell us.
423
00:51:38,610 --> 00:52:02,720
Chris: It's super interesting. This paper is really interesting because you kind of go like, oh, I wonder what that's about. And he really outlined just more ways than you think that information by itself can be harmful. And kind of, when you get to the end of it, you're like, oh, yeah. Like, sort of. The pen is mightier. Right. It's that kind of thing. Like, information is actually extremely potent. The penis mighty.
424
00:52:04,780 --> 00:52:06,564
Kayla: Sorry. Oh, God.
425
00:52:06,732 --> 00:52:07,732
Chris: Oh, my God.
426
00:52:07,876 --> 00:52:10,400
Kayla: Oh. Welcome back.
427
00:52:13,340 --> 00:52:49,302
Chris: So we talked about that first series of words that can destroy you already. So let's start with the next series of words that can heal. To really understand not just what Roko's basilisk is, but why it is and how it came to be, which is part of this process requires some background and actually a few different areas. And this is where we start talking about utilitarianism a little bit, which we kind of already talked about in the episode. So I won't have to say as much here, but the basic idea behind utilitarianism is that you are concerned with outcomes and you take a very mathematical approach to it.
428
00:52:49,406 --> 00:52:50,850
Kayla: Do you feel like that's you?
429
00:52:53,270 --> 00:53:14,590
Chris: Maybe? Yes. I don't know, like here. I. So I think. Is there like a word for, like, flexitarianism? Because I think that there are some cases where you have to make utilitarian choices because it's. So let me do this quick tangent for me to answer that. I kind of have to talk about the Kobayashi maru.
430
00:53:15,970 --> 00:53:17,578
Kayla: See, I told you were the bigger nerd.
431
00:53:17,634 --> 00:53:28,362
Chris: Yep, I definitely am. So for those of you listeners who don't know Kabayashi Maru, good for you. It is a good for you. That means you're not a nerd. It is a fictional Star Trek reference.
432
00:53:28,466 --> 00:53:29,666
Kayla: It's actually pretty cool, though.
433
00:53:29,778 --> 00:54:03,680
Chris: It's very cool. So the idea behind the Kobayashi maru is that as a test that they give to Starfleet cadets in their training. And it's a. It's a simulation test. So basically you go into this room, it's like a. It's. They act. I mean, they do simulations in the navy and the air force and whatever, but it's similar to that. And so you're a pretend you're a captain for a bit and you have your teachers or like all the officers around you, and then you do this training thing. And the idea is that there's some distress signal that's coming from over the border of an alien race that the Federation is at war with. I think it's the Romulan.
434
00:54:04,140 --> 00:54:49,776
Chris: You have to do the simulation of going to rescue those people, but you have to make these hard decisions of like, if we go over there and try to rescue this ship, then it may cause even more harm because then we're violating this treaty. And bottom line is the whole thing is like a glorified sort of trolley problem that they give to cadets. And it's, you know, it's supposed to basically test to see how a captain in a command position will respond to situations that sometimes call for harsh decision making. Most people don't have to pull the lever to kill the one in favor of the five, but if you're the captain of a starship, you might have to. So that's why they give them these tests. And famously, Captain Kirk, and it's an unwinnable test, right?
435
00:54:49,808 --> 00:55:03,406
Chris: So, like you, it is a guaranteed you will lose the test. Nobody has ever won. It's designed to be lost, just. It's how you lose. I it, that is instructive. But famously, Captain Kirk is the only person to have ever won the Kobayashi maru, because. Because he cheated.
436
00:55:03,518 --> 00:55:04,510
Kayla: It's so hot.
437
00:55:04,630 --> 00:55:19,886
Chris: Yeah, because he was like, fuck this, and went in and reprogrammed the test ahead of time, so that it was like, and I think he actually didn't even reprogram it to be like, winnable. I think he reprogrammed it to be like, easy mode, so he could just.
438
00:55:19,918 --> 00:55:22,424
Kayla: I don't know if that's what happens in the new Star Trek, is it not?
439
00:55:22,472 --> 00:55:22,992
Chris: I don't know.
440
00:55:23,056 --> 00:55:30,592
Kayla: I think he just kind of goes like, fuck it, I'm gonna do it. Maybe I'm wrong. I think in the Chris Pine one, he doesn't do that, but I could be totally wrong.
441
00:55:30,776 --> 00:56:06,230
Chris: So the reason I bring that up is because I think, I really like to use that as sort of a framework for how I think about morality sometimes, because I think that there are some cases when what Kirk does is not correct, and it's like, that's not gonna be the real world, man, the real world. You're gonna have to make these hard choices, and sometimes you are going to have to, as the starship captain, or as the captain of a aircraft carrier, or as a leader of a company, whatever. As a leader, you might have to make a decision that you need to be utilitarian, and it's going to be hard, and you have to live with stuff like that and be able to do that as a leader.
442
00:56:06,390 --> 00:56:37,654
Chris: But then I think there are some other cases where it's better to be Captain Kirk, and it's better to say, wait, you said a or b, is there a circumental, like, what is the third path? Right? Like, is there a way to stop the trolley? Is there a way to untie the person? Right. Kind of going back to that thing. So to me, it kind of comes down to the Kobayashi maru. Like, sometimes you have to be Captain Kirk, and sometimes you just have to do the damn test, right. It's not always clear, though, which one it is. I don't know. That's my answer to whatever the hell you asked.
443
00:56:37,742 --> 00:56:40,382
Kayla: It's a good answer to the question that I don't remember that I asked.
444
00:56:40,446 --> 00:56:43,540
Chris: Yeah. All right, I. Cool. What did you ask?
445
00:56:43,580 --> 00:56:44,572
Kayla: No idea which form?
446
00:56:44,596 --> 00:56:48,996
Chris: Oh, you said, like, am I a utilitarian? Yeah, that's my answer to, am I a utilitarian or not?
447
00:56:49,068 --> 00:56:51,372
Kayla: It's let's go watch Star Trek.
448
00:56:51,556 --> 00:57:40,106
Chris: So here's where utilitarianism comes into our story. The less wrong forum, where Roco's basilisk was birthed, is chock full of utilitarian thought. So we talked about this earlier on the show a bit, but less wrong. Articles and discussions frequently resemble that scene in the Princess bride where the. Does that whole, like, oh, God, I know, you know. I know. But then if I know, then you know, then therefore, typological gymnastics. And that's all based on decision theory, right? Which we've talked about before, and we'll talk about in a minute again. And the decisions made in those. I know, you know. I know. Chains tend to be based on some utilitarian calculation, right? So it's like, I know, you know, I know. And this is how I'm sure that the AI is going to behave, because it's going to do this utilitarian calculation.
449
00:57:40,288 --> 00:57:47,878
Chris: For example, I might predict that you will decide to purchase a certain bag of chips if you go to the grocery store because you like those chips.
450
00:57:47,934 --> 00:57:48,622
Kayla: Okay?
451
00:57:48,766 --> 00:57:59,638
Chris: That's the basic thing. Like, if they provide you some utility. So I'm gonna say I predict that is the decision that you, Kayla, will make in the future when you go to the grocery store. That's the type of things they're doing with their decision theories.
452
00:57:59,734 --> 00:58:02,090
Kayla: I can poke a hole right in that right now.
453
00:58:03,150 --> 00:58:03,606
Chris: Okay.
454
00:58:03,638 --> 00:58:04,198
Kayla: Do you want me to.
455
00:58:04,254 --> 00:58:06,278
Chris: Sure, go for it. Poke a hole in those chips.
456
00:58:06,414 --> 00:58:10,930
Kayla: I don't think I have ever gone to the grocery store to purchase myself a bag of chips.
457
00:58:11,790 --> 00:58:43,710
Chris: Okay. That's not what I meant with the example. Just saying the example is. Okay, what's something? Fine. What's something? You have gone to the grocery store to purchase tofu. All right, so then I might predict that you will buy a certain thing of tofu because you like that brand. You find utility from that brand. So I can predict your actions in the future. That's what they're doing on lesswrong.com dot. Or maybe I put you in a trolley problem simulator and predict which way you will choose, and I will use math to make that prediction because obviously, everyone, including you, makes your decisions using utilitarian framework, right?
458
00:58:43,790 --> 00:58:44,662
Kayla: Absolutely.
459
00:58:44,766 --> 00:59:26,390
Chris: Everyone obviously must be using utilitarianism. So that's what they're doing on these forums. They're assuming a universal utilitarian framework, everyone, including inscrutable future hyper intelligent AI's, and then using that to predict what decisions people and AI's will make. AI is like the one that Roko thought up right now, I'm going to say this again later because it's important, but utilitarianism is a useful tool to deploy to understand how decisions can be made. But it's just a tool, right? It's not something that can be applied to all things, even all decisions at all times. And it certainly isn't a good enough reason to invent a vengeful AI deity. But, you know, as they say, when all you have is a hammer, everything looks like a nail, I guess, right?
460
00:59:26,470 --> 00:59:30,450
Kayla: True. I tested that out once. It really changes the way you see things.
461
00:59:31,430 --> 00:59:33,046
Chris: You tested out only owning a hammer?
462
00:59:33,078 --> 00:59:33,478
Kayla: Mm.
463
00:59:33,534 --> 00:59:34,326
Chris: Mm. Yeah.
464
00:59:34,358 --> 00:59:36,766
Kayla: That must have been very difficult time in my life.
465
00:59:36,798 --> 00:59:38,646
Chris: How did you drive to work using the hammer?
466
00:59:38,798 --> 00:59:42,170
Kayla: I didn't because it looked like a nail, so I smashed my car.
467
00:59:42,990 --> 00:59:51,142
Chris: Yeah, yeah. That's definitely what the analogy is here. Yeah, yeah. No, the analogy here is that the hammer is utilitarianism.
468
00:59:51,246 --> 00:59:51,942
Kayla: Right.
469
00:59:52,126 --> 00:59:58,998
Chris: If that's all you're using, then everything kind of seems like it must be. You can break it down into some sort of utilitarian calculation.
470
00:59:59,094 --> 00:59:59,470
Kayla: Right.
471
00:59:59,550 --> 01:00:12,670
Chris: I'll add on to this that utilitarianism isn't just something that deals in whole numbers either. And this is especially true when it comes to discussions had on lesswrong.com dot. Utilitarianism can also deal in probabilities.
472
01:00:12,830 --> 01:00:14,022
Kayla: Ooh, probabilities.
473
01:00:14,126 --> 01:00:27,262
Chris: Probabilities. So these are the types of thought experiments that go, if a magical genie says you can kill one person, and then I'll flip a coin, and if it comes up heads, I'll cure all cancers, would you do it?
474
01:00:27,406 --> 01:00:28,518
Kayla: Wait, say that again.
475
01:00:28,654 --> 01:00:44,460
Chris: So you find a magic lamp and you rub it, and the magic genie comes out and says, if you kill one person, one random person on the world, you have to decide that they will die. But then if you do that, I will flip a coin, and if it comes up heads, I will cure all cancers that have ever existed.
476
01:00:44,550 --> 01:00:45,240
Kayla: Ugh.
477
01:00:45,400 --> 01:00:46,480
Chris: Would you do it?
478
01:00:46,640 --> 01:00:47,432
Kayla: Probably.
479
01:00:47,576 --> 01:00:48,896
Chris: Yeah, probably. Right.
480
01:00:48,928 --> 01:00:49,952
Kayla: But if I had to kill the.
481
01:00:49,976 --> 01:00:51,840
Chris: Person, then probably not.
482
01:00:51,920 --> 01:00:52,800
Kayla: I don't know.
483
01:00:52,960 --> 01:00:54,112
Chris: It's tough to say.
484
01:00:54,256 --> 01:00:55,008
Kayla: Yeah.
485
01:00:55,184 --> 01:01:01,944
Chris: But I use that example because it's like, sort of an obvious. You take that bet and it uses like, a really big probability. It uses.
486
01:01:01,992 --> 01:01:02,584
Kayla: Right.
487
01:01:02,752 --> 01:01:22,052
Chris: It uses big probabilities in small numbers. Right. It uses a coin flip. It's 50%. Right, right. And it uses relatively small number of, oh, you only have to kill one person. So it's like an obvious. It's as obvious as you can get with like, yeah, sure, you take that bet as long as you're the type of person willing to sacrifice the needs of the one for the betterment of the many.
488
01:01:22,156 --> 01:01:22,852
Kayla: Right?
489
01:01:23,036 --> 01:02:09,890
Chris: But we're talking about things like, what would a super intelligent AI look like that we won't even invent for another 100 years? And probabilities with something like that start getting astronomically low, astronomically fast. Right. They're very what if? And they're very, like, tiny chance of things like this happening. And the less wrong folks know that, but the problem is, they still think the math is worthwhile. And I'm here to tell you it's not. Or maybe not that it's not worthwhile. It's just that it's not meaningful. The thing is, human beings are very bad at understanding and processing the consequences of extremely low probability events. Our brains just kind of go like, it's either high low or 50. And that's, like, about as much as we can really conceive of in terms of, like, making decisions and thinking about this stuff.
490
01:02:10,190 --> 01:02:31,278
Chris: When you start saying, well, this outcome is so disastrous that even a tiny probability is worth thinking about, you are actually not correct. It's still not worth thinking about. By way of example, you have a non zero chance of going outside and getting hit by an asteroid. But that doesn't mean you should spend any time during your daily routine of looking through a telescope and making trajectory calculations.
491
01:02:31,374 --> 01:02:32,542
Kayla: You sure about that?
492
01:02:32,646 --> 01:02:33,582
Chris: 100%, yeah.
493
01:02:33,606 --> 01:02:34,754
Kayla: You seen Armageddon?
494
01:02:34,862 --> 01:02:37,578
Chris: I've seen Armageddon and deep impact.
495
01:02:37,754 --> 01:02:38,586
Kayla: I don't know.
496
01:02:38,698 --> 01:03:28,538
Chris: So, yes, I am sure about that. And just to illustrate, so one of the, it's, and this is semi well known now, but one of the examples that Eliezer Yudkowski has posted on less wrong is the question, is it better for some absurdly astronomical number of people to get a speck of dust in their eye or for one single person to be tortured for 50 years with no reprieve? The dust, right, maybe, but, like, they have literal, like, they debate about this, right? And his. He's positing that it's. It's better to have the single person get tortured. You're not. So here's the thing you're not hearing or reading, the types of numbers he's talking about. He's talking about numbers so astronomically high that our brains can't even conceive of them. Like, beyond Googleplex numbers. There's numbers?
497
01:03:28,594 --> 01:03:31,354
Kayla: Well, there's not that many people. So what the fuck are you even talking about?
498
01:03:31,442 --> 01:03:33,976
Chris: Right, but that, well, don't ask me. Ask him.
499
01:03:34,048 --> 01:03:41,816
Kayla: Also. No. If Googleplex people got a speck of dust in their eye, it's still better than one person being tortured nonstop for 50 years. That's wrong.
500
01:03:41,928 --> 01:04:03,420
Chris: But according to a utilitarian, you can assign some utility to that. And even if it's the tiniest, tiniest inconvenience, if you multiply that by enough people, then it's worth the inconvenience of a single person times the inconvenience of being tortured. That's sociopathy, what utilitarianism is. And that's the danger of maybe only applying that in any context.
501
01:04:03,460 --> 01:04:07,164
Kayla: These people actually think that it's better for one person, not all of them.
502
01:04:07,252 --> 01:04:08,476
Chris: It's a debate on the site.
503
01:04:08,588 --> 01:04:10,360
Kayla: But if you're a utilitarian.
504
01:04:10,700 --> 01:04:23,340
Chris: If you're utilitarian, then, yes, there is some number of people that you can say that would be true for. It's not 100. It's not a googolplex. But maybe for some near infinitum number of people, that might be true for. It's just math.
505
01:04:23,460 --> 01:04:24,980
Kayla: It's not mathematic.
506
01:04:25,090 --> 01:04:36,832
Chris: Okay, think about it. And some. Maybe this is easier. Is it worth one person being tortured for 50 years if every single human on earth had to have. Have to be tortured for one day?
507
01:04:36,936 --> 01:04:41,328
Kayla: I feel like that's a completely different thing. Like, a speck of dust in the eye is like.
508
01:04:41,464 --> 01:04:49,620
Chris: I mean, either way, you're talking about some detrimental effect happening to a person, right? Yes, but a speck of either pain and suffering or. It's, like, normal.
509
01:04:50,180 --> 01:04:52,644
Kayla: Yeah, but that's, like, normal day to day shit.
510
01:04:52,692 --> 01:04:59,852
Chris: Right? And, see, what I think is happening right now is your brain is having a hard time making the leap from normal numbers to, like, meaningless numbers.
511
01:05:00,036 --> 01:05:11,320
Kayla: Maybe, but I don't feel like this is happening to me. I feel like that there's no number of people. I'm sorry to harp on this. I'm just saying I feel like there's no number of people getting a speck of dust in their eye.
512
01:05:11,940 --> 01:05:12,524
Chris: That's right.
513
01:05:12,572 --> 01:05:13,800
Kayla: That outweighs.
514
01:05:14,300 --> 01:05:15,492
Chris: Well, yes.
515
01:05:15,556 --> 01:05:16,572
Kayla: One person getting towards.
516
01:05:16,636 --> 01:05:49,990
Chris: So here's what I'm gonna say. This is how I think of it as there's no meaningful number of people that could have a dust speck of dust in their eye. To make that worth it. I think that you can talk on a website about the math of the utility of a dust speck versus the utility, or I guess I should say anti utility, of a dust spec versus the anti utility of torture, and you can do those calculations. But, and this is actually what the rest of this sort of, like, little section of the podcast is about whether that's meaningful or not.
517
01:05:50,070 --> 01:05:50,470
Kayla: Gotcha.
518
01:05:50,510 --> 01:06:08,286
Chris: So let me finish these, a couple sentences here, and then come back to that. Basically, what I'm saying is, I'm not advocating here that I'm either, like, pro torture or pro spec, but rather, when the numbers get that absurdly large, or when probabilities get that absurdly small, human concerns like torture and specs and eyes are completely meaningless.
519
01:06:08,358 --> 01:06:09,150
Kayla: I'm pro spec.
520
01:06:09,230 --> 01:06:29,292
Chris: So here's. I know. So here's analogy. You could talk about using a tiny fraction of the energy emitted by a black hole to power your subaru. But by the time we've reached the scale of black holes, concerns about how you get your subaru to take you to work have long since disappeared in the proverbial rear view mirror. Proverbial rear view mirror. That is a tongue twister.
521
01:06:29,356 --> 01:06:30,764
Kayla: I am still pro spec.
522
01:06:30,932 --> 01:07:05,006
Chris: That's fair. But whether you're pro spec or pro torture is, to me, is sort of beside the point, right? You can say you're pro spec. You can say you're pro torture. What I'm saying is, when the numbers get to the scale that they are, the meaningfulness of those numbers sort of goes out the window. And one of the ways I like to think about the scale problem is this, is that there is a difference between what is conceivable and what is meaningful, right? So human brains can conceive of numbers that are just beyond understanding.
523
01:07:05,038 --> 01:07:44,234
Chris: Like, actually, the number, the notation that Eliezer is using on his site when he talks about the dust spec, is using this, like, crazy mathematical notation that results in just numbers that are, that make Googleplex look like nothing, that make Googleplex look like just not even a dust spec, so to speak. And, I mean, I could even sit here right now and like, in the next 2 seconds, I could write a number that is bigger than the number of atoms in the universe or larger than the amount of seconds that have transpired. Like, that's, you know, it's easy for humans to conceive, to come up with these numbers and I could. Likewise, I could easily write a number right now that was smaller than the Planck length. Right, right.
524
01:07:44,362 --> 01:08:06,042
Chris: But there's a difference between what I can conceive of and then what is in fact, meaningful for the Planck length. For example, I've heard scientists say that it's the smallest meaningful length I can conceive of. I can conceive of lengths smaller than that, but they don't have any physical meaning. It's the. The fabric of reality just doesn't support that level of fidelity.
525
01:08:06,106 --> 01:08:06,546
Kayla: Right.
526
01:08:06,658 --> 01:08:34,703
Chris: And then a similar thing, I think happens when you talk about numbers that are so large, such as in the dust spec example, is I can certainly conceive of numbers that large, and I can even do math on them and talk about utility. But I just don't think that they're meaningful because by the time you've gotten that large, things like the age of the universe or a googolplex or whatever are so far beneath that, you've already left meaning behind way before you even get to those numbers.
527
01:08:34,752 --> 01:08:35,247
Kayla: Right.
528
01:08:35,384 --> 01:09:09,210
Chris: So that's kind of what I'm saying here, if that all makes sense. Now, if you take all that and talk about it on a website and play some mental math with a. You know, like, I think that post, to me, is perfectly fine. The problem is when you go from like, a fun conversation you have, you know, when you're high with your friends to being scared about an actual future artificial intelligence that may be applying some of these low probability, high impact sort of things, that's when it starts to become a problem. Because now you're worried about something that is essentially meaningless.
529
01:09:09,290 --> 01:09:09,746
Kayla: Right?
530
01:09:09,858 --> 01:09:11,106
Chris: Does that all make sense?
531
01:09:11,298 --> 01:09:13,194
Kayla: I think so. I'm still pro spec.
532
01:09:13,282 --> 01:09:18,290
Chris: But I know this whole time. This whole time I was saying that in your head you were just going prospek.
533
01:09:18,330 --> 01:09:24,018
Kayla: Yeah, I definitely wasn't listening. I'm saying you haven't swayed me. Makes sense.
534
01:09:24,154 --> 01:09:25,474
Chris: I'm not trying to sway prospect.
535
01:09:25,482 --> 01:09:27,444
Kayla: I'm saying that neither prospect all the way, baby.
536
01:09:27,537 --> 01:09:33,317
Chris: I'm saying that neither pro spec nor pro torture have really any meaning because the question is meaningless to begin.
537
01:09:33,372 --> 01:09:35,981
Kayla: Well, we're going to settle this with a Twitter poll, so it's fine.
538
01:09:36,045 --> 01:09:42,877
Chris: Great. Actually, we definitely should. Any questions about utilitarianism? And even, like, probabilistic utilitarianism?
539
01:09:42,933 --> 01:09:43,509
Kayla: No.
540
01:09:43,669 --> 01:09:44,448
Chris: Okay.
541
01:09:45,549 --> 01:09:48,837
Kayla: Too big. Too much, too big.
542
01:09:48,933 --> 01:10:28,376
Chris: Monkey brain can't think. Yeah. So utilitarian theory taken to absurd scale is cool and all, but that's just one part of how folks on less wrong talk about decision theory. So let me talk here about one of the jargony bits, and trust me, I'm only going to talk about one of them. We can't talk about all of them because we just don't have the time. But this is something called an acausal trade. That's one of the words they like to throw, phrases. They like to throw around. Another one, by the way, the biggest one that gets thrown around with things like rokos, basilisk, and on other parts of the site. It's called timeless decision theory. And acausal trade is a concept within that theoretical framework.
543
01:10:28,528 --> 01:10:35,456
Chris: I won't talk about that because that's even longer, but I will go into what a causal trait is because I think it's worth at least talk about one of their bits of jargon.
544
01:10:35,528 --> 01:10:36,664
Kayla: I do have a question.
545
01:10:36,752 --> 01:10:37,460
Chris: Yes?
546
01:10:37,760 --> 01:10:44,180
Kayla: Do you think that everyone on this site has watched the good place at this point?
547
01:10:44,840 --> 01:10:46,008
Chris: Yeah, probably.
548
01:10:46,184 --> 01:10:47,940
Kayla: Did they make it?
549
01:10:48,680 --> 01:10:49,500
Chris: Maybe.
550
01:10:49,920 --> 01:10:54,012
Kayla: Did the good place happen? I wouldn't be surprised Michael Schur found this website.
551
01:10:54,136 --> 01:10:55,524
Chris: I was literally just about to say.
552
01:10:55,572 --> 01:10:56,732
Kayla: Turned into the good place.
553
01:10:56,796 --> 01:11:43,178
Chris: I wouldn't be surprised if Michael Shearer was reading some posts on less wrong and was like, I'm gonna make a tv show and it's gonna be a primetime television show where all the characters are dead. So a causal trade, in other words, a trade that lacks a direct cause, therefore a causal. Think about your standard, everyday, normal person trade. You give me an apple, I give you an orange, or you give me money, and I give you a garage full of lularoe leggings or whatever, ten grand. That's a causal trade, right? That's like a direct, like, you give me x, I give you y. An A causal trade is a trade in which you give me something in an implicit exchange for a promised behavior or even an expected behavior or other consideration. It's essentially a promise or an expectation.
554
01:11:43,354 --> 01:12:15,742
Chris: And the less wrong folks are very big on the use of promises or expectations as motivators in their decision theory theorizing. It's all very game theory esque. And in fact, here I'm going to use a board game example to illustrate the point a little better. I don't know how many of our listeners have, like, played a game of risk, but if you played any board game, no, I don't think my mom's played risk. But if you've ever played risk, or if you've ever played a game similar to it, where there's like a table full of people and they're all sort of like fighting each other to be the one on top, right? You probably know of the crazy revenge guy at the table.
555
01:12:15,926 --> 01:12:23,958
Chris: Everybody's playing, trying to win, and then there's the one guy where he's like, if you fucking attack me, I am gonna destroy you the whole rest of the game.
556
01:12:24,094 --> 01:12:25,270
Kayla: Are you talking about me?
557
01:12:25,350 --> 01:12:27,342
Chris: So that's you at our table? Yeah.
558
01:12:27,446 --> 01:12:30,366
Kayla: That's literally what I did the one time we played Riz.
559
01:12:30,438 --> 01:12:39,534
Chris: Yeah, but there's always at least one revenge guy where they say, like, I swear to God, if you attack me, I'm just. I don't care if I lose at that point, my goal will be to take you down.
560
01:12:39,662 --> 01:12:41,330
Kayla: I think it's the best way to play the game.
561
01:12:42,020 --> 01:12:52,740
Chris: That is an acausal trade. So when I decide to not attack Kayla because I know she's a crazy pants and she's gonna fucking revenge me the whole rest of the game, you've made an acausal trade with me?
562
01:12:52,780 --> 01:12:53,476
Kayla: Hell, yeah.
563
01:12:53,588 --> 01:13:22,104
Chris: You have influenced my behavior based on me knowing you're going to behave in a certain way if I do it. I haven't said, hey, Kayla, let's. You know, if I help you attack Bob and then you help me attack Alice, then we're allies. It's not that. It's. I know that if I attack you're gonna go all crazy, and even to the detriment of your own ability to win the game. That's what you're gonna do. So it's like this weird thing, right? It's rational but irrational at the same time, depending on how you look at it.
564
01:13:22,152 --> 01:13:24,016
Kayla: Yeah, that's one of the best.
565
01:13:24,088 --> 01:13:28,984
Chris: It's irrational because. Well, of course she wouldn't do that because then she's gonna lose the game. That's stupid.
566
01:13:29,032 --> 01:13:29,704
Kayla: Don't care.
567
01:13:29,832 --> 01:13:37,844
Chris: But by making that promise, you have essentially insured yourself against attacks from some people who are worried about that kind of thing.
568
01:13:37,892 --> 01:13:39,772
Kayla: But you gotta be willing to follow through, right?
569
01:13:39,836 --> 01:13:41,604
Chris: That's the thing. You have to be willing to follow through.
570
01:13:41,652 --> 01:13:42,596
Kayla: Otherwise it's empty threats.
571
01:13:42,628 --> 01:13:51,508
Chris: And it's. Otherwise it's empty threats. So some people aren't. It depends. But that's exactly what they're using to conceive of Roko's basilisk.
572
01:13:51,564 --> 01:13:51,924
Kayla: Okay?
573
01:13:51,972 --> 01:14:12,374
Chris: It's an acausal trade in that exact same manner, except it's in reverse. Right. The robot, the future Roko basilisk. AI is saying, I promise you that you will get tortured unless you help me become existing. So it's the same type of thing. It just happens in reverse times instead of forward in time, but it's the same concept.
574
01:14:12,422 --> 01:14:14,670
Kayla: So don't play risk with Rocco, if that makes sense.
575
01:14:14,830 --> 01:14:30,734
Chris: Well, don't play risk with Roko's Basilisk. But we also know that when we play risk with the crazy revenge guy, sometimes the crazy revenge guys all talk, and sometimes they actually decide they want to win the game instead of just take revenge on somebody from, you know, what they promised they were going to do.
576
01:14:30,862 --> 01:14:32,726
Kayla: Those people are disgraces.
577
01:14:32,918 --> 01:14:35,062
Chris: You better hope that Roko's basilisk is.
578
01:14:35,086 --> 01:14:39,742
Kayla: Like that because part of Buddy's a disgrace, if that's what you.
579
01:14:39,766 --> 01:14:55,798
Chris: Part of the thing that debunks what potentially could happen from an AI like that is the fact that they will probably have much better things to do than torture somebody from the past. They're actually what they're really worried about, unless wrong isn't even torturing you, it's torturing a simulation of you that they think of as you.
580
01:14:55,854 --> 01:14:56,454
Kayla: Right.
581
01:14:56,622 --> 01:15:06,806
Chris: So basically what we're saying is this future hyperintelligence that is essentially godlike and inscrutable, but we can totally screwed it, which is a word, right?
582
01:15:06,918 --> 01:15:07,190
Kayla: Yeah.
583
01:15:07,230 --> 01:15:10,010
Chris: If it's inscrutable and we screwed it. Yeah.
584
01:15:10,510 --> 01:15:13,494
Kayla: C r U T, screwed.
585
01:15:13,662 --> 01:15:16,646
Chris: We're saying it's inscrutable. Except I totally know how it's going to behave.
586
01:15:16,718 --> 01:15:17,070
Kayla: Right.
587
01:15:17,150 --> 01:15:33,056
Chris: And it's going to behave in this way, and it's going to take valuable resources that a God will totally care about doing and create a simulation of little old me and torture. That simulation of little ol me. Why would it do that?
588
01:15:33,168 --> 01:15:34,208
Kayla: I mean, my argument, it has a.
589
01:15:34,224 --> 01:15:35,056
Chris: Risk game to win.
590
01:15:35,128 --> 01:15:45,264
Kayla: My argument would be that if Rocco's basilisk is as godlike and powerful as we say, then it doesn't expend any resources or energy torturing you.
591
01:15:45,312 --> 01:15:48,528
Chris: Well, I mean, it's just omnipotent, and.
592
01:15:48,544 --> 01:16:02,980
Kayla: It can and all powerful and like torturing you. But making good on that promise is not taking away from its precious time to go solve risk, global galactic risk. That's my fear.
593
01:16:03,520 --> 01:16:13,494
Chris: Okay. But omnipotent isn't actually a thing. Like, at some point, it does actually have to spend something akin to resources or at least time or cycles or something.
594
01:16:13,632 --> 01:16:14,282
Kayla: Does it?
595
01:16:14,386 --> 01:16:26,890
Chris: I don't know. But one of the things, one of the analogies that was given in rational wiki, which I'll cite in my sources later, was what we're doing right now is akin to two ants trying to figure out how a human brain is gonna work.
596
01:16:26,930 --> 01:16:27,506
Kayla: True.
597
01:16:27,658 --> 01:16:37,602
Chris: I don't know. Like maybe it's gonna smush us ants or I probably won't care. Like right now, I don't really care whether ants that existed a million years ago were working towards evolving humans.
598
01:16:37,666 --> 01:16:38,122
Kayla: True.
599
01:16:38,226 --> 01:16:45,214
Chris: I'm not going around looking, where are those ants? I gotta torture me some ants. Cause I. They didn't try to make me be alive, so.
600
01:16:45,302 --> 01:16:53,878
Kayla: Well, that's also putting the ants in a position that they're not in. The ants are not sentient, and the ants don't have any capability of speeding up evolution.
601
01:16:54,014 --> 01:16:57,318
Chris: How do you know that we have any capability of creating something like Roko's basilisk?
602
01:16:57,334 --> 01:17:02,894
Kayla: Because Roko's basilisk has come back in time and fucking told us by posting this fucking thing. Just.
603
01:17:03,062 --> 01:17:04,150
Chris: You are a true believer.
604
01:17:04,230 --> 01:17:06,010
Kayla: I am in the cult.
605
01:17:08,510 --> 01:17:12,390
Chris: So going back, do you understand a causal trade now?
606
01:17:12,550 --> 01:17:13,094
Kayla: Yeah.
607
01:17:13,182 --> 01:17:13,670
Chris: Okay.
608
01:17:13,750 --> 01:17:15,558
Kayla: Mostly because I'm the risk vengeance.
609
01:17:15,614 --> 01:17:20,606
Chris: Because you are the vengeance person that uses an acausal trade to get people to not attack you.
610
01:17:20,638 --> 01:17:21,158
Kayla: Yeah.
611
01:17:21,294 --> 01:17:50,104
Chris: So as you pointed out, though, acausal trades are not binding. And any future basilisk style AI would know this. And here's the thing. It doesn't matter if they even have. Like you said, even if they did have infinite resources torture you anyway, the fact that they know a causal trades aren't binding means that suddenly the blackmail is not effective, and therefore it breaks the acausal trade, and therefore there's no reason torture you.
612
01:17:50,152 --> 01:18:02,748
Kayla: So the basilisk goes. They know that this is not binding, so therefore this whole thing is illogical. Put a line through it. It doesn't work, doesn't make sense. So I might as well. Nothing do this right?
613
01:18:02,924 --> 01:18:05,020
Chris: I told you, it's a bunch of. They know I know you.
614
01:18:05,060 --> 01:18:09,708
Kayla: I'm just saying I need to think on that for a little bit before I feel comfortable.
615
01:18:09,804 --> 01:18:21,084
Chris: All right, keep thinking on that with your aunt brain. Compared to the Rococo's human brain, that's insulting. It's true, though. So we've talked about the limitations of utilitarianism as being a tool and not a universal framework for looking at the world.
616
01:18:21,132 --> 01:18:23,360
Kayla: Did you just call Roko's basilisk a tool?
617
01:18:23,660 --> 01:18:46,188
Chris: Talked about how absurdly scaled numbers and probabilities quickly get you into the realm of meaninglessness. And we've talked about a causal trade being non binding. Although, to be honest, I mostly told you the acausal trade story to illustrate some of the ritual that is involved with this group with all of their jargons. Now, I want to reference another episode of ours.
618
01:18:46,244 --> 01:18:46,988
Kayla: Oh, shit.
619
01:18:47,124 --> 01:19:00,076
Chris: In fact, our fourth episode of season one in which we talked about. Do you remember fourth episode, the game. Pop quiz, hotshot. No. Oh, wait. Yeah, the game. Sure. Yes. Sorry. That was the name of the episode. Star Citizen.
620
01:19:00,148 --> 01:19:04,444
Kayla: Yeah. Oh, oh, okay. I understand why you got confused for a second.
621
01:19:04,492 --> 01:19:08,124
Chris: Yes. You were talking about the game that we've already referenced in this episode.
622
01:19:08,172 --> 01:19:13,564
Kayla: Oh, my God. That episode. Do you realize what that episode is? That episodes called the game.
623
01:19:13,692 --> 01:19:14,260
Chris: Yeah, I know.
624
01:19:14,300 --> 01:19:14,644
Kayla: If you.
625
01:19:14,692 --> 01:19:19,140
Chris: Oh, no. Every time you listen that episode, you lose the game. Yeah, sorry, you guys.
626
01:19:19,180 --> 01:19:19,956
Kayla: Sorry, guys.
627
01:19:20,108 --> 01:19:33,818
Chris: So we talked about Star Citizen in that episode, and we also invoked good old Pascal's wager as a lens to look at what was going on. Can you remind our dear listeners, absolutely not. About what Pascal's wager is?
628
01:19:33,834 --> 01:19:35,298
Kayla: I have no idea what you're talking about.
629
01:19:35,314 --> 01:19:38,202
Chris: You don't remember Pascal's wager? It was, like a big part of that episode.
630
01:19:38,346 --> 01:19:43,466
Kayla: I have absolutely no idea what you're. I don't think we talked about this thing that you're.
631
01:19:43,538 --> 01:19:45,418
Chris: This is a terrible. A causal trade.
632
01:19:45,514 --> 01:19:48,402
Kayla: I'm sorry. I don't know what Pascal's wager.
633
01:19:48,466 --> 01:20:22,424
Chris: I should have promised torture you in the future. Okay, so Pascal's wager is the thing that goes like this. I might as well believe in God, because if I decide not to believe in God, right, and there's no heaven and no God, then doesn't matter. I wasn't going to heaven whether I believe in him or not, right? If I decide to believe in God and there's no heaven, then I don't go to heaven. But if I decide to believe in God and there is a heaven, then I do go there. When I die, I'm so believing in God strictly. And this is actually also a game theory, sort of like matrix, right?
634
01:20:22,472 --> 01:20:22,872
Kayla: Right.
635
01:20:22,976 --> 01:20:34,248
Chris: The believe in God choice dominates, right. Because if you. If there's no God, no heaven, then it doesn't matter either way. But if there is, then you better believe in it, because otherwise you're going to hell or not going to heaven or whatever.
636
01:20:34,384 --> 01:20:37,304
Kayla: That's Pascal's wager, and it is faulty.
637
01:20:37,472 --> 01:20:54,386
Chris: Right? So with Star citizen, the version of Pascal's wager was, I better believe in the thing. The thing being Star Citizen becoming a real game. Because if the game doesn't happen, then no big deal. But if it does, then I get to play this awesome game that I've always wanted to play. Right? That was the version for Star Citizen.
638
01:20:54,418 --> 01:20:55,042
Kayla: Right.
639
01:20:55,226 --> 01:21:19,592
Chris: But Star Citizen, viewed as this future entity that you might as well believe in, took it a step further, fans were and still are attempting to bring the entity into existence by donating money and resources, and a lot of it. Belief in a future entity because of the potential consequences coupled with a desire to bring it into existence. Does that sound familiar?
640
01:21:19,656 --> 01:21:22,712
Kayla: Are you saying Star Citizen is Roko's basilisk?
641
01:21:22,816 --> 01:21:25,960
Chris: Sort of. It's Roko's basilisk with a carrot instead of a stick.
642
01:21:26,120 --> 01:21:32,660
Kayla: Okay, so basically what you're telling me is that we should give money to Star Citizens so that we avoid being tortured for eternity.
643
01:21:33,870 --> 01:21:37,534
Chris: Well, no, but that's the thing, right? Is that Star Citizen is carrot versus stick.
644
01:21:37,582 --> 01:21:39,262
Kayla: So I don't know that it is.
645
01:21:39,326 --> 01:22:12,006
Chris: One of the things with. So Star Citizen is like, I might as well donate because there's a nice little juicy reward for me at the end if I win. Same thing. Sort of with Pascal's wager, right? There's a juicy heaven reward. The sort of shitty thing about Roko's basilisk is that it's only conceived of with stick and not carrot. It's only conceived of being tortured. There's no version of it where if I donate to this, then the basilisk will have known that I was going to do that and therefore reward me because I donated to it. There's no version of it where that happens.
646
01:22:12,118 --> 01:22:12,742
Kayla: Okay.
647
01:22:12,846 --> 01:22:17,290
Chris: Which there's no reason for that not to exist, because carrots and sticks are both motivators.
648
01:22:17,710 --> 01:22:18,678
Kayla: What happens?
649
01:22:18,734 --> 01:22:19,438
Chris: Kind of weird.
650
01:22:19,534 --> 01:22:35,116
Kayla: If you reach the end of your life and as you're dying and you die, and then you find out I that if you are somebody who helped star citizen happen, you get eternal salvation.
651
01:22:35,268 --> 01:22:35,796
Chris: Wow.
652
01:22:35,908 --> 01:22:39,040
Kayla: You're somebody who didn't. Eternal damnation.
653
01:22:39,340 --> 01:22:41,356
Chris: Then I would definitely donate to Star Citizen.
654
01:22:41,388 --> 01:22:45,680
Kayla: So that's what I'm saying. Just because it looks like it's carrot doesn't mean there ain't no stick.
655
01:22:46,980 --> 01:22:53,146
Chris: I guess I'm not really seeing the tie in from making a game exist to eternal life in heaven.
656
01:22:53,308 --> 01:22:55,014
Kayla: Cause it's Roko's basilisk.
657
01:22:55,182 --> 01:23:00,598
Chris: Yeah, but only in the sense that you are trying to make a game exist. Yeah, but not in the sense that you are going to heaven.
658
01:23:00,654 --> 01:23:03,558
Kayla: It's under the guise of a game. It's not actually a game.
659
01:23:03,654 --> 01:23:06,038
Chris: I see. So you're, like, just way extrapolating.
660
01:23:06,134 --> 01:23:13,510
Kayla: I'm just saying Roko's basilisk disguised itself as a video game to get people to make it happen.
661
01:23:13,630 --> 01:23:14,110
Chris: Got it.
662
01:23:14,150 --> 01:23:20,088
Kayla: And now if you don't, so you're unsure, contribute to Star Citizen eternal damnation.
663
01:23:20,224 --> 01:23:26,680
Chris: Yeah, I think you got exactly what were trying. Yeah, that was. That is the correct plot line of this whole thing.
664
01:23:26,720 --> 01:23:27,368
Kayla: Yeah.
665
01:23:27,544 --> 01:23:32,848
Chris: Right. So actually, we are not the only ones to draw this parallel.
666
01:23:33,024 --> 01:23:34,016
Kayla: Oh, shit.
667
01:23:34,128 --> 01:23:40,936
Chris: It's been. Well, okay, so hold on. Let me be clear. The parallel between Pascal's wager and Roco's basilisk, not between Star Citizen.
668
01:23:41,048 --> 01:23:42,952
Kayla: I bet you were not the first to do that either.
669
01:23:43,016 --> 01:24:13,284
Chris: Probably. So it's actually been discussed quite a bit and over much more than just Roko's basilisk. Apparently, there's enough fretting over future super intelligent AI that Pascal's wager has come up a lot. In fact, here's a quote that I'm citing from rational wiki, again, that they in turn are citing from a writer by the name of Greg Egan. Quote. You know what they say. The modern version of Pascal's wager is sucking up to as many transhumanists as possible, just in case one of them turns into goddess.
670
01:24:13,402 --> 01:24:16,584
Kayla: Oh, okay. So I need to start doing that. Okay.
671
01:24:16,712 --> 01:24:20,840
Chris: I love that's your reaction and not like, oh, yeah, snarky. That's funny.
672
01:24:20,920 --> 01:24:23,232
Kayla: Oh, no, that's a good advice. It's solid advice.
673
01:24:23,336 --> 01:24:27,264
Chris: Yeah. So you should be tweeting, I love youlanmusk.
674
01:24:27,312 --> 01:24:31,224
Kayla: I think, really, this is just about who has certain kinds of anxiety and who doesn't.
675
01:24:31,352 --> 01:24:34,440
Chris: Yes, that's literally what this is.
676
01:24:34,520 --> 01:24:40,206
Kayla: It's just about, like, what's wrong with your brain? Like, if this pokes your anxiety, then you're a certain.
677
01:24:40,278 --> 01:24:42,878
Chris: You experience a certain type of anxiety. Yes.
678
01:24:43,014 --> 01:24:44,902
Kayla: All right, well, aren't you just so.
679
01:24:45,086 --> 01:24:48,470
Chris: Lucky, dude, I get super anxious about.
680
01:24:48,550 --> 01:24:49,606
Kayla: Not about this.
681
01:24:49,798 --> 01:25:26,288
Chris: Yeah, because I read all these things, which I am now sharing with you. So just in case it isn't obvious as to how the pascal's wager fallacy applies to this sleek, modern robo God, this AI that we have never seen but a subset of humans thinks exists. Well, the exact same way it applied to the crusty, old school, Judeo Christian God that we also have never seen but a subset of humans thinks exists. And that is. Yeah, sure, maybe the future basilisk AI wants current you to spend your time and money helping it exist. But what about the other future AI, whose only goal was to make sure the basilisk AI didn't exist?
682
01:25:26,344 --> 01:25:27,480
Kayla: Stop. No.
683
01:25:27,600 --> 01:25:30,420
Chris: Oh, shit. Now you've pissed that one off. Oops.
684
01:25:31,130 --> 01:25:35,850
Kayla: Well, now I have to worry about that now, too. How am I gonna please both of them?
685
01:25:35,890 --> 01:26:05,044
Chris: So this is the same thing that shoots down the original Pascal's wager, which is that there's a prerequisite that the God that you're wagering on is both alone and omnipotent. It's the classical logical fallacy of assuming the thing you're trying to prove and the nature of these Pascal's wagers and how they motivate people to spend money on them. They've been referred to by one economist as Pascal's scams. And. And there's even a trope on lessrong.com itself that calls things like Roko's basilisk Pascal's mugging.
686
01:26:05,172 --> 01:26:05,836
Kayla: Jeez.
687
01:26:05,948 --> 01:26:22,398
Chris: Which I just love that term so much, I can't even tell you. But what that really demonstrates, by the way, is that not everyone on less wrong is afraid of the basilisk. Like, there's people that are like, yo, this is just making people upset and, like, goading them into donating money to something that they really shouldn't have to do.
688
01:26:22,494 --> 01:26:24,810
Kayla: Wait, are you telling me that people donate?
689
01:26:25,470 --> 01:26:26,718
Chris: Oh, yeah, we'll get to that.
690
01:26:26,774 --> 01:26:32,574
Kayla: Wait, no, hold on. Hold on. Off podcast? Are you serious right now?
691
01:26:32,702 --> 01:26:33,782
Chris: Why off podcast?
692
01:26:33,886 --> 01:26:36,598
Kayla: Because I need to know otherwise, my kayla.
693
01:26:36,694 --> 01:26:41,710
Chris: That's the whole thing. The whole thing is, if you don't donate to the basket, if you don't help.
694
01:26:41,750 --> 01:26:43,930
Kayla: I didn't know they're donating money.
695
01:26:44,270 --> 01:26:45,614
Chris: Well, how else do you think it was?
696
01:26:45,622 --> 01:26:48,190
Kayla: I was about to cry. This is really upsetting.
697
01:26:48,270 --> 01:26:51,468
Chris: How else did you think you are supposed to help? This anime exist?
698
01:26:51,524 --> 01:26:52,120
Kayla: No.
699
01:26:52,460 --> 01:26:53,396
Chris: Who don't need to.
700
01:26:53,428 --> 01:26:54,000
Kayla: Who?
701
01:26:54,420 --> 01:26:55,356
Chris: We'll get to that.
702
01:26:55,428 --> 01:26:56,452
Kayla: I don't like this.
703
01:26:56,596 --> 01:27:02,364
Chris: What? Cause people are actually, like, taking it to heart and acting on it. Yeah, it's shitty, so we'll get to that.
704
01:27:02,412 --> 01:27:03,308
Kayla: Oh, my God.
705
01:27:03,444 --> 01:27:23,560
Chris: But, yes, there's a trope on lesswrong.com called Pascal's mugging, which I just love. But anyway, that's the whole thing with Pascal's wager is the many gods argument. Sort of shoots it down. It's like, okay, like, sure, if there was only one God, then that would be a true wager. But what if, you know, you were betting on the wrong God and the right God was like, oh, if you believe in that one, you'll actually go to hell.
706
01:27:24,100 --> 01:27:27,156
Kayla: This is stressing me out. That's just tell me where to give the money.
707
01:27:27,228 --> 01:27:30,908
Chris: But you knew that. You knew that. That's. That's the thing that's stressing you out.
708
01:27:30,924 --> 01:27:31,492
Kayla: Is I just need.
709
01:27:31,516 --> 01:27:32,484
Chris: You could have been giving money.
710
01:27:32,532 --> 01:27:32,908
Kayla: Yes.
711
01:27:33,004 --> 01:27:34,884
Chris: Actually, I'm not gonna tell you now because I really.
712
01:27:34,932 --> 01:27:37,404
Kayla: Okay, well, it's too. It's too late.
713
01:27:37,452 --> 01:28:07,030
Chris: Holy shit. All right, so on that it does seem to be an issue. For example, there is an institute now called Machine Intelligence Research institute, formerly called SIAI, which we have mentioned just an hour or two ago on the show, which stands for singularity Institute for Artificial intelligence that takes donations and takes them for the cause of reducing existential risk from the development of AI. Now, this is all well and good.
714
01:28:07,110 --> 01:28:09,766
Kayla: Like, that's not specifically Roko's basilisk, but.
715
01:28:09,838 --> 01:28:13,102
Chris: I can see where this starts to sound like a pascal's mugging.
716
01:28:13,166 --> 01:28:13,702
Kayla: Right?
717
01:28:13,846 --> 01:28:36,950
Chris: If you're positing that there's an existential humanity wide risk from something that may or may not happen in the future and in fact has a terribly low probability, you're back in this territory of infinitesimal small probabilities times disastrously large, consequences equaling, oh, I don't know, about, like, $20, give or take. But again, with probabilities that small and consequences that large, the math doesn't quite work out that way.
718
01:28:36,990 --> 01:28:37,310
Kayla: Right?
719
01:28:37,390 --> 01:28:58,830
Chris: So this is why I brought up the thing about the speck of dust times the whatever, right? Because it's a fun thing to think about. It's a fun thought experiment. But if you actually do that math and say, okay, yeah, it's about $8, that's when it starts being like, no, you took something that was, like, conceivable but meaningless and tried to assign meaning to it, and that's where you broke.
720
01:28:58,990 --> 01:29:00,500
Kayla: I'm still pro spec.
721
01:29:00,670 --> 01:29:03,664
Chris: I understand that. We'll just call this episode pro spec.
722
01:29:03,752 --> 01:29:07,208
Kayla: Because that's no matter the number.
723
01:29:07,384 --> 01:29:08,032
Chris: I know.
724
01:29:08,136 --> 01:29:11,240
Kayla: Also, tell me where to put the money. Is it to this people?
725
01:29:11,400 --> 01:29:57,212
Chris: Anyway, that's why I brought that up. Right. And yet, not only did Siai say that type of math was possible, but they actually did the math, and they would tell you exactly how many lives you were saving per dollar donated. And I think it came out to eight lives per dollar donated. They actually did that math and said, every dollar you donate to SiAi saves eight lives. Now, again, I'm very upset right now. Remember, this isn't a calculation based on how many malaria nets they can ship overseas, which is like, okay, this times this many equals this many lives saved. Right? This is based on. Well, there's a 0.001% chance that a world destroying AI will be developed. So if we can reduce that chance by an even tinier 0.001%, then let's see. Carry the one. Yeah. About eight lives per dollar.
726
01:29:57,406 --> 01:29:59,040
Kayla: This is the most upsetting thing.
727
01:29:59,080 --> 01:30:37,070
Chris: And they took people's money. And like all good cults, I don't actually think they were scamming. I think they really believe this math. For example, here's a commenter on less wrong talking about how he's doing his part, and this is part of a much larger comment that he made, but just quoting a little bit of it. What am I doing? Working at a regular job as a c programmer and donating as much as possible to Siaihdeene and sometimes doing other useful things in my spare time. Blah blah, he continues on. End quote. By the way, Sia's founder was none other than less wrong's founder, Elie Iser Yudkowski.
728
01:30:38,410 --> 01:30:43,362
Kayla: Now no, I'm done. I can't do test anymore.
729
01:30:43,426 --> 01:31:12,306
Chris: I realize I'm dropping that knowledge here in a way that makes it seem really dramatic, and that's intentional because I aim to entertain. But I will say I honestly do not think that Eliezer is a scam artist. I don't. If he were, I don't think he would have tried to shut down the roco's basilisk post so forcefully. Conspiracy theory about Streisand effect aside, I do think Mister Yudkowski, from what I've read, is genuinely concerned about mitigating existential risk that comes from AI research.
730
01:31:12,458 --> 01:31:14,250
Kayla: Okay, but what is the money doing?
731
01:31:14,410 --> 01:32:04,568
Chris: Here's what Wikipedia has to say about what Siai, now known as Miri, M I r I. Miri's approach to identifying and managing the risks of AI, led by Yudkowski, primarily addresses how to design friendly AI, covering both the initial design of AI systems and the creation of mechanisms to ensure that evolving AI systems remain friendly, Miri researchers advocate early safety work as a precautionary measure. However, Miri researchers have expressed skepticism about the views of singularity advocates like Ray Kurzweil, who we both like that superintelligence is just around the corner. Miri has funded forecasting work through an initiative called AI impacts, which studies historical instances of discontinuous technological change. So that means, like, big jumps in technology and has developed new measures of the relative computational power of humans and computer hardware.
732
01:32:04,704 --> 01:32:11,120
Chris: Miri aligns itself with the principles and objectives of the effective altruism movement that.
733
01:32:11,160 --> 01:32:16,680
Kayla: Helps me a lot that like, just really contextualizes it and like, makes me go like, okay, so this isn't just people taking money.
734
01:32:16,760 --> 01:32:31,912
Chris: No, no, no. And driving away. That's what I'm saying here. Specifically, I'm saying, like, I dropped the whole oh, his name to be dramatic, but he's not. He's not an MLM leader. He's not. Right doing. He's not stealing people's money.
735
01:32:32,016 --> 01:32:34,128
Kayla: He is doing, like. That's actual good work.
736
01:32:34,184 --> 01:32:34,352
Chris: Yes.
737
01:32:34,376 --> 01:32:35,144
Kayla: That is important.
738
01:32:35,232 --> 01:32:52,424
Chris: They do good work. It's just that there are some people who fell prey to the anxiety thing that you were talking about because of this is essentially a home for the type of people that would have these kind of fears. Otherwise they wouldn't be talking about it and working on it.
739
01:32:52,472 --> 01:32:52,776
Kayla: Right?
740
01:32:52,848 --> 01:33:00,632
Chris: And then they kind of fall into that trap and then feel like they have to donate money. Because if I don't donate everything I can, then I may end up being tortured by a future AI.
741
01:33:00,696 --> 01:33:01,376
Kayla: Right?
742
01:33:01,568 --> 01:33:21,426
Chris: So maybe it was a little irresponsible of me, actually, to, like, drop his name like that, because I genuinely think he's doing good stuff. I just think that there's. Yeah, that it's. It's just. It's just the situation, right? It's a situation in the community. And I also don't think that the stray sand effect thing that he did was. Was super useful. But we all make mistakes.
743
01:33:21,618 --> 01:33:25,990
Kayla: Like you right now. Yeah, I'm kidding. I don't think you make a mistake.
744
01:33:26,610 --> 01:33:51,332
Chris: So I want to go back to Pascal's wager real quick. Specifically the God part. Maybe this goes without saying, but we here at cult are just weird, are also far from alone and speculating on the singularity and future godlike Super AI's as being sort of a new techno religion. In fact, I think singulitarians are even on our like to do list if maybe we might do this on the show one day, right? Which I guess is like, actually kind of what this episode is.
745
01:33:51,396 --> 01:33:51,716
Kayla: Right?
746
01:33:51,788 --> 01:34:51,894
Chris: But anyway, for our listeners who are unsure or unaware, technological singularity is just a fancy word to talk about how pretty soon we may have super intelligent AI's that will transform, revolutionize our very society and our very existence, and may involve things like living a really long time, living forever, uploading our brains, our consciousness, into computers, that sort of thing. And it also has its own gods and its own gods. Are these super intelligent AI that. And so the reason that this is, like, if you're wondering, like, why are super intelligent AI's, like, going from just my shitty, stupid computer to God like beings? The primary way that's discussed is that once computer intelligence, once machine intelligence reaches a certain threshold of both, like, complexity, like general complexity, and also power, that it may itself want to design an additional super intelligence.
747
01:34:51,942 --> 01:35:18,964
Chris: And being that machine intelligence is smarter than us. It will do a better job than us of creating a new intelligence, which then itself will do a better job than its robot creator, of creating a new intelligence, and so on and so on. And it will just spiral until before we even notice. Because of that cyclical effect, we'll have some super intelligent AI that is just so far beyond understanding that we're like ants to it.
748
01:35:19,082 --> 01:35:20,560
Kayla: Just go watch the movie. Her.
749
01:35:20,680 --> 01:35:23,808
Chris: Yeah, actually. Yeah, yeah, go watch her. It's an excellent movie.
750
01:35:23,864 --> 01:35:25,520
Kayla: Best depiction of the singularity.
751
01:35:25,640 --> 01:36:13,816
Chris: Yes, actually, so far I've seen. Yeah, and we, you and I are both, like, huge fans, and we both, like Ray Kurzweil, and Ray Kurzweil is a futurist. Future futurist. Futurist. Yeah, I think that's the name. And we've read some of his stuff, and he's, like, a very positive oriented optimist about the singularity. He actually works at Google now. I think his title is futurist there, I think. Yeah, yeah, he does. So that's sort of explaining, like, how the singularity of super intelligence would allegedly come to be. And it's been playfully referred to as the rapture for nerds. And it's very true, which is very true because it's sort of, if you think about it involves these, like, godlike beings. It involves eternal life. It involves a lot of these things that we sort of maybe want to be true or make us feel.
752
01:36:13,848 --> 01:36:32,440
Chris: Feel safe and or require some article of faith to believe in. And it's just using technology instead of, like, old school religion stuff. So the Roko's basilisk shenanigans kind of does a good job of illustrating that whole point about some angry, inscrutable God.
753
01:36:32,520 --> 01:36:34,200
Kayla: Are you gonna send this to Elon Musk?
754
01:36:34,320 --> 01:37:23,288
Chris: What? This. This podcast? Yeah, I'll definitely tweet it at him. Man, we'll get so many f cking listeners. Holy shit. Shit. One of my favorite articles I read while researching this episode was entitled sinners in the hands of angry artificial intelligence. And I highly recommend it. It was such a good article. It's on a blog called orbitermag.com and orbiter mag, which I had never heard of. Apparently, it's like a blog that talks about the meaningfulness of technological type things, which is, I think we sorely need things like that. But anyway, the title of that article is clearly referencing the classic historical Jonathan Edwards Sermon, sinners in the hands of angry God. And the article itself, of course, draws parallels to the Rocos Basilis thing to that famous sermon.
755
01:37:23,344 --> 01:37:23,940
Kayla: Right?
756
01:37:24,320 --> 01:37:56,628
Chris: The article goes on to talk about how religious awakenings of the past have typically followed some pivotal technological change, such as the Protestant Reformation coming in the wake of the printing press, or Jonathan Edwards own era of religious awakening in America coming on the heels of things like the train and the Telegraph. Which then begs the question, why should the recent advances of computer Internet personal devices be any different? Why shouldn't we also be experiencing some sort of religious awakening, which maybe this is sort of a part of?
757
01:37:56,764 --> 01:37:58,480
Kayla: Mmm, I like that.
758
01:37:58,620 --> 01:38:34,896
Chris: So here's a great quote from the end of the article, which again, you should definitely go read the whole thing on orbiter mag because it's really good. And I really like orbiter mag as a blog. Now here's the quote. Religion is implicit in such technological change. It behooves us not to argue for its elimination, but to render more sophisticated forms of understanding it. As much as Roco's basilisk as a concept deserves disdain, the reality is that self aware artificial intelligence, whether malevolent, benevolent, or something else, will surely be developed sooner rather than later. Such a development requires a religious response. End quote.
759
01:38:35,008 --> 01:38:35,928
Kayla: Damn.
760
01:38:36,104 --> 01:38:49,712
Chris: And I think they're absolutely right. Maybe not a catholic or jewish or muslim response. That's not what they're talking about. But they're talking about a response that involves things like metaphysics and morality and ethics and that sort of stuff.
761
01:38:49,776 --> 01:38:50,440
Kayla: And reverence.
762
01:38:50,520 --> 01:39:25,840
Chris: And reverence. Absolutely. So it's fascinating. Right? And I'd also like to add here that I'm not even sure that Roko's basilisk deserves disdain. The folks on less wrong are very smart people. They include AI researchers, coders, developers that are ten times smarter than I am. And I think they actually came up with a really interesting thought experiment. It's just an unfortunate side effect that sometimes digging around for truth isn't always safe for the miners. There can be hazards in mining each other's thoughts for insights and information, just as surely as there can be hazards in mining the earth for diamonds. I just hope that most of them are okay now.
763
01:39:26,180 --> 01:39:27,084
Kayla: Me too.
764
01:39:27,252 --> 01:39:48,256
Chris: And speaking of hazard, I said we'd get to that about something called information hazard quite a long time ago on the show now. And here we finally are. Although actually we've been talking about an information hazard this whole time. That's what Roko's basilisk is. Remember we mentioned it's not so much the AI that is the problem, it's the thought itself.
765
01:39:48,368 --> 01:39:49,024
Kayla: Right?
766
01:39:49,192 --> 01:39:56,248
Chris: And don't forget that's something that Eliezer was saying about why he was so upset about it too. So he understands, I think, like, what the issue is.
767
01:39:56,344 --> 01:39:57,500
Kayla: A single thought.
768
01:39:58,600 --> 01:40:32,082
Chris: Sorry, a single thought, and you can be paralyzed into donating all of your money to research artificial intelligence. So we've been talking about information hazard this whole time. But I really wanted to mention a fascinating paper I read for this episode by one Nick Bostrom, who you may have heard of. He's a relatively famous philosopher at Oxford. Here's his Wikipedia blurb. Nick Bostrom is a swedish philosopher at the University of Oxford known for his work on existential risk, the anthropic principle, human enhancement ethics, superintelligence, risks, and the reversal test. I don't know what that last one is.
769
01:40:32,106 --> 01:40:33,514
Kayla: Fuck is the reversal test.
770
01:40:33,562 --> 01:41:13,482
Chris: I don't know. Mister. Bostrom's paper defines a and information hazard as, quote, information hazards are risks that arise from the dissemination or the potential dissemination of true information that may cause harm or enable some agent to cause harm. Such hazards are often subtler than direct physical threats and as a consequence, are easily overlooked, end quote. This is like the most interesting topic to me because one of the key things about information hazard that he makes a point that he kind of mentions there and really makes a point of in the paper, is that information can be free and true and yet still hazardous, still harmful.
771
01:41:13,626 --> 01:41:17,666
Kayla: Do you remember the whole point of the first book of the Expanse?
772
01:41:17,858 --> 01:41:55,780
Chris: Yeah, that's actually. That's totally information hazard. It's about like the push and pull between. For those of you who haven't read the Expanse, there's this guy who has this sort of. This virtue, which actually I was about to write about here, we're about to talk about here, but we assume that free and open information is purely virtuous and always good. And there's a character in the expanse that has the same ethic, and we sort of identify with that. But then the book and actually the rest of the series even sort of like, calls that into question. Like, is it always good to have all information be as free and as open as possible?
773
01:41:56,120 --> 01:41:57,062
Kayla: Fucked up?
774
01:41:57,176 --> 01:42:26,442
Chris: Yeah, it's super interesting, and it's very counterintuitive, which I'll talk about some examples that maybe make it less counterintuitive here. I'll definitely link the paper in the show notes because I cannot recommend it enough. It's so interesting. And he actually categorizes and puts into sort of like a taxonomy all sorts of different types of information hazards. So here's a short list. This is like half of them data hazard, idea hazard, attention hazardous signaling hazard, evocation hazard, enemy hazard.
775
01:42:26,506 --> 01:42:26,754
Kayla: What?
776
01:42:26,802 --> 01:42:28,114
Chris: Knowing too much hazard.
777
01:42:28,202 --> 01:42:29,050
Kayla: That's me.
778
01:42:29,210 --> 01:43:13,990
Chris: Information asymmetry hazard, unveiling hazard, ideological hazard, distraction and temptation hazard. That's me. And again, that's only like half. So if you read the paper, you'll see it's also one of those interesting things where you kind of go like, oh, yeah, I think I knew about that. That actually does kind of make sense. So, for example, I mentioned enemy hazard above, which is basically a thing where like, we as a country have military secrets. Because if we allowed, for example, the design of the f 22 raptor to become common knowledge, it would put our national security at risk, our military enemies. And then it's not quite enemy hazard, but like, while we're talking about military stuff, you could, and I think he maybe did in the paper, posit that knowledge of how to build a nuclear device is information hazard.
779
01:43:14,030 --> 01:43:52,532
Chris: Because again, if it were free and common knowledge, we would all be at more risk. And that's why, for example, we can't just make a YouTube video about how to refine uranium. You will rightly get in huge trouble with the us government. And I know this because a youtuber that I like mentioned it on his channel that why he didn't make a video about that thing ever. It's like this chemistry dude, Nile Red. Nile Red. Yeah. He recently had a video where he worked with some uranium compound and he was like, I thought about doing a uranium refining thing, but then I realized I would have gotten in huge trouble with the government, so I didn't. But again, the reason for that is because the government is trying to protect against this information Hazard.
780
01:43:52,716 --> 01:44:17,912
Chris: Anyway, this information hazard paper is now referenced on Les Wrong's very own article about Roko's basilisk. That's actually where I found the paper. So Les wrong for a long time had censored all whatever, but now they have their own article on it because obviously it's like so famous now. It's just for a long time, Eliezer would only write about it on a Reddit forum about this type of stuff, but now they have their own article on it.
781
01:44:17,976 --> 01:44:18,416
Kayla: Good.
782
01:44:18,528 --> 01:44:52,416
Chris: But the reason they have it there is this paper is because ultimately that's what the basilisk is. It's a piece of information, a little memetic virus that a particular portion of the population is vulnerable to and it is hazardous to their mental health. Me and information hazards are at large. What less wrong is kind of all about they and the siAi. Now, Miri have the goal of reducing existential risk from information hazards involving AI. I think it's generally a good mission, even though it may kick off some mental trauma and poor life choices from time to time amongst their community.
783
01:44:52,528 --> 01:44:53,180
Kayla: Right?
784
01:44:53,680 --> 01:45:30,690
Chris: So real quick, before we get to the criteria and judgment, just like I did in my first episode of the season, I'm doing this thing where I'm throwing sources in at the end. I'm not sure if I'll keep doing it this way. I guess it depends on like our listener feedback and yada, but it just, it feels a little bit easier for me somehow to squeeze them in at the end versus the beginning. But like I said, feedback pending. So here, I think, all of the places I went for research on today's topic. First of all, lessrong.com quote s article on Rokos Basilisk. I also reread that old Slate article, Roko's Basilisk, the most terrifying thought experiment of all time.
785
01:45:31,870 --> 01:45:33,518
Kayla: I still read Slate every day.
786
01:45:33,614 --> 01:45:43,970
Chris: Wikipedia's articles on less wrong on Eliezerz Yudkowski, Business Insider's article, so aptly titled for their magazine, what is Roko's Basilisk?
787
01:45:48,390 --> 01:45:49,510
Kayla: Oh, I like that.
788
01:45:49,550 --> 01:46:02,518
Chris: And then there was a vice article with maybe even better title explaining Roko's basilisk, the thought experiment that brought Elon Musk and Grimes together. Because that's to vice. That's the, that's the headline for Roko's best.
789
01:46:02,574 --> 01:46:07,742
Kayla: Yeah, literally, a new life has been created.
790
01:46:07,846 --> 01:46:08,534
Chris: Oh shit.
791
01:46:08,582 --> 01:46:09,590
Kayla: Because of this.
792
01:46:09,670 --> 01:46:15,726
Chris: It begins. I bet it's more than one. I bet people are having like, scary, oh, sure, anxiety sex.
793
01:46:15,878 --> 01:46:24,630
Kayla: Specifically, Grimes and Elon Musk are having a child. Yes, because they methadore on Twitter. Because of this.
794
01:46:24,790 --> 01:47:08,872
Chris: I know. And there's also an article out there called this horrifying AI thought experiment. Got Elon Musk a date. That's on livescience.com, and that seems to be a common theme. Knowyourmeme.com has a whole article on Roko's basilisk because it's a very memetic thing. Obviously. There's a couple videos I watched on YouTube, Orbiter mag, which we mentioned. That article centers in the hands of angry artificial intelligence. And then there's another really good blog out there called last word on nothing, with an article entitled who's afraid of Rocco's Basilisk? Me, of course. I also used that. I read that whole paper by Nick Bostrom because it was so fascinating. It was one of the first things I read, because the very first thing I read was actually the less wrong article about their own thing about Rokovascalisk.
795
01:47:08,896 --> 01:47:46,926
Chris: And then the second thing I read was the Nick Bostrom paper. And that's when I was like, oh, we have to do an episode on this. It was that interesting. And then finally, I made heavy use of rationalwiki.org article on Roko's Basilisk. And I highly recommend, it was a great article. They have a great article on Roko's Basilisk. It really details out in a much better way than we did even on the show here, where it came from, the nature of less wrong. They even go into a little more detail about, like, timeless decision theory and a causal trade and some of the, like, really esoteric stuff that sort of generated this thing.
796
01:47:47,078 --> 01:48:28,900
Chris: And then they also have this whole section called so you're worrying about the basilisk, and it's actually all about, like, here are the different reasons that you shouldn't worry about the basilisk. It's actually really extensive. It's almost like half the article. And I, when I was talking about the reasons you shouldn't be afraid of the basilisk, like, a good chunk of it sort of was inspired or framework or came from that part of their article. And then finally, at the very end of that section, there's a little header that says something like, I know it's rubbish, but I'm still worried. And they're basically like, well, you should probably talk to a therapist then, because it's, you know, they're like, exist. Well, no, but they're like, they're not being like, that's the thing is like rationalism.
797
01:48:28,980 --> 01:48:33,156
Kayla: I'm going to my therapist tomorrow and I'm like, should I talk about Roku's basilisk? I don't know.
798
01:48:33,188 --> 01:49:04,700
Chris: Rational wiki is typically very cheeky in their articles, but this section was part of what made me kind of go like, all right, good job, you guys. The fact that they made that call to mental health, and they even said, if you are having trouble with existential sort of fears, like, that's something that maybe, like, a university campus mental health professional may have more experience with. So anyway, rational wikis article on Roku's basilisk. Very good. All right, it's finally time for.
799
01:49:06,760 --> 01:49:07,120
Kayla: The.
800
01:49:07,160 --> 01:49:16,410
Chris: Criteria and the dodgmandhead. So let's do this. Okay, we're gonna do that. We're gonna do this for like ten minutes.
801
01:49:18,150 --> 01:49:28,062
Kayla: Number one, Jesus Christ expected harm towards the individual, socially, physically, financially, etc.
802
01:49:28,206 --> 01:49:29,090
Chris: Mm.
803
01:49:29,430 --> 01:49:35,118
Kayla: Number two, population of cult. That's not what it is anymore. What do we call it now, is it niche?
804
01:49:35,214 --> 01:49:38,586
Chris: Is it niche? And we are never gonna rewrite that apparently.
805
01:49:38,718 --> 01:50:11,700
Kayla: And, well, because it's whatever. Three, anti factuality, closed logical system. Four. Percentage of life consumed. Not empirically, but how much of your time is devoted to the thing? Ritual is number five and number six, charismatic leader. And for some reason I don't think that this is accurate anymore. When we originally conceived of this, we put alive leaders cult of personality. That's definitely not how we've addressed it.
806
01:50:11,780 --> 01:50:17,360
Chris: Yeah, I don't know why we wrote that. That's literally years ago now. So those are our criterias.
807
01:50:18,020 --> 01:50:21,452
Kayla: Well, I don't. What's the cult? Is it the.
808
01:50:21,556 --> 01:50:27,324
Chris: I think it's the less community. I think it's the less wrong. Less wrong slash, siai or miri.
809
01:50:27,372 --> 01:50:31,054
Kayla: Now expected harm. I feel like it's high.
810
01:50:31,222 --> 01:50:33,166
Chris: Yeah. I mean, so there was a.
811
01:50:33,198 --> 01:50:34,998
Kayla: The information hazard is high.
812
01:50:35,134 --> 01:50:54,006
Chris: Yeah, the information hazard is high. There was a poll, I think some internal poll that they ran recently ish, a few years ago, at least not. Not back in 2010 or whenever when this was happening. But then it was something like, I think between ten and 15% of people were like still worried about it on the site, which is like pretty high.
813
01:50:54,078 --> 01:50:54,614
Kayla: Pretty high.
814
01:50:54,702 --> 01:50:57,962
Chris: When you think about how like, freaking weird it is, it's like one or.
815
01:50:57,986 --> 01:50:59,190
Kayla: More out of every hundred.
816
01:51:00,650 --> 01:51:02,402
Chris: It's one out of every ten.
817
01:51:02,586 --> 01:51:04,234
Kayla: No. It's one out of every hundred.
818
01:51:04,362 --> 01:51:06,698
Chris: No. See, this is what I was talking about earlier in the episode when I.
819
01:51:06,714 --> 01:51:07,858
Kayla: Was saying Kayla is bad.
820
01:51:07,914 --> 01:51:09,906
Chris: People can't conceive of probabilities.
821
01:51:10,018 --> 01:51:12,150
Kayla: Fuck off. I said the wrong thing.
822
01:51:15,010 --> 01:51:17,314
Chris: It's one and a half out of ten.
823
01:51:17,402 --> 01:51:19,098
Kayla: That's a lot.
824
01:51:19,194 --> 01:51:19,442
Chris: Yeah.
825
01:51:19,466 --> 01:51:20,938
Kayla: So I'm saying it's fairly high.
826
01:51:20,994 --> 01:51:26,262
Chris: Yeah. I would say since we're saying expected harm, good job passed using the word expected, I would say it's fairly.
827
01:51:26,286 --> 01:51:29,382
Kayla: High population of cult. Is it niche?
828
01:51:29,566 --> 01:51:38,510
Chris: Oh. The other thing about the harm thing too is that it's not just mentally anguishing. It's also people are like spending their real hard financial draining.
829
01:51:38,590 --> 01:51:59,424
Kayla: Yeah. Is it niche? I mean, it seems fairly. This particular community seems fairly niche. I would say given the extensive writing that's been done about Roco's basilisk on the Internet, that concept is not niche, but this community and the like, really hardcore, like philosophical, ethical, whatever kind of thinking that they do is a niche population.
830
01:51:59,512 --> 01:52:07,472
Chris: Yeah, totally agree. I think that since we're talking about the community as the cult, I would say it's definitely niche. Even though this one thing happened to go, like, mega viral.
831
01:52:07,536 --> 01:52:12,048
Kayla: Right. This one, I. This one is tough.
832
01:52:12,224 --> 01:52:13,248
Chris: What's this one?
833
01:52:13,384 --> 01:52:14,544
Kayla: Anti factuality.
834
01:52:14,632 --> 01:52:15,380
Chris: Mmm.
835
01:52:16,040 --> 01:52:31,022
Kayla: Because it's like, yes and no. First of all, anti factuality. They want you to believe that it is better for a bunch of people to not get a speck of dust in their eye and just torture.
836
01:52:31,046 --> 01:52:32,774
Chris: It's all about the spec. It's all about the spec.
837
01:52:32,862 --> 01:52:54,352
Kayla: Just saying feels antifactual to me. But I think in terms of. That was a joke. In terms of, like, having a closed logical system. I don't think that exists here, because it seems like these people almost, like, take the logical to the opposite extreme, where it's almost hazardous, with following the logic. So to the letter.
838
01:52:54,496 --> 01:53:14,994
Chris: So that's the exact line of reasoning that. Now I forget the article because I read so many of them. But there was one article that follows that exact line of reasoning that was basically, like, being irrational can be harmful, but actually, like, being way too rational can also be harmful. Yeah. Basically to that extreme.
839
01:53:15,042 --> 01:53:27,626
Kayla: Which kind of makes me go, like, is it kind of anti factual? Because it's like, when you get. No, I'm serious. I know this. It rolled over pedantic. But when you get this literal.
840
01:53:27,778 --> 01:53:28,226
Chris: Yeah.
841
01:53:28,298 --> 01:54:01,804
Kayla: You are kind of not existing in real world logic anymore. You're kind of not existing in, like, experience. You're not existing in, like, real. It's like, okay, the television show, Nathan, for you, the way that those writers craft the, like, business proposals is they get. And this. I know this actually, because you worked there, they get into a. An extremely logical frame of mind, and they try to be so logical that it's insane and silly.
842
01:54:01,892 --> 01:54:04,436
Chris: Right. Yeah. And that's a part of what makes that show so genius.
843
01:54:04,468 --> 01:54:04,612
Kayla: Yeah.
844
01:54:04,636 --> 01:54:09,392
Chris: And that this definitely feels like a similar sort of thing. It's, like, so logical that it's.
845
01:54:09,496 --> 01:54:11,560
Kayla: It's a different kind of antifactuality.
846
01:54:11,640 --> 01:54:18,280
Chris: Yeah, it's. Yeah, it's weird, right. I think it goes back a little bit to what were saying about, like, conceivable versus meaningful.
847
01:54:18,360 --> 01:54:18,768
Kayla: Right.
848
01:54:18,864 --> 01:54:32,796
Chris: The logic is conceived so perfectly and so detailed and esoterically that the meaningfulness just, like, totally goes bunk. The same way it does with something like, Nathan, for you. Except it's funny instead of terrifying.
849
01:54:32,888 --> 01:54:52,724
Kayla: But I also don't want to insinuate that the actual thought that these people are doing on a day to day, even in the community, is anti factual. I think that we talked about some extremes where it is. So I'm gonna say it's yes and no. And, like, maybe more of an emphasis on the no with, like, a high probability of yes.
850
01:54:52,812 --> 01:54:59,204
Chris: So it's like a Schrodinger's criteria for yes. It's both anti factual and factual at the same time?
851
01:54:59,332 --> 01:55:00,180
Kayla: I think so.
852
01:55:00,300 --> 01:55:09,356
Chris: Cool. Whoa. Gotta bring it back to quantum mechanics, too. We gotta talk. Every episode, we have to reference at least, like, three or four other episodes a single thought. It's all about the throwback, man.
853
01:55:09,468 --> 01:55:22,080
Kayla: Percentage of life consumed. I'm gonna say it feels not that high to me. Like, I know you talked about that one guy who, like. Well, basically all he does is work and give money.
854
01:55:22,420 --> 01:55:22,820
Chris: Yeah.
855
01:55:22,860 --> 01:55:49,024
Kayla: But outside of that one example that was given, it's people sitting on a forum. And unless I'm prepared to say that I'm in a cult of Twitter, I don't think I am. But I do. You know, I spend some time on there, but I don't know that if you're a user of this forum, you're having a percentage of your life consumed. That's high enough to be considered cult like.
856
01:55:49,152 --> 01:56:05,798
Chris: I agree. I think that if everybody, unless wrong, were I doing what that one guy was doing and 100% of them believed in Roko's basilisk and were donating every last possible cent to the research institute, then I would say hi. But I don't think it's many people doing that.
857
01:56:05,854 --> 01:56:06,406
Kayla: Right.
858
01:56:06,558 --> 01:56:10,558
Chris: So, yeah, I would say that's percentage of life consumed fairly low.
859
01:56:10,614 --> 01:56:13,710
Kayla: Yeah. I'm gonna ask you to answer this one. Ritual.
860
01:56:13,870 --> 01:56:15,102
Chris: Oh, ritual is high, dude.
861
01:56:15,166 --> 01:56:15,630
Kayla: Hi.
862
01:56:15,750 --> 01:56:21,966
Chris: Yeah. Because, like, if you. I definitely recommend checking out less wrong and reading some of their stuff.
863
01:56:22,038 --> 01:56:23,470
Kayla: Oh, you said all the acronyms.
864
01:56:23,630 --> 01:56:30,942
Chris: There's acronyms, jargon, and so much like in jargon and tropiness that, you know, references other stuff that's been talked about.
865
01:56:31,006 --> 01:56:32,910
Kayla: It feels like you're talking about tv tropes.
866
01:56:33,030 --> 01:56:33,358
Chris: It is.
867
01:56:33,374 --> 01:56:34,690
Kayla: You ever been to tv tropes?
868
01:56:35,030 --> 01:56:36,558
Chris: Tv tropes is almost.
869
01:56:36,694 --> 01:56:39,678
Kayla: You can't parse it. It's like, you can't.
870
01:56:39,814 --> 01:56:40,086
Chris: Right.
871
01:56:40,118 --> 01:56:40,686
Kayla: It's impossible.
872
01:56:40,758 --> 01:56:44,326
Chris: The vocabulary they use is been generated from other parts.
873
01:56:44,358 --> 01:56:44,758
Kayla: Yeah.
874
01:56:44,854 --> 01:56:45,908
Chris: Of their own site.
875
01:56:46,014 --> 01:56:47,792
Kayla: It is turtles all the way down there.
876
01:56:47,856 --> 01:56:53,176
Chris: Yeah. Tv tropes is completely out of control. It's not even about tv anymore. No, it's about itself.
877
01:56:53,248 --> 01:56:53,544
Kayla: Yeah.
878
01:56:53,592 --> 01:57:13,556
Chris: I love tv tropes, but it's similar to that where it's like, when you first read an article there, it's kind of hard to understand because of how referential it is within their own community. Now, that doesn't mean they're going to a temple and performing blood sacrifices or anything. They're not doing that, but I would say it's fairly ritualistic.
879
01:57:13,688 --> 01:58:05,568
Kayla: I'm gonna ask a question here, because that behavior in a forum that doesn't feel outlandish, that doesn't feel atypical. Like, I'm just thinking about some of the, like, I'm just thinking about Reddit and some of the subreddits that I frequent on there and how with many of them, especially, like, very specialized ones, that kind of thing happens. You've heard, am I the asshole? You've heard that where it's like, people be like, am I the asshole? There's a lot of acronyms going on there. Aiti n a yta, various jargon that's used. Or, like, just know mother in law, which is a form about shitty mothers and mother in laws. Go read it. It is the greatest team of all time. These people are insane.
880
01:58:05,664 --> 01:58:19,416
Kayla: But, like, if when you start trying to read these posts, it's like, until you get a handle on the jargon and the nicknames and the acronyms on any forum, you're gonna experience that. So does that mean, like, all of these forums are ritualistic?
881
01:58:19,528 --> 01:58:29,440
Chris: I think they have an element of ritual, for sure. I think anything, to me, any language that helps designate insiders versus outsiders, to me, feels ritualistic.
882
01:58:29,520 --> 01:58:30,018
Kayla: Okay.
883
01:58:30,104 --> 01:58:38,894
Chris: It's. It's a, it's a signaling tool, right? Any language that is a signaling tool as much as it is a communication tool to me, says ritual.
884
01:58:38,982 --> 01:58:41,134
Kayla: Okay, then ritual here is high.
885
01:58:41,222 --> 01:58:41,822
Chris: Yeah.
886
01:58:41,966 --> 01:58:43,798
Kayla: And charismatic leader. Yes.
887
01:58:43,934 --> 01:58:44,886
Chris: Yeah, for sure.
888
01:58:44,958 --> 01:58:46,286
Kayla: Mister, sir man, whatever his name is.
889
01:58:46,318 --> 01:58:47,166
Chris: Mister Yudkowski.
890
01:58:47,198 --> 01:58:47,406
Kayla: Yes.
891
01:58:47,438 --> 01:59:16,158
Chris: Yeah. He seems pretty charismatic. Generally, he wants to help develop a future that has friendly artificial intelligence instead of malevolent artificial intelligence, so. And also, he's very active on his own forum. I mean, he's posting tons of articles and replying to other people's comments and things like that. So he's pretty active. It's definitely. He's definitely there. Is he charismatic? Yeah, I guess. I don't know. I've never seen him, like, talk in person. I didn't watch a video of him, like, I mean, speaking.
892
01:59:16,254 --> 01:59:28,450
Kayla: One of the only things you, like, kind of said about him was that he wrote this post where he called that guy an idiot, which doesn't seem super charismatic, but also charismatic doesn't have to mean, like, Jay Z knights a dick.
893
01:59:28,610 --> 01:59:30,802
Chris: Right. But she's also charismatic, but she's charismatic.
894
01:59:30,826 --> 01:59:33,474
Kayla: Or, like, a lot of these people are kind of jerky.
895
01:59:33,602 --> 01:59:35,098
Chris: All right, so I'm gonna go with exists.
896
01:59:35,194 --> 01:59:48,690
Kayla: I'm gonna say there is a presence of a charismatic leader. So. So we got expected harm. We got charismatic leader, we've got niche. We've got semi antifactual. I think it's a cult.
897
01:59:48,850 --> 01:59:51,082
Chris: I think so. I was. I was leaning cult.
898
01:59:51,146 --> 01:59:57,088
Kayla: It's cult mandae. Got that ritual. The only thing that's kind of like, eh. Is percentage of life consumed.
899
01:59:57,184 --> 01:59:59,700
Chris: Yeah, no, I'm calling cult on this one.
900
02:00:00,000 --> 02:00:01,616
Kayla: Sorry, guys. Sorry, sorry, guys.
901
02:00:01,688 --> 02:00:08,432
Chris: If anybody is listening to this from. From less wrong, that doesn't mean that, you know, it's nothing bad.
902
02:00:08,496 --> 02:00:12,976
Kayla: If it makes you feel any better, this one said this man right here.
903
02:00:13,048 --> 02:00:15,352
Chris: I like you're pointing to me because we're recording the video.
904
02:00:15,416 --> 02:00:24,768
Kayla: Yeah. But also this. They know who I'm talking about. Said that I'm in a cult because of my involvement with Cicada 3301.
905
02:00:24,944 --> 02:00:25,408
Chris: Yes.
906
02:00:25,464 --> 02:00:27,176
Kayla: And you called it a benevolent cult.
907
02:00:27,248 --> 02:00:38,120
Chris: Benevolent cult, yeah. I see this for the most part, aside from occasionally throwing off ideas that make people's brains hurt, I think for the most part is a benevolent cult.
908
02:00:38,280 --> 02:00:39,560
Kayla: It's neutral to benevolent. Yeah.
909
02:00:39,600 --> 02:00:40,008
Chris: Yeah.
910
02:00:40,104 --> 02:00:41,624
Kayla: But probably net positive.
911
02:00:41,792 --> 02:00:50,402
Chris: Yeah, I think so. I thought that the stuff that I read on less wrong was really interesting. I think Eliezer's mostly a good guy trying to do the right thing.
912
02:00:50,466 --> 02:00:57,090
Kayla: I think dedicating that level of thinking is generally a plus, a positive, a good thing.
913
02:00:57,130 --> 02:01:19,736
Chris: Right. I think the thing that helps me think about it a lot is kind of going back to that analogy with the mining. They are mining for truths. They are mining for information and insights in this very novel field that involves some really weird ways of having to think about the world and decisions and whatever. And eventually you're gonna. By doing that, you're gonna run into something that could potentially be hazardous.
914
02:01:19,808 --> 02:01:22,512
Kayla: You know what, do you want me to draw analogy here?
915
02:01:22,536 --> 02:01:22,936
Chris: Oh my God.
916
02:01:22,968 --> 02:01:27,900
Kayla: You know what it's exactly like what? You remember that little flash game, mother load?
917
02:01:28,640 --> 02:01:29,064
Chris: Yeah.
918
02:01:29,112 --> 02:01:29,792
Kayla: Literally that.
919
02:01:29,816 --> 02:01:30,728
Chris: It's like mother load. Yeah.
920
02:01:30,744 --> 02:01:31,336
Kayla: I don't.
921
02:01:31,448 --> 02:01:32,016
Chris: Spoilers.
922
02:01:32,048 --> 02:01:32,952
Kayla: I don't want to spoil.
923
02:01:33,016 --> 02:01:33,920
Chris: Go play mother lode.
924
02:01:33,960 --> 02:01:40,160
Kayla: But go play mother lode and play. Please, please believe me when I say.
925
02:01:40,320 --> 02:01:41,304
Chris: You play to the end.
926
02:01:41,352 --> 02:01:55,296
Kayla: You have to play to the end. It will seem like nothing's going on, and then all of a sudden you will achieve salvation. It is a fantastic gaming experience and relevant to what we just said, which I don't remember what it was because.
927
02:01:55,328 --> 02:01:57,960
Chris: We were mining and discover a hazard.
928
02:01:58,000 --> 02:01:58,560
Kayla: Yes.
929
02:01:58,720 --> 02:02:35,986
Chris: So I'll also say that if somehow, some miracle we get somebody that listens to this show from less wrong. I would love to interview you. Please contact us@cultorjustweirdmail.com or on twitter. I would love to do an interview with one of you guys to talk about the experience from the inside. I usually like trying to get primary sources for this kind of stuff. I wasn't able to this time. I actually tried to find Roko's contact information, but strangely, it's totally not out there. But yeah, I would love to talk to someone that was sort of like experiencing this on the inside. So on the off chance, please contact us.
930
02:02:36,058 --> 02:02:39,110
Kayla: Question for you. Am I in this call?
931
02:02:40,450 --> 02:02:41,098
Chris: Both are.
932
02:02:41,154 --> 02:02:42,898
Kayla: I'm not part of the community.
933
02:02:43,034 --> 02:02:50,322
Chris: Oh, I think we both are because we're both like total singletarians. I was saying I'm reading Scared of Roko's basilisk and I thought it was, like, really interesting.
934
02:02:50,386 --> 02:02:50,898
Kayla: Right.
935
02:02:51,034 --> 02:02:54,330
Chris: I'm kind of disappointed that you're still scared of it after all we saw.
936
02:02:54,370 --> 02:03:03,446
Kayla: I'm not that scared of it, but it. You can't just, you know, you just snap out of it. Well, aren't you just so great?
937
02:03:03,478 --> 02:03:05,702
Chris: I'll tell you what. I tell you what. When we're done here, you didn't snap.
938
02:03:05,726 --> 02:03:10,686
Kayla: Out of it either. You read a shit ton of stuff that I've only heard secondhand. My brain needs time.
939
02:03:10,878 --> 02:03:18,542
Chris: I'm saying when we're done here, definitely read the rational wiki article because it has some additional things that we didn't mention on the show.
940
02:03:18,606 --> 02:03:19,326
Kayla: I will do that.
941
02:03:19,398 --> 02:03:47,166
Chris: Actually, one of the things that I didn't mention I probably should have was one of the reasons that they give is that AI isn't just going to be like, oh, I invented Rokobasilisk now. Like, it's not going to be like off and then on, right? It's going to be, if it exists at all, a gradual process where, you know, one thing affects, the next thing affects, the next thing creates the next thing. It's going to be extremely chaotic, okay? And there's going to be no way to trace it back to, like, whether you, Kayla, did or didn't donate to.
942
02:03:47,328 --> 02:03:52,682
Kayla: But what about punctuated equilibrium, which for some reason I have talked about on.
943
02:03:52,706 --> 02:04:16,006
Chris: This show multiple times, that's, even if there is punctuated equilibrium, it doesn't mean that the process won't be chaotic, okay? The process absolutely will be chaotic and you will not be able to trace it back to which butterfly flapped its wings to make the AI happen. Okay, good to know, mathematically speaking, you will not be able to do that. Even a super intelligent AI will not be able to do that because. Math.
944
02:04:16,198 --> 02:04:17,214
Kayla: Hash. Math.
945
02:04:17,302 --> 02:04:18,006
Chris: Boom.
946
02:04:18,158 --> 02:04:19,022
Kayla: You're a cult.
947
02:04:19,166 --> 02:04:23,646
Chris: You're a cult. So, yeah, that was cool. Yeah.
948
02:04:23,758 --> 02:04:26,734
Kayla: But I'm still mad at you. I'm sorry.
949
02:04:26,902 --> 02:04:27,798
Chris: Still mad.
950
02:04:27,974 --> 02:04:28,930
Kayla: Still mad.
951
02:04:29,430 --> 02:04:43,818
Chris: All right, before we sign off, I would like to say to you, our listeners and fans, that whatever set of words that you may have heard that have had the power to destroy you, I promise you'll find the words with the power to heal, too, and I hope you find them soon.
952
02:04:43,954 --> 02:04:44,994
Kayla: That's very nice.
953
02:04:45,162 --> 02:04:46,130
Chris: I'm Chris.
954
02:04:46,290 --> 02:04:47,362
Kayla: And I'm Kayla.
955
02:04:47,466 --> 02:04:50,950
Chris: And this has been co. Or just weird.