Transcript
1
00:00:00,840 --> 00:00:49,632
Kayla: When you google Sam Bankman Freed and effective altruism, you get headlines back like how effective altruism let Sam Bankman Freed happen. Effective altruism is as bankrupt as Sam Bankman Fried's FTX. Effective altruist leaders were repeatedly warned about Sam Bankman Fried years before FTX collapsed and Sam Bankman Fried and the effective altruism delusion. It sounds fairly damning. And in the fallout after the FTX scandal and Bankman Fried's arrest, effective altruism itself seemed to be on trial along with him. Now, when you google Sam Bankman Freed and polycule, you get back partner swapping pills and playing games inside Sam Bankman Fried's FTX party house, polyamory, penthouses and plenty of loans.
2
00:00:49,776 --> 00:01:04,879
Kayla: Inside the crazy world of FTX and FTXs crypto empire was reportedly run by a bunch of roommates in the Bahamas who dated each other, according to the news site that helped trigger the companys sudden collapse. What the hell exactly is going on.
3
00:01:20,940 --> 00:01:37,150
Chris: Kayla? It was really hard for me to not reply to anything. You're like, okay, I'm gonna do the intro snippet now. I'm gonna do the cold open now. And I was like, okay. And then you started doing it and I was like, how am I supposed to not fucking respond to any of this?
4
00:01:37,270 --> 00:01:38,358
Kayla: Well, respond now.
5
00:01:38,454 --> 00:01:46,998
Chris: I had to, like, point my mouth away from the microphone cause I started giggling when you brought up the polycule roommates. Bahamas. I mean, it sounds pretty fun.
6
00:01:47,054 --> 00:01:48,570
Kayla: It really does, honestly.
7
00:01:48,970 --> 00:01:52,490
Chris: Where to begin?
8
00:01:52,610 --> 00:01:56,178
Kayla: You know what? We're gonna get into it. So you'll have plenty of time to respond.
9
00:01:56,274 --> 00:01:57,882
Chris: You should know where to begin cause you did the research.
10
00:01:57,946 --> 00:02:08,114
Kayla: I did. Welcome back to culture just weird. I'm Kayla. I'm a tv writer and a goddamn expert on effective altruism. After all of these episodes. Not really, but it sounds good to say.
11
00:02:08,241 --> 00:02:22,700
Chris: I'm Chris. I am a data scientist, game designer. I have never been to the Bahamas with a polycule to caused a major financial collapse before, but I'm looking if anybody else wants to join me.
12
00:02:23,600 --> 00:02:59,762
Kayla: I already said welcome back to culture. Just weird. But I'm going to say it again. Thank you for supporting the show by listening to it and coming back and hearing us Yap week after week. If you'd like to support us further, you can go to patreon.com culturesweird. And if you want to yap with us about cults and weirds, you can find us on discord linked in our show notes last week on the show we tackled affective altruism and long, the last two letters of the test Grail bundle. It was a bit of a primer, allowing us to dip our toes into what exactly these concepts are, in short, effective altruism, or ea. Using evidence and reason to figure out how to benefit others as much as possible.
13
00:02:59,866 --> 00:03:10,030
Kayla: And taking action on that basis, usually with like, charitable donations and longtermism, means extrapolating that idea out to include considerations of the far future and the humans who might live then.
14
00:03:10,170 --> 00:03:16,238
Chris: And that's why you'd be in a polycule, so you can make as many babies as possible. A lot of cross pollination going on.
15
00:03:16,294 --> 00:03:29,670
Kayla: That's not. Not a thing. I don't. Seriously, I don't think this polycule was. I don't think this polycule had anything to do with population stuff. But, like, that's why Elon Musk has twelve kids, or however many.
16
00:03:29,750 --> 00:03:31,302
Chris: Should we explain polycule?
17
00:03:31,406 --> 00:03:32,526
Kayla: Well, we'll get to it. Don't worry.
18
00:03:32,558 --> 00:03:33,342
Chris: Okay. Okay.
19
00:03:33,446 --> 00:03:38,470
Kayla: We could explain it later. I want to keep our listeners on their toes, on bated breath for a moment.
20
00:03:38,550 --> 00:03:40,444
Chris: Don't forget, my mom still listens to this.
21
00:03:40,582 --> 00:03:59,416
Kayla: She'll enjoy this episode. This week we're going to get into some of the criticisms of effective altruism and long termism, and some of the criticisms. It's not really possible to be totally comprehensive on this because it's a big topic. In short, we are all now here for the good stuff.
22
00:03:59,568 --> 00:04:01,856
Chris: Yeah. The juicy episode.
23
00:04:01,888 --> 00:04:15,422
Kayla: The juicy episode. So what the hell happened with infamous effective altruist and alleged polycule enthusiast Sam Bankman fried? I would like to say upfront that we at cult are just weird. Are pro polyamory for consenting parties.
24
00:04:15,526 --> 00:04:15,998
Chris: Hell yeah.
25
00:04:16,053 --> 00:04:28,438
Kayla: Absolutely. Nothing wrong with it. Probably a good thing that more people are learning about this relationship structure. That said, we really want to talk about those polycule headlines because, boy, did they scandalize the news media for a while in 2022.
26
00:04:28,574 --> 00:04:45,912
Chris: I feel like we run into this, like, frequently where it's like there's stuff that we genuinely are supporting and in favor of, but like, somehow it's still. We still have this, like, vestige of, like, societal, whatever, shame or something that, like, makes it fun to, like, gossip. Scandalize about.
27
00:04:45,976 --> 00:04:46,496
Kayla: Yeah.
28
00:04:46,608 --> 00:04:50,656
Chris: You know. Yeah, we're hypocrites. We feel like we run into that more than once.
29
00:04:50,808 --> 00:04:54,064
Kayla: Yeah. That's why we have a podcast.
30
00:04:54,112 --> 00:04:54,940
Chris: That's right.
31
00:04:55,440 --> 00:05:04,942
Kayla: Okay, so Sam Bankman Fried, I will refer to him as many others do, as SBF founded a cryptocurrency exchange called FTX. Cause we're gonna get a lot of.
32
00:05:04,966 --> 00:05:06,814
Chris: Acronyms in test grill. Yep.
33
00:05:06,862 --> 00:05:15,430
Kayla: Yeah. He courted a ton of investors, made a boatload of money for himself. Like, I think he ranked 41st in Forbes list of richest Americans at one point. Holy shit.
34
00:05:15,470 --> 00:05:16,414
Chris: He got that high?
35
00:05:16,462 --> 00:05:18,454
Kayla: Oh, yes, he. Yes. Yes.
36
00:05:18,542 --> 00:05:19,270
Chris: That dude.
37
00:05:19,350 --> 00:05:20,174
Kayla: That dude.
38
00:05:20,302 --> 00:05:22,062
Chris: Like, what am I doing with my life?
39
00:05:22,126 --> 00:05:31,406
Kayla: This is like, a Mark Zuckerberg 2.0. He very much leaned into the aesthetic of, like, my hair is unkempt, and I wear t shirts and flip flops. That guy.
40
00:05:31,478 --> 00:05:32,078
Chris: Oh, my God.
41
00:05:32,134 --> 00:05:49,652
Kayla: Okay, nothing wrong with it. I mean, there's a lot wrong with it. But in November 2022, FTX went bankrupt really badly, and the following month, SBF was arrested and indicted on charges of wire fraud, commodities fraud, securities fraud, money laundering, and campaign finance law violations.
42
00:05:49,756 --> 00:05:50,820
Chris: That's a lot of fraud.
43
00:05:50,900 --> 00:06:11,840
Kayla: Oh, yeah. He was convicted on seven counts, sentenced to 25 years in prison. And everyone invested in FTX lost their money before shit hit the fan. He helped popularize the concept of effective altruism in the mainstream, as he was a big EA guy. Great, good. We are all caught up. But what about the polycule?
44
00:06:12,340 --> 00:06:13,796
Chris: Okay, get to that.
45
00:06:13,908 --> 00:06:26,116
Kayla: So the story goes that in 2021, SPF and the whole FTX crew moved the operation from Hong Kong to the Bahamas, allegedly because there were fewer regulations there for financial stuff.
46
00:06:26,188 --> 00:06:30,058
Chris: Yeah, I think I. That tracks with what I know about the Caribbean.
47
00:06:30,234 --> 00:06:44,986
Kayla: Ten members of an inner circle, essentially, like, the execs of FTX, lived together in, like, a palatial mansion. Like, just the most succession y. Most, like, Silicon Valley y, Bahamas mansion you can imagine.
48
00:06:45,058 --> 00:06:47,186
Chris: Sorry. Why isn't there a tv show yet, though?
49
00:06:47,218 --> 00:06:49,946
Kayla: There will be. You better believe there will be.
50
00:06:50,098 --> 00:06:50,786
Chris: Okay.
51
00:06:50,898 --> 00:06:56,476
Kayla: While running, quote, unquote, the cryptocurrency exchange, they also partied hard, obviously.
52
00:06:56,548 --> 00:06:57,004
Chris: Yeah, yeah.
53
00:06:57,052 --> 00:07:09,716
Kayla: The stories go that SBF and his roommates would, like, stay up all night and take amphetamines, like speed. He himself once tweeted, quote, stimulants when you wake up, sleeping pills if you need them. When you sleep. Like, that was the tweet.
54
00:07:09,828 --> 00:07:16,596
Chris: I mean, you know, I drink coffee in the morning, and sometimes I smoke weed at night, so I can't really make fun of him too much.
55
00:07:16,668 --> 00:07:27,960
Kayla: A lot of time was also spent watching SPF play video games. Like, it's confirmed that he was literally playing League of Legends during at least one very important fundraising call with Sequoia Capital.
56
00:07:28,260 --> 00:07:29,180
Chris: All right, so here's where.
57
00:07:29,220 --> 00:07:30,240
Kayla: Also, he was bad.
58
00:07:30,900 --> 00:07:32,300
Chris: Oh, he was like a baddie.
59
00:07:32,340 --> 00:07:33,996
Kayla: They found his ranking and he was not good at it.
60
00:07:34,028 --> 00:07:51,060
Chris: Wood league. I would love to make fun of him for playing. Lol. And I would love to equate the toxic nature of these types of games with him being a douchebag. But also, I have a bunch of friends that work at riot, so I feel like I shouldn't do that.
61
00:07:51,100 --> 00:07:53,892
Kayla: But also, anyone who plays video games is bad and wrong.
62
00:07:53,996 --> 00:07:54,544
Chris: That's right.
63
00:07:54,612 --> 00:08:13,780
Kayla: That's the position of the show. Yes, of course, the rumors go that all ten of the co ed inner circle group were all dating each other, either pairing up and then mixing and matching in either an on again, off again situationship or something akin to swinging, or they were doing straight up polyamory.
64
00:08:14,120 --> 00:08:29,520
Chris: Okay, okay. So it's unclear whether they were all cuddle puddling in every single night, all ten of them. Or whether it was some amorphous, amoebic sort of. Something would break off and then they would come back in.
65
00:08:30,100 --> 00:08:31,804
Kayla: It was like an incestuous pile.
66
00:08:31,932 --> 00:08:32,508
Chris: Awesome.
67
00:08:32,604 --> 00:08:43,840
Kayla: And to be clear, polyamory is a relationship structure in which multiple people, rather than just a couple, form a romantic unit. So, like, multiple people all dating each other in a single relationship.
68
00:08:44,260 --> 00:08:46,860
Chris: Do we know the ratio of these ten people?
69
00:08:46,980 --> 00:08:55,724
Kayla: You know, that's a good question. I don't. But we do know that Caroline Ellisone, CEO. Not related to Larry Ellison, as far as I know.
70
00:08:55,772 --> 00:09:01,348
Chris: Oh, okay. Yeah, you're saying tech people and then Ellison. I'm just assuming it's Larry Ellison.
71
00:09:01,444 --> 00:09:16,520
Kayla: Not related to Larry Ellison. Different Ellison. She was the CEO of a trading firm founded by FTX, and she was also SBF's. There's too many acronyms. She was also bankman Fried's sometimes girlfriend.
72
00:09:16,820 --> 00:09:18,228
Chris: SBF's. Gf.
73
00:09:18,324 --> 00:09:23,936
Kayla: Correct. She also blogged about polyamory on Tumblr and her journey into it.
74
00:09:24,028 --> 00:09:31,088
Chris: She did. You know, I remember her in the sort of, like, fallout news of FTX. Like, she also got quite a bit of heat.
75
00:09:31,144 --> 00:10:11,956
Kayla: Part of the maelstrom. My favorite quote from her tumblr that I found was this. There's problems here. And if you are a polyamorous person, you will be able to identify them immediately. And if you are not a polyamorous person, I think they'll still, like, scream right in your ear. When I first started my foray into poly, I thought of it as a radical break from my trad past. But TbH, acronyms. I've come to decide the only acceptable style of poly is best characterized as something like imperial chinese harem. None of this non hierarchical bullshit. Everyone should have a ranking of their partners. People should know where they fall on that ranking, and there should be vicious power struggles for the higher ranks.
76
00:10:12,108 --> 00:10:13,788
Chris: That sounds awesome.
77
00:10:13,884 --> 00:10:28,232
Kayla: I cannot confirm or deny whether this is, like, a joke or reality. It definitely falls in that, like, troll. Not a troll realm, but, like, yeah, it does. I have some thoughts.
78
00:10:28,416 --> 00:10:40,488
Chris: Like, if you treat it like a sport, that sounds awesome. You know, if it's, like, not, like, a serious, like, you know, if you're like, well, I'm just gonna play the sport of polyamory and not take it seriously. That sounds like that could be fun.
79
00:10:40,624 --> 00:11:00,442
Kayla: But, yeah, I mean, from my limited understanding of, like, ethical polyamory practices, this is perhaps not the most sound way to pursue a poly relationship. I don't think that relationships should be about vicious power struggles and rankings. That's my personal.
80
00:11:00,506 --> 00:11:02,510
Chris: Then how do you get a Game of Thrones, Kayla?
81
00:11:03,170 --> 00:11:07,090
Kayla: Well, I mean, they didn't even have time to get their Game of Thrones on. Cause they all got arrested or whatever.
82
00:11:07,130 --> 00:11:07,950
Chris: That's true.
83
00:11:08,290 --> 00:11:27,166
Kayla: But why bring any of this up? What does Bankman Freed's alleged polycule have to do with EA effective altruism? Doesn't this kind of just seem like were talking about scandalized, sensationalized reporting to make these tech bros look like weirdos to pearl clutching Guardian readers?
84
00:11:27,318 --> 00:11:28,374
Chris: Yeah. Isn't that what we're doing?
85
00:11:28,422 --> 00:11:36,870
Kayla: Yeah, I mean, I think Scott Alexander points out in his essay in continued defense of affective altruism, which we talked about last week.
86
00:11:36,910 --> 00:11:37,942
Chris: Right, the rationalist guy.
87
00:11:38,006 --> 00:12:14,828
Kayla: Correct. He says, quote, nobody cares about preventing pandemics. Everyone cares about whether SBF was in a polycule or not. Effective altruists will only intersect with the parts of the world that other people care about when we screw up, therefore, everyone will think of us as those guys who are constantly screwing up and maybe doing other things. I'm forgetting right now. In short, he's saying that, like, for every article I know. Yeah, for every article that about SBF's polycule, there are a dozen articles that should have been written about the self proclaimed 200,000 lives Alexander's estimated. Alexander estimates effective altruism has saved a.
88
00:12:14,914 --> 00:12:18,248
Chris: I guess he has a point about sensationalism in media and clicking.
89
00:12:18,264 --> 00:12:33,540
Kayla: That's why I brought this up. That's why I brought this up in this episode and not in the previous episode, because I read that quote, and I was like, oh, yeah, let's talk about the. Maybe the good stuff first. So I'm not just like, you're not being oo. Yeah, but I still want to do oo a little bit.
90
00:12:34,000 --> 00:12:36,808
Chris: Mm mm. Dammit, Kayla.
91
00:12:36,904 --> 00:12:41,170
Kayla: Frankly, however, to go back on myself.
92
00:12:41,290 --> 00:12:42,210
Chris: Yeah. Oh, okay.
93
00:12:42,290 --> 00:13:03,618
Kayla: I bring it up because it is relevant. Like, especially when we're getting into the criticism section of effective altruism. Like, if a movement is going to support and even encourage tech billionaires to acquire as much wealth and power as they can so they can donate it to causes of their choice, we also have to look at the whole package of the person we're allowing to acquire said wealth and power.
94
00:13:03,794 --> 00:13:08,794
Chris: I think that's actually a really good point. No, you're bringing me back around now because you're right.
95
00:13:08,922 --> 00:13:16,408
Kayla: I a big part of not to say that polyamory is unethical. I'm just saying that upfront. Sorry, continue.
96
00:13:16,464 --> 00:13:46,916
Chris: No, no, the point is not that polyamory is unethical. It's perfectly ethical if it's consensual and whatever, like anything else, it's more that, like, yeah, maybe there is a reason to interrogate deeply into the lives of people who we have allowed to accumulate such wealth. Because effectively accumulating that wealth is equivalent to saying, we are giving this person a lot of decision making power over how resources are spent and what things are built.
97
00:13:46,988 --> 00:13:47,364
Kayla: Right.
98
00:13:47,452 --> 00:14:00,428
Chris: And if these guys are using those resources to build yachts instead of building bridges or shelters for homeless people, I think that we need to be saying, like, okay, well, what are they doing? Like, what would you say you do here?
99
00:14:00,444 --> 00:14:11,990
Kayla: What would you say you do here? Yeah, I don't know if it's really possible to disentangle one's lived ethics with one's charitable ethics, especially when we're talking about, like you said, people who are hoarding billions of dollars.
100
00:14:12,030 --> 00:14:13,990
Chris: Yeah. Not at that level of wealth. Right, right.
101
00:14:14,070 --> 00:14:21,614
Kayla: But again, there's nothing wrong with polyamory, and there's nothing even wrong with, like, taking drugs or playing video games or like, fucking off to the Bahamas or whatever. These aren't the case.
102
00:14:21,622 --> 00:14:22,410
Chris: That's lucky.
103
00:14:25,190 --> 00:14:37,346
Kayla: But when Caroline Ellison is helping make her boyfriend violently wealthy, and then blogging that this is another quote, blogging that her ideal man can, quote, control most major world governments and has sufficient strength to physically overpower you.
104
00:14:37,498 --> 00:14:38,298
Chris: Okay, hold on.
105
00:14:38,314 --> 00:14:39,746
Kayla: I'm gonna look at that twice.
106
00:14:39,938 --> 00:14:44,274
Chris: Okay, so she's bragging about how her boyfriend can.
107
00:14:44,322 --> 00:14:52,950
Kayla: No, she's saying, this is her ideal man. She's not saying, oh, her ideal man. This is my ideal man. Which, I mean, she has and has continued to date SPF during this period.
108
00:14:53,290 --> 00:15:02,990
Chris: Well, he became not that guy for sure. I mean, it sounds like he never was that guy, but after FTX collapse.
109
00:15:03,070 --> 00:15:10,582
Kayla: I don't think he's the ideal anymore. But I think that he was certainly on his way at one time, right, when you're ranked 41st richest american, he.
110
00:15:10,606 --> 00:15:13,254
Chris: Didn'T look like he could physically overpower anyone.
111
00:15:13,302 --> 00:15:17,014
Kayla: Well, I don't know. Maybe she was like four foot eleven, I don't know.
112
00:15:17,142 --> 00:15:18,730
Chris: Oh, physically overpower her.
113
00:15:19,150 --> 00:15:26,814
Kayla: She's talking about the ideal man is for her, is somebody who can physically overpower herself.
114
00:15:26,942 --> 00:15:31,054
Chris: Oh, I thought she was doing like, a my boyfriend can beat your boyfriend up thing.
115
00:15:31,142 --> 00:15:44,642
Kayla: You know what? That could be an interpretation. But my interpretation, she's saying that, like, I am. I am deeply aroused by a man who has extreme global power and can also physically overpower me.
116
00:15:44,706 --> 00:15:45,122
Chris: Okay.
117
00:15:45,186 --> 00:15:46,202
Kayla: I find that erotic.
118
00:15:46,266 --> 00:15:48,642
Chris: Got it. Well, you know, like, to each their own.
119
00:15:48,706 --> 00:16:12,340
Kayla: To each their own. But I don't think that we should be giving a lot of money to people who are eroticizing and idealizing, in reality, individual Silicon Valley tech bros to be able to control major world governments. I know that this is probably some Tumblr roleplay fanfic bullshit, and also, it's kind of what was actually happening, right?
120
00:16:12,380 --> 00:16:13,480
Chris: Troll. Not a troll.
121
00:16:14,060 --> 00:16:41,692
Kayla: And another thing I'm going to look at, read the polyamory situation, is that sometimes shitty people use polyamory as a cover for shitty abusive behavior in relationships. And the EA community has been accused of exactly that. Even outside of the Sam Bankman fried stuff, both Bloomberg and Time reported on women accusing the EA community, particularly the Silicon Valley EA community, of cultivating a culture in which predatory sexual behavior thrived.
122
00:16:41,796 --> 00:16:51,780
Chris: Because, of course, yeah, this is. Now we're talking about classic cult stuff here, right? This is like source family or NXIVM or any number of others.
123
00:16:51,900 --> 00:17:26,410
Kayla: Men within the movement were accused of using their power to groom younger women and utilized the guise of polyamory to do so. The accusers also stated that EA was largely male dominated in the community and sexual misconduct was either excused or ignored. The center for Effective Altruism, which is William McCaskill's organization, argued to time that they had banned accused perpetrators from the organization and offered to investigate new claims. They also said it's hard to figure out whether the sexual misconduct that went on was unique to the EA community or whether it was just good old fashioned societal misogyny.
124
00:17:27,949 --> 00:17:44,557
Chris: I don't disagree with that. I mean, you see misogyny get its own flavor no matter where it is. In fact, I was even gonna say, yeah, polyamory can be used as a cover for certain types of abuse. So can just regular old monogamy.
125
00:17:44,653 --> 00:17:45,289
Kayla: Sure.
126
00:17:46,589 --> 00:17:53,033
Chris: The individual family unit is used as a bludgeon by a lot of people to advance political agendas.
127
00:17:53,141 --> 00:18:24,076
Kayla: I hope that they're then donating some money to the structural issues behind societal misogyny that might be taking down their very organization, but I don't think that they are. Oh, we'll get to that. I don't know. I agree with you. And also, that response rubbed me the wrong way a little bit. I don't think it's wrong to acknowledge that these problems grow out of systems, not simply some effect of altruism itself. And also, it feels a little bit like a cop out and a little.
128
00:18:24,108 --> 00:18:25,532
Chris: Bit like washing your hands.
129
00:18:25,596 --> 00:18:32,720
Kayla: A lack of understanding of how, like, you are currently shaping culture and you're continuing to shape culture in the image of, like, the shitty stuff.
130
00:18:33,100 --> 00:18:43,292
Chris: Yeah, it's hard to tell, especially with these billionaire guys, because so many of them seem like we can't do anything, pass it on, but, like, they're also creating culture. So I. Yeah, I don't know.
131
00:18:43,396 --> 00:19:28,760
Kayla: It's tough. Regarding SBF specifically, there is some question about whether he was, quote, unquote, really an effective altruist. And I think those questions kind of expose a deep criticism of EA. It is extremely easy to bastardize the original concept and use it for personal gain. That ends up hurting a lot of people. SBF publicly declared himself an eaer, stated that he was, quote, earning to give, and made donations, quote, not based on personal interests, but on the projects that are proven by data to be the most effective at helping people. He was a member of giving what we can. The organization we talked about last week, where members pledged to donate at least 10% of their incomes to EA causes. He founded something called Future Fund, which was supposed to donate money to nonprofits. And guess who was on the team?
132
00:19:29,140 --> 00:19:30,436
Chris: Eliezer Dukowski.
133
00:19:30,548 --> 00:19:31,484
Kayla: William McCaskill.
134
00:19:31,532 --> 00:19:32,276
Chris: Oh, okay.
135
00:19:32,348 --> 00:20:03,346
Kayla: One of the founders of the EA movement. And it wasn't the only way McCaskill was connected to SBF. Like, I read somewhere that at one point, Sam Bankman Fried had, like, worked at the center for effective Altruism. I'm not sure if that's true, but in 2022, when Elon Musk was looking to fund his Twitter purchase, William McCaskill acted as a liaison between Musk and Sam Bankman fried. McCaskill reached out to Musk, who had once described McCaskill's book, what we owe the future as a, quote, close match for my philosophy.
136
00:20:03,458 --> 00:20:09,314
Chris: Right. That quote comes up, like, everywhere. That quote has been plastered across the Internet by now. Yeah.
137
00:20:09,442 --> 00:20:22,822
Kayla: And McCaskill said, hey, my collaborator. And, yes, he used the phrase, my collaborator can maybe help secure this funding. And then, you know, ultimately, of course, that did not go through because Sam Egmon fried got arrested and went to jail.
138
00:20:22,886 --> 00:20:51,822
Chris: Yeah. Getting arrested would make. That would put a damper on that. You know, I. I'm picturing now, because you were saying, like, oh, there's other ways that they were tied together. And now I'm picturing there's, like, a fives League of Legends team, and it's like, Sam, Bankman, Fried, McCaskill, Musk, and, I don't know, pick another fifth bostrom or something. And they're like. They're all playing league of legends, and I'm trying to figure out who would go where. Cause there's very specific roles.
139
00:20:51,886 --> 00:20:53,222
Kayla: But they're also all enemies.
140
00:20:53,366 --> 00:20:58,410
Chris: Yeah, of course. And they're yelling at each other like, dude, you should have been there for the gank, man.
141
00:20:59,990 --> 00:21:22,040
Kayla: Kill me. Leading up to his arrest, Bankman Fried did an interview with Vox reporter Kelsey Piper via Twitter DM, which I can't tell if that's really smart or really dumb. He stated that his quote ethic stuff was, quote, mostly a front, and that ethics is a, quote, dumb game we woke westerners play where we say all the right shibboleths, and so everyone likes us.
142
00:21:22,160 --> 00:21:24,820
Chris: Ooh, dropping the shibboleth word.
143
00:21:25,120 --> 00:21:25,840
Kayla: Many.
144
00:21:26,000 --> 00:21:27,380
Chris: Should we define that?
145
00:21:27,720 --> 00:21:35,976
Kayla: Can you. It's like a word. No, I can't. It's a word that is used to define in groups and out groups.
146
00:21:36,048 --> 00:21:36,504
Chris: Yeah. Yeah.
147
00:21:36,552 --> 00:21:42,244
Kayla: So, like, if you know the word, then you're on the in, and if you don't know the word, then you're identified as an outsider.
148
00:21:42,292 --> 00:21:45,452
Chris: It's like how we talk about jargon being part of the ritual criteria.
149
00:21:45,516 --> 00:22:02,052
Kayla: Yeah, he could have. Yeah. Many, of course, took this statement to mean that he was using EA as a cover for his shady dealings and his accrual of wealth. He later claimed he was referring to things like greenwashing and not EA efforts, but, like, damage kind of done.
150
00:22:02,236 --> 00:22:02,960
Chris: Right.
151
00:22:03,950 --> 00:22:09,318
Kayla: McCaskill has since expressed deep regret at being duped by SBF, for what it's worth.
152
00:22:09,374 --> 00:22:11,094
Chris: So did they break up their League of Legends team?
153
00:22:11,142 --> 00:22:11,838
Kayla: I think they did.
154
00:22:11,894 --> 00:22:12,486
Chris: Oh, no.
155
00:22:12,558 --> 00:22:14,766
Kayla: Well, I don't think you can play lol in jail.
156
00:22:14,878 --> 00:22:24,022
Chris: Shit. SPF needs another support. Oh, yeah. Now he's just gonna get, like, you know, like, neo Nazi Steve from his cellmate is gonna have to be his playing partner.
157
00:22:24,086 --> 00:22:28,054
Kayla: Unfortunately, neo Nazi Steve is probably not that far from a regular lol player.
158
00:22:28,142 --> 00:22:31,498
Chris: Oh, zing. Sorry, riot friends.
159
00:22:31,654 --> 00:22:56,350
Kayla: I don't know anything about lol. I just wanted to make a burn. Make a burn for what it's worth. As I mentioned before the intro, music Time reports that McCaskill and other EA leaders were actively warned about SBF being a fraud, being a deceiver very early on, like, 2018, and those warnings were essentially ignored. Like, McCaskill was literally with this guy till the end.
160
00:22:58,370 --> 00:23:10,694
Chris: When he. When the whole FTX thing went down, did McCaskill play it? Like, I had no idea, or was he more like, well, I was warned.
161
00:23:10,742 --> 00:23:32,286
Kayla: But, you know, I think he played it more as, like, I'm outraged at how duped I was. I'm outraged at this harm that this guy has caused. I don't think he said, like, I should have known better. I could be wrong. He definitely tweeted about this, so, like, it's free to go and, like, look at and kind of see how you feel about it. But there was a lot of expression of, like, I'm outraged that this guy did this.
162
00:23:32,358 --> 00:23:48,814
Chris: Yeah, I'll give him a couple empathy points here, because, like, I do understand that, like, when you have a cause that you think is really important and you have a hose of money feeding that. Cause, right. There's gonna be a lot of sunk cost fallacy of, like, no, no. This guy has to be fine. Cause if he's not fine, then I'm fucked.
163
00:23:48,942 --> 00:24:02,794
Kayla: Yeah, and that's a really good point. Like, McCaskill has all these organizations, but, like, he himself is not an FTX tech bro. He himself is not Elon Musk. He himself is nothing, generating billions and billions of dollars of wealth.
164
00:24:02,882 --> 00:24:07,130
Chris: Yeah. So there's a lot of motivation for him to dismiss warnings.
165
00:24:07,250 --> 00:24:13,818
Kayla: Yeah. And, like, none of us are perfect, but I think you gotta be more perfect when you're doing stuff like this.
166
00:24:13,954 --> 00:24:18,794
Chris: Yeah, absolutely. There's a higher standard when you're in command of that many resources.
167
00:24:18,922 --> 00:24:45,022
Kayla: Let's actually continue this conversation about McCaskill. Obviously, we talked about him in last week's episode, but I kind of held off on saying how I felt about him. And part of that was because I wanted people to come and listen to this episode. But another part of it is that I feel really complicated about it. I don't think he's a villain. I do think he's demonstrably naive or has been demonstrably naive, and I think he's in a really unfortunate position right now.
168
00:24:45,166 --> 00:24:45,890
Chris: Yeah.
169
00:24:46,190 --> 00:25:00,798
Kayla: Academic and researcher Gwilym David Blunt, whose area of focus is, among other things, the ethics of philanthropy, wrote an article for the philosopher titled Effective Altruism, long termism, and the problem of arbitrary power, which you sent to me. So thank you for finding that.
170
00:25:00,854 --> 00:25:02,448
Chris: Wait, his last name was Bluntley?
171
00:25:02,574 --> 00:25:03,280
Kayla: Yeah.
172
00:25:03,860 --> 00:25:04,760
Chris: Sweet.
173
00:25:06,020 --> 00:25:09,720
Kayla: In this essay, he explains. Ha ha. That was me laughing at you.
174
00:25:10,020 --> 00:25:11,812
Chris: Yeah. Thanks for the support.
175
00:25:11,996 --> 00:25:53,068
Kayla: In the essay, he explains that there's an atmosphere of schadenfreude surrounding MacAskill now, particularly in the wake of FTX's spectacular fall, largely coming from other philosophers and academics. And I think I would also argue, the mediaev. Blunt explains that part of this might be related to McCaskill's success in doing one of the more difficult things in academia, breaking out of it, and having a direct and recognized impact on the wider world. Blunt rightfully credits MacAskill with creating both the effect of altruist and long termist movements, and states that his center for effective Altruism has, quote, annual expenditures approaching $400 million, with about $46 billion more in funding commitments.
176
00:25:53,254 --> 00:25:55,224
Chris: That's a lot of money.
177
00:25:55,272 --> 00:25:56,768
Kayla: That's a lot of impact, baby.
178
00:25:56,864 --> 00:26:00,856
Chris: That's like a Scrooge McDuck swimming your little gold coins amount of money.
179
00:26:01,048 --> 00:26:24,720
Kayla: Blount goes on to describe how, in 2022, McCaskill expressed concern about a deep change in the overall vibe of effective altruism. What he originally imagined and embodied as a practice of ascetic frugality had now become a way for the very wealthy to wield more wealth. In short, his own philosophy in breaking through to wider culture had also gotten away from him and its original roots.
180
00:26:24,880 --> 00:26:53,540
Chris: It's interesting that he felt that switch because I didn't feel it in time, but I definitely feel it in space with this, where I feel like there's kind of. I don't know, there's two different kinds of effective altruists, right? There's, like, the people that like to do some math and figure out where to donate their $10,000 or $5,000, and then there's, like, this Sam Bankman fried set of, like, crazy wealthy billionaires that are, like, you know, using it again as, like, a club.
181
00:26:53,880 --> 00:27:06,384
Kayla: I think that it's probably. They were able to tap into a. Ooh, if I tell people that if they invest in me, they're not just investing in me, they're investing in the future, and they're investing in these good causes. I get more money.
182
00:27:06,472 --> 00:27:19,712
Chris: Right. And especially people are going to take advantage of that. People, you know, compared to 1020, 30 years ago, people are much more interested investing in things that are more activist, investing things that are more.
183
00:27:19,776 --> 00:27:20,488
Kayla: I hate that phrase.
184
00:27:20,544 --> 00:27:28,752
Chris: I know. I hate that phrase, too. But people are more likely today to invest in something that feels like it's doing something for the capital G. Good.
185
00:27:28,816 --> 00:28:23,782
Kayla: Right. So in this way, I feel for William McCaskill, because that's tough. If you come up with this idea and you have this. Monkish is not the greatest word, but it's supposed to be. It was originally supposed to be more frugal, more ascetic is the word that is used. More austere versus big and ostentations and billions of dollars. This article kind of softened my heart towards him a little bit, which is good. And I think McCaskill was 24 years old when he developed the idea of effective altruism. He's a little baby boy, and 24 year olds are, of course, well into adulthood. And McCaskill was deeply educated in philosophy, among other things. And still, 24 year olds, while they have many gifts of youth that we lose in older age, 24 year olds also often lack wisdom that does come with age.
186
00:28:23,966 --> 00:28:54,400
Kayla: And I think that there is some wisdom lacking around his approach to his own philosophy. It's worth talking about how he was unable to see some of this coming, that he couldn't look at history and recent events and think, wealthy people usually act in their own self interest and often resort to unethical means to accrue wealth. It's worth talking about how, despite being warned about SBF and his shady practices, McCaskill was still duped, along with the rest of the FTX investors, and had to take to Twitter to express his rage and embarrassment over the whole thing.
187
00:28:54,700 --> 00:29:05,612
Chris: So were the warnings, like, I mean, if somebody had said, hey, this dude's sus. No cap skibity, do you think that would have gotten through to him?
188
00:29:05,676 --> 00:29:08,412
Kayla: I think he would have been like, what the hell are you talking about?
189
00:29:09,324 --> 00:29:12,084
Chris: Oh, he was 24 in 2018. Okay, so he's a millennial.
190
00:29:12,212 --> 00:29:16,740
Kayla: I think he's 37 now. No, he wasn't 24 in 2018. He was 24 when he came up with these ideas.
191
00:29:16,860 --> 00:29:17,372
Chris: Okay.
192
00:29:17,436 --> 00:29:19,924
Kayla: When he founded, like, giving what we can in those things.
193
00:29:20,012 --> 00:29:21,324
Chris: But he's more of a millennial.
194
00:29:21,452 --> 00:29:22,900
Kayla: Yeah, I think he's my age.
195
00:29:22,980 --> 00:29:29,840
Chris: Okay, so, okay, so he's, like, right smack in the middle of millennial. Okay. So you'd have to be, like, hey, man, this SPF guy is Jugi.
196
00:29:30,140 --> 00:29:31,836
Kayla: Jugi was a Gen Z term, baby.
197
00:29:31,868 --> 00:29:36,226
Chris: Oh, that was a Gen z term against millennials, right? I don't know what we're talking about.
198
00:29:36,288 --> 00:29:44,014
Kayla: You don't even remember millennial jargon from 15 years ago or whatever. Harry Potter has effective altruism.
199
00:29:44,182 --> 00:29:46,102
Chris: SPF is like Draco Malfoy.
200
00:29:46,166 --> 00:29:47,294
Kayla: Yeah, that would have gotten through.
201
00:29:47,342 --> 00:29:48,490
Chris: Okay. Okay.
202
00:29:49,310 --> 00:30:16,630
Kayla: It's also worth talking about how when effective Ventures foundations, a coalition of EA organizations, including the center for Effective Altruism, giving what we can and 80,000 hours, all of which McCaskill is involved in, when effective Ventures foundations bought Witham Abbey, a literal manor house on 25 acres of land for 17 million pounds to use as their new headquarters. And McCaskill does not really seem to see the problem there.
203
00:30:17,770 --> 00:30:26,938
Chris: Yeah. I mean, the optics there aren't great. He does say that. Well, he used the word ascetic. You used the word monk. But if you're gonna get a. You know, if you're gonna be monkish, get Nabby.
204
00:30:27,034 --> 00:30:29,912
Kayla: I guess that's true. Yeah. Like, you should go look up pictures of it.
205
00:30:29,936 --> 00:30:31,584
Chris: It's like a palatial abbey.
206
00:30:31,632 --> 00:30:35,480
Kayla: It's not versailles, but it's a fucking abbey, man.
207
00:30:35,600 --> 00:30:36,300
Chris: Yeah.
208
00:30:36,640 --> 00:30:53,480
Kayla: And it does just bring up the question of why is an effective altruist group that is like, put your money where your mouth is. Why are they spending 17 million pounds on a mansion when the whole mission of the movement is to spend the most money where it can do the most effective material good.
209
00:30:53,640 --> 00:30:59,940
Chris: Yeah. And you know what? I've heard the argument. I don't know if you're going to bring this up, but I've heard the argument about, like, well, it's, you know, the.
210
00:31:01,500 --> 00:31:03,484
Kayla: We can do the best work always for show.
211
00:31:03,532 --> 00:31:22,212
Chris: Well, it's for show and for influence. So, like, if. Yeah, and we can do the best work, right? Like, we can work better here. People will take us more seriously, blah, blah. All the, you know, the sort of, like, aesthetic things that maybe it brings to the table, and then that has a positive Roi. So it's actually good. Don't really buy it.
212
00:31:22,236 --> 00:31:23,800
Kayla: I just don't buy it anymore, because.
213
00:31:24,780 --> 00:32:04,288
Chris: I feel like if you. If your thing is effective altruism, if your thing is, like, acetic, you know, high roi giving, then wouldn't you better off advertising yourself as being like, yeah, man, we work out of a warehouse. Like, that's much more, to me, is much more effective. Like, I. There was. I forget who the. This is, like, long the knowledge of this is long lost to me. But in business school, I remember hearing a story about, like, some CEO that was CEO of this company that was. And they were, whatever. It was, like they were, like, very conscious about costs and they were like, hey, we need to, you know, do things cheaply so we can give it, you know, pass on the savings to the customer, whatever it was. They wanted to be really cognizant about costs.
214
00:32:04,444 --> 00:32:30,616
Chris: And so this guy, like, sat on some, like, his desk was like some, like, busted ass table in like an. It wasn't a corner office. And it was like he was sitting on, like, milk crates or something like insane like that. And. But, and it was like he, you know, he could have spent $10 to get a decent chair, but, like, it was for show. It was for, like, hey, I'm sitting on milk crates. So, like, that's the attitude I want you guys to take. And I feel like that also could apply here.
215
00:32:30,768 --> 00:32:46,528
Kayla: If SPF figured out he could get more money by not getting haircuts and wearing flip flops, then, like, I feel like that could maybe translate, I don't know, business. But also, I still just, like, don't buy an abby. There's other options between milk crates and Abby.
216
00:32:46,624 --> 00:32:58,074
Chris: Right, right. But, like, it's just, it's not like it's fine and all, I guess if you're, again, going for the show of it, but don't you want to show your DNA and not, like, what you're not about?
217
00:32:58,162 --> 00:33:11,090
Kayla: That's what I think. Blount goes on to explain that McCaskill and others have failed to include a working theory of power, which results in major blind spots and loopholes in the EA framework.
218
00:33:11,210 --> 00:33:16,778
Chris: I think that sentence there is why I was like, oh, you got to read this, because I think that's sort of the key insight.
219
00:33:16,914 --> 00:33:31,030
Kayla: There seems to be little or no understanding of the fact that the harm caused by billionaires leveraging a system that allows them to become billionaires enough to embrace EA ideals cannot then be outweighed by the good they might do in their EA endeavors.
220
00:33:31,530 --> 00:33:32,730
Chris: Okay, maybe it's that sentence.
221
00:33:32,770 --> 00:33:36,338
Kayla: Actually, that was my sentence. Oh, that was your sentence summarizing what they're talking about.
222
00:33:36,434 --> 00:34:03,100
Chris: Yeah, I think that's just another thing that's so. And that particular insight, I think even goes beyond just EA, but to altruism in general, to charity in general, because that's a lot of these, starting with the Carnegies and the Rockefellers, that's what they like to do. But why are they in a position to be doing that in the first place? Do they want to interrogate that? Not really.
223
00:34:03,260 --> 00:34:11,139
Kayla: I still don't understand why Bill Gates keeps going. I'm just donating all my money to charity, and then he gets richer and richer. Yeah, I don't understand.
224
00:34:11,300 --> 00:34:13,320
Chris: But God forbid we get taxed.
225
00:34:13,699 --> 00:34:43,099
Kayla: EA founders like McCaskill and Peter Singer seem hellbent on viewing someone like SBF as a bad apple, an outlier, unfortunate one off. He's not the result of a flaw in the philosophy, even though the philosophy facilitated his rise to money and power, which facilitated his harmful behavior. Without a working theory of power, without grappling with structural power and what it means, EA and long termism helps put power in the hands of the ultra wealthy and then keep it there, which is. Do I need to say why that's a problem?
226
00:34:44,719 --> 00:34:53,388
Chris: No, I think we've said it before, like three or four times in this episode. They get to make all the resource decisions. If that happens, that's not great.
227
00:34:53,484 --> 00:35:29,330
Kayla: EA and long termism look around and say, like, hey, let's put our money over here where the people need help, but they do not look around and say, like, hey, what are the structures in place that cause those people to need help in the first place? And how do we change that? Because those causes are so often necessary in generating the kind of wealth disparity that we're talking about. If you buy into the idea, which I do, that billionaires only exist when massive amounts of people live in poverty, it's easy for those billionaires to go, hey, I'll donate some of my wealth, rather than aid in the dismantling of the structures that allowed them to become so rich and powerful.
228
00:35:29,410 --> 00:35:30,082
Chris: Right?
229
00:35:30,266 --> 00:35:32,350
Kayla: It's an indictment on the whole system.
230
00:35:32,970 --> 00:35:34,230
Chris: I feel indicted.
231
00:35:34,650 --> 00:35:55,438
Kayla: EA seems unable to grapple with the fact that there are structural issues within their own philosophical framework and movement. And part of that is because the philosophy seems to avoid grappling with the very of structural issues. And like you said, this is a problem that comes up in the ethics of philanthropy and charity time and time again. Like, this is not reinventing the wheel. This problem is time immemorial.
232
00:35:55,574 --> 00:35:56,278
Chris: Right.
233
00:35:56,454 --> 00:35:57,902
Kayla: And they're not fixing it.
234
00:35:58,046 --> 00:36:02,294
Chris: No, because these people want to have their names on their college buildings.
235
00:36:02,342 --> 00:36:17,202
Kayla: It's very important. I will also note again that it's funny to me that the center for effective altruism was like, maybe the issues with sexual misconduct in our ranks was actually the result of systemic misogyny, when they don't really seem equipped to talk about or engage in systemic issues elsewhere.
236
00:36:17,266 --> 00:36:22,030
Chris: Just, yeah, that's a little like, have your cake and eat it, too.
237
00:36:22,850 --> 00:36:50,080
Kayla: McCaskill and SBF. And these guys aren't the only place to look for these criticisms of EA and long termism's effect on empowering the mega wealthy with both, like, a lot of money and a lot of material power. Professor Gary Marcus, who is a researcher focused on the intersection of neuroscience, cognitive psychology, and AI, which is very cool, recently wrote an article titled OpenAI's Sam Altman is becoming one of the most powerful people on earth. We should be very afraid.
238
00:36:50,540 --> 00:36:52,640
Chris: Great. And I am done. Yeah.
239
00:36:53,340 --> 00:37:14,088
Kayla: So Sam Altman, another Sam, just Sam problem all the way down. Sam Altman is the CEO of OpenAI, the company behind the ubiquitous chat GPT, and he's kind of recently become the poster child for the Silicon Valley AI movement. And he definitely has that Mark Zuckerberg. I'm just a regular guy, and I'm just doing this because I really like it.
240
00:37:14,184 --> 00:37:19,860
Chris: Just a regular guy having people take pictures of me. Wakeboarding on the 4 July.
241
00:37:20,440 --> 00:37:33,236
Kayla: Gary Marcus goes on to describe how Sam Altman uses both deceit to build an image. He lies about owning no stock in OpenAI. When he owns stock in companies that own stock in OpenAI, he's just like, put layers.
242
00:37:33,388 --> 00:37:40,700
Chris: Wait, so he says, I don't own stock in OpenAI, but then he has an ownership stake in Y.
243
00:37:40,740 --> 00:37:44,440
Kayla: Combinator, in Y Combinator, and Y Combinator owns stock in OpenAI.
244
00:37:45,780 --> 00:37:48,044
Chris: Okay, so he's just straight up lying. Okay.
245
00:37:48,092 --> 00:37:54,240
Kayla: And it's not like he doesn't know, because he was like, I don't know if he currently is, but he was at one time the president of Y Combinator, so he knows.
246
00:37:54,820 --> 00:38:01,366
Chris: That's like me saying, like, well, I don't own XYz stock, even though, like, it's part of a mutual fund that I own. Yes, I do.
247
00:38:01,398 --> 00:38:25,384
Kayla: Yes, you do. He lies about being pro AI regulation while actively working against it. He lies about whether or not a certain famous actress voiced his chat GPT program, even when that certain actress said, don't fucking use my voice. And then he tweeted her when he tweeted the chat GPT voice, which is the name of a movie that Scarlett Johansson was in, and it sounded exactly like Scarlett Johansson's voice, even though she said, don't use my voice.
248
00:38:25,542 --> 00:38:26,556
Chris: Yeah. And then she sued.
249
00:38:26,628 --> 00:38:27,796
Kayla: Yeah. Which she should have.
250
00:38:27,868 --> 00:38:32,732
Chris: Of course, at no point in that process did she say maybe?
251
00:38:32,796 --> 00:38:33,260
Kayla: No.
252
00:38:33,380 --> 00:38:33,724
Chris: Yeah.
253
00:38:33,772 --> 00:38:58,590
Kayla: No, he was moving fast and breaking things, and you shouldn't do that. Ex employees of OpenAI have been forced to agree to not talk badly about the company. I forget exactly what it was, but they were coerced into signing contracts that said they would lose all their stock options or something if they talked badly about the illegal. I think opening. I had to be like, oh, sorry, I guess we won't do that.
254
00:38:59,530 --> 00:39:01,362
Chris: Okay. I guess that's good at least.
255
00:39:01,466 --> 00:39:23,180
Kayla: Sam Altman's been recruited by Homeland Security to join their AI safety and security board. I do not know whether he actually has, but I know that, like, they've tried to recruit him, okay. While actively working to dismantle AI safety in his own company. And he's made a shit ton of money doing all this, even though he's, like, one of those guys. Like, I don't take salary. I don't have stock. Yes, you do.
256
00:39:23,370 --> 00:39:28,112
Chris: I thought OpenAI was just, like, losing money, like, burning piles of cash hand over fist.
257
00:39:28,176 --> 00:39:33,112
Kayla: I don't know how anything works, because it seems like that's how every company these days in Silicon Valley is.
258
00:39:33,176 --> 00:39:33,792
Chris: No, you're right.
259
00:39:33,856 --> 00:39:38,544
Kayla: They're losing all this money while the CEO and the execs become fabulously wealthy somehow.
260
00:39:38,592 --> 00:39:41,640
Chris: Yeah. Yeah. I also don't really know how is.
261
00:39:41,680 --> 00:39:45,300
Kayla: Elon Musk so rich when, like, half of his companies are constantly losing money?
262
00:39:46,320 --> 00:40:23,560
Chris: I feel like this is an important question to answer, and I don't quite have the answer to it, but it was like, when were watching succession, it was like, it didn't faze me at all that there was, like, this whole plotline of, like, oh, my God. Waystar Roiko is, like, owes bajillions, and, like, we're way in the red. Way in the red. And yet all the roys were just, like, on yachts and private jets. I was like, yeah, that makes sense. I don't really understand how this person is less solvent than I am, but they're the one on the yacht. I don't really get it, but that does track.
263
00:40:26,060 --> 00:40:48,764
Kayla: Yeah, it does. There's a lot more to the Sam Altman story. We'll link the article in the show notes so you can read further, because it is an article that and it's not anti AI article. This Gary Marcus fellow is a pro AI guy. Read the article, see what you think. But just know that Sam Altmande is another self proclaimed effective altruism guy.
264
00:40:48,852 --> 00:40:49,956
Chris: Oh, of course he is.
265
00:40:50,108 --> 00:41:07,580
Kayla: And there's no safeguards in place keeping this guy from fucking everything up with his quest to move fast. And break things, deregulate and de safetify AI. So he can either make his company rich or himself rich, or at least become very powerful, even if he doesn't have any money. This is a powerful man. He's being recruited by homeland security.
266
00:41:07,660 --> 00:41:08,560
Chris: Right. Right.
267
00:41:09,140 --> 00:41:19,288
Kayla: There's two more things I want to talk about before we wrap up. First. Last week I said I'd explain the difference between EA and utilitarianism. It didn't really fit anywhere, so I'll just say it here.
268
00:41:19,344 --> 00:41:24,408
Chris: Oh, yeah. Cause we said it was kind of like. It kind of feels like a modern iteration of utilitarianism.
269
00:41:24,504 --> 00:41:27,512
Kayla: Luckily, Wikipedia has a section on exactly this.
270
00:41:27,576 --> 00:41:28,312
Chris: Oh, perfect.
271
00:41:28,416 --> 00:41:57,598
Kayla: It states that EA does not claim that the good is the sum total of well being, and that ea. And that quote, EA does not claim that people should always maximize the good regardless of the means of. It. Goes on to explain that Toby Ord, one of the original philosophers behind EA, described utilitarians as number crunching compared with effective altruists, who are guided by conventional wisdom, tempered by an eye on the numbers. So they're the same but different. It's their different flavors.
272
00:41:57,694 --> 00:42:03,294
Chris: Okay. Okay. I think I understand. I'm gonna have to give that more thought. But thank you for the disambiguation.
273
00:42:03,382 --> 00:42:17,748
Kayla: I think the effect of altruists try to paint the utilitarians as, like, they just look at the numbers and nothing else matters, which, like, maybe some don't. And effective altruists look at the numbers, but they also consider other things. And I kind of think that.
274
00:42:17,924 --> 00:42:18,900
Chris: Okay, so what you're telling me, I.
275
00:42:18,900 --> 00:42:20,156
Kayla: Think there's a bit of an overlap.
276
00:42:20,228 --> 00:42:27,732
Chris: Is that utilitarians are pro mote or no pro torture, and the eas are pro mote, probably.
277
00:42:27,876 --> 00:42:37,134
Kayla: Okay, you gotta ask them. Lastly, I wanted to talk about community, and sorry if I'm getting on a soapbox here, because it's something that's been.
278
00:42:37,142 --> 00:42:38,822
Chris: Like, Caleb, this is a podcast.
279
00:42:38,886 --> 00:43:18,050
Kayla: I know this whole episode has been a little soapboxy. I was like, here's the criticism, and they're all mine. This is something that's been rolling around in my head ever since we started talking about these episodes and this topic. I think one of my personal criticisms of EA and longtermism is that it seems to remove a sense of community building and mutual aid from the idea of philanthropy or helping. And I don't think. Again, I don't think it's wrong for McCaskill to argue that it's better if you have $100 to use that money to help 100 people living in an impoverished area across the world from you rather than helping ten people living next door to you. There's certainly an argument there. I don't think that's wrong.
280
00:43:19,670 --> 00:44:07,000
Kayla: I think it's good to think about the global community and consider where we can help and who matters in this world. But I also think that philosophically diminishing the help you can do in your own community as ethically inferior has its own downsides. Like, I think that people like MacAskill and the Elon musks and the various sams of the EA world, they feel very cut off from people like you and me. I think the wealthier you get, the more cut off from quote unquote, regular society. You become to the point where you can only relate to other extremely wealthy people, and you live in this really hard edged bubble that cannot be penetrated. Ha ha. The world becomes a series of, unfortunately, calculations and hypotheticals. Like, hypotheticals you're detached from, and that's really, like, the opposite of community.
281
00:44:07,080 --> 00:44:10,096
Kayla: You do not live with the people who are your neighbors.
282
00:44:10,248 --> 00:45:06,810
Chris: Yeah, man. Yeah. I also am of two minds on this. You're right. It's good to widen the scope of who you think of as your neighbor and who you are willing to give charity to. And consider the global community and all of humanity. That all sounds nice. But on the other side, there's the contribution to the atomization of society. And if we're all just doing the math, which seems to be what they complain about, utilitarians, but anyway, if we're all just doing the math to say we can help the most people in XYZ place, don't worry about physically going down to the soup kitchen or whatever, or even just, I don't know, baking a pie for your neighbor, maybe they're still into that. But it just. It feels like it's emphasizing one thing over the other just because of the efficiency and the effectiveness.
283
00:45:07,710 --> 00:45:09,158
Chris: Yeah, I don't think.
284
00:45:09,254 --> 00:45:10,970
Kayla: It's not the only eroding.
285
00:45:11,390 --> 00:45:30,184
Chris: Yeah. Eroding the community down to the like. Like I said, atomization, where everything you do has to be mediated through an app, and you have to. If you're gonna swim at a friend's pool, it has to be through swimpley. And if you're gonna, you know, get food from a friend, it has to be through Uber and yada, yada. If you're gonna stay at a friend's place, it's gotta be through Airbnb.
286
00:45:30,272 --> 00:45:37,864
Kayla: Right? Right. I read a quote, during all this research that I lost in one of my tab closures, and I haven't been able to find it again. Forgive me.
287
00:45:37,872 --> 00:45:38,800
Chris: Oh, tab closures.
288
00:45:38,880 --> 00:46:21,930
Kayla: No, forgive me for paraphrasing, especially, but someone pointed out that the Silicon Valley effective altruists literally live in their ivory towers above San Francisco, debating and calculating how to get more wealth and what future populations to help with it, while scores of people literally suffer and die in the streets of San Francisco below them, like, literally below them, literally beneath their feet. And that point stuck with me. Like, we live in Los Angeles. We live in a city that has a tremendous unhoused population. Like, I think it's 75. It's north of 75,000 people. And I think about that's so many. And I think that's just La city. I think La county is more. And so I think about that 200,000 number that Scott Alexander talks about.
289
00:46:22,030 --> 00:46:48,198
Kayla: And I think about if Elon Musk were to take some of his exorbitant wealth and do something to house unhoused people in Los Angeles, if you want to think about it in long term stuff, that's not just helping those 75,000 people, that's helping the unhoused people that come after them as well. I don't know why they're not thinking about these things. I think that the destruction of community does have material impact.
290
00:46:48,354 --> 00:46:54,174
Chris: Well, I think that's part of my problem with long termism, is that there's just a lot of, like, assumption that we are doing the calculations correctly.
291
00:46:54,222 --> 00:46:54,810
Kayla: Right.
292
00:46:55,110 --> 00:47:13,030
Chris: And I just don't know that you can do that. You know, it's like, oh, well, we're definitely spending the money in the right place. Like, are you, like, you have certainty on that? Like, we're applying the certainty of mathematics to something that doesn't feel that certain to me. I'm not saying there isn't a correct answer. I'm just saying that we don't know what it is.
293
00:47:13,070 --> 00:47:13,516
Kayla: Right.
294
00:47:13,638 --> 00:47:23,256
Chris: And you certainly don't know what it is with your EA calculations or your utilitarian calculations. It's just. Yeah, that's one of my problems with it.
295
00:47:23,448 --> 00:47:39,144
Kayla: Essentially, it's not morally inferior to help your neighbors. Like, our communities are important, and I think that effective altruism and longtermism divorces the wealthy from that idea. Yeah, I lied. There's one more point.
296
00:47:39,232 --> 00:47:40,912
Chris: Oh, my God. You're just like, one more point.
297
00:47:40,976 --> 00:47:59,596
Kayla: You're like, I know. The long termist argument. This is a little point. The long termist argument that future people morally matter just as much, if not more, than people literally alive right now is like a super duper steep, slippery slope. And I worry about the anti abortion argument lurking in that philosophical mire. Like, I really worry about.
298
00:47:59,628 --> 00:48:01,476
Chris: That's fundamentally natalist.
299
00:48:01,588 --> 00:48:06,620
Kayla: I hope long termists at some point grapple with that, because I'm worried about that one.
300
00:48:06,780 --> 00:48:21,060
Chris: Yeah. And like, I gotta say, as an economist, I also am kind of annoyed that, like, they're not applying a future discount to the morale, you know, like, you do that with money. Future money is worth less than current money.
301
00:48:21,100 --> 00:48:21,716
Kayla: That's true.
302
00:48:21,828 --> 00:48:28,188
Chris: So why aren't you doing that with future people? Like, people ten years from now should be like, 50% of people now and so on.
303
00:48:28,244 --> 00:48:30,252
Kayla: You should email Sam all day. Tell him that.
304
00:48:30,316 --> 00:48:36,692
Chris: See, but they're like saying, oh, we're so objective with all of our math. And, like, have you even taken an econ 101 class?
305
00:48:36,756 --> 00:48:43,532
Kayla: I think they have, and I think that's the problem. Caroline Ellison's father is like an econ professor at MIT.
306
00:48:43,636 --> 00:48:45,196
Chris: Well, then he should be bringing this up.
307
00:48:45,308 --> 00:48:46,180
Kayla: You should go email him.
308
00:48:46,220 --> 00:48:46,724
Chris: I'll email him.
309
00:48:46,772 --> 00:49:10,782
Kayla: I'm sure he doesn't have anything else to worry about right now. Okay. I have talked a lot about these guys and about effective altruism and long termism and everything that goes along with it. And I think it's finally time we made our way through the test grail bundle. We hit basically every letter on our way down the ladder. So next week on Culture, just weird. Is Tess Creole a cult or is it just weird?
310
00:49:10,886 --> 00:49:18,174
Chris: Oh, actually, sorry, Kayla. There's one more thing that we need to cover, I think. Wait, is context.
311
00:49:18,222 --> 00:49:19,582
Kayla: But we did all the test grills.
312
00:49:19,646 --> 00:50:00,826
Chris: I know we did do all the test grills. And I know that we're really chomping at the bit at this point. I kind of feel like EA, by itself, could be its own criteria. We probably could have evaluated that. But I want to talk about a little bit about the context of eugenics that sort of is not. Behind is not a good word. It's sort of like a precursor, but it's a complex precursor to all this stuff. And I don't want to. We'll get to that. That'll be next week's episode. But I don't want to give the impression that, yeah, eugenics, the super nazi part of eugenics is just everything we've talked about.
313
00:50:00,858 --> 00:50:02,634
Kayla: You're saying it's etescreal.
314
00:50:02,802 --> 00:50:14,034
Chris: It's sort of etescreal, but the test real bundle has some DNA in the eugenics movement, and I feel like that's important context to bring up before we do the criteria.
315
00:50:14,082 --> 00:50:32,534
Kayla: That's really good because I left out a bunch of stuff from this episode that has to do with eugenics adjacent, stuff that's related to effective altruism. So. Perfect. All right, next time on cultures is weird. We're talking about eugenics again. Again, this is Kayla, this is Chris.
316
00:50:32,622 --> 00:50:36,650
Chris: And this has been cult or too much context.