Transcript
1
00:00:00,960 --> 00:00:39,234
Chris: Transhumanism's quest to eliminate disability is entangled, therefore, historically, structurally, and symbolically, with racism and cissexism. While transhumanists might want to deny these connections, they are deeply embedded in the construction and positioning of disability. Intellectual disability, in particular, has been used to oppress racial and sexual others. What are. What's happening? That's just.
2
00:00:39,282 --> 00:00:40,658
Kayla: The chair did that.
3
00:00:40,754 --> 00:00:41,994
Chris: The chair did what?
4
00:00:42,042 --> 00:00:43,150
Kayla: Made me turn.
5
00:00:45,450 --> 00:00:46,090
Chris: Oh, my God.
6
00:00:46,130 --> 00:00:47,110
Kayla: This is the chair.
7
00:00:50,690 --> 00:00:52,874
Chris: Turning in the chair made you laugh.
8
00:00:52,962 --> 00:00:54,594
Kayla: Look. No, it's just the chair.
9
00:00:54,682 --> 00:00:58,918
Chris: Well, yeah, it just. It's just off. It's just off. What? I don't know.
10
00:00:58,934 --> 00:01:02,730
Kayla: Like, it's just I have to anchor myself with my hands the whole time.
11
00:01:03,190 --> 00:01:07,798
Chris: Yeah. Welcome to chairing. I don't know. So when you sit in your chair.
12
00:01:07,854 --> 00:01:10,646
Kayla: Out there, you just spin around wildly?
13
00:01:10,798 --> 00:01:19,998
Chris: I don't think that was wild. I think you were just, like, slightly spinning there. All right, I'm warmed up. Was that all recording or.
14
00:01:20,014 --> 00:01:23,918
Kayla: No, that was all. That was all recording. Don't worry. And make sure you get all the gold.
15
00:01:24,014 --> 00:01:28,466
Chris: Good. No, that's not gold. That was. That was garbage. That was garbage. Bamboo.
16
00:01:28,498 --> 00:01:29,594
Kayla: Grade a gold.
17
00:01:29,682 --> 00:01:38,914
Chris: Grade a garbage. Gold. I guess we'll just say welcome to cult are just weird right off the bat. I'm Chris. I'm game designer slash data scientist.
18
00:01:39,002 --> 00:01:43,550
Kayla: I'm Kayla. I'm not gonna tell you what I am. I'm gonna let you decide for yourselves.
19
00:01:44,170 --> 00:01:46,802
Chris: No, you're supposed. It's our credentials.
20
00:01:46,826 --> 00:01:50,226
Kayla: I'm gonna let you define me. Also, I write tv.
21
00:01:50,378 --> 00:02:06,020
Chris: Welcome to cult or just weird. We are happy that you are here. If you want to make this conversation two ways, you can come join us on discord. The link is in the show notes. Also, go to Patreon and give us money.
22
00:02:06,320 --> 00:02:07,640
Kayla: Obviously, that's.
23
00:02:07,680 --> 00:02:15,424
Chris: Yeah, more the money thing. And then talk with us. Actually, if you give us money on Patreon, you get access to exclusive channels in the discord.
24
00:02:15,592 --> 00:02:16,480
Kayla: They're the best channels.
25
00:02:16,520 --> 00:02:25,438
Chris: Bada bing. There's my call to action. Do you want to do some banter? Actually, you know what's funny? It's not funny. It's more horrible and pathetic.
26
00:02:25,494 --> 00:02:26,090
Kayla: Right.
27
00:02:27,030 --> 00:02:30,822
Chris: Like, I was doing. So when I remember, I was doing the transcripts for all the episodes.
28
00:02:30,886 --> 00:02:36,286
Kayla: Yes. Which. All the episode transcripts are now up if you need, including the old episodes. They're all there for you.
29
00:02:36,398 --> 00:02:41,238
Chris: Which means I had to go through some of the old episodes and. Oh, my God, they are so fucking cringe, dude.
30
00:02:41,294 --> 00:02:41,970
Kayla: Don't.
31
00:02:42,790 --> 00:03:02,638
Chris: Especially the thing that made me think of it was just saying the word banter because, like, in the old episodes, it's like two or three seasons where, like, each and every episode in a row, our opening chit chat, I'll call. It, was like, should we do banter? I don't know. Is this. Does this count? Like, oh, my God, it's so bad.
32
00:03:02,694 --> 00:03:03,734
Kayla: What are we doing here?
33
00:03:03,782 --> 00:03:08,278
Chris: I wanted to, like, reach back in time and choke myself, and it was only, like, three years ago.
34
00:03:08,454 --> 00:03:11,022
Kayla: I want to choke myself three years ago for a lot of reasons.
35
00:03:11,086 --> 00:03:23,236
Chris: Yeah, I guess that's true. Well, okay, so actually, let's just get to it. Like, I was gonna. I had a plan of, like, oh, let's. I'm going to talk about some of the video games I've been playing. Oh, let's talk about the tv show we just watched.
36
00:03:23,348 --> 00:03:24,396
Kayla: The people don't care.
37
00:03:24,468 --> 00:04:11,612
Chris: We got to get to the content here. We got to really get to the topic, because just like last week, it's kind of big, and I don't think I'm going to make a shorty episode out of such a big topic. So the reason the topic's big is because it's eugenics. We've been talking about eugenics. We want to talk about eugenics because it's related to transhumanism, but eugenics is a big topic. So before we get into episode number two about it, I just kind of want to, like, recap what the goal of these episodes are. So basically, I just. I began feeling more and more like, as were studying TESCREAL this season, I just got the feeling that it's, like, impossible to understand this bundle of ideologies without understanding eugenics.
38
00:04:11,676 --> 00:04:46,152
Chris: So I just felt like it was super important to at least scratch the surface of eugenics as a topic. As usual. What counts as the surface of something is entirely unclear, and scratching at it just causes you to get sucked in more and more. But I just wanted to state that goal of educating about eugenics so as not just to enrich, but actually allow for understanding of transhumanism. And the rest of all the letters, that doesn't mean I think they're the same thing, or even that one is a subset of the other. The relationship is much more complicated than that.
39
00:04:46,336 --> 00:04:55,696
Kayla: Wait, I thought were here because you were accusing all transhumanists and TESCREALists of being secret. That's not what we're doing here. You're not accusing them all of being secret eugenicists?
40
00:04:55,808 --> 00:04:56,820
Chris: Well, a little.
41
00:04:58,840 --> 00:05:00,740
Kayla: That was a joke from Kayla.
42
00:05:02,400 --> 00:05:16,316
Chris: There's one more thing. I think there's actually another area of study that I have become convinced that there's a third link in that chain there. To understand transhumanism, you have to understand eugenics. And to understand eugenics. Well, we'll get to that.
43
00:05:16,388 --> 00:05:17,172
Kayla: Oh.
44
00:05:17,316 --> 00:05:18,320
Chris: Are you ready?
45
00:05:19,300 --> 00:05:21,316
Kayla: Maybe. Yes, I am.
46
00:05:21,388 --> 00:05:22,676
Chris: You better. It doesn't matter.
47
00:05:22,788 --> 00:05:23,700
Kayla: But I was born ready.
48
00:05:23,740 --> 00:05:46,110
Chris: That was a rhetorical question. So the structure of the episode here, you and I are going to talk about the actual substantive ideological connections between transhumanism and eugenics. And then we're also going to kind of talk about like the people involved and then we're going to talk about that third thing that I said we would get into. Alright, so, question. Do you remember the eugenics slogan we ended last episode with?
49
00:05:46,850 --> 00:05:47,282
Kayla: No.
50
00:05:47,346 --> 00:05:56,594
Chris: No, of course you don't, because that was a week ago. Okay. It goes like. Like this. Like. So eugenics is the self direction of human evolution.
51
00:05:56,722 --> 00:05:57,858
Kayla: I remember that now.
52
00:05:57,954 --> 00:06:08,080
Chris: Right. So that was a slogan at some eugenics conference in like the twenties or something. But I just, I. The reason I left us with that is because it just sounds like such a transhumanist statement.
53
00:06:08,500 --> 00:06:09,320
Kayla: Yeah.
54
00:06:09,740 --> 00:06:28,040
Chris: Another question. This one should be much easier. A good friend who listens to the show asked me this exact question actually a few weeks ago at dinner. Based on everything we've learned so far, do you think that the fundamental through line in transhumanist ideology is eugenics?
55
00:06:30,180 --> 00:06:34,946
Kayla: Are you asking me, are you just asking Kayla, stating that a friend asked you this question?
56
00:06:35,108 --> 00:06:37,782
Chris: Both. Both of those things. Hey, a friend asked me this, but.
57
00:06:37,806 --> 00:07:17,826
Kayla: Don'T give me your input. Anyway, moving on. Is it eugenicist to combine with a computer? I think. No, I think that it is not eugenicist to be like, hey, I want a cyborg arm or I'm going to like, transhuman my body. I think if the transhumanists are like, I am going to have a baby in this way with the standard quote unquote way that we have a baby, but while it's in the womb, we're going to genetically standard way.
58
00:07:17,858 --> 00:07:19,230
Chris: Can you describe it in detail?
59
00:07:19,930 --> 00:07:21,042
Kayla: Well, it's a lot of in and.
60
00:07:21,066 --> 00:07:22,578
Chris: Out, in and out humping.
61
00:07:22,634 --> 00:07:23,002
Kayla: Yeah.
62
00:07:23,066 --> 00:07:23,830
Chris: Okay.
63
00:07:25,050 --> 00:07:31,050
Kayla: I think where it starts to get into a potentially. Do you really want that to be part of our podcast?
64
00:07:31,210 --> 00:07:31,892
Chris: Yeah.
65
00:07:32,026 --> 00:07:52,264
Kayla: Okay, well, you're the one wasting time. I think where the gray area starts to come in is like, are the transhumanists saying by and large that like, human birth should be controlled largely via IVF in a way that, like, engineers, like, are we doing baby engineering as, like, as, like, a standard?
66
00:07:52,352 --> 00:07:52,624
Chris: Sure.
67
00:07:52,672 --> 00:08:04,936
Kayla: I think there's a difference between doing baby engineering and doing, like, hey, cool. When you, like, are. When you exist as a person, you, like, give yourself a cyborg body. I think those are different things.
68
00:08:05,088 --> 00:08:11,896
Chris: Right. So when you say baby engineering, I'm assuming you mean, like, babies with, like, little hard hats and slide rules going.
69
00:08:11,928 --> 00:08:13,568
Kayla: Out and building construction babies.
70
00:08:13,664 --> 00:08:23,280
Chris: Right. Okay. Okay. That makes sense. No, I think I pretty much agree with you there. But, you know, of course, I'm asking this at the top of the episode, so maybe we revisit that question.
71
00:08:23,320 --> 00:08:26,544
Kayla: Yeah, I'm an ill informed member of the public. I don't know.
72
00:08:26,632 --> 00:08:37,940
Chris: I just wanted your hot take. Here's another hot take. So I don't know if you remember our friend James Hughes. Do you remember that name from multiple episodes ago? No, he's come up a couple times.
73
00:08:38,059 --> 00:08:38,659
Kayla: Okay.
74
00:08:38,740 --> 00:08:56,796
Chris: Okay. So James hughes was the. He was the head of. I don't know what title it was, but the head of the world transhumanist organization association, excuse me. That then became humanity plus. And he was the guy that was sort of, like, at the helm when they had that big, like, libertarian left split.
75
00:08:56,868 --> 00:08:57,460
Kayla: Right, right.
76
00:08:57,540 --> 00:09:37,950
Chris: And he's the one that took it left. He's also published a rebuttal to the whole TESCREAL thing. I think we mentioned that in the episodes where we interviewed Doctor Torres, but he published a big, oh, why? This is a conspiracy theory, but it's a critique from the left. He's a very left guy, so he's an interesting character. He reminds me of a lot of the guys, even not to call him a eugenicist. I'm not saying that. But he reminds me a lot of some of the characters from last episode where it's like, he's progressive, he's science focused. But does he have a blind spot? I don't know.
77
00:09:37,990 --> 00:09:38,310
Kayla: Right?
78
00:09:38,390 --> 00:09:48,654
Chris: I don't know. I did. Like. I'm gonna read this quote, though. Here's what he has to say about that whole question. Basically, quote, whether germinal choice is.
79
00:09:48,742 --> 00:09:50,838
Kayla: What is germinal? I'm sorry.
80
00:09:50,934 --> 00:09:53,222
Chris: Germinal is a country in Europe.
81
00:09:53,326 --> 00:09:54,038
Kayla: Okay.
82
00:09:54,094 --> 00:10:09,338
Chris: Yeah, they have beer and bratwurst. No, germinal. Just germ. Or. Germinal is referring to germ cells, and germ cells are the cells that develop into a reproductive cell, like either an egg or a sperm. So germinal just refers to, like, reproductive stuff.
83
00:10:09,394 --> 00:10:09,890
Kayla: Okay.
84
00:10:09,970 --> 00:10:56,072
Chris: Okay. So he says whether germinal choice really is eugenics depends one's definition of eugenics, which, like, yeah, I don't know. It feels kind of obvious to me. But anyway, he continues, the eugenics movement that spread across Europe and the United States before 1945 encouraged selective breeding and was responsible for the mandatory sterilization of criminals, the poor, the disabled, and dark skinned people based on unscientific theories. There are very few advocates of this older eugenics around today. And to the extent that anyone advocates racist, classist, or authoritarian ideas, they are to be despised. But if eugenics includes believing that individuals free of state coercion should have the right to change their own genes and then have children, then the advocates of human enhancement and germinal choice are indeed eugenicists.
85
00:10:56,246 --> 00:11:37,004
Chris: If eugenics also includes the belief that parents and society have an obligation to give our children and the next generation the healthiest bodies and brains possible, then most people are eugenicists. Once safe, beneficial gene therapies are available, parents will feel the same sense of obligation to provide them for their kids as they do a good education and good health care. As bioethicist Arthur Kaplan has said, many parents will leap at the chance to make their children smarter, fitter, and prettiereghenite. They'll slowly get used to the idea that a genetic edge is not greatly different from an environmental edge. On the other hand, if eugenics is authoritarian genetic correctness, it is precisely the bioluttites who are today's eugenicists.
86
00:11:37,172 --> 00:11:50,240
Chris: The bioluttites are the ones who want laws on what kind of children we can and can't have, who want to forbid people from controlling their own bodies and reproductive choices, end quote. Does that clear things up at all?
87
00:11:50,890 --> 00:12:04,298
Kayla: I got a lot of things to say, and I can't say most of them. I think there's some really insidious twisting of words there. And I don't care for the phrase bio Luddites.
88
00:12:04,354 --> 00:12:18,500
Chris: I don't care for that either. I came away from this quote mixed as you are. I still like James Hughes. I think most of what I've read from him has been generally good. And I don't think this is all bad either.
89
00:12:18,580 --> 00:12:18,964
Kayla: Agreed.
90
00:12:19,012 --> 00:12:23,340
Chris: There's just, like, several things in here that are kind of sus, that make me go like, ew, I don't like that you said that.
91
00:12:23,380 --> 00:12:43,052
Kayla: Well, what seems to constantly be missing from these conversations is, like, the hows. And I'm not sure if everybody who's listening to somebody saying those words, that implies that all reproduction is going to become IVF. That implies that all reproduction, I mean.
92
00:12:43,076 --> 00:12:46,380
Chris: Transhumanists might say that's the case, though.
93
00:12:46,720 --> 00:13:14,820
Kayla: And I am so worried about the financial aspect of that. I have a lot of thoughts and feelings about that desire to turn reproduction solely into an assisted reproductive technology approach. That's not. I have a lot of things to say. It'll take a lot to. We only have so much time here, so I want to wait and see what you have to say more about this stuff.
94
00:13:15,240 --> 00:13:16,060
Chris: Yeah.
95
00:13:16,240 --> 00:13:20,964
Kayla: Some people watch Gattaca and come away with one conclusion, and some people watch Gattaca and come away with another.
96
00:13:21,092 --> 00:13:54,330
Chris: Right, right. Well, I think it's that type of film and it's that type of topic. Right. Like, it's a confusing topic, even for people who, like, oh, he's progressive and has, like, a lot of good ideas. I'm still kind of uncertain about his take on it. However, I will say that I think that summarizes sort of the transhumanist take on things. Right. Like, a lot of the points he made in there are talking points that come up again and again. Right. Like, oh, well, what's the difference between this and giving my kid the best chance at school and blah, blah, right.
97
00:13:55,230 --> 00:13:57,838
Kayla: It would take a long time to sit here and talk about it, but there are differences.
98
00:13:57,974 --> 00:14:06,150
Chris: There are differences. But as to the talking points of transhumanists, when this topic comes up, I feel like that's a pretty good representation.
99
00:14:06,270 --> 00:14:07,444
Kayla: Okay. Got it.
100
00:14:07,542 --> 00:14:52,618
Chris: As an aside, my answer to my friend a few weeks ago, on the whole, like, is transhumanism just eugenics? I said no. It was kind of equivalent to what you were talking about. But my, like, well, okay, if it's. If the answer is no, then what is it? Then I gave him this whole long answer about how I think that, like, the pedigree of transhumanism is actually more humanism and that eugenics and transhumanism are both branches on that tree. They're not. Like, the trunk isn't eugenics, and transhumanism comes off of it. Humanism is the trunk, and they both come off of it. After doing all this research in the last couple weeks now, I'm like, okay, I still think that's the case.
101
00:14:52,674 --> 00:15:27,298
Chris: I still think that the central thread is humanism and progress and enlightenment and that those are branches, but there's also all these vines going in between the branches, and one of them is grafted to the other. It's just, it's so complicated. Like, you can't even envision a tree you know, like, I get confused. Like, I think were talking about this today where sometimes I feel like I'm, like, one zoom out away from just saying, did you know that people influence other people throughout time? People have influenced people. I feel like I'm, like, one step away from that.
102
00:15:27,394 --> 00:15:30,114
Kayla: Everything is everything. Yeah, all of the time.
103
00:15:30,202 --> 00:15:30,936
Chris: I know.
104
00:15:31,098 --> 00:16:19,450
Kayla: I do. Like what he said about, yo, if you want to change your genetics and then have kids, like, cool. Cool as hell. I'm like, yeah, that sounds great. I think another thing that's missing from this conversation is just what choice means. And it's something that I haven't seen a lot of engagement with, because, like, what does choice mean in this world? Does choice mean that these options are, quote unquote, available to everyone? Or does, like, we keep talking about, does choice mean these options are available to the wealthy elite? Because if these options are only available to a certain price point, then it starts to be a really. An extreme version of the, like, conversation. It's equivalent to somebody being like, well, you don't have to have an iPhone to get by in today's society. You can also have android.
105
00:16:19,570 --> 00:16:21,026
Kayla: And that's kind of leaving out.
106
00:16:21,058 --> 00:16:21,950
Chris: Excuse me.
107
00:16:22,730 --> 00:16:38,246
Kayla: That's leaving out the fact that, like, an iPhone, there are huge swaths of people that simply cannot afford a smartphone at all. And so it's not about, like, the quote, unquote, choice between smartphones. It's the choice if there is not a choice for some people because of the price.
108
00:16:38,278 --> 00:16:39,318
Chris: Is it the green text box?
109
00:16:39,374 --> 00:16:40,238
Kayla: It's the green text box.
110
00:16:40,294 --> 00:17:30,630
Chris: Yeah, that makes sense. We should eugenics away Android phones. No, no, I get. No, I get what. I get what you're saying. Just to further disambiguate here, like, I think there's kind of also another subdivision. Okay, so we're past coercive control from the government right now. We're into, like, it's freedom of choice. But is it really free if we live in this system where, like, you have to work five jobs just to, like, pay half your rent, and then Elon Musk can, like, pay for 10 million genetic blah, blah, whatever? Right? So that's one thing, and I will say for sure that a guy like James Hughes, I have seen in his writing him stress that actually, one of his big points is, this stuff is going to happen. We have to make it accessible or we're screwed.
111
00:17:30,790 --> 00:17:56,836
Chris: That's part of, like, why he's good. That's part of why is because that's one of his main thrusts, I think that there is still yet another subdivision here, though, of like, okay, even if it's accessible to everyone, even if it's free real estate, right? Is there not still something that makes you feel weird about like certain choices that people might still be compelled, feel compelled to make?
112
00:17:56,948 --> 00:18:00,884
Kayla: Oh, yeah, this is not a choice. It stops being a choice when it becomes the default.
113
00:18:00,972 --> 00:18:11,556
Chris: Yeah. And this is where I want to kind of like jump in a little bit with like, some of the discussions we've been having on discord and with you and me offline, like, not sleeping.
114
00:18:11,588 --> 00:18:14,140
Kayla: Because we're just laying back, being like, let's talk about eugenics.
115
00:18:14,300 --> 00:19:02,636
Chris: Jesus Christ. Don't admit that we are the worst. But these discussions were still, I still thought they were interesting, right? So one point that one of our discorders brought up was that, okay, so we had this scenario that we talked about last week where what if you were able to select for in utero, some trait, like, say, if somebody was gay or not, right? This is, this is the sort of thought experiment that Doctor Watson of Watson and Crick fame brought up. Would it be okay for the mother to terminate that pregnancy? And so the thing that the sort of the discussion that came up was, well, I think that she should still have her bodily autonomy. And I think that was Doctor Watson's position as well. That doesn't mean that.
116
00:19:02,708 --> 00:19:08,912
Chris: So I don't think the state should be involved, but that doesn't mean that I won't shun this person and think that they're despicable.
117
00:19:08,996 --> 00:19:09,384
Kayla: Right?
118
00:19:09,472 --> 00:19:19,696
Chris: So I think that's an important distinction is like, it's kind of like a free speech thing, right? It's like, okay, well, just because the government isn't controlling it doesn't mean I like what you have to say.
119
00:19:19,768 --> 00:19:25,328
Kayla: I don't have to sit here and listen to you. And I also then have the free speech to say, shut the fuck up, right? Or walk away.
120
00:19:25,384 --> 00:20:07,800
Chris: I support your, you know, freedom of your bodily autonomy to have made that decision. I don't support the decision itself. I think that's perfectly fine. But I do think that there's an additional layer of like, okay, but there's still like, I can do that. Like Voltaire, I will fight for the right of that person that I disagree with, whatever. But then we actually have an example, and you're the one that actually reminded me about this example. We have a real world example of something that you can actually select for in utero that has a massive impact on society. We don't have to, I think the way that it got brought up, as I was saying, well, it's a good thing we don't have to worry about this so much. Cause the science is bad.
121
00:20:07,840 --> 00:20:19,016
Chris: And you were like, yeah, but what about sex selection, right? What about the fact that every time that you and I have gone to the IVF doctor for talking about implantation, they've been like, so which embryo do you want?
122
00:20:19,048 --> 00:20:29,640
Kayla: Which sex do you want? Do you guys have a sex preference? And it's, like, very casual. And there's no, like, you're not having the conversation with, like, a counselor. You're having it with just, like, the random doctor and or nurse. It's like.
123
00:20:29,680 --> 00:21:13,626
Chris: And it's a snap decision. It's like, just on. It's like she's, you know, you're sitting on the exam table, and she's just standing there going like, hey, which one do you want? As if it was, like, a coke flavor or something. And that strikes us as weird. And maybe that's not caused the huge problem over here, but parents that are able to sex select in utero in other countries, that has been very damaging societally. So I'm talking about India and China specifically here. Not to, like, rag on other countries, but there's a major disparity in the number of men and women in those countries. Right now, it's only, like, a one or 2%, but that actually ends up translating to being, like, millions of millions.
124
00:21:13,658 --> 00:21:15,002
Kayla: More men than women, more men than.
125
00:21:15,026 --> 00:21:37,826
Chris: Women, which then creates a whole downstream just a host of problems, to the point where the indian government has laws on the books. They try to correct for this. But there's been so much sex selection in utero. And there's another thing. There's infanticide, too. There's been so much. And by sex flushing in utero, I literally mean, like, oh, if you genetically test and it's a girl, you abort it.
126
00:21:37,858 --> 00:21:42,682
Kayla: Right. We're not talking about, like, IVF choosing which embryo. We're talking about quote unquote traditional.
127
00:21:42,706 --> 00:21:43,770
Chris: I'm sure that goes on, too.
128
00:21:43,850 --> 00:21:52,618
Kayla: But we're talking quote unquote traditional pregnancies in which. Yeah, once the fetus in utero is sexed, a decision is made to abort depending on that sex.
129
00:21:52,714 --> 00:22:14,028
Chris: Right? So that's literally eugenics. It's not like a phenotype trait. That's like, well, this is caused by a million different factors. It's like, nope, there's an X chromosome and Y chromosomes real easy. We can do it for sure. And it has a massive effect. And so were just talking about like, okay, well that sucks. And like, obviously, you know, we don't, obviously can't blame the parents, right? It's the system, right.
130
00:22:14,044 --> 00:22:15,548
Kayla: I don't blame any individuals for that.
131
00:22:15,564 --> 00:22:30,166
Chris: It's extreme in terms of like having a son versus a daughter. It's like extreme. It's way more costly for things like Dowry and whatnot, to have a daughter in some of these countries. So I totally don't blame the parents for, I say wanting to make that decision.
132
00:22:30,238 --> 00:22:30,542
Kayla: Sure.
133
00:22:30,606 --> 00:22:32,526
Chris: I'm sure for a lot of them it's needing. Right.
134
00:22:32,598 --> 00:22:47,694
Kayla: And I'm also assuming that, like I'm assuming that some of, for, in regards to China, I'm assuming that some of this is a result of the quote unquote one child policy that was on the right. And we are as random Americans. So like, you know, take this with a grain of salt, but from, and.
135
00:22:47,702 --> 00:22:55,066
Chris: If you're from there, please email us and, you know, correct us or help us understand. Culturesweirdmail.com dot what we do know is.
136
00:22:55,138 --> 00:23:02,010
Kayla: These sex selective abortions are occurring and have drastically altered the sex ratios in the countries right.
137
00:23:02,090 --> 00:23:53,246
Chris: Now. Obviously, the easy answer to this is, well, geez, it would be nice if these existing social norms that we find to be pretty oppressive if those weren't around. And it sure would have been nice if those weren't there before we discovered genetic engineering technology that would have been real nice. But we have these social norms that were developed a thousand years ago over time. And now all of a sudden it's like, and now you can select your sex of your child in Europe, right? And I think that like, aside from just being like an interesting and depressing thing to talk about, it also kind of illustrates that, like, okay, well, even in a quote unquote liberal eugenic situation and maybe even where there's equal access, there might still be an issue.
138
00:23:53,398 --> 00:24:09,694
Chris: If you're doing this type of stuff in a society that has certain norms. It's exactly what Doctor Torres was saying in the interview. Right? If the society has certain preferences and then you give people a choice even though it's not state control, it could end up being bad.
139
00:24:09,782 --> 00:24:10,854
Kayla: Right? Right.
140
00:24:11,022 --> 00:24:17,770
Chris: So this is like we don't have enough time to solve this problem, this intractable problem for the globe.
141
00:24:17,810 --> 00:24:20,778
Kayla: Wait, you and I aren't going to finish this by the end of the hour and a half?
142
00:24:20,914 --> 00:24:47,562
Chris: No, I just wanted to bring it up as an example of some stuff that we've talked about, both you and I and on discord and in relation to the multiple layers of why this can get uncomfortable. I also want to mention here, just to kind of, like, harken back to the, hey, this is all of a branch of humanism and enlightenment. I think it's also worth questioning. Like, is progress always great?
143
00:24:47,666 --> 00:24:51,682
Kayla: Ooh, is progress bad? Line go up bad.
144
00:24:51,746 --> 00:24:53,706
Chris: Right? Well, that's what Chris is saying.
145
00:24:53,738 --> 00:24:55,190
Kayla: Chris is saying progress is bad.
146
00:24:56,570 --> 00:25:13,202
Chris: Yeah, I'm saying progress is bad. No, like, we've had this discussion, too, right? Like, is progress always good? Sometimes it's clearly good. Like, if you have an ectopic pregnancy now, you don't die. You would have died before. That's good. That's a good part of progress. But is it always good? I don't know. We have.
147
00:25:13,266 --> 00:25:30,978
Kayla: Is having a computer on your car dashboard better than what the car dashboard used to be? No, that's considered progress is, like, now the dashboard console is an iPad versus before it was dials. And I do not think that's in the head. That's a very clear, to me, example of, like, progress not necessarily being better.
148
00:25:31,074 --> 00:26:20,190
Chris: Yeah. And if you don't subscribe to car, then your car doesn't go, please kill me. But that's, like, you know, it's a little hard for us to envision because we're, like, in western society, but, you know, it might beneficial to us if we maybe thought about other civilizational paradigms other than western progress. Line go up, everything, start bad, get good. And, like, it's. It permeates everything. Right? Like, we're talking about evolution and genetics here. Like, a lot of what happens with a lot of the rhetoric, at least it used to be this way, when talking about evolution and natural selection, is that evolution improves organisms and they get better over time and more advanced. And that's what the eugenicists talk about, too. Now, luckily, we've moved past that to talking about being more fit for your environment.
149
00:26:21,450 --> 00:26:24,322
Chris: Evolution doesn't have a bias towards good or bad.
150
00:26:24,386 --> 00:26:38,920
Kayla: When I had to learn that survival of the fittest didn't mean survival of the biggest and the strongest, but survival for the organism most fit to its niche, like, that really changed things for me, how I thought about this.
151
00:26:38,960 --> 00:27:32,744
Chris: There's also the bias about, like, oh, these people are primitive versus our civilization is advanced. Like, we tend to think of certain civilizations as being backwards in time versions of us, rather than their own civilization that actually exists alongside us. It's a very colonial mindset, obviously. I don't think we have that as much anymore, but certainly that still permeates right. And that goes back to the whole, like, everything's progress. Everything is a line going up. Is that true? Always. I think it would be. It would behoove us to question that sometimes. All right, so I said, we talk about ideas, and then we talk about people. Time to talk about a people. Time to talk about the Huxley family. So, the Huxley family man, Aldous, they were. Aldous is one of them. So, Aldous, what did Aldous do? He wrote brave new World.
152
00:27:32,832 --> 00:27:38,620
Kayla: He did? And I pretended to read that in high school, and I would just, like, carry it around, but it wasn't assigned.
153
00:27:39,320 --> 00:27:39,864
Chris: Whoa.
154
00:27:39,952 --> 00:27:48,464
Kayla: It was just like, oh, I should read this book. And I like, oh, you are so adorable. No, it's because I kept starting it, and then it was, like, kind of really boring, and I never finished it.
155
00:27:48,592 --> 00:27:51,208
Chris: Yeah, were supposed to read, Tessa the d'Urbervilles in high school.
156
00:27:51,264 --> 00:27:52,048
Kayla: Not doing that.
157
00:27:52,144 --> 00:27:57,386
Chris: And we watched the movie so that we wouldn't have to read the book, and were too bored by the movie.
158
00:27:57,498 --> 00:27:59,190
Kayla: Yeah, that's a bad sign.
159
00:28:00,570 --> 00:28:33,710
Chris: In any case, we're not actually here to talk about Aldous. We're here to talk about another member of his family, his brother, Julian. Actually, before we talk about Julian, their grandfather. I did not write down his first name here, but their grandpa, great grandpappy Huxley. Grandpappy Huxley coined the word agnostic. Oh, so these guys were word coiners. Julian coined a bunch of terms. I would say, aside from the one we're about to talk about, the other most famous one might be ethnic group. So he coined the term ethnic group to talk about.
160
00:28:33,790 --> 00:28:37,650
Kayla: How do you coined that? How's that not just a term?
161
00:28:38,270 --> 00:29:03,452
Chris: Yeah. So it was basically like a replacement for the word race. As the idea of race, science was becoming heavily discredited, and the late forties and fifties, they needed, like, a different way to talk about human groups in a way that was, like, more precise. So that was. That was him coining that word. He also coined the word transhumanist, or transhumanist.
162
00:29:03,476 --> 00:29:04,908
Kayla: Oh, Huxley did that.
163
00:29:05,004 --> 00:29:17,332
Chris: Huxley did that. There's some. I read some things where it's like, oh, well, actually, so and so said it before in this publication or whatever, but he's widely credited with, like, coining and popularizing the term.
164
00:29:17,396 --> 00:29:19,080
Kayla: He's the elon musk of the term.
165
00:29:20,460 --> 00:30:10,428
Chris: But, yeah, that's why he's sort of like. He's like the typical tie in. So when people talk about, like, oh, you know, transhumanism is just eugenics. Right. A lot of the times, the exhibit a is Julian Huxley. Because Julian Huxley was himself a eugenicist. And then he coined the term transhumanism. Now, was he a transhumanist himself? Not really. Like, sort of like he had. He espoused some transhumanist ideas. But I like the connective tissue there is weird to me because, like, mostly he was very similar to the guys we talked about last week, where he's like just this badass scientist that unfortunately was. Also had eugenics ideas a little more damning in his case because it was post world War Two. But then it was sort of like. It also wasn't the same eugenics as before. Like, it was more liberal eugenics. Right.
166
00:30:10,444 --> 00:30:40,056
Chris: He was a very progressive guy. He was an internationalist. He actually visited the Soviet Union several times, visited the United States. Like, he was a guy that was, like, in favor of internationalism and international community. In fact, he was the first director of UNESCO. Oh, so UNESCO is the United nations educational, scientific and cultural organization. That's UNESCO. You've probably heard it before as like, UNESCO World Heritage Site.
167
00:30:40,128 --> 00:30:40,496
Kayla: Yeah.
168
00:30:40,568 --> 00:30:55,288
Chris: So they're kind of like a international national park type thing, but they actually do, like a ton of stuff. This is an aside. This is not about Julian Huxley, but there's like, there's a ton of stuff that they get involved in. There's like all these ngO's that they sponsor.
169
00:30:55,344 --> 00:30:56,152
Kayla: Oh, I didn't know that.
170
00:30:56,216 --> 00:31:02,470
Chris: And. Yeah. Oh, they're into all kinds of stuff. In fact, one of the things I found was, do you know the international baccalaureate program?
171
00:31:02,600 --> 00:31:04,538
Kayla: Yeah. The thing that you did in high school. Cause you're a big nerd.
172
00:31:04,594 --> 00:31:06,714
Chris: Ib? Yeah. Yeah, that's UNESCO.
173
00:31:06,842 --> 00:31:07,954
Kayla: What does that mean?
174
00:31:08,042 --> 00:31:11,522
Chris: UNESCO created the International Baccalaureate program.
175
00:31:11,626 --> 00:31:13,530
Kayla: Is UNESCO bad?
176
00:31:13,570 --> 00:31:14,802
Chris: UNESCO is globalist.
177
00:31:14,866 --> 00:31:15,378
Kayla: Is it bad?
178
00:31:15,434 --> 00:31:29,820
Chris: It's a globalist agenda. I mean, I don't think so. Its founding mission says here to advance peace, sustainable development and human rights by facilitating collaboration and dialogue among nations. So that sounds.
179
00:31:29,860 --> 00:31:30,652
Kayla: Do you feel like that's what was.
180
00:31:30,676 --> 00:32:00,340
Chris: Achieved in IB pretty good? Yeah, absolutely. I am the most peaceable and cooperative person that has ever existed, Kayla, thanks to IB. So that's UNESCO. Just to hammer home several things. One is like, he was big and important. Two is he was very internationalist. Three, he was the founding member or head of lots of things. So he was also the founding member of the WWF. Like the chair slamming the wrestlers. Yeah. So he was a wrestler.
181
00:32:00,420 --> 00:32:03,308
Kayla: Hell, yeah. Nah, the animal one, right?
182
00:32:03,364 --> 00:32:22,906
Chris: That was a slow pitch for. Yeah, the animal one. He was big into conservation. Huge into conservation. In fact, if I had to say, like, number one, it would probably be conservation. Interesting in terms of his interests, because he was a biologist, right? So it kind of makes sense. Let's see. He was the secretary of the Zoological Society of London.
183
00:32:23,068 --> 00:32:25,238
Kayla: Okay, I'm sorry, how do people have the time?
184
00:32:25,334 --> 00:32:34,134
Chris: I don't know. And he was also the president of the British Eugenics Society from 1959 to 1962.
185
00:32:34,182 --> 00:32:36,366
Kayla: Oh, also, he's less good than the animals.
186
00:32:36,478 --> 00:33:14,008
Chris: Yeah. First president of the British Humanist association. As if to just like, swoop right in and support my argument about humanism being the trunk of the tree here. He was also known for being like, a good communicator, like presentation of science and popularization of science. But again, eugenicist. So here's a quote. It says in his, and his Wikipedia article says he actually used this quote several times. This argument several times. No one doubts the wisdom of managing the germ plasm of agricultural stocks. So why not apply the same concept to human stocks?
187
00:33:14,064 --> 00:33:16,280
Kayla: I can think of a couple reasons, Mister Huxley.
188
00:33:16,360 --> 00:33:21,784
Chris: I can think of a couple reasons. Now, again, but that's common talking point for eugenicists.
189
00:33:21,872 --> 00:33:29,620
Kayla: Why? Why do they want to think of themselves as livestock? Why do they want to think of themselves as livestock?
190
00:33:30,040 --> 00:33:54,626
Chris: I don't know. But again, mixed bag. He advocated ensuring that lower classes had a nutritious diet, education, and facilities for recreation. And that was like, partially because. Yay. But also partially because he knows that the higher you move up the social ladder, it tends to correlate with a decrease in number of births.
191
00:33:54,698 --> 00:33:56,426
Kayla: The less will have to eugenics. You.
192
00:33:56,578 --> 00:34:17,216
Chris: Right. The less will have to force eugenics. You. I mean, I don't think he believed in forced eugenics, but he did believe in. Well, here, I'll read another quote by him then. We must plan our eugenic policy along some such lines as the following. The lowest strata, allegedly less well endowed genetically, are reproducing relatively too fast. So there it is again.
193
00:34:17,288 --> 00:34:18,408
Kayla: Did Mike Judge write this?
194
00:34:18,464 --> 00:34:40,550
Chris: I know, right? Therefore, birth control methods must be taught them. They must not have too easy access to relief or hospital treatment, lest the removal of the last check on natural selection should make it too easy for children to be produced or to survive long. Unemployment should be ground for sterilization, or at least relief should be contingent upon no further children being brought into the world, and so on.
195
00:34:40,710 --> 00:34:43,654
Kayla: That's really, really awful.
196
00:34:43,782 --> 00:34:49,870
Chris: Much of our eugenic program will be curative and remedial merely instead of preventative and constructive. End quote.
197
00:34:49,989 --> 00:34:50,870
Kayla: Good Lord.
198
00:34:50,949 --> 00:35:44,936
Chris: I know it's not great. I'm like, he does all this great stuff, and then he drops that bomb, and I'm like, holy shit. What the fuck? He also, though, like, he, you know, we talked before about the fallacy of thinking of evolution as, like, an advanced, you know, march of advancement. So he has this quote. The ordinary man, or at least the ordinary poet, philosopher, or theologian, always was anxious to find purpose in the evolutionary process. I believe this reasoning to be totally false, end quote. So he was against that. You know, I. It's a mixed bag. Like, I don't know what to say about Julian Huxley, other than, like, he's the common tie in thread that people like to use when they talk about the connection between eugenics and transhumanism. I personally don't like that thread. Seems very thin to me.
199
00:35:44,968 --> 00:35:56,376
Chris: I think that the ideological thread is just there anyway, and it's much stronger. I don't think Julian Huxley was the guy that was like, oh, eugenics, oh, transhumanism. I'm going to tie them together.
200
00:35:56,448 --> 00:35:56,872
Kayla: Right.
201
00:35:56,976 --> 00:36:45,468
Chris: I don't think it was that. I think it was just like, some of the goals are broadly the same. Right? Taking human society, taking the human species, and transcending it into human 2.0 is both a goal of eugenics and also a goal of transhumanism. So I think there's just overlap. I don't think Huxley was, like, the guy that connected it. Oh, he was also a debunker, but in a good way. It says that he took interest investigating the claims of parapsychology and spiritualism, joining the Society for Psychical research 1928. And then he said, after investigations, he found the field to be unscientific and full of charlatans. So basically, like, he joined the society in, like, good faith, basically being like, this is interesting. And then he was like, oh, this is all bullshit. And then he left.
202
00:36:45,604 --> 00:36:52,340
Chris: So he kind of took this interest in debunking paranormal stuff as well, on top of everything else.
203
00:36:52,500 --> 00:36:57,646
Kayla: The interesting thing here is that, like, this is Aldous Huxley's brother, right?
204
00:36:57,748 --> 00:36:58,434
Chris: Yeah.
205
00:36:58,602 --> 00:37:16,218
Kayla: Brave new world is about a society in which the society has been engineered into an intelligence based social hierarchy, and it's a dystopia, utilizing things like birth control and these different things.
206
00:37:16,274 --> 00:37:18,202
Chris: They purposely breed dums to be workers.
207
00:37:18,306 --> 00:37:21,670
Kayla: Yeah. And it's presented as a dystopia.
208
00:37:22,890 --> 00:38:13,410
Chris: I know. I couldn't even tell. Like, I know that one quote I read was pretty damning. But I can't say for sure that all of his ideas about eugenics were, like, that damning, like, you know, blanket statement that eugenics is shitty overall for, you know, the many layers that we've talked about. But I don't know. Like, he just. I'm so confused about the guy. Like, I think to me, he's much more confusing than Francis Galtone. Francis Galton, you know, big time scientist, but big, huge blind spot with. With eugenics. Julian Huxley. I don't know. Like, most of what I read about him was just, like, good stuff. And then there was, like, you know, a few paragraphs about, like, oh, he did eugenic stuff, too. And here's, like, a really bad quote. I don't know, man. It's so hard. It's hard.
209
00:38:13,530 --> 00:38:15,298
Kayla: His eugenics stuff was bad. The end.
210
00:38:15,394 --> 00:38:16,154
Chris: Yeah, you're right.
211
00:38:16,202 --> 00:38:20,140
Kayla: I. His conservation stuff, probably good. His eugenics stuff, bad.
212
00:38:21,200 --> 00:38:48,762
Chris: Yeah. All right, so Huxley was doing his stuff in, like, the first half of the 20th century. He died, I think, in 1975. So let's move the timeline forward a little bit here. On the way. I'm just going to point out something out the window here is that apparently marriage counseling is eugenicist. His roots, like, the guy that basically invented marriage counseling had some eugenic.
213
00:38:48,906 --> 00:38:51,426
Kayla: What does that mean? What is the guy who, the guy.
214
00:38:51,458 --> 00:39:18,656
Chris: Who was like, I'm gonna start doing marriage counseling as a thing. Like, marriage counseling hasn't always been a thing. It started being a thing in, like, the fifties or sixties. And the guy that invented it was a eugenicist. And part of the idea behind marriage counseling Washington for, like, eugenic aims, like, let's make sure the good match marriages don't break up. That was part of it was to, like, have this. It had a partially eugenic aim to it.
215
00:39:18,688 --> 00:39:18,976
Kayla: Right?
216
00:39:19,048 --> 00:39:34,618
Chris: That's not what it is. Now, you can still drink your celestial seasonings tea. You can still get your marriage counseling. But it's just, it's interesting how many places eugenics leaked all over and touched. It's like a lot of. A lot of people.
217
00:39:34,714 --> 00:39:37,570
Kayla: Yeah. It's like foundational building block shit.
218
00:39:37,730 --> 00:40:41,760
Chris: Yeah. So now let's move into current times. So we're talking about how eugenics and transhumanism are related. Here's another quote, actually, from an article written by Susan Levin in slate.com. Transhumanism as we know it, however, is a marriage. Hey, marriage. Marriage of sorts, between substantive commitments shared with anglo american eugenics. There it is. And the notion that living things and machines are basically alike, the latter stemming from developments in computing and information theory during and after World War Two, end quote. So what she's saying is, like, transhumanism equals eugenics plus the sort of overarching paradigm of thinking of human beings as machines, which she contends basically in the article, she contends with both of those things. Okay, so she would say that it's. Yeah, the DNA is definitely there. We've talked about Nick Bostrom quite a bit on.
219
00:40:42,140 --> 00:40:53,548
Chris: On the show recently because he's one of the, like, he's basically the. If you had to name one guy that was like, TESCREAL, or it was like, behind all this. Yeah, you'd probably name Nick Bostrom.
220
00:40:53,604 --> 00:40:56,604
Kayla: We have a Nick Bostrom book in our library. Like, in our book?
221
00:40:56,652 --> 00:40:58,564
Chris: Yeah, it's called Super Intelligence. I've never read it, though.
222
00:40:58,612 --> 00:41:02,216
Kayla: I've never read it. Kind of thinking of taking it off of our bookcase.
223
00:41:02,368 --> 00:41:08,740
Chris: Well, you might be more inclined to do that after we talk about the Nick Bostrom email.
224
00:41:09,280 --> 00:41:10,440
Kayla: I don't want to talk about that.
225
00:41:10,480 --> 00:41:36,170
Chris: Let's talk about the Nick Bostrom email, because if we're going to be talking about TESCREAL transhumanists that have eugenic thinking, we need to talk about this email. So this is from a vice article entitled prominent AI Philosopher and father of longtermism sent a very racist email to a nineties philosophy listserv. By the way, do you know who dug this email up and found it out in the muck?
226
00:41:36,330 --> 00:41:37,750
Kayla: Would I ever be able to guess?
227
00:41:38,050 --> 00:41:38,946
Chris: Yes.
228
00:41:39,138 --> 00:41:39,826
Kayla: You?
229
00:41:39,978 --> 00:41:41,090
Chris: Emile Torres.
230
00:41:41,250 --> 00:41:41,858
Kayla: Really?
231
00:41:41,994 --> 00:42:02,308
Chris: Yeah. Emile Torres found this email actually, good quote. Nick Bostrom, an influential philosopher at the University of Oxford who has been called the father of long termist movement, of the longtermist movement, has apologized for a racist email he sent in the mid nineties. In the email, Bostrom said that, quote, trigger here, this is not me saying this.
232
00:42:02,404 --> 00:42:03,880
Kayla: He's saying racial slurs.
233
00:42:05,020 --> 00:42:31,434
Chris: Bostrom said that, quote, blacks are more stupid than whites, adding, quote, I like that sentence and think it is true, end quote. And then use the racial slur. So the racial slur part of it was basically like, he continued writing stuff and then was like, I like that sentence. And I think that it's true. But if I were to say that in mixed company, then people would hear it as me saying, blah, n word, blah. Or like, I really.
234
00:42:31,482 --> 00:42:37,770
Kayla: But I'll write it down. But I'll write it down. Send it in mixed company. I'll write it down.
235
00:42:37,930 --> 00:42:51,690
Chris: Yes, but it wasn't mixed company. It was a listserv that contained a lot of people in it that were in this exact movement that we're talking about. So the fact that he felt comfortable sending that email in the company of other people that we might call, like, proto TESCREALists.
236
00:42:51,730 --> 00:42:52,254
Kayla: Right.
237
00:42:52,402 --> 00:43:29,064
Chris: I think is pretty informative. So he did do a apology. So he was, you know, he did the. And it was like, not a bad apology. You know, I fully, I don't remember the exact wording, but it was basically like, I fully recant this. These are not views that I hold now. I wish I hadn't said that. I don't think it fully addressed, though. Like, I still just, I kind of feel like too much of the emphasis, both in the reporting on it and in his reply and in his apology. I still think the emphasis is too much on, like, the racial slur and not the sentiment. Not the sentiment.
238
00:43:29,152 --> 00:43:29,984
Kayla: That's icky.
239
00:43:30,112 --> 00:43:43,832
Chris: Like, I'm not gonna say the racial slur, but I actually think that saying blacks are more stupid than whites, I like that sentence and think it's true, is actually the worst thing to say because that's the other thing was just like, haha. I'm just saying this for, like, shock value.
240
00:43:43,896 --> 00:43:49,604
Kayla: Not a cool thing to say, but not a. The former thing is, like, awful.
241
00:43:49,692 --> 00:44:05,092
Chris: Yeah. The former thing is like, oh, that's actually really dangerous. Yeah, that's like, literal scientific racism has been disproven. Like, first of all, throughout IQ, I had a whole thing I was going to talk about IQ in this episode.
242
00:44:05,156 --> 00:44:06,164
Kayla: Oh, we gave up on IQ.
243
00:44:06,212 --> 00:44:15,814
Chris: I gave up on IQ. It's just too much, you guys. That should say a whole nother rabbit hole of IQ where, like, is that test even meaningful? A lot of people say no.
244
00:44:15,902 --> 00:44:16,942
Kayla: I say no.
245
00:44:17,126 --> 00:45:06,390
Chris: Yeah. So there is a measurable difference. People find a measurable difference in average iqs across different races. But if you control for things like socioeconomic status, those things disappear. And also they tend to converge over time. So if you look at the black community, over time, the average IQ as measured has gone up over time as their socioeconomic conditions have improved. So there's just a ton of evidence pointing to the fact that there's, like, it's widely scientifically accepted that there's no. And also, race is not a scientific thing. It doesn't have any genetic basis. So, like, there's multiple reasons why scientific racism is harmful and also wrong, apparently.
246
00:45:06,510 --> 00:45:21,162
Kayla: Jamie Loftus did an entire podcast season about Mensa, and I think it was recommended to us at the first episode, really gets into the eugenic background of Mensa. So since we're not covering that, you can go listen to that on our episode. You can go listen to Jamie Loftus.
247
00:45:21,266 --> 00:45:26,354
Chris: Also, did you know that, like, in the general population, IQ has been slowly raising over time?
248
00:45:26,522 --> 00:45:27,138
Kayla: Interesting.
249
00:45:27,194 --> 00:45:29,770
Chris: People don't know why. There's, like, a name for the effect. I forget what it's called.
250
00:45:29,810 --> 00:45:31,070
Kayla: It's because we're eating more.
251
00:45:31,970 --> 00:45:37,034
Chris: That's what some people think. Some people think we're just, like, healthier now. Yeah. Because of, you know, better access.
252
00:45:37,082 --> 00:45:38,418
Kayla: We don't have rickets or whatever.
253
00:45:38,514 --> 00:46:09,500
Chris: Yeah. So I don't know. But in general, it's been going up and computers. But also fundamentally, IQ is busted. So what does that mean anyway? Okay, the point of all this, though, was to, again, talk about, like, if we're got. If we got this, like, founding father of TESCREAL stuff, feeling very comfortable putting his scientific racist viewpoint out there on this listserv. I don't think that looks good for transhumanism and TESCREALism in terms of its eugenic ties.
254
00:46:09,620 --> 00:46:18,544
Kayla: It also makes me feel a lot better about the decision we made at the very beginning of this season. Oh, man. Where we almost. Have we talked about this yet?
255
00:46:18,632 --> 00:46:19,304
Chris: I don't know.
256
00:46:19,392 --> 00:46:25,208
Kayla: We almost opened the season by reading a story written by Nick Bostrom.
257
00:46:25,304 --> 00:46:25,704
Chris: Right.
258
00:46:25,792 --> 00:46:31,632
Kayla: And then went, you know what? Actually, this story's really dumb. And then we did.
259
00:46:31,696 --> 00:46:32,624
Chris: I might be an idiot.
260
00:46:32,672 --> 00:46:33,912
Kayla: I still really like the story. It's.
261
00:46:33,976 --> 00:46:35,528
Chris: The story's fine. The story's cool.
262
00:46:35,584 --> 00:46:51,380
Kayla: Especially as adapted by. I don't know who it is. Adapted by the youtuber who popularized this story. If you go look up the Nick Bostrom dragon story, you'll find it. But, yeah, we almost opened the season by talking about this story that equates death to a big dragon that eats a whole town.
263
00:46:51,420 --> 00:46:53,284
Chris: I'm so glad that our radar was on for that.
264
00:46:53,292 --> 00:46:54,660
Kayla: Yeah, went, I don't know, actually.
265
00:46:54,700 --> 00:47:09,340
Chris: This is kind of silly. Part of it was. A big part of it was when were talking about how he was railing against. I don't know who he was bitching about, but, like, hey, why aren't we trying to solve death? Like, why aren't we trying to put resources towards.
266
00:47:09,640 --> 00:47:11,056
Kayla: That was the story was about curing.
267
00:47:11,088 --> 00:47:15,032
Chris: Death because that's what the story was about. And were like, does he know about medicine?
268
00:47:15,176 --> 00:47:16,940
Kayla: Has he ever gone to the doctor?
269
00:47:17,760 --> 00:47:58,402
Chris: At the bare minimum, he has no bell disease. So the other thing is he, like, in case you were wondering if that was a one time thing, sure, that spike of racism was a one time thing, but he has repeatedly talked about dysgenic pressures being one. So you know how they're, like, super into existential risk? Like, that's a big thing for them, is, like, for TESCREALs. It's like, oh, we got to fight x risk. We got to fight existential risk, because that's, you know, that's the thing that will prevent the transhumanist project from happening. One of the existential risks, depending on who you ask, and definitely, according to Bostrom, is dysgenic pressures, which is. Which is idiocracy. They're all worried about idiocracy.
270
00:47:58,466 --> 00:47:59,866
Kayla: Are you fucking kidding me?
271
00:47:59,938 --> 00:48:01,186
Chris: That's what dysgenic pressures mean.
272
00:48:01,218 --> 00:48:12,006
Kayla: Nick Bostrom's a dumbass. Nick Bostrom is really making me think of that quote from the previous episode about, like, an idiot doesn't know he's an idiot. And, like, the people who think that they're the smartest are the ones who are trying to fashion the world in their own image.
273
00:48:12,078 --> 00:48:12,422
Chris: Exactly.
274
00:48:12,446 --> 00:48:17,910
Kayla: And, like, maybe Nick Bostrom should, like, take a long, hard look in the mirror, because none of us are as smart as we think we are.
275
00:48:17,990 --> 00:48:38,394
Chris: And that's where it gets, like, super racist and eugenics. That's where the eugenics and the racism really come in, is the, like, dysgenic pressures. So when you say, dumb people are going to out produce us, and then you also say, I think blacks are stupider than whites. All right. Had to talk about the Nick Bostrom thing. Obviously. We've been sitting on that one for a really long time.
276
00:48:38,442 --> 00:48:40,710
Kayla: Also, Nick Bostrom's institute failed recently.
277
00:48:41,210 --> 00:49:30,532
Chris: Yes. Which is, I think, honestly, after doing all this, is probably a good thing. It's the future of humanity Institute. That's why we named those episodes future of humanity, those previous few episodes. Okay, so we talked about some ideas. We talked about some people, and I promised you I would complete the sentence. You can't understand transhumanism without understanding eugenics. And you can't understand eugenics without understanding something that I had not read about before, although I had definitely heard thoughts, conclusions, the influence of this theoretical practice, but without understanding crip theory. Oh, so what is crypt theory? First of all, I I need to specify that I am not using that word in a derogatory manner.
278
00:49:30,596 --> 00:49:31,780
Kayla: That word has historically been.
279
00:49:31,780 --> 00:49:37,108
Chris: That word has historically been used as a slur. It is currently in the process of being reclaimed.
280
00:49:37,284 --> 00:49:41,360
Kayla: It's similar. I think I could be speaking out of turn, similar to queer.
281
00:49:41,860 --> 00:50:33,006
Chris: I've read that in several places that parallels have been drawn between the reclaim of Crip and the reclaim of queer. And in fact, crip theory specifically talks about the intersection of disability and queerness. So that's like one of its big things. All right, so that's what crip theory is in a nutshell. It's a lot more than that, actually. Before I get to this quote, the other thing I should mention is that it intersects heavily with the idea of the social model of disability. So there's something called the social model of disability, which was proposed as an alternative to the medical model of disability. I think this is stuff that we've probably heard about more. We've encountered these ideas a little more than. I'd never heard the word crip theory, but I had heard the social model.
282
00:50:33,038 --> 00:51:11,378
Chris: I just don't know if I'd heard the term. But basically the idea with that is that rather than assuming that disability is something that needs to be cured and it's on the individual to conform to whatever the society says is a normal human being, instead, we should view disability as a social construct based on, like, you know, nobody is the average, right? So everybody has some difference from the average. And that is definitely true of disabled people. And we should be focused, therefore, on bringing society up to speed, not on bringing the individuals up to speed.
283
00:51:11,434 --> 00:51:13,058
Kayla: So more like building ramps to get.
284
00:51:13,074 --> 00:51:54,900
Chris: Into building ramps rather than curing the. Whatever it is, quote unquote. Like it's very. It's very much about, like, we don't need to be cured. We have value as is, right? This isn't a disease. The guy that invented it, by the way, or that coined the phrase and proposed it, he was, like, very clear that this isn't like the be all and end all. This is just like an alternative to the dominant view, which is the medicalized state. Right? So he wasn't saying, like, I have all the answers. I just want to make that clear. Like he was saying, this is an alternative view, that we really should be thinking about it at least much more this way than the other way, because there are critiques. Right?
285
00:51:54,940 --> 00:52:15,348
Chris: Like some disability advocates will say, like, well, you know, I actually have something that is, like, biologically very painful, and I would like that to be cured, please. And like, you know, if we push too hard in the other direction, in the social model direction, then do we, you know, make that less likely to happen because we're just trying to do curb cuts and elevators everywhere?
286
00:52:15,484 --> 00:52:15,734
Kayla: I.
287
00:52:15,772 --> 00:52:21,306
Chris: But again, the creator of this was very clear that it wasn't like, well, I'm not saying that this is the only answer.
288
00:52:21,378 --> 00:52:44,830
Kayla: Right. Like, I've heard that argument when talking about things that might be considered disabilities, such as, like, chronic illness. So, like, in the fibromyalgia or, like, me CFS community, and probably now the long Covid community, like long Covid is disabling. But long Covid people who have long Covid probably would rather have a cure than, like, sure not.
289
00:52:45,730 --> 00:53:08,426
Chris: Yeah. So it's important to have multiple perspectives, but I think it's super valuable to have, just because the medical perspective is so dominant, to have that social model as an alternative way of thinking about it. So here's a quote from an article I read entitled transhumanism is eugenics for educated white liberals. Very damning.
290
00:53:08,498 --> 00:53:09,050
Kayla: Yeah.
291
00:53:09,170 --> 00:53:12,612
Chris: And also, I kind of feel like that's what we've been talking about.
292
00:53:12,716 --> 00:53:14,080
Kayla: Yeah. Yeah.
293
00:53:15,700 --> 00:54:09,656
Chris: In sharp contrast to transhumanist thinking and philosophical common sense, quote unquote. In general, crip theorists generally hold that the existence of disability is a positive good, something that should be embraced and valued. I think this is me just saying, as an aside, whatever critiques there are of the social model or of this, like, I think that statement is pretty unambiguously true in Garland Thompson's words. I don't know who that is, but this is what the article author is quoting. The human variations that we think of as disabilities are something that we should conserve and protect because they are essential, inevitable aspects of human being, and because these lived experiences provide individuals and human communities with multiple opportunities for expression, creativity, resourcefulness, relationships, and flourishing. So this. By the way, the author of this article's name is Mitch Syria, who is.
294
00:54:09,728 --> 00:54:39,034
Chris: I'm just gonna read her little blurb from the. From her university website. So she's a professor at the University of Missouri St. Louis. She is the author of an intersectional feminist theory of moral responsibility published by. We don't need to know that. Regular contributor to biopolitical philosophy, she they is a queer disabled philosopher with specializations in ethics, moral psychology, marxist feminism, and critical disability theory.
295
00:54:39,162 --> 00:54:40,546
Kayla: Can we be friends?
296
00:54:40,738 --> 00:55:30,558
Chris: So she's just like, all the things. But I really liked, obviously, like that quote. I liked the article in general. I want to read you another quote from the same article, and then we'll talk about it for just a sec. Historically. Oh, this is the quote, by the way, historically, disability was affiliated with blackness and genderqueerness, and these associations persist today. As so and so puts it, we don't need another name. Queerness, broadly conceived, is regularly understood or positioned in contemporary culture. As always, a bit disabled, and vice versa. Consequently, to promote a non disabled body is to promote a white cisgender body that is a normal. This is in context of the article, non freakish body. Transhumanism's quest to eliminate disability is entangled, therefore, historically, structurally, and symbolically, with racism and cissexism.
297
00:55:30,694 --> 00:56:18,570
Chris: While transhumanists might want to deny these connections, they are deeply embedded in the construction and positioning of disability. Intellectual disability in particular. This last sentence is kind of like why I'm reading this. Intellectual disability in particular has been used to oppress racial and sexual others, end quote. So this is why I say, like, I don't think you can understand eugenics either without understanding crip theory, because eugenics is basically talking about let's eliminate the people that let's do a genocide, essentially, whether it's actually killing them or sterilizing them on people that we consider to be unfit. And that is a social construct. And because it's a social construct, they can apply it to whatever they want, and they tend to apply it to the power structures that they want to preserve.
298
00:56:18,870 --> 00:56:37,180
Chris: So that's why it's inexorably linked to the racist structures that we have, is because it's like, well, we just need to say that black people aren't. We just need to have smart people like Nick Bostrom say that, I think blacks are stupider than whites, and then that just opens the door for whatever tool you want to deploy at them.
299
00:56:37,220 --> 00:56:38,124
Kayla: Right, right.
300
00:56:38,252 --> 00:56:40,684
Chris: So it's. That's. So it's all connected.
301
00:56:40,812 --> 00:56:55,924
Kayla: I'm really grateful that there's, like, academics doing this work because that was an incredible way to put into, like, actual substantive, like, argument. Why this is not a good phrase. Human diversity, that's not a good phrase. Right.
302
00:56:55,972 --> 00:57:04,628
Chris: Actually, human biodiversity is one of their favorite things to who. To say to. Yeah, crip theorists love the phrase human biodiversity.
303
00:57:04,644 --> 00:57:05,720
Kayla: Okay, then I'll say it.
304
00:57:06,300 --> 00:57:12,280
Chris: They use, like, in fact, they use that to contrast against genetic selection or eugenics.
305
00:57:13,020 --> 00:57:58,428
Kayla: To have an academic argument for human biodiversity, I think is really good to have. Because it can feel like, well, I just feel like it's good to have different kinds of people, and it just feels icky to, like, genetically engineer everyone to be blue eyed and blonde. Like, that's not a great argument. To be like, well, it just feels wrong. So to have academics putting into words and, like, quote unquote, rational arguments of, like, why human biodiversity is important and why it is a benefit to our society as part of these, like, the ability to argue against the actual, quote, unquote, rationalists who are able to come up with very logical sounding arguments of why liberal eugenics is good, actually. I'm really happy to hear that.
306
00:57:58,564 --> 00:58:09,388
Chris: Yeah. And I think that it's important to listen to those. Like, this feels icky, but I think kind of what you're saying and what this person's doing is explaining where that feeling comes from.
307
00:58:09,444 --> 00:58:12,228
Kayla: Right. And making it more than just a feeling.
308
00:58:12,324 --> 00:58:21,420
Chris: Yeah, exactly. Exactly. So overall, I guess, what would you make of the crip theorists derision of transhumanism?
309
00:58:22,720 --> 00:58:36,440
Kayla: I'm just going to go back to the thing that I said about that quote. That's kind of how I feel about it as a whole. I feel like I want to learn more about this because this feels like something that I feel like I could align with.
310
00:58:36,560 --> 00:58:39,980
Chris: How does it make you feel about transhumanism? Like, the fact that they don't like it.
311
00:58:40,730 --> 00:58:48,626
Kayla: The fact that crip theorists don't like transhumanism? Yeah, I think it's probably reductive to say they don't like transhumanism.
312
00:58:48,818 --> 00:58:52,282
Chris: You know what? I'm actually gonna. I'm gonna stop you here because I'm. I'm.
313
00:58:52,386 --> 00:58:56,850
Kayla: You're trying to get me to. You have. You have answer in mind? Tell me what that is.
314
00:58:57,010 --> 00:58:59,250
Chris: Cause I kind of asked you a trick question.
315
00:58:59,410 --> 00:59:00,610
Kayla: Yeah. Tell me what I'm supposed to think.
316
00:59:00,650 --> 00:59:02,682
Chris: Because I know something that you don't.
317
00:59:02,746 --> 00:59:03,434
Kayla: Oh, no?
318
00:59:03,562 --> 00:59:07,090
Chris: Crypt theorists aren't actually anti transhumanists.
319
00:59:07,210 --> 00:59:08,482
Kayla: Oh, that's what I said.
320
00:59:08,666 --> 00:59:24,114
Chris: I think. I'd say they're very anti tescreal. If I'm gonna, like, pull this back to, like, the TESCREAL thing and how TESCREAL kind of feels as a bundle different than any one piece of this, I would say that, yeah, they're anti tescreal, anti Bostrom's flavor of transhumanism.
321
00:59:24,202 --> 00:59:24,830
Kayla: Same.
322
00:59:25,770 --> 00:59:37,764
Chris: However, very interestingly, while crip theorists tend to be against genetic human enhancement, for all the discussed reasons, they actually tend to be very pro cybernetics.
323
00:59:37,852 --> 00:59:38,836
Kayla: Oh, hell yeah.
324
00:59:38,908 --> 00:59:43,476
Chris: They are very pro. And you can, like. As soon as I say that, you can kind of see why. Right.
325
00:59:43,508 --> 01:00:01,492
Kayla: It's like that TikTok that I sent you 2 seconds ago of, like, that girl who she. I believe she's an amputee. She does not have her forearms. And she was talking about how, like, prosthetics used to suck because were trying so hard to make them look like a, quote unquote like biological arms. And now these ones that are.
326
01:00:01,516 --> 01:00:05,351
Chris: Nobody was, by the way. Nobody was asking for their input. They were just like, here's your arm.
327
01:00:05,488 --> 01:00:24,984
Kayla: Now that we've started to design, now that disabled people are part of that design process, the fucking prosthetics are cool as hell. Cause they, like, do different things and aren't just hindered by trying to look like a biological arm. And hers were like, in addition to having different abilities than a biological arm, it was also pink and sparkly. It was cool.
328
01:00:25,072 --> 01:01:14,596
Chris: Yeah, it was dope as hell. And again, it hearkens back to the design with not for that exact phrase actually came up several times in my reading of crypt theory stuff. So it makes a lot of sense. Like, they will. Like, part of the is part of it is what you were just saying. Part of it is also, like. Crypt theorists also like to point out that it's. There's like, this diy element to it that, like, they don't. It's exactly what you were talking about, actually. They. They are against the commodification and top down non disabled people designing for them. Sure they are. For a disabled person saying, like, I need to be part of this design process. Disabled people designing their own cybernetic enhancements. Yada, yada. Does that make sense?
329
01:01:14,668 --> 01:01:15,228
Kayla: Yes.
330
01:01:15,364 --> 01:02:05,086
Chris: Okay. So, in fact, there's like a very famous. I didn't know about it, but I actually think I might have heard it. It's called the cyborg manifesto. So it is. I'm gonna read from its Wikipedia article here again. It's an essay written by Donna Haraway and published in 1985 in the Socialist Review. In it, the concept of the cyborg represents a rejection of rigid boundaries, notably those separating human from animal and human from machine. Haraway writes, the cyborg does not dream of community on the model of the organic family. This time, without the Oedipal project, a cyborg would not recognize the Garden of Eden. It is not made of mud and cannot dream of returning to dust. The manifesto challenges traditional notions of feminism, particularly feminism that focuses on identity politics and instead encourages coalition through affinity.
331
01:02:05,238 --> 01:02:21,252
Chris: Haraway uses the concept of a cyborg to represent the plasticity of identity and to highlight the limitations of socially imposed identities. The manifesto is considered a major milestone in the development of feminist post humanist theory. A lot of ists in there.
332
01:02:21,316 --> 01:02:28,452
Kayla: Fuck yeah. This is the stuff that I fuck with. This feels good. This doesn't feel bad and icky. This feels good.
333
01:02:28,596 --> 01:02:33,040
Chris: This is why I am talking about it at the end, actually, of the show.
334
01:02:33,740 --> 01:03:31,948
Kayla: And I just want to interject here, just because I wanted to. I came across a slightly eugenics. Not slightly. I came across a eugenics related thing when I was researching effective altruism. Who, again, are the. They're the EA of TESCREAL and one of their founding fathers, Peter Singer, has actually been targeted by disabled activists and by disabled protesters. I can't say definitely crip theorists, but crip theorist adjacent, folks, because Peter Singer has a lot to say about how it's morally and ethically okay to euthanize severely disabled infants and severely disabled people. And it's, you know, this episode's not about that. So I'm not gonna get into the nuance of all of that. But this is a person who's very influential in the TESCREAL movement, who has been protested by disabled activists specifically for saying it's morally and ethically okay to euthanize severely disabled infants.
335
01:03:32,004 --> 01:04:21,938
Chris: That's not a good thing to say. Don't say that. Don't think that. And then also definitely don't have say it. So I just thought that was, like, that was very interesting how as much as they're against genetic selection, eugenics, they are in favor of bionic enhancement. I also just. If you want to hear some stuff that you fuck with, there's also modeled on the cyborg manifesto. There is a crypt technoscience manifesto. So in, it's a whole thing. But I did grab the four commitments of crip technoscience from the manifesto. So I'm just going to read them to you here real quick, because you just. You're going to love it. One, crypt technoscience centers the work of disabled people as knowers and makers. Actually, I think that's kind of, like, very salient to what were talking about before.
336
01:04:22,114 --> 01:04:26,226
Chris: Two, crypt technoscience is committed to access as friction.
337
01:04:26,418 --> 01:04:27,298
Kayla: Ooh.
338
01:04:27,474 --> 01:04:40,414
Chris: There's a whole thing in there about, like, access not being this, like, passive something like it's. It's. It's wild. Three, crip technoscience is committed to interdependence as political technology.
339
01:04:40,542 --> 01:04:42,142
Kayla: Yes. Yes.
340
01:04:42,326 --> 01:04:46,118
Chris: Four, crip technoscience is committed to disability justice. That one's easy.
341
01:04:46,174 --> 01:04:46,726
Kayla: Great.
342
01:04:46,878 --> 01:04:52,990
Chris: But I just. I think my favorite one there is. Number two is the committed to access as friction. I didn't fully understand what I was reading.
343
01:04:53,030 --> 01:04:58,200
Kayla: Yeah, I don't really understand either. I really like how it sounds, but I'm. I'm. Number three is my jam.
344
01:04:58,280 --> 01:05:20,936
Chris: Yeah. Number three is. Yeah. Interdependence is political technology. Hell, yeah. So just wrapping up that whole thing, like, I think that, like, when I came across this stuff, I was like, oh. Like, this is if I really want to talk about, like, modern transhumanism versus eugenics, or is it eugenicist or which parts of transhumanism are and aren't. I just feel like that this, like, viewing it through this lens is super helpful.
345
01:05:21,008 --> 01:05:39,060
Kayla: It's really nice to know that if I want to, like, talk about transhumanist stuff, I don't have to only be in TESCREAL communities, that there. There's people talking about this stuff in a non TESCREAL way. And I'm just really grateful to know that's happening. And that really helps me. That really helps me.
346
01:05:39,220 --> 01:06:22,590
Chris: Same, bruh. Same. So in general, this has been an enormously challenging subject to grapple with across many different avenues. Eugenics and transhumanism, as topics are emotionally challenging, ethically challenging, and intellectually challenging. It's the type of topic where one finds it extremely difficult to find something either in the literature or in your own mind that you might call solid ground. There's been a lot of gray area. We've talked about a lot of, like, well, I like this guy, but I don't know. He's also got, like, a really bad stuff, too. There's a lot of, like, I don't like the conclusions I'm being forced to draw about, like, how I feel like people should be able to choose their, you know, Gattaca babies or whatever.
347
01:06:22,630 --> 01:06:23,264
Kayla: I.
348
01:06:23,462 --> 01:07:02,878
Chris: A lot of. I don't understand where idea number one stops and idea number two ends. And by the way, if the right person gets that wrong, it could be very damaging. So when I find myself in this sort of intellectual, ethical mirror fun house, I'm just going to be on the lookout for a guidepost or handhold of some kind just to help me think straight. So I want to wrap up these two episodes talking about an article that I read published in the Atlantic in 2004 by one Michael Sandel. By the way, I looked him up now, and apparently he's like some sort of, like, rock star philosopher who sells out concert venues and stuff.
349
01:07:02,934 --> 01:07:04,166
Kayla: Oh, my God, he's Ian Malcolm.
350
01:07:04,238 --> 01:07:54,320
Chris: He's like, literally Ian Malcolm. Anyway, so this article, and there was actually even ended up writing a book centered around it. But the article is entitled the Case against Perfection. Here's what I found steadying in the article. And as an aside, he says this idea is actually derived, by the way, from a theologian named William F. May, just to make sure I'm giving the right credit here. But in this article, he posits that there are two types of love, or at least there is a particular axis along which you can categorize love into two different kinds. The first kind, he says, is accepting love. The second is transforming love. Accepting love affirms someone's being, and transforming love seeks their well being, and if they are in balance, each corrects for the excesses of the other. By the way, this was proposed.
351
01:07:54,700 --> 01:08:01,612
Chris: I'm saying this as these are two types of love, because I think that this is very universally applicable.
352
01:08:01,716 --> 01:08:02,412
Kayla: Okay.
353
01:08:02,556 --> 01:08:06,492
Chris: It was proposed initially as there are two types of love that parents give to their children.
354
01:08:06,596 --> 01:08:07,172
Kayla: Okay?
355
01:08:07,276 --> 01:08:11,532
Chris: So that will make some of these quotes make more sense when they start talking about parents and children.
356
01:08:11,596 --> 01:08:11,956
Kayla: Got it.
357
01:08:11,988 --> 01:08:18,326
Chris: It also, like, has a lot to do with eugenics, and that's where the parent children analogy comes in.
358
01:08:18,358 --> 01:08:18,966
Kayla: Sure, sure.
359
01:08:19,078 --> 01:09:01,545
Chris: I just didn't state it that way initially because I think it's very universally applicable to any type of loving relationship. Anyway, if they are in balance, transforming and accepting love, each corrects for the excesses of the other, so they're not hard concepts like, you probably already kind of get it right. Accepting love is what it sounds like, affirming that which is loving a son or daughter, unconditionally forgiving your husband for an unspeakable crime, such as hosting a podcast. Transforming love is also kind of what it sounds like. That's the kind of love that makes you want to see your child succeed and thrive, or the kind of love that lets you support your husband in his quest to become a better person and stop podcasting. But you're not gonna laugh at that at all.
360
01:09:01,617 --> 01:09:09,009
Kayla: I got really confused as to when you stopped doing the quote, and when you were you, and I was like, what's happening here?
361
01:09:09,089 --> 01:09:17,837
Chris: Sorry, I did a terrible job. That's no, this guy is actually quote making. He's talking about how I shouldn't be podcasting. It's very precious.
362
01:09:17,894 --> 01:09:22,381
Kayla: I was like, that joke's really on point. But that was you making the joke.
363
01:09:22,486 --> 01:09:25,649
Chris: Okay? Your interpretation was way funnier than the actual joke.
364
01:09:27,750 --> 01:09:30,782
Kayla: I love you enough to allow you to do your podcast.
365
01:09:30,926 --> 01:10:17,744
Chris: Thanks. But the balance is the key, right? So this is how. Here, I'll start telling you what I'm quoting now. So this is how William May, the aforementioned theologian, it's the quote within a quote. This is how he puts it. Parents find it difficult to maintain an equilibrium between the two sides of love. Accepting love without transforming love slides into indulgence and finally neglect. Transforming love without accepting love badgers and finally rejects, end quote, and what Sandel ultimately posits is that the eugenic project represents a sort of apotheosis, a final boss, if you will, of the victory of transformative love over accepting love wildly out of balance and thus ultimately unavoidably pernicious.
366
01:10:17,912 --> 01:11:04,830
Chris: Even if you were able to somehow make the science work perfectly and keep state coercion at bay, not just because it is a powerful tool for social class dominance, which it is, and not just because it is sloppy pseudoscience, which it is, but because it robs humanity of the great gift of mindfully contemplating the unfolding of our time on earth. Sandel if bioengineering made the myth of the self made man come true, it would be difficult to view our talents as gifts for which we are indebted, rather than as achievements for which we are responsible. This would transform three key features of our moral humility, responsibility and solidarity. In a social world that prizes mastery and control. Parenthood is a school for humility.
367
01:11:05,260 --> 01:11:59,850
Chris: That we care deeply about our children and yet cannot choose the kind we want teaches parents to be open to the unbidden. Such openness is disposition worth affirming, not only within families but in the wider world as well. It invites us to abide the unexpected, to live with dissonance, to reign in the impulse to control a gattaca like world in which parents become accustomed to specifying the sex and genetic traits of their children would be a world inhospitable to the unbidden, a gated community writ large. The awareness that our talents and abilities are not wholly our own doing restrains our tendency towards hubris. This demand for performance and perfection animates the impulse to rail against the given. Sandel. The deepest moral objection to enhancement lies less in the perfection it seeks than in the human disposition it expresses and promotes.
368
01:12:00,230 --> 01:12:22,010
Chris: For example, the parental desire for a child to be of a certain genetic quality is incompatible with the special kind of unconditional love parents should have for their children, he writes, and he's talking about may again here. To appreciate children as gifts is to accept them as they come, not as objects of our design or products of our will or instruments of our ambition. End quote.
369
01:12:23,270 --> 01:12:26,850
Kayla: As gifts or as gifs. Sorry.
370
01:12:28,550 --> 01:12:30,930
Chris: As gifs of our ambition.
371
01:12:33,150 --> 01:12:41,638
Kayla: That is extremely profound. Yeah, I'm gonna be thinking about that for. I don't know if I have a great, like. Here's my pithy reaction to that.
372
01:12:41,694 --> 01:12:43,730
Chris: Give me your 140 character hot tape.
373
01:12:44,390 --> 01:12:46,734
Kayla: But I'll be thinking about that for a while. And that.
374
01:12:46,822 --> 01:12:47,570
Chris: Me too.
375
01:12:49,320 --> 01:12:58,752
Kayla: That feels a lot more that feels a lot better to contemplate than some of the stuff we've been having to contemplate with these TESCREAL episodes.
376
01:12:58,856 --> 01:13:43,194
Chris: Yeah, and it really helps. Again, it helps me. It gives me a framework, a little scaffolding onto which I can kind of, like, help hang the different ideas that I'm, like, being buffeted with all this research. And we've talked about a bunch of scenarios, right? Like, we talked about the scenario earlier of, like, what. What if we can sex select? And what if we can do this? And what about this scenario? Wouldn't that be bad? Like, there's a lot of what ifs in this, and a lot of sort of, like, positing different scenarios and asking about what Doctor Watson thought and whatever. In the case against perfection article, Sandel also draws upon such scenarios to help distill the discussion and the ideas of this. He writes, these scenarios raise a hard question. If it is morally troubling to contemplate.
377
01:13:43,242 --> 01:14:36,550
Chris: And then he talks about the scenarios, like stuff we've talked about. If it is morally troubling to contemplate these scenarios, doesn't this suggest that something is wrong with acting on any eugenic preference, even when no state coercion is involved? Removing the coercion does not vindicate eugenics. The problem with eugenics in genetic engineering is that they represent the one sided triumph of willfulness over giftedness, of dominion over reverence, of molding over beholding. Why, we may wonder, should we worry about this triumph? Why not shake off our unease about genetic enhancement as so much superstition? What would be lost if biotechnology dissolved our sense of giftedness? It is more plausible to view genetic engineering as the ultimate expression of our resolve to see ourselves astride the world, the masters of our nature. But that promise of mastery is flawed.
378
01:14:36,850 --> 01:14:50,542
Chris: It threatens to banish our appreciation of life as a gift and to leave us with nothing to affirm or behold outside our own will. This is Chris, this is Kayla, and.
379
01:14:50,566 --> 01:14:53,110
Kayla: This is Ben Culter. Just weird.