Join the conversation on Discord!
Aug. 20, 2024

S6E21 - The Germ Makers

Wanna chat about the episode? Or just hang out?   --- No idiot knows that he is an idiot. As a rule, those of small intellectual equipment are so sure of themselves that they are eager to make the race over in their own image. -Clarence Darrow...

Wanna chat about the episode? Or just hang out?

Come join us on discord!

 

---

No idiot knows that he is an idiot. As a rule, those of small intellectual equipment are so sure of themselves that they are eager to make the race over in their own image.
-Clarence Darrow

 

Chris & Kayla enjoy dunking on one of history's worst ideas.

---

*Search Categories*

Science / Pseudoscience; Anthropological; Destructive

 

---

*Topic Spoiler*

Eugenics

 

---

Further Reading

https://www.britannica.com/science/eugenics-genetics

https://en.wikipedia.org/wiki/Eugenics

https://en.wikipedia.org/wiki/History_of_eugenics

https://en.wikipedia.org/wiki/New_eugenics

https://openscholarship.wustl.edu/cgi/viewcontent.cgi?params=/context/bio_facpubs/article/1001/&path_info=Eugenics__Annals_of_Eugenics_.pdf

The Deceptive Simplicity of Mendelian Genetics

https://en.wikipedia.org/wiki/Francis_Galton

https://galton.org/essays/1900-1911/galton-1904-am-journ-soc-eugenics-scope-aims.htm

https://en.wikipedia.org/wiki/Intelligence_quotient

https://en.wikipedia.org/wiki/Karl_Pearson

https://en.wikipedia.org/wiki/Adolphe_Quetelet

https://en.wikipedia.org/wiki/Charles_Davenport

The American Eugenics Records Office

https://en.wikipedia.org/wiki/John_Harvey_Kellogg

https://www.plannedparenthood.org/uploads/filer_public/cc/2e/cc2e84f2-126f-41a5-a24b-43e093c47b2c/210414-sanger-opposition-claims-p01.pdf

https://www.npr.org/sections/itsallpolitics/2015/08/14/432080520/fact-check-was-planned-parenthood-started-to-control-the-black-population

https://en.wikipedia.org/wiki/Buck_v._Bell

https://blog.petrieflom.law.harvard.edu/2020/10/14/why-buck-v-bell-still-matters/

https://en.wikipedia.org/wiki/Clarence_Darrow

The Eugenics Cult, by Clarence Darrow

https://en.wikipedia.org/wiki/Nazi_eugenics

https://en.wikipedia.org/wiki/Julian_Huxley

https://en.wikipedia.org/wiki/Directed_evolution_(transhumanism)

https://www.seenandunseen.com/transhumanism-eugenics-digital-age

https://slate.com/technology/2022/03/silicon-valley-transhumanism-eugenics-information.html

https://biopoliticalphilosophy.com/2023/01/19/transhumanism-is-eugenics-for-educated-white-liberals/

Making Us New: From Eugenics to Transhumanism in Modernist Culture

https://www.vice.com/en/article/prominent-ai-philosopher-and-father-of-longtermism-sent-very-racist-email-to-a-90s-philosophy-listserv/

https://www.truthdig.com/articles/longtermism-and-eugenics-a-primer/

https://en.wikipedia.org/wiki/Idiocracy

https://www.theatlantic.com/magazine/archive/2004/04/the-case-against-perfection/302927/

 

---

*Patreon Credits*

Michaela Evans, Heather Aunspach, Alyssa Ottum, David Whiteside, Jade A, amy sarah marshall, Martina Dobson, Eillie Anzilotti, Lewis Brown, Kelly Smith Upton, Wild Hunt Alex, Niklas Brock, Jim Fingal

<<>>

Jenny Lamb, Matthew Walden, Rebecca Kirsch, Pam Westergard, Ryan Quinn, Paul Sweeney, Erin Bratu, Liz T, Lianne Cole, Samantha Bayliff, Katie Larimer, Fio H, Jessica Senk, Proper Gander, Nancy Carlson, Carly Westergard-Dobson, banana, Megan Blackburn, Instantly Joy, Athena of CaveSystem, John Grelish, Rose Kerchinske, Annika Ramen, Alicia Smith, Kevin, Velm, Dan Malmud, tiny, Dom, Tribe Label - Panda - Austin, Noelle Hoover, Tesa Hamilton, Nicole Carter, Paige, Brian Lancaster, tiny, GD

Transcript
1
00:00:01,000 --> 00:00:24,390
Chris: Here's a few quotes from the other, the lawyer that I mentioned, the semi famous lawyer Clarence Darrow. Apparently, by the way, he was like one of those archetypal foghorn leghorn. Like, I'm just a simple country lawyer. So he has a whole screed on eugenics, and it's called, interestingly enough, cult of eugenics.

2
00:00:24,510 --> 00:00:26,890
Kayla: Oh, I think we have our answer friends.

3
00:00:27,590 --> 00:00:30,328
Chris: He wrote this in 1925.

4
00:00:30,424 --> 00:00:31,232
Kayla: Oh.

5
00:00:31,376 --> 00:00:33,128
Chris: No, that's not a oh, that's good. That's.

6
00:00:33,184 --> 00:00:35,728
Kayla: I just mean, oh for all the guys who were, like, still doing eugenics.

7
00:00:35,784 --> 00:00:40,200
Chris: Oh, yeah. That's not good for, like, the Davenports who were, like, working with Nazis in the thirties.

8
00:00:40,280 --> 00:00:44,192
Kayla: Yeah. It's pretty clear that they probably should have figured out this stuff before they.

9
00:00:44,216 --> 00:00:45,304
Chris: Weren'T not being told.

10
00:00:45,392 --> 00:00:46,060
Kayla: Yeah.

11
00:00:47,240 --> 00:01:32,792
Chris: Even if human breeding could be so controlled. Sorry, I won't do that. Even if human breeding could be so controlled as to produce a race such as the eugenicists desire, we might still lose much that is worthwhile. It is hardly possible to breed certain qualities in without breeding others out. I for 01:00 a.m. Alarmed at the conceit and sureness of the advocates of this new dream, I shudder at their ruthlessness in meddling with life. I resent their egotistic and stern righteousness. I shrink from their judgment of their fellows. Everyone who passes judgment necessarily assumes that he is right. It seems to me that man can bring comfort and happiness out of life only by tolerance, kindness, and sympathy, all of which seem to find no place in the eugenicist creed.

12
00:01:32,976 --> 00:01:56,850
Chris: The whole program means the absolute violation of what men instinctively feel to be inherent rights, end quote. All right, you ready to just get into this?

13
00:01:57,000 --> 00:01:59,038
Kayla: Oh, yeah.

14
00:01:59,214 --> 00:02:00,494
Chris: It's hot in here.

15
00:02:00,622 --> 00:02:01,614
Kayla: Are you recording?

16
00:02:01,742 --> 00:02:18,310
Chris: Yeah, I'm recording, and I'm sweating. Kayla, are you excited to be here on culture? Just. This is culture. Just weird. Are you excited to be on the show? No, I'm glad to have you on the show. Even though I always have you on the show and you have me.

17
00:02:18,430 --> 00:02:24,970
Kayla: I always have you on the show. I'm always on the show. I'm always excited to be here. I'm a little apprehensive about today's topic, however.

18
00:02:25,360 --> 00:02:35,416
Chris: Why would you be appreh. No, this is going to be a learning experience for everyone involved. I'm Chris. I am a game designer and data scientist.

19
00:02:35,568 --> 00:02:37,304
Kayla: I'm Kayla. I'm a tv writer.

20
00:02:37,392 --> 00:02:42,660
Chris: And as mentioned, this is cult or just weird. I don't think we have any business do we?

21
00:02:43,240 --> 00:02:44,504
Kayla: I don't, actually.

22
00:02:44,552 --> 00:03:14,274
Chris: Even if we did have, like, we need to get going. Like, one of the things about this topic is it's so big, and we've been trying to do short episodes this season, and I've been really struggling with that. So my compromise to myself is, I'm gonna break this up into two episodes because there's a logical break. I want to do some of our usual asking each other questions. Actually, I ask you questions the host asks the listener, or you're the reactor.

23
00:03:14,322 --> 00:03:14,482
Kayla: What?

24
00:03:14,506 --> 00:03:15,706
Chris: Are you listening?

25
00:03:15,778 --> 00:03:16,730
Kayla: Pretty reactive?

26
00:03:16,850 --> 00:03:28,124
Chris: Yeah. Usually one of us asks the other one a question. I'm gonna start the same way today. I wanna play a game called guess the Wikipedia article.

27
00:03:28,212 --> 00:03:32,520
Kayla: I don't wanna do this. Oh, no. I'm scared. Okay, let's do it. Let's go. I'm excited.

28
00:03:34,260 --> 00:03:50,128
Chris: The idea of a dystopian society based on dysgenics can be traced back to the work of eugenicist Sir Francis Galton. HG Wells's 1895 novel, the Time Machine, postulates a society of humans, which has devolved due to lack of challenges.

29
00:03:50,224 --> 00:03:52,580
Kayla: Oh, I forgot that was a eugenesis bomb.

30
00:03:53,080 --> 00:04:27,542
Chris: While the Epsilon minus semi morons of Aldous Huxley's 1932 novel Brave New World have been intentionally bred to provide a low grade workforce. Perhaps the best parallel, and this is actually talking about the parallel to the. So it's the title of the article. So the perhaps the best parallel is provided by the 1951 short story the Marching Morons by Cyril Mh Korn Bluth. By the way, I read a summary of the marching morons, and it is. It's a hell of a thing. It's not good.

31
00:04:27,646 --> 00:04:37,598
Kayla: Should we. It's a little too late, but I guess maybe just a nice little content warning for our listeners that we're going to be using some not so nice language.

32
00:04:37,774 --> 00:05:02,490
Chris: There's some language. It's not so nice language. And one of the things that we're going to talk about is that this is language that's actually fairly normalized, but didn't really start out that way. So there's like. I don't know, it's got a language that has a complicated history, like, most language, anyway. Okay, so what encyclopedia article do you think that whole description was from? Like, what was. What's the parallel to the marching morons that they were just talking about?

33
00:05:02,570 --> 00:05:03,794
Kayla: I am a moron.

34
00:05:03,842 --> 00:05:07,270
Chris: So I diagnosed moron.

35
00:05:09,530 --> 00:05:18,896
Kayla: I don't have any idea what the parallel is, because I was going to be like, oh, this is pulled from the eugenics. Wikipedia. But I close the eugenics parallel. I don't know.

36
00:05:19,008 --> 00:05:25,980
Chris: Close. That's from the background section of the Wikipedia article on idiocracy.

37
00:05:27,920 --> 00:05:29,232
Kayla: Of course it is.

38
00:05:29,416 --> 00:05:42,400
Chris: Are we gonna talk about that movie just now? Like, that's literally the only reason I did this bit is to talk about idiocracy. Cause it doesn't really fit in anywhere else. But I definitely wanna bitch about it. Cause it is like such a. Eugenics.

39
00:05:42,480 --> 00:05:43,544
Kayla: Oh, my God. It is.

40
00:05:43,592 --> 00:06:02,234
Chris: The most eugenics movie paragraph really hammers that home. The idea of a dystopian society based on dysgenics can be traced back to the work of eugenicist Sir Francis Galton, who we will talk about as he is the father of eugenics. But that's not a good thing to have your movie be about.

41
00:06:02,322 --> 00:06:05,226
Kayla: No, it's really not. It's really not.

42
00:06:05,378 --> 00:06:08,810
Chris: And we're probably gonna, like, bleed listeners here because people seem to love that movie.

43
00:06:08,850 --> 00:06:16,266
Kayla: People do love idiocracy. And look, I love what's his face. I love that guy. Who's the guy that's in it? Luke. Luke Wilson.

44
00:06:16,338 --> 00:06:18,666
Chris: Luke Wilson. I love what's like talking about Mike judge.

45
00:06:18,818 --> 00:06:30,474
Kayla: You know, I am a gigantic beewis and butt head fan. Love beewis and butt head. I do not like idiocracy. And I do not like idiocracy because it is a eugenics movie.

46
00:06:30,602 --> 00:06:37,522
Chris: It is. It's like, what if eugenics was also smug? Like, that's idiocracy.

47
00:06:37,626 --> 00:06:39,106
Kayla: Politically smug, too.

48
00:06:39,218 --> 00:06:39,736
Chris: Right?

49
00:06:39,858 --> 00:06:46,908
Kayla: And I am resentful of the resurgence that it's had in meme culture.

50
00:06:47,004 --> 00:06:47,668
Chris: It really has.

51
00:06:47,724 --> 00:07:05,676
Kayla: Because we live in a society, because in America, we have so much political strife. And one of our favorite things to do is go the other side is dumb. And what better way to do that than show idiocracy? In which the idea is that everybody became dumb over hundreds of years.

52
00:07:05,828 --> 00:07:07,604
Chris: If you eat at Fuddruckers, you're dumb.

53
00:07:07,652 --> 00:07:21,890
Kayla: You're dumb. I can't deny that. When you have a movie like idiocracy, where there's like pro wrestling guys coming out at the political events, and then.

54
00:07:21,930 --> 00:07:24,418
Chris: You have, yeah, Hulk Hogan was at the RNC recently, and then you have.

55
00:07:24,434 --> 00:07:35,190
Kayla: That exact same thing happen in real life. I see the parallel. And also, this is a Eugenics movie. This is a eugenics movie. Look at the Wikipedia. It's a Eugenics movie. Are you going to explain what idiocracy is?

56
00:07:36,330 --> 00:07:53,350
Chris: I mean, I don't think I have time for that because as stated at the top of the show. This is a really big topic, and we're trying to get in two episodes, but it's a movie by Mike Judge. It's funny. That ties back into the first episode of the season, too. The first arc is because Luke Wilson's character gets essentially cryogenically frozen.

57
00:07:53,650 --> 00:07:54,826
Kayla: Oh, he does.

58
00:07:55,018 --> 00:08:23,870
Chris: It's like hibernation sleep, or whatever the hell, you know? But it's basically that. And so then he wakes up in the future where all of the dumb people have outbred the smart people, and everyone in America is dumb, and the conceit is all, hold on. No, you hold on. Luke Wilson. Luke Wilson is, like, in today's whatever, where he's from. Now, he's like a very average or below average intelligence person. But in the world of idiocracy that he wakes up in, he's, like the smartest one.

59
00:08:23,910 --> 00:08:24,302
Kayla: Right.

60
00:08:24,406 --> 00:08:26,726
Chris: But it's just like it's played for laughs.

61
00:08:26,758 --> 00:08:31,526
Kayla: Of course, it's not just that they're dumb, Christopher. They're dumb and fat.

62
00:08:31,678 --> 00:08:45,241
Chris: Oh, right. Well, I mean, they're dumb and they eat McDonald's. To guys like Mike judge, that's the same thing. That is the same thing. One of the movie posters for idiocracy is the vitruvian man by Leonardo da Vinci. Except it's like a fat guy with.

63
00:08:45,265 --> 00:08:49,233
Kayla: Like, a belly holding a soda and wearing flip flops.

64
00:08:49,321 --> 00:08:54,337
Chris: So fucking smug. Oh, my God. Okay, really, though, this is not about idiocracy.

65
00:08:54,513 --> 00:09:00,417
Kayla: This is, you know, where you can talk about idiocracy. Come to our discord linked in the show notes. I will gladly bitch about idiocracy with you on the Internet.

66
00:09:00,433 --> 00:09:41,878
Chris: That's a good idea. And in fact, we forgot to call to action. That too. Yeah, go to our show notes. That's what the discord link is. And let's. And you can argue against us, too. Like, I'm sure there's probably a lot of people out there that love idiocracy and tell us why we're wrong on discord. But that was just. Yeah, just a little fun debate to kick off our two episode arc about eugenics. Okay, I actually have another question for you as well. That's. This is more of, like, a normal cult or just weird question where I'm just gonna ask you something that will just be interesting to have on the back recesses of our minds while we talk about this topic of eugenics.

67
00:09:41,974 --> 00:10:35,590
Chris: So we will get into the details of what eugenics is, but suffice it to say, eugenics is about breeding human beings for certain qualities. I need that as a precursor to this question, because what I want to ask you is, so eugenics is typically associated with stuff we don't like. It's typically very racist, and we'll get to all this stuff. But what I want to ask you is how would you feel about it if instead of the main eugenicists of prior times and our time, instead of talking about wanting to breed for the master white race or whatever the distasteful thing is that we find distasteful, how would you feel about it if instead of veering into that, instead they were talking about selecting for things like tolerance or love or pro social stuff like, yeah, kindness. Yeah.

68
00:10:36,090 --> 00:10:39,670
Chris: And what if it worked instead of just being bad science?

69
00:10:40,730 --> 00:11:17,170
Kayla: I think, and I hope that I would still have an issue with it, because I don't think the only issue, and I'm sure you'll tell me, I don't think the only issue with eugenics is that, like, the eugenicists were selecting for, like, the wrong things or the things that I don't like. I think it's more, yes, that's bad. And also the selective breeding of humans in this way is really ethically dark and grim and sticky and murky.

70
00:11:17,830 --> 00:11:28,570
Chris: Would it be harder to answer that if I said instead of selecting for kindness, what if it was selecting against people that were going to join the KKK and do mass shootings?

71
00:11:29,050 --> 00:11:34,210
Kayla: Well, I don't believe that you can do that. I don't believe that you can do that.

72
00:11:34,290 --> 00:11:37,146
Chris: What if you could? What if we found a message that.

73
00:11:37,178 --> 00:11:55,590
Kayla: Feels that I don't think that we're gonna find that, and that does feel a little bit like we're getting into, like, arresting people for future crime to, like, a crazy degree. Right. And then it also, this conversation is making me think of conversations we've had before about things like how we can do genetic testing in utero to see.

74
00:11:56,210 --> 00:11:57,202
Chris: That'S what I was gonna go to.

75
00:11:57,226 --> 00:12:08,450
Kayla: Next to quote, unquote, catch. If a fetus is going to be born with certain genetic abnormalities and that gives the pregnant person the option then to potentially terminate the pregnancy or not.

76
00:12:08,950 --> 00:12:10,450
Chris: How would you feel about that?

77
00:12:11,310 --> 00:12:12,198
Kayla: How would I feel about that?

78
00:12:12,214 --> 00:13:04,140
Chris: Or do I feel about that? What I'm trying to decide whether. Because we're definitely gonna get a little bit into birth control movement, but maybe I'll just ask you here. There is a quote from, doctor Watson of not. Not Sherlock Holmes. Not Sherlock Holmes Watson. This is Watson and Crick Watson, who both of those guys apparently had some, like, really weird things to say. Watson. I'll just say it. It's. It came off the wrong way. Apparently with him is basically he said, like, if we. If in theory there were a gene for homosexuality and we could discover it, would it be ethical for that mother to be able to terminate the pregnancy? And actually, the way he put it was he's in favor of that because he thinks any mother should have the freedom to terminate their own pregnancy for any reason.

79
00:13:04,680 --> 00:13:16,060
Chris: He got a lot of flack for that. Understandably so. But it does raise an interesting question of, like, just exactly how pro choice am I?

80
00:13:16,360 --> 00:14:00,942
Kayla: I know, because I think that what I want to say to that potential pregnant person is if you would terminate a pregnancy over the thought of raising a gay child, then I don't think that you're ready to be a parent. And I think that's a bigger conversation. But then, am I a hypocrite if I don't have that same knee jerk reaction to somebody who would terminate a pregnancy in which trisomy 18 or something that's incompatible with life, a disability that's incompatible with life, is present in the fetus, shouldn't I be then saying to any parenthood, well, unless you're ready to care for a child that is severely, has a severe disability and will die, are you not ready to be a parent yet? I don't know. It gets really. This is what I'm saying. It gets really sticky.

81
00:14:00,966 --> 00:14:01,582
Chris: Oh, it does.

82
00:14:01,686 --> 00:14:23,310
Kayla: It gets really sticky and icky. And I think that this also. I'm sure you're going to talk about this stuff. I worry about the. The power behind something like a eugenics movement and the power behind these quote unquote, sciences, because then we start getting into the forced sterilization of people.

83
00:14:23,850 --> 00:14:26,042
Chris: Well, we definitely get into the forced sterilization of that.

84
00:14:26,066 --> 00:14:37,610
Kayla: And that happens, and that has happened, and that continues to happen, and that is tied deeply to the eugenics movement and tied deeply to these questions that you and I are talking about right now.

85
00:14:37,770 --> 00:15:30,100
Chris: Yeah, it's sticky. That's why I wanted to start with a question like that, because it's not so much that I want to gotcha us into being like, pro life's better. That's not what it's about. It's more like, first of all, to help us try to maybe understand the mindset of a lot of these eugenicists. They thought that they were very much doing the right thing. A lot of these leading eugenicists from the early 20th century believed that they were on the right side of history and science supported them, and they were progressive and they were liberal and all that stuff. I think that we especially, like on this show, we've talked about eugenics several times, and certainly for me, that word just brings up. It's like a one third of a slur. You know what I mean?

86
00:15:30,640 --> 00:15:43,292
Chris: It's like one third of the word genocide. When I hear the word genocide, there's this part of my body that goes. And when I hear eugenics, it's like half of that. You know what I mean? It's like a bad word almost, because.

87
00:15:43,396 --> 00:15:48,596
Kayla: Of what it symbolizes. What the word itself is a symbol for is hideous.

88
00:15:48,708 --> 00:15:58,964
Chris: Right. And part of what I wanted to do with that question is try to get in the mindset of, like, well, these people didn't feel that way at one point. Eugenics was a good word.

89
00:15:59,012 --> 00:16:02,276
Kayla: Right. I have a question for those people.

90
00:16:02,468 --> 00:16:03,076
Chris: They're not here.

91
00:16:03,108 --> 00:16:06,652
Kayla: Well, then I have a question for you as the man speaking for these pro eugenics people.

92
00:16:06,716 --> 00:16:08,280
Chris: Oh, yeah, that's definitely me.

93
00:16:09,520 --> 00:16:22,160
Kayla: Were any of the people involved in these early eugenics movements of the population that these eugenicists were seeking to select out? No, I think that's maybe a problem.

94
00:16:22,280 --> 00:16:24,632
Chris: No, I think that's maybe a problem.

95
00:16:24,696 --> 00:16:29,752
Kayla: And it goes back to this, like, what is it? No representation without. What is that? No. About us.

96
00:16:29,776 --> 00:16:31,640
Chris: No representation without representation.

97
00:16:31,720 --> 00:16:36,026
Kayla: Yeah, it goes back to that. That quote of nothing about us without us. Is that what it is?

98
00:16:36,128 --> 00:16:39,550
Chris: It's that it's also designed with. Not for design.

99
00:16:39,590 --> 00:16:41,246
Kayla: With. Not for, like, design.

100
00:16:41,318 --> 00:16:43,518
Chris: Taxes without representation about us.

101
00:16:43,574 --> 00:17:06,806
Kayla: It is inherently ethically dubious for a population to be selecting out another population. That is not part of the conversation. I don't want to have this conversation about, like, who we should and should not be aborting with people who were never in danger of that. I would rather have this conversation with people who have the lived experience.

102
00:17:06,998 --> 00:17:45,954
Chris: Right, right. And I think just like, a second thing for that question is also just to kind of get in the mindset of, like, there's multiple things going on here that could be wrong. Part of it is that it's just the science behind eugenics is bad. Part of it is there's, like, the content of it. Like, what if it was good and were selecting against this versus that? And then part of it is like. Or is there some larger thing at play here that feels icky to us? And it feels like the answer to that last one is yes. Because even if we would be selecting for stuff we, like you answered, and I think I would also answer, there's something that still feels bad there.

103
00:17:46,042 --> 00:17:58,834
Kayla: And when you say the science is bad, you mean the science is unsound, ineffective. You don't mean, like, it's bad, and we should. It's morally bad, we shouldn't do it. You mean it's not sound science.

104
00:17:58,922 --> 00:18:00,266
Chris: The science is sour.

105
00:18:00,418 --> 00:18:01,230
Kayla: Okay.

106
00:18:03,050 --> 00:18:12,052
Chris: But let's back up a little bit. I do want to, as we do on the show, get into the history of eugenics, which means we have to start way back in the day.

107
00:18:12,116 --> 00:18:12,964
Kayla: I don't want to.

108
00:18:13,092 --> 00:18:52,488
Chris: Eugenics was only coined at the end of the 19th century as a term, but the idea of breeding people of certain ways or controlling humans genetic destiny. Before we had the words, you know, genetic or heredity. But that kind of stuff has been around for a really long time. Plato, in his seminal work, Republic according to Britannica, depicts a society where efforts are undertaken to improve human beings through selective breeding. And then we also just have, like, you know, the general, like, bloodline stuff. Like, bloodlines have always been a thing. Like, oh, you can only interbreed the Habsburgs with other Habsburgs.

109
00:18:52,544 --> 00:18:53,900
Kayla: That shit was weird, man.

110
00:18:54,480 --> 00:19:32,888
Chris: So we've always had some notion of what we would later come to call eugenics. But it wasn't until the late 19th century that we actually got the word eugenics. And we got that from aforementioned father of eugenics, Sir Francis Galton. So just to kind of set the table for this era, like, what was. What was like, why did this time period see the explosion of interest and acceptance and scientific, quote, unquote, research into eugenics? Two big things happen towards the end of the 19th century. You have Darwin publishing on the origin of species.

111
00:19:32,944 --> 00:19:34,736
Kayla: It's all Darwin's fault.

112
00:19:34,848 --> 00:19:38,000
Chris: And you also have Mendel, who, I don't know if you remember your.

113
00:19:38,040 --> 00:19:41,656
Kayla: Yeah, the Punnett squares. The Punnett squares with the peas. It was the p's.

114
00:19:41,728 --> 00:20:33,550
Chris: Yeah. So then you have Mendelian heredity. So you have these two big scientific advancements, frameworks for explaining things that you're able to overlay on stuff that's, like, very old. Like, you knew that heredity and bloodlines were kind of a thing, and you could breed humans. And, you know, we're breeding cows to be more domesticated. So something's going on here with breeding. But these explanatory frameworks made people kind of go like, oh, this is how it works scientifically. So we can do this with the power of science and also think about it like, we're also in just sort of an era of advancement here in the 18 hundreds. And as I'll get to in a second, a lot of these guys, a lot of these eugenicists were extremely distinguished scientists and researchers who also dabbled in eugenics.

115
00:20:33,670 --> 00:20:34,330
Kayla: Right.

116
00:20:34,790 --> 00:20:53,250
Chris: So it gets very complicated with these guys very fast. What is eugenics, though? We kind of defined it before, just so I could ask you that question. But first of all, it became an international movement, and it had many varieties. It gave birth to many varieties with many different genealogies.

117
00:20:53,590 --> 00:20:56,794
Kayla: Can we do the Punnett square for eugenics?

118
00:20:56,922 --> 00:21:41,194
Chris: I kind of feel like that's what Tessrael is. So the general flow is from England to the US to the rest of the world, most famously Germany. And the reason I say that is because the first two big names are british guys. Sir Francis Galton, as I mentioned, and then another guy named Carl Pearson. So Pearson was actually his protege. And the two of them were sort of like the grandfather and father of eugenics, of the eugenics movement. And then from there, it sort of moved to the US. Charles Davenport was the guy, like the guy here. There was obviously a lot of scientists, a lot of political figures, a lot of people were involved in the eugenics movement.

119
00:21:41,242 --> 00:22:14,740
Chris: But in terms of, like, if you had to point to a guy, Charles Davenport, and he was essentially like, I wouldn't call him a student of Galton and I, Pearson, but he studied them. He met with them in, like, he visited Great Britain and visited the universities at which they worked in the UK. So aside from its international nature, there's also something to keep in mind as we talk about this topic. You can kind of split it into positive and negative eugenics. That doesn't mean good or bad. There's like, another way that people try to split it that way.

120
00:22:15,210 --> 00:22:15,954
Kayla: Ew.

121
00:22:16,122 --> 00:23:02,520
Chris: Yeah. They try to talk about, like, in fact, ethical eugenics. Yeah. Liberal eugenics is what they say. So there's like, the eugenics that's top down and the eugenics that's bottom up. And that's something we actually talked about. Doctor Torres, a little bit. Yeah. One thing I'll say, by the way, just while we're on that tangent, liberal eugenics isn't new. My impression was that eugenics started as an idea that was purely top down, like state sterilization type programs. And then we decided that was bad. And now we're like, but what if we could do stuff that was more bottom up? Like, people get to choose what their kids genes are or whatever that was already. That like, the eugenics movement in the early, late 19th, early 20th century already featured that. They were already.

122
00:23:03,180 --> 00:23:23,392
Chris: One of the things at the time was like, hey, parents, be responsible and only, like, marry someone that is of good stock. So there was all I know, there was already a push to have individual citizens voluntarily do the eugenics a little bit.

123
00:23:23,456 --> 00:23:31,920
Kayla: Do you know what phrase, I think rings of echoes of eugenics that we still use today? I think that is just the next.

124
00:23:31,960 --> 00:23:35,752
Chris: Generation of good stock, beef stock.

125
00:23:35,856 --> 00:24:03,580
Kayla: When we talk about, oh, they're a well matched couple. You hear that all the time. And people can hide behind, like, well, yeah, because she's a marketing executive and he's a data scientist, and it's like, no, what you're talking about is that they look hot enough for each other, that they're the hot kids. The proper amount of what you think that, like, they're in the same leagues and they're gonna have the proper kids with the proper, like, waist to hip ratio and eye color and white.

126
00:24:03,920 --> 00:24:54,262
Chris: All right, getting back to positive versus negative eugenics, that doesn't mean the same thing as liberal versus top down or coercive eugenics. What that means is positive is somebody saying, like, let's try to get the good ones to breed more. And negative is, let's stop the bad ones from breeding so much. Okay, so that's. There's a differential, like. And some eugenicists cared more about one versus the other. I think Galton himself was more of a positive eugenicist, if I'm not mistaken, from my readings. But I'm going to read to you a definition according to Galton himself, from one of the papers of his that I read. Eugenics is the science which deals with all influences that improve the inborn qualities of a race, and also with those that develop them to the utmost advantage.

127
00:24:54,406 --> 00:25:14,314
Chris: The improvement of the inborn qualities or stock of some one human population will alone be discussed here. I guess here was actually in his paper, so maybe I didn't need to say that. In any case, that was his definition of eugenics, which, I mean, we kind of get it at this point. Right? Like, I just wanted to read his definition. But, like, do you feel like you get an idea of what it is?

128
00:25:14,362 --> 00:25:14,730
Kayla: Yes.

129
00:25:14,810 --> 00:25:28,946
Chris: Because it is multifaceted. Right? Like, some people are in favor of coercion, some people are not. Some people think it's about making the good ones breed, and then some people think about not making the bad ones breed. And if you're nazi, maybe it's about just killing them. Just in case.

130
00:25:29,058 --> 00:25:51,184
Kayla: Right. I think it's good to point out that, like, there are multiple different forms, because then when we see it pop up in its less, like, negative. That's the right term. Negative term. Negative forms these days. Like, you can still identify it. Like when, like when Elon Musk is talking about, like, the population issue that, like, certain people aren't having enough babies. Like, that's a positive eugenics.

131
00:25:51,352 --> 00:25:52,744
Chris: That's exactly what that is. Yeah.

132
00:25:52,832 --> 00:25:53,432
Kayla: Ideal.

133
00:25:53,536 --> 00:26:46,316
Chris: Yeah. Like, as an aside, like, I kind of hate that terminology because it, like, positive makes it sound like it's good, and negative makes it sound like it's bad. Like, I wish they had thought of a different term for that. But whatever, plus or minus, one of the things I just want to kind of stop here and mention is that a big takeaway for me, for all of this research is just how widespread and well accepted and well regarded eugenics was at the time. I mentioned this already, but we just have this bias against that word, right. At this point, it's like a very negative sounding word. And there's a reason for that. Around 1945, it stopped being used as much, but we'll, it seems very late. We'll get to that. But it wasn't always that way, right?

134
00:26:46,348 --> 00:26:49,604
Chris: Like, eugenics was a positive word for a long time.

135
00:26:49,652 --> 00:26:54,972
Kayla: Right? It wasn't a dirty word amongst the pop. The general population while it was born.

136
00:26:55,076 --> 00:27:17,904
Chris: And I know I had that bias. I think, you know, when we talk about urantia tea company and all these things, you know, different things that have eugenics as a part of them on the show, I definitely had the bias of like, well, it was like a few pseudo scientific racists. The guys in Germany, bada bing. That's it. But it was not just that. It was very much not just that.

137
00:27:17,952 --> 00:27:18,540
Kayla: Right?

138
00:27:18,840 --> 00:27:29,500
Chris: So continuing about Mister Galton here, a few fun facts about him. He was actually Darwin's half cousin, so he was born for the job.

139
00:27:30,850 --> 00:27:32,990
Kayla: Something. We gotta look at the genes of that family.

140
00:27:33,810 --> 00:28:06,200
Chris: He coined the term eugenics. He also coined the. I know it's not a term because it's two words, but nature, nurture, he coined that. And just to hit a home on how distinguished and how productive these guys were, these scientists who also liked eugenics tended to be. Here's what Galton accomplished in his life. This is just from the intro on Wikipedia to his page. He produced over 340 papers and books. He also developed the statistical concept of correlation.

141
00:28:06,900 --> 00:28:08,404
Kayla: Like, just correlation.

142
00:28:08,572 --> 00:28:57,540
Chris: When we talk about correlation now, he basically invented that. A lot of these guys were inventing math that we still use today. That's in textbooks. He also widely promoted a relatively new idea of regression toward the mean, which is like a very important statistical concept. He was the first to apply statistical methods to the study of human differences and inheritance of intelligence. On that note, he created sort of like a proto IQ test. He was the first one to create a test of human intelligence, the first one to do that. He introduced the use of questionnaires and surveys for collecting data on human communities. So, like, I guess people weren't surveying humans before that. He was also a polymath, so he was, like, just good at a lot of stuff. He also founded fields called psychometrics and differential psychology.

143
00:28:57,960 --> 00:29:52,738
Chris: He devised a method for classifying fingerprints that proved useful in forensic science as the initiator of scientific meteorology, he devised the first weather map, proposed a theory of anticyclones, and was first to establish a complete record of short term climatic phenomenon on a european scale. And then it talks here about. He has another couple, like, little physical inventions that he made. He was knighted, and there's a few other minor things as well. So I know that sounds like a lot. That was like a lot of word vomit, but, like, you get the idea of this guy and Pearson following him and several other of these guys were very similar. In fact, Pearson invented statistical mathematics. So these guys were just like, a lot of them were statisticians. A lot of them were just like scientists, but then they also were proponents of eugenics.

144
00:29:52,914 --> 00:29:54,510
Chris: And, you know, it's.

145
00:29:54,850 --> 00:29:57,386
Kayla: This is just the Nobel disease, like, writ large.

146
00:29:57,538 --> 00:30:11,842
Chris: It is. And it's really interesting. Like, these days, we sort of feel like there's a maybe like a conservative liberal divide amongst racism. I think conservatives would probably disagree with that, but I think I would disagree.

147
00:30:11,866 --> 00:30:12,530
Kayla: With that now, too.

148
00:30:12,570 --> 00:30:41,238
Chris: Yeah, probably. But certainly back in these days, right, it was not that way. Like, racism was just a thing that everybody did. It wasn't like a. Like, you could be a progressive and be a racist. And that's exactly what this guy was like. It specifically talks about him being a scientific racist in several places. But again, scientific racism, like, there's a difference between saying I don't like this particular race versus I scientifically think that they're inferior.

149
00:30:41,334 --> 00:30:41,970
Kayla: Right.

150
00:30:42,360 --> 00:31:33,160
Chris: One is a little more insidious maybe than the other. All right, so we already talked a little bit about Carl Pearson. Carl Pearson was his protege. He was also from the UK. I won't go into all the details about him, but suffice it to say, very accomplished. And then I also mentioned Charles Davenport. So he was the us guy. So he founded something in the US called the Ero, the Eugenics Records office, located in Cold Spring Harbor, New York, which Cold Spring harbor is just like a cool name. And he found support for that with Carnegie money. So the Carnegie Institution funded the ERo. Although I will say that later once the Ero was closed, after eugenics was widely discredited, the Carnegie Institute pulled their funding. That's part of why they closed. But he was responsible for the adoption?

151
00:31:33,900 --> 00:31:52,586
Chris: I don't want to say he was solely responsible, but he was like the vanguard guy in America for eugenics. And it got pretty bad over here. So when I talk about bad, I talk about, like, forced sterilization. Yeah, we did that. We did a lot of that.

152
00:31:52,778 --> 00:31:56,010
Kayla: We did that up until, like, the 1970s, and we're probably still doing it.

153
00:31:56,130 --> 00:32:01,706
Chris: I think the last sterilization law went off the books in like, the eighties or something.

154
00:32:01,778 --> 00:32:02,830
Kayla: Jesus Christ.

155
00:32:04,100 --> 00:32:12,132
Chris: They, you know, they really, they slowed down quite a bit. But do you want to guess how many, like, the estimate for how many sterilizations were performed in the United States?

156
00:32:12,276 --> 00:32:14,132
Kayla: Like, over how many years?

157
00:32:14,316 --> 00:32:18,316
Chris: This was in the early. I'll say this is over like 30 years.

158
00:32:18,388 --> 00:32:19,200
Kayla: A million?

159
00:32:20,260 --> 00:32:22,404
Chris: Okay. Good guess. No, I was just going.

160
00:32:22,452 --> 00:32:23,764
Kayla: I hope that's a crazy number.

161
00:32:23,812 --> 00:33:12,368
Chris: It is. 60,000 sterilizations were performed in the US. States started writing these laws in the 1910s. Is that what you call that decade, the 1910s? They started writing these laws and then they, of course, they got challenged because they're nuts. The most famous case was called Buck v. Bell, a young woman by the name Kerry Buck. I encourage you to look this case up because it is interesting. The bottom line is she ended up getting assigned to one of these institutions. Institutions for the feeble minded. That was a word, actually. Feeble minded is. We talked about this at the top of the episode. A lot of words that actually meant something during this time were like a diagnosis. They were like a diagnosis.

162
00:33:12,424 --> 00:34:03,908
Chris: I don't think feeble minded was a diagnosis, but it was a word that, it was like a catch all that eugenicists like to use to talk about people who had something wrong with their brain. Or they thought when you actually look at these people, it's like, no, she was just poor. Although they also wanted to not. They didn't want poor people to breed either. Yeah, it wasn't just like, hey, you realize this person isn't feeble minded. They just like, you know, that was the thing. Like, they wanted to sterilize prostitutes and criminals and basically all these, like, quote unquote lower members of society. It all just kind of went in this one bucket. Anyway, the reason I bring up Buck v. Bell is that was the case where Carrie Buckley and her lawyer challenged the constitutionality of this law.

163
00:34:04,044 --> 00:34:09,467
Chris: It was a Virginia law, and it went all the way up to the Supreme Court. And the Supreme Court was like, no, that's cool. You can do that.

164
00:34:09,563 --> 00:34:10,179
Kayla: Good Lord.

165
00:34:10,219 --> 00:34:16,532
Chris: So the Supreme Court affirmed that states could forcibly sterilize its citizens.

166
00:34:16,636 --> 00:34:17,764
Kayla: When was this?

167
00:34:17,931 --> 00:34:22,692
Chris: So the case, Buckfibel, was decided in 1927.

168
00:34:22,795 --> 00:34:23,684
Kayla: Oh, my God.

169
00:34:23,812 --> 00:35:03,442
Chris: So again, this is all like when you think about Francis Galton, he's like, on the crossover from the 18 hundreds to the 19 hundreds. Pearson kind of comes a little bit after him and then Davenport a little bit after them. And then we're into this sort of, like, basically eugenics was all the rage from, like, 1900 to the 1930s. Like, it. It wasn't just scientists. It was all, like I said, it was also politicians. And it became very popular with just the american people in general. So once this was approved by the Supreme Court, there was other states that got into the whole sterilization game. Of the 60,000, by far the leader was California.

170
00:35:03,586 --> 00:35:08,114
Kayla: I feel like I knew that California. And I feel like were sterilizing people really late today.

171
00:35:08,162 --> 00:35:10,466
Chris: The absolute best at it.

172
00:35:10,538 --> 00:35:11,430
Kayla: Good Lord.

173
00:35:12,570 --> 00:35:22,570
Chris: They forcibly sterilized approximately 20,000 of the 60,000 holy in America. And they were so good at it that the Germans.

174
00:35:22,690 --> 00:35:23,414
Kayla: No.

175
00:35:23,602 --> 00:35:30,118
Chris: Came over to kind of figure out what was going on, kind of study how were doing things over here in the golden state.

176
00:35:30,254 --> 00:35:34,894
Kayla: Weren't we doing it to a lot of native populations as well? Like native american people?

177
00:35:35,022 --> 00:35:53,918
Chris: I did not come across that, but I am sure. I am sure because, again, the line from, well, we should try to breed our race to become nicer and better and smarter. The line from that to like. And by that, I mean people who don't look like me.

178
00:35:54,014 --> 00:35:54,366
Kayla: Right.

179
00:35:54,438 --> 00:35:58,158
Chris: Is like. It's just very short. It's a very short line.

180
00:35:58,214 --> 00:35:58,810
Kayla: Right?

181
00:35:59,310 --> 00:36:21,016
Chris: So, yeah, so the Nazis really learned a lot from us when it comes to eugenics, unfortunately. Here's a quote I don't think you're going to like Davenport. Remember Charles Davenport, the main eugenics guy in the US, continued to maintain connections with german colleagues in the 1930s. So the thirties is when they were starting to do their shit.

182
00:36:21,128 --> 00:36:21,824
Kayla: Yeah.

183
00:36:21,992 --> 00:36:37,976
Chris: That's drawing a line between science and politics, quote unquote, drawing a line between science and politics that allowed him to cultivate and extend his relations with nazi racial hygienists, end quote. In the years before World War two.

184
00:36:38,048 --> 00:36:53,950
Kayla: That's a fun thing that we don't often talk about over here, is that a lot of Americans and a lot of America, Washington, not necessarily anti the Nazis for a very long time, particularly these guys. Yeah.

185
00:36:54,610 --> 00:37:15,798
Chris: The impression I have from doing this research is that there's, like, a smooth development of eugenics across these decades and through these countries. And Germany's policies were simply a next step. And I think that I even read that, like, some of them were, like, a little surprised that when were, like, aghast at the end of the war.

186
00:37:15,914 --> 00:37:16,610
Kayla: Yeah.

187
00:37:17,230 --> 00:37:28,918
Chris: Some of them were like, but we got this from you. In fact, specifically, there was a. So they performed, according to the research I did, 450,000 force sterilizations in Germany.

188
00:37:28,974 --> 00:37:29,574
Kayla: Jesus Christ.

189
00:37:29,622 --> 00:37:59,360
Chris: Far less the number of people they killed. But the killing was, again, it was for eugenic reasons. Right. It was just the next step. It was just like, well, it's so hard to sterilize these people. What if we just kill them? So they sterilized a lot of people. And that came under scrutiny at the Nuremberg trials. And one of the Nazis that was being tried there pointed to Buck v. Bell and said, your own supreme court says that this is okay, so why are you putting us on trial?

190
00:38:00,060 --> 00:38:03,756
Kayla: I don't. Oh, it feels really bad.

191
00:38:03,948 --> 00:38:20,014
Chris: Yeah, it's not great. And it wasn't just davenport by. There were other Americans that were. Other american eugenicists were involved in Germany's eugenics program, which just, again, seemed like just another eugenics program.

192
00:38:20,142 --> 00:38:20,774
Kayla: Right, right.

193
00:38:20,822 --> 00:38:24,998
Chris: It wasn't until it went maybe a little too far for most people. Not everyone, but most people.

194
00:38:25,134 --> 00:38:27,654
Kayla: Until it went to its logical conclusion.

195
00:38:27,702 --> 00:38:33,462
Chris: Logical conclusion, which, you know, 2020 hindsight. Right. Because at the time, they all thought eugenics was good.

196
00:38:33,526 --> 00:38:34,130
Kayla: Right?

197
00:38:34,870 --> 00:38:46,506
Chris: Oh, actually, let's apply the brakes a little bit there. Like everything. It's not 100% one way, it's not 100% the other way. There were actually a lot of contemporary critiques of eugenics.

198
00:38:46,618 --> 00:39:05,128
Kayla: This is why I didn't know that. That's good to hear. I should have assumed that. And this is why I get a b in my bonnet when we want to apply the like. Well, nobody knew any better back then because there usually was not an insignificant amount of people who did, quote unquote, know better.

199
00:39:05,234 --> 00:39:48,696
Chris: Yeah. It's sobering for me to remember that eugenicists weren't the quote unquote, bad guys. And they, you know, they enjoyed a wide support. It's also sobering to know that the wide support wasn't universal. And there were some very sharp critics. Some of the most well known and vociferously opposed to eugenics were folks like Clarence Darrow, who's kind of like a famous lawyer, I'm going to quote him in a minute, a guy named Franz Boas. He had some apparently well known, well regarded debates with Mister Davenport on the subject of eugenics. And then there was a scientist by the name Thomas Hunt Morgan. And again, I'm trying to keep this episode short, which is why I'm like, I'm not really going into the details about who these guys were.

200
00:39:48,808 --> 00:39:51,232
Kayla: It sounds like we could do an entire season on this and we're not.

201
00:39:51,256 --> 00:40:46,160
Chris: Doing it for sure. So Thomas Hunt Morgan was actually a scientist who brought some of the first major research that was like, hey, I didn't find anything, you guys. So here's a quote from his entry in. This is either Britannica or Wikipedia. I did not write down which. The first major challenge to conventional eugenics based on genetic inheritance was made in 1915 by Thomas Hunt Morgan. He demonstrated the event of genetic mutation occurring outside of inheritance, involving the discovery of the hatching of a fruit fly. So fruit flies are like, you know, the thing that all these. That people who study inheritance use because they have very quick turnover. He demonstrated that mutation could happen outside of inheritance. So it says here he had white eyes from a family with red eyes, demonstrating that major genetic changes occurred outside of inheritance.

202
00:40:47,100 --> 00:41:38,718
Chris: Additionally, Morgan criticized the view that certain traits, such as intelligence and criminality, were hereditary because these traits were subjective. So those were some of the guys that were anti eugenics. Some of their criticisms, aside from the one we just talked about, where this Thomas Morgan guy did research and was like, this doesn't work. But that's just a subset of one of the major critiques, which is just the scientific rigor for this, was actually trash. A lot of critics at the time noted that in the process of data gathering, eugenesis gathered largely anecdotal evidence by self administered questionnaires or other means whose uniformity or accuracy could not be verified. So basically what was happening is the ero, the eugenics records office, these guys, there was just so much fucking confirmation bias going on here. Sure, sure.

203
00:41:38,814 --> 00:41:45,550
Chris: Like, these guys assumed the thing that they were trying to prove, which is that these traits were heritable.

204
00:41:45,710 --> 00:41:47,262
Kayla: That's the bad science you were talking about.

205
00:41:47,286 --> 00:42:27,306
Chris: And that's the bad science I'm talking about. And they would basically get these family genealogies and be like, yes, Uncle Franz was a criminal, just like Jacob here in the second generation. And then Jacob's brother did deviant behavior, which was just like, he said hi to a girl in fifth grade or whatever. So they would cherry pick. So they would cherry pick those, but they'd be like, those guys look bad. So, of course we need to prevent these people from breeding. But then they would ignore the other ten close relatives on that same tree that didn't have anything that they could point at, which is so silly, because.

206
00:42:27,338 --> 00:42:34,258
Kayla: It'S like you're pointing at one thing when there's a million explanations, and that's bad science, right?

207
00:42:34,434 --> 00:42:55,838
Chris: We're into the section here where there's a million things wrong with it. One of the things is how arbitrary and abstract these concepts are. It's absurd to think that you can easily account for something called quote unquote intelligence. But also, even if you could, these guys were cherry picking data, right?

208
00:42:55,894 --> 00:42:56,134
Kayla: Right.

209
00:42:56,182 --> 00:43:38,078
Chris: So it wasn't like there was, like, structural problems with this. And then there's also the, like, we're just going to cherry pick the things that make us. Right. So that Morgan guy, that critic we talked about earlier, also on the cons side, he made the comparison of dealing with mental defects to dealing with communicable diseases. So he said, basically, like, in the past, we could have, like, bred people for greater resistance to cholera or tuberculosis, but it's way easier and quicker and less painful to just clean up the environment and, you know, give people vaccines or whatever than it is to try to change the fundamental nature of human constitutions.

210
00:43:38,174 --> 00:43:39,006
Kayla: Yeah, that. Yeah.

211
00:43:39,078 --> 00:44:28,140
Chris: Why aren't we just doing that? So there's, like, practical argument against it. I mentioned the guy, Franz Boas, before. He was a professor of anthropology at Columbia University and a strong opponent. He wrote in scientific Monthly in 1916 that it would seem the first obligation of the eugenicist ought to be to determine what traits are truly inherited and which ones are not. Unfortunately, he wrote, this has not been the method pursued. The battlecry of the eugenicists. Nature, not nurture, has been raised to the rank of dogma, and the environmental conditions that make and unmake man physically and mentally have been relegated to the background. Eugenicists, boas complained, simply assumed that most traits were inherited, which was, of course, the point to be proven in the first place.

212
00:44:28,760 --> 00:44:31,380
Kayla: Oh. I heard a d word in there.

213
00:44:31,680 --> 00:44:32,616
Chris: What's the d word?

214
00:44:32,688 --> 00:44:33,544
Kayla: Dogma.

215
00:44:33,672 --> 00:45:02,942
Chris: Oh, dogma. Yeah, that's a criteria yeah. So a criterion. Excuse me. Here's a few quotes from the other. The lawyer that I mentioned, the semi famous lawyer Clarence Darrow. Apparently, by the way, he was like one of those archetypal foghorn leghorn. Like, I'm just a simple country lawyer, but his pedigree is not to, like, not to get eugenicist about this, but, like, he had.

216
00:45:02,966 --> 00:45:04,446
Kayla: He was really smart.

217
00:45:04,598 --> 00:45:26,358
Chris: Apparently. He had, like, ancestors that had served in the American Revolution. And his father was an ardent, according to Wikipedia, an ardent abolitionist and proud iconoclast and religious freethinker who was known throughout his hometown as the village infidel. And his mom was an early supporter of female suffrage and women's rights.

218
00:45:26,494 --> 00:45:28,246
Kayla: I could be called the. Oh, my God.

219
00:45:28,318 --> 00:45:59,228
Chris: I know, I know. So this guy was pretty badass. He was also, like, extremely. Clarence Darrow was, like, extremely intelligent. He was, like, the trope not just of the simple country lawyer, but of the simple country lawyer that's way smarter and more well read than you are. He was a leading member of the ACLU for a while. He did stuff with trade union causes, and he also was the lawyer on the side of, if you remember, the scopes monkey trial.

220
00:45:59,284 --> 00:46:00,420
Kayla: Oh, you better believe I do.

221
00:46:00,500 --> 00:46:12,130
Chris: Yeah. So he was also the good guy lawyer in that one. So he has a whole screed on eugenics, and it's called, interestingly enough, cult of eugenics.

222
00:46:12,250 --> 00:46:14,630
Kayla: Oh, I think we have our answer friends.

223
00:46:15,330 --> 00:46:18,058
Chris: He wrote this in 1925.

224
00:46:18,154 --> 00:46:19,106
Kayla: Oh.

225
00:46:19,298 --> 00:46:20,762
Chris: No, that's not oh, that's good.

226
00:46:20,866 --> 00:46:23,458
Kayla: I just mean, oh for all the guys who were still doing eugenics.

227
00:46:23,514 --> 00:46:27,946
Chris: Oh, yeah, that's not good for, like, the Davenports, who were, like, working with Nazis in the thirties.

228
00:46:28,018 --> 00:46:31,586
Kayla: Yeah, it's pretty clear that they probably should have figured out this stuff before.

229
00:46:31,698 --> 00:46:40,144
Chris: They weren't not being told. Even if human breeding could be so controlled. Sorry, I won't do that. He didn't actually say.

230
00:46:40,152 --> 00:46:41,992
Kayla: That was pretty good. I was kind of hoping you were going to do it.

231
00:46:42,016 --> 00:47:30,706
Chris: I tried to, like, I found a clip of him talking about something else, and it wasn't quite foghorn Leghorn, but it was still pretty good. Even if human breeding could be so controlled as to produce a race such as the eugenicists desire, we might still lose much that is worthwhile. It is hardly possible to breed certain qualities in without breeding others out. I. For 01:00 a.m. Alarmed at the conceit and sureness of the advocates of this new dream, I shudder at their ruthlessness in meddling with life I resent their egotistic and stern righteousness. I shrink from their judgment of their fellows. Everyone who passes judgment necessarily assumes that he is right. It seems to me that man can bring comfort and happiness out of life only by tolerance, kindness and sympathy, all of which seem to find no place in the eugenicist creedde.

232
00:47:30,848 --> 00:47:37,294
Chris: The whole program means the absolute violation of what men instinctively feel to be inherent rights. End quote.

233
00:47:37,382 --> 00:47:39,910
Kayla: That. That. That's the whole argument?

234
00:47:39,990 --> 00:47:41,982
Chris: That's the whole argument from 1925.

235
00:47:42,046 --> 00:47:43,250
Kayla: Damn, that's good.

236
00:47:43,590 --> 00:48:07,170
Chris: Here's a second quote by him. In an age of meddling, presumption and gross denial of all the individual feelings and emotions, the world is urged not only to forcibly control all conduct, but to remake man himself. Amongst the schemes for remolding society. This is the most senseless and impudent that has ever been put forward by irresponsible fanatics to plague a long suffering human race.

237
00:48:07,470 --> 00:48:08,290
Kayla: Dang.

238
00:48:08,710 --> 00:48:11,102
Chris: So these weren't like, I don't know, maybe it's wrong.

239
00:48:11,246 --> 00:48:16,222
Kayla: This was like, this shit's bad, you guys. And you're feeble minded for liking the.

240
00:48:16,286 --> 00:48:57,576
Chris: No, he actually did. Oh, so he has another quote from this exact same, I don't know, blog post, whatever the hell it was called back then. No idiot knows that he is an idiot as a rule. And by the way, idiot was a term that was, again, like a diagnosed type term back then. Probably a problematic thing to use in this context. But I am quoting him directly, and I think he knew what he was doing when he used this word. I think he was specifically trying to evoke, like, use eugenics terminology here against them. No idiot knows that he is an idiot as a rule. Those of small intellectual equipment are so sure of themselves that they are eager to make the race in their own image. End quote.

241
00:48:57,768 --> 00:49:07,112
Kayla: That's a good point. I think that just like. Yeah, the people that are being like, I think we should do this. I don't. Again, I don't think that they're looking.

242
00:49:07,136 --> 00:49:08,152
Chris: They're the dumb ones.

243
00:49:08,216 --> 00:49:14,328
Kayla: I don't think they're looking at themselves and going like, what are my deficiencies that I would select against? I think they're looking at other people.

244
00:49:14,424 --> 00:49:25,416
Chris: And my big takeaway from Clarence Darrow is just the shocking arrogance of eugenicists to be like, this is correct. I know better.

245
00:49:25,528 --> 00:49:26,180
Kayla: Right?

246
00:49:27,680 --> 00:49:38,952
Chris: So there were some famous eugenicists. I just want to list them here. Cause it's interesting. I think a lot of people now, by now, maybe know about John Harvey Kellogg. So that's like the Kellogg's story.

247
00:49:38,976 --> 00:49:43,744
Kayla: Wait, the Kellogg guy? I mean, I knew he was a nutter, but I didn't know he was a.

248
00:49:43,832 --> 00:49:45,008
Chris: He was a nutter and a eugenicist.

249
00:49:45,024 --> 00:49:47,704
Kayla: I knew he was like anti masturbation guy. An anti sex guy.

250
00:49:47,832 --> 00:49:49,712
Chris: Yeah, that was the.

251
00:49:49,816 --> 00:49:51,152
Kayla: I didn't know that part.

252
00:49:51,296 --> 00:50:02,262
Chris: The cornflakes were so you wouldn't jerk off. Yeah. But, yeah, he was also into eugenics. Helen Keller, I have heard this.

253
00:50:02,446 --> 00:50:03,958
Kayla: How did you confirm that?

254
00:50:04,094 --> 00:50:06,206
Chris: I just looked it in several places.

255
00:50:06,318 --> 00:50:06,990
Kayla: I have heard that.

256
00:50:07,030 --> 00:50:10,574
Chris: Several places that I trust in my trust network. Alexander Graham Bell.

257
00:50:10,702 --> 00:50:11,770
Kayla: I believe it.

258
00:50:12,310 --> 00:50:12,798
Chris: Yeah.

259
00:50:12,854 --> 00:50:15,470
Kayla: He wanted people to say ahoy when they answered the phone, right?

260
00:50:15,510 --> 00:50:42,250
Chris: Oh, yeah. We should breathe that out. Teddy Roosevelt, which he was like a very mixed bag guy, so that doesn't really surprise me at all for him. Same a guy that I'll get back to either this episode or next called Adolphic Quitalet. He was a eugenicist. He was also the guy that invented BMI. Oh, no, it was actually called the quite index for a while until it was changed to BMI.

261
00:50:42,330 --> 00:50:45,960
Kayla: If you're still using the BMI as a measure of anything, maybe think about that.

262
00:50:46,130 --> 00:51:23,300
Chris: And then the influential economist John Maynard Keynes. If you've ever heard of keynesian economics, that's him. He's kind of like the opposite of supply side. Unfortunately, his economic theories support the idea of government intervention, and he's sort of anti laissez faire. But he was also a eugenicist. According to his article, he was a prominent supporter of eugenics, serving as the director of the British Eugenics Society and writing that eugenics. Genics is, quote, the most important, significant, and I would add, genuine branch of sociology which exists.

263
00:51:23,840 --> 00:51:25,272
Kayla: What is wrong with these guys?

264
00:51:25,376 --> 00:51:50,140
Chris: Fat, blind spots that these guys tend to have. And then, of course, there's a little bit of a elephant in the room with one Margaret Sanger, who was at the forefront of the birth control movement. She founded the organization that would later go on to be called Planned Parenthood. That was hard research to do because there's so much propaganda trying to.

265
00:51:50,180 --> 00:52:05,340
Kayla: Like, that's where I've heard of. This is not the actual history, but, like, the folks who are. Sorry if I'm stepping on you, but folks who are anti abortion, claiming that the whole point of Planned Parenthood was Margaret Sanger eugenicsing black people.

266
00:52:05,500 --> 00:52:55,122
Chris: Yeah. So. And I actually saw, it was tough to tease apart because I saw some places where it seemed like I should place someone in my trust network and 90% of what they were saying, I could confirm. But then they were actually parroting that talking point. And so I was like, so is that true? And then I looked it up because there is a statement. It's not so. There is a statement from Planned Parenthood, basically sort of like, denouncing Margaret Sanger, at least her involvement in eugenics movement. It still isn't ultra clear to me. There are some arguments that, like, oh, well, the birth control movement just needed the oomph and funding that being pro eugenics would give them. There's also just like, birth control. And eugenics, as we've talked about, is just like, does have, like, a bit of a natural tie.

267
00:52:55,186 --> 00:52:55,970
Kayla: Right, right.

268
00:52:56,090 --> 00:53:22,582
Chris: Like, if you want to prevent people from being born for XYZ reasons, that involves birth control. As far as the, like, putting clinics in predominantly black neighborhoods, that did happen. It wasn't for eugenic reasons. It was more just like the need was there, as far as I'm aware. And she partnered with black leaders at the time to put those clinics there. So that wasn't like, you guys should stop having babies, and you should. She might have believed that anyway, but that's not what the clinics are.

269
00:53:22,606 --> 00:53:27,966
Kayla: And it's still, it's a conspiracy theory to think that these days, what Planned Parenthood's ultimate nefarious goal is to do.

270
00:53:27,998 --> 00:54:14,008
Chris: Eugenics, which is freaking absolutely not. And they. And even if. Even if you want to believe the worst propaganda about Sanger, Planned Parenthood has, like, issued a statement that, like, goes into all this. Like, it details out, here's the bad things she did and why we. We don't like it. But now let's jump ahead to the end of World War two. So I said 1945 was around when it got discredited. Basically what happened is once the Allies won the war and kind of peeked under the hood in Nazi Germany, were like, oh, that is the natural conclusion to that, isn't it? There's a lot of dead people. That's really bad. That's really bad. We shouldn't do this anymore. So it basically just suffered the worst brand hit in all history.

271
00:54:14,184 --> 00:54:36,440
Chris: There was all kinds of organizations that had eugenics in their name, that changed their names. In fact, there's a term called crypto eugenics where basically the idea is, did eugenics go underground and rebrand, or did it get discredited? It's like a little bit of both. Some of these organizations that were eugenic organizations probably were still mainly doing the same things and espousing the same ideals.

272
00:54:36,520 --> 00:54:39,340
Kayla: Changing your name, does that mean changing your app?

273
00:54:39,750 --> 00:55:38,656
Chris: No, but the salient point there, I think, is that the popular support for eugenics just kind of evaporated to the point where that's when it started having that negative overtone that it has today that I'm used to was after World War Two. I will also say that not to go back and hate America first again, but I'm down. There's definitely contention that our eugenic policies, which were, surprise, very much tied to immigration policies. So, like, the eugenicists, succeeded in getting a lot of immigration, anti immigration laws passed through because we didn't want that bad stock coming over here and polluting our good american stock. And a lot of that bad stock was, wouldn't, you know, jewish folks from Europe, right, who otherwise might have been able to refugee themselves over here and couldn't because of the immigration policies that we passed.

274
00:55:38,848 --> 00:55:47,580
Chris: So just another little thing there. So we already talked about some of the debunking of eugenics. I said, there's a million different reasons.

275
00:55:47,880 --> 00:56:05,840
Kayla: And I don't know if you get into this later, but I think that just in talking about the, like, did it go underground and the trends and where it falls, if it's about immigrants, blah. I think that for a long time, the stuff that led to sterilization, forced sterilization, made its way into our prison system.

276
00:56:06,740 --> 00:56:10,284
Chris: I didn't go into that research, but that does sound exactly right.

277
00:56:10,332 --> 00:56:15,956
Kayla: Like, it seems like eugenics kind of goes where we need it to go in order to uphold these certain systems.

278
00:56:16,028 --> 00:56:41,588
Chris: Yeah. Criminality was one of the big things that they thought they could breed out of human beings. Again, very precise with their terminology, criminality. Okay. So, like I said, there was a million. There's a million reasons why eugenics is not good. To help my thoughts organize around this, I tend to use the Ian Malcolm quote, just because we can doesn't mean that we should.

279
00:56:41,684 --> 00:56:43,320
Kayla: That's from Jurassic park, you guys.

280
00:56:44,100 --> 00:57:31,318
Chris: So there's the just because we can part. Can we. Is it good science? Is it possible? Not really. And then there's the even if it was possible, should we. Right, so there's the can and there's the should. So we already talked a little bit about the can. We talked about, like, how much confirmation bias went into the data collection that a lot of these eugenicists were doing. I also want to hit on a couple other things. There's, you know, they like to compare domestication of things like, you know, crops or livestock or, like, hey, we did that. Why can't we do that with people. A very good point that I read was, like, are we sure that's good? Mmm. Like, think about it from the cow's perspective. Are they actually better off? I don't think so.

281
00:57:31,374 --> 00:57:33,638
Kayla: The cows are not better off. The chickens are not better off.

282
00:57:33,694 --> 00:57:34,718
Chris: None of those animals.

283
00:57:34,814 --> 00:57:40,678
Kayla: And even in some ways, there are instances in which the ways we've done, like, the selective breeding and agriculture that.

284
00:57:40,694 --> 00:57:42,550
Chris: Has also monoculture has been us.

285
00:57:42,590 --> 00:57:43,870
Kayla: Bad. Been bad to us.

286
00:57:43,990 --> 00:57:51,610
Chris: Yeah, but, like, if you think about it from the perspective of those animals, none of those things are good for those. It's good for us. We get more cows and they're bigger or whatever.

287
00:57:51,650 --> 00:57:55,538
Kayla: But, like, domesticated cows are not superior to wild cows.

288
00:57:55,554 --> 00:58:00,018
Chris: If you gave them the option to do that to themselves, I don't think that they would.

289
00:58:00,074 --> 00:58:01,474
Kayla: We didn't uplift the cows.

290
00:58:01,522 --> 00:58:07,994
Chris: So it's not a really good analogy to say, like, we made cows better, so we can make ourselves better. No, not if we're doing that.

291
00:58:08,042 --> 00:58:09,642
Kayla: We could make ourselves tastier.

292
00:58:09,826 --> 00:58:29,384
Chris: Yeah, we could probably make ourselves tastier. I also. This is a quote. I think this is from that Morgan guy that did the fruit fly research. But he basically said, would you say that you could enact a successful breeding program for ants based on your knowledge of oyster biology? You probably wouldn't say that.

293
00:58:29,432 --> 00:58:30,336
Kayla: That's true.

294
00:58:30,488 --> 00:59:26,140
Chris: Then why do you think that we can do a human breeding program based on pea pods and fucking fruit flies? And as an aside here, the peapods thing, Mendel's experiments are, like, shockingly simplified, and I think it's generally considered a good thing. We might not have ever discovered the types of inheritance patterns that he discovered if he hadn't purposely. He basically went to get the most basic p's that he could find, that had well known properties of passing on traits to the next generation. Like, he purposely sought out a plant that would be. That could control for a lot of complexifying factors that occur in many other species. So even using that as a way to extract there's ten different reasons why using it to extrapolate is a bad idea.

295
00:59:26,680 --> 01:00:11,752
Chris: And then, in terms of the bad science, genetic inheritance is extremely complex, much more complex than the eugenicists at the time gave it credit for, much more than they even knew about. There's a complex interplay between genes, environment, epigenetics, and more. When you're talking about the development of an organism, there have been huge, significant discoveries in the science of genetics since the early 19 hundreds. For example, epigenetics, they didn't know about that. In the 1960s, non coding DNA was discovered, and at first it was thought to be junk DNA. Now we know that it does all kinds of different regulatory stuff inside the nucleus, and it's like 99% of the DNA in a gene is non coding DNA.

296
01:00:11,816 --> 01:00:12,780
Kayla: Oh, my God.

297
01:00:13,400 --> 01:00:42,306
Chris: Just to be clear, non coding means it does not contain instructions to build a protein. It's there for some other reason. So there was just, like, a lot going on here that they didn't even know about, like, the unknowns. But again, they're, like, way too arrogant and way too confirmation biased to. It's not even confirmation bias. It's like, motivated reasoning. Right? It's like, I really want this to be true, that intelligence is simple and is simply inherited, because I just want it to be.

298
01:00:42,378 --> 01:01:09,720
Kayla: I think that's why I don't want to invoke, when you asked me, like, should we do eugenics? If we could do it good. I don't want to invoke the, like, well, I don't think people should be playing God thing. But also, like, this is kind of the reason why we should be the most careful when it comes to playing, quote, unquote, playing God in this way, because we are not God, and we probably will never be as omniscient as a mythological God.

299
01:01:10,260 --> 01:01:11,668
Chris: Think for yourself, Kayla.

300
01:01:11,804 --> 01:01:12,880
Kayla: We need to.

301
01:01:13,420 --> 01:01:14,564
Chris: I'm doing nootropics.

302
01:01:14,652 --> 01:01:19,692
Kayla: Have a little bit of humility in thinking that, like, we're not at the. We're never at the end of science.

303
01:01:19,796 --> 01:01:49,570
Chris: Yeah. The humility is drastically lacking in this whole venture. So those are like, sort of my summary of why the science isn't good. Like, just because we can't. We can't do it. Doesn't mean we should. We also shouldn't. So the shouldn't reasons. There's a bunch of those, too. We already talked about it as, like, a terrible use of resources. Like, wouldn't it be smarter to just. Whatever the thing is, shouldn't we just do a social program? Like, wasn't that. Wouldn't that be much easier than, like, changing the entire race rather than, like.

304
01:01:49,870 --> 01:01:52,454
Kayla: Breeding out the poor? Shouldn't we just, like.

305
01:01:52,542 --> 01:01:53,250
Chris: Yeah.

306
01:01:53,620 --> 01:01:57,412
Kayla: Structure our societies way easier, resulting in a lot of poor people?

307
01:01:57,556 --> 01:02:53,872
Chris: And as a lazy person, that reason appeals to me quite a bit. We just talked about the sort of, like, height of arrogance problem. There is a researcher named Andrey Pikulski from the University of Warsaw, argues that eugenics can cause harmful loss of genetic diversity. You kind of talking about this actually with the bananas. With the bananas. The monoculture. Yeah. It can cause harmful loss of genetic diversity. If a eugenics program selects a sciency type gene that could possibly be associated with a positive trait. So actually, he's not even talking about diversity for diversity's sake, which is important. He's actually saying. So he uses the example of what if there's a coercive government eugenics program that prohibits people with myopia from breeding? That might have the unintended consequence of also selecting against high intelligence, since the two tend to go together.

308
01:02:54,056 --> 01:02:55,024
Kayla: Wait, really?

309
01:02:55,152 --> 01:02:59,832
Chris: Well, this is a very tenuous example, but if you imagine that people with.

310
01:02:59,856 --> 01:03:02,032
Kayla: Glasses are smarter, and I do because I'm wearing glasses.

311
01:03:02,056 --> 01:03:15,990
Chris: Because you're wearing glasses, then in this scenario, if trait a is undesirable but it's correlated with trait b, and you don't let people with trait a mate, then you're going to lose out on trait bjdehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehe. Oops.

312
01:03:16,290 --> 01:03:21,882
Kayla: But what about Chris? The fact that boys don't make passes.

313
01:03:21,946 --> 01:03:29,442
Chris: At girls who wear glasses, so they don't breed anyway. Okay, I guess that just works then.

314
01:03:29,586 --> 01:03:30,630
Kayla: That's not true.

315
01:03:32,290 --> 01:03:43,700
Chris: And then, of course, there's like, I don't know, this is probably the biggest, or at least it's number one or two in terms of shitty things about eugenics, is, what do we mean by better?

316
01:03:44,120 --> 01:03:48,420
Kayla: That's a good question. Sorry, I'm gonna not yell that. Let me do that differently.

317
01:03:49,880 --> 01:03:52,080
Chris: I think you should. I'm gonna leave it.

318
01:03:52,160 --> 01:03:53,300
Kayla: That's a good question.

319
01:03:53,840 --> 01:04:03,260
Chris: And, like, if we could even miraculously define intelligence. Like, let's say there's a world where intelligence is a word with precision.

320
01:04:03,800 --> 01:04:06,264
Kayla: Not the most imprecise word known to man.

321
01:04:06,352 --> 01:04:08,180
Chris: Is more intelligent. Even better.

322
01:04:08,570 --> 01:04:09,310
Kayla: No.

323
01:04:09,690 --> 01:04:13,674
Chris: Would it better if everyone was of the same level of intelligence?

324
01:04:13,762 --> 01:04:14,390
Kayla: No.

325
01:04:14,730 --> 01:04:18,066
Chris: Like, that's not obvious. That's not an obvious conclusion.

326
01:04:18,138 --> 01:04:23,990
Kayla: It's not a better world if we're all of Mike judges intelligence, because then all the movies would be idiocracy.

327
01:04:26,250 --> 01:04:53,974
Chris: I'm going to jump back to Galton here because he has a little quote here that is not good. He was, again, the main eugenics guy, right? So in one of his papers, he writes, a considerable list of qualities can easily be compiled that nearly everyone except cranks would take into account when picking out the best specimens of his class. So, Kayla, only cranks would disagree with this.

328
01:04:54,022 --> 01:04:55,158
Kayla: Only cranks.

329
01:04:55,294 --> 01:05:01,726
Chris: It would include health, energy, ability, manliness, and courteous disposition.

330
01:05:01,838 --> 01:05:16,550
Kayla: Oh, the famous genes for manliness and courteous disposition. How in God's name are you with a straight face telling me that you're taking a nature approach to courteous disposition.

331
01:05:16,970 --> 01:05:42,618
Chris: This makes me go like, okay, I see why Clarence Darrow was just shredding these guys sometimes. Oh, man. But then it makes me go, like, how? You know what? It's the Nobel disease. Because I was gonna say, how were they so good at statistics? And they're also so dumb. But, like, there you go. You can't just say intelligence, because you might be good at statistics and you might also be a scientific racist.

332
01:05:42,674 --> 01:06:00,444
Kayla: Well, these people, I think that, like, when you are very skilled in one arena, we have this fallacy that, like, that means you're just skilled in arenas, particularly, like, science arenas, when, like, we could do an entire season of this show, we could probably do multiple seasons of this show of scientists who were brilliant in one field.

333
01:06:00,492 --> 01:06:04,716
Chris: That would just be a different podcast. Absolute numbskulls.

334
01:06:04,748 --> 01:06:19,668
Kayla: I'm trying to think of a word that's not eugenicists. Absolute diagnosed morons. No, absolute dumbasses when it came to the other things that they were doing. And this seems to be a really clear case.

335
01:06:19,844 --> 01:06:25,888
Chris: I agree. And it's weirdly ironic that it's true about the guy who invented it.

336
01:06:26,024 --> 01:06:27,740
Kayla: Did he go to Harvard? Sorry.

337
01:06:28,080 --> 01:06:31,352
Chris: No, don't put that in there. It might have been Oxford, because he was british.

338
01:06:31,456 --> 01:06:31,864
Kayla: Ew.

339
01:06:31,952 --> 01:07:09,750
Chris: Which Oxford is every bit as bad. Here's another quote by a journalist named Edwin Black. He penned a book called War against the Weak, which. I wanted to read it. I didn't have time, but it's apparently just really gets deep into the american eugenics movement. There's not good stuff that it talks about. I mean, it's good that he's talking about it anyway. He argues that eugenics is often deemed a pseudoscience because what is defined as a genetic improvement of a desired trait is a cultural choice rather than a matter that can be determined through objective scientific inquiry.

340
01:07:10,570 --> 01:07:12,070
Kayla: That is a very good point.

341
01:07:12,570 --> 01:07:52,752
Chris: And he continues. Indeed, the most disputed aspect of eugenics has been the definition of improvement of the human gene pool, such as what is a beneficial characteristic and what is a defect. Historically, this aspect of eugenics is often considered to be tainted with scientific racism and pseudoscience, for obvious reasons. So, again, what is better? What do we mean by better? By what standard? Now, I think on, like, a micro level, maybe you could like it. I think there's just maybe sometimes the extrapolation out to a macro level doesn't work, or extrapolation from a micro level to, like, other aspects of our like, if. Well, you know, if we can prevent tay Sachs disease, why can't we make people smarter?

342
01:07:52,856 --> 01:07:53,864
Kayla: Right. Right.

343
01:07:53,992 --> 01:08:39,957
Chris: I use tay Sachs as an example because that is something that, like, people are currently, like, making decisions about, like, whether to have children, if they have genes for tay Sachs disease. We have already talked about genetic testing in children, which is, like. It's literally a eugenics thing. Right? We're making decisions about breeding based on this, like, data about how they're gonna turn out. I would say it's liberal eugenics because it's individual choice. It's not the state coming down and being like, you can't have that baby. It's gonna be Tay Sachs. It's individual, but it's. It's definitely eugenics. And, you know, it's. It's. I can see sometimes dumbasses as these guys are, I can also see them saying, like, if we can breed Tay Sachs out, that's good, right? Well, why can't we make smarter people? Isn't that also good?

344
01:08:40,134 --> 01:08:41,438
Kayla: And that's where I think they're wrong.

345
01:08:41,493 --> 01:08:43,526
Chris: I think it's a huge. It's a huge leap.

346
01:08:43,598 --> 01:08:44,064
Kayla: Right? I.

347
01:08:44,142 --> 01:08:46,323
Chris: But that's the leap that they're making.

348
01:08:46,372 --> 01:09:03,292
Kayla: It's as close to saying, I think that we can breed fruit flies so we can breed humans. It's just such a wild jump to a conclusion that it's really hard to take seriously when we're talking about it from this 2020 hindsight perspective.

349
01:09:03,435 --> 01:09:37,250
Chris: Yeah. And then the last thing, and I think this is probably number one in terms of why it's shitty, is we mentioned this already in the show, the path from select for desirable traits to suppress people different than me from breeding is very short. So much of this, of what we've been talking about is just people being afraid of great replacement theory. Like, when I was researching all of this, like, you read statements of the form, I'm scared that the bad people out breed the good people. You read that so much.

350
01:09:37,370 --> 01:09:39,305
Kayla: That's what the premise of idiocracy is.

351
01:09:39,337 --> 01:10:34,792
Chris: And that's why we start with idiocracy. And that's why idiocracy is so bad, is because this is, like, foundational to eugenics. It's not. It's really not. Just like they say, it's, let's improve the species. But the amount of times I read, I'm worried about the bad people outbreeding the good people. Like, I would be a million dollars if I had a nickel for each time I read that. So it's just one of the core pillars of eugenics. And that means right off the bat, like, I don't fucking buy it. Right? Like, this is just a scientific way to justify, like, you know, like, you know, we should only breed good people. What are good people? Well, you know, like, wealthy, white, don't have syphilis and don't misbehave when they're, you know, women. When women misbehave, then we should sterilize them. Of course it's that.

352
01:10:34,856 --> 01:10:35,104
Kayla: Right?

353
01:10:35,152 --> 01:10:40,882
Chris: Just what is, like, when I say by what standard, we're like, you can't standardize this. They kind of do have answer for that.

354
01:10:40,946 --> 01:10:41,482
Kayla: Right?

355
01:10:41,626 --> 01:10:44,482
Chris: They're like, well, rich white people, obviously.

356
01:10:44,546 --> 01:11:11,334
Kayla: And this is why, when we talked about it, we kind of came to the conclusion that there's a deep naivete in the transhumanist dxtopian idea of, like, well, yeah, it's okay if we give people the choice to do what they want, right? Cool. But let's look at history. Like I said about Willie McCaskill in the last episode, it's like, cool. I love your idealism. Let's look at history. Let's plan for that and not go. It won't be a problem.

357
01:11:11,462 --> 01:11:44,350
Chris: Right. Unless you're thinking this is sort of like an implied thing with these guys. It's not. They were like, no, some races are dumber than others. Right out. This is what I mean by racism used to kind of just be totally out in the open and fine. It wasn't relegated to a culture war. It was like, yeah, well, these races are dumb and these ones are good. And then we, like, look at Nazi Germany and we're like, man, you guys shouldn't have said that over there, right? We didn't, did we? No, we did. We were saying that we didn't justify.

358
01:11:44,430 --> 01:11:48,806
Kayla: Like, slavery and genocide with those exact ideas.

359
01:11:48,918 --> 01:11:56,022
Chris: Right? So I'll just say, just to wrap up on that point, like, it seems like people have evolved.

360
01:11:56,166 --> 01:11:57,170
Kayla: Oh.

361
01:11:57,590 --> 01:12:08,472
Chris: To be deeply afraid of things like being outbred by competing organisms. Right? Like, it just seems like we have this deep seated fear because it's still around today. Like, people are still terrified of that today.

362
01:12:08,536 --> 01:12:08,808
Kayla: Right?

363
01:12:08,864 --> 01:12:15,888
Chris: People talk about that all the time. People talk about that when I was growing up. People are talking about it now. What if they come in and these bad people out breed everyone?

364
01:12:16,024 --> 01:12:17,064
Kayla: Right, right.

365
01:12:17,232 --> 01:12:30,826
Chris: And it's everywhere. It's in America. It's in France. They talk about that in France. In France, yeah, with muslim immigrants. But, Kayla, what does this all have to do with transhumanism?

366
01:12:31,018 --> 01:12:32,498
Kayla: I don't know.

367
01:12:32,594 --> 01:13:06,350
Chris: And the rest of the escriole. Next time on cult or just weird, we tie that knot together. But if you want the short version, consider the. This is the slogan from the 1921 International Eugenics Congress. Eugenics is the self direction of human evolution. Now, if that's not a transhumanist statement, I don't know what is. This is Chris, this is Kayla, and this has been a really hard episode. Girls are just weird.