Join the conversation on Discord!
May 21, 2024

S6E8 - The Future of Humanity: Plus

Wanna chat about the episode? Or just hang out?   --- What would you say you... do... here?   Chris dunks on a doofy website for 40 minutes straight; it is also possible we learn some things about Transhumanism along the way.   ---...

Wanna chat about the episode? Or just hang out?

Come join us on discord!

 

---

What would you say you... do... here?

 

Chris dunks on a doofy website for 40 minutes straight; it is also possible we learn some things about Transhumanism along the way.

 

---

*Search Categories*

Anthropological; Science / Pseudoscience; Common interest / Fandom; New Religious Movement

 

---

*Topic Spoiler*

Humanity+

 

---

*Further Reading*

 

https://www.britannica.com/topic/transhumanism

https://en.wikipedia.org/wiki/Humanity%2B

H+'s website: https://www.humanityplus.org/

H+'s weird, tiny, self published "news articles" (also an example of their liberal/left leaning): https://www.humanityplus.org/news/transhumanists-condemn-police-brutality

other sites in the H+ "webring":

https://willeadership.io/

https://healthspanaction.org/healthspan-action-coalition/

H+ hosted zoom roundtable on Transhumanism: https://www.youtube.com/watch?v=sdjMoykqxys

H+'s weird mission statement: https://www.humanityplus.org/about

Max More's blog post on the definition of Transhumanism: https://www.humanityplus.org/philsophy-of-transhumanism

Two articles from early 2000s on the left/right schism in Transhumanism:

https://web.archive.org/web/20060313212747/http://www.twliterary.com/jhughes_utne.html

https://web.archive.org/web/20061231222833/http://www.slate.com/id/2142987/fr/rss/

Notable Transhumanists:

https://en.wikipedia.org/wiki/James_Hughes_(sociologist)

https://en.wikipedia.org/wiki/Max_More

Extropianism? What's that?

https://en.wikipedia.org/wiki/Extropianism

https://www.oed.com/dictionary/extropianism_n?tl=true

 

---

*Patreon Credits*

Michaela Evans, Heather Aunspach, Alyssa Ottum, David Whiteside, Jade A, amy sarah marshall, Martina Dobson, Eillie Anzilotti, Lewis Brown, Kelly Smith Upton, Wild Hunt Alex, Niklas Brock

<<>>

Jenny Lamb, Matthew Walden, Rebecca Kirsch, Pam Westergard, Ryan Quinn, Paul Sweeney, Erin Bratu, Liz T, Lianne Cole, Samantha Bayliff, Katie Larimer, Fio H, Jessica Senk, Proper Gander, Nancy Carlson, Carly Westergard-Dobson, banana, Megan Blackburn, Instantly Joy, Athena of CaveSystem, John Grelish, Rose Kerchinske, Annika Ramen, Alicia Smith, Kevin, Velm, Dan Malmud, tiny, Dom, Tribe Label - Panda - Austin, Noelle Hoover, Tesa Hamilton, Nicole Carter, Paige, Brian Lancaster, tiny

 

Transcript
1
00:00:00,200 --> 00:00:05,438
Chris: Maxmore also co founded another organization called the Extropy Institute.

2
00:00:05,494 --> 00:00:08,029
Kayla: This is what I'm saying. He's got other things going on.

3
00:00:08,109 --> 00:00:09,366
Chris: He's got a lot going on.

4
00:00:09,438 --> 00:00:11,850
Kayla: Working on the mission statement all the time.

5
00:00:12,710 --> 00:00:18,942
Chris: And for that, I could find even less information than I could find about humanity. And also, it doesn't exist anymore.

6
00:00:19,006 --> 00:00:19,438
Kayla: Oh.

7
00:00:19,534 --> 00:00:27,502
Chris: It closed its doors in 2006 with the board of directors stating that its mission was, quote, essentially completed, end quote.

8
00:00:27,606 --> 00:00:30,758
Kayla: What did they do? How did they win? What happened?

9
00:00:30,814 --> 00:00:32,458
Chris: So where's my live forever pill, man?

10
00:00:32,514 --> 00:00:33,870
Kayla: Yeah, gimme.

11
00:00:49,050 --> 00:00:50,418
Chris: Raw audio.

12
00:00:50,554 --> 00:00:51,274
Kayla: Rawdio.

13
00:00:51,402 --> 00:00:58,082
Chris: Wwe raw. Raw fish. Rawhide. Are you ready to go?

14
00:00:58,146 --> 00:01:00,460
Kayla: Are you recording? Is this for the show?

15
00:01:00,540 --> 00:01:01,140
Chris: Turns out.

16
00:01:01,220 --> 00:01:04,316
Kayla: Oh, I don't know. You're saying random words for the show.

17
00:01:04,388 --> 00:01:08,604
Chris: Anything is for this. Just getting warmed up. But anything and everything could be for the show.

18
00:01:08,732 --> 00:01:10,300
Kayla: All right. Hi.

19
00:01:10,420 --> 00:01:17,932
Chris: So, you know, hi, welcome to cult of just weird. I'm Chris. I'm a game designer and a data scientist.

20
00:01:18,076 --> 00:01:21,600
Kayla: I'm Kayla. I'm still finding myself. I don't know who I am.

21
00:01:21,900 --> 00:01:27,308
Chris: Good news for me. My goal of living forever is going pretty well so far.

22
00:01:27,484 --> 00:01:28,628
Kayla: I mean, you're still here.

23
00:01:28,724 --> 00:01:32,492
Chris: I'm still here. I haven't even taken any supplements yet or gotten an erling chip or anything.

24
00:01:32,556 --> 00:01:34,360
Kayla: You take supplements? Don't lie.

25
00:01:34,700 --> 00:01:38,868
Chris: Okay. But not, like, longevity. Well, I don't know. What does vitamin D count as?

26
00:01:39,044 --> 00:01:39,780
Kayla: I don't know.

27
00:01:39,860 --> 00:01:41,028
Chris: I eat that weird dirt.

28
00:01:41,124 --> 00:01:43,300
Kayla: You eat the weird dirt. We take zinc.

29
00:01:43,420 --> 00:01:47,412
Chris: The dirt is mostly for the calcium. Anyway, yeah.

30
00:01:47,476 --> 00:01:49,040
Kayla: One day, we'll talk about your dirty.

31
00:01:49,780 --> 00:01:59,690
Chris: Holy shit. Any business stuff for you? Cause otherwise, I'm just gonna go right into the call to action. Go talk to us on discord.

32
00:02:00,110 --> 00:02:01,038
Kayla: I don't have any business.

33
00:02:01,094 --> 00:02:01,430
Chris: No business.

34
00:02:01,470 --> 00:02:03,070
Kayla: I got no business doing this.

35
00:02:03,110 --> 00:02:04,638
Chris: That was my call to action. Then.

36
00:02:04,734 --> 00:02:07,278
Kayla: Yeah, go on our discord. Come hang out with us. Come talk to us.

37
00:02:07,334 --> 00:02:10,574
Chris: And it's full of weirdos. You'll like it.

38
00:02:10,662 --> 00:02:18,110
Kayla: If you think we're doing a really good job here, which obviously we are, you can support us on patreon@patreon.com. Culturejisweird.

39
00:02:18,230 --> 00:02:32,492
Chris: And the link to our discord is in the show notes and also on our website and I everywhere else that we are. All right, so what were we talking about last week? It wasn't eugenics again, was it? Fuck. It was.

40
00:02:32,596 --> 00:02:35,148
Kayla: Yeah. Isn't it always? I feel like.

41
00:02:35,204 --> 00:02:35,860
Chris: Damn it.

42
00:02:35,940 --> 00:02:48,148
Kayla: Okay, there's a show called. There's a podcast called if books could kill part of the Michael Hobbes universe. And they, I think we talked about on the show before they.

43
00:02:48,204 --> 00:02:50,532
Chris: We have. We talked about it a few episodes ago because.

44
00:02:50,596 --> 00:02:52,160
Kayla: Great. I'm glad I'm bringing it up again.

45
00:02:52,950 --> 00:02:57,830
Chris: I know were like, oh, well, we even. I think you even said that. You were like, we're just pimping Michael Hobbs. Oh, my God.

46
00:02:57,870 --> 00:03:10,070
Kayla: Okay, let me get through this really fast. Their thing is that every airport book they read is like, it all ends up being the same book. At the end of the day, it's really just one book. It's all grifting and shilling the same thing. It's all.

47
00:03:10,110 --> 00:03:13,286
Chris: Did you know the thing you thought isn't true? That's it.

48
00:03:13,438 --> 00:03:26,350
Kayla: I think that we're kind of finding this with a lot of the cults and weirds we talk about on the show is that, man, do a lot of them brush up against or go back to or have foundations in eugenics and it's not good.

49
00:03:27,010 --> 00:04:09,988
Chris: Yeah. If you haven't listened to our previous episode, we've begun a series of them. A series of episodes on transhumanism, an ideology loosely organized around the notion that humankind can and should enhance ourselves and can and should transcend our biology. Hence the name transhumanism. They are into things like advancing prosthetic limb technology and technology in general. They want to make cybernetic implants reality, interfacing the human brain with computers and technology. For example, they're into nootropics, which is the idea that you can medically enhance your intelligence. So think like genius pills, which you.

50
00:04:10,004 --> 00:04:11,760
Kayla: Can buy at the gas station.

51
00:04:12,130 --> 00:04:15,482
Chris: Wait, what? Not real ones, though.

52
00:04:15,546 --> 00:04:26,402
Kayla: Well, what's a real one, first of all? Second of all, like, you can totally go to the gas station and buy, like, pills that say, like, oh, this is gonna make your brain good next to. This is gonna make your dick good next to.

53
00:04:26,426 --> 00:04:27,378
Chris: Next to five hour energy.

54
00:04:27,434 --> 00:04:28,090
Kayla: Yeah, exactly.

55
00:04:28,170 --> 00:04:28,450
Chris: Yeah.

56
00:04:28,490 --> 00:04:33,146
Kayla: Is five hour energy like a cult, a on ramp to transhumanism?

57
00:04:33,178 --> 00:04:34,306
Chris: I kind of think so.

58
00:04:34,378 --> 00:04:38,586
Kayla: I don't disagree with you. I want a five hour energy right now.

59
00:04:38,698 --> 00:05:22,852
Chris: Yeah. Don't think of nootropics as something real, even though there are a lot of transhumanists that do. But, yeah, easily, though, the thing that they are most into, the thing that they want to transcend the most, is human mortality, which. I get it. I am scared to die too, you guys. This makes a lot of them cryonicists since it theoretically would extend one's life or at least allow someone to pause their metabolism until real life extension becomes a reality again, all hypothetical and speculative. And for what it's worth, cryonics also checks another few of the transhumanist boxes. If you get to wake up in a far flung future with all kinds of awesome biology transcending technology, that's a transhumanist's dream.

60
00:05:22,996 --> 00:05:25,876
Kayla: That's my dream, too. But I don't think I'm a transhumanist.

61
00:05:25,908 --> 00:05:30,012
Chris: Are you kayla? Maybe. I don't know. We will determine. We'll get to that.

62
00:05:30,036 --> 00:05:41,700
Kayla: Actually, I don't know if it is my dream. I'm kind of tired. Not to sound like a doomer, but I kind of want the sleeve release of death. Sometimes I'm so scared to die. And also, my God, am I tired.

63
00:05:42,320 --> 00:06:03,552
Chris: So there's a lot of overlap, is what I'm saying, between chronicists and transhumanists. Sometimes that overlap is pretty blatant. We mentioned this on the previous episode, but Max Moore, considered the father of the modern transhumanist movement, was president and CEO of Alcor, the nonprofit we visited and talked about in our cryonics episodes. He was the CEO there between 2010 and 2020.

64
00:06:03,656 --> 00:06:05,120
Kayla: That's a long time also.

65
00:06:05,200 --> 00:06:15,130
Chris: Yeah. Yeah. I was surprised. Ten years. I was like, I thought it was shorter than that. He is also an advisor for a group that I want to talk about today, humanity plus.

66
00:06:16,070 --> 00:06:32,942
Kayla: So, wait, should I say what? I don't know what. I only know a little bit of what you're going to talk about in the rest of this episode, and I'm a little bit scared. Only because I've been a member of the humanity plus mailing list for probably, I can calculate it because I remember.

67
00:06:32,966 --> 00:06:35,498
Chris: When I said, you can't pretend like you're going to be ignorant of what I'm going to say.

68
00:06:35,574 --> 00:06:47,194
Kayla: I don't open the emails, but I totally signed. I totally signed up to be on the mailing list a decade plus ago.

69
00:06:47,322 --> 00:06:52,354
Chris: I've signed up for, like, so many things where I'm like, oh, yeah, I want to get emails from this. And it's never, ever open.

70
00:06:52,402 --> 00:06:58,362
Kayla: I definitely looked at a few of them, but, you know, every time I kind of went to click that unsubscribe, I didn't. I didn't.

71
00:06:58,386 --> 00:06:59,106
Chris: Okay.

72
00:06:59,298 --> 00:07:00,210
Kayla: I stayed.

73
00:07:00,330 --> 00:07:01,066
Chris: How are we?

74
00:07:01,178 --> 00:07:02,306
Kayla: This is an okay organization.

75
00:07:02,378 --> 00:07:09,892
Chris: Do they have, like, HTML markup in them? Like, are there cool pictures and stuff? Or is it just like, to whom you may concern, in, like, choreography?

76
00:07:09,916 --> 00:07:13,560
Kayla: It's like a cool newsletter. It's like, yeah, in news.

77
00:07:13,900 --> 00:08:00,630
Chris: Well, transhumanism as an ideology and a movement has a pretty huge scope. So take what I'm about to say with a grain of salt, but my impression of humanity plus is that they are sort of the organization when it comes to transhumanism. They advocate for the kind of scientific and technological advances that transhumanists like, specifically the ones about augmenting the human mind, body, and lifespan. They advocate for the ideology in general. They hold conferences and meetups. They publish articles and blog posts, and they even have their own magazine called H, which started as a print magazine and now is strictly digital. And Kayla mentioned, that's what I get. Oh, you got h. I thought H was different than the email newsletter.

78
00:08:00,710 --> 00:08:04,830
Kayla: I don't know. Maybe I get both. I don't know. I think I get both. I signed up for something a long time ago.

79
00:08:04,910 --> 00:08:24,210
Chris: Kayla gets something in her email inbox. We will figure out what that is and get back to you. But I know that they do also have an online, formerly offline, now online zine called h. Now, if you're getting the impression from that list that all they're really into is kind of, like, sitting around and talking about this stuff.

80
00:08:24,330 --> 00:08:26,770
Kayla: What would you say you do here?

81
00:08:26,890 --> 00:08:29,602
Chris: Actually, that's kind of the impression I got as well.

82
00:08:29,706 --> 00:08:30,626
Kayla: Interesting.

83
00:08:30,778 --> 00:08:44,357
Chris: Truth be told, there isn't too much information out there about humanity plus. And a good chunk of what I was able to find was just, like, air quote, news articles from humanity itself or associated websites.

84
00:08:44,493 --> 00:09:11,168
Kayla: So do they. So they're not like a group of people who are, like, independently. It's probably people who are doing work to bring about a transhumanist ideal future. But as an organization, they're not like a science. They're not, like, building ideas guy, Kayla. They're the ideas guys. But that's interesting to know that. And again, I'm assuming. But I'm assuming the members are individually, like, out there working in the world, and then this is where they come together to talk about the stuff that.

85
00:09:11,184 --> 00:09:50,374
Chris: They do or that was my impression. I think they have, like, a board of directors and things like that. But I don't know if they're like. I don't know if it's like the WGA, where it's like, it's an organization of writers, but there's also people that work for it, like, employed that I don't know. So I don't know if there's people that, like, specifically work for humanity plus or if it's just like, this is our club, they will take your money. There is. And I don't think. I don't sense a grift from them. So I'll just get. I'll get that out up front. But there is, like, you can do a membership. Like, you can say, like, I'm a member of Humanity plus, and they have a discord just like we do. So did you go on it, everybody? Yeah, I did. Yeah.

86
00:09:50,422 --> 00:09:51,570
Kayla: What's going on in there?

87
00:09:51,870 --> 00:09:56,678
Chris: Humanity plus. That is outside of the scope of this discussion.

88
00:09:56,734 --> 00:09:57,174
Kayla: Sorry.

89
00:09:57,262 --> 00:10:20,756
Chris: I. I think humanity plus. I'll keep you informed. If. If anything pops up on there, if the singularity happens, I'll let you guys know. Okay. But, yeah, I don't think that there's anybody there that's, like. I don't think there's, like, a humanity plus lab where people are doing research. I think it's just humanity plus. Like, the coffee table, right? I think it's a coffee table, not a lab room.

90
00:10:20,948 --> 00:10:22,960
Kayla: Do they do, like, grants or anything?

91
00:10:23,260 --> 00:10:33,788
Chris: I couldn't find that. So I have a thing here in a second. Let me say. We'll get to that, and then I'll get to that in just a minute. First, though, Kayla, do you happen to remember web rings?

92
00:10:33,964 --> 00:10:34,532
Kayla: Oh, yeah.

93
00:10:34,596 --> 00:10:35,960
Chris: From Internet 1.0.

94
00:10:36,460 --> 00:10:38,428
Kayla: I'm too young to.

95
00:10:38,564 --> 00:10:39,480
Chris: Oh, please.

96
00:10:39,820 --> 00:10:42,800
Kayla: Have actually been on any web rings, but I do know of them.

97
00:10:43,940 --> 00:10:47,204
Chris: Get out of here. If you know what a web ring is, then that means you're ancient.

98
00:10:47,252 --> 00:10:49,040
Kayla: Please, please let me have this.

99
00:10:49,360 --> 00:10:55,648
Chris: Can you describe what they are so that, like, that more than half of our audience that probably wasn't even alive?

100
00:10:55,784 --> 00:11:07,256
Kayla: Absolutely not. From what I remember and gather, it's like, essentially it was a usenet and forums and emails.

101
00:11:07,328 --> 00:11:08,904
Chris: Oh, that's totally not the impression I got.

102
00:11:08,952 --> 00:11:09,952
Kayla: What is it, then? I don't know.

103
00:11:09,976 --> 00:11:12,700
Chris: I just thought it was, like, a bunch of websites that had, like.

104
00:11:13,520 --> 00:11:15,392
Kayla: Okay, so you don't know what a web ring is either?

105
00:11:15,536 --> 00:11:41,680
Chris: No, I do. I'm describing it right now as you verify live on the show. But a web ring. My impression of a web ring was just like, there's a bunch of websites, and they're all linked to. They all have hyperlinks to each other. And then you put a little widget at the bottom of your little geocities webpage that says, click to go to the next page in the web ring, and it would just take you to the next page in the web. It was just, like, an association of websites that all wanted to be linking to each other and associated with each other.

106
00:11:41,720 --> 00:11:42,766
Kayla: Yeah, that's exactly what it was.

107
00:11:42,848 --> 00:11:47,650
Chris: Boom. I got it right maybe you actually were too young to know what they are, so I apologize for that.

108
00:11:47,690 --> 00:11:50,042
Kayla: I was too busy playing with my.

109
00:11:50,066 --> 00:12:14,452
Chris: Little pony and your youth. Anyway, that's the impression that I got when doing the research for this episode, is that there's kind of like a humanity plus slash transhumanism web ring. There's, like, all these sites that are just kind of supporting each other's SEO, but the substance of what they're doing is, like, hard to pin down, aside from being sort of a philosophical discussion salon.

110
00:12:14,636 --> 00:12:40,544
Kayla: Well, do you think, do you get the sense that, I mean, I know what I think, but I don't know. I'm not as informed as you. Do you get the sense that this is a boon to the people who are involved. This is something that these people really want to talk about. Define involved to someone who is a humanity plus member. This is something clearly that these people want to talk about. And so they've built a place where they're able to find other people to talk about it with.

111
00:12:40,692 --> 00:12:51,744
Chris: Yeah, I think. I think that there is maybe, like, a misplaced expectation on the part of people doing research into stuff like this.

112
00:12:51,792 --> 00:12:52,408
Kayla: Like you and me.

113
00:12:52,464 --> 00:13:15,164
Chris: Like you and me in that. Like, oh, they're talking about technology. Like, so they must be doing science too, right? And certainly, like, that was, I mean, that's what Alcor said they were doing. So it, you know, I don't feel guilty about having that expectation going into this, but it was definitely checked as I was like, oh, I don't think that they're actually doing any of that. I think they're just talking about how much they like the people that do that.

114
00:13:15,292 --> 00:13:16,124
Kayla: Just a fan club.

115
00:13:16,212 --> 00:14:00,832
Chris: It's a fan club. It's like a fan club for the singularity is basically what it is, my kind of people. And you can definitely hyperlink follow your way to some actual medical organizations that are sort of about life extension and that kind of thing. So you can follow the quote unquote web ring into some people doing actual stuff other than just talking. Let me just tell you about a couple of the things on the humanity website itself. So the first thing you might want to check out, if you're asking that office space question that you brought up before, okay, what do you actually do here? Is there a projects page? So when you were saying, like, oh, you know, what do they fund? What do they, you know, do they fund any research projects?

116
00:14:00,936 --> 00:14:10,378
Chris: That's what I figured I would find when I went to their projects page. Yeah. So you go to their projects page here, and there's just this wide diversity of things that they do.

117
00:14:10,474 --> 00:14:16,634
Kayla: I get the sense from the way that you're talking that you're not saying what you actually mean.

118
00:14:16,682 --> 00:14:20,674
Chris: No, kayla, please. Okay, so they. They have conferences.

119
00:14:20,842 --> 00:14:21,546
Kayla: Okay.

120
00:14:21,658 --> 00:14:22,970
Chris: They have summits.

121
00:14:23,130 --> 00:14:24,642
Kayla: Okay. What's different?

122
00:14:24,786 --> 00:14:31,210
Chris: They have. They have a symposium. They have. Oh, here's another summit. Oh, there's a party.

123
00:14:31,370 --> 00:14:32,314
Kayla: Oh, that sounds fun.

124
00:14:32,402 --> 00:14:37,790
Chris: Party sounds fun. Here's another summit. Summit.

125
00:14:38,410 --> 00:15:01,162
Kayla: Look, man, this just sounds like people who speaker group really like to hang out and figured out a way to be very organized about the way they hang out, which is like, I'm a person who, if you don't tell me all of the rules, I get really upset and uncomfortable, and I won't show up at your thing. So the fact that these are hangouts with clear rules I'm in. It's just people hanging out in an organized fashion.

126
00:15:01,266 --> 00:15:21,082
Chris: Part of me wants to dunk and be like, there's just guys sitting around talking, but also, like, I don't know. That's valid. If you want to sit around and talk about your fun philosophy with other people that are doing fun stuff with you and your fun transhumanism. Let's all be robots together. More power to you fucking, like, french.

127
00:15:21,146 --> 00:15:24,932
Kayla: Philosophers got to do that back in the enlightenment or whatever.

128
00:15:25,106 --> 00:15:39,020
Chris: I think it was just the fact that it said projects page. I was, like, expecting, like, oh, okay. So they have, like, a project to, like, fund, you know, the creation of, like, a more advanced, like, arm prosthetic, I bet. And then I went there, and it's like, symposium, conference, symposium, summit.

129
00:15:39,600 --> 00:15:45,400
Kayla: Well, you know what they say, chris, that does yappen. Those who can't podcast form a symposium.

130
00:15:45,520 --> 00:16:03,680
Chris: Right, right. They also have a page that says pitch. So I'm like, oh, maybe they're pitching something. If you go there, it's just a transhumanism roundtable that's occurring in Rome later this year. So it's like, oh, go to this entirely separate page of pitch and their pitch. I guess the pitch is that they want to have this roundtable.

131
00:16:03,720 --> 00:16:16,616
Kayla: I mean, yeah. Is the idea. Is there an idea here of, like, by having so many speaking engagements in conferences and get togethers that this is, like, collecting new followers? Like, this is bringing people into the movement?

132
00:16:16,768 --> 00:16:18,680
Chris: Yeah, I think there's opportunity to, like.

133
00:16:18,720 --> 00:16:20,184
Kayla: Learn about this firsthand.

134
00:16:20,312 --> 00:16:30,068
Chris: Yeah, I think it's. They think of it, and this is just me speculating. Based on my research. I think there's some advocacy and some evangelism. In it.

135
00:16:30,164 --> 00:16:30,588
Kayla: Okay.

136
00:16:30,644 --> 00:16:57,128
Chris: I think there's some, like, yeah, get people involved. Like you said, get people interested in the same thing that I'm interested in, because it's important. And then there's also the advocacy of, like, hey, here's what transhumanism is like. There's a whole thing where they try to, like, clear up, quote, unquote, misconceptions. So I think that's a lot of, like, if you. If you wanted to really answer the. What do you say you do here? Aside from the, like, sitting around talking? I would say it's those two things.

137
00:16:57,224 --> 00:17:02,080
Kayla: And to be fair, like, organizing conferences and symposiums.

138
00:17:02,120 --> 00:17:03,064
Chris: It's true. It's a lot of work.

139
00:17:03,112 --> 00:17:04,336
Kayla: So much work.

140
00:17:04,528 --> 00:17:17,112
Chris: I mean, you could just, like, get a room in a hotel and, like, get your eight friends and some chairs. I'm like, see, I'm. The thing is, I'm not sure if it's like that or if it's, like, Comic Con or if it's, like, Comic Con. Like, I've never been to one of these.

141
00:17:17,136 --> 00:17:17,712
Kayla: Probably both.

142
00:17:17,816 --> 00:17:37,265
Chris: Probably could be one or the other. Aside from the projects page, they also have a news section, which I assumed was, like, news articles about transhumanism or about humanity plus, and I was close. It's articles about transhumanism written by humanity plus. Okay, so.

143
00:17:37,337 --> 00:17:38,233
Kayla: So they're doing something.

144
00:17:38,321 --> 00:17:56,410
Chris: They're doing something. But it's not, like, published anywhere. It's just on that. It's just on the site. Like, it's just. It's not like, here's an article we wrote in, you know, salon or something. It's just like, here's a thing that I wrote, and they're calling it news. It's really weird. That's like, four to five sentences.

145
00:17:56,570 --> 00:17:59,390
Kayla: Why do you think that is?

146
00:17:59,850 --> 00:18:06,098
Chris: I don't know. I think it goes back to the SEO web ring circle jerk thing.

147
00:18:06,274 --> 00:18:11,570
Kayla: I'm not sure, but I don't know. I can send it to the Atlantic. I have no idea.

148
00:18:11,610 --> 00:18:15,722
Chris: I don't know. But it's four to five sentences. It's like, we think this. It's like nothing.

149
00:18:15,826 --> 00:18:16,434
Kayla: Oh, gotcha.

150
00:18:16,482 --> 00:18:19,622
Chris: That's the other thing. It's not like a big, long thing. It's like, nothing.

151
00:18:19,706 --> 00:18:20,094
Kayla: Okay.

152
00:18:20,142 --> 00:18:39,414
Chris: And sometimes it's just like, a YouTube link of, like, Max Moore or somebody else talking about something. Okay, so then I'm like, okay, I think I need to go to check out their mission page to figure out what they're about. Let's. Let's see what's on their mission page. And, oh, wow, this is actually kind of like the most bizarre part of their website.

153
00:18:39,502 --> 00:18:40,766
Kayla: The mission pages.

154
00:18:40,918 --> 00:18:46,982
Chris: Yeah. So, first of all, their mission is, like, seven gigantic paragraphs of text organized into sections.

155
00:18:47,086 --> 00:18:47,614
Kayla: Okay.

156
00:18:47,702 --> 00:19:10,142
Chris: Which is pretty weird that their mission statement isn't just, like a sentence or two that cuts to the core of what they're about. Like, that's kind of what a mission statement's supposed to be. And it's like this huge wall of text, like, multiple webpage lengths long of text, and throughout the page, the sentence construction is just, like, of this really uncanny nature where it reads like either an AI wrote it.

157
00:19:10,206 --> 00:19:10,958
Kayla: AI written.

158
00:19:11,054 --> 00:20:06,190
Chris: Or like, a high schooler wrote it. It's just weird. Listen to these two sentences from their mission page and try to parse what the hell is being said. Other technologies that could extend and expand human capabilities outside physiology include AI, robotics, and brain computer integration, which form the domain of bionics memory transfer. This is all the same sentence, memory transfer, and could be used for developing whole body prosthetics, because these technologies and their respective sciences and strategic models such as blockchain, would take the human beyond the historical normal state of existence society. This is a second sentence, but it started back with blockchain. Including bioethicists and others who advocate the safe use of technology have shown concern and uncertainties about the downside of these technologies and possible problematic and dangerous outcomes for our species. So that's pretty clear, right?

159
00:20:06,610 --> 00:20:10,972
Kayla: That definitely sounds like AI. That sounds like a chat genius.

160
00:20:10,996 --> 00:20:12,756
Chris: And, like, why the hell is blockchain in there?

161
00:20:12,788 --> 00:20:15,916
Kayla: Well, you gotta have blockchain. Like, what do you mean? Why is blockchain.

162
00:20:15,948 --> 00:20:22,276
Chris: You gotta blockchain. I know. It's like somebody was like, oh, we have to put the word blockchain in there. It's a buzzword that'll get us money from Marc Andreessen.

163
00:20:22,468 --> 00:20:29,700
Kayla: But anyway, I wanna say, like, because we may have listeners who came to us through the cryonics episodes.

164
00:20:29,780 --> 00:20:30,500
Chris: Oh, yeah.

165
00:20:30,580 --> 00:20:42,432
Kayla: If we're getting. If we're. If we have the wrong impression of this, like, please let us know. Like, if we're missing something, like, really, please let us know. But this first kind of impression is very different than what I was expecting.

166
00:20:42,536 --> 00:21:34,460
Chris: We should. Cryogenator. The guy that we interviewed for the cryonics episode, he hangs out in our discord sometimes now, which is super awesome. I'm going to ping him and see if he got the same impression that I did from this organization. It's tough to say, though, because it's also got Max Moore is on its board of advisors, and he's like, rah, cryonics. So I don't know. And finally, on the mission page, I won't. I won't bore you with reading these long paragraphs again, but this is so bizarre. Two of the paragraphs on this page, on the mission page, are, like, copies of each other, but, like, bad copies, they're like carbon copies. Like, the text is all the same, except it's only, like, 95% the same. So there's, like, this whole paragraph of content, and then there's some other stuff. Scroll down, scroll down.

167
00:21:35,180 --> 00:21:43,204
Chris: And then there's the same paragraph, except it's not the same. It's got, like, minor changes. It's so weird.

168
00:21:43,292 --> 00:21:45,020
Kayla: Okay, do you want my theory? I have a theory.

169
00:21:45,100 --> 00:21:47,948
Chris: I definitely want your theory, because I really don't understand.

170
00:21:48,084 --> 00:22:34,660
Kayla: So it's a lot of work to run an organization, whether or not the organization is doing hard science or having symposiums, a lot of the people that are involved with humanity plus you've talked about Max Moore. I won't name other names, in case you're talking about them, but the people that are involved usually tend to have a lot of other stuff going on. They have other organizations that they run, or they have a primary day job that is in making of science or whatever. My guess is that there's just not a lot of time to go around to manage this kind of stuff. And so the front facing stuff is not as well managed as perhaps. I'm assuming that if you go to a symposium, like, I've seen Max Moore talk, I've watched videos of Max Moore.

171
00:22:35,000 --> 00:22:41,080
Kayla: I've seen some of these other people talk on YouTube or whatever. And it's not like this uncomfortable, stilted, repetitive.

172
00:22:41,120 --> 00:22:54,168
Chris: Not at all, actually. So I read he has a few blog entries on the Humanity plus website. I'm going to talk about one of them in a minute. But in this blog entry, talking about transhumanism, he's extremely articulate. That's part of what confused me.

173
00:22:54,224 --> 00:23:36,594
Kayla: These are very, very well spoken people. Theory is just that they all have a lot going on that something like the mission statement on the Humanity plus website has kind of fallen to. Like, it's not the top priority thing, particularly because a lot of the people that probably do get involved with humanity plus don't get involved because they found a random website and clicked through to the about section and went, wow, I agree with this. Let me go to a symposium. Like, they're probably coming at it from. They've been to a different symposium, or they saw it on the discord, or they went to Radfest, or they went to this other conference. Like, they're coming at it from an already knowledgeable standpoint. So this is not the priority.

174
00:23:36,642 --> 00:23:45,946
Chris: Not super necessary. Right. I agree with you. And also, if one of the primary goals of this organization seems to be advocacy.

175
00:23:46,018 --> 00:23:47,110
Kayla: Yeah, yeah.

176
00:23:47,890 --> 00:23:54,980
Chris: Then, like, who. Then people writing about this. Like, that's the first thing I clicked on, actually, was mission.

177
00:23:55,060 --> 00:23:55,420
Kayla: Right?

178
00:23:55,500 --> 00:24:20,148
Chris: Not that I'm, like, big New York Times reporter, but, like, just an example of a person in media, which I am because I have a podcast. That was the first thing I clicked on. Cause I was like, okay, what are these guys all about? So, yes, you're right in terms of membership, but in terms of, like, if they do care about advocacy, then, like, that's the landing page for people that are, like, looking into what they are.

179
00:24:20,204 --> 00:24:20,982
Kayla: Right? I.

180
00:24:21,116 --> 00:25:07,114
Chris: So I just think. I still think it was a little weird. I also think that there's something there that just sort of illustrates that maybe they're confused about their own mission a little bit. Like, either too confused to write about it properly or too confused to not, like, outsource it to someone that doesn't really know. I don't know. I got that impression of. It was just sort of all over the place, and I did glean from it. The advocacy bit, I will say that, but it wasn't easy, and I'm pretty sure they could have done that in one sentence instead of a whole page. So I want to take a little bit of a step back and acknowledge that, yes. I've just been kind of, like, dunking on these poor nerds doofy website.

181
00:25:07,242 --> 00:25:09,746
Kayla: I'm also a poor nerd, so I feel for them.

182
00:25:09,818 --> 00:25:15,460
Chris: Yeah. And I have. I. Man, I've had some doofy webs. Our site's doofy. But I want to switch.

183
00:25:15,500 --> 00:25:20,356
Kayla: I know I need to go back and look at all of our website stuff and make sure I'm walking the walk.

184
00:25:20,508 --> 00:26:05,850
Chris: Oh, like, we don't have a mission statement on the culture. Just weird. It's, like, written terribly. Yeah, we probably do. But let me talk about some of the good, though. It's not all bad. Some of it's good. So, first of all, I want to go back to that blog post that I was just talking to you about. They do have a whole section that's not really, like, news, quote unquote, news blurbs. It's actually, like, real blogs written by important people in the organization. So our recurring friend Max Moore has a pretty long blog post on the site entitled the Philosophy of Transhumanism, which I personally found pretty helpful in my trying to grasp this, like, very broad kind of nebulous concept. So I'll talk about two examples that I really liked in his article.

185
00:26:06,430 --> 00:26:16,610
Chris: First, he has this whole section where he says that it's helpful to think about transhumanism as sort of a mixture of transhumanism and transhumanism.

186
00:26:16,910 --> 00:26:18,530
Kayla: Oh, I love that.

187
00:26:19,190 --> 00:26:25,406
Chris: That's so good, because it took me, like, I had to keep reading, and then I had to, like, reread three times before I got what the hell he was talking about.

188
00:26:25,438 --> 00:26:26,182
Kayla: Say it again.

189
00:26:26,326 --> 00:26:36,452
Chris: It's a mixture of trans dash humanism and transhumanism. So, yeah, now I'm curious, because I can't.

190
00:26:36,476 --> 00:26:37,428
Kayla: I don't have. I can't tell.

191
00:26:37,444 --> 00:27:33,920
Chris: You can't articulate what that, you know. You know, it's just a vibe for you. Yeah. Okay. Well, luckily for you, I have articulated it already in writing, thank God. What he's highlighting here is that the philosophy has roots in just regular vanilla humanism, and it's also about transcending our biological humanity. So the trans dash humanism part is like, yeah, we are from humanism. Humanism is like the soil from which we grow or whatever. And then the transhumanism ism is like, we are also in the arena of improving ourselves beyond what you might call our biological limitations. So that's kind of what he's talking about with that there. One of the other things he talks about in this particular blog post is something called extropianism, which, depending on who you ask, can mean one or both of the following.

192
00:27:34,740 --> 00:28:10,782
Chris: It can be a play on the word entropy. So extropy is sort of like anti entropy philosophy, something that believes in humans ability to live for extreme lengths of time, for human civilization to exist for extreme lengths of time in opposition to the natural decay of the rest of existence implied by entropy. That's the first thing, and that's actually how it was first coined, was in that context. The second thing is also wordplay, and this time it's a play on utopianism. So in this context, extropianism is an alternative to utopianism. And that's pretty much word for word from Mister Moore.

193
00:28:10,846 --> 00:28:11,970
Kayla: What does that mean?

194
00:28:12,430 --> 00:28:22,326
Chris: In case you're thinking, what's wrong with utopia? Utopia sounds nice. Yeah, that's a bit big to get into too deeply here.

195
00:28:22,478 --> 00:28:24,046
Kayla: Just go read some Sci-Fi.

196
00:28:24,198 --> 00:28:38,026
Chris: But you. Utopianism tends to be one of those like road to hell paved with good intentions type things. Building someones or some groups idea of a perfect society is associated with some of the worst shit that we have ever done as a species.

197
00:28:38,178 --> 00:28:40,426
Kayla: And were back to eugenics.

198
00:28:40,618 --> 00:28:55,698
Chris: Early 20th century eugenicists, which ive helpfully heard called first wave eugenicists, by the way, were essentially utopians. Like, they wanted to create a utopian society via, you know, having the correct people breed and not breeding the incorrect people.

199
00:28:55,794 --> 00:28:57,770
Kayla: Don't. Don't do that.

200
00:28:57,930 --> 00:29:30,376
Chris: Utopianism is just negatively connotated now because of all that, because it's an end that has incorrectly justified some pretty awful means. And since transhumans are fundamentally concerned with improving the human condition, which kind of sounds like, don't you want to create a utopia? Then there's a real risk of slipping into some pretty dangerous territory along those lines. That makes sense to which, we've already discussed that a little bit, but I'll just give a nice big. We'll get to that to tease some upcoming episodes about this very subject.

201
00:29:30,448 --> 00:29:31,968
Kayla: Good or bad? Good or bad?

202
00:29:32,144 --> 00:30:02,866
Chris: The episodes will be good, Caleb. So anyway, Max Moore sort of coined the context of talking about extropianism to emphasize that transhumans want humanity to consistently, over time, improve their condition and reduce their suffering, not to reach some final state, which is like somebody's idea of an ideal, perfect state. Again, it's like a less shitty alternative to utopianism.

203
00:30:02,978 --> 00:30:04,530
Kayla: Okay, I'm down with that.

204
00:30:04,650 --> 00:30:18,870
Chris: But yeah, Max Moore not only talks about the idea of extropianism in his blog post, he also kind of invented the idea, at least as it pertains now. It pertains to that context of alternative to utopianism.

205
00:30:18,990 --> 00:30:19,810
Kayla: Okay.

206
00:30:20,110 --> 00:30:31,550
Chris: By the way, when it was coined initially in context of what I said before, of being like, anti entropy, eg, life extension. That was by a cryonicist. So we're back to cryonics, baby.

207
00:30:31,590 --> 00:30:33,078
Kayla: It all goes back to cryonics.

208
00:30:33,214 --> 00:30:38,638
Chris: Maxmore also co founded another organization called the Extropy Institute.

209
00:30:38,694 --> 00:30:41,230
Kayla: This is what I'm saying. He's got other things going on.

210
00:30:41,310 --> 00:30:42,566
Chris: He's got a lot going on.

211
00:30:42,638 --> 00:30:45,050
Kayla: Working on the mission statement all the time.

212
00:30:45,910 --> 00:30:52,126
Chris: And for that, I could find even less information than I could find about humanity. Plus, and also, it doesn't exist anymore.

213
00:30:52,198 --> 00:30:52,638
Kayla: Oh.

214
00:30:52,734 --> 00:31:00,702
Chris: It closed its doors in 2006 with the board of directors stating that its mission was essentially completed, end quote.

215
00:31:00,806 --> 00:31:03,942
Kayla: What did they do? How did they win? What happened?

216
00:31:04,006 --> 00:31:09,798
Chris: So where's my live forever pill, man? Yeah, gimme it sounds like if you completed your mission, then those should be.

217
00:31:09,814 --> 00:31:12,370
Kayla: Around what was okay, I need to know what the mission was.

218
00:31:13,060 --> 00:32:04,612
Chris: Well, they're not very good at crafting missions. I think we've established that. Anyway, that's a little side tangent on extropianism. I thought that was interesting. It honestly, to me it feels almost entirely synonymous with transhumanism. But I guess maybe it's that, but even more focused on chasing human immortality. Okay, but I don't know. They're just like, they're very similar concepts, but I guess that's what happens when all you do all day is have symposia and workshops and conferences. Speaking of which, bringing it back to humanity plus and Max Moore's writings on their site. I'll link this philosophy of transhumanism article in the show notes because I do think it's worth a read. If you're curious about all of this, I found it valuable. There's one more story I have to tell because it's my favorite thing, a schism ooo the humanity Wikipedia entry.

219
00:32:04,716 --> 00:32:09,356
Chris: Wikipedia page hints at a very significant undercurrent.

220
00:32:09,428 --> 00:32:10,628
Kayla: I need to know more.

221
00:32:10,684 --> 00:33:06,916
Chris: Tell me in the narrative of what translates humanism is, or specifically who it is and who it's for. An undercurrent that we've dipped our toes in already and will be continuing to dip in upcoming episodes of Cult are just weird. Here's the quote from the Wikipedia page. In 2006, it was reported that there was a political struggle within the World Transhumanist association that erupted in 2004 largely between the libertarian right and the liberal left, resulting in a center leftward positioning that continued to polarize politics under its former executive director, James Hughes. End quote. As we've already discussed, there's an interesting, messy, confusing, important part of the transhumanist movement, and that it kind of has its roots in, like, hardcore libertarian, like radical individualism, but it also has roots in humanist, egalitarian, let's help everyone. We're all in this together ism, right?

222
00:33:06,948 --> 00:33:16,406
Kayla: There's this very classic liberalism, but also, like, hard. I know you just said it, but I'm saying again, like hardcore libertarianism.

223
00:33:16,478 --> 00:33:39,050
Chris: Yep, yep. For what it's worth, the mission statement that we've just been making fun of this whole episode, it does try and make a point to say, like, hey, we are not about elitists, like getting to upload their brains or get frozen. This is for everyone. This is like, we specifically want all of the benefits for humanity that we're talking about to be for everyone.

224
00:33:39,170 --> 00:33:43,922
Kayla: Is that the libertarian stance or the classic liberal stance. I don't know. I'm not kidding.

225
00:33:44,066 --> 00:33:49,930
Chris: Well, I would even say liberal left stance, like classical liberal to me, almost sounds more libertarian, but I don't know. Whatever.

226
00:33:49,970 --> 00:33:53,590
Kayla: So is that the libertarian stance or the liberal left stance?

227
00:33:53,890 --> 00:34:45,331
Chris: It's blurry, right? But I took that as evidence that they do still have this left leaning. They still have successfully kept out some of the more, like, nefarious, pernicious libertarian type stuff, at least from humanity plus. Now, whether having all of this transhuman pie in the sky, whether we can get all of that type of life extension and human augmentation or whatever, whether that actually, it's feasible to make that a reality for everyone, especially in our current political and economic paradigm, that's much tougher to say. But regardless, they're talking about that being a goal. They're talking about that wanting to be something that is true about transhumanism is that it's for everyone.

228
00:34:45,436 --> 00:34:46,156
Kayla: Okay.

229
00:34:46,308 --> 00:35:11,340
Chris: The current content being put out there by humanity seems to support the Wikipedia statement about the left right schism. Hell, there's a news article, quote unquote news article in their news section, the one that has, like, the two sentence articles written by them and not published anywhere else. Yeah, but there's a news. There's a news article there about police brutality. Oh, wow. I can actually read the entire article. Cause it's tiny.

230
00:35:11,500 --> 00:35:12,236
Kayla: Please do.

231
00:35:12,348 --> 00:35:32,380
Chris: Humanity plus condemns police brutality against the people that it has sworn to protect. In particular, humanity stands against systematic violence toward people of color. We are humanity plus and support the proposition that all people, regardless of race, ethnicity, age, gender, religion, creed, and sexual orientation, have the same rights and opportunities.

232
00:35:33,600 --> 00:35:34,816
Kayla: When was that published?

233
00:35:34,968 --> 00:35:41,780
Chris: June 8. June 8th, written by Humanity plus.

234
00:35:42,080 --> 00:35:44,536
Kayla: No, year. June 8 of this year. Because it hasn't been.

235
00:35:44,568 --> 00:35:45,352
Chris: That's all it says.

236
00:35:45,456 --> 00:35:50,520
Kayla: I'm gonna go ahead and assume that was written during the summer of 2020, probably June.

237
00:35:50,600 --> 00:35:58,034
Chris: Sure. Yeah, probably. But that's all it says. June 8, written by Humanity plus. There might be a guy named Humanity Plus.

238
00:35:58,162 --> 00:35:58,866
Kayla: Mister Plus.

239
00:35:58,938 --> 00:36:04,670
Chris: Mister plus. But, you know, they acknowledge the existence of systemic violent racism.

240
00:36:04,970 --> 00:36:09,070
Kayla: They're so woke, they're gonna cancel me.

241
00:36:10,530 --> 00:36:26,308
Chris: I did want to dig a little further, though, since this is kind of important, and this sort of, like, where is this on the political spectrum is kind of important, and it will come up again later, and I'm trying to prepare for an interview. Ooh, another teaser.

242
00:36:26,404 --> 00:36:29,420
Kayla: And I'm like, should I be scared or excited? Excited and scared.

243
00:36:29,540 --> 00:37:18,612
Chris: Yeah, it's like every time on our show, excited. And scared. So I traced through the sources on that blurb from the Wikipedia article about this schism. The info here seems to trace to two different news articles, one from Slate, all the way back in the pre iPhone dark ages of 2006, and one from the year before, published in Utne magazine, which I had never heard of but apparently is a culture reporting zine. And it's still around, actually. But I want to read the relevant quotes from both articles just to have it in mind for when we talk about the other side of the transhumanist political coin. The first blurb is from the Slate article, where the author attended a 2006 transhumanist conference at Stanford a while back. I'm told there was a left right battle for the soul of transhumanism, and the left won.

244
00:37:18,796 --> 00:38:23,692
Chris: Libertarians got a few nods at the conference, but mostly for opposing drug laws. In the draft, speakers and attendees called themselves visionaries, futurists, or revolutionaries. They invoked Marquis Sartre. Did I pronounce that right? Sartre, Sartre and Heidegger. They preached struggle and solidarity. They spoke of speciesism, morphological diversity, techno progressive transhumanism, somatic epistemic technology. I have no idea what that is, actually. I have no idea probably what half of this stuff is. Non anthropocentric personhood ethics, which sounds awesome. That's me, not the article writer. That sounds awesome is me. And the quote, illusory distinction between self and cosmos. They called the United States a, quote, bloated uber power. End quote. They cheered calls for a worldwide guaranteed income free lifelong therapy and a universal right to art and paid vacations. Quote. I'm a very pragmatic kind of anarchist feminist, said one speaker, end quote.

245
00:38:23,756 --> 00:38:30,828
Chris: And I'm now ending the quote from the article writer as well. So I don't want to someone to hang out with these guys biased here.

246
00:38:30,884 --> 00:38:32,440
Kayla: But these are my people.

247
00:38:33,140 --> 00:38:36,944
Chris: That's my kind of transhumanism. Yeah, sounds pretty good.

248
00:38:36,992 --> 00:38:38,460
Kayla: It sounds like a nice time.

249
00:38:39,680 --> 00:38:47,056
Chris: But keep in mind that this article and this slate article is from 2006, though, so that's a long time ago.

250
00:38:47,128 --> 00:38:50,304
Kayla: The leftism of 2006 might be.

251
00:38:50,352 --> 00:38:51,168
Chris: It's a long time ago.

252
00:38:51,224 --> 00:38:51,900
Kayla: Yeah.

253
00:38:52,240 --> 00:39:36,844
Chris: People that were on a sort of left leaning, like center liberal left leaning might have had these kind of ideas in 2006 and might have migrated since then. Some of them, yeah. It's pretty complicated. I don't think it's pretty out of scope for what we're talking about, even though it's like by definition kind of has to be in scope. But that's why we're going to cover this over several episodes. This next blurb from the magazine I mentioned, utne, is from 2005, and it says, quote, since it emerged from the fringes of cyber culture in the late 1980s, the transhumanist movement has been known as much for its libertarian leanings as for its belief in the plugged in forearm human of tomorrow.

254
00:39:37,012 --> 00:40:18,766
Chris: While today all the self proclaimed liberal transhumanists could probably fit in the holodeck of the starship enterprise, they count a number of influential scientists, bioethicists, and philosophers in their small but growing ranks. Unlike their libertarian peers, who tend to denounce all regulation, these democratic transhumanists view societal controls as crucial to realizing their openly utopian dreams. This is utopian, by the way, coming from this article writer, I would. I think that they would like to say that they are extropion. But anyway, some argue that the trend is irreversible. As with IVF and other assisted reproduction techniques, the public demand for longer lives, prettier children and better moods will override efforts to stop them.

255
00:40:18,878 --> 00:40:20,130
Kayla: Prettier children?

256
00:40:22,630 --> 00:40:32,144
Chris: Yeah, that's. I feel like that's a little straw, Manny, but that's okay. If these powerful new technologies are to be used justly, they say the time to embrace them is now.

257
00:40:32,232 --> 00:40:35,160
Kayla: People aren't doing IVF to make prettier children.

258
00:40:35,280 --> 00:40:44,664
Chris: No, that's not. Hold on. Okay, so that's not what this was saying. This was saying, as with IVF, when something becomes available, people will say, oh, that's cool. I want it.

259
00:40:44,712 --> 00:40:45,192
Kayla: Gotcha.

260
00:40:45,256 --> 00:40:50,688
Chris: So once we can Gattaca, they're basically saying, iVF has proven that once we.

261
00:40:50,704 --> 00:41:00,204
Kayla: Can start doing a gattaca, that's into having big old lips and, like, doe eyes and broccoli hair and four arms.

262
00:41:00,332 --> 00:41:52,832
Chris: Look, I think I disagree with that. I don't think that one implies the other, but that's what it's saying. Okay, I'll reread a sentence here just to pick back up. If these powerful new technologies are to be used justly, they say the time to embrace them is now. Others go even further, heralding the redesigned human as the key to transforming the world along progressive lines. Today, human intelligence in the form of technology is about to make possible the elimination of pain and lives filled with unimaginable pleasure and contentment, writes James Hughes, author of Citizen why democratic Societies must respond to the redesigned Human of the Future. The former editor of a zine called Ecosocialist Review, who teaches health policy at Trinity College in Connecticut. Hughes 44, is an executive director of the World Transhumanist Association, WTA.

263
00:41:53,016 --> 00:42:02,220
Chris: His goal, he says, is to convince fellow liberals that a pro technology, democratic form of transhumanism is the way of the next left, end quote.

264
00:42:02,520 --> 00:42:04,512
Kayla: Do you think that has come to pass?

265
00:42:04,656 --> 00:42:16,366
Chris: No. Yeah, I think this was right around the time that I was reading singularity is near and was also kind of buying into a lot of this stuff. I mean, I still maintain.

266
00:42:16,558 --> 00:42:20,810
Kayla: So why do the liberals hate technological progress?

267
00:42:21,950 --> 00:42:33,278
Chris: It's because technology is not woke enough for them. Kayla. So in this article, it says, in this 2005 article, it says Hughes was director of the World Transhumanist association.

268
00:42:33,414 --> 00:42:33,966
Kayla: Yeah.

269
00:42:34,078 --> 00:42:55,424
Chris: Which was at that time the name of what would later be called humanity plus. Oh, now did the schism take? That's kind of what we've been talking about. It's about 20 years now after either of those articles were written. So we ought to ask the question, what are the political underpinnings and political aims of transhumanists in 2024?

270
00:42:55,552 --> 00:43:04,980
Kayla: This is what I would like to know, because like you said, a quote unquote left leaning person in 2006 can be in a really weird quadrant these days.

271
00:43:05,480 --> 00:43:07,968
Chris: Things have really jumbled up lately.

272
00:43:07,984 --> 00:43:08,888
Kayla: It's a weird world.

273
00:43:08,944 --> 00:43:57,632
Chris: Mandez well, we will get to that. My goal with this episode was just to highlight and talk about an actual organization that falls under the transhumanist movement. Humanity plus was kind of the most obvious one to pick since they're like literally about the movement, talking about it a lot anyway. But there are more. For example, there's an Internet forum where a lot of transhumanists and extropian leaning folks like to hang out. A community that has gotten more and more attention lately, one that extols rationalism and utilitarianism to a degree so extreme it circles back and becomes irrational again. One that a podcast called cult or just weird talked about way back in season two. Next time on cult or just weird, we enter the lair of the basilisk.

274
00:43:57,736 --> 00:44:03,400
Kayla: Oh, no. This is Chris, this is Kayla, and this has been, I'm very scared.

275
00:44:03,520 --> 00:44:04,224
Chris: Cult are just.