Join the conversation on Discord!
March 31, 2020

S2E1 - The Experts (Cult Awareness Network & Cult Deprogramming)

Cult Or Just Weird

Wanna chat about the episode? Or just hang out?

Come join us on discord!

 

---

"You cannot save people. You can only love them."
-Anais Nin

Welcome to Cult or Just Weird... SEASON 2.

In this episode Chris and Kayla talk about the importance of expertise, and about what happens when the cult is coming from inside the house.

---

*Search Categories*

Science / Pseudoscience; Destructive

---

*Topic Spoiler*

Cult Awareness Network & Cult Deprogramming

---

*Further Reading*

"Cult" resource groups

https://www.icsahome.com/

http://cultresearch.org/

https://culteducation.com/

http://www.cultexperts.org/ (check out their "Buyer Beware" section...)

 

CAN & Deprogramming

https://en.wikipedia.org/wiki/Cult_Awareness_Network

https://en.wikipedia.org/wiki/Deprogramming

https://en.wikipedia.org/wiki/Ted_Patrick

 

A few nodes in the trust network to help you check your assumptions about all of this

"The Brainwashing Myth" on theconversation.com

"Do you Believe in Brainwashing?" on The Guardian

"Brainwashing & Deprogramming are equally mythological" on Skeptoid

 

---


*Patreon Credits*

initiates: Michaela Evans

cultists: Rebecca Kirsch, Pam Westergard, Alyssa Ottum, Ryan Quinn, Paul Sweeney

Transcript
1
00:00:00,360 --> 00:00:01,686
Kayla: Welcome to culture. Just weird.

2
00:00:01,798 --> 00:00:02,398
Chris: Hey, everyone.

3
00:00:02,494 --> 00:00:14,470
Kayla: Season two. These are bizarre circumstances, strange times that we are relaunching under. We're both freaked out and anxious and trying to do our best to live our lives in our little apartment.

4
00:00:14,590 --> 00:00:41,458
Chris: Yeah. So. But let me preface this real quick. Kayla and I are going to say a few things about our current situation just for a few minutes here. So if you are tired and sick of hearing content about viruses and all that, and you just want to get on to hear us yap about some regular culty weirdness, you might want to skip to the 1120 mark, where we'll get started on the show's regular content. We'll not blame you at all if you do that.

5
00:00:41,554 --> 00:00:42,146
Kayla: You're up.

6
00:00:42,218 --> 00:00:42,954
Chris: Oh, okay.

7
00:00:43,002 --> 00:00:44,170
Kayla: Wait, aren't you. You were you.

8
00:00:44,210 --> 00:00:45,754
Chris: I just was interrupting you, so I thought that you.

9
00:00:45,802 --> 00:00:49,402
Kayla: Oh, no, I was just saying. Just guiding us on into our corona.

10
00:00:49,426 --> 00:01:32,162
Chris: Chat, just making it happen. Okay. Yeah, so I. Yeah, so I know we wanted to say something here. I was not sure for a little bit about what I wanted to say. I sort of. I zoomed back out into, why do I do this show? Why did I even turn from not a podcaster into a podcaster in the first place? And I think the answer for me comes down to some of the major themes of the show, which are curiosity and rational skepticism. And I think it was like, really when I started researching my very first episode, the second one of the series on Ramtha and all that stuff that I really went from, I don't know, we could do a podcast about cults.

11
00:01:32,186 --> 00:01:41,618
Chris: I guess to, like, this is something that feels like I personally have something invested in, because, like, I don't know, like, I've never really been, like, into cults before in my life.

12
00:01:41,714 --> 00:01:43,090
Kayla: That's what I brought to this.

13
00:01:43,170 --> 00:02:21,556
Chris: That's what you brought to this. But I've always been into rational skepticism. I've always been into that. So for me, this is, in a fundamental way, almost less of a podcast about cults than it is about seeking truth, fighting fake news and alternative facts. And, you know, being a rational skeptic. And in my opinion, it's none too soon. Figuring out how to cope with and not be consumed by the avalanche of information that is out there right now is truly the defining struggle of this generation and of everyone on earth right now. So I've been thinking about what I wanted to say on our tiny little platform here about the coronavirus crisis, because there are a lot of different things that we could say about it.

14
00:02:21,668 --> 00:03:09,836
Chris: Obviously, it's affecting people in different ways in all walks of life across the globe, and just tremendous variety of ways. But I think the right message for this show, or maybe just for me, is the coronavirus pandemic is simply the latest and most urgent battle in the war on science and rationality. We're not really fighting a virus. We are fighting the irrational. We have guidelines from the WHO, from great articles by mathematicians and scientists and trusted sources telling us the exact way to defeat this virus. And we absolutely can defeat it, and I believe we will. So if you're feeling anxiety, just know that I believe we absolutely will defeat this virus. But like with everything else, we also have not just the trusted sources and the who, we also have conspiracy mongers and deniers.

15
00:03:09,988 --> 00:03:46,716
Chris: I have a person close to me who just recently told me only a week and a half ago that she thought it was a hoax. Just today on social media, I read a conversation in which somebody I know was asking their network, you know, for those of you that don't think this is a problem or think that it's blown out of proportion, like what's going through your mind right now, what would it take you for you to be convinced that this is something we have to be concerned about and act on? And literally one of the people replied to him, said the only thing that for me would be, I'd have to literally get it. I would have to get the disease. And how familiar is this sort of attitude to us on cultures? Just weird, right?

16
00:03:46,868 --> 00:04:29,140
Chris: This is the exact argument of new age cultists in the Ramtha school of enlightenment. I am my own God. I don't trust anything other than my own experience. It's the exact argument of flat earthers. I don't believe anything other than what I can see and I can test. I don't believe science. I don't believe NASA. I don't believe the rest of society. It's the argument of MLM victims. I don't trust the math. I can make this work of anti vaxxers. I don't believe the medical community and scientists. I believe what is comforting to me of climate science deniers. I don't believe the scientific consensus. I believe everything's fine because I am personally not on fire right now. And the list goes on. Fighting coronavirus, which in reality is fighting coronavirus. Denial is nothing new.

17
00:04:29,260 --> 00:05:11,784
Chris: It is literally the same battle we've been fighting over and over for a long time now, maybe even since the dawn of humanity. So why did I want to talk about this? Well, if you guys are anything like me, then this fight can take its toll on you mentally and emotionally. It can get you down constantly feeling like you have to be the responsible one, fighting climate deniers and MTHFR, snake oil peddlers and MLM converts, and on. It's hard, and it can feel lonely and thankless. And especially with conspiracy mongers being in control of our political apparatus right now, and such a large and growing proportion of Americans that subscribe to these irrational beliefs, many of whom put the conspiracy mongers in power in 2016, it can really start to feel bleak. I get it.

18
00:05:11,952 --> 00:05:53,990
Chris: So I bring all this up to say that if you ever feel the same way that I do sometimes to counter that thought of aloneness, you are not alone. And together, the rational people will persevere and win. And if Kayla and I are the last motherfuckers on earth broadcasting that you should trust good science, that you should believe in academic experts, that you should believe math, then so be it. We are here for you. We are on your side. And I really do think that despite the large inroads made by the irrational set in recent years, there are more rational people than not. The fight we are facing won't be easy. It definitely won't be fair. The irrational have always benefited unfairly from the work and responsibility of the rational. There's just no escaping that.

19
00:05:54,330 --> 00:06:19,770
Chris: But the victory will be rewarding, and we will do it together. Kayla and I are on your side, and we are tough outs. And finally, we have a secret weapon. The truth always does win. It may take time, it may even be costly, but the nature of truth is that it's inevitable. So, anyway, I hope with our tiny little platform and our small little community here, we can give you some resilience and hope from just knowing that we're here. And that's my bit.

20
00:06:19,850 --> 00:06:20,778
Kayla: That was beautiful.

21
00:06:20,914 --> 00:06:21,482
Chris: Thanks.

22
00:06:21,586 --> 00:06:31,672
Kayla: You're welcome. What? That was beautiful and poignant. And I guess we should also probably point out that we're recording this on.

23
00:06:31,816 --> 00:06:36,064
Chris: March 22, we're recording this on March 22, and we're publishing on March 31.

24
00:06:36,152 --> 00:06:40,160
Kayla: So it could be a massively different landscape on March 31.

25
00:06:40,240 --> 00:06:43,080
Chris: Oh, it will be. Nine days is eternity right now. Yeah.

26
00:06:43,120 --> 00:07:21,940
Kayla: So I hope that everything we're saying is still relevant. I hope that everything we're saying is still helpful. I know for me, the. The thing that keeps me sane during times like this is looking for the helpers. And then being one of the helpers, we should post that, like incredible Mister Rogers quote about when things are scary. Look for the helpers and it's really easy to feel helpless and hopeless right now. If you're a person that likes to take action and do things and be a part of it. Like, we literally can't leave our houses right now to take to the streets. But there are some things to do. I mean, I think the number one thing to do, as we all know, is to listen to the scientists, listen to the who.

27
00:07:22,520 --> 00:07:29,256
Kayla: Stay the fuck inside your house, wash your hands, do all of the things that I'm sure all of our listeners are already doing.

28
00:07:29,368 --> 00:07:32,496
Chris: I know sometimes I feel like we're preaching to the choir with our listeners.

29
00:07:32,688 --> 00:07:54,290
Kayla: But it's good to. It's just good to get it out there. You can continue supporting the economy where you can. So some ways you can do this are buying gift cards at local restaurants. If you don't feel comfortable doing takeout, do takeout if you feel comfortable with it. Gift cards for any local businesses that you know you'll eventually use, you know, in 4618 months.

30
00:07:54,590 --> 00:07:59,342
Chris: And if you do take out, remember, if you try to, try not touch door handles and wash your hands.

31
00:07:59,366 --> 00:08:45,796
Kayla: If you do right, you can. There's a lot of gyms and fitness centers that are now broadcasting and live streaming their classes. So, you know, sign up for those classes, pay for those classes. I know that some studios are offering live stream classes and are like freezing your membership for the time being. And if you can afford it, ask them to unfreeze your membership. If you're still gonna be taking their livestream classes, ask them to unfreeze your membership so that you're still supporting that business. We'll post some links for businesses local to us that we recommend and we'll post some other links with ideas and places that you can put your money. And in the same vein, donate when you can. Local nonprofits that include disaster relief, homelessness relief, animal rescues, those are really big right now.

32
00:08:45,948 --> 00:09:35,744
Kayla: And one place that not local that we can all be putting our money towards is the who's Covid-19 Solidarity Response Fund. So they've put together this fund and from their website, this fund will enable us to send essential supplies such as personal protective equipment, to frontline health workers. Enable all countries to track and detect the disease by boosting laboratory capacity through training and equipment. Ensure health workers and communities everywhere have access to the latest science based information to protect themselves, prevent infection and care for those in need and accelerate efforts to fast track the discovery and development of life saving vaccines, diagnostics and treatments. So pretty good place to put your money right now if you have some extra. Some other tangible ways you can donate blood. There's a shortage right now due to everything and people being forced to stay in their homes.

33
00:09:35,872 --> 00:10:16,096
Kayla: We'll post a link to where you can. To where you can donate. I do want to point out that the FDA has an unfortunate and very disturbing ban on folks who can and cannot give blood, and that's a whole other episode for it. So if you are in the LGBTQ community, feel free to take blood donation off your list because the FDA is a jerk about it. Right now is a good time to foster pets. A lot of rescues and shelters are closing and need people that can open their homes to little fur friend. Right now you can get involved with politics. Now is a good time to do some text or phone banking. Those are things you can do from your home.

34
00:10:16,288 --> 00:10:32,928
Kayla: And I have a bunch of other links that I'm going to post with just opportunities where you can kind of focus your energy and focus your activism right now in ways that are actually tangible and will kind of help get us all through this. One way or another, we'll get through.

35
00:10:32,944 --> 00:11:00,018
Chris: It together and take care of your mental health, too. I know that's kind of what this is about. Ways to be active and things to consider. And that was sort of what my bit was about, was, like, we're here for you, so, you know, take care of each other and take care of your mental health. It's gonna be. It's gonna be a tough time, and we don't even know when this is published, how tough it's gonna be on March 31. One thing that's gonna be hilarious is listening to our third episode that we recorded before this whole thing.

36
00:11:00,074 --> 00:11:08,552
Kayla: We recorded before the lockdowns were in place, before, like, it was recorded, like, literally a week and a half ago, two weeks ago from this point.

37
00:11:08,656 --> 00:11:38,696
Chris: Yeah. So, actually, it might be our fourth episode. I'm not. But anyway, you'll know. You'll know which one that's, like, not coronavirus times. So that's, you know, you'll see how that sausage is made. But, yeah. So, anyway, without further ado. Oh, actually, wait, one more thing. The cultur. Just weird team has a nice little surprise for you guys this season. New season, new music. Now, without further ado, welcome to season two of cult.

38
00:11:38,848 --> 00:11:40,300
Kayla: Or just weird.

39
00:12:04,080 --> 00:12:04,688
Chris: All right.

40
00:12:04,784 --> 00:12:06,260
Kayla: We never said who we are.

41
00:12:07,560 --> 00:12:12,240
Chris: Well, here, this will be the start of the actual episode, though, so for those of you.

42
00:12:12,280 --> 00:12:13,860
Kayla: Sorry, I'll stop saying go.

43
00:12:14,240 --> 00:12:31,652
Chris: No, you said every time you're like. And go, oh, man. Welcome back. For those of you that skipped ahead to this, and for those of you that are here now, welcome back. New season, new decade, new music. You just heard, we had a pretty good off season. We kept acquiring new listeners over our season hiatus.

44
00:12:31,716 --> 00:12:32,340
Kayla: Thanks, guys.

45
00:12:32,420 --> 00:12:45,396
Chris: So that was cool. Yeah. If anybody was spreading word of mouth, you guys are the best pyramid ever. What else is going on? Nothing. Nothing's really in the news right now. Right. So, yeah, let's just ignore it.

46
00:12:45,468 --> 00:12:46,280
Kayla: We're trying.

47
00:12:47,820 --> 00:12:48,540
Chris: Okay.

48
00:12:48,660 --> 00:12:50,460
Kayla: So you're up this week, so.

49
00:12:50,540 --> 00:12:51,116
Chris: That's right.

50
00:12:51,188 --> 00:12:53,582
Kayla: Thank you for being the season opener.

51
00:12:53,676 --> 00:12:59,002
Chris: You're welcome. I take that honor with responsibil. I don't know.

52
00:12:59,066 --> 00:13:00,602
Kayla: That was bad. That was really bad.

53
00:13:00,666 --> 00:13:03,466
Chris: I do things good and yay is what I meant to say.

54
00:13:03,538 --> 00:13:05,018
Kayla: Good. Do we need to give a refresher.

55
00:13:05,074 --> 00:13:34,830
Chris: On, like, what this is what we're about? Yeah, I mean, do you think people will. Maybe people will start with the season two, episode one, they might think. Okay, well, yeah, just a level set. I mean, what we are is actually, this will be a good cause. I have a good way to segue into the content by talking about what we are. So why don't you start us off? Oh, my God. We are a piece of paper. No. Oh, my God.

56
00:13:35,730 --> 00:13:37,882
Kayla: So we are a podcast.

57
00:13:38,026 --> 00:13:38,618
Chris: Good.

58
00:13:38,754 --> 00:13:39,594
Kayla: Dedicated.

59
00:13:39,722 --> 00:13:41,706
Chris: Really narrowing it down here up.

60
00:13:41,858 --> 00:14:06,858
Kayla: Dedicated to chatting about groups that we find interesting and that maybe or maybe not meet a certain set of criteria. And we think that they're either cult adjacent or they're fucking cults. And a lot of groups we've covered so far have been, you know, they've run the gamut from straight up new age. What is it? New age religious movements. What is it?

61
00:14:06,954 --> 00:14:08,466
Chris: Well, they're so new religious movements.

62
00:14:08,498 --> 00:14:11,202
Kayla: New religious movements to restaurants.

63
00:14:11,266 --> 00:14:17,986
Chris: So new religious movement, by the way, is sort of the academically preferred term. Cult is not really an academically preferred term.

64
00:14:18,058 --> 00:14:18,594
Kayla: Correct.

65
00:14:18,722 --> 00:14:22,622
Chris: It's very colloquial, very loaded. And actually, we'll get to that.

66
00:14:22,726 --> 00:14:23,702
Kayla: Oh, shit.

67
00:14:23,806 --> 00:14:24,614
Chris: In a bit.

68
00:14:24,742 --> 00:14:38,974
Kayla: Okay, so, yeah, like Christopher said, new religious movements. Restaurants, video games, insurance companies, cities. Cities, yeah. Just beauty product companies.

69
00:14:39,062 --> 00:14:47,134
Chris: We try to not have any tying together theme whatsoever. We just. We're just shotgunning all over the place.

70
00:14:47,222 --> 00:14:57,518
Kayla: But it's just I think that what we've learned through doing this podcast is that pretty much anything can become cult or cult like.

71
00:14:57,614 --> 00:15:02,590
Chris: Yeah. Well, and we have a list of criteria, which. Oh, yep. It's on physical.

72
00:15:02,630 --> 00:15:03,454
Kayla: You started this.

73
00:15:03,542 --> 00:15:16,810
Chris: I know. I know. I regret it terribly. And I feel like. So the list of criteria is basically six things that we, as non experts, which we'll also get to, that have come up with that. Tell us, yeah, tell us what the criteria are.

74
00:15:16,890 --> 00:15:43,034
Kayla: Number one, expected harm towards the individual. This can be social harm, physical harm, financial harm. Number two, is it niche within its society? This used to be a different criteria, but it didn't make any sense. So now it's. Is it niche? It used to be population of cult didn't make sense. Is it niche? Number three, antifactuality. So is there a closed, logical system present? Is there a denial of truth?

75
00:15:43,082 --> 00:15:44,514
Chris: Is there motivated reasoning?

76
00:15:44,562 --> 00:16:00,986
Kayla: A lot of that. Number four, percentage of life consumed. No, not like vampires. But how much of your time is devoted to the group, to the activities of the group, to being a part of the group? Number five, ritual. So the more ritualistic behaviors that are involved in this group, the more culty it is.

77
00:16:01,138 --> 00:16:07,994
Chris: And there's a lot of cool stuff with that. I mean, rituals can be words, it can be writings, it can be ceremonies. It can be a lot of different things.

78
00:16:08,042 --> 00:16:11,482
Kayla: Oh, we are very liberal with our.

79
00:16:11,586 --> 00:16:16,964
Chris: Usage of the term rituals. So is research as well true? That's a pretty broad term.

80
00:16:17,012 --> 00:16:41,964
Kayla: True. Number six, last but not least, my personal favorite charismatic leader. Is there a presence of a charismatic leader? And at some point when we started this, we said that it needed an alive leader. And we don't definitely gone away from clearly not that present of a charismatic leader at one point in time. Cult of personality. Think on those lines, and those are our criteria. So keep that in mind as Christopher takes us on a journey.

81
00:16:42,052 --> 00:17:03,572
Chris: Yeah, and I feel like that list, what usually happens. So the reason that all of those groups that we said were so diverse is that I feel like what usually happens in the show is like, one of us will notice one or two of those criteria sort of applying to something, and that'll kind of, like, catch our attention, and then we'll say, like, ooh, that would be a good topic for the show.

82
00:17:03,636 --> 00:17:03,980
Kayla: Right.

83
00:17:04,060 --> 00:17:44,480
Chris: And then it kind of becomes like, well, how many of the criteria do they hit? And. And again, you know, not experts. We have not been academically trained to be cult researchers. We're not even trained journalists. We are just open minded, curious people that like talking about this stuff. And hopefully we do a good job of entertaining and bringing in experts when we can. And a big part of the show, too, if you. If you did, stick with us through the preamble. Washington, you know, we like to be rational skeptics, too. We like to practice congenial and gracious but skepticism.

84
00:17:44,940 --> 00:17:47,596
Kayla: We're skeptics, but not like dicks about it, right?

85
00:17:47,628 --> 00:17:52,508
Chris: That's what it says on our twitter. Yeah. So anyway, although we could probably be.

86
00:17:52,524 --> 00:17:53,556
Kayla: Dicks about it sometimes.

87
00:17:53,628 --> 00:18:39,318
Chris: Yeah, I'm probably a dick about it way more than I should be. But anyway, so that's culture. Just weird. If you're joining us for the first time. All right, so having had some time away from the show, aside from giving you time to do your other 37 jobs or whatever, actually has been good for other reasons, too. It gives you some space and some time to think and contemplate about the show. And with culture, just weird. One of the things that I think about a lot is, well, the thing we just talked about, we're not experts. Right. I thought about that a lot in season one as well, which you can probably tell that's important to me from listening to the preamble, right? So that's why I frequently went to try to get experts to come on the show and answer questions for us.

88
00:18:39,334 --> 00:19:18,504
Chris: That's why I tried to get someone to talk to us for the Mary Kay episode and for the MTHFR episode. And then, granted, I still had to select those experts based on my own personal criteria and trust network. But I guess the point is, I just. I feel a lot of responsibility that when we put a podcast product out there, especially a podcast like ours, where we put a lot of emphasis on skepticism and trust, and we are very critical of things like fake news and people like Teal Swan and Jay Z Knight, it's really extra incumbent on us to strive for truth and leverage expertise in places where we very much don't have. It sort of ties in a lot to what I was saying in the pre show preamble.

89
00:19:18,592 --> 00:19:56,446
Chris: So I thought a lot about that for season two and for this episode, I actually reached out to several people, several personalities. I looked at several sites because I wanted to actually bring somebody on the show to talk to us about, not something specific, but actually more general about the show itself, about cults, about even our criteria. And so actually, that's why this is a delayed recording. So that's why episode three is going to seem so out of place, or three, four, that's why this was a delayed recording, because I was trying to get a hold of some people and it didn't quite work out. I still do want to have somebody on the show to chat about this at one point.

90
00:19:56,638 --> 00:20:29,578
Chris: I've been wanting that for a while now, actually, and I thought season two would be a good place to level set about it, but I wasn't able to get somebody quite in time to go on the show for this episode. But the nice thing about doing research for this show, as you have noted in the past, is that it's much like measuring the coastline. And what we mean about that is that depending on how far you zoom in to look at something, it gets more and more detailed and more wrinkly, and you start noticing things that you didn't notice when you were zoomed out.

91
00:20:29,754 --> 00:20:36,386
Kayla: It's a more descriptive way of saying going down the rabbit hole. I think mostly because I coined the phrase personally.

92
00:20:36,578 --> 00:20:38,442
Chris: Right. That's the important part.

93
00:20:38,626 --> 00:20:39,738
Kayla: It's all about me.

94
00:20:39,794 --> 00:20:59,314
Chris: Kayla TM so the interesting thing about cult research, cult awareness, cult monitoring, whatever the industry is called, whatever that community is called that I found out while trying to find an expert to talk to, is that it's not exactly well regulated.

95
00:20:59,482 --> 00:21:00,954
Kayla: Wait, what's not well regulated?

96
00:21:01,082 --> 00:21:17,474
Chris: That industry, that community. There's not, like a governing body. There's not a lot of, like, author like. There's not like a nature magazine for. For cult activity. There's not, like an american journal of medicine for cult activity.

97
00:21:17,602 --> 00:21:19,226
Kayla: You'd feel like there would be.

98
00:21:19,338 --> 00:21:54,188
Chris: Yeah, but it's not. Well, it is, and it isn't right, because some of it, I think, comes down to the fact that cults are not really. It would be almost like researching pop culture, because it's a very colloquial term. I think there are researchers that research new age movements and new religious movements. But cults implies something different, because we've talked about some new religious movements on the show. We've also talked about other stuff. Classify mlms as a new religious movement.

99
00:21:54,244 --> 00:21:54,892
Kayla: Right. But it's definitely.

100
00:21:54,916 --> 00:22:03,492
Chris: But they're definitely cult like. And that's pretty evident. If you go onto, like, the cult education Institute's website, for example, they have all kinds of stuff there about mlms.

101
00:22:03,556 --> 00:22:04,020
Kayla: Right?

102
00:22:04,140 --> 00:22:56,132
Chris: So it's like this weird space that sort of, like, defies classification in a strange way, which is, like, probably why we are always doing criteria on the show. And it's fun. I don't know, but, yeah. So it's this real hodgepodge of things out there. And some of them are experts, some of them are less experts. Here's a short list of groups that are out there. There's a group called Freedomofmind.com. And this is the sort of the organization built around a guy by the name Steve Hassan. We may talk about more in a bit. He's one of the people I tried to get on the show, but, you know, he is a cult researcher and currently exit counselor for people trying to leave cults. There's the ICSA, International Cultic Studies association, cultresearch.org dot.

103
00:22:56,236 --> 00:23:15,240
Chris: I also tried to get the person that runs this cultresearch.org on the show. I didn't have much luck there either. Openmindsfoundation.org. There's the cult Awareness network, cult experts.org, and of course, the one we just mentioned, the cult education institute led by Rick Allen Ross, whom I believe we've mentioned on the show before.

104
00:23:15,320 --> 00:23:15,896
Kayla: We have.

105
00:23:16,008 --> 00:23:53,892
Chris: And there's more too. That's just a short list. So I didn't get a chance to look into all these organizations in depth because I wound up looking into one of them in particular in depth. But the thing I took away while I was going down these rabbit holes trying to find an expert to talk to was that in this field, there's not really a well defined, like I said, academic governing body. There are academics who study this stuff. In fact, one of the folks I just talked about up there was really the most academic person I could find. Doctor Janjo Laelitch, who, along with her PhD, has written several books on the matter and is a professor at a California system university, tried to get her on the show. I'm hoping we can maybe still get her on at some point.

106
00:23:53,916 --> 00:24:21,732
Chris: But she's like the most academic person that I could find. Like somebody that was actually at a university that actually had written books about this stuff. But what I'm trying to get across here is that it's a mixed bag of groups. Some are generally very good, like CEI, the cult education institute, some not so much. One of those ones I mentioned up there, cult experts.org dot. I started digging into that one a little bit because, you know, sometimes when you're reading something and something just does not quite smell right.

107
00:24:21,796 --> 00:24:31,076
Kayla: Yeah, it's like when there's like a law being passed and it's like, this is from the Americans for freedom of friends, and you're like, what?

108
00:24:31,228 --> 00:24:32,852
Chris: Right? There's just something about it. People look into it.

109
00:24:32,876 --> 00:24:33,908
Kayla: It's something insane.

110
00:24:34,044 --> 00:24:40,952
Chris: So this cultexperts.org is apparently a part of something called the Apologetics index online.

111
00:24:41,096 --> 00:24:41,780
Kayla: What?

112
00:24:42,080 --> 00:25:34,378
Chris: Listen to this definition of what apologetics is from Wikipedia, apologetics from Greek. Speaking in defense is the religious discipline of defending religious doctrines through systematic argumentation and discourse. Early christian writers who defended their beliefs against critics, recommended their faith to outsiders, or called christian apologists. In 21st century usage, apologetics is often identified with debates over religion and theology. So long and short of the matter is some of the groups I listed are more secular in nature. And in fact, there's a whole. There's another place I found that, like, broke down which groups tend to be more secular focused and which tend to be more religious focused. But that's the thing, is sometimes these anti cult groups are secular, and sometimes they're religious. Sometimes it's less about cult education, protecting people, whatever, and more about the dominant religion protecting itself.

113
00:25:34,434 --> 00:25:35,242
Kayla: Right, right.

114
00:25:35,386 --> 00:26:23,310
Chris: If you guys have seen wild country, it's sort of like the end of wild country, where once they. They drive out the rajneeshis, which, you know. Yeah, they totally should have. Like, they were nuts. But the thing that got replaced with was a christian summer camp. So it was like, you can look at it in two lenses. One lens is that there's some crazy motherfuckers ruining a place, which is true, but then you can also look at through the lens of, there was a new religious movement being sort of driven out and maybe in a sense, persecuted by the dominant religion. So it's like this really weird. Like, it's kind of a mess, actually. That's why it's, I think, a bit impervious to research is because it's such a mess in that sense. Right.

115
00:26:23,350 --> 00:26:36,340
Chris: Like, some of it's about what we're about, which is rational skepticism and curiosity and open mindedness. And some of it is more about, like, you know, christian groups trying to make sure that you're not doing the other religion.

116
00:26:36,420 --> 00:26:37,452
Kayla: Right, right.

117
00:26:37,636 --> 00:27:04,950
Chris: Anyway, so like I said, it's a bit of a mixed bag when you're talking about all these different cult research, cult awareness, whatever groups. And sometimes when you reach into a mixed bag, you pull out something good, you know, like Cei or whatever. I. Sometimes you pull out something, well, weird. Okay, so let's reach into that proverbial bag here and pull out the name of one of those organizations that I listed above.

118
00:27:05,330 --> 00:27:07,938
Kayla: There were some that definitely sounded culty.

119
00:27:08,034 --> 00:27:10,314
Chris: Yeah, well, actually. Oh, do you want to play a guessing game?

120
00:27:10,362 --> 00:27:11,162
Kayla: I kind of do.

121
00:27:11,266 --> 00:27:15,914
Chris: Okay, so I'm gonna. I'm gonna list them again. I want you to tell me which one you think we're talking about today.

122
00:27:16,002 --> 00:27:18,506
Kayla: Which one I think we're talking about. Which ones I think sound culty.

123
00:27:18,658 --> 00:27:20,586
Chris: The one I think. The one you think we're gonna talk about.

124
00:27:20,618 --> 00:27:21,034
Kayla: Okay.

125
00:27:21,122 --> 00:27:22,402
Chris: Freedomofmind.com.

126
00:27:22,466 --> 00:27:23,270
Kayla: That one.

127
00:27:24,490 --> 00:27:25,034
Chris: All right.

128
00:27:25,082 --> 00:27:27,950
Kayla: I mean. Okay. Actually, keep going because there was another one.

129
00:27:29,450 --> 00:27:38,274
Chris: Freedomofmind.com, the international Cultic Studies Association. Cultresearch.org, openmindsfoundation.org.

130
00:27:38,322 --> 00:27:40,170
Kayla: Dot. That's the one. That's the one.

131
00:27:40,290 --> 00:27:46,906
Chris: The cult awareness network. Cult experts.org, and the Cult Education institute.

132
00:27:47,058 --> 00:27:48,306
Kayla: I think it's the open one.

133
00:27:48,378 --> 00:27:52,946
Chris: Do you remember which one? Wait, which one? There's two. Oh, sorry. Yeah. Openmindsfoundation.org. Dot.

134
00:27:52,978 --> 00:27:53,218
Kayla: Yeah.

135
00:27:53,274 --> 00:27:54,386
Chris: Ooh, good guess, but no.

136
00:27:54,458 --> 00:27:54,834
Kayla: Okay.

137
00:27:54,882 --> 00:27:59,834
Chris: No. We are reaching into the bag, and we are pulling out the cult awareness network.

138
00:27:59,922 --> 00:28:10,174
Kayla: That one also sounds culty. It sounds like the paid for by Americans with friends. Like, that's what it sounds like. It sounds like. That sounds good, but we'll get to that. Okay.

139
00:28:10,322 --> 00:28:24,558
Chris: Per Wikipedia, the Cult Awareness Network, or CAN, was an organization created by D programmer Ted Patrick that provided information on groups it considered to be cults, as well as support and referrals to deprogrammers. Sounds good, right?

140
00:28:24,614 --> 00:28:25,230
Kayla: Sounds great.

141
00:28:25,310 --> 00:28:32,934
Chris: Yeah. The Cult awareness network, founded in 1978, still exists today, which is great because we need resources like that. Right?

142
00:28:33,062 --> 00:28:33,718
Kayla: Yeah.

143
00:28:33,854 --> 00:28:46,712
Chris: Resources that help us identify harmful groups that use coercive psychology and don't have our best interests at heart. Trusted organizations that are definitely not at all related to. We'll get to that.

144
00:28:46,776 --> 00:28:47,872
Kayla: Oh, God. What?

145
00:28:48,016 --> 00:28:50,736
Chris: First, let's talk about their timeline.

146
00:28:50,848 --> 00:28:56,288
Kayla: Okay. Do we have, like, a best friend's animal society kind of thing on our hands?

147
00:28:56,424 --> 00:29:05,146
Chris: In a different way. Best friends change their name. These people changed something else.

148
00:29:05,218 --> 00:29:05,970
Kayla: Okay.

149
00:29:06,130 --> 00:29:54,068
Chris: Ted Patrick founded the organization in 1978. It was originally headquartered in Chicago, Illinois, and it collected information on what we've called new religious movements. Early on, they were really into that thing that I called Ted Patrick, their founder, into deprogramming later on. They allegedly moved away from these methodologies. The group was originally in favor of deprogramming, but distanced itself from the practice in the late seventies. And this is actually when they changed their name from. So they were actually founded as the Citizens Freedom foundation. So they changed their name to the cult awareness network, moved away from deprogramming practices. They started to become a bit of a subject of controversy, and they ended up in sort of the late eighties, into the nineties, getting into some legal trouble.

150
00:29:54,164 --> 00:29:55,000
Kayla: Okay.

151
00:29:55,590 --> 00:30:06,330
Chris: Some of the controversies that these guys got into were very much like he said, she said, battles with a particular group that you may have heard of called the Church of Scientology.

152
00:30:06,750 --> 00:30:10,134
Kayla: What? No, I don't want to do this.

153
00:30:10,262 --> 00:30:52,448
Chris: The cult awareness network very much was against the Church of Scientology. They said that a lot of the referrals that they got for. So they. They, as a group, would get contacts from people that wanted help, that had, like, a friend or a sibling or a son or a daughter or whatever that they felt was a member of a cult or becoming, you know, brainwashed by a cult. And they said that most of their contacts of this nature were from or about the Church of Scientology. And so the Church of Scientology would tend to fire back and say that the cult awareness network was just being, you know, persecuting a new religion. And they would kind of go back and forth on that.

154
00:30:52,544 --> 00:30:54,400
Kayla: I hate where this is going.

155
00:30:54,560 --> 00:31:30,630
Chris: This culminated in some lawsuits in 1991. So in 1991, over 50 scientologists from across the country filed civil suits against can. And according to Wikipedia, many of these lawsuits used sort of the same carbon copy of claims. So they were all similar. So it basically seemed like a coordinated attack from Scientology. They filed dozens of discrimination complaints against C. A. N. And the cult awareness network, which only had a budget of $300,000, basically was unable to keep up with all this litigation.

156
00:31:31,050 --> 00:31:32,310
Kayla: Understandable.

157
00:31:32,770 --> 00:31:51,542
Chris: In 1995, there was an additional case called the Jason Scott case. Jason Scott was a member of the Life Tabernacle Church, which is a pentecostalist congregation. The lawsuit was brought against both can and one Rick Allen Ross. What of the CEI?

158
00:31:51,686 --> 00:31:54,974
Kayla: Wait, so he was part of the CEI at the time? He wasn't part of this other thing that.

159
00:31:55,022 --> 00:32:02,214
Chris: No, there was no CEI at the time. At the time, Rick Allen Ross was just a guy that was. Was a cult d programmer that was working with can.

160
00:32:02,342 --> 00:32:05,210
Kayla: I'm stressed out. This is stressing me.

161
00:32:05,710 --> 00:32:48,654
Chris: So there was a criminal trial in which Mister Ross was acquitted. But then there was also a civil trial in which Mister Ross was ordered to pay more than 3 million in damages. And the C. A. N, having referred Ross to this guy, to Jason Scott's mother for the deprogramming, was ordered to pay a judgment of $1 million uS. And this was the thing that finally was the straw that broke the camel's back for can financially. And the old cult awareness network, which publicly opposed Scientology, as we mentioned, as well as other groups it considered to be cults, was driven into bankruptcy by litigation costs in 1996.

162
00:32:48,822 --> 00:32:51,326
Kayla: That is a very particular tactic.

163
00:32:51,438 --> 00:33:17,646
Chris: Subsequently, Church of Scientology attorney Stephen Hayes appeared in bankruptcy court and won the bidding for what remained of the organization for an amount of $20,000. The name, the logo, the phone number, office equipment, and so on. The Cult Awareness Network, an organization dedicated to tracking data and providing resources to fight cults, was now fully under control by none other than the Church of Scientology.

164
00:33:17,798 --> 00:33:20,390
Kayla: That is horrifying.

165
00:33:20,550 --> 00:33:38,420
Chris: And I mentioned just a second ago the old cult awareness network. And that's because literature on this now divides them up into the. Even though it's, like, still one entity, in accordance with how they bought it during bankruptcy, they refer it to the old cult awareness network versus the new Cult awareness network.

166
00:33:40,080 --> 00:33:44,088
Kayla: One is the Cult awareness network, and the other is the cult awareness network.

167
00:33:44,184 --> 00:33:52,256
Chris: Right. So remember when I said the community of anti cult and cult resource organizations was like a box of chocolates? You never know what you're gonna get.

168
00:33:52,408 --> 00:33:53,414
Kayla: Did you say that?

169
00:33:53,552 --> 00:33:58,858
Chris: No, I'm saying that now. I said it was a grab bag before, but now I'm saying it's box of chocolates.

170
00:33:58,914 --> 00:34:00,426
Kayla: It is like a box of chocolates.

171
00:34:00,578 --> 00:34:18,842
Chris: So originally created as anti cult organization has been zombie converted by Scientology to cover their damn asses. It's like that multi level marketing bill that we talked about in the show, and they talked about on the dream where it's like. It's called the Anti Pyramid scheme Act, but it's really about enshrining legal protections for the MLM industry. It's like that.

172
00:34:18,946 --> 00:34:21,414
Kayla: It's America of friends or whatever. I keep saying.

173
00:34:21,522 --> 00:34:32,686
Chris: So. Always read the label, kids. And by the way, superfan Mikayla, thank you for tipping us off about the cult awareness network. She emailed us about this a few months ago.

174
00:34:32,838 --> 00:34:33,518
Kayla: God bless me.

175
00:34:33,574 --> 00:34:57,427
Chris: Never had time to do an episode on it, but then it sort of came up in my research and just trying to find groups that we could talk to and help, you know, like bringing it back to theme of this episode, find experts to help level set what we're doing on the show. Ran into this. Zooming into that coastline. So, anyway, that's definitely all I have to say about the cult awareness at work. And this has been cultist weird.

176
00:34:57,563 --> 00:35:00,707
Kayla: I feel like that's not true.

177
00:35:00,843 --> 00:35:03,955
Chris: Oh, wait, sorry. Yes, I'm not done yet.

178
00:35:04,147 --> 00:35:06,679
Kayla: How. No, I don't want to hear anymore.

179
00:35:07,659 --> 00:35:16,426
Chris: Stressed about that cult awareness network. Noble, pure heroes that were subverted and corrupted by evil. Right? Is that what you think at this point?

180
00:35:16,498 --> 00:35:18,450
Kayla: I mean, that's kind of what you presented to me.

181
00:35:18,530 --> 00:35:19,578
Chris: Is that what it sounded like?

182
00:35:19,634 --> 00:35:20,242
Kayla: Yeah.

183
00:35:20,386 --> 00:35:23,146
Chris: Let's zoom a little bit closer into this coastline, shall we?

184
00:35:23,178 --> 00:35:24,138
Kayla: God damn it.

185
00:35:24,274 --> 00:35:42,994
Chris: Let us revisit the part of the story about the cult awareness network's founding and talk for a minute about who one might say is the charismatic leader of this episode. I kind of glossed over him before, but actually, maybe I should talk about his prison time. Oh, oops. Maybe I should have led with that, shouldn't I?

186
00:35:43,122 --> 00:35:44,490
Kayla: What is happening?

187
00:35:44,610 --> 00:35:48,842
Chris: So, remember when I said the cult awareness network was founded by one Ted Patrick?

188
00:35:48,906 --> 00:35:49,506
Kayla: Yes.

189
00:35:49,658 --> 00:36:50,754
Chris: Well, Mister Patrick, at a young age, had some personal experience with faith healers, witch doctors, allegedly voodoo practitioners. He had a lot of experience with cult like groups himself. So Mister Patrick, as we mentioned, founded the Cult Awareness Network in 1978. And despite a lack of formal education and professional training in the area of cult deprogramming, Mister Patrick was hired by hundreds of parents and family members to deprogram, quote, unquote, their loved ones. A high school dropout himself, Patrick based his techniques and practices on his own life experiences. This is from Wikipedia. And he was basically one of the, if not the first, pioneer of this thing called deprogramming. And he used a very confrontational method. Patrick stood trial several times on kidnapping charges related to his activities. In 1980, Patrick was convicted of conspiracy, kidnapping, and false imprisonment.

190
00:36:50,922 --> 00:37:09,316
Chris: These charges were related to the abduction and attempted deprogramming of Roberta McElfish, a 26 year old waitress in Tucson. In 1990, Patrick attempted to deprogram Elma Miller, an amish woman who had joined a liberal sect. He was hired by her husband to return her to him and the amish church.

191
00:37:09,468 --> 00:37:10,644
Kayla: I'm sorry.

192
00:37:10,772 --> 00:37:44,472
Chris: Criminal charges of conspiracy were filed against Miller's husband, Ruther, and two others, but were later dropped on her request to the prosecuting attorney, who decided not to charge Patrick. And the list goes on. So, yeah, this whole thing that we sort of glossed over, cult deprogramming is maybe not the best thing. And there were followers in his footsteps, and one of those was going back to that case we talked about, the Jason Scott case, Rick Alan Ross. Now, I don't want to.

193
00:37:44,496 --> 00:37:45,872
Kayla: Are you about to, like, shatter?

194
00:37:46,016 --> 00:38:10,810
Chris: I don't. Here's the thing. Yes and no. So I don't. The CEI right now is a great resource, and I think Rick Allen Ross is probably a pretty good dude at this point, but he used to be a deprogrammer, and I. Deprogramming is. What I'm trying to say here is that it's maybe itself a cult deprogramming is not great.

195
00:38:11,350 --> 00:38:12,090
Kayla: Always.

196
00:38:12,390 --> 00:38:33,340
Chris: So let's talk about. Well, I would say yes, because there's a different term for it now if it's not what it used to be. So I'm going to spare you the details, but there's a Wikipedia article about the Jason Scott case. But I suffice it to say that it was a pretty sort of standard methodology for cult deprogramming, which involves kidnapping.

197
00:38:33,420 --> 00:38:34,040
Kayla: No.

198
00:38:34,580 --> 00:38:43,044
Chris: In fact, maybe I won't spare you the details, so I'm going to read you a section from the Wikipedia article about the Jason Scott case, about deprogramming.

199
00:38:43,132 --> 00:38:43,612
Kayla: Okay.

200
00:38:43,676 --> 00:39:10,660
Chris: Quote. To facilitate the deprogramming, Ross put together a two man security team. The three traveled to the grandmother's home, locked the two youngest children in the basement, and following several days of argument and lecturing, the boys gave up their pentecostal beliefs for deprogramming. Jason Ross demanded a larger fee in view of the fact that he was powerfully built and legally an adult. Increasing the risk of prosecution, Ross hired a karate black belt named Clark Rotroff to help with the operation.

201
00:39:10,740 --> 00:39:12,812
Kayla: I feel like I shouldn't have a black belt anymore.

202
00:39:12,956 --> 00:39:20,394
Chris: One evening, as Scott returned to the family residence, he was surprised by Ross's three associates wrestled to the ground and dragged into a waiting van.

203
00:39:20,562 --> 00:39:21,270
Kayla: What?

204
00:39:21,970 --> 00:40:05,634
Chris: Scott struggled, but was held down and handcuffed by the three men, gagged with duct tape from ear to ear, and had his ankles tied with rope as he lay face down and with his cuffed hands beneath his body. One of the men, weighing 300 pounds, sat on top of his back. Scott's legs, upper body and back had sustained multiple bruises and abrasions from being dragged to the van and across stairs, floors in a patio. Anna goes on to talk about where they brought him and how he had to basically endure being wrongfully imprisoned. Scott testified that he endured five days of derogatory comments about himself, his beliefs, his girlfriend, and his pastor, and diatribes by Ross about the ways in which Christianity and conservative Protestantism was wrong. He was intimidated, forced to watch videos on cults and told his church was just the same.

205
00:40:05,802 --> 00:40:22,580
Chris: And it goes on, so on and so forth. In any case, that's why there was such a large settlement that was filed against Cin and Rick Allen Ross in this case. And I know I just said, like, I think he's a good guy now, but obviously those things I just described were not good.

206
00:40:22,660 --> 00:40:23,092
Kayla: No.

207
00:40:23,196 --> 00:40:28,436
Chris: And in general, that is how cult deprogramming had worked.

208
00:40:28,588 --> 00:40:29,884
Kayla: Why was that allowed?

209
00:40:29,972 --> 00:40:33,860
Chris: That's why Ted Patrick got in trouble with the law frequently.

210
00:40:33,940 --> 00:40:35,650
Kayla: Why did he want to do this?

211
00:40:35,780 --> 00:40:36,166
Chris: Who?

212
00:40:36,238 --> 00:40:37,286
Kayla: Ted Patrick.

213
00:40:37,438 --> 00:41:12,952
Chris: Because he had experience, he had personal experiences growing up with cults, like he himself had been the victim of cults, and so he felt like it was his duty to do that. I think generally, just like Ted Patrick had a tough time growing up he was born in what he referred to as red light district in Chattanooga, Tennessee, and he was just surrounded by a lot of criminal elements. And also he was, remember I said he was taken to a bunch of faith healers, witch doctors, and voodoo practitioners. That was all because he had a speech impediment. And so they were trying to take him to all these quacks and weirdos to heal him.

214
00:41:13,056 --> 00:41:17,180
Kayla: An improper treatment for speech impediments.

215
00:41:17,560 --> 00:41:55,200
Chris: Yes. So I think that's just one of those things where he had bad experiences himself and then decided to have that be his life's work. But he didn't have any formal training. He didn't have any training as a counselor or mental health professional or anything like that. He just based it on his own experiences, and it became this thing where you kidnap people and deprogram them. Now, the thing is, let's talk a little bit about the court cases. None other than the ACLU would fight against these deprogrammers in court because obviously kidnapping someone falls under, depriving them of their civil liberties.

216
00:41:55,280 --> 00:41:55,776
Kayla: Yeah.

217
00:41:55,888 --> 00:42:15,952
Chris: And the argument the other way, though, was that, well, these parents believe that their kids are in immediate danger, so that's why it's okay for us to do this. Right? That's why it's okay for us in these cases. In these extreme cases, they're an immediate danger. We can deprive them of their civil liberties. But, you know, that's something that is rightfully difficult to do.

218
00:42:16,016 --> 00:42:16,528
Kayla: Right.

219
00:42:16,664 --> 00:42:39,194
Chris: Over the years, that opinion has sort of changed in the court, and now claims of brainwashing are actually inadmissible in court. They're treated sort of like lie detector. Like, they're not really. They're not real. Yeah. One of the most helpful things I read was a transcript on Skeptoid.com entitled brainwashing and deprogramming, subtitled both brainwashing and its opposite, deprogramming are equally mythological.

220
00:42:39,282 --> 00:42:39,978
Kayla: Wow.

221
00:42:40,154 --> 00:43:25,272
Chris: So what they talk about in this episode, they bring up wartime brainwashing and how this sort of. This concept of people that went over in the Korean War and the Vietnam War and were pows would get captured by the enemy. They would essentially undergo these, you know, like this torture and imprisonment and essentially like, you know, programming them to say, like, oh, I renounce all of my patriotism and what Americans are doing is wrong, and, you know, blah, blah. So, like, they would. And then they would send those videos out and whatnot. And that sort of became this thing in people's minds. Like, oh, my God. People can be brainwashed.

222
00:43:25,336 --> 00:43:25,944
Kayla: Right?

223
00:43:26,112 --> 00:43:39,632
Chris: And then that sort of entered, and that's how it entered the public sphere was sort of from that. And then you have the rise of all these new religious movements, which are, like, very unfamiliar, and then some of them are actually bad. Like Jonestown.

224
00:43:39,736 --> 00:43:40,096
Kayla: Right?

225
00:43:40,168 --> 00:44:02,854
Chris: And you start putting those two and two together, and you say, okay, people can get brainwashed. And, like, maybe it is worth doing something to. Maybe it is worth doing something pretty drastic to help people from that, but it's often ineffective. Like, people would even come back, even pows, where they would make these tapes denouncing America would come home, and they'd be like, oh, I was just saying that. So they stopped torturing me.

226
00:44:02,902 --> 00:44:03,566
Kayla: Right, right.

227
00:44:03,638 --> 00:44:47,800
Chris: It wasn't actually effective. And holding someone captive, forcing them to listen to your point of view, to challenge something that maybe you came to believe of your own free will, sounds maybe more like deprogramming than anything else, than actual brainwashing. Right? Like, what's actually happening here? Like, is the cult brainwashing bad, or is it the capture and deprogramming? Is that bad? Like, they're both bad, but one doesn't sound necessarily better to me than the other one. Like, the deprogramming. Depriving people some. Depriving someone of their civil liberties is more like a pow situation even than going into some coercive thought process.

228
00:44:47,920 --> 00:44:48,560
Kayla: Right?

229
00:44:48,720 --> 00:45:07,910
Chris: So let's revisit that story, then, about the cult awareness network's takeover. Scientology destroyed the cult awareness network and then crawled into its empty husk like a victorious hermit crab. But in my mind, the cult awareness network was only able to be destroyed because it sowed the seeds of its own destruction.

230
00:45:08,000 --> 00:45:08,682
Kayla: Right?

231
00:45:08,866 --> 00:45:34,418
Chris: It had a history of deprogramming, of referrals for deprogramming. There were some accusations of them getting kickbacks for these referrals. So in other words, like, someone would call them, say, help. I need help for this person that I love, that's in blah, blah. And then they would refer a deprogrammer, and then they would get a kickback for that. And oftentimes, deprogramming is very expensive because you have to hire physical security to hold someone against their will.

232
00:45:34,514 --> 00:45:35,482
Kayla: This is insane.

233
00:45:35,626 --> 00:45:55,754
Chris: So the thing is, who's the expert here? Right? Like, this is a complex subject, and if you are looking for help on something like this, if you have someone that is, you know, facing that is part of what you believe is a cult and you're looking for help, it's a tough situation. It's tough to parse the correct information.

234
00:45:55,882 --> 00:45:56,338
Kayla: Right?

235
00:45:56,434 --> 00:46:45,656
Chris: Yeah. So Mister Patrick, of course, like, portrayed himself as this expert, as this deprogrammer, but he actually didn't have any training at all in mental health care. So that's not really someone maybe that you want doing something like this. And again, who knows if it's even brainwashing. Like, we mentioned a few of these things that were actually just like, we mentioned that case of the amish lady whose husband basically just wanted to kidnap her back from some, quote unquote, liberal sect. So we didn't mention this one because I don't think it actually went to court. But Ted Patrick, in 1980, was paid $27,000 to deprogram Susan Worth, a 35 year old teacher living in San Francisco. In this case, Patrick was hired by her parents, who objected to her involvement in leftist political activities.

236
00:46:45,808 --> 00:46:49,560
Kayla: So it's just kind of like, I don't like what they think, so I'm gonna.

237
00:46:49,720 --> 00:47:04,904
Chris: So I'm gonna hire a D programmer and kidnap. Right. And there's some other, like, there were also accusations. I don't want to say, like, maybe I shouldn't even say in the show because I forget who I read this about specifically, but, like, abuses, too, of course. Like, sexual abuse.

238
00:47:05,072 --> 00:47:06,512
Kayla: It's kind of part and parcel to this.

239
00:47:06,536 --> 00:47:13,296
Chris: Right. So this process, actually, in this case, involved handcuffing her to a bed for two weeks and not giving her any food.

240
00:47:13,408 --> 00:47:14,260
Kayla: Jesus.

241
00:47:14,880 --> 00:47:24,944
Chris: She was later released and eventually spoke out against deprogramming practices, but they did decline to press legal charges against either her parents or Ted Patrick.

242
00:47:24,992 --> 00:47:25,336
Kayla: Wow.

243
00:47:25,408 --> 00:48:18,056
Chris: So, yeah, so you have this guy who's not really an expert in this. In this subject, who didn't train for mental health, and. And then you have it being abused by people who are some. Maybe you're genuinely worried about their. Their spouse or their loved one or whatever, and some maybe just don't like that they are espousing beliefs that are different from theirs. So. So, yeah, like I said, I think that the cult awareness network. Yes, it's super weird that Scientology went to battle against them and destroyed them in court, as Scientology does, and then they backed into that husk, and now it's like the veneer for Scientology's own agenda. But would they have been able to do that if can wasn't getting in legal trouble for its own illegitimate practices?

244
00:48:18,128 --> 00:48:19,088
Kayla: Right, right.

245
00:48:19,264 --> 00:48:35,996
Chris: So before we get into the criteria and judgment finale part of the episode, I'm just trying something slightly different since the new season. Crazy change. I'm citing my sources at the end instead of the beginning. Mind blown. Right. Radical new direction for this season. Sources at the end.

246
00:48:36,108 --> 00:48:37,316
Kayla: Oh, yeah.

247
00:48:37,348 --> 00:48:37,972
Chris: Isn't that exciting?

248
00:48:38,036 --> 00:48:39,084
Kayla: Very mind blowing.

249
00:48:39,212 --> 00:49:22,436
Chris: Yeah. Anyway, so here are my research sources. Obviously, there are Wikipedia articles that I decided a bunch from on the cult awareness network, the new cult awareness network, since there are two, Ted, Patrick and a few others. All of those sites I listed earlier in the show when I was talking about looking around for an expert to come on the show, all of those sites are sources that I took from a few articles from some of those places on how they classify and categorize cults. I found Steve Hasan's material on destructive versus benign cults particularly helpful. So he has this whole conception of, like, cults can be either destructive or benign. So you might call a benign cult something like, for us, it's weird. Like we sort of have the same thing.

250
00:49:22,468 --> 00:49:23,964
Kayla: Right. Benevolent cult.

251
00:49:24,052 --> 00:49:35,708
Chris: Yeah. Where we have this criteria that's like harm caused and, you know, anti factuality. Like if you take those two bits away and it's just something with like, a lot of ritual and a charismatic leader.

252
00:49:35,844 --> 00:49:36,476
Kayla: Right.

253
00:49:36,628 --> 00:49:38,404
Chris: I don't know, like maybe it's benign. Right.

254
00:49:38,412 --> 00:49:40,212
Kayla: We talked about Cicada 3301.

255
00:49:40,276 --> 00:49:57,330
Chris: Right. Yeah, exactly. Now, the thing is, Steve Hassan has faced his own sort of accusations of harmful deprogramming in the past as well. I don't want to really get into it. I don't think it is quite as bad as some of the stuff we talked about in the program today. But it's more stuff like practicing without a license.

256
00:49:57,750 --> 00:49:58,558
Kayla: It's still bad.

257
00:49:58,654 --> 00:50:17,466
Chris: I know, but that's the thing is all these guys are sort of like, they're sort of practicing mental health without a license. And I think that's what he got in trouble for soaring Massachusetts. So I guess he's not going to come on the show anymore now. Oh, well, he generally does seem to be doing good work now, though, from what I can tell, and is a strong voice in favor of what's now called exit counseling instead of deprogramming.

258
00:50:17,538 --> 00:50:17,962
Kayla: Right.

259
00:50:18,066 --> 00:50:41,946
Chris: So deprogramming is more confrontational. It involves kidnapping more often than not. It involves, you know, basically like berating somebody until they, you know, quote unquote, renounce their beliefs. Exit counseling sounds to me more like it's traditional type counseling where it's like you're trying to talk with someone and reason with someone on their own terms, like an adulthood.

260
00:50:42,008 --> 00:50:47,230
Kayla: Right. You're still giving them the dignity of being a human.

261
00:50:47,350 --> 00:51:07,286
Chris: Right. And so what I can from what I can tell he's a strong voice in favor of exit counseling. Now, instead of deprogramming, the last source is that transcript I mentioned brainwashing and deprogramming again, subtitled both brainwashing and its opposite, deprogramming are equally mythological. And by the way, Skeptoid made all of its episodes free during the current lockdown situation. So go take a listen.

262
00:51:07,358 --> 00:51:08,462
Kayla: Thanks, Skeptoid.

263
00:51:08,646 --> 00:51:20,142
Chris: Anyway, it's that moment you've been waiting for all the way since 2019. Time to bust out the criteria. And remember, this is for entertainment. We are not experts.

264
00:51:20,326 --> 00:51:21,414
Kayla: We're not brainwashing anyone.

265
00:51:21,462 --> 00:51:29,886
Chris: We're not brainwashing anyone. And we will probably not hire anyone to come kidnap you. Although actually, that could be a good way to get new listeners. Maybe we should do that.

266
00:51:29,958 --> 00:51:33,478
Kayla: Yeah. It wouldn't be called deprogramming then. It would be called programming.

267
00:51:33,534 --> 00:51:46,848
Chris: Oh, yeah, we'd be doing the brainwashing. But, yeah. So we are not experts. These are for fun, like we said. And the fun is mostly about making the paper sounds.

268
00:51:46,944 --> 00:52:00,392
Kayla: Yeah. Okay, so before we do this, what is the organization that were the cult awareness network? The cult awareness network or the concept of deprogramming?

269
00:52:00,496 --> 00:52:07,130
Chris: Kind of, well, there's sort of linked. Right. The cult awareness network was founded by the originator of deprogramming.

270
00:52:07,170 --> 00:52:46,170
Kayla: Okay. Okay. I do want to say that, like, it's, this is a horrendous, horrifying practice. And I also understand it was, it's definitely been abused in certain cases, like parents just being like, I don't like my leftist daughter being leftist. I also think that, like, if you were in a true situation where a loved one is in like a Jonestown, Hale bop situation where you're like, this is a cult. This is a dangerous cult. What do I do? I understand why you would go to kind of these, like, extreme, maybe harmful lengths because you just, there's not a lot.

271
00:52:46,210 --> 00:52:47,218
Chris: You don't know what to do. Yeah.

272
00:52:47,234 --> 00:52:52,938
Kayla: You don't know what to do. How do you save your loved one from being a part of this thing that is destructive?

273
00:52:53,074 --> 00:52:53,722
Chris: Right. Yeah.

274
00:52:53,746 --> 00:53:02,222
Kayla: That's part of what almost make, like, gives the deprogramming that much more power because it's like they're preying on the desperate.

275
00:53:02,326 --> 00:53:04,110
Chris: Yeah, we've never seen that in this show before.

276
00:53:04,150 --> 00:53:12,414
Kayla: Yeah. Maybe that'll come up when we start talking about these expected harm high financially and physically.

277
00:53:12,542 --> 00:53:23,912
Chris: Yeah. Because it's costly and the person that it is targeting is literally deprived of their civil liberties, which is why the ACLU gets involved legally. Yeah.

278
00:53:23,936 --> 00:53:24,752
Kayla: And honestly, like.

279
00:53:24,816 --> 00:53:29,640
Chris: And sometimes physically abused, and I can't. Mentally abuse and also ineffective.

280
00:53:29,720 --> 00:53:42,792
Kayla: Right. And I can't imagine that the people, like, the victims of deprogramming. I can't imagine that this whole situation scenario helps the relationship with their loved one that's having this done.

281
00:53:42,976 --> 00:53:45,888
Chris: I could probably find an example of where it did, but I'm not.

282
00:53:45,904 --> 00:53:55,852
Kayla: You know, I'm just saying it probably harms that relationship. Probably, I would guess. Is it niche? I don't really know how to answer that one. Probably.

283
00:53:55,956 --> 00:54:24,766
Chris: It's pretty niche. I was reading some things that was basically saying, like, it became sort of like a. Like a fringe of society thing that was happening around, like, the seventies and eighties. Like, that was sort of a time when, again, you know, were fresh off having a bunch of, like, new age religious stuff come to the forefront, and then, like, Jonestown happened and then, like, the Rajneeshi thing happened. So I think it was, like, sort of part of the zeitgeist, but still pretty niche.

284
00:54:24,838 --> 00:54:40,166
Kayla: Okay. We can call it niche. I'm down with that anti factuality. I mean, they are doing a thing that's not effective, but I also don't know if they know that it's not.

285
00:54:40,198 --> 00:54:56,868
Chris: I think there's some motivated reasoning here, because to me, this is sort of the. Like, I don't need to be an expert. Like, if we're gonna lead off the show and talk about how shitty it is that. That people today are, like, I don't need to. Dumb math academics that say that, like, you know, we're. That Covid is bad.

286
00:54:56,964 --> 00:54:57,316
Kayla: Right?

287
00:54:57,388 --> 00:55:11,102
Chris: Like, if we're gonna, you know, say that those people are bad because they. Because today it's. It's all the deniers and all the conspiracy mongers are part of that, is that there's this disdain for expertise.

288
00:55:11,206 --> 00:55:11,878
Kayla: Right.

289
00:55:12,054 --> 00:55:14,094
Chris: Then I think we have to call this antifactual.

290
00:55:14,182 --> 00:55:21,054
Kayla: Okay, I'm down with that. Next. One percentage of life consumed. That's a tough one, too.

291
00:55:21,182 --> 00:55:22,254
Chris: Yeah, this is.

292
00:55:22,422 --> 00:55:26,822
Kayla: Man, it's not a lot. Even if you get deprogrammed. Like, two weeks is.

293
00:55:26,966 --> 00:55:28,670
Chris: I'm just saying, as a percentage. Sure. Yeah.

294
00:55:28,710 --> 00:55:34,726
Kayla: Two weeks is not a large percentage, but the psychological toll might be something that's, like, much.

295
00:55:34,918 --> 00:55:36,662
Chris: While it's happening, it's a hundred.

296
00:55:36,806 --> 00:55:41,838
Kayla: While it's happening, it's a hundred. And I'm sure there's, like, a lot of really terrible fallout after that.

297
00:55:41,854 --> 00:55:56,758
Chris: You're stuck. But by the same token, could you say, like, what if it's successful. Like, I didn't read much about successful attempts, but if the attempt is successful, maybe it frees you from the cult, and then percentage of life consumed goes down. You know, I feel like the percentage.

298
00:55:56,774 --> 00:56:03,744
Kayla: Of life consumed question here. It's kind of like, there. It's like we've done this before there, but not like there.

299
00:56:03,792 --> 00:56:06,768
Chris: Sometimes a lot there can be present, not needed.

300
00:56:06,864 --> 00:56:11,104
Kayla: Yeah, but the next one. Oh, baby. Ritual abounds.

301
00:56:11,192 --> 00:56:11,832
Chris: Mmm.

302
00:56:11,976 --> 00:56:13,648
Kayla: Ritual abounds.

303
00:56:13,704 --> 00:56:36,602
Chris: Oh, yeah. Kidnapping someone, tying them to a bed, depriving them of food. Physical abuse. I don't know, actually. Maybe physical abuse isn't. That might not be an appropriate thing to call ritual. But certainly the, like, the confrontational, sort of, like, you know, yelling at you until you agree with me seems real ritualistic, potentially, I think.

304
00:56:36,626 --> 00:56:47,018
Kayla: So. I'm gonna say the ritual is pretty culty, pretty high. The next one, I think I need your help with. The next one is charismatic leader. And you talk about Ted Patrick.

305
00:56:47,114 --> 00:56:51,390
Chris: Yeah. But, like, it's tough to say. Cause there's not a lot on him.

306
00:56:51,770 --> 00:56:52,352
Kayla: Right.

307
00:56:52,466 --> 00:57:26,998
Chris: You know, he's. He's not, like, the most famous person to have ever existed. Like, it's maybe actually niche should score higher. But he did sort of lead the wave of a bunch of other deprogrammers. So maybe lead by example. I don't know. Like, I would say there's definitely a presence of a charismatic leader, but I don't know if he was, like, charismatic in the sense of recruiting or if he just did something that was maybe seen as needed and other people joined or maybe lucrative and other people joined. I'm actually not sure how lucrative it is. By the way, just because something's expensive, it doesn't mean it's lucrative.

308
00:57:27,054 --> 00:57:27,462
Kayla: Right.

309
00:57:27,566 --> 00:57:45,010
Chris: If you have to pay a lot of money to, like, you know, your bodyguard. Yeah. Security, then, you know, you may not be making a lot of money. I don't know. But anyway, I would say there is a charismatic leader, but I don't know if it's, like, as charismatic as, like, a Mary Kaye or a teal swan.

310
00:57:46,790 --> 00:57:48,490
Kayla: A little bit torn on this one.

311
00:57:48,830 --> 00:57:56,574
Chris: Where does that leave us? I mean, I guess we should also probably say, like, new can versus old can. I don't know. Like, we didn't say that much about new can because I don't know if.

312
00:57:56,582 --> 00:58:02,038
Kayla: I want to make a call about new can. Personally, I work in Hollywood. Just saying.

313
00:58:02,134 --> 00:58:09,744
Chris: Oh, right, sure. Well, they're not, like, officially church of Scientology. They're just, like, associated.

314
00:58:09,902 --> 00:58:10,760
Kayla: Okay.

315
00:58:12,420 --> 00:58:14,676
Chris: Okay. Yeah, let's make a call on old can.

316
00:58:14,748 --> 00:58:17,800
Kayla: That's about as much as I'm gonna say on that. Old can.

317
00:58:18,100 --> 00:58:22,040
Chris: Wow, this is starting to get creepy up in here.

318
00:58:22,580 --> 00:58:27,828
Kayla: Old can, probably cold.

319
00:58:27,884 --> 00:58:38,950
Chris: I think it's a cult. Yeah, I think. Yeah, I think you have a group that the forms around one guy's sort of vision.

320
00:58:39,070 --> 00:58:46,902
Kayla: Right? It's like his life experience too, which retail swanian. Like, here's my background. And my background is what makes me the one that can tell you this and do the thing.

321
00:58:46,966 --> 00:59:05,010
Chris: Right? And it's based on this, like, non expert, non. No training, no mental health professionals anywhere in sight. Practicing without a license type stuff. There's a lot of anti factuality there. There's a lot of ritual there. There's harm there. There's a. There's a charismatic ish leader. I think it's a cult.

322
00:59:05,170 --> 00:59:05,762
Kayla: I think it's a cult.

323
00:59:05,786 --> 00:59:08,546
Chris: I think the cult awareness network is a cult. Is a cult.

324
00:59:08,658 --> 00:59:09,458
Kayla: Oh, shit.

325
00:59:09,554 --> 00:59:14,242
Chris: Very meta call is coming from inside the house. Oh, the cult. The cult.

326
00:59:14,266 --> 00:59:15,082
Kayla: The cult is coming from inside.

327
00:59:15,106 --> 00:59:20,306
Chris: The cult is coming from this cult. Everything is cult. All is cult. Welcome to 2020.

328
00:59:20,378 --> 00:59:24,882
Kayla: Yeah, 2020 is a cult. I have decided just now, but, yeah.

329
00:59:24,906 --> 00:59:35,492
Chris: I thought it was a pretty appropriate one to kick off the season with. To talk about is the cult awareness thing a cult? Right. And to talk about scary.

330
00:59:35,596 --> 00:59:37,644
Kayla: Why did you do a scary one?

331
00:59:37,812 --> 00:59:40,892
Chris: Because I selected this topic way, like, months ago.

332
00:59:40,996 --> 00:59:42,796
Kayla: Right. I know. I'm, like, thinking about all of my.

333
00:59:42,828 --> 00:59:44,268
Chris: Topics now and I'm just gonna rethink them all.

334
00:59:44,284 --> 00:59:48,252
Kayla: Do I wanna do these? I think most of mine are, like, fairly tame.

335
00:59:48,396 --> 00:59:48,932
Chris: That's good.

336
00:59:48,996 --> 00:59:54,044
Kayla: Like, good. I've got really. I got some goodies in there I'm very excited for.

337
00:59:54,172 --> 00:59:58,600
Chris: I'm pretty excited about mine too, including, I think this one is pretty cool.

338
00:59:59,460 --> 01:00:00,596
Kayla: Very meta. I like.

339
01:00:00,668 --> 01:00:07,720
Chris: And again, thanks to Michaela for tipping us off about the weird, like, scientology takeover of can back a few months ago.

340
01:00:08,500 --> 01:00:10,972
Kayla: The tip was a few months ago. Not the takeover.

341
01:00:11,036 --> 01:00:13,556
Chris: The tip was. Yes. Correct. Just the tip.

342
01:00:13,748 --> 01:00:15,760
Kayla: No. Inappropriate.

343
01:00:16,380 --> 01:00:17,652
Chris: We don't say things like that.

344
01:00:17,676 --> 01:00:18,388
Kayla: You're a cult.

345
01:00:18,524 --> 01:00:24,196
Chris: You're a cult. Anything else you want to add to the first episode of our second season?

346
01:00:24,308 --> 01:00:41,854
Kayla: Thanks for. For coming back, guys. Thanks for joining us. If you're brand new, if Covid Nineteen's got you down, we got you back. Yeah. We are all in this fucked up boat together. Listen to our podcast.

347
01:00:41,942 --> 01:00:52,312
Chris: Listen to our podcast. Build your trust network. Listen to experts. Understand that. I don't know. I don't know. I was gonna say about that.

348
01:00:52,376 --> 01:00:57,544
Kayla: If Twitter is doing more harm than good for you in your life, just delete it off your phone. Just do it.

349
01:00:57,672 --> 01:01:01,256
Chris: Like, truly no episode about Twitter. Yeah. Okay.

350
01:01:01,408 --> 01:01:05,192
Kayla: We'll do that one day. Thank you for taking us on that journey.

351
01:01:05,336 --> 01:01:11,984
Chris: Thank you for going on that journey with me. And I will look forward to speaking with you on the next episode.

352
01:01:12,112 --> 01:01:12,608
Kayla: Yeah.

353
01:01:12,704 --> 01:01:15,408
Chris: Of cult or just weird?

354
01:01:15,504 --> 01:01:16,504
Kayla: We didn't say our names.

355
01:01:16,552 --> 01:01:19,584
Chris: Oh, we didn't say our names. Shit. The episode will never end.

356
01:01:19,632 --> 01:01:20,360
Kayla: Who are you?

357
01:01:20,480 --> 01:01:21,072
Chris: I'm Chris.

358
01:01:21,136 --> 01:01:21,840
Kayla: And I'm Kayla.

359
01:01:21,880 --> 01:01:24,700
Chris: And this has been cult or just weird?