Join the conversation on Discord!
Sept. 21, 2021

S3E13 - The Crusaders: Reprise (QAnon update w QAA's Travis View)

Cult or Just Weird

Wanna chat about the episode? Or just hang out?

Come join us on discord!

 

---

All of this has happened before, and it will all happen again.

Chris & Kayla seek an expert to help deliver an update on one of last season's very timely topics.

Special thanks to guest Travis View (@travis_view on Twitter) from the QAnonAnonymous podcast for his time and insights for this episode!

---

*Search Categories*

Anthropological; Internet culture; Common interest / Fandom; Destructive; Conspiracy Theory

---

*Topic Spoiler*

QAnon, pt 6: update a year later

---

*Further Reading*

All of our research this episode involved chatting with Mr View, co-host of QAnonAnonymous:

www.qanonanonymous.com

https://podcasts.apple.com/us/podcast/qanon-anonymous/id1428209307

 

---

*Patreon Credits*

Michaela Evans, Heather Aunspach, Annika Ramen, Zero Serres, Alyssa Ottum

<<>>

Rebecca Kirsch, Pam Westergard, Ryan Quinn, Paul Sweeney, Erin Bratu, Liz T, Lianne Cole, Samantha Bayliff, Katie Larimer, Fio H, Jessica Senk, Proper Gander, Kelly Smith Upton, Nancy Carlson, Carly Westergard-Dobson, Benjamin Herman, Anna Krasner

Transcript
1
00:00:00,320 --> 00:00:04,530
Chris: Yeah. How's it going? Thanks for. Thanks for taking the time to talk to us.

2
00:00:05,030 --> 00:00:24,118
Travis View: Yeah, yeah, it's my pleasure. My pleasure. Yeah. I listened to some of your podcasts that you covered before on QAnon. You went really deep. You deep dive down, all the way down to ancient blood libel that we talked about stuff. But that kind of stuff is important to give all this kind of stuff context.

3
00:00:24,294 --> 00:00:26,888
Chris: Yeah. We're sort of addicted to context. It's really bad.

4
00:00:27,014 --> 00:00:29,080
Kayla: It's. It's hard to know what to cut.

5
00:00:30,660 --> 00:01:21,470
Travis View: No, it is important because, like, you know, feel like. Feel like a lot of people, especially when I talk about QAnon, people talk about, you know, this kind of thing as if it's a completely new phenomenon. Like, it just like. Like this. Because the style of conspiracism just came out of blue. I often like to point out to people's surprise and chagrin that I wouldn't even call QAnon the most successful conspiracy conspiracist political movement in the United States. Because in the 18th century, there was a case of the anti masonic movement which. Which, as peak, captured 10% of the House of Representatives, even was two members of the anti masonic party who are governors, and the anti masonic presidential candidate even won a single state. So they're not quite.

6
00:01:21,570 --> 00:01:31,662
Travis View: We're not quite that to that point where we have sort of QAnon specific people be getting the governorships and being serious political candidates. But fingers crossed.

7
00:01:31,806 --> 00:01:32,970
Chris: Knock wood, please.

8
00:01:33,630 --> 00:01:38,582
Travis View: Yeah, not wood. Good point. It took them a few years to get to that point, so we'll see how that's true.

9
00:01:38,686 --> 00:01:43,606
Chris: Yeah. And anti masonic does. It's like, that's a half step away from anti cabal. Right? So.

10
00:01:43,678 --> 00:02:14,960
Travis View: Yeah, yeah. The anti masonic movement, I mean, they were. They basically believed that the Masons and the Illuminati were plotting to control the world, and they needed to put it down by political means. It came out of this religious movement, sort of these evangelical Christians who thought that the Masons were basically trying to destroy religion. And actually, there was this Roger Stone of the era named Thurlow Reed, who sort of harnessed the energy of this movement and turned it into a political movement.

11
00:02:15,470 --> 00:02:17,342
Chris: That is. That's fascinating, man.

12
00:02:17,446 --> 00:02:19,010
Kayla: We never learn, do we?

13
00:02:19,670 --> 00:02:20,382
Travis View: No. No.

14
00:02:20,446 --> 00:02:30,318
Chris: I mean, I read something one time that was like history. Those who don't learn history are doomed to repeat it. And those who don't do learn history are forced to watch everyone repeat it.

15
00:02:30,454 --> 00:02:30,838
Kayla: Yeah.

16
00:02:30,894 --> 00:02:41,770
Travis View: Right. Yeah, yeah. It's a lesson. Like, humans are humans with human needs. And human sort of social bonds, and we should do the same shit over and over again.

17
00:03:18,400 --> 00:03:25,232
Chris: Welcome to culture. Just weird. I'm Chris, a game designer slash data scientist by day, podcaster by night.

18
00:03:25,336 --> 00:03:30,144
Kayla: I'm Kayla. I'm a tv writer. And shockingly, I'm also a podcaster.

19
00:03:30,232 --> 00:03:30,576
Chris: What?

20
00:03:30,648 --> 00:03:31,328
Kayla: I know.

21
00:03:31,464 --> 00:03:34,520
Chris: Maybe I didn't actually need to say that since this is a podcast.

22
00:03:34,600 --> 00:03:35,420
Kayla: Too late.

23
00:03:36,120 --> 00:03:39,848
Chris: Anyway, as you just heard today, though, we actually have a third person to chat with.

24
00:03:39,904 --> 00:03:40,700
Kayla: We do?

25
00:03:41,180 --> 00:03:41,676
Chris: Yes.

26
00:03:41,748 --> 00:03:42,380
Kayla: Who's that?

27
00:03:42,460 --> 00:03:53,764
Chris: That's. We were lucky enough to snag some time with Travis V of the very awesome, very insightful, and very popular QAnon anonymous podcast, which is amazing.

28
00:03:53,892 --> 00:04:03,052
Kayla: Even though I was literally a participant in this interview, I am feigning surprise and delight. Okay, so why are we talking to Travis today?

29
00:04:03,196 --> 00:04:54,536
Chris: I'm so glad you asked, Kayla. We here at cult are just weird. Produced a five part series on QAnon last year. If you're interested in QAnon, its context, its origins, how it uses powerful psychological devices, how it affects believers and their families, and ways for you to maintain some hope and sanity in the face of such torment, then definitely go check out these episodes. They comprise episodes 16 through 20 of our second season, and they all share a moniker of, quote, the Crusaders, like this episode does. Anyway, being that, first of all, we spent a large chunk of the show's time and energy talking about QAnon. And secondly, QAnon is a bit of a current event type thing, right? Like, in other words, it's still very much around. It's still very much evolving. In September of 2021, the story is still unfolding.

30
00:04:54,648 --> 00:04:59,048
Chris: We thought it would be good to devote this episode to doing a Q update for y'all.

31
00:04:59,144 --> 00:05:08,558
Kayla: We also got some positive feedback on the behind blue Curtains episode where we had a great chat with Molly Maeve Egan in lieu of discussing a specific group. So we figured, let's go ahead and try that format again.

32
00:05:08,654 --> 00:05:30,182
Chris: So then we figured, who better to talk to for an update on what's been happening with all aspects and angles of QAnon than one of the hosts of the aforementioned QAnon Anonymous podcast? They put on a really good show over there, by the way, and there's like an absolute ton of overlap between Qaa and culture. Just weird. So if you aren't already listening to them, you absolutely should go check them out. It'll be right up your alley.

33
00:05:30,246 --> 00:05:37,934
Kayla: Can confirm. Great show. They talk about the secret queens of Canada. They talk about queuing on getting kicked off of Reddit. They talk about it all.

34
00:05:37,982 --> 00:05:39,318
Chris: Hypermectin, horse pace.

35
00:05:39,374 --> 00:05:39,790
Kayla: All of it.

36
00:05:39,830 --> 00:05:40,390
Chris: Hell, yeah.

37
00:05:40,470 --> 00:05:51,278
Kayla: So then, without further ado, here's a September 2021 update on QAnon, courtesy of QAnon Anonymous co host Travis Vue. Can you introduce yourself for our listeners?

38
00:05:51,374 --> 00:06:04,820
Travis View: My name is Travis Vue. That's a. It's not my real name as a pseudonym, but if you're so inclined, you can search my name, and you'll see that I've been doxxed by the Washington Post. Yeah. And I am the co host of the Q Anonymous podcast.

39
00:06:05,240 --> 00:06:15,256
Chris: Yeah, I feel like you're kind of a bit of a celebrity, like an extremism and disinformation celebrity. So it's really cool that you're on the show. That's crazy. That's a thing, yes.

40
00:06:15,448 --> 00:06:26,490
Travis View: No, no. It really signals how fallen this world has become. I often joke I mourn the loss of the world that had little use for my interests. But the problem is.

41
00:06:26,600 --> 00:06:27,702
Chris: Great joke.

42
00:06:27,886 --> 00:06:44,710
Travis View: But the problem is that. Yeah. Is that extremism, which used to be a sort of a real friend, I mean, made its way into a. Into a very mainstream kind of topic because it was basically promoted by mainstream political figures like Trump.

43
00:06:44,870 --> 00:06:52,552
Chris: So how did you become interested in QAnon then, and disinformation and extremism? And how did your podcast, QAnon Anonymous, get started?

44
00:06:52,726 --> 00:07:53,288
Travis View: I suppose my base interest really came out of my interest in skepticism generally, especially back in the early two thousands. I was really interested in science and epistemology and empiricism. And I was really interested in seeing how that there were some really anti science movements that were quite popular. Specifically, I'm thinking of the intelligent design creationism movement. And there was this institution out in Seattle called the Discovery Institute that was pushing very hard to get the pseudoscience called intelligent design creationism placed in schools. It was really blew me away that there was this coordinated campaign to fight against basic, established, more sort of well supported scientific principles, because generally, I feel very grateful to live in a time in which there's a lot more technology and science, because I understood it could have been possible.

45
00:07:53,344 --> 00:08:40,328
Travis View: I've been born in a different time where I was a illiterate substance farmer. But. But that didn't happen. So, yeah, it was very strange to me that people would always reject scientific principles and the principles of skepticism in favor of these bizarre fantasies. So I tried to, like, learn, like, why do these people believe these wild things? Why do some people believe, as is the case with, like, young earth creationists, that they believe that the world is 10,000 years old, which is not just a little wrong, it's absurdly wrong, and it's wrong in a million different ways. And, you know, that ties into, you know, sort of sociological, psychological, sort of phenomenon. And so that's what sort of my base interest, and I guess my. My more specific interest in QAnon came when, I started noticing. I was.

46
00:08:40,424 --> 00:09:24,418
Travis View: I was kind of aware of QAnon as a phenomenon in early 2018. But, you know, I kind of dismissed that as a weird chan thing. You know, it's just one of those things that, you know, people get really into. Wasn't really that worth paying attention to. But what caused me to change my mind was that I noticed a tweet from Charlie Kirk, who is a fairly mainstream conservative commentator. He's met with Trump. He speaks on campuses all over the country. He appears on Fox News fairly regularly. Well, he, Charlie Kirk, he tweeted this tweet that contains a lot of bogus statistics about human trafficking, implying that Trump was doing a much better job of fighting human trafficking than Obama ever was.

47
00:09:24,554 --> 00:10:12,748
Travis View: And I knew right away that was bogus because the Department of Justice, they take years to release their statistics about the federal arrests and stuff. So I tried to trace the origin of this claim, and I figured out that basically it came from a spreadsheet that was compiled on the Q research board on eight Chan. And so there was like, there was this path from, basically from, you know, the bizarre fringes of conspiracism on eight Chan, all the way to Charlie Kirk's Twitter account and his hundreds of thousands of followers on his verified account. And that made me realize that, oh, this is not something that is just staying in the bowels of the Internet. This is something that seems to being picked up by mainstream social media figures and is being pushed onto mainstream social media networks in a significant way.

48
00:10:12,804 --> 00:10:26,040
Travis View: It's not just fringe conspiracists. So that's what really got me interested in that. And so that caused me to sort of track QAnon a little bit more closely and help journalists try to understand what exactly was going on with the phenomenon.

49
00:10:26,460 --> 00:10:28,040
Chris: That's really interesting.

50
00:10:28,580 --> 00:10:37,092
Kayla: So disheartening. It's like, obviously, we've all been talking about QAnon for such a long time, but going back to stuff like that, it's just. It's so disheartening.

51
00:10:37,196 --> 00:10:48,828
Chris: Yeah. And that pathfinding from the fringe to the mainstream, I think, is what really, I mean, that's the scary bit about QAnon. Because otherwise, it just. Yeah, like you said, you could just kind of forget about it. It's just a fringe, right?

52
00:10:48,924 --> 00:11:27,714
Travis View: Yeah. I mean, it's like, I don't have a problem with generally with people believing in fringe, wild things. I mean, that's. That's a very human thing, I guess. It's sort of like, it's the same kind of thing that disturbed me with intelligent design creationism. It wasn't merely that they believed something that was false or deviated from scientific principles, is that they were working to push these false beliefs into mainstream public schools. So that's what really, you know, disturbed me. There's this bizarre effort, coordinated effort amongst bunch of people to make their fringe false beliefs much more acceptable and popular.

53
00:11:27,802 --> 00:12:02,006
Chris: So one more comment on that. Is that just your. The way that you said you started being interested in this stuff just reminds me of myself a lot, because I was sort of interested in cults when we started doing this show. But really, the draw for me was I've always been into skepticism. Right? Like, I've always sort of, like you were talking about, into science and empiricism and that sort of thing. And we started doing this show, and I thought it was just going to be like, all right, I'm just going dunk on some weirdos for a bit. Like, you know, our second episode was about what the bleep do we know? And Ramtha. And so I thought that was kind of going to be the thing.

54
00:12:02,078 --> 00:12:13,930
Chris: And then over the past couple of years, it's just, without even me noticing, like, all of a sudden now we're in this place where it's just so much more than that. It's so ubiquitous. It's so intense.

55
00:12:14,350 --> 00:12:58,976
Travis View: Yeah. I mean, that was. I mean, the really troubling thing is that before I really got into, you know, Qanon, I didn't think I was really interested in, like, extremism so much, because I didn't. I guess I didn't really tie those two together. I really. I really thought of it more as like a. Again, a sort of a debunking kind of thing. Like, these people believe these false things, and I will correct them with the facts and the more reputable sources. And this is how I'm going to counter these things. But, you know, through the help of a lot of researchers I've connected with and journalists, I realized that, no, this is actually a much deeper problem than merely false things that need to be debunked.

56
00:12:59,008 --> 00:13:24,510
Travis View: This is an issue of extremism, and this ties into things like white nationalism and other kind of ugly political movements. And the thing is that people believe these extremist ideologies for much more deep reasons than merely believing that they have a secret key to the universe. It's a sense of identity, it's a sense of purpose that runs really deep.

57
00:13:24,680 --> 00:13:26,978
Chris: Yeah, your journey really resonates with me there.

58
00:13:27,034 --> 00:13:45,710
Kayla: Yeah, that is the journey. I feel like that is the journey of anybody who kind of comes to this is like, you started in one place where it was like, I like to read about skepticism, and now the world is ending. So you've kind of given us a where you were before QAnon anonymous but where do you see yourself, and where do you see the podcast going from here?

59
00:13:46,450 --> 00:14:15,284
Travis View: Yeah, that's a good question. You know, I I mean, the podcast, I often say, you know, as long as conspicuous holds sway in mainstream american politics, I think we'll have plenty of content to work with. And so that's really, that's always been my primary interest, is in the way that pseudoscience and conspiracism seem to grip hold of people and cause them to make, you know, it's very serious political decisions.

60
00:14:15,332 --> 00:15:07,534
Travis View: Like, I mean, we did a recent episode, for example, on Ivermectin, which, you know, doesn't seem like it's, at first glance, doesn't seem as entirely relevant, but it's based on the same kinds of mechanisms as conspiracism, the belief that there is a secret cure that will end the pandemic, but there's a cabal of big pharma that is suppressing the information, and then only the small, brave band of people who push this information into the mainstream are able to save humanity. And I really don't see a scenario in which all baseless conspiracism is vanquished and is no longer part of mainstream american politics, and all of a sudden, everyone's just sitting down to talk about base policy disputes. You know, if, like, if platonic ideal, right?

61
00:15:07,622 --> 00:15:25,158
Travis View: Yeah, if political discourse is no longer a battle of, like, bizarre extremist ideologies and it becomes a your, you know, policy disputes, then I guess maybe there will be no place for QAnon Anonymous in the sort of the media sphere. But at the moment, that isn't the case.

62
00:15:25,334 --> 00:15:33,730
Kayla: It's a weird position to be in, where we're kind of all hoping for our own creative demise. I hope that our skills are not needed in the future.

63
00:15:34,030 --> 00:16:02,522
Travis View: I mean, it's, it's, yeah, it's something I actually haven't told, q and u followers. Back when I used to argue directly with them, I don't do that so much anymore. But, like, they would say, like, listen, if you don't like what I'm doing, starve me of material. Stop pushing nonsense. Check on what you're saying is so easy. I mean, you're. I have to say, it's like, well, it's like when I started gaining sort of, like, a little bit more clout, a little bit more media attention. It's like, you know, you could. You could cause me to be ignored. You could cause me to be irrelevant. It's up to you.

64
00:16:02,666 --> 00:16:22,284
Chris: Yeah. It's also interesting talking about the ivermectin. And just once you start, once you know what the tropes are, it's, like, hard to unsee them everywhere, right? So, like, the. The secret knowledge, the one true cure, there's, like, these common tropes that, like, once you know what they are, you're like, oh, it's present there. It's present in anti Vaxx. It's present in QAnon.

65
00:16:22,452 --> 00:16:46,304
Travis View: Yeah. It often reminds me of the hero's journey narrative, which is ubiquitous and every sort of long story, the conspiracist narrative is just something that seems to be imprinted on the human mind and resonates with us. And so you see it over and over again whenever people push these baseless conspiracy theories.

66
00:16:46,452 --> 00:17:10,420
Chris: So this episode is. The thrust of the episode is, hey, let's update our listeners on QAnon because it's been a while since we've talked about it. Folks who have listened to our episodes and folks who listen to your show will probably already know that the seed of QAnon was the Q drops. The actual posts on four Chan eight kun. So how have those been going since last year's election?

67
00:17:10,920 --> 00:18:05,584
Travis View: Well, there were only a few Q drops after the election, and the fact the very last Q drop was on the December 8, 2020, and this consisted solely of a link to a YouTube video of Twisted sisters. We're not going to take it. And that link to that video has since been taken down. So it's right now, at the moment, it's a dead link. But since then, after Q posting, it seems as though Ron Watkins, who was the eight kun administrator, kind of, like, took up the mantle and started more directly leading the QAnon community on Twitter and then later on Telegram when he got banned, directing people, for example, to, like, you know, go to DC on January 6 and other things. So, yeah, the. The Q drops are. Are gone. I mean, it's. As always, it's.

68
00:18:05,632 --> 00:18:17,148
Travis View: This is the longest silence in Q's history. I think the previous record was just three months. So this is always within the realm of possibility that Q will come back. But for the moment, it seems like Q is totally silent.

69
00:18:17,304 --> 00:18:34,468
Chris: Yeah. I've seen, of course, theories that Ron always was Q, and I've seen it be referred to as Ronan, by the way. Like, why is QAnon so fun to come up with? Alt names for blue anon, Ronan, Qanon. Anonymous. It's just for some reason, it's very fertile ground there.

70
00:18:34,564 --> 00:18:50,148
Travis View: Yeah. Yeah. I don't know. It is very strange to see QAnon at first become something that was so obscure that you needed to explain it to people before they even understood what it was. There's something that became, like, shorthand for online conspiracism.

71
00:18:50,284 --> 00:19:07,332
Kayla: Right. So is it. I mean, is QAnon even still a thing, or is it just kind of a word we're now using to describe online conspiracism? Like, without Q being around? It was such like that cult of personality. Even though we may or may not know who Q is, like, is it still a thing?

72
00:19:07,476 --> 00:19:37,844
Travis View: Well, you know, I would argue it is still a thing and still very much active, partly because I want to give you an example. I mean, just a few months ago in Dallas, there was a. There was a conference called the Patriot Roundup, which was the largest QAnon conference to date. Like, this was a three day, $500 a ticket extravaganza that featured General Flynn and Sidney Powell and a host of different QAnon influencers like Jordan Sather.

73
00:19:37,932 --> 00:19:40,520
Chris: Hopefully food, right?

74
00:19:41,260 --> 00:19:42,480
Kayla: Hope you got a lot.

75
00:19:43,100 --> 00:20:35,002
Travis View: So I often like to point out, like. Like, the biggest QAnon conference ever happened this year after the Q drops ended. I mean, like, the very first QAnon conference, which I attended about two years ago in DC. It was attended by a few dozen people. It was outdoors, it was free, and it was on a really a makeshift stage in front of the Washington monument. It was. Wasn't. Wasn't a really impressive event at all. But now there are people who are willing to spend a lot of time and a lot of money to go to very explicitly QAnon events. In fact, those same organizers who hosted the Patriot roundup event, he goes by the name QAnon John. He's hosting a similar three day event, actually, just next month in Las Vegas. So, I mean, this sort of thing, QAnon continues to be a draw.

76
00:20:35,146 --> 00:20:50,422
Travis View: And in addition to that, a lot of the original QAnon influencers, they still have fairly substantial audiences, despite the fact that they've been booted from all these other mainstream social networks. They've been forced to go to places like telegram and rumble and stuff.

77
00:20:50,526 --> 00:21:01,970
Chris: So it sounds like it's still alive and kicking, maybe even thriving, and also maybe doing a little bit of. What's the word? I don't say dispersing, but, like, metastasizing. Is that maybe what's happening here?

78
00:21:02,390 --> 00:21:43,780
Travis View: Yeah. Yeah. I mean. I mean, yeah. The QAnon community, I mean, there's still very active, even though they don't have the sort of. The central sort of messenger of QAnon anymore. You can see them, for example, when they're not attending these expensive conferences. You can see them at school boards. This is partly at the direction of General Flynn, who has adopted this slogan. Local action has a national impact. Many QAnon followers have been encouraging other QAnon followers to make their voices heard on school boards and run for local office and stuff. So, you know, they're making a play for some low level political power.

79
00:21:43,940 --> 00:22:10,606
Kayla: It's very scary. Like, it's very. It's very. I feel like that phrase that you said of the. The local. You know, the local action, having this. This national power, it makes it hard to know how widespread a movement like this is and how much of it is just a vocal minority versus a giant political party. Do you think it's gotten bigger, or do you think, has it acquired more people, or are people that are in it just becoming louder?

80
00:22:10,758 --> 00:22:54,216
Travis View: You know, I think I have to imagine that the people who are involved in QAnon became more vocal, because this was always my general concern, is that what QAnon was, what really interested me about QAnon was that it was a very kind of passive kind of political movement, because it's based upon this premise that there are big revolutionary changes that are coming for your benefit, the kind that you like, and you don't need to do a thing. The Q team are handling it behind the scenes. You just need to sit back and grab your popcorn, watch the show as Q drops often said. But I thought, man, this can't last forever. Like, no one's patience when they're waiting for something big to happen lasts forever.

81
00:22:54,328 --> 00:23:24,866
Travis View: There's going to come a point when they feel like in order to order for, they're still going to have that passion, that desire for big changes, but they're going to feel the need to do it themselves in order to scratch that itch. And so, yeah, I think that this is, you know, partly the kinds of thing that we see. I mean, obviously, the most dramatic example of this was January 6, but, like, in. In lesser examples, we can see this in the way that they are making a play for local office.

82
00:23:25,028 --> 00:23:56,530
Chris: Well, you mentioned January 6, which. Absolutely crazy. That's definitely an example of what you're talking about, where it goes from this passive to more active thing. I think when we had Jatar Jadeja on the show, he called it forcing the end. So how is it that a real world violent insurrection has these ties to this online conspiracy thing? Is it just that? Is it just people getting restless and needing to make the thing real?

83
00:23:56,990 --> 00:24:47,134
Travis View: Yeah. Yeah, I think it really is. What's really interesting about QAnon is that conspiracism is usually fairly despairing because it makes people who are otherwise have. You know, it explains why people who have no political power, and it gives them an explanation. Well, it's because there's this cabal who controls everything, and they keep people like you down, and that's. That's kind of a bummer of a message. But Q is different because it said that, no, actually, this cabal that you hate so much is going to be defeated by you sitting at your computer, posting memes and being involved and connecting with other like minded QAnon followers. And so all of a sudden, there's this. There. Well, there's this hero's journey. There's this belief that you can, in fact, even though you feel very powerless, you can defeat this cabal that you believe that is.

84
00:24:47,182 --> 00:25:34,330
Travis View: Is keeping you down. And so this is what. But this is what motivated a lot of people to, like, come to DC on January 6. It was not only. It was obviously, it was this confidence that they could, in fact, make this big revolutionary change. And this combined with Trump imploring people to come there on January 6, and Ron Watkins imploring people to come, you know, they felt like they were participating in the storm, and they were able to, you know, essentially reverse the results of the election. And even though. Even though it instilled these people who were otherwise, again, feel very despairing about the state of politics and how much power they have and made them feel empowered. For the majority of them, once they actually got inside the building, they had no idea what to do.

85
00:25:35,270 --> 00:26:08,780
Travis View: This is why they did things like take selfies and just sort of marvel at what a weird situation it was. And then, of course, a lot of them dispersed once Trump got on YouTube or got back on Twitter and asked them to. So, I mean, yeah, I mean, this is basically how it makes people, you know, feel like they. That they can have a really significant impact on national politics in the way that they might not otherwise feel if it wasn't for Q egging them on.

86
00:26:09,240 --> 00:26:43,060
Kayla: Just since we're talking about January 6 and we've talked about social media's role in this, you know, YouTube, Twitter, you've also been talking about QAnon being deplatformed, I mean, since the beginning, since it got kicked off Reddit. How do you think the fact that January 6, which finally kicked the ass of Twitter and some of the other social media platforms to take this rhetoric, this online rhetoric, really seriously and take it serious as something very dangerous, how did that deplatforming affect the Q influencers and the Q followers that kind of set up shop there?

87
00:26:43,220 --> 00:27:27,914
Travis View: I mean, yeah, I mean, I've seen statistics on, for example, how much traffic the QAnon aggregators were getting, and these are the sites that collected all of the Q drops so that someone who wanted to read them didn't have to go to the source, like eight kun. And the traffic for those sites just dropped pretty heavily after January. And, of course, I have to imagine, I don't know the exact statistics for all the more broad QAnon influencers, but I have to imagine that it hurt them as well. You know, it didn't make them, you know, stop posting because it forced them to go to other sort of sites. But, you know, it certainly limited the audience to a certain degree of these main QAnon influencers.

88
00:27:28,082 --> 00:27:37,602
Kayla: Good. I'm glad. I mean, I know it makes some things more dangerous, but deplatforming has kind of been proven to work in some really important ways.

89
00:27:37,786 --> 00:28:09,552
Travis View: Yeah. You know, yeah. There. I think there are important conversations to have about allowing these giant billion dollar corporations to essentially allow the perimeters of acceptable discourse in a fairly mainstream way. But from a purely consequentialist perspective, yeah, the platforming works. It's just, it's just, it can turn these mainstream platforms because of network effects. They have really the power to limit the exposure of these extremist movements if they decide to, that they're not going tolerate them on their platforms.

90
00:28:09,736 --> 00:28:51,050
Kayla: Do you ever feel like, sorry, this is totally out of the realm of what we had planned, but I feel like it's come up a few times when we're talking about these kinds of groups and we're coming from it. From a skeptic or rational background, do you ever find yourself feeling like you have to align with or defend a position you don't necessarily believe in or really identify with? Because in reaction to these actions, like, yeah, I'm not a huge fan of Twitter being able to tell people what they can and cannot post. And also, clearly there's some people who are using it in dangerous ways. How do you deal with that kind of conflict internally?

91
00:28:51,350 --> 00:29:35,172
Travis View: Well, I guess I'm sort of honest with the story of the sad state of affairs, which is that, again, if we're judging it from a purely consequentialist kind of perspective, in terms of, like, what kind of impact does it have on people in the world? Yeah, I suppose it was a good thing. But, but I think that, you know, we have to also have to talk about, like, well, just because it is good in this particular situation doesn't mean it's a general blanket good and it could have more poisonous effects in the future. I mean, it's such an ugly sort of like situation. Yeah, I don't generally like sort of defending, sort of like, you know, I have no interest in defending Mark Zuckerberg.

92
00:29:35,296 --> 00:29:35,960
Kayla: Right.

93
00:29:36,820 --> 00:29:53,932
Travis View: Jack Dorsey. But, yeah, I mean, it's really tough because I can't, if I'm being honest, I have to say that, no, I mean, this, the fact that they are able to cut off the supply of traffic and eyeballs, I think has been a net good for society. I mean, what are you going to do?

94
00:29:54,076 --> 00:29:54,548
Kayla: Right?

95
00:29:54,644 --> 00:30:01,172
Chris: Yeah, those are tough questions I'm sure we will be discussing as a society for many years to come.

96
00:30:01,236 --> 00:30:01,920
Kayla: Yay.

97
00:30:03,540 --> 00:30:22,100
Chris: Speaking of many years, we are on to, unfortunately, you're too, of the COVID pandemic. How has the Q movement responded to, because I know year one we talked a little bit about, in our show, we talked about how the pandemic sort of threw gasoline on the queue and on fire. How has the movement responded to year two here?

98
00:30:22,640 --> 00:31:29,678
Travis View: Yeah, I mean, generally they believe that any kinds of COVID restriction or Covid sort of policy is really a kind of pretext for a communist takeover. And I think that there's always, you know, I think it's always healthy to question whether or not any particular government policy is actually in the benefit of anyone and these sorts of things. But, yeah, they've been reacting very poorly to these sorts policies. It's just made them like, you know, even more paranoid and fearful and, gosh, honestly, there's no good time for a pandemic. But, man, this could not have come at a worst time when there's already a great deal of distrust and strife. And then, and then you're right. You know, the pandemic, it just caused all people to feel like the ground was underneath there and nothing was really stable anymore more. And so they turned.

99
00:31:29,734 --> 00:31:40,134
Travis View: And then, of course, as a consequence, they often spent a lot more time at home, at their computer. And so it caused a lot more people to get radicalized. I mean, yeah, it was really ugly situation.

100
00:31:40,302 --> 00:31:47,846
Chris: Not only was it bad timing, but we've talked about this a couple times where it's like, if you were just going to design, you know, a virus.

101
00:31:47,958 --> 00:31:55,680
Kayla: Rethink the, like, my stance against intelligent design, because I'm like, it absolutely feels like there is someone out there designing all of this.

102
00:31:55,760 --> 00:31:56,336
Chris: Yeah.

103
00:31:56,488 --> 00:31:57,536
Kayla: Terrible writing.

104
00:31:57,688 --> 00:32:12,104
Chris: Yeah. If you were to design a virus, like, to exploit the cracks and flaws in american society, this is what you would do. You'd say, okay, let's see. Let's make, like, half of it asymptomatic. Let's have it target different communities and different age levels differently.

105
00:32:12,152 --> 00:32:13,888
Kayla: We all have to work together, and.

106
00:32:13,904 --> 00:32:29,310
Chris: We all have to work together to defeat it. Like, I don't know. Yeah, it's. I don't, I don't really have a question there. Just commenting on, like, not only was it perfect timing, it's also perfect. Perfectly suited to disrupt the cracks that we already have in american society.

107
00:32:29,470 --> 00:32:45,270
Kayla: So today, September 16, 2021, the day we are recording, can you summarize for us, what is the state of the QAnon community and movement just for today? Because who knows what will it be tomorrow? But where are we now right now?

108
00:32:45,310 --> 00:33:40,206
Travis View: I mean, they're still full of hope that they will be vindicated. As surprising as it. As it is, I mean, they're getting some sort of fuel. For example, it was just announced that attorney John Durham sought an indictment for a lawyer in connection with his investigation into the origins of the Russia Trump investigation. And John Durham is very important in QAnon mythology and QAnon lore. In fact, in fact, the second to last Q drop just said Durham. Q because they believe that John Durham will finally sort of basically expose what a sham that the sort of the Russiagate investigation was. And then, and then, like, James Comey will finally go to prison. Now, most people have not really thought about James Comey in a long time, but of course, they're still very much on the minds of QAnon followers.

109
00:33:40,398 --> 00:34:36,389
Travis View: Q one followers, of course, they're also very convinced of that the election was fraudulent, and there's going to be some sort of big reveal that will usher Trump back into office. They're looking very closely at the Arizona audit, which is still ongoing. But there's this guy named Doug Logan for cyber ninjas. And they believe cyber Doug Logan also happened to communicate pretty extensively with Ron Watkins. And they believe that this audit, once it's finally revealed, will expose all the fraud. And then we'll realize that Biden is not the legitimate president. In fact, there's a theory, that, very popular theory going around QAnon world right now called devolution, which is premised on the idea that Trump set up some special laws or some sort of special orders that make Biden essentially a neutered president, doesn't have any real power.

110
00:34:36,529 --> 00:35:26,060
Travis View: And so that he can sort of be locked in, right, until all those fraud is revealed, and then Trump is rushed back into power. So that's it. They're still full of hope. I mean, the thing is that QAnon followers, they often hope for a two things. Number one, a sort of a bloodless coup, a sort of a way to sort of rush all of themselves and all the people they like into power without needing any kind of violence. Right? And so this, I think, separates them pretty well from other people, like oath keepers, for example, or people in the militia movement who think that, who really think it's their job to take up arms against the state if they believe that, they're being too tyrannical tolerate. But no, I mean, QAnon runs on pure hopium, as they say.

111
00:35:26,520 --> 00:35:44,330
Travis View: This idea that they're going to get everything they want, all their enemies will be vanquished. They will get to say I told you so to your friends, and all their friends and family will say, we are wrong. You're right the whole time. Let's call each other again, please.

112
00:35:46,070 --> 00:36:17,248
Chris: So one of the things that we really appreciate about the QAnon Anonymous podcast is that you guys really go wide on a bunch of this stuff. There's a lot of conspiracy universe stuff that is context important to QAnon. And actually, I really liked your Jean Ray Timecube show, so thank you for that little jolt of nostalgia there. But we're curious, why do you think there's always an audience for these strange, no matter how strange or weird or niche, there's always an audience for some of these conspiracies.

113
00:36:17,344 --> 00:36:31,044
Kayla: When you were doing that episode about the secret queen, of Canada. I just. My jaw was on the floor that this person had followers and like, yeah, you follow a hollow earther who has followers that are getting tattoos of his symbol on his body and just. Sorry, finish the question.

114
00:36:31,092 --> 00:36:37,840
Chris: Yeah, no, just. Why. Why do you think that no matter how weird and out there's always people willing to follow someone?

115
00:36:38,860 --> 00:37:15,378
Travis View: Well, I think it's a very human desire for esoteric knowledge, which, you know, which is, I think is fairly universal. I mean, I have it. I mean, I feel like. I feel like I'm a fellow traveler with conspiracists in this sense, but in that I really, I'm really interested in sort of these bizarre insider stories that only a few people are really privy to. Now, I think there are a lot of, like, healthier ways to get esoteric knowledge. You might, for example, read a history book. You might read a science book and just understand that if you absorb that knowledge, then you have esoteric knowledge that very few people are privy to.

116
00:37:15,434 --> 00:37:36,906
Travis View: That kind of academic knowledge isn't enough, and they need to feel like they're being an insider movement that gives them an understanding of what's really going on behind the scenes, of how things really work that only they and the small number of people know about. And so that allows them to basically be, you know, the avant garde of a new, powerful movement.

117
00:37:37,058 --> 00:37:45,110
Chris: Trey. Well, like, I mean, they can't read a history book, Travis, because that's all just. It's all made up anyway from the cabal. Hide the real truth about Tartaria.

118
00:37:46,810 --> 00:38:11,530
Kayla: Yeah. It's what you said about feeling like you're on a similar path as some of these conspiracy theorists. I feel that deeply, too. And it's something that I need to keep in check about myself, because if I get the wrong YouTube video served to me, I'll say something to you and I'll be like, I saw this YouTube video about how zero and infinity are the same thing. Did you know that? And once I hear myself say it's like, no, you got. You're seeking the secret knowledge. Go read a history book.

119
00:38:12,030 --> 00:38:13,438
Chris: I have to scull your mueller.

120
00:38:13,574 --> 00:38:33,078
Kayla: Go read a physics book. We're fine. But I think that's a good point, is that some of us who have this desire to follow these conspiracy theories from a more analytical or investigative perspective, we're probably doing that because we have that desire for the secret knowledge as well.

121
00:38:33,214 --> 00:38:58,636
Travis View: I mean, a lot of it was like, some of these things that, like, they're very big, intriguing claims. I mean, even going all the way back to, like, intelligent designs, like, really, you. You think you've overthrown a 150 year old theory? That's biology department in every single university in the world. I got to see what you have to say. You know, that kind of interest, I think, is, I mean, understandable.

122
00:38:58,798 --> 00:39:24,378
Chris: Yeah. And then the funny meta thing there is, too, is, like, people listen to our podcasts to get information that they find intriguing or they didn't know about, too. So there's this, like, I totally get that. Right. I mean, even when I write the scripts, like, I have a little reminder for myself when I write a script for my episodes. Like, hey, make sure you, like, try to find some stuff to present that will make people go, whoa, that's crazy. I didn't know that. Right. Like, there's a. So, yeah, I mean, I.

123
00:39:24,384 --> 00:40:07,736
Travis View: And also, I mean, I always try to give the benefit of the doubt to conspiracies as much as I can. Like, asking, like, well, what is, like, possibly true about this? You know, even if, like, you know, something like. Like anti. Anti vaxxers, obviously their. Their stance of vaccines are wrong, but their stance on the history of corruption with pharmaceutical companies, I think, is quite spot on. It's like, you could find a lot of bad stuff that big pharmaceutical companies, like fiber have done in the past that's worth being concerned about and worth, that is sort of understandable why these corporate giants have lost the public trust. That kind of stuff I'm always interested in, or even, like, government conspiracies. I often.

124
00:40:07,808 --> 00:40:42,052
Travis View: I'm extremely intrigued with the history of, like, mkultra because it sounds like the most wild, bizarre, false conspiracy theory that you could think of, but it's totally true. Like, once the, you know, the CIA was totally in the thrall of cold War paranoia, they felt it justified to run human experiments on us citizens with drugs for try and understand how to control people's minds. It's absolutely insane. Absolutely true. So, you know, I give it to them when they get. When they're on the right track, at least.

125
00:40:42,196 --> 00:40:52,648
Kayla: Right. When you learn about the real history of mkultra, it absolutely feels like reading a Dan Brown novel. Like, it feels like the secret knowledge and how is this real? And then it is.

126
00:40:52,704 --> 00:40:59,700
Chris: There's dozens of things like that. We talked a lot about trust erosion when we, you know, on the show, certainly during the QAnon episodes.

127
00:41:00,160 --> 00:41:24,630
Kayla: So with all of this research you've done for the podcast, have you figured out anything prescript? Like, has it given you any prescriptive insight for the massive disinformation and extremism problem that we're facing, like either, you know, systems level, macro, or what we can do individually to deradicalize fix the problem, is what we're saying.

128
00:41:25,210 --> 00:42:02,974
Travis View: Yeah, yeah. I mean, I wish. I wish I had a really good answer. There really isn't, because, like, the things that are causing these problems are, I mean, like we talked about are extremely human and extremely understandable. I mean, the root. The roots of the issues, they're, I guess, accelerated by technology, and they're sometimes accelerated by our stressful times, but they're rooted in our minds and our social beings and things which are, you know, hardwired. And I. There really is no sort of easy, sort of, fixed for that, those kinds of stuff.

129
00:42:03,102 --> 00:42:04,014
Chris: It's actually a good answer.

130
00:42:04,062 --> 00:42:05,182
Kayla: I mean, very good answer.

131
00:42:05,246 --> 00:42:05,534
Chris: There's.

132
00:42:05,582 --> 00:42:09,230
Kayla: There's no. The good answer is that there is no easy answer. And I think that's part of.

133
00:42:09,270 --> 00:42:11,406
Chris: It's a disappointing answer, but good answer.

134
00:42:11,518 --> 00:42:16,462
Kayla: But it's. It's. I think that's part of what's been so disheartening about the whole thing, is that there is no easy.

135
00:42:16,486 --> 00:43:05,952
Travis View: It is. I mean, I mean, like, we look at. I mean, we look at. Conspiracism has been part of our political sort of workings for ever. I mean, you can look at the real sort of roots of sort of modern conspiracism go all the way back to the aftermath of the French Revolution, in which there are many people. Most notably, there was this priest named Augustine Burrell, who was so enraged by the french revolution that he thought it was caused by the Illuminati and the Freemasons. And this, interestingly, Thomas Jefferson actually read one of Augustine Burrell's books, and he called it sort of the ravings of a bedlamite. But these sorts of, like, these sorts of ideas and these sorts of perspectives have just always been with us in the modern world.

136
00:43:06,016 --> 00:43:24,166
Travis View: And, I mean, you ask us, like, well, how do we get people back to a kind of consensus reality that didn't really, ever really exist? And, you know, there's just. There's really no real answer. You're asking me to solve a problem that for hundreds of years.

137
00:43:24,318 --> 00:43:28,846
Chris: I don't think that's asking too much, really. You run a great show. I don't know.

138
00:43:28,878 --> 00:43:29,650
Kayla: It's true.

139
00:43:31,870 --> 00:43:45,650
Chris: Speaking of which, one of the questions I wanted to ask is sort of like a high level question. What would you say is the main theme or message? Of Qanon. Anonymous if it gets a small entry in a history book 100 years from now, what would that say?

140
00:43:46,390 --> 00:44:24,950
Travis View: Well, I mean, my perspective when sort of tackling this material is one of, I think, of what I would like to call humane skepticism, which is kind of like a secular version of loving the sinner but hating the sin is this idea that, like, even that we can hate sort of like disinformation, and we can also hate what it does to people and the way it affects people's minds, the way it causes people to act in self destructive ways. And we hate the disinformation because we like the people who believe it, because we want better for them. And this is, I think, the, sort of like, the healthiest perspective.

141
00:44:24,990 --> 00:45:08,488
Travis View: And I think this is, I think, generally a flaw of a lot of other kind of like, past skeptic movements, like the new atheism, for example, is often kind of like, seen as sort of like, smug and arrogant and dismissive. And the idea that these sorts of beliefs that people have had for thousands of years could only be held. And I often don't think that, like, for example, even with Qanon, I don't think you have to be stupid in the sense of like, having like, a low level of cognitive processing power in order to believe in sort of QAnon conspiracy theories. I think they're mistaken. I think they're wrong. I think it's a result of flawed sort of reasoning processes. But often, I think they are often, sometimes very sharp.

142
00:45:08,544 --> 00:45:39,500
Travis View: It requires doing a lot of, I guess, flawed research, and sometimes theyre often very successful in their personal lives. For example, theres one case, there was a technical executive for Citibank who was the creator of the largest QAnon aggregator site, which is QMap pub. You dont, you can't be a stupid person sort of achieve that kind of position. It's a different kind of cognitive flaw that causes people to fall into QAnon than low intelligence.

143
00:45:40,400 --> 00:46:25,486
Chris: All of those things you just said resonate with us, and we talk about some of those things on the show. We talk about all of those things, actually, I think on our show a lot. Certainly me personally, I get the being disillusioned with new atheism. I went through the same process myself, really liked a lot of those guys, and then slowly started going, really? I don't know about, you know, so same sort of journey for me there, I think. And then to speaking to the point about smart versus dumb, yeah, we see that, too, all the time. We had a guest on the show, Doctor David Gorski, he called it the Nobel disease in the case of certain, like, Nobel Prize winners, where they would then go on and do incredible work to win a Nobel.

144
00:46:25,518 --> 00:46:27,734
Kayla: Prize deserving of a Nobel Prize.

145
00:46:27,862 --> 00:46:56,850
Chris: Right. But then they would go on to, you know, to promote quackery or, you know, in some cases, you know, like eugenicist stuff. In some cases, you know, the. Being responsible for why everybody thinks you need a million milligrams of vitamin C every day. So anyway, I, that really resonates, too, the whole, like, smart and, you know, being smart can make you also better at defending position that is ultimately false. So that's, you know, that's another thing I feel like we see a lot.

146
00:46:57,230 --> 00:47:10,654
Travis View: Yeah. I mean, like, if you're a really intelligent person, you can sort of, you can create all these bizarre and convoluted defenses over a kind of position you have a deep emotional attachment to. That's. Yes, that's very difficult deconstruct.

147
00:47:10,782 --> 00:47:40,150
Kayla: And I think it's difficult for us to engage with. And so kind of as we're getting to our last question here, something that we're dealing with all the time as podcasters that talk about this, and it's, you and your QAnon anonymous colleagues are always neck deep in the worst sludge of our civilization. So what do you do when you're studying all this stuff and constantly inundated with this stream? What do you do to protect your own mental health and to stay sane?

148
00:47:40,610 --> 00:47:52,216
Travis View: One thing that I've been doing, at least over these past few months is going on hikes very frequently, you know, and this is the old, you know, touch grass kind of thing.

149
00:47:52,248 --> 00:47:52,820
Kayla: Right.

150
00:47:53,560 --> 00:48:27,936
Travis View: It is important. It does help. You know, I try to. I try to start by boarding, by, you know, doing two things. I, you know, I go out on a hike, I get some sunshine, I get some exercise, and I take my camera and I try to take a picture of something, you know, that's love, a nice little composition. And that's, I think, a healthy exercise, at least for me, because it forces me to go out and intentionally try to look for something beautiful, which is sort of the exact opposite of my day job, which is intentionally looking for ugliness on the Internet.

151
00:48:28,128 --> 00:48:37,882
Kayla: Are you able to interact with social media and with things like Twitter in a healthy way now, or. I feel like I can't personally. It's like I go on Twitter, I just see garbage.

152
00:48:37,986 --> 00:49:32,278
Travis View: Do you feel, no, I really can't. I've honestly, I've been, this is something else I've been doing more recently. I mean, throughout. I was on Twitter very frequently. I was very active user, tweeting several times a day about thing, crazy things. I was vines. I don't, I'm nothing quite active at that level anymore because it just was not benefiting. It was not benefiting me. I don't think it was benefiting my work as much anymore to be quite that engaged. I mean, I still go on various boards in which extremists gather and conspiracists gather. I do still engage with the kind of content that conspiracists engage with, but nothing quite at the same level as I used to, if I'm being honest.

153
00:49:32,454 --> 00:49:37,810
Chris: Do you think you noticed a difference in your own mental state after taking a few steps back from Twitter?

154
00:49:38,270 --> 00:50:27,270
Travis View: Yeah, absolutely. I mean, it's like when you're, I mean, when you're in the thick of it, I mean, it really makes you feel like, you know, this is this bizarre convoluted mess, like is the world. And especially when you see it sort start to leak out into mainstream politics when you take a few steps back, I think it helps you gain like some level perspective. Like, I don't know, there's like some, there's sometimes there are things that, like, I decide that I notice sort of gaining sort of steam or inside of the conspiracist world that I decide to, like, not really report on because I think it's like, well, listen, there's still a chance this is still going to be quarantined within the conspiracist world. It's not worth blowing up and sort of sharing.

155
00:50:27,310 --> 00:51:10,290
Travis View: On my Twitter account, which has like 80,000 followers now, one example is that I guess I'm doing it now, but Ron Watkins, he started the whole alien leaks kind of movement. He had set up a website really, into UFO's and aliens, even though that was sort of a project he was doing and something that might be worth being concerned about if it sort of gets to a more mainstream place. I often, I think that, you know, this is not something I don't think is quite, it's not hurting people quite at the level that QAnon is quite yet. So it's not something I'm just going to engage with on a deep level. It's not something I'm going to share with my more mainstream audience because I don't think it's quite as relevant as sort of other things. And that's. Yeah, that's.

156
00:51:10,330 --> 00:51:27,818
Travis View: I mean, it is been beneficial. I have noticed, I mean, sort of a personal benefit, sort of my own mental health, when I'm able to take a step back and do other things besides engaging in bizarre online content for hours every day, that's really good.

157
00:51:27,834 --> 00:51:28,850
Chris: And that helps us. Yeah.

158
00:51:28,890 --> 00:51:56,304
Kayla: It honestly feels a little bit like the advice that's needed for people who get sucked into the conspiracy theories, genuinely. I mean, that touchgrass comes from kind of that thing where if, yeah, what's beneficial to us is people looking at this and observing it and talking about it, taking a step away. It feels like that. That's kind of the only thing that we've seen that might help somebody who's actually getting sucked into the conspiracy theory. That parallels interesting to me.

159
00:51:56,392 --> 00:52:41,784
Travis View: Yeah, yeah. When individuals come to me and they say, for example, my friend, my family member, my brother, my sister, my father has gotten sucked into these conspiracy theories. Is there anything you think I can do to help them? I mean, I have to say, like, you know, I'm not a mental health professional. This is sort of a very difficult mental health question. But my main recommendation, try to come away from the computer and try to find. Help them find meaning somewhere else. Because the thing is that we need to find a way to have meaning and purpose. This is not something that's just good for us. This is something we need in the same sense that our lungs need oxygen. It's just something we need to have. And if you're getting it from QAnon, then you're just not going to give it up.

160
00:52:41,952 --> 00:53:33,448
Travis View: If you realize that it's possible to have to engage with the world and have meaning and enjoy it in a way that does not involve QAnon, then all of a sudden, the QAnon doesn't seem so enticing. And this is the. I mean, honestly, this is the same way I feel about my own work. I enjoy it very much, but I try to get a distance from it. If, for example, we do sort of slide into a utopia in which political discourse consists entirely of policy disputes, then QAnon anonymous is no longer relevant. And that is fine. I'll find something else to do. I'll go back to my old corporate content job or something like that, because I don't want to get to a place where this engaging with this nonsense is just something like I need psychologically.

161
00:53:33,504 --> 00:53:54,820
Travis View: This is something that I base my identity and purpose on. I mean, I've got other things. I've got hobbies. I've got family. I've got to make sure that I have a diverse set of interests that I can identify with so that I don't need being online as a core part of my personal identity, because that is very unhealthy.

162
00:53:55,410 --> 00:54:13,074
Chris: We could start a camping or outdoorsy co op because we just actually, we interviewed Matt Remsky from the conspirator reality podcast just a week or two ago, and we asked him the same question, and his answer was hiking as his first answer. So it's interesting that's sort of. And we like to hike a lot.

163
00:54:13,122 --> 00:54:15,522
Kayla: Touching grass is very important form the.

164
00:54:15,546 --> 00:54:17,386
Chris: Touch co op or something.

165
00:54:17,578 --> 00:54:46,070
Travis View: It is. You know, I think, and I think it's important because the thing about the online world is that it's a world of pure representation. There's nothing that is what it is. Everything is, you know, so symbols and that you manipulate and manipulate you, whereas, like, you know, go out in nature, it's like there's like, when you see a tree, it's not a representation of a tree. It just is a tree, and you can sort of connect with the, you know, physical reality, atomic reality, in a. In a more productive way.

166
00:54:46,860 --> 00:54:50,340
Chris: That's a really interesting insight, actually, about the representations and symbols.

167
00:54:50,380 --> 00:55:14,310
Kayla: Blowing my mind. I'm like, yeah, I can't just look at trees on the Internet and go, like, I've seen trees today. No, go touch grass, girl. I need to go touch that grass. Yeah. There is definitely something very different about engaging in a thing, actually, as opposed to virtually. And I feel like, for myself, it can be difficult to articulate what that difference is. But there is that difference.

168
00:55:14,690 --> 00:55:43,756
Travis View: I mean, we're, like, really like the first generation ever where you didn't need to engage in the world in analog way ever, if you weren't so inclined. And this is a bizarre worldwide experiment. We still don't fully appreciate what the consequences are. So I think a basic analog interaction with people and things, I think, is helpful to give you balance about and gives you perspective with your digital interaction with the world.

169
00:55:43,948 --> 00:55:48,292
Chris: Yeah, now, that was a deep dive. That was.

170
00:55:48,436 --> 00:55:51,276
Kayla: We all need to get off Twitter, and that's what I'm hearing.

171
00:55:51,468 --> 00:55:57,756
Chris: Well, before we sign off here, anything that you'd like to add? Anything we missed or anything you'd like to plug of yours?

172
00:55:57,908 --> 00:56:19,340
Travis View: No, I would say that if you're interested in what I have to say, come follow me at Travis View on Twitter. Though, like I mentioned, I don't tweet quite as much as I used to. You could also listen to us at the Q Anonymous podcast. We do one free episode every week, and then one episode just for our patrons who support us.

173
00:56:20,560 --> 00:56:35,170
Chris: That was a really good conversation. I knew it would be. One of the things. We said this to Travis. He was really good at, like, keeping his answers, like, concise and, like, really getting to the point. Like, I was really impressed with his interview abilities.

174
00:56:35,290 --> 00:56:36,210
Kayla: Better than us?

175
00:56:36,330 --> 00:56:37,274
Chris: Oh, way better than us.

176
00:56:37,322 --> 00:56:42,470
Kayla: I found myself asking questions, being like, I'm talking for three minutes. I'm just trying to ask a question.

177
00:56:43,210 --> 00:56:52,218
Chris: I do that a lot, actually. I feel like I need to get a little bit better. Get into the goddamn point. Well, let me say a paragraph and then ask you a question at the very end of it.

178
00:56:52,274 --> 00:56:55,390
Kayla: My favorite thing is, let me say a paragraph and then not even ask you a question.

179
00:56:56,050 --> 00:56:58,140
Chris: Yeah, I do that, too. Oh, man.

180
00:56:58,300 --> 00:57:03,524
Kayla: But we're not here to self flagellate. Let's react. No, it's time. It's the react portion.

181
00:57:03,612 --> 00:57:04,044
Chris: React.

182
00:57:04,132 --> 00:57:11,436
Kayla: Tm. We're here to now talk about what we just talked about. So what are your. What are your thoughts and feelings?

183
00:57:11,508 --> 00:57:14,476
Chris: Can we then, can we react to the. React and then react to the reaction?

184
00:57:14,588 --> 00:57:15,092
Kayla: Did it?

185
00:57:15,156 --> 00:57:17,012
Chris: Yeah, exactly. It seems so popular.

186
00:57:17,076 --> 00:57:37,322
Kayla: Yeah. What are your. What are your thoughts and feelings after that conversation? I mean, you and I feel. I have a lot, we don't know, keep up with what's going on in QAnon land, but we also don't study it as closely as Travis Vue and QAnon anonymous does. So I feel like this conversation was interesting because it was both informative, like, new information and also new analysis.

187
00:57:37,426 --> 00:57:37,706
Chris: Yeah.

188
00:57:37,738 --> 00:57:45,506
Kayla: Like, it was. We both. We got both new information on what's going on with QAnon, and then the stuff that we maybe already know. There was a new viewpoint, a new analysis.

189
00:57:45,578 --> 00:58:14,334
Chris: Yeah. And then there's stuff that's, like, on my radar that helped kind of fill in some of the cracks on, like, the QAnon conferences. Conferences. Because there's the one that he said that was in Dallas, and there's one coming up in Vegas, and then talking about devolution and things like that really helped kind of fill in some of those cracks. One thing I just did want to mention is, like, we're talking about some of the, like, the cyclicality of this stuff. I just wanted to differentiate real quick, like, we are still in this cycle. This is not like, you know, there was.

190
00:58:14,422 --> 00:58:15,494
Kayla: We're still in the moral panic.

191
00:58:15,582 --> 00:58:25,822
Chris: Yeah. There was a moral panic in the eighties, there was a moral panic in the 18 hundreds. You know, there's moral panics every 30 years, whatever. We're still in this one. This is the current moral panic.

192
00:58:25,846 --> 00:58:32,950
Kayla: It didn't end. And then we're coming back. No, it did not end. It didn't end with our last episode, shockingly.

193
00:58:33,030 --> 00:58:34,478
Chris: I know. I thought we had solved it, and.

194
00:58:34,494 --> 00:58:38,804
Kayla: It did not end with the. It clearly did not end with the US election.

195
00:58:38,982 --> 00:58:47,360
Chris: Right, right. Do you think this one started with. Would you say it started with pizzagate? Was that like. Was that like, the beginning of this moral panic?

196
00:58:47,440 --> 00:58:47,864
Kayla: I think.

197
00:58:47,912 --> 00:58:48,664
Chris: Or was it before that?

198
00:58:48,712 --> 00:58:53,688
Kayla: Really difficult to. It can be really hard to say. Like, what's the start?

199
00:58:53,824 --> 00:58:56,040
Chris: I need an exact date down to the second.

200
00:58:56,120 --> 00:59:29,300
Kayla: I think, for all intents and purposes, you can say that this started with pizzagate. I think that there's probably very compelling arguments be made on either side of that. And also, I think there is a compelling argument to be made for QAnon itself is not its own moral panic, is not its own satanic panic, is not its own nocturnal ritual fantasy. It is a part of a greater conspiracy, specifically, the flavor of antisemitism that still very much exists in the world.

201
00:59:29,420 --> 00:59:57,382
Chris: Right. And that's one of the things we talked about on our episode, was like, is this something unique and new and different, or is it just the same old? And I think the answer still remains is that part of it is cyclical, and part of it is like this unique flavor to 2020, 2021, with the pandemic, with four chan, with things that, you know, they didn't have access to in the eighties or the things that the anti masonic party had access to in the 18 hundreds.

202
00:59:57,516 --> 01:00:04,150
Kayla: Right. It's the. The. The meat is the same, but the plate is different.

203
01:00:04,890 --> 01:00:09,690
Chris: Or is the plate the same, but the meat's different? Or is it a different sauce? I think it's just different sauce.

204
01:00:09,730 --> 01:00:10,578
Kayla: Just different sauce?

205
01:00:10,674 --> 01:00:12,698
Chris: Yeah, it's like Bernays versus a one.

206
01:00:12,754 --> 01:00:17,514
Kayla: The broad strokes are the same, but then, you know, that's why it's the sauce.

207
01:00:17,642 --> 01:00:20,674
Chris: You put the sauce on with a paintbrush. Right.

208
01:00:20,722 --> 01:00:21,458
Kayla: Since when?

209
01:00:21,594 --> 01:00:23,972
Chris: I don't know. I think. I think this metaphor got away from us.

210
01:00:23,996 --> 01:00:24,932
Kayla: I think you don't know how to.

211
01:00:24,996 --> 01:00:39,332
Chris: Eat one of the other cool things. I thought this is. I don't really have too much to add here, but it just was really cool to, like, hear Travis talk about his journey, like, before and during all of this, like, what got him into this sort of work.

212
01:00:39,436 --> 01:00:39,892
Kayla: Right.

213
01:00:39,996 --> 01:00:43,548
Chris: Because I was like, oh, man, I really identify with a lot of that, you know, like, I, coming to this.

214
01:00:43,564 --> 01:00:48,462
Kayla: From a skeptic, like, I like science. I like to read Brian Green and then be like, oh, all of sudden.

215
01:00:48,486 --> 01:00:50,494
Chris: Now, why did you do it in that voice?

216
01:00:50,662 --> 01:01:00,918
Kayla: That's how our voices are when we come to this. We're like little babies and then now we're adults going, oh God, oh God.

217
01:01:01,014 --> 01:01:21,296
Chris: Yeah, I know. But there was also different flavor, too, because he was talking about being really interested in sort of that dominance of that intelligent design discussion that was happening. It's still happening, actually. But, you know, that was really prominent, you know, a decade or two ago. It's funny because, like, I kind of forgot about that.

218
01:01:21,368 --> 01:01:24,816
Kayla: Yeah, me too. And I was in everything that's happened.

219
01:01:24,888 --> 01:01:25,792
Chris: When he said that I was.

220
01:01:25,856 --> 01:01:26,984
Kayla: How big of a deal that was?

221
01:01:27,032 --> 01:01:28,020
Chris: Oh, yeah.

222
01:01:28,400 --> 01:01:50,290
Kayla: I also think something that's interesting and just kind of keeping in line with this part of the conversation, it's come up with you and I. It's come up with talking to other people. It's come up, it came up in this conversation, the disillusionment that happens for folks that kind of come to this from a skeptic viewpoint, the disillusionment with the quote, unquote, like new atheism and the new atheist leaders.

223
01:01:50,670 --> 01:02:40,626
Kayla: And it's interesting to, it's always interesting to hear that journey because, you know, selfishly, from my perspective, like, I saw you go through that journey because, you know, this was these, it was a lot of people that maybe you identified with or looked up to with the new atheist, I never felt like I was a part of that because there was never really space for, there wasn't a lot of space in that movement for people that weren't, I hate to use this, but the straight white men, like, there wasn't a lot, sorry, but there wasn't a lot of space. And that's why there continues to be issues with the, just the misogyny and the exclusion of women in that space. And I think that can always kind of be a thing. Obviously, that's a red flag.

224
01:02:40,698 --> 01:02:53,378
Kayla: And those of us who are white, straight, cis, whatever, can very easily overlook it. But it's like, now looking back, it's like, yeah, red flags abound, man. Like, look at how exclusive this.

225
01:02:53,474 --> 01:03:16,130
Chris: But as a group was. Yeah, as a white, straight cis dude, I had a lot of blind spots there. So, like, I do, you know, I wasn't, like, noticing a lot of those red flags, of course. And so what feels. So your journey is very different than mine. Your journey is like, yeah, man, there were red flags there. I never felt welcome. My journey is like, because I had the blind spots and then they were revealed to me.

226
01:03:16,210 --> 01:03:16,570
Kayla: Right.

227
01:03:16,650 --> 01:03:32,796
Chris: My journey was very disorienting. Like, oh, these guys actually kind of suck. And, like, that felt, I guess, ultimately, it just, I'm really glad to talk to people like Travis and to read, you know, articles about sort of that disillusionment of which there have been some written.

228
01:03:32,868 --> 01:03:33,340
Kayla: Right.

229
01:03:33,460 --> 01:03:37,476
Chris: Because it. It definitely makes me feel less alone and crazy there.

230
01:03:37,548 --> 01:03:37,756
Kayla: Yeah.

231
01:03:37,788 --> 01:04:07,174
Chris: No, because for some of those guys, like, again, Michael Shermer for a long time, and I still, he's still not, it's not like these guys are the worst devils in the world. But, you know, I really looked up to him for a long time. Like, a lot of why I'm a skeptic is his column in Scientific American. And having him be like a Andy NgO fanboy is really upsetting to my brain. So, anyway, it's just, yeah, it's good to hear that some other people, like Travis, have experienced that same sort of journey.

232
01:04:07,222 --> 01:04:49,596
Kayla: And I think it's, I think it also, it does the same thing to make me feel less alone, because my experience with a lot of the new atheism, the Richard Dawkins specifically, a lot of the experience wasn't just like, these guys are bad. It was like, what's wrong with me? Yeah, what's wrong with me that I'm not connecting to this group of people? What's wrong with me that I don't feel welcome here? What's wrong with me that things are making me feel like an outsider or unwelcome or what's wrong with me that this is not connecting to me? It's good to hear. I mean, it sucks, but it's good to hear that multiple people have had this experience and realized, like, oh, yeah, all of the answers weren't here. All of the answers, it really blows.

233
01:04:49,628 --> 01:05:32,504
Chris: That you were feeling that gaslighting of, like, why am I. What's wrong with me? I hope that we've talked about this in the show a lot. If we asked him, what is the through line of your show? And I think we have a similar sort of through line, but I would say that one of our big through lines is you're not alone. You're not, like, if you are feeling gaslit in the same way, like, you're not welcome in skeptic communities, or you're not welcome in whatever community you're in. I hope that you can listen to this show and feel like you're heard or seen. I hope you can have some of those same feelings that I get when I listen to Travis and he says, yeah, I was disillusioned by these guys, but, you know, I still have the skeptical bent and.

234
01:05:32,552 --> 01:05:53,224
Chris: But it's, you know, it's this very sort of nuanced approach. Like, he talked about social media, how, like, there's good and bad and moderating it. You know, if you. If you feel the same way about some of these things that we do, but you don't want to be like an arrogant dickweed about it, then I hope. I hope that listening to our show can give you some of that sense of community and belonging.

235
01:05:53,352 --> 01:05:54,976
Kayla: We just want to better than Richard Dawkins.

236
01:05:55,048 --> 01:05:57,182
Chris: Yeah. Which done and done.

237
01:05:57,326 --> 01:06:24,144
Kayla: I hope so. Anyway, something that really kind of, in the same little bit, in the same vein of what we're talking about, something that really got to me, it was that conversation that we had about, like. Yeah, skepticism without being a dick about it. The way that. The way that Travis put it was like, humane skepticism, the secular version of quote unquote, you know, loving this, loving the sinner, hating the sin. That felt really important to me. Me. It's something that's very difficult.

238
01:06:24,192 --> 01:06:25,660
Chris: Don't hate the player, hate the game.

239
01:06:26,280 --> 01:06:43,984
Kayla: It's something that's very difficult to engage with right now because it's not just, oh, some people think differently than me or even, oh, some people think differently than me and are trying to make their politics enact their worldview. It's some people think differently than me and are making public health choices that are hurting a lot of people.

240
01:06:44,112 --> 01:06:44,424
Chris: Right.

241
01:06:44,472 --> 01:06:52,300
Kayla: So it can be really difficult. And I don't. I don't think loving the sinner, but hating the sin is always the right.

242
01:06:53,200 --> 01:06:54,280
Chris: Interesting. Yeah.

243
01:06:54,360 --> 01:07:09,512
Kayla: It really depends on what your definition of love is in any given circumstance or what your definition of sin is or your definition of sinner is. But I don't think it's always the right thing to be, like, you're okay. You just have bad beliefs. Like, sometimes you have to punch the Nazi, for lack of a better analogy.

244
01:07:09,616 --> 01:07:51,724
Chris: I mean, this is part of why we added the criteria of chain of victims. Right. Because sometimes it is, when you're dealing with these groups, it is really difficult to know, like, are you a victim that I feel bad for because you got sucked into this thing, or are you a perpetrator of these crimes that I dislike. And, I mean, there's so much discourse online. That's the thing. It is both. And that's what makes it hard to think about. That's what makes, that's what generates all of the back forth discourse online about, like, you should be able to make fun of ivermectin paste eaters, and then, like, no, you shouldn't do that because that so, like, that's why there's always discourse about that is so hard to differentiate in this chain of victims. And QAnon is a great example of that.

245
01:07:51,772 --> 01:08:10,934
Kayla: When we're living in a world like this, where we're all stuck inside of our houses and there's an opportunity for a better world and so many of us aren't taking it, sometimes you got to blow off a little steam by joking about all the people eating horses paced. And I don't begrudge people fulfilling that need, but also, it is a complicated thing.

246
01:08:11,022 --> 01:08:13,054
Chris: Or like Travis said, you gotta go out and touch grass, baby.

247
01:08:13,102 --> 01:08:19,381
Kayla: Touch grass. Touching grass is probably the better option than dunking on folks online. Almost always.

248
01:08:19,486 --> 01:08:26,718
Chris: Yeah. Yeah. I think I've been doing a little bit better with the social media. Step back and touching grass a little bit better lately.

249
01:08:26,774 --> 01:08:28,166
Kayla: We went and looked at the trees last week.

250
01:08:28,198 --> 01:08:29,006
Chris: We were like, yeah, we did.

251
01:08:29,037 --> 01:08:39,460
Kayla: Oh, we could sit here. We have an afternoon off. We could sit here and be on the Internet, or we could go drive to a tree place. And we drove to the tree place, and it was better. It was a better choice.

252
01:08:40,160 --> 01:09:04,296
Chris: The thing that Travis said that was different than I've heard or thought of before was the thing about the symbology of the Internet. When you're on the Internet, even a picture of a tree is still just a representation of that tree. It's still a symbol of that tree. It's not the, you know, the. It's not interacting with a tree. The base layer of reality, the way you are when you're out in nature.

253
01:09:04,368 --> 01:09:05,952
Kayla: And he said the atomic level.

254
01:09:06,055 --> 01:09:07,167
Chris: Did he say atomic level?

255
01:09:07,264 --> 01:09:09,432
Kayla: Like, that was one of the phrases is like, you're. Yeah. On the atomic.

256
01:09:09,456 --> 01:09:11,319
Chris: Okay, well, I don't have a scanning electron microscope.

257
01:09:11,359 --> 01:09:15,779
Kayla: I'm just saying you're not dealing with the trees atoms. You're dealing with a picture.

258
01:09:16,560 --> 01:09:18,424
Chris: I know, I'm being a dildo.

259
01:09:18,471 --> 01:09:19,040
Kayla: Go.

260
01:09:19,200 --> 01:10:09,118
Chris: That's what I do. No, but, yeah, I just. I thought that was really interesting way of putting it, because. Yeah, like, once you get completely unmoored sometimes from reality, when you're spending so much time online in ways that we probably don't even, like, consciously appreciate all the time. If you're only looking at symbols, right? If you divorce all meaning from those symbols, then they become jumbled up. And I think that is also what happens to a lot of conspiracy theorists when that happens. One other thing that we talked about, actually, we just mentioned this a second ago, but deplatforming from social media. And Travis said that there's this consequentialist viewpoint where, yeah, deplatforming works, and it removes power from these nazis, which is thumbs up. Great.

261
01:10:09,214 --> 01:10:23,790
Chris: But on the other hand, we also don't really think that Twitter should be all powerful and Facebook should. I mean, we hate Facebook. Like, Facebook sucks. Terrible sucks. So my question to you, Kayla, is deplatforming good or bad? Go.

262
01:10:24,370 --> 01:10:27,714
Kayla: That's not the proper question. You cannot ask what?

263
01:10:27,762 --> 01:10:28,418
Chris: No stack rank.

264
01:10:28,434 --> 01:10:55,600
Kayla: You cannot properly answer that question. And if I hatch the stack rank, you can't. It really depends, because if it's like, Jack Dorsey becoming a president and still runs Twitter de platforming, good or bad, pretty clear that's bad. Right, right. As Twitter as it is right now, deplatforming people, promoting a harmful rhetoric. Good.

265
01:10:55,760 --> 01:10:57,104
Chris: Yeah, yeah.

266
01:10:57,192 --> 01:11:18,470
Kayla: With a big old asterisk. Because it's like, I don't know if good and bad are appropriate terms here. It's, are there. I. People who are utilizing, capitalizing on, manipulating, benefiting, leveraging from promoting conspiracy theory to. In a way that is harming people left, right, and center.

267
01:11:18,590 --> 01:11:19,534
Chris: Yeah, that's.

268
01:11:19,582 --> 01:12:02,056
Kayla: That's where it becomes important. That's where it becomes like, this is for the public. This is for the greater public good. This is a public health issue. This is not just like, this is good, or you shouldn't say that. And I think the way I know who said it, but I saw somebody on Twitter, of course, thought were getting off Twitter. I know. Explaining how the deplatforming situation, it becomes much more critical when we're talking about people with a lot of followers. So people like the Mayanopolises or people that are far more visible, the Laura loomers, people that are far more visible promoting these conspiracy theories versus, like, you know, first name bunch of numbers that are promoting the conspiracy theories. It's like, it's kind of like how.

269
01:12:02,168 --> 01:12:06,376
Chris: By first name, bunch of numbers, she means, like, accounts with no followers, basically.

270
01:12:06,448 --> 01:12:19,464
Kayla: Yeah. And I think that it seems, from what I understand, and I could be wrong, but it seems like the bulk of the, quote, unquote de platforming focus has been on those. More those accounts with more reach.

271
01:12:19,552 --> 01:12:27,806
Chris: Yeah. And I think we try to differentiate when we can between like, influencers and followers. But, you know, that gets. That gets messy because again, chain of victims.

272
01:12:27,838 --> 01:12:29,050
Kayla: Chain of victims.

273
01:12:29,510 --> 01:12:55,144
Chris: But yeah, that's. I mean, we're still today dealing with the consequences of the printing press. So it's gonna be a while before we figure out this whole deplatforming thing. I suspect for me personally, as with many things I bring sort of like my game design and know, working at a game company that made a large multiplayer game online, I worked at Blizzard.

274
01:12:55,232 --> 01:12:56,976
Kayla: Large multiplayer game online.

275
01:12:57,088 --> 01:13:33,904
Chris: Yeah, that's what we call. No, people know that I work at Blizzard. It's not a big deal. World of warcraft without moderation would have been a completely unplayable cesspit of just absolute dog shit. Like the amount of resources and work and effort and data science and smartness, AI, I mean, you name it, the things that we put into moderating that game, it's a lot. Okay. And. And even still, like there. You know, there's some areas of that game where, like, you can go and there's like a. You know, where there's a common chat room that is like, still pretty unsavory even after all that.

276
01:13:33,952 --> 01:13:34,544
Kayla: Right.

277
01:13:34,712 --> 01:13:51,218
Chris: So anyway, all that is to say, I bring that perspective to it of like, public spaces need to be. Need to have some sort of moderation or else all of the poop rises to the top and everything becomes unusable for everyone.

278
01:13:51,314 --> 01:13:54,082
Kayla: Are you citing the tragedy of the commons, sir?

279
01:13:54,186 --> 01:14:01,594
Chris: I am, but I'm citing the tragedy of the commons to argue against freedom. How about that? No no, but that's the thing, right?

280
01:14:01,762 --> 01:14:04,770
Kayla: It's the tragedy of the thing you're paying for.

281
01:14:04,890 --> 01:14:52,600
Chris: It is, though, it is the tragedy, the commons, because the tragedy of Commons is that if you don't have somebody curating a piece of property, then it will get trampled on. And the tragedy of the Commons ends up being used to support private property. But really what it's talking about is curating private property is one way to accomplish that. I'm going to curate something that I own because I'm incentivized to because I own it. But that's not the only thing. That's not the only metaphor that the tragedy of Commons applies. So all I'm saying is that I see this as definitely as a curation problem. However, the thing where it gets complicated is that there's a pretty clear sense that World of Warcraft is a private choice based. I can engage with it or not. It's a game. It's entirely diversionary.

282
01:14:53,300 --> 01:15:00,114
Chris: Is Twitter that or is Twitter a public service? Is Twitter the electric company? Is Twitter the plumbing?

283
01:15:00,162 --> 01:15:10,178
Kayla: It gets hard when you're utilizing something that you don't pay for. And I will use this moment to remind everyone that the reason you don't pay for Twitter is because they are selling your data.

284
01:15:10,274 --> 01:15:14,738
Chris: Yeah, but at the same time, you can even say, like, there's. Our public spaces are curated, too.

285
01:15:14,794 --> 01:15:15,130
Kayla: Right?

286
01:15:15,210 --> 01:15:22,466
Chris: Right. Like, I can't just, like, walk in the middle of Times Square and, like, pop a squad and, like, take a big dump. Can I. Can I do that, actually?

287
01:15:22,498 --> 01:15:23,074
Kayla: I don't know.

288
01:15:23,162 --> 01:15:24,388
Chris: Maybe I can, actually.

289
01:15:24,554 --> 01:15:26,820
Kayla: It's also important. Define can.

290
01:15:27,160 --> 01:15:29,584
Chris: What is can without consequences.

291
01:15:29,712 --> 01:15:34,792
Kayla: Without. I feel like, without consequences. What does that mean?

292
01:15:34,976 --> 01:15:56,360
Chris: Without state and for without consequences being enforced by the owning entity slash state consequences. Yes. Because I'm not talking. I'm not talking about, like, hey, please don't pee there. Right? Like, that's a consequence. That's a consequence is somebody's gonna look at me funny. I may not care about that consequence, but I might care if, like, a police officer comes by and arrests me.

293
01:15:56,400 --> 01:15:57,328
Kayla: Right, so you mean punishment.

294
01:15:57,424 --> 01:16:05,216
Chris: I'm talking about punishment, like state powered punishment, or I'm talking about punishment from whatever entity owns whatever thing that I'm talking about.

295
01:16:05,288 --> 01:16:05,712
Kayla: Right.

296
01:16:05,816 --> 01:16:15,104
Chris: In the case of Twitter, that becomes complicated because those lines are blurred. Is it a. Is it a public thing? Is it a public good? Or is it a private space? Hard to say.

297
01:16:15,192 --> 01:16:15,896
Kayla: Well, it is a public.

298
01:16:15,928 --> 01:16:25,730
Chris: That being said, what I'm arguing is that even public spaces, even existing public spaces, there still is curation from. From the state entity.

299
01:16:25,850 --> 01:16:26,274
Travis View: Right.

300
01:16:26,362 --> 01:16:32,482
Chris: So. And that's something we accept, because otherwise, you know, there'd be poop and graffiti and blah, blah, blah, whatever. Everywhere.

301
01:16:32,546 --> 01:16:33,578
Kayla: Go graffiti, go. Do it.

302
01:16:33,594 --> 01:16:52,648
Chris: Okay, fine. Whatever you feel about graffiti. I love graffiti, too. Point is, we have accepted that there are certain, you know, appropriate levels of curation that we do, even with our public spaces. So if you say, like, oh, it should be completely unrestricted, even if Twitter was entirely a public space.

303
01:16:52,744 --> 01:16:54,896
Kayla: Anti loitering laws. Come on.

304
01:16:55,008 --> 01:17:06,544
Chris: Yeah. So that's just. I just want to say that I bring that curation lens to that whole argument. And now that I have completely solved that, let's talk about one more thing.

305
01:17:06,632 --> 01:17:07,300
Kayla: Yes.

306
01:17:07,840 --> 01:17:15,092
Chris: Which Travis talked a bit about, this weird thing where he sort of hopes for his own creative demise.

307
01:17:15,156 --> 01:17:15,840
Kayla: Yes.

308
01:17:16,140 --> 01:17:17,588
Chris: And I think we hope that, too.

309
01:17:17,684 --> 01:17:18,476
Kayla: Oh, man.

310
01:17:18,628 --> 01:17:20,108
Chris: But I don't think it'll ever happen.

311
01:17:20,164 --> 01:17:21,400
Kayla: No, of course not.

312
01:17:22,180 --> 01:17:47,678
Chris: Unfortunately, I think that their podcast and maybe ours will exist forever because all of the things that lead to QAnon are very human nature. Right? Like, he talked about a bunch of this stuff. Desire for secret knowledge. There's a desire for agency when you feel like you have none. There's a desire for community, a desire for hope when it feels like there is no hope. And those who gain power behind the.

313
01:17:47,694 --> 01:17:50,170
Kayla: Claim of also meaning.

314
01:17:50,710 --> 01:18:09,068
Chris: Yeah, the desire for meaning. And those who gain power behind the claim of protecting us seems like they're always going to erode our trust. Right? Like, I really want to trust the CDC and I want to trust big Pharma, but it's hard because they have demonstrated in the past that they can't be trusted in some way.

315
01:18:09,084 --> 01:18:10,436
Kayla: Mkultra, like we talked about.

316
01:18:10,508 --> 01:18:16,172
Chris: Yeah. So there will always be those things, and so there will always be conspiracies that are harmful.

317
01:18:16,236 --> 01:18:21,884
Kayla: Well, do you think that conspiracies always require an erosion of trust, or do you think it's possible?

318
01:18:21,972 --> 01:18:22,908
Chris: No, that was just one of the.

319
01:18:22,924 --> 01:18:25,124
Kayla: Elements, for them to exist without that.

320
01:18:25,252 --> 01:18:32,470
Chris: I think it definitely is. I think for, because they're, again, desire for secret knowledge, desire for agency when you feel like you have none. Community, hope.

321
01:18:32,550 --> 01:18:59,194
Kayla: Well, and clearly, like, there's some, like we talked about in the, in our QAnon episodes, the cyclical nature of the nocturnal ritual fantasy going back hundreds and hundreds of years, maybe more. It's like, clearly, I think Travis spoke about this, too. Like, clearly there is some. Our brains are doing this. Clearly they is some sort of, like, universal human experience that is doing this.

322
01:18:59,242 --> 01:19:29,194
Chris: Yeah, I think it's those things that we just listed. I was also talking to a very good friend of mine the other day about cultures. Weird. And about QAnon anonymous and extremism research in general. And he asked me a superb question that maybe also plays into the why will it always be around? I don't know, but I want to posit that question to you. Do you think that the whole, like, reporting, podcasting, research ecosystem that has cropped up around QAnon is in some way helping to perpetuate it?

323
01:19:29,242 --> 01:19:30,110
Kayla: Yes, ma'am.

324
01:19:30,770 --> 01:19:36,258
Chris: So should we stop doing this exact. Should we just turn this episode off and not publish it?

325
01:19:36,314 --> 01:20:03,828
Kayla: I mean, it's not like you and I didn't talk about that before we decided to do the topic on our show. We did talk about that, but part of the way we talked about it. Washington. It was less to do with this. Like, is the reporting ecosystem perpetuating this? And more just the even more baseline question that we have talked about a lot with some of the topics we've covered on the show. Does shining the light onto the thing amplify it or disinfect? Disinfect.

326
01:20:03,964 --> 01:20:04,348
Chris: Yeah.

327
01:20:04,404 --> 01:20:24,110
Kayla: And I think that. I think that the answer is probably both. I think that. That, unfortunately, the more you talk about something, you probably are going to catch people that maybe didn't hear about it before that are going to be like, oh, that's interesting, and go and be funneled into that thing.

328
01:20:24,270 --> 01:20:25,010
Chris: Yeah.

329
01:20:25,550 --> 01:20:51,232
Kayla: But I also think that it isn't inherently wrong to cover that thing. And I think that another answer to this question we would have to do an entire episode on because I think that it would require several hours of conversation to really dissect the ways in which american media, in particular news media, has worsened or amplified.

330
01:20:51,416 --> 01:20:52,464
Chris: Absolutely.

331
01:20:52,632 --> 01:20:53,440
Kayla: Qanon.

332
01:20:53,560 --> 01:21:30,938
Chris: Yeah, worsened or amplified a lot of QAnon and related phenomena. I mean, that's been a huge problem. In fact, it was just recently a huge problem with the whole September 18 thing where there was supposed to be all the stuff going down in DC, and then it got really amplified by sort of the media ecosystem. And then it turned out that it was like a whole lot of nothing. There weren't that many people there. And the only real effect that all of this media coverage had was to signal boost this guy who was, like, a nobody and had no power to say, basically just free million of dollars of advertising for this, like, nazi guy.

333
01:21:30,994 --> 01:21:41,026
Kayla: And cause fear and anxiety in others. I was nervous. I was like, something could happen. I don't know. Clearly something can happen because we've seen something happen, so who's to say it can't happen again?

334
01:21:41,138 --> 01:22:19,158
Chris: Yeah. And I think the thing that also worries me about this type of stuff is specifically the way that the Internet surfaces the worst things from either side to the other side. And then that creates this sort of, like, a resonance chamber of, like, did you hear what this fill in the blank worst person, Jack Possobiak. Whatever. Right? Did you hear what Jack Bosobiak said about blah, blah? And then, like, everybody on the left is like, fuck that. People on the right are such jackasses, but really what they're thinking is, Jack Possobiak said something that was a total piece of shit.

335
01:22:19,214 --> 01:22:19,742
Kayla: Right?

336
01:22:19,886 --> 01:22:24,314
Chris: Now, granted, there's like, you know, several dozen more Jack besobie acts. He's not the only one.

337
01:22:24,362 --> 01:22:24,950
Kayla: Right?

338
01:22:25,290 --> 01:22:47,466
Chris: But is it the majority of the right? Similarly, is the majority of the left people, you know, like, I don't know, like Charlotte Clymer or, like, people that are saying stupid things on the left, and that's the stuff that gets surfaced to the people on the right. So now you have everyone on the left thinking that everyone on the right is Jack Bosobiak. You have everyone on the right thinking that everyone on the left is probably pick a worse person than Charlie Klammer because she. She's not a Nazi, at least.

339
01:22:47,498 --> 01:22:50,858
Kayla: It's just hard, because there's not. There's not Jackson.

340
01:22:50,994 --> 01:22:58,470
Chris: I know, I know. But we also are a little bit blind to it. We're a little bit blind to it because that stuff doesn't get surfaced us. Right. Like, we.

341
01:22:58,930 --> 01:23:04,778
Kayla: I'm just saying there's no q. There's no left QAnon. In the same way. QAnon is an alt right phenomenon.

342
01:23:04,914 --> 01:23:07,554
Chris: It is. But there is blue and on stuff. I mean, there's people.

343
01:23:07,602 --> 01:23:08,490
Kayla: There is blue and on stuff.

344
01:23:08,530 --> 01:23:14,516
Chris: Yeah, there's. I mean, there's an insurrection yet. They didn't do an insurrection yet, but I could have said that about QAnon last year.

345
01:23:14,548 --> 01:23:15,476
Kayla: Two things are exactly.

346
01:23:15,508 --> 01:23:29,468
Chris: I'm not both sides. I'm not both sides in this. I'm just saying that there is an effect that the loudest, most enraging voices get amplified on social media, because that's how the algorithms are designed, and they.

347
01:23:29,484 --> 01:23:30,876
Kayla: Like to show you things that'll make you mad.

348
01:23:30,948 --> 01:23:43,868
Chris: Right. And then that creates this resonance effect where both sides just hate the other side so much. But even though, at least in this fake online world, on Twitter, which is part of why it's good to get off Twitter and touch grass, this is.

349
01:23:43,884 --> 01:23:48,540
Kayla: Why I've blocked everyone and everything. If it makes me mad, I block it.

350
01:23:48,620 --> 01:24:03,394
Chris: Yeah. And it just. I guess just. It's something we always just need to kind of be on the lookout for, because it's possible that we get caught up in that. Right. It's possible that we get caught up, and they're like, did you hear this crazy thing? I mean, that's kind of part of what our show is about, is, like, crazy stuff.

351
01:24:03,452 --> 01:24:03,950
Kayla: I know.

352
01:24:04,030 --> 01:24:09,654
Chris: So we just need to be careful that we're not like that. We're never saying that. Oh, yeah. This crazy thing blanket applies.

353
01:24:09,782 --> 01:24:10,450
Kayla: Right.

354
01:24:10,750 --> 01:24:50,536
Chris: You know, I don't know. I don't really have a concluding thought here, other than it's just important for folks like, especially folks like us, who are not like professional journalists, but really, for everybody to be careful about, like, what you consider to be true about the opposite side, what you're willing to. The information you're willing to share with each other about it, and fact checking things. Like, you know, we try to do that a lot where we don't. We don't want to say something to our audience unless we've, like, looked up the source and we've corroborated it with another source and we've confirmed that it really did happen. And then we also want to layer a shit ton of context on it.

355
01:24:50,568 --> 01:24:51,140
Kayla: Right.

356
01:24:51,720 --> 01:24:54,820
Chris: Anyway. Yeah.

357
01:24:55,450 --> 01:24:56,670
Kayla: I don't know.

358
01:24:58,330 --> 01:25:00,338
Chris: I don't know. The end.

359
01:25:00,394 --> 01:25:18,154
Kayla: The end. No, that's all good points. Yes, correct. Good. Also, I think where it gets difficult is that. I mean, not work is. It's all difficult. You also can't not talk about this stuff. Right. There was a time not so long ago where you could.

360
01:25:18,202 --> 01:25:18,834
Chris: Sweet summer.

361
01:25:18,882 --> 01:25:26,612
Kayla: Not talk about QAnon. You have to talk about QAnon now. When you are talking about the political landscape of our country.

362
01:25:26,716 --> 01:25:28,356
Chris: People broke into the Capitol and killed people.

363
01:25:28,388 --> 01:25:43,116
Kayla: Yeah. You have to talk about QAnon now. So it gets hard to know, like, what exact. Who exactly should I be listening to on this? Who should I be blocking on Twitter? Who should I be following? What news reporter should I be following? What news reporter should I not be following? It becomes who's making it worse?

364
01:25:43,148 --> 01:25:43,764
Chris: Who's making it better.

365
01:25:43,812 --> 01:26:04,478
Kayla: Yeah. So it's hard to. It's hard to know when something is so serious that you must talk about it and when something is not yet at that critical threshold. And Travis talked about that when were having our conversation with him. Like, what am I choosing to amplify? And what am I choosing to go to put. That's not mine. That's. I'm setting that down. I'm leaving that behind.

366
01:26:04,574 --> 01:26:04,894
Chris: Yeah.

367
01:26:04,942 --> 01:26:06,326
Kayla: I'm not choosing to amplify that.

368
01:26:06,398 --> 01:26:44,178
Chris: Yeah. And I liked his point about, because he is so, you know, eyeball deep in this stuff, because he is so tuned in to what's going on, he can kind of see things before they happen. Right. And then he has to make a decision. You know, do I. Do I surface this thing to my. To my audience? You know, do I communicate about this that I can see happening ahead of everybody else? Or is it better if I just, like, wait for it to become big enough that I feel like I'm definitely not platforming it? I don't know what the right choice is there. It's probably a challenging one to make.

369
01:26:44,274 --> 01:26:48,786
Kayla: I mean, Cassandra, tried to get everyone to believe her, and no one did.

370
01:26:48,858 --> 01:26:49,810
Chris: Poor, poor Cassandra.

371
01:26:49,850 --> 01:26:57,970
Kayla: And she died. So take what you will from that greek tragedy. I do feel like we should perhaps move on now to our criteria.

372
01:26:58,050 --> 01:27:02,394
Chris: Ooh. Oh, but we didn't. We didn't do a cult this time because it was just QAnon. Update.

373
01:27:02,442 --> 01:27:07,860
Kayla: Yeah, we already decided on that way back in December of 2020, but.

374
01:27:08,400 --> 01:27:14,140
Chris: That's right. We did add two new criteria in season three. So maybe we should talk about those real quick.

375
01:27:14,760 --> 01:27:16,128
Kayla: I think it'll be a very fast conversation.

376
01:27:16,144 --> 01:27:17,180
Chris: It'll be real quick.

377
01:27:17,600 --> 01:27:18,700
Kayla: Yes and yes.

378
01:27:19,400 --> 01:27:23,704
Chris: I mean, look here. The first one is. We've already actually even mentioned it is chain of victims.

379
01:27:23,752 --> 01:27:30,466
Kayla: Yeah. It's more prevalent. It's more prominent in QAnon than, I think, almost anything else we talk about, except for maybe mlms.

380
01:27:30,578 --> 01:27:36,370
Chris: Yeah. MLMs and QAnon are kind of like, what put this one on the map for us to say, like, this makes.

381
01:27:36,410 --> 01:27:40,810
Kayla: Something cult like, anything fundamental. And extremists is like, here you go.

382
01:27:40,850 --> 01:27:52,970
Chris: Well, it's the recruiting aspect of it. Right. And QAnon, like, mlMs, has a big recruiting element of, like, wanting to red pill people, and there's red pilling guides, and there's, you know, there's this intent to wake up the sheep.

383
01:27:53,050 --> 01:28:11,020
Kayla: I don't want to say that's the only way for there to be chain of victims, because I think fundamentalist religions also have an intensely high level of chain of victims, even when they are non recruitee, such as the Amish, which we talked about. That tends to be a chain of victims that's more like parent to child or older generation to younger generation.

384
01:28:11,140 --> 01:28:36,290
Chris: Right, right. With QAnon, though, it's definitely that recruiting aspect. No. So, obviously, chain of victims is high here because QAnon is one of the things that led to that. That criteria being created in the first place, and then are their beliefs. Dogmatic is the other new criteria. Well, okay, hold on. This is different. Remember, this is different from anti factuality. This is. This is not. Do they believe crazy, dumb stuff?

385
01:28:36,330 --> 01:28:37,310
Kayla: This is maybe.

386
01:28:37,610 --> 01:28:43,274
Chris: Is there a sense that the inside of the bubble is right and the outside of the bubble is wrong? So, given that.

387
01:28:43,322 --> 01:28:47,482
Kayla: Yes, that's very high here. I'm sorry. I'm not trying to be. Be dismissive or insensitive.

388
01:28:47,506 --> 01:28:48,590
Chris: No, I think you're right.

389
01:28:49,010 --> 01:28:54,842
Kayla: It is very dogmatic. It is the kind of thing where if you deviate from the norm, it's like, what's wrong with you?

390
01:28:54,906 --> 01:29:00,730
Chris: That said, well, I've even seen, like, deviations internal to the community creating that sort of, like, you're not the true Scotsman thing.

391
01:29:00,810 --> 01:29:20,522
Kayla: That said, I do feel like since Q has absconded from the world, there is more room for questioning and still being a part of QAnon. There is a greater percentage of people who consider themselves QAnon but don't view Donald Trump as a savior after the election than there was before.

392
01:29:20,626 --> 01:29:20,938
Chris: Right.

393
01:29:20,994 --> 01:29:22,378
Kayla: So I think, like, when Alexander the.

394
01:29:22,394 --> 01:29:28,930
Chris: Great died and then all of the Hellenistic wouldn't use that comparison. Like, there's the seleucids and the ptolemy.

395
01:29:29,010 --> 01:29:41,804
Kayla: Get out of here. I'm just saying that I think that the. It's an. It's an interesting place right now where it's, like, there's not. There's no one true QAnon, so no one Truanon. There's no one Truenon.

396
01:29:41,852 --> 01:29:43,636
Chris: See, it's real fun. It's fun to make.

397
01:29:43,708 --> 01:29:52,068
Kayla: What does dogma look like in this landscape? And I think while it is still very dogmatic, it does look different than pre election.

398
01:29:52,124 --> 01:30:08,894
Chris: I mean, I think it's contributed to maybe some fracturing. Right? Because once there's a deviation, rather than tolerate that deviation and allow it to be in the fold and know, work with it, instead there's like a. You're not. Again, you're not the true Scotsman. Get the hell out of here.

399
01:30:08,942 --> 01:30:09,342
Kayla: Right?

400
01:30:09,446 --> 01:30:23,526
Chris: And then. So that's why you get these fracturing of, like, different influencers kind of having their own little mini qanons with different flavors that all kind of, like, internicine, hate each other. But anyway, so high and high. So it remains a cult.

401
01:30:23,598 --> 01:30:24,930
Kayla: It's still a cult. Friends.

402
01:30:26,830 --> 01:30:48,152
Chris: To the extent that is a real thing. Anyway, here at the end of our show, as we try to always say, don't smash that, like, button. Don't worry about subscribing or telling a friend. Just listen and enjoy the show. This is Chris, this is Kayla, and this has been cult. Are you not.

403
01:30:48,296 --> 01:30:55,978
Kayla: You're not gonna do it or just weird? You used to say, thanks. We should only have one person say the whole thing. Oh, and then we changed it back.

404
01:30:56,114 --> 01:30:57,330
Chris: I feel like we always just wing it.

Travis View Profile Photo

Travis View

Co-host, QAnon Anonymous pod