Transcript
1
00:00:01,200 --> 00:00:25,890
Mike Caulfield: If we show people, hey, here's how you recognize the fundamental parts of a conspiracist story and break it down. Can we engage people in that way and get them to utilize their pattern recognition and understand the narrative structure of these things so that we're taking that. We're taking that compunction that has so often been used for ill, and we're using it for good. So they're a little more resilient against this stuff.
2
00:00:49,570 --> 00:00:56,642
Chris: And in that case, I am going to start recording season four of Cult or Just Weir. d.
3
00:00:56,706 --> 00:00:57,390
Kayla: Whoa.
4
00:00:58,060 --> 00:01:02,276
Chris: Yeah, it's pretty wild. The end.
5
00:01:02,348 --> 00:01:04,164
Kayla: Oh, man. How many episodes have we done?
6
00:01:04,292 --> 00:01:14,556
Chris: We have done. Hold on. Let me go back to the script. Okay. I actually didn't. I didn't write it in the script. I just didn't want to have the tap noise when we're actually recording.
7
00:01:14,588 --> 00:01:16,204
Kayla: We got new microphones. I'm gonna pick them up.
8
00:01:16,292 --> 00:01:26,370
Chris: Yeah, well, okay. So we should probably say that. I mean, I'm sure you've noticed by now the dulcet tones of our voices are even more dulcet and even more tonal.
9
00:01:26,710 --> 00:01:27,374
Kayla: Beautiful.
10
00:01:27,462 --> 00:01:27,862
Chris: Yeah.
11
00:01:27,926 --> 00:01:30,006
Kayla: Welcome to season four of cult are just weird.
12
00:01:30,078 --> 00:01:30,238
Mike Caulfield: Yeah.
13
00:01:30,254 --> 00:01:31,566
Chris: No, we did get new equipment, though, I hope.
14
00:01:31,598 --> 00:01:32,214
Kayla: New and improved.
15
00:01:32,262 --> 00:01:33,070
Chris: It does sound nice.
16
00:01:33,110 --> 00:01:40,814
Kayla: New microphones. We have a new sound mixer. We have a new setup. We're here to bring you the best. And we're honestly, we're really glad to be back.
17
00:01:40,942 --> 00:01:48,078
Chris: And if only 1% of our listeners actually even notice that it sounds better, then that's all right by me.
18
00:01:48,174 --> 00:02:02,256
Kayla: It's like that story that my 6th grade teacher read to us about the guy throwing starfish back from the beach, and another guy went up and went, you can't possibly save them all. Why are you throwing the starfish back in the water? You can't save them all.
19
00:02:02,328 --> 00:02:24,844
Chris: It's because Jesus was carrying him that whole time. Yep. That's a great story. I know, but, yeah. Welcome to season four. Thank you for being with us on this long ass journey. That implies we have done 60 episodes, unless you count the bonus episodes we've done for Patreon. The bonus episode we did for Patreon, our regular users at the end of last season on Lularoe.
20
00:02:24,972 --> 00:02:26,692
Kayla: Oh, yeah, I forgot about that.
21
00:02:26,756 --> 00:02:31,260
Chris: Yeah. There's been a couple extras, or actually, it's more than a couple extras now. After three years.
22
00:02:31,300 --> 00:02:31,852
Kayla: Yeah.
23
00:02:31,996 --> 00:02:45,108
Chris: So it's been a lot. It's a lot of content. It's way too much content, actually. That's one of the things that I wanted to maybe mention in our intro today, actually, first let's do our creds. Right. So I'm Chris. I am a game designer and data scientist.
24
00:02:45,204 --> 00:02:47,492
Kayla: I'm Kayla. And I am a television writer.
25
00:02:47,636 --> 00:02:49,996
Chris: And we are both interested in cults.
26
00:02:50,028 --> 00:02:53,164
Kayla: And podcast hosts and things that are just weird.
27
00:02:53,252 --> 00:03:02,932
Chris: Yes. So I just mentioned. Yeah, one of the things this season is maybe go a little bit lighter on the content. We respect your time.
28
00:03:02,956 --> 00:03:04,276
Kayla: We have a panels of content at YouTube.
29
00:03:04,308 --> 00:03:05,440
Chris: We've done too much.
30
00:03:06,420 --> 00:03:10,780
Kayla: We should not be forcing you to sit down and listen to four hour episodes every other week, which.
31
00:03:10,820 --> 00:03:17,980
Chris: Our first two episodes of last season, the first one clocked in at like an hour and a half, which was fine. And then the next one was like over 3 hours.
32
00:03:18,060 --> 00:03:18,450
Mike Caulfield: Right.
33
00:03:18,540 --> 00:03:24,890
Chris: Cause it was on anti vax. And I just could not figure out what to cut. Cause I'm just. I'm a bad producer, basically.
34
00:03:25,230 --> 00:03:26,798
Kayla: That's what I say about you all the time.
35
00:03:26,894 --> 00:03:36,126
Chris: So, anyway, we're gonna try to better about that this season. This episode might be a little bit longer for this season. I anticipate this one's probably gonna be like an hour and a half.
36
00:03:36,198 --> 00:03:36,622
Kayla: Okay.
37
00:03:36,686 --> 00:03:57,342
Chris: But I think ideally, that should probably be long for this season. I think that'll be, like, on the upper end of length for the season. And it's for our own mental health, our own sort of sanity. And it's, you know, because there's a lot of content out there, we shouldn't expect our listeners to be listening to us yammer on for 3 hours. We should try to be a little more succinct if we can.
38
00:03:57,406 --> 00:04:17,853
Kayla: Right. If you remember from our conclusion of season three, you'll remember that this season we're trying to be a little gentler, a little more focusing our exploration on how we can take care of ourselves, how you can take care of yourselves, how we can all take care of each other, and how we can survive living in this world that is basically one gigantic cult at this point.
39
00:04:18,022 --> 00:05:01,410
Chris: So, yeah, the shorter episodes are sort of like maybe us, like, leading by example a little bit as well. I mean, it's good for us, but it's also like, hey, also take care of yourselves. And maybe think about when you should cut back or say no to things or only produce an hour of content, when you could be producing 7 hours of content. I don't know how many people that I'll apply to, but in general, I think there's like, that's sort of a lesson that we're maybe hoping to get across is the hey, like, take. Take time for yourself. Actually, the other things that we're going to be doing this season are trying to highlight some organizations that we like and think are good, rather than just always talking about the shitty exploitive ones all the time.
40
00:05:02,470 --> 00:05:06,490
Chris: And that's sort of the nature of talking about cults.
41
00:05:06,830 --> 00:05:24,354
Kayla: Let's see if we're able to stick to that is the problem. I have so many episodes I want to do on nice things and organizations that seem like a cult but are actually helpful. Or here's the anti cult. But then I also have so things that I'm like, have you heard of this horrible thing? And even though, okay, that may happen.
42
00:05:24,442 --> 00:06:05,010
Chris: Yeah. Okay, we're not gonna go. We're not gonna pivot, like, whole hog. I mean, we have a lot of listeners that like to hear about crazy cult shit, and we like to talk about crazy cult shit. So it's not going away, but I think we're just going to try to emphasize more. So, like, the good example would be, like, last season when we talked to Travis Vue from QAnon Anonymous. Right. That wasn't about a cult. It was just us talking to a helper. Or when we talked to Nikita Zimoff from Pleistocene park. Right. He's doing climate science. That's like, we sort of gimmicked it into talking about how it could be accomplished. But the bottom line is, like, we wanted to highlight that as a good thing somebody's doing instead of just always talking about all the shit that's flowing downhill.
43
00:06:05,350 --> 00:06:10,094
Chris: Kayla, do you have any other business? Like, what have we been up to in the off season?
44
00:06:10,182 --> 00:06:28,040
Kayla: My business is that I'm sick while we're recording this. So if you think my voice sounds different, it's just that I'm sick. And the only reason I say that is because I love to go. No, I love when I'm watching a sitcom and I go, like, they're sick right now. You can tell. You can tell on their voice. That's my favorite thing. So if you're sitting at home going, she's sick, it's cause I am.
45
00:06:28,160 --> 00:06:32,912
Chris: Okay, so you've confirmed that for all the people that like to play that little mini game with the people. Yeah.
46
00:06:33,016 --> 00:06:36,664
Kayla: But in terms of other business, nah, man, we're hanging in there.
47
00:06:36,712 --> 00:06:38,536
Chris: Anyone talk about the show you've been writing for?
48
00:06:38,608 --> 00:06:43,966
Kayla: I hope that you're all hanging in there. Yeah, I'm on a new show called High Town. Watch it. It's great.
49
00:06:44,088 --> 00:06:44,810
Chris: It's pretty good.
50
00:06:44,890 --> 00:06:50,002
Kayla: It's a hardcore dark gritty crime drama. Enjoy.
51
00:06:50,186 --> 00:06:51,710
Chris: I've been working on a game.
52
00:06:52,130 --> 00:06:53,162
Kayla: You have been working on it?
53
00:06:53,186 --> 00:07:07,666
Chris: I mean, that's actually relevant, because I find the art that I have been. So I've been doing some dabbling and some digital art for said game, and I have found that to be, personally, pretty relaxing and therapeutic and fulfilling.
54
00:07:07,858 --> 00:07:10,058
Kayla: Hey, you have a new hobby. You have a whole new hobby.
55
00:07:10,074 --> 00:07:11,610
Chris: That's right. That's right.
56
00:07:11,730 --> 00:07:13,338
Kayla: I started going to the library, and.
57
00:07:13,354 --> 00:07:22,378
Chris: I am totally, like, if we ever, like, this game actually becomes something that we can publish to the public. I'm absolutely using this podcast to sell the shit out of it, you guys.
58
00:07:22,434 --> 00:07:23,442
Kayla: Why else do you have a podcast?
59
00:07:23,506 --> 00:07:31,458
Chris: Yeah, this is all for my personal gain. All right, but maybe if we want to have reasonable length episodes, we should get to.
60
00:07:31,474 --> 00:07:34,522
Kayla: Let's do the show. Let's do it. I want to hear what you have to say this episode.
61
00:07:34,626 --> 00:07:35,322
Chris: Hey, Kayla.
62
00:07:35,386 --> 00:07:36,070
Kayla: Yes?
63
00:07:36,580 --> 00:07:40,000
Chris: Have you ever been to the Internet?
64
00:07:40,460 --> 00:07:41,188
Kayla: The what?
65
00:07:41,284 --> 00:07:43,116
Chris: The Internet. Have you been to the Internet?
66
00:07:43,148 --> 00:07:45,868
Kayla: I don't know what you're talking about. Never heard of it.
67
00:07:45,964 --> 00:07:47,396
Chris: You've been to the Internet?
68
00:07:47,588 --> 00:07:52,476
Kayla: I am never not on the Internet. I grew up in the Internet.
69
00:07:52,628 --> 00:07:57,760
Chris: What's it like over there? Is there enough content? No, there's there.
70
00:07:58,540 --> 00:08:19,524
Kayla: I was just thinking about this the other day, and I'm not the only person to think about this, but, like, there's so much content and so few places to go on the Internet than there used to be. There used to be. That is pretty weird content, but you'd have to go to, like, different websites, and now it's just like, it's on TikTok. Everything's on TikTok. All the memes are on TikTok. It's all on TikTok. Yeah, you go on Twitter, you see.
71
00:08:19,532 --> 00:08:24,332
Chris: The TikToks, and then you go on YouTube, and it's collated all the TikToks into a compilation of TikToks.
72
00:08:24,396 --> 00:08:27,388
Kayla: You used to have to go to different people's blogs and stuff.
73
00:08:27,444 --> 00:08:30,356
Chris: Yeah, I know. Oh, I actually went to some blogs for this episode.
74
00:08:30,388 --> 00:08:30,964
Kayla: Oh, my God.
75
00:08:31,012 --> 00:08:36,412
Chris: I know. I know. But, yeah, as Bo Burnham says, can I interest you in everything all of the time?
76
00:08:36,476 --> 00:08:39,155
Kayla: Oh, God, tattoo that upon my soul.
77
00:08:39,227 --> 00:08:43,804
Chris: Do you ever feel like you struggle with the constant stream of information?
78
00:08:43,972 --> 00:08:47,756
Kayla: Every day of my life, 80% of.
79
00:08:47,788 --> 00:08:58,484
Chris: Which is probably some degree of garbage, but it's really hard to tell which 80%. Do you ever feel overwhelmed when you log onto Twitter or open TikTok or scroll Instagram?
80
00:08:58,652 --> 00:09:06,280
Kayla: Oh, yes. I went and got my blood drawn today, and this is. Spent five minutes with the phlebotomist. And this is what we talked about.
81
00:09:06,620 --> 00:09:08,820
Chris: You talked to the phlebotomist about how you're.
82
00:09:08,860 --> 00:09:09,732
Kayla: She brought it up.
83
00:09:09,836 --> 00:09:11,132
Chris: Yeah. Well, I mean. Cause we're all.
84
00:09:11,156 --> 00:09:13,596
Kayla: She was like, I can't go on Facebook anymore. And I was like, yeah, I know.
85
00:09:13,708 --> 00:09:18,116
Chris: We'Re all facing this issue. Like, this is. This is the issue of our time.
86
00:09:18,188 --> 00:09:20,044
Kayla: It really is. It really is.
87
00:09:20,092 --> 00:09:23,492
Chris: Actually, my next line here was, Kayla, you are not alone.
88
00:09:23,596 --> 00:09:24,228
Kayla: Oh, God.
89
00:09:24,324 --> 00:09:25,042
Chris: I. I think that's.
90
00:09:25,066 --> 00:09:26,138
Kayla: Is there a support group for this?
91
00:09:26,194 --> 00:09:28,874
Chris: Pretty clear. Because you talked to your phlebotomist about it.
92
00:09:28,922 --> 00:09:36,274
Kayla: And she. I didn't say. I just walked in and she brought it up my arm, and she was like, I really gotta get this off my chest, I guess. I don't know.
93
00:09:36,322 --> 00:09:36,794
Chris: Really? I love it.
94
00:09:36,802 --> 00:09:46,546
Kayla: We're just talking about, like, is it busy here on the weekends? No, it's busy on Monday. And then all of a sudden, she was talking about how she watches the news too much and she can't go on Facebook anymore. Cause it's overwhelming.
95
00:09:46,618 --> 00:10:02,970
Chris: Don't we all? So if I may indulge us in a snippet from our last season's interview with Mister Matt Remsky again, and I know we already did this with our finale, but this particular slice of audio has just really stuck with me and influenced you and me quite a bit.
96
00:10:04,350 --> 00:11:09,920
Matthew Remski: The notion that as a journalist, you can bring somebody who's not doing something obviously illegal to account, that you can bring shame to them, that you can be so thorough with your deconstruction of Mickey Willis and the pseudo documentary that he'll never make another one that's just completely. It's so naive. It's, like, almost pathologically naive. It's wishing for a world that just does not exist. And you want it to. And you want the world to exist because somehow you're caught on this guinea pig wheel of thinking that you're doing something positive, that you're in a ground war, that you're winning. And I just don't. I don't. I don't think that's. I don't think that's realistic. It's not just that it gives these people oxygen. It's that it. It poisons the soul, really.
97
00:11:12,500 --> 00:11:56,004
Chris: So people who've been listening since last season will. Will definitely remember that clip. But the basic idea here is that attempting to debunk and engage with everything everywhere, all of the time will indeed poison the soul. And I'll also point out to some degree, the deluge of misinformation is a strategy employed by parties that don't have everyone's best interests at heart. Now, this episode isn't about the GRU or documented russian state sponsored disinfo campaigns. The scope there is way too big, and I promise to shorter episodes. But if you're interested, you can go check out, like, things the Rand Corporation, for example, has written on this topic. But anyway, I will bring up a quote from everyone's favorite White House chief strategist, Mister Bannon.
98
00:11:56,092 --> 00:11:57,668
Kayla: I don't want to hear anything from.
99
00:11:57,724 --> 00:11:59,044
Chris: I know you forgot he existed.
100
00:11:59,092 --> 00:12:00,120
Kayla: No, I didn't.
101
00:12:00,580 --> 00:12:31,122
Chris: Quote the Democrats. Don't matter. The real opposition is the media, and the way to deal with them is to flood the zone with shit. End quote. So that's like a you might remember that quote from back in the day. Yes, that's the flood the zone with shit quote. Everybody remembers that quote. And as smarter people than me have written, basically what he's saying is the goal is not persuasion anymore. The goal of information and disinformation and propaganda is not persuasion. The goal is to overwhelm and disorient.
102
00:12:31,266 --> 00:12:34,306
Kayla: Yep. I feel overwhelmed and disoriented all of the time.
103
00:12:34,378 --> 00:13:08,508
Chris: Well, that's partially intentional. As Sean Illing writes in Vox, quote, this idea isnt new, but Bannon articulated it about as well as anyone can. The press, ideally, should sift fact from fiction and give the public information it needs to make enlightened political choices. If you short circuit that process by saturating the ecosystem with misinformation and overwhelm the medias ability to mediate, then you can disrupt the democratic process. Yeah. So, pretty big problem. Yeah, pretty intimidating.
104
00:13:08,684 --> 00:13:21,692
Kayla: Well. Cause how do you fight back against something like that? How is any individual, how is any institution, how is any arm of government, how is any group, how is anyone supposed to fight back against that? Like, literally.
105
00:13:21,756 --> 00:13:22,388
Chris: Kayla, I.
106
00:13:22,404 --> 00:13:23,708
Kayla: Am I setting you up?
107
00:13:23,764 --> 00:13:24,924
Chris: Glad you asked.
108
00:13:25,052 --> 00:13:36,328
Kayla: That was not intentional. I literally. That is my concern. And I'm so glad that you're gonna give me a straight, clear cut answer. Right. It's gonna be really easy on how we can. Okay. Of course not.
109
00:13:36,424 --> 00:13:44,224
Chris: It might be a little more clear cut than you thought. Oh, certainly. The thing is, you need the right tool for the job.
110
00:13:44,312 --> 00:13:47,048
Kayla: Okay, so a hammer, kind of.
111
00:13:47,104 --> 00:13:48,968
Chris: Or a information hammer.
112
00:13:49,024 --> 00:13:49,780
Kayla: A wrench.
113
00:13:50,400 --> 00:14:27,430
Chris: I mean, when something weighs too much for you to lift on your own, what do you need? Leverage. Right. When there's too much information for you, to consume, you need a tool that gives you the leverage that you need. So Mike Caulfield is a professor who is creating just these exact tools, ones that you can actually use in your own life. And he was kind enough to talk to us for today's episode of cult are just weird. Thanks again so much for coming on the show, Mike. I really love your work, which we are going to talk about here, but if you could introduce yourself for our audience.
114
00:14:27,820 --> 00:15:12,504
Mike Caulfield: Yeah, sure. My name is Mike Caulfield. I'm a research scientist at the center for an informed public at the University of Washington, and I work in the field of online media literacy. This is one space I work showing people how to navigate the web, and the other thing I do is rapid response. I actually track misinformation that emerges during emerging events. So misinformation that emerges in response to new things that happen. Ukraine is a current example, but my major focus is right now is on election misinformation.
115
00:15:12,672 --> 00:15:54,360
Chris: Gotcha. Yeah, I knew, like, I'd read a bunch of your stuff. I didn't know the rapid response thing until just recently. I thought it was super, actually. I wrote all of these interview questions, and then I was like, man, I really want to ask him about the rapper response. That's like its own whole interview. I don't know, it just sounds fascinating to me, bottom line. But we'll get to that, as we say on the show. But, yeah, I wanted to bring you on here because I wanted to highlight your work as a helper, as a person who is on sort of the front lines of this research about disinformation and media literacy, and also give our audience some real, tangible strategies for navigating their own information spheres. So I really appreciate you being on the show.
116
00:15:54,860 --> 00:16:00,318
Chris: How long have you been working on this, I don't know, project, or how long have you been in this?
117
00:16:00,374 --> 00:16:53,780
Mike Caulfield: Yeah, so the media, the online media literacy side, since about 2010 that actually began at a. Working at a small college as an instructional designer, helping faculty construct their courses. We had sort of some global, college wide outcomes that were trying to, that we try and introduce into the curriculum, and one of them was civic digital literacies. And a piece of that civic digital literacy's curriculum was critical consumption. How do you kind of sort truth from fiction, from everything in between on the web? And that ended up end up being a bigger thing than we thought at the time. But that's really the start. 2010 ish.
118
00:16:53,920 --> 00:16:59,600
Chris: Okay, interesting. I mean, yeah, it certainly sounds like when you say it that way, it's like, oh, my God. Yeah, that's huge.
119
00:17:01,660 --> 00:17:46,862
Mike Caulfield: Well, at the time, actually, we found some stuff that we thought was interesting about it, and went out to try to get a lot of other people on board with teaching this stuff. And it was 2010, 2011. And at that time, we go out and say, hey, look, people are not as good at this as they really need to be. And people would say, well, actually, the big problem with the Internet is people are rude. It was 2010 2011. Everyone was obsessed with civility, and they were like, well, really what we should be teaching is students how to be polite on the Internet. That's not really your big problem here. But, yeah, it took a while for people to recognize the importance of it.
120
00:17:46,966 --> 00:17:51,658
Chris: Yeah. Was that the era of, oh, if everybody's anonymous, then they can say me?
121
00:17:51,714 --> 00:18:14,596
Mike Caulfield: Yeah, yeah, exactly like the era of David Brooks editorial after David Brooks editorial on how the sort of unwashed hordes of the Internet were ruining civil discourse. And, yeah, I'm not discounting that entirely. It's just turns out not to have been the larger problem.
122
00:18:14,738 --> 00:18:29,540
Chris: Yeah, yeah. The more pressing. Yeah. What got you interested in this field of work? Is it just sort of like a natural outgrowth of what you were doing with your instructional design? Or is it, have you always had an interest in this?
123
00:18:30,520 --> 00:19:34,730
Mike Caulfield: Well, I mean, two things kind of converged. One, I had in my past, I'd run a large online political community, and I had thought of that really as separate from my instructional design work. But as I was talking to other people, part of what I realized was this is where a lot of political debate is increasingly happening. And if we're teaching our students the skills they need to be citizens, we should be thinking about this as the medium and really looking at some of the unique challenges that political discourse, civic discourse, civic information seeking has on the web. And so, yeah, once those two things merged, it progressed from there. And then, as I said, our ability to really get people into it was relatively limited until 2016. And in the sort of wave, what people then called fake news happened.
124
00:19:34,810 --> 00:19:41,728
Mike Caulfield: And then people were like, oh, yeah, people are really bad at this. We might want to do something about that.
125
00:19:41,834 --> 00:19:51,708
Chris: Yeah, it turns out there's a structural problem. So would you say, would you characterize that then as a sort of, like, you had like, this sort of like, hobby interest in something and then you had your professional interest?
126
00:19:51,764 --> 00:20:35,710
Mike Caulfield: Oh, yeah, yeah. That I knew from my hobby interest that this was where political discourse was headed. And then it's always the case in, like, a liberal arts university, whether it's a public liberal arts like I was at, or a private. That part of the goal of that education is to prepare students to be scholars, professionals, and citizens. Right. Citizens is the third leg of the stool. But the way that were thinking about citizenship was just not connected to the environment that citizenship was increasingly practiced in. And so, yeah, I'd say the combination of those two things definitely led me to this.
127
00:20:35,790 --> 00:21:05,072
Chris: Yeah, I love that framework, too. And I love the name of the University of Washington center for informed Public, because it's. I do feel like we got to this point where it's like, college is for job. You know, like, you graduate college and it's job placement, and it's, what is your resume? And I do feel like there's. There's a lot of space for us to go, you know, try to get back to this idea that we're, like, producing good citizens, that we are training people in things other than just like, you know, can. Can you land that job at Goldman Sachs type of thing?
128
00:21:05,246 --> 00:21:15,388
Mike Caulfield: Yeah, exactly. I mean, the skills intersect, but, yeah, having a focus on all three of those domains, you know, scholar, professional, and citizen, is just, I think, key.
129
00:21:15,564 --> 00:21:39,632
Chris: Well, that's a perfect segue, because I kind of feel like I'm. I don't know, I'm like a virtual honorary student of yours because I, like all I said already, I liked your writings. But specifically, there's one thing I want to talk about, which is your. Your sift technique. You're kind of well known for it. It's an acronym. S I f t. Can you talk a little bit about that for our audience? Like, what does it stand for and what do those words mean?
130
00:21:39,776 --> 00:22:31,650
Mike Caulfield: Yeah, sure. So, one of the things that we found in 2010, when we first started to look at how students do evaluating things on the web, was that students tended to. They tended to approach these problems the way people approach schoolwork. I guess, you know, given a website and asked, hey, is this reliable? They would look at the features of the way they think. I just got to look deeply at the website in front of me and figure out, hey, is this reliable? Is this not, you know, are there spelling errors? That sort of thing? And they would not actually do what we later learned competent professionals do, which is say, before I even look at this thing, what do other people say about this website? What do other people say about this claim?
131
00:22:33,230 --> 00:23:15,248
Mike Caulfield: What we found, and this fast forwarding quite a bit to 2016, when I started talking to a researcher named Sam Weinberg, what we found was that the thing that really differentiated the competent people, the people that didn't get lost on the web versus the people that did. Is the first question of a person who had a web competency was not, is this true or false? It was just, do I even know what I'm looking at here? Right? Like, do I know what I'm looking at? Before I even get into this question of truth or falsity, am I under the correct impression about what I'm looking at? And also this assessment of, like, do I have the capability myself to evaluate this, or do I got to find somebody else?
132
00:23:15,384 --> 00:24:01,910
Mike Caulfield: And so sift encapsulates some of these information seeking behaviors that get people to go beyond the thing in front of them and try to tap into the sort of social network of knowledge of the web. The first one is s is just stop. We ask people, when they look at something, to ask themselves, again, do I know what I'm looking at here? If there's something that they find particularly compelling about it, to make a note of that, why am I mad about this thing if they don't know what they're looking at? If they, hey, this arrived from a source I'm unfamiliar with. This is a claim I've never heard before. Then we ask them to go to the next steps. The rest of the steps can be practiced in any sequence. The I in Sift is to investigate the source.
133
00:24:01,950 --> 00:24:46,660
Mike Caulfield: And again, we're not talking about sort of a Pulitzer Prize winning investigation. We're just saying, you know, before you retweet that thing, maybe hover over that profile and see if this person is a policy expert or a stand up comedian, because that matters. And maybe a deeper investigation is, if you're unfamiliar with an organization, maybe look up the Wikipedia page and just understand, is this a political advocacy organization? Is this a news organization? Again, when we're talking about these credibilities, a lot of people get very down into this. Well, how can you say one thing is credible than another? We find that people are, like, sharing stuff from political advocacy organizations thinking that they're newspapers. That's the level of thing. And this, again, comes back to this idea of, do you even know what you're looking at?
134
00:24:46,700 --> 00:25:32,488
Mike Caulfield: Like, everybody wants to get to, like, this issue of what's more credible than this? What's more trust? Like, do you know where to slow down? Do you know what you're looking at? The f is find other coverage or find better coverage. And that's just if you're looking at a claim before you even look at someone's presentation of a claim. Just figure out like what have people said about this before this article that someone may have sent you. It may add important points to the conversation on the subject. But like any conversation, one of the metaphors we use is if you imagine you walk into a party where people are having a very intense conversation about a subject, you don't just jump in to that conversation and start saying, well, I think this.
135
00:25:32,544 --> 00:26:16,630
Mike Caulfield: Or you try to get a sense of, hey, what have people talked about so far? What is the conversation so far? You got to do that before you jump in. And so we ask people to find better coverage, see if there's some other sources that they can use to figure out what do people in the know think about this claim. And then the final thing is to trace claims, quotes and media to the original context. This is kind of a smaller point, but we sometimes find that people will actually reject a lot of things because the way they found out about it was, you know, through someone that maybe didn't have expertise. But, you know, if you click through, you'll find, oh, actually this person is citing something very reputable. You just got to go a little further down that chain.
136
00:26:16,970 --> 00:27:01,150
Mike Caulfield: On the flip side, very often people will present a link to the New York times as evidence of something that they're saying. And they'll say, even the New York Times says x. And then, you know, you kind of click through the link and you find that's not actually what the article says. Right. And so getting trace is really about getting to, you know, getting into the right place to do your sort of analysis. But that's it. You know, stop investigate the source. Just who is the source? Why might they be in a position to know more than the average bear? Find better coverage. Like, if I was to just, this is the source that arrived on my doorstep about this issue, but if I was to say to myself, hey, what would be my first choice for a source? What might that look like?
137
00:27:01,190 --> 00:27:19,530
Mike Caulfield: Go see what that person says and then just make sure you're at the right point for analysis. Trace claims, quotes immediate to the original context. And that also includes things like making sure if you're looking at a video that's 10 seconds long that maybe you get the minute or two around that to understand the context of it.
138
00:27:20,470 --> 00:28:03,464
Chris: Thank you so much for that full explanation. Part of what I really love about sift actually, okay, so there's a few things not to gush, but is that it doesn't say you need to go spend 2 hours deep diving into a specific thing. What it does is it gives sort of a, like a brief heuristic to just, like, you know, 80 20 rule. Right? Like, filter out, like, the most, like, egregious stuff. Like, the most, like, if you're stopping and investigating. I mean, even that. Right there is almost enough sometimes, right. Is just enough to, like, you bring up the hover method, and I've read some of your work on that as well, how sometimes you can just hover over something and not even have to do anything else.
139
00:28:03,592 --> 00:28:04,640
Mike Caulfield: Yeah, absolutely.
140
00:28:04,720 --> 00:28:19,350
Chris: And that's. That's one thing I really like about it. And I also, the other thing I really like is that the acronym itself is kind of descriptive. Right. It's like the whole point of it is to literally is to just sift the big grains out and just focus on the small grains kind of thing.
141
00:28:19,470 --> 00:29:11,900
Mike Caulfield: Yeah. And part of that is the scarcity in an information economy is not information, right? Information is the abundance. This goes back to some work by a guy named Herbert Simon in the seventies. So the abundance we have in the information age is information. Right? It creates a scarcity. The scarcity is actually your attention. Your attention is the most valuable thing. And one of the things that happens both by just the nature of the sort of Internet web structures we exist in, platforms we exist in, but also happens as a result of people deliberately trying to overload you, is that you get thrown all these things, just this endless stream of outrage, things to look at, evidence, things with circles around them and arrows.
142
00:29:12,360 --> 00:29:49,730
Mike Caulfield: And the sort of academic approach to that has been to tell students, oh, well, just apply deep attention to these things and figure them out. But what that actually does is it sort of exhausts your critical faculties, and what you really want to do is you want to. Your first step is just often figuring out, is this worth my attention? So that you save your attention and you save your focus and you save your critical capabilities for the things that are really worthy of it, rather than just getting exhausted chasing bullshit proposition after bullshit proposition.
143
00:29:50,790 --> 00:30:28,174
Chris: Yeah, I am. I was going to ask you about this later in the episode, but actually, I think it's probably better here. But I liked your article. I'm going to link this in the show notes, but your article entitled Information Literacy for mortals. And it was sort of one of those moments where I read the whole essay, and I was like, oh, man, that's crazy. I didn't even think about it that way. But then I thought about it for another minute and I was like, oh, yeah, of course. That's right. And it's basically what you're talking about now, right. Is that it's not strictly better to treat every incoming piece of information as like a, something you would study at a university. Right.
144
00:30:28,302 --> 00:30:37,038
Chris: It's less about, like, doing deep analysis on every single thing that comes into your field of view and more about what can I dismiss? Does that sound.
145
00:30:37,094 --> 00:31:10,552
Mike Caulfield: Yeah, yeah, and, yeah. So there's a bunch of pieces in that article, but yeah, one of them is that there's this assumption in the academy that, and I think in the, in the general culture as well, that the more effort you apply to a decision or you apply to sense making, the more informed you will be. Right. And it kind of makes an intuitive sense to people. You know, if you really care about something, you put more time, you put more effort into it.
146
00:31:10,616 --> 00:31:20,336
Chris: Like, if you spend one decision point on it and it yields this much benefit, then I spend ten decision points on it. It should yield ten x, right? In theory, yeah.
147
00:31:20,528 --> 00:32:14,808
Mike Caulfield: Ten x, yeah, in theory. Right. But when you actually think about decisions you make and where you've gone wrong, very often, where you go most wrong in a decision is when you take too many variables into account. I remember going in shopping, this is give away my age, but shopping for a cassette deck and going to one of these stereo stores. This cassette deck had like, auto reverse, and this one had, like the newest Dolby and this one. And all these different things had all these different characteristics. Right. And you try to take in all those characteristics, you say, I'm going to get as much information about each one of these as is possible. Well, what we know about decision making, consumer decision making, is that's likely to give you a much poorer result.
148
00:32:14,994 --> 00:33:03,874
Mike Caulfield: Then if you decide before you go into the store, what are the top three things that are important to me? And then you just buy the lowest price deck that gives you those top three things. When people start to try to balance a whole bunch of different stuff and kind of hold it all in their mind, they tend to get overwhelmed and they lose sight of the most important thing versus the things that are maybe nice to have or interesting or a little bit weird, but not really a deal breaker. And so we do find when students are looking at a simple question, sometimes they will get to a relatively good idea of what people in position to know, say, what the issues are after maybe three or five minutes of searching in research.
149
00:33:04,042 --> 00:34:00,076
Mike Caulfield: But as they progress, as they get to that third page, fourth page of Google search results, they sometimes get a little more confused. And part of that is you do find, not always, but often, you do find the most important information about an issue relatively quickly in a search. And then as you get further down, further into the search, you start to find weird trivia. In 2003, one of the reporters at this publication was caught plagiarizing, like, oh, that's interesting. 2003, you say something else, says one of the people that was on the board of directors is connected in this way to this thing over here. In your mind just has this tendency. I mean, we're sort of, we're pattern seeking monkeys. And your mind has this tendency to say, well, that's got to fit somewhere in the analysis.
150
00:34:00,268 --> 00:34:39,699
Mike Caulfield: But fundamental things like, is this a news source? Like, do they fund reporters to do actual reporting? You know, that those sorts of things have to be weighted much more. And so part of this idea with research, when you talk about, we want people to do research on topics, but people can get very quickly overwhelmed in detail, and we want to make sure that people are making sure they answer the most important questions upfront. Then, if they want to go deeper, great. But before you start your whole journey, get the most important questions down and then. And then go from there.
151
00:34:40,079 --> 00:35:17,950
Chris: I think what's, part of what's interesting to me about this topic is that it's not just analysis paralysis, right? It's not just, oh, there's too many variables, and I'm not sure it's actually that there's too many variables, and then that produces a worse decision result, actually. Like, you're not paralyzed. You do make a decision. It's just bad. I. Right. And so I think that's part of what really interests me about it. And I like your analogy about, you know, using the idea of different variables, right, where it's like there's two or three that are actually material to the decision, and then there's some other ones that are like maybe some light context, maybe.
152
00:35:18,070 --> 00:36:12,610
Mike Caulfield: So thinking about the most important stuff upfront, and then the other stuff you find, add nuance, add context to the thing. But understanding the difference between those two, otherwise you get into this sort of cynicism, for example, with sources where one wrong decision of one source automatically eliminates, well, they made a mistake in 2011, when what you really want to know is something that is in much broader strokes, otherwise everything gets reduced to the same level of trust. I talk about this a little bit in a article I wrote on something called trust compression, which is the idea that one of the things we see in students is very often it's not that they're coming in their gullible, and it's not necessarily that they're coming in and they're cynical.
153
00:36:12,910 --> 00:36:42,730
Mike Caulfield: It's that you show them something from a newspaper that has an occasional clickbaity headline, and they'll say, oh, that's moderate to low trust. You show them something from a journal that publishes almost everything that's nonsense, and they'll say, oh, that's. That's moderate to low trust. And what we really want is the discernment there, right? These things are different, right? There's not sort of a calculus where you. You eliminate things on the basis of small details.
154
00:36:43,350 --> 00:37:20,594
Chris: Going back to sift. One of the things I also wanted to say about it is just how I can. Like, I can personally identify with almost, like, each step in that process. Like, just the stop thing. Like, I know I've felt that myself, too. Like, you know, as. As I consume media and Twitter and whatnot, you know, I will find myself sometimes clicking on something that I'm like, oh, wait, I shouldn't have clicked on that. Like, I have this moment that's, like, emotional almost, right? Like, my brain will kind of catch me being, like, enraged, you know? Like, part. Like, one half of my brain will be like, hey, dude, you're just clicking on this because you mad. Like, is that you're. You're part of the problem right now. So, like, I I don't know.
155
00:37:20,602 --> 00:37:39,578
Chris: I just feel like I have some of those, you know, in the info, like the eye part. Like, you know, I do the hover thing, too, a lot. And I do a lot of times when there's a sort. Like something, there'll be some claim that I'm like, that sounds bad, but also, I don't know if that sounds right. And then I'll look. It's like, oh, that's from quillette or something like that. You know, like, okay.
156
00:37:39,634 --> 00:38:24,780
Mike Caulfield: Like, oh, yeah, yeah. So the. There's a couple of things that push us to do that. One is serve our emotions. Emotion can sort of route around our defenses. Another thing that we find is just the sort of compulsion to share puts a lot of time pressure on us, right? So when we look at. Why don't people check things? That I think drives some bad decision making, too, right. That, you know, the thing we know about rumor is being the first person to share. So there is this sort of countdown that people often feel when they're looking at stuff, but you got to balance it, right? You got to balance it. You got to ask yourself, hey, is it really worth trashing my credibility to try to get this out, try to share this a few minutes earlier?
157
00:38:24,820 --> 00:38:57,688
Mike Caulfield: And one of the other things we just ask people is the stuff we show people. It takes like 90 seconds. Maybe if you don't want to put in the 90 seconds for it, maybe it's actually not. If you don't want to put the 90 seconds into checking it, maybe it's not worth as much as you think. Right. Because if it was really some sort of groundbreaking making a revelation, it would be the sort of thing where you say, oh, I'll take 90 seconds for this. If it's like, oh, 90 seconds, I'm not going to bother with this. You probably can just let it slide, right?
158
00:38:57,744 --> 00:39:03,384
Chris: I mean, and we just spent ten minutes talking about how 90 seconds, more than 90 seconds may not even produce.
159
00:39:03,432 --> 00:39:06,256
Mike Caulfield: More than 90 seconds may not matter. Right? Yeah, exactly.
160
00:39:06,408 --> 00:39:47,360
Chris: The trade off here isn't like, I need to spend 30 minutes researching this before I share it. The, the trade off is like just the briefest amount of gut check, and then you probably shouldn't share if you can't do that. Yeah. Yeah. You also wrote a series of essays on the concept of tropes in our collective discourse. And part of why I really love those essays is they were also prescriptive. You talked about being able to quickly identify a trope as a superpower that fact checkers have that actually maybe we can all develop. So can you talk a little bit about what you mean by tropes, why they're so powerful, and then how individuals can use this knowledge to their benefit?
161
00:39:48,780 --> 00:40:14,214
Mike Caulfield: Yeah. So tropes mean a number of things. I mean, a trope in some fields, like a trope could just be like a metaphor or a figure of speech. Right. When people talk about narrative tropes and character tropes, that's more of what we're talking about. And so if you are familiar with the site tv tropes, they show a lot of the common narrative tropes, a lot of the common character tropes, you.
162
00:40:14,222 --> 00:40:16,490
Chris: Know, so sadly, all too familiar.
163
00:40:16,830 --> 00:40:18,934
Mike Caulfield: Yeah. You can spend a lot of time on tv tropes.
164
00:40:18,982 --> 00:40:19,350
Chris: Yeah.
165
00:40:19,430 --> 00:41:05,342
Mike Caulfield: So a trope on tv might be counting bullets, right? You're in a, you're in some sort of firefight, and someone has to, like, figure out how many. How many, how many bullets were fired here. Or a trope could be like one bullet left. Right. You know, the dilemma for the character, they only have one bullet left. How are they going to handle this? This situation. And the idea of a trope is you kind of see these from episode to episode. They're not perceived as being sort of unique, but you understand how it works. If you see a ticking time bomb counting down, like, no one has to turn to a character and say, oh, this is a bomb. When it reaches zero, it will explode. If I cut the wrong wire, it will explode.
166
00:41:05,506 --> 00:41:12,374
Mike Caulfield: Requires I use, you know, a good degree of care, you know, like, no one has to, you know. You know how the ticking time bomb works, right?
167
00:41:12,422 --> 00:41:17,974
Chris: Yeah. You know that there's going to be a cut the right color wire line coming up, right, exactly.
168
00:41:18,022 --> 00:42:06,086
Mike Caulfield: So we see tropes, too, when we look at information. And just because something is a trope doesn't mean it's misinformation. But a lot of how we consume information is based on these tropes. So, for example, there's a trope, really simple thing like the body double trope. And the way the body double trope works is this. Someone sees a picture of somebody, Hillary Clinton or Bill Gates or something like that, and then they take an old picture of that person or another picture, and they say, it doesn't look like Hillary Clinton. It doesn't look like Bill Gates. Why does this person have a body double? Right? And like. And we're like, oh, body. And then, and then everybody kind of knows how to play the game. They go out and they find more pictures, and they start comparing the pictures, right?
169
00:42:06,118 --> 00:43:00,954
Mike Caulfield: And so tropes kind of tap into a couple things. One is that they signal to participants in this sort of participatory storytelling that people engage in the sorts of evidence that people might go out and find, like, let's find more pictures of Hillary or something. They also allow people to quickly process that. People immediately when they hear body double, they know a number of reasons why there might be a supposed body double out there. And like all tropes, there are a couple instances in history where body doubles were used, but they're relatively limited, and they generally don't look like sort of the massive conspiracy that people say with body doubles. Same with, like, a false flag attack, right? A false flag attack is a trope where an attack on something is claimed to have been.
170
00:43:01,122 --> 00:43:44,894
Mike Caulfield: The people that were attacked actually started the attack themselves. Right. To gain sympathy or to make a political point or to implement a policy. False flag attacks do exist, but they're relatively contained, they're relatively small. They don't tend to look like 911 conspiracy theories. Right. You know, and to be like, oh, we had a false flag attack where explosives were set up in a building over time, and then to cover up the fact they were explosives, we flew multiple planes into the like. That's not right. So part of the idea of tropes is to understand very quickly kind of the history of that class of claim. Right? So if you think about false flags, it's a good example.
171
00:43:45,092 --> 00:44:29,852
Mike Caulfield: We know some instances of historical false flags, and we know the stuff that has not turned out to be true, the sort of more expansive stuff. And understanding that you're looking at a false flag claim when someone says, oh, Sandy Hook, the Sandy hook shooting was actually a false flag attack to try to create an environment where gun control, was going to be, proposed or something like that. When someone proposes that understanding. Okay, I know the trope. I know the places where a false flag attack kind of sort of applies. And it looks like this. And here's Sandy Hook. Right? And what sort of class does that fall in?
172
00:44:29,876 --> 00:45:01,842
Mike Caulfield: And it becomes apparent that it's using all the same sort of devices and faulty logic and weird connections and then sort of sub tropes like crisis actors that all the stuff over here is doing. And so understanding that and understanding the sorts of things that people are going to bring up with these false claims, I think if, you know, okay, they're saying it's a false flag and they're going to say, this crisis actors, and this is how those false claims work, they're more likely to be resilient against that misinformation because they know how it works. Right.
173
00:45:01,996 --> 00:45:04,862
Chris: Would you say it's like a pattern recognition sort of thing? Like once you.
174
00:45:04,966 --> 00:45:51,220
Mike Caulfield: Yeah, so here. Yeah, so here's the thing is, I think the people that tried to deceive us, and I think the people that are merely confused and trying to convert us rely a lot on getting us to exercise our pattern recognition. Right? I mean, they try to activate our pattern recognition. They say, well, here's a weird detail of this, and here's a weird detail of this. And look at this. This person here was actually an actor on the site over here, right? And they're trying to get us to process all this stuff in this way that's not particularly productive, where every sort of detail, as we've discussed, becomes like another variable in this equation that just becomes sort of overwhelming. But pattern recognition is something that we just can't help doing as humans.
175
00:45:52,240 --> 00:46:41,066
Mike Caulfield: Apart from the fact that we do it to solve problems, we've just also evolved. So that pattern recognition is pleasant. We like recognizing patterns it's something that we enjoy. And so bad actors and the merely confused both make use of this pattern recognition desire. And one of the things that we're looking at is, can we sort of flip that around, right? If we show people, hey, here's how you recognize like the fundamental parts of a conspiracy story and break it down. Can we engage people in that way and get them to utilize their pattern recognition and understand the narrative structure of these things so that we're taking that compunction that has so often been used for ill and we're using it for good. So they're a little more resilient against this stuff.
176
00:46:41,258 --> 00:47:24,864
Chris: That is absolutely my favorite part of it. And I think it sort of ties into my other favorite part of it is that it's, and maybe ties into the rest of the things we've talked about today is because it takes one unit of effort to create and spread a conspiracy theory, but ten units of effort to do a meticulous debunking of it. There's this scale problem that is just the more there is, the worse it is. But what I like about the idea of tropes and sift and that it sort of tips the scales back and allows you to say, okay, actually I, because I recognize this pattern of crisis actor or what have you, now I can basically dismiss it without spending all of those ten units of effort and trying to understand it because I understand the pieces.
177
00:47:24,912 --> 00:47:27,824
Chris: I've recognized the pattern well, and also.
178
00:47:27,872 --> 00:48:24,374
Mike Caulfield: I mean, get some, maybe you can get some pleasure and status out of it too. I mean, one of the things that a lot of conspiracy theory communities provide is if you discover the new piece of evidence, right. And you frame it in a way that plugs into the conspiracy and that's sort of accepted into the core of it, you get a little bit of a status boost from that, right. And that feels good. And I think we don't attend to that enough that so a people enjoy pattern recognition and conspiracy. You know, engaging conspiracy theories allows people to sort of attract. Right. And we don't really, we don't always come back and say we're going to fulfill that human need as well on this side. Like the fact checking side can often feel very dry, like, nope, that's not true. Right.
179
00:48:24,542 --> 00:49:02,370
Mike Caulfield: And we need to engage that. And the status piece, I think is important too. Right. You know, I'd like to see a world where, you know, people could engage on this stuff. And if instead of engaging in theory as fact, be the person that says, oh, you know what this is? This is basically a modified wag, the dog trope, except what they've done is they've replaced the war with this other camp, you know, and talk about it in that way where they're engaging that analysis, they're getting maybe some status from being able to break it down, but they're not. They're not sort of deep in the bowels of it. They're maintaining their analytical faculties.
180
00:49:02,450 --> 00:49:51,464
Chris: Trey, that is such a good point. I had nothing in the years of doing the show. Yeah, I've encountered what you're talking about. Right. I've encountered the, like, well, it's, you know, the fun cowboy, do whatever you want, move fast, break things, conspiracy side. And then there's, like, the, you know, the correct but boring vulcan side, you know, but, like, actually, that doesn't need to be the case. Like, that's such a good point. The one thing I wanted to mention about tropes, because I think our audience, or at least long time listeners who have heard our QAnon episodes might be familiar with, is blood libel. And it feels like that's a perfect example of that, where if you only hear about QAnon's beliefs and you say, like, what they believe, what, that the elites are drinking the blood of babies or, you know, what?
181
00:49:51,552 --> 00:50:21,008
Chris: Adrenochrome, whatever the specifics are there. Sounds weird, but then when you look at it as a trope, when you know that the trope exists, rather, because I think a lot of us don't. I didn't before we did the research, then it's like, oh, there's actually, like, hundreds of years of history behind this trope, and it has shown up. It showed up in the eighties, and it showed up in this time, and it showed up in, like, the founding of the nation and the 18 hundreds and back and back and back. So I just bring that up as an example. Our audience might well, and I think.
182
00:50:21,064 --> 00:51:09,470
Mike Caulfield: I mean, the other piece of that is that, you know, blood libel, you know, for. Especially for sort of the tick tock conspiracy theorist, it's just kind of all sort of fun, you know, when you understand the trope of blood libel, you realize, no, it's not fun. That's been responsible for a lot of suffering and a lot of, you know, a lot. A lot of death, a lot of oppression. And so understanding that before you kind of put a toe into it and saying, oh, actually, you know, actually this trope plugs into some. Some rather disturbing stuff, and maybe we don't go down this. Maybe we don't go down this path. Right. So, yeah. Knowing the history, I think, can show you, hey, this is just sort of the retreads on old tires.
183
00:51:10,010 --> 00:51:31,016
Mike Caulfield: It's nothing new, but I think it also can help people understand. Sometimes people get into conspiracy theories and they feel sort of light and bubbly, and then they're confused how the conspiracy theory community kind of, like, slides down into, like, you know, white supremacy or these sorts of things.
184
00:51:31,048 --> 00:51:34,904
Chris: And it's like there's always, like, there's always anti semitism at the bottom for some reason.
185
00:51:34,992 --> 00:51:35,304
Mike Caulfield: Yeah.
186
00:51:35,352 --> 00:51:37,152
Chris: It always funnels into that.
187
00:51:37,336 --> 00:52:05,440
Mike Caulfield: And it's like, well, if you actually knew the history of the tropes that you were using from the beginning, this wouldn't be surprising to you because all of these things were formed to tell these sorts of stories that were these anti semitic stories that were these colonialist stories. So, yeah, I think the history can show people that it's not as novel as they think. And then I think it can also clue people into some of the damage that these things can cause.
188
00:52:05,600 --> 00:52:41,786
Chris: Yeah. And the solution to the scale problem is so great, so neat to me that it's because it's, rather than having to debunk by yourself a thousand individual claims, a thousand individual things. And that's one of the things we've seen on the show is, like, there's a never ending flow of cults, cult like groups, weird stuff out there. There's a never ending flow of it. So rather than needing to debunk each one individually, you just have to notice that there's 500 of type X and another 500 of type Y. And I know all about type X and type Y because those are tropes that I understand.
189
00:52:41,978 --> 00:53:23,390
Mike Caulfield: Yeah. And, I mean, you can say, like, okay, well, that, you know, okay, this is body double trope. And you don't have to say that. That proves at, like, 99.999% certainty that this is false. What you have to say is, I mean, you say, like, maybe 99%. You say, look, given the history of this trope, the burden of proof is on you. I'm not going to do the work here. You know, this is a trope that almost always doesn't pan out. The configuration that you've presented this in looks like other things. We've done them before. You can go back. You do the work. We're not going to sit here and you have to show why. This time it's different.
190
00:53:23,690 --> 00:54:09,948
Mike Caulfield: This is one of the things we struggle with is that because people present these claims as novel, because they don't show the fact that, oh, this is just the same thing recycled. They present them as new and therefore as requiring, you know, new refutations. But I think once you understand they're old, it's not to say that they're not true, but it is to say, like, you got to show why all the old reputations don't apply here as well. Right? The burden is on you, and if you can do that, you'll listen. But very often the game with conspiracy theorists is that the burden of proof is on the person outside of the conspiracy theory. They can't explain every single detail of the conspiracy theory. Somehow they have failed to display prove it.
191
00:54:10,044 --> 00:54:45,690
Mike Caulfield: Whereas if the conspiracy theorist has 98% of their theory knocked down, well, they still have the 2% you haven't shown them. So you gotta, for stuff that we've seen before, you got to flip that script. I'm not saying that this is a way to get family members out of it. For that, talk to someone like David newert or someone like that. But I do think for the conspiracy curious, for the people that aren't deep down the rabbit hole, I think saying, you know, this isn't new. This is satanic panic. This is Britney Spears is the illuminati. This is pretty old stuff.
192
00:54:46,750 --> 00:55:02,890
Chris: So how would you say, for talking directly to our audience here, what's a good recommendation? Learn the tropes. Or is there a good resource for that? You can always tell them to listen to our podcast more. That's what I'm getting at here.
193
00:55:05,290 --> 00:56:04,806
Mike Caulfield: You can listen to the podcast more. Absolutely. One thing that I think is frustrating on the research is you can actually learn more about conspiracist tropes right now on tv tropes than you can in the academic literature. And that's not great. That's not a great situation. But it's partially because for a lot of years, conspiracy theory was not taken particularly seriously as an academic subject, whereas, you know, by people like tv writers, that was their bread and butter for a long time. Right. So I do think going through tv tropes is probably actually a good first stop and just understanding how tropes work. I do hope that over time, we'll start to identify some of the major tropes in various types of. Of misinformation. But you could start with something like tv tropes. You could listen to things like this podcast.
194
00:56:04,918 --> 00:56:52,572
Mike Caulfield: And I think it's important for podcasts like this and other podcasts to really tie these things. When you're dealing with something that seems novel, if you tie it to the history, and I've listened to some of your podcasts, you do seem to do this to tie it to previous sorts of instantiations, because again, it's building that up and understanding that this is just old labels on new bottles of wine or how. I forget how that goes. This is just sort of retreads on these ideas. I think that's one of the more powerful cases to be made, and I don't think we make it. But, yeah, I hope to do more with the subject of tropes over time and try to figure out what are some of the core tropes that keep on the getting used.
195
00:56:52,636 --> 00:57:40,650
Mike Caulfield: I'll give you an example of sort of a meta trope, a pair of metatropes that's used a lot, is what I call the Potemkin crisis and the Potemkin victory. The idea of a potemkin crisis is somebody wants to get a policy solution or implement something or get some result. In order to do that, they have to create enemies. They launch some sort of false flag attack or something like that. So potemkin crisis 911, Sandy Hook, these are sort of potemkin crisis things. Potemkin victory is the flip side of that, right? Potemkin Victory is something like the moon landing, right? So we want something that makes us feel like we won. We're great. We're awesome. And so we stage something, right? We stage something to show people, yeah, everything's okay, and then we put that on the moon.
196
00:57:40,950 --> 00:58:30,740
Mike Caulfield: Once you start to think about these things as broad categories, you start to realize they all use the same pieces, right? The minute someone see something and it's Potemkin crisis or Potemkin victory, somebody is going to propose that it was done on a soundstage, right? Whether it's the moon, whether it's Sandy Hook, whether it's the Gulf war in 91, in the accusations that CNN was filming it on soundstage, someone's going to make the soundstage accusation, right? The minute it's a Potemkin crisis, someone's going to talk about the crisis act thing, right? And so this is one of the ways that tropes kind of pull things together for people, both processing things. And I think analyzing things is once you're sort of in a particular area, all these other things are sort of activated, and you just know they're going to show up.
197
00:58:31,120 --> 00:59:22,330
Mike Caulfield: And so if you're prepared for that, you can do a better job. And so part of what I'd like to do is start mapping these out a little more fully and take a little bit of a cue, maybe there, from the tv trope site as well as from the academic research and not just narrative tropes, things like character tropes. So one of the character tropes in almost all conspiracies is the renegade. You get the person from the inside that's now on the outside. They're going to tell you the truth. And this has been noted since Richard Hofstadter back in his 64 essay on the paranoid style in american politics. I mean, he noted that back then, like conspiracies, they always have a renegade. There's always someone from the inside that's coming to the outside, and they got to give you the truth.
198
00:59:23,590 --> 00:59:37,982
Mike Caulfield: So these sorts of things, mapping them out and understanding how they all relate and. Yeah, and hopefully helping people sort of learn how to read these things without having to believe them.
199
00:59:38,166 --> 00:59:45,022
Chris: Right. Well, that was actually my next question. Is there, is there like a dictionary out there of these types of tropes, or is that what you're talking about something?
200
00:59:45,086 --> 01:00:33,064
Mike Caulfield: That's what I'm talking about. I feel like that's what we need. A lot of it overlaps with the stuff that you find in other media, in books and other pieces of fiction. The tropes that are most useful to people in conspiracy communities are the tropes that there's a certain subset and they work a certain way, this Potemkin crisis or this Potemkin victory. And then we get the soundstage. Well, there's a number of things. Like, you know how to look for evidence that something's on the sound stage, right? Like if there's little glitches in the video, you say, well, that's a green screen, right? So you go and you look for little glitches in the video if there's sort of a flattening effect, you know, where sometimes cameras, for various reasons of lighting, seem to flatten distance, you say, well, oh, it's studio lighting.
201
01:00:33,112 --> 01:01:23,016
Mike Caulfield: And so these tropes sort of signal what people look for. And the ones that you find in conspiracy communities are not necessarily the full set of tropes that you would find in fiction, but they're very often a subset that is very honed for people to participate so that people can go and find additional evidence. In the moon landing, the soundstage, people were looking at the different rocks, and they found, like, a rock that supposedly had, like a c on it. And they kind of made up this idea that, oh, well, that's a prop mark for a rock. It's like, of course, like, that's a, that's not how props work. They don't put c on like a visible prop. But yeah, part of the idea would be the pull from the stuff we know from fiction.
202
01:01:23,208 --> 01:01:38,544
Mike Caulfield: But some of the interesting pieces of that is how specific tropes are really profitable, trying to find attention or status because they allow for the discovery of certain types of evidence.
203
01:01:38,712 --> 01:01:57,890
Chris: Right. It's like for that cloud currency. Yeah. It's interesting, the overlap you mentioned between sort of like fiction tropes and these conspiracy tropes, like how much overlap there is. Therefore, I think I'm probably good to go because I've spent a lot of time on tv tropes stuck. I think I'm pretty much set. But.
204
01:01:59,830 --> 01:02:00,766
Mike Caulfield: There'S probably a lot of people.
205
01:02:00,798 --> 01:02:26,284
Chris: That are, there's so many links. It's just link after link, God damn it. But no, I think it's cool that's actually one of the things that you recommend. And it does make sense because like you said, yeah, there is an overlap there. What is your personal. Working in this space can be dismal. What is your personal mental health routine? Like, how do you maintain your own sort of like sanity and health?
206
01:02:26,452 --> 01:03:10,858
Mike Caulfield: Oh, geez. I mean, I'm not sure I'm the best example because I don't, I'm not as good about protecting it as I should be. But, you know, I mean, I think that things that ground you are your personal relationships, I think it's great if you're in this work to, for example, my wife Nicole, she's an art teacher. She has nothing to do with any of this stuff. And for me, that actually is wonderful. I think having people outside the work that you care about that when you start talking about, oh, well, that's a wag the dog thing. They're like, I don't know any of the words that just came out of your mouth. I think that's actually really useful. It helps you keep perspective.
207
01:03:10,914 --> 01:03:21,110
Mike Caulfield: And so I would say that I used to, I'm going to get back to, I used to do musical composition, and I used to write a bit more.
208
01:03:21,850 --> 01:03:36,282
Chris: Those are great tips. I've started recently, personally, I've started doing some like, some digital art stuff to kind of like, I found that really engaging and rewarding in like a positive way. Like, rather than scroll Twitter for 2 hours, I do that, and it's just way better.
209
01:03:36,426 --> 01:03:37,110
Mike Caulfield: Yeah.
210
01:03:37,890 --> 01:03:48,378
Chris: Is there anything that we didn't cover that. You would like to say to our audience, oh, geez, no pressure on the.
211
01:03:48,394 --> 01:04:34,308
Mike Caulfield: Spot, you know, I mean, one thing I would say is, I'm trying to tell everybody that I talk to and all the audiences I give presentations to, you know, if you care about misinformation, disinformation, I mean, there's a lot of focus on correction and things like that, and you should go out. If you feel like correcting people, you should do it. One of the other things that people can do if they feel helpless is just share. Good people don't think about this enough, but the people that share bad information are often highly motivated and share a lot. But what we find is this asymmetry is actually part of the problem, that people look at the good information, think, well, that's kind of what I expect. It's like, oh, well, actually, vaccines worked quite a bit. You know, like, that's not surprising.
212
01:04:34,364 --> 01:05:21,680
Mike Caulfield: Like, why would I share that? That's what everybody thinks, and they don't share it. And so then people are in these information environments that are very unbalanced. And so if you want to do something simple, just commit to sharing a piece of reliable information a day to either your Facebook account with your parents or whatever, and skip the argument, but you're getting that stuff in there. I think that's. I think that's underrated. And so sift is important. Think before you share. Do all this stuff before you share. But if you find something good and useful, please do share it, because that's the second half of this. We've got to not only try to reduce the flow of bad information, I think we've got to significantly increase the flow of good information because a lot of people just aren't getting it.
213
01:05:24,150 --> 01:05:24,930
Kayla: Wow.
214
01:05:26,670 --> 01:05:29,590
Chris: No, that was one of my favorite interviews, actually, that I've done.
215
01:05:29,630 --> 01:05:30,558
Kayla: Oh, are we recording?
216
01:05:30,654 --> 01:05:35,430
Chris: Yeah, we are. No, we've been recording. I recorded the wow. The wow was funny.
217
01:05:35,470 --> 01:05:39,770
Kayla: Oh, no. We have to have a better way to come out of an interview.
218
01:05:40,070 --> 01:05:42,382
Chris: Well, I said it was one of my favorites.
219
01:05:42,526 --> 01:05:44,030
Kayla: That was a very good interview.
220
01:05:44,190 --> 01:05:45,710
Chris: Yeah. Like, because it was.
221
01:05:45,790 --> 01:05:47,830
Kayla: I'm very pleased.
222
01:05:47,950 --> 01:05:58,630
Chris: There was so much actionable there. Right. It wasn't just, hey, this is what QAnon's like, and it sucks. Right? It was like, hey, everyone, like, there's some real.
223
01:05:58,970 --> 01:06:01,930
Kayla: Here's some things to do, tangible things to do. Oh, man.
224
01:06:02,090 --> 01:06:09,042
Chris: I think that the biggest. Some of the biggest things that sat with me were the fact that your attention is the scarce resource.
225
01:06:09,146 --> 01:06:09,706
Kayla: Yeah.
226
01:06:09,818 --> 01:06:14,186
Chris: Right. And using the idea of tropes and using the sift technique.
227
01:06:14,338 --> 01:06:15,450
Kayla: Actually, let me stop you.
228
01:06:15,530 --> 01:06:15,922
Chris: Yes.
229
01:06:15,986 --> 01:06:22,208
Kayla: As we go into our analysis of said, would you mind reminding us what the sift technique is?
230
01:06:22,224 --> 01:06:33,704
Chris: What do you want me to sift out? What the sift. So s is stop, I is investigate the source. F is f. No idea.
231
01:06:33,832 --> 01:06:35,808
Kayla: Shit. Didn't write this down?
232
01:06:35,984 --> 01:07:15,490
Chris: No, I had it in my head. F is find better or other coverage, and t is traced back to the original context. So just real quick with the stop is almost the most important one. Whereas it's just like, that's the thing that catches. That's the conversation that you have in your own brain between your enraged emotional monkey brain and your higher cognitive thinking. And your higher cognitive thinking is the one saying, oh, stop for a second. I know that it's gonna give you some endorphins to go see what stupid things so and so has said. This time, just stop for a second, and then that's what gives you the opportunity to do the other three things.
233
01:07:15,950 --> 01:07:57,932
Kayla: I really respond to the sif technique because it's like you and I have had this conversation in our real lives about how I feel like I'm at this point in my life where I just don't accept any information that is presented to me as fact or something to absorb. And I have to do that in order to survive in the world. But it's not really a fun place to be where I'm like, I spend a lot of time on the Internet, and I'm just kind of scrolling through stuff that I'm like, this is not information that I'm accepting. I cannot accept this into my being. Yeah, and that's like a doomer place to be. As opposed to sift is like proactive place to be.
234
01:07:58,036 --> 01:08:19,658
Chris: Yeah. And I think in some cases it's kind of like, it kind of doesn't matter. You know, like it's. It's more about the entertainment than it is about the information. You know? Like, you hear, like, you see something like, hey, look, this dog is taking care of these little ducklings. And now they're a dog and duckling family. And it's kind of like, yeah, that's probably fake, you know, but I've watched Nathan for you.
235
01:08:19,714 --> 01:08:21,082
Kayla: I know these things.
236
01:08:21,225 --> 01:09:05,374
Chris: Right. But at the same time, with those, it's kind of like, I don't think you necessarily have to do the effort of like, fuck that. That's fake. You can just kind of be like, you know what? Whatever. Dog and ducklings. That was neat. It's more entertainment than it is information. Right. But in some cases, the information or the decision that you are needing to make about that information does matter. And that's when you can use something like sift, where there's still a lot coming in to your field of view. And the only way that you can manage to do anything about that. And certainly because like everything else, this is a problem that has been put onto us as individuals by lack of action from the systemic forces that are allowing it to happen.
237
01:09:05,502 --> 01:09:18,685
Chris: But since it's on us as individuals, this gives us a good technique for basically saying, like, how do I ignore the 80% of the stuff that I need to ignore? Because my attention is the scarce resource. I can just keep going.
238
01:09:18,758 --> 01:09:24,294
Kayla: No, I mean, I would like for you to keep going. I know that you took copious notes on this. I want to.
239
01:09:24,462 --> 01:09:25,765
Chris: Well, I mean, the other thing is.
240
01:09:25,837 --> 01:09:28,585
Kayla: Your face just dropped. You're like, I did shit.
241
01:09:28,697 --> 01:09:43,216
Chris: No, I was dropping to look at the notes. Kayla, I have a note here to say, like, suck it on disc because I'm high. D. And anybody that's an is or a c on the disc test, do.
242
01:09:43,233 --> 01:09:44,296
Kayla: People know what the disc test?
243
01:09:44,353 --> 01:09:45,669
Chris: No, I don't think so.
244
01:09:46,529 --> 01:09:48,761
Kayla: What is the disc test and why are we talking about it?
245
01:09:48,785 --> 01:10:06,364
Chris: Just because. D. What? So anyone that's, like, worked in corporate America has probably done, like, a, you know, a personality test type thing. And at Blizzard, we did this one called disc, which was supposed to sort you into, like, how your interpersonal. How you relate interpersonally.
246
01:10:06,492 --> 01:10:07,748
Kayla: This sounds like a cult.
247
01:10:07,884 --> 01:10:09,076
Chris: Oh, it's super. Is.
248
01:10:09,148 --> 01:10:11,828
Kayla: I did disc for a job, too. Yeah, worked for a cult.
249
01:10:11,924 --> 01:10:53,866
Chris: Yeah, exactly. So d. D means sort of like, decision. Like, you just make a decision even if it's wrong. Right. I is interpersonal. I think it's like you're better at being friends with people. I don't know. S is. C is conscientious. I don't know. Social. No. S is steady. S is don't rock the boat. And C is conscientious. It's like, so if you're conscientious, that means that you like to analyze stuff a lot before making a decision or moving on it. And since I scored high in D, which is just make it fucking decision, I am utilizing this entire interview as a way to say that I am superior to everyone else.
250
01:10:53,898 --> 01:10:57,202
Kayla: I don't think this is the takeaway from this interview with Springfield.
251
01:10:57,306 --> 01:10:59,114
Chris: No, that was the takeaway is that I'm superior.
252
01:10:59,202 --> 01:11:37,520
Kayla: How about we talk about my takeaways all right, so I think my biggest takeaway from that is the phrase of, do I even know what I'm looking at? And I think that is so important because, like, I consider myself to be someone who's very media savvy. Savvy. And, oh, my God, the amount of times that I have been duped, that I have been. That I have caught myself sharing something that's fucking fake, that I have gotten tricked, it's mind boggling. And I'm somebody who, like, considers myself very. Yeah, I do consider myself Internet savvy.
253
01:11:37,830 --> 01:12:00,638
Kayla: And so just the idea of making sure I know what I'm looking at before I invest those brain cells into engaging, that's so important to not just know about, but to internalize, to really put that into your body and brain as a way to operate in the world moving forward is, do I even know what I'm looking at?
254
01:12:00,734 --> 01:12:39,910
Chris: Yeah, it's such a. It's an easy thing to say, too. It's like an easy little mantra to remember. Stop. Do I even know what I'm looking at? Stop. Do I even know what I'm looking at? And I'll say here, too, while we're on the topic of mea culpa, make sure your own house is clean first. This is totally not just a right wing thing. I know we talked a lot about Hillary Clinton, and we've complained about Steve Bannon, and all of those things are true. There's a lot of conspiracy theories about Hillary and Bill Gates and whatnot. And there's a lot of conspiracy theory coming out of the right, to be sure. But especially recently, like, with the Ukraine stuff and with. I mean, there's so many, like Rebecca.
255
01:12:39,950 --> 01:12:44,054
Kayla: Jones in Florida, we talked about the body double trope.
256
01:12:44,142 --> 01:12:44,454
Chris: Yeah.
257
01:12:44,502 --> 01:12:49,606
Kayla: The biggest one that I can remember of the last couple years was the Melania Trump body double trope.
258
01:12:49,638 --> 01:12:50,598
Chris: Exactly. And we're like, ha.
259
01:12:50,614 --> 01:12:51,766
Kayla: Hang from the right.
260
01:12:51,838 --> 01:13:06,328
Chris: There's a Hillary Clinton body double trope. Those dumb people on the right. But then, yeah, there's totally one about. Yeah, there was that one about Melania that was throughout the entire Trump presidency. Like, oh, it's not really her. So it's. I just want to be super clear.
261
01:13:06,384 --> 01:13:08,448
Kayla: That this is, like, this is not a partisan issue.
262
01:13:08,504 --> 01:13:25,642
Chris: It's definitely not a partisan issue. And the more that, you know, that we think that it is, the more that we take this, you know, these tips is like, yeah, those people that don't do this stuff are dumb. It's those people that are not like me right. Then the more we are susceptible to it ourselves.
263
01:13:25,706 --> 01:13:49,934
Kayla: We all don't do this stuff. We all don't do this. And it's because we're still. The Internet is still new. I know it's been around for most of our lives, but it is still a new thing in a lot of ways. We do not have a clear way of interacting with it correctly. And it's constantly changing, too. It's constantly changing.
264
01:13:49,982 --> 01:13:51,450
Chris: I think we should just throw it away.
265
01:13:51,870 --> 01:14:33,660
Kayla: God. I mean, that's actually, that makes me want to talk about something that, like, came up. Not throwing away the Internet. That came up for me while were talking about this. And it feels like, again, this is not a partisan issue. This is not reserved for any age group or class or creed or whatever. And it does seem like we have a little bit of an issue right now where there are groups of people that are more susceptible to falsehoods on the Internet than others. And what I'm thinking of particularly are folks who did not grow up with the Internet in the ways that you and I did. So I'm thinking specifically of the boomer generation, and I'm not using that derogatorily. And the Gen Z generation.
266
01:14:34,360 --> 01:14:55,416
Kayla: And, I mean, even in the interview with Mike Caulfield, he talked about the TikTok conspiracy theory culture. And that is a very explicitly Gen Z thing. It's on a Gen Z affiliated platform, and it is being. The culture is largely very young millennials or Gen Z engaging with this content.
267
01:14:55,488 --> 01:14:57,820
Chris: And not, like me, a geriatric millennial.
268
01:14:58,920 --> 01:15:16,442
Kayla: And I don't really know what I have to say about that other than that it does seem like there is a little bit more of an issue for folks who did not. For folks who did not engage with the Internet on a, like, deep level from its origins.
269
01:15:16,546 --> 01:15:17,230
Chris: Yeah.
270
01:15:17,930 --> 01:15:21,930
Kayla: And that is people like our parents who came to the.
271
01:15:21,970 --> 01:15:23,578
Chris: Who came to, you know, they listen to this. Right?
272
01:15:23,674 --> 01:15:36,088
Kayla: Yeah. And I mean, people like our parents came to engage in dummy. No, but they came to love you guys engaging with the Internet at a high level at a much later time in their lives.
273
01:15:36,184 --> 01:15:36,536
Chris: Sure.
274
01:15:36,608 --> 01:15:46,080
Kayla: And Gen Z do not have a framework for the Internet not being what it is now, where it's just Twitter and Facebook and TikTok.
275
01:15:46,200 --> 01:15:53,256
Chris: Yeah, well, you saying that about boomers versus Gen Z versus millennial makes it also feel a little. And Gen X. Sorry. I know you guys are always laughing.
276
01:15:53,288 --> 01:15:55,140
Kayla: No, I'm sorry that we're leaving you out.
277
01:15:55,840 --> 01:15:58,000
Chris: It's just the nature of your generation.
278
01:15:58,120 --> 01:16:00,950
Kayla: I'm kind of lumping you in with millennials here.
279
01:16:02,530 --> 01:16:25,418
Chris: My perception of it is that it's maybe just like a flavor difference, right? Like, we're all susceptible to different flavors, different versions of things. Like, I would say, you know, boomers are probably pretty resilient against things like the moon landing or Holocaust denial, or just kind of like, thinking about my own people in my life that I know, right.
280
01:16:25,474 --> 01:16:31,062
Kayla: Boomers are probably way less susceptible to, like, a conspiracy theory about Pearl harbor than they would be to something that.
281
01:16:31,126 --> 01:17:17,006
Chris: And 911. So there's things that are sort of, like, they're more resilient to, whereas I feel like. And then they're less resilient to things that are Internet native, whereas it's almost more of the opposite for Gen Z, where Gen Z might discover some older conspiracies. That kind of like what Mike Caulfield was saying about tropes, right? A trope that they don't recognize as a tropez, and they think that it's some new thing, and they're, like, less resilient against that kind of stuff, so. And for us millennials, I don't know. I don't know what the hell, we're over. Well, something interesting, something that's probably economic for us, to be honest. Like. Like, seriously, like, you know, with the growing up, through the 2008 financial crisis.
282
01:17:17,078 --> 01:17:17,526
Mike Caulfield: Sure.
283
01:17:17,638 --> 01:17:30,120
Chris: And what we've had to deal with the last, you know, four to five years, whatever. We're probably less resilient against things that have to do with the economy and people in power and being taken advantage of and that kind of stuff.
284
01:17:30,160 --> 01:17:33,620
Kayla: I wouldn't be surprised if were more susceptible to things like mlms.
285
01:17:34,000 --> 01:17:38,464
Chris: Honestly, mlms maybe even anti vax because of just the lack of trust institutions.
286
01:17:38,632 --> 01:18:09,216
Kayla: I also think that. And again, I'm not trying to say millennials are better at this. I don't think that we are. I think that we just have a different relationship. I think that it is important to talk about the ways in which different generations interact with the Internet differently. And it even just makes me think of, like, I remember when I was in elementary school, there were. When we had, like, computer class, we learned how to Google, and that is a very specific moment in time. I don't even think you had that.
287
01:18:09,288 --> 01:18:10,420
Chris: No, I wasn't.
288
01:18:10,760 --> 01:18:12,680
Kayla: You're only a very little bit older than me.
289
01:18:12,760 --> 01:18:13,264
Chris: Right.
290
01:18:13,392 --> 01:18:46,808
Kayla: And I don't think that takes place in schools as much anymore. I could be totally talking out of my ass there, but from what I have seen on TikTok and folks talking about that particular phenomenon on TikTok. It's like the Internet is almost taken for granted in this way that I don't know if it is being, I don't know if people are being taught to Google. I don't know if people are being taught about the ways that the Internet, what the Internet is outside of what it is, as Google, Facebook, Amazon, Twitter.
291
01:18:46,864 --> 01:19:21,918
Chris: TikTok, particularly millennials of our age, where the Internet is something that we have memory of there not being one. And it is a new thing that was introduced. But because it is so all encompassing in terms of our society, it's really hard, maybe impossible, for us to wrap our heads around what it's like to just have everything be like that from birth, by default, like, to grow up with. The Internet was already a thing that's probably a mindset, a framework that it's just impossible for us to get into.
292
01:19:21,974 --> 01:20:04,222
Kayla: Well, it's like, I think about, it's probably similar to television for us, where if we had news on all the time in our homes as kids, we're not sitting there questioning the validity of what is being shown to us. We're not sitting there going like, I wonder. And then you grow up and realize, oh, wait, every news source has a bias. There are certain things that are shown and aren't shown, and sometimes you get total propaganda, blah, blah. And you have to actually be savvy. You do have to actually be savvy about your news media consumption. And so if you grow up in a society in which the Internet is omnipresent, I think it's less likely to be inherently critical of it or inherently distrustful of it in some way, or.
293
01:20:04,246 --> 01:20:28,048
Chris: Even to know their forget you're not learning techniques like searching or sift or whatever. You might not even know that there are techniques that you need to learn, right? It's like you were saying with the news, right? If I'm ten years old watching the news, I'm just watching the news. I'm not ten years old watching the news and saying, perhaps I should consider the source of that, right? It's something that you're just raised with.
294
01:20:28,104 --> 01:20:58,678
Kayla: And no one really ever taught. There's no, like, systemic teaching as far as I'm aware of, like, how to be a savvy news consumer and how to, like, understand when something you're being shown is propaganda versus when it's. And what you're not. Like, there's not really a systemic version of that outside of, like, academia. And that I think, like, for Mike Caulfield, to be talking in his interview about how sift and the work that he does takes place in an academic setting. It's like, how do we bring that to the wider public?
295
01:20:58,734 --> 01:21:44,154
Chris: Because this podcast is up. I mean, literally, that's what this episode is about. That's why I wanted to do this episode. One more thing I wanted to talk about is just, this is just sort of like a theory crafting thing. But I saw an example of the just asking questions trope the other day, and it was like, I forget who it was, but it was yet another, I don't think it was Bruce Willis, but it was like some, or maybe it was like, celebrity has bad thing happened to them? And then did you know they just had gotten the vaccine? And that's like, we've seen a million of those. If you've been online, you've seen one of those tropes. But I was wondering, I feel like there's trope types almost. If Mike Caulfield's listening to this, maybe he can answer this.
296
01:21:44,282 --> 01:22:11,630
Chris: But there's the just asking questions trope. There's this structural trope, and there's a bunch of ways that manifests itself. Like, oh, so and so celebrity got sick after the vaccine. Is that true? Is that meaningful? I don't know. I'm just asking questions. But then there's, like, plenty of other ways that it manifests itself. Like, is so is Hillary actually responsible for killing Jeffrey Epstein? But I don't know. I'm just asking questions.
297
01:22:11,710 --> 01:22:12,710
Kayla: So insidious, right?
298
01:22:12,750 --> 01:22:14,222
Chris: It's like, it's like, what the, it's.
299
01:22:14,286 --> 01:22:17,766
Kayla: What they do on it. Whatever you want, as long as you say you're just asking questions.
300
01:22:17,798 --> 01:22:36,426
Chris: Well, speaking of tv tropes, like, that's literally, like, every tv lawyer does that, right? Where it's like, oh, I'm gonna ask this question that I shouldn't be asking. And the judge is like, overruled. You can't say that. You can't ask that. And then they'll be like, okay, I rescind what I said. But, like, by that point, it's already.
301
01:22:36,458 --> 01:22:37,018
Kayla: In the jury's ear.
302
01:22:37,034 --> 01:22:40,074
Chris: It's already in the jury's ear. And that's, like, literally what the technique is.
303
01:22:40,122 --> 01:22:52,210
Kayla: Well, and that technique is also, like, you see that in news media again, where it's, like, a really good thing to remember is that if a headline is being written as a question, the answer is probably no.
304
01:22:52,290 --> 01:22:52,930
Chris: Right, exactly.
305
01:22:52,970 --> 01:23:13,562
Kayla: And again, there's not, like, you're not learning that in school, but it is a great rule of thumb that if you're reading something, particularly if it is a more biased news source and the headline is a question, are millennials eating all of the avocados? And now there's going to be a global shortage, the answer is almost always no. Probably no.
306
01:23:13,626 --> 01:23:31,470
Chris: Right. And if you know that Tropez. Trope in itself. Right, right. Written as a question, probably no. What I was wondering with this, though, is, like, is it a just asking questions trope, or is it a celebrity vaccine reaction trope? And I feel like it's kind of like both. Like, maybe there's, like, categories, you know.
307
01:23:31,550 --> 01:23:33,526
Kayla: Both would be hyperlinked on tVtropes.com dot.
308
01:23:33,558 --> 01:23:49,406
Chris: Yeah, exactly. It's like, structurally, it's a just asking questions trope, but content wise, it's a celebrity vaccine reaction tropez. So I feel like there's, like, some. Some space to talk about, like, the categories of tropes or, like, trope types, maybe. I don't know.
309
01:23:49,438 --> 01:23:56,130
Kayla: I think we need to get mike back tropes. But for what's. What he's doing, that. Conspiracy tropes. Yeah, we need this. We need this now.
310
01:23:56,430 --> 01:23:58,478
Chris: I know. I think it's a really good idea.
311
01:23:58,534 --> 01:24:03,214
Kayla: It absolutely is. Because, again, we need this. This stuff needs to be codified.
312
01:24:03,302 --> 01:24:45,280
Chris: Like, right. And the thing is like, if you don't know, like, again, just, you know, for our audience, when I asked Mike about, you know, what does our audience do? Like, it would be nice if the answer was like, oh, yeah, go to this database and screw around on it like you would on tv tropes. And at the end of the day, you'll be resilient against conspiracy theories. Or it could even be maybe better as something that's easily referenceable. It's easy to search. You go to this thing and you're like, I saw this claim and it involved somebody said false flag. And I'm going to type false flag into here. And then you're like, oh, yeah, that came up on conspiracytropes.com. False flag. Here's all the instances of that in the past and why it's never panned out. And yada, yada, right.
313
01:24:45,940 --> 01:24:54,440
Kayla: I do want to just say one more thing connecting back to. I just want to clarify something that I said where I connecting it back to the boomers versus Gen Z.
314
01:24:54,980 --> 01:24:56,916
Chris: You want to apologize to our parents.
315
01:24:56,988 --> 01:25:06,996
Kayla: Okay, I'm sorry, parents. But what I really mean there is that there are unique weaknesses to exploit.
316
01:25:07,108 --> 01:25:07,760
Chris: Right.
317
01:25:08,620 --> 01:25:10,564
Kayla: In various groups of people.
318
01:25:10,652 --> 01:25:12,172
Chris: And I think that does include us.
319
01:25:12,236 --> 01:25:30,354
Kayla: And that absolutely includes. It includes everybody. And that's what I wanted to clarify. I was not trying to say boomers are dumb or Gen Z is dumb, because I don't believe that. I absolutely do not believe that. And I think that, you know, not a great. That's not a great trope to propagate.
320
01:25:30,402 --> 01:25:32,146
Chris: Generational tropes. Oh. Yeah.
321
01:25:32,178 --> 01:25:52,982
Kayla: I think that women should absolutely have less generational. Millennials are entitled fighting and more generational reaching across the aisle. So I just want to clarify that. What I mean is that there are. If there are bad act, bad faith actors, they have multiple different ways to exploit multiple different groups of people.
322
01:25:53,046 --> 01:25:59,886
Chris: Right. And it might be useful for, if you find yourself in one of those categories, to consider what your blind spots might be.
323
01:25:59,918 --> 01:26:18,698
Kayla: My big blind spot is that I think I know everything. Like, literally, I think that might be a millennial blind spot is that we think that we're, like, so media savvy and so Internet savvy that we're not susceptible to this. And that's when I get tripped up, that's when I make a mistake, and that's when I get, you know, duped by misinformation on the Internet.
324
01:26:18,814 --> 01:26:40,426
Chris: Agreed. Before we wrap up, just a few of the things we normally have to go through on the show. Not have to, but, you know, it's important that we do. Sources fairly limited this time because it was mostly the interview, but I did a lot of reading on Mike Caulfield's blog. It's called Hapgood. The address is Hapgood us. I told you. I did a lot of blog reading this time, too.
325
01:26:40,458 --> 01:26:42,370
Kayla: I love it. Early days of the Internet.
326
01:26:42,450 --> 01:26:43,146
Chris: Yeah.
327
01:26:43,338 --> 01:26:44,950
Kayla: Dwelling days of the Internet.
328
01:26:45,600 --> 01:27:15,986
Chris: I also recommend following and keeping up with everyone at UWCIP. And just again, that's the University of Washington's center for an informed public. I cut this out of the final interview track, but I had this whole thing where I was just being the most obsequious little nerd and thanking him and all his colleagues at EWCIP. I'm like, thank you so much, and tell everyone else that I think they're great. I love you, but they deserve it. I mean, you know, so I follow them on Twitter.
329
01:27:16,058 --> 01:27:17,698
Kayla: Good work. Deserve their flowers.
330
01:27:17,794 --> 01:28:07,292
Chris: That's right. So follow them on Twitter. I won't list them all out here, but you can go to UwCip's Twitter and find them, you know, and read their blog posts. I also want to do a quick shout out to their communications director, Michael Grass. Who is a huge help with resources and coordination for me. And these last 2 may be spoilers for future episodes TBD. I kind of want to, but I also did some research using the resources available to me from the Stanford Internet Observatory, and I'll specifically recommend following Renee Darista and then also the John F. Kennedy School of Government at Harvard and specifically there I'm talking about. I'd highly recommend following Joan Donovan on Twitter and following her work. All these folks are aces on the vanguard of media literacy and misinformation research. And that's it for the sources here.
331
01:28:07,316 --> 01:28:21,196
Chris: I'll put all that stuff in the show notes. This is one of those episodes I mentioned up at the top of the show where again, we're highlighting a specific helper and some of his most helpful techniques. Not exactly a cult, so it doesn't really make sense to go over the criteria.
332
01:28:21,388 --> 01:28:24,788
Kayla: Is Mike Caulfield a charismatic leader, yes or no?
333
01:28:24,844 --> 01:29:10,732
Chris: Oh, absolutely. Absolutely charismatic. Yeah. But the reason we're even having a show about the topic is that it does relate to cults, particularly since cult like groups engage in disinformation and the Internet has greased the skids for everyone, cult like groups included. So expect us to bring up the idea of sift and information tropes in our future episodes. To sort of wrap this up here, I'll just say that in the past, I think we've characterized the whole, like, I can see where this is gonna go thing. When we're looking at some of these groups, like QAnon or anti vaxx or individual people like Joe Rogan, we've sort of characterized that as like a curse. When were talking to Matt Remsky. Yeah.
334
01:29:10,836 --> 01:30:03,282
Chris: And Travis Vue last season, they sort of had that future site, Cassandra, like, they talked at length about how they can tell exactly where XYz story or person is gonna go. And Mister Remsky could predict exactly what was going to happen after Joe Rogan contracted Covid, and how you and I can tell when we see a few key attributes in a group, those are red flags, and we know what else we're going to see when we look at those groups. And yeah, we've really sort of characterized that as a curse. But after talking to Mister Caulfield about this idea of tropes, maybe it's not a curse, actually. Maybe it's a superpower. Maybe it's actually us utilizing the power of tropes to keep us safe and to protect the precious resource of our limited attention and short time on earth. I'm Chris.
335
01:30:03,426 --> 01:30:04,338
Kayla: I'm Kayla.
336
01:30:04,434 --> 01:30:07,970
Chris: And this has been cult or just weird.
337
01:30:08,130 --> 01:30:10,170
Kayla: Enjoy the new music for season four.