I’m a really smart conspiracy guy. I read like everything I can about conspiracy theories and stuff on Reddit, and watch tons of conspiracy videos on YouTube, and I’m a lurker on a few other platforms that I won’t name here because I don’t want to get shadowbanned for mentioning them. The cabal is crazy like that. They will ban you just for mentioning stuff they don’t like.
I really love Xbox. Especially the Halo series and Call of Duty. And the Matrix. That movie frickin’ rules. It’s like one of the only movies that like tells the truth about what’s REALLY going on in the world and stuff. I try to watch it like once per month, if not more. I get really high and (if my mom’s not around) turn the sound way up, and just frickin’ chill.
Yeah, I mean, I live with my mom still, but mostly hang out in the basement. So it’s totally cool. There’s a toilet down there and a fold-out couch, so it’s almost like having my own apartment. She makes me vacuum, but I don’t mind. I make it like a game, and imagine I am collecting coins or points or something.
Dark Side of the Moon by Pink Floyd is my favorite album. Followed by Legend by Bob Marley. Both of those albums rule so hard. When I listen to them, I usually turn on this blacklight I have, because I also have some blacklight posters of like aliens and mushrooms and stuff. And the posters look really awesome when you turn the blacklight on. And I just like stare at them and trip out and think about like conspiracies and stuff. It’s totally rad.
I had an entry-level job in construction for a while, but I got fired when my boss’s boss found out I was making conspiracy videos on TikTok. One of them went viral and landed in somebody’s for you page or something who’s a higher up at the company. I guess I probably should of used a fake name, and maybe not mentioned the name of the company I worked for or whatever. But like, they were maybe up to some shady stuff I think. So, in the end it’s for the best probably.
Anyway, then my mom talked to her sister, and my cousin ended up helping me get a job at Walmart pushing shopping carts with him. Which was actually like, totally cool. Cause I love Walmart and all the like $3 DVDs and stuff. And Pringles. It’s like the world capital of Pringles. So it’s really cool.
But like my mom got ultra mad for no reason like always, just because I ended up signing over my first paycheck to this really cool old janitor dude I met who works there. His name is Larry and he was in Nam and is totally into conspiracies too. Plus he sells prepper supplies and libido pills on the side. He said he could cut me in on it, and I could probably make an extra hundred dollars a month selling stuff for him. I thought it sounded like an awesome deal, but my mom was like super pissed, and asked me what the hell I was planning to do with all these penis pills and like buckets of rice and lentils and stuff. I tried to explain it to her, but she just didn’t get it. She’s not as much of a free thinker as me.
She made me quit Walmart because she thinks the senior citizens I work with are a bad influence. She told me as punishment that I had to eat only my prepper supplies from now on and make my own food, cause she wasn’t gonna do it anymore. She said I needed to learn my lesson. But I actually kind of like rice and lentils with some Frank’s Red Hot Sauce; that shit is so good. So the joke is actually kinda on her. The other thing is I am farting like all the time now. But it’s actually kind of funny too, especially if she is around. She gets super mad and says I am gross.
I haven’t heard from my dad in a while. It’s been a couple years actually. We don’t even know where he is living now, which is shitty but whatever. Whenever I used to ask about him, my mom would say that he is a good-for-nothing dirtbag, and if I’m not careful I will end up like him. So I stopped mentioning it.
Even though my mom can be kind of a sheeple, she decided not to get vaccinated against the A.I. Virus. And she said I can do whatever I want cause I’m over 18 now. I actually think the A.I. Virus is a hoax, because like, how could a computer virus even infect a person? It makes no sense.
So, of course I didn’t get vaccinated either. I don’t want to like have all those little microchips in my body and stuff. Cause like, it’s probably the microchips in the first place that makes people act all weird. That’s totally the kind of crap the cabal would do. Plus, I mean like, I don’t even know anybody who got sick. So how can it be real?
Or at least that’s what I thought when this whole thing began…
I know this is one of the favorite tropes of conspiracy people, but I suspect my amazing account on TikTok was shadowbanned on account of (pseudo) conspiracy content.
The thing about pseudo-conspiracy content, of course, is that it is by and large indistinguishable to the naked eye (or the algorithmic eye) from “real” conspiracy content. Commentary and satire also get thrown onto the ash heap of history, without regard for fundamental differences.
The thing that’s forever tantalizing about the concept of shadowbanning, is that it is all but impossible to find “proof” that it is occurring, especially with the poor quality stats platforms generally give to users.
For illustrative purposes, here is the past 60 days of engagement:
Embeds on TikTok don’t tend to play nicely with WordPress, but here is the YouTube version of the video that caused the traffic spike a little after June 20, 2021 listed above:
I had early success with this account by playing on Mandela Effect stuff, which is by and large harmless. After the success of the above, and a few follow-ups, I ended up leaning more into the conspiracy direction. Here is the correlating time period’s increase in followers:
You can see the followers jumped dramatically around the same time period as the video above was posted, and then basically plateaued. But, with such a sudden and dramatic increase in followers, one would theoretically *expect* that any content posted after that bump would automatically get more traffic than content posted prior to it, purely based on distribution to followers.
But if you look at the traffic graph, that is not the case.
One thing I’ve learned working for platforms, however, is that algorithms are inscrutable, even to those who develop and maintain them. The fact of the matter may very well be that there is no explanation. Or if there is, it would just be based on a “best guess” by an engineer, and that’s about as far as it could be taken.
Users of platforms, however, like to believe in the fiction that everything behind the scenes is perfectly and intentionally designed to act a certain way. While that may be the case in terms of broad strokes, it is rarely the case when applied to a specific set of detailed examples. We might be able to approximately match the overall system design when examining a single example, but as I said, it’s rare you can perfectly suss out what is going on. At least in my years of experience in the matter.
That doesn’t stop platform users from 1) theorizing, and 2) assuming that they are being targeted, and 3) assuming targeting is happening because of their political beliefs.
Here’s an interesting example I noticed while toying with pseudo-conspiracy content on TikTok:
This is a search results page for the somewhat vanilla term “cabal” on TikTok (above). The included text reads:
No results found
This phrase may be associated with behavior or content that violates our guidelines. Promoting a safe and positive experience is TikTok’s top priority. For more information, we invite you to review our Community Guidelines.
TikTok has blocked a number of hashtags related to the QAnon conspiracy theory from appearing in search results, amid concern about misinformation, the BBC has learned…
“QAnon” and related hashtags, such as “Out of Shadows”, ”Fall Cabal” and “QAnonTruth”, will no longer return search results on TikTok – although videos using the same tags will remain on the platform.
Now, my usage of #cabal was imitative of QAnon conspiracies, but I intentionally never linked my account to that overall cesspool of content, to which I am personally vehemently opposed.
The word cabal itself is, of course, a neutral and perfectly valid English word:
noun
1. a small group of secret plotters, as against a government or person in authority.
2. the plots and schemes of such a group; intrigue.
3. a clique, as in artistic, literary, or theatrical circles.
There’s even an overtly non-conspiratorial definition of that word, as you can see. And the etymology of the term is even more interesting:
cabal (n.)
1520s, “mystical interpretation of the Old Testament,” later “an intriguing society, a small group meeting privately” (1660s), from French cabal, which had both senses, from Medieval Latin cabbala (see cabbala). Popularized in English 1673 as an acronym for five intriguing ministers of Charles II (Clifford, Arlington, Buckingham, Ashley, and Lauderdale), which gave the word its sinister connotations.
And since that definition links to this one, including for reference:
cabbala (n.)
“Jewish mystic philosophy,” 1520s, also quabbalah, etc., from Medieval Latin cabbala, from Mishnaic Hebrew qabbalah “reception, received lore, tradition,” especially “tradition of mystical interpretation of the Old Testament,” from qibbel “to receive, admit, accept.” Compare Arabic qabala “he received, accepted.” Hence “any secret or esoteric science.” Related: Cabbalist.
So, because of a few bad actors, a term with many layers of rich historical significance can just be disappeared from a platform.
And yet, there’s no issue using other phrases related to conspiracy in general, and they have literally BILLIONS of views:
Whereas, if you type in #cabal (or #qanon), you are not presented with the dropdown to select the “official” tag, and are not told any tally of existing views.
What I take issue with here is not the banning of QAnon related content. I support that, and god only knows how much of it I myself banned while in a related position to do so. What I take issue with instead is the heavy-handedness, inconsistency, and reactiveness of platforms in removing this content.
If they wanted to really make a difference, they should have all done it across the board at least a couple years earlier. It was always clear what was happening, and always clear that it was dangerous. The only thing that changed, as far as I could tell, is that news outlets eventually caught wind of it, and started reporting on it, and challenging platforms to remove it with the threat of public embarrassment.
As the BBC article linked above states:
“TikTok said it moved to restrict “QAnonTruth” searches after a question from the BBC’s anti-disinformation unit, which noticed a spike in conspiracy videos using the tag. The company expressed concern that such misinformation could harm users and the general public.”
As also quoted above though, TikTok apparently did not remove the majority of that content. They simply made it harder for the average user to find. But only in that one narrow instance.
By contrast, it’s still easy to find dozens if not hundreds of “antivax” accounts, no problem. Even if that content “could harm users and the general public.” There are tons and tons of those accounts which remain active and searchable:
Now, you can choose to do personally whatever dumb thing you want with regards to the COVID vaccines, or vaccines in general. My point in illustrating this is that there is an obvious and known public harm, and yet little to nothing is done in this instance. And the cause is almost definitely because they have not (yet) been embarrassed by the BBC’s “anti-disinformation” team.
It’s worth noting, however, that they do apply a TINY label on videos which they (apparently) detect as being related to COVID misinfo (see the yellow boxes I added below to highlight the label):
Does any person in their right mind think this little tiny warning stops conspiracy people from conspiracizing? Get real. It’s a joke. Here’s how it looks on a video details page on web:
As you can see by this person’s video content in the screenshot above, when you ban or remove conspiracy content, what this signals to the conspiracy person producing or sharing that content is that they are “on the right track.” Because it’s clear to them the platforms are owned by or in cahoots with “the cabal.” (or else why would that word itself be forbidden on the platform?)
No amount of fact checking, interstitial labels, or burying things from search results is going to disabuse those people of those notions. It’s just not going to work. Like ever. I’m not being hyperbolic. I’ve seen this play out in the wild thousands of times over the course of 5+ years. The pattern is always the same. “We” are not winning.
So what should platforms do? Just not police their content? Let anything go? Hardly. They should “do their best,” to maintain the service that they own and pay for in roughly the shape that they determine to be the right one. But they should do it with the knowledge that the measures they take to suppress things which do not correlate to the shape they desire do not necessarily result in positive outcomes, or solve fundamental societal problems which are at the root of these online behaviors.
I know no one wants to hear this. But there is no simple fix. Platforms are broken because society is broken. Truth is broken and devalued because Hyperreality is simply more engaging. If we want to have conversations with people that result in meaningful changes on these issues, we’re simply going to have to find new and more creative ways to do it, because this present set of approaches is not working.
Thought these prohibitions around misinformation were interesting & worth keeping from Tiktok’s Community Guidelines section on Integrity & Authenticity:
Misinformation is defined as content that is inaccurate or false. While we encourage our community to have respectful conversations about subjects that matter to them, we do not permit misinformation that causes harm to individuals, our community, or the larger public regardless of intent.
Do not post, upload, stream, or share:
* Misinformation that incites hate or prejudice
* Misinformation related to emergencies that induces panic
* Medical misinformation that can cause harm to an individual’s physical health
* Content that misleads community members about elections or other civic processes
* Conspiratorial content that attacks a specific protected group or includes a violent call to action, or denies a violent or tragic event occurred
* Digital Forgeries (Synthetic Media or Manipulated Media) that mislead users by distorting the truth of events and cause harm to the subject of the video, other persons, or society
Do not:
* Engage in coordinated inauthentic behaviors (such as the creation of accounts) to exert influence and sway public opinion while misleading individuals and our community about the account’s identity, location, or purpose
Now, I’m someone who likes to find the edges of policies like these. So there are certain things my brain automatically zeroes in on while reading…
“misinformation that causes harm” – where harm isn’t clearly identified… means the door is potentially fairly wide open to interpretation apart from their enumerated types in the list that follows.
“attacks a specific protected group”– the definitions of protected groups or classes often tend to be somewhat narrower than people think. A Facebook leak from 2017 showed the … complexity of these kinds of definitions when the rubber meets the road.
“denies a violent or tragic event occurred” – does this mean denying happy or non-violent events occurred is also forbidden? Status unclear.