I describe some concerns I have about content and social media platforms and the vulnerabilities they pose to both creators and their followers.

This episode was published on April 18, 2023 at 12:55am EDT.

You can support my work and this channel ⁠⁠by booking an astrology reading⁠⁠.

You can also support this channel with ⁠a monthly membership⁠.

Please ⁠⁠add yourself to my contact list⁠⁠.

Episode transcript:

Greetings all, welcome to Aquarian Diary.

I’m your host, John Irving.

It is April 18th, 2023.

There’s a few things that have been on my mind a lot lately. And a lot of it has to do with getting reach or exposure for all this content I produce, which I put a lot of time into.

And it has been a wonderful experience in many ways. I have connected with some truly fantastic people and am forming relationships with them, which I am very grateful for.

But there have also been a lot of practical challenges. Lately I’ve been having to spend a lot of time on more administrative type things and to try and understand how to, for example, get exposure.

And as you know, I am quite critical of the status quo and my position is quite progressive, which I feel very strongly about.

I comment on social and political issues all the time that are of concern to me. And I know from speaking with many of you that you share a lot of my concerns.

However, that does create some challenges.

Because for example, on this platform, YouTube, there are a whole bunch of algorithms that determine whether you succeed or fail. And except for probably a few people who work on the coding for YouTube or programming of it, nobody really knows what those algorithms are.

We know vaguely that certain terms will trigger the algorithms and presumably penalize you for doing that. Many people have commented on this.

And so it’s kind of a bit of a crapshoot. You can discuss something and then get penalized and not even know why.

There are other problems as well, including the commenting system. I often see comments that I respond to, but that do not actually appear under the videos when other people look at them.

Furthermore, based on my channel activity, it looks like I have kind of been put in YouTube jail or shadow banned because the level of activity or the precipitous drop in it recently just doesn’t make sense.

Or the amount of subscribers I have and the amount of views some of my videos have received versus others.

So therefore I have no choice but to assume that, like I said, I’m being penalized somehow.

Of course, the content I’m producing is not mainstream. I understand that. I have zero interest in producing fluffy or superficial content. Nonetheless, there are still a significant number of people interested in the types of topics I address.

I don’t own YouTube, so it’s not my product. But these kinds of things are concerning. Because effectively what’s happening is that your exposure is governed by criteria that you have no control over, that you’re not even aware of, and therefore can’t do anything about.

Why does this matter? Well, in my case, I don’t think I have ever posted anything here on my channel that I don’t stand by. In other words, I’ve never published anything that I feel isn’t true from my perspective.

Whereas there are plenty of other channels that produce all kinds of stuff, and in many cases some of the content is demonstrably false or misleading. So there isn’t a level playing field.

And as I said, this applies to comments as well, so what’s happening is that the engagement or discussion is being undermined. And as I said, I see comments that are perfectly fine and not objectionable that literally do not appear. There’s no logic to it. So that’s problematic because I think the actual discussion and engagement between viewers or listeners and content creators is absolutely critical.

Like for example, in China, the Chinese government, an authoritarian government, has extremely strict rules around what people can search for, what words they can use, and so on. It’s very Orwellian and it’s reached totally absurd proportions where the population has to make up new terms to describe things that the Chinese government doesn’t want them talking about to get around the censors.

Well, to some degree, maybe it’s not as severe, but that is what’s happening here. These platforms are using algorithms and the algorithms don’t even actually work properly in many cases. Some of the content they are blocking is completely legitimate.

I find this quite disturbing because there’s a social component to this.

Facebook, for example, used to be a huge thing years ago and it’s lost a lot of popularity recently. But back in the day, I made many friends through Facebook, through social and political groups, that became friends of mine for many, many years. So it served a purpose back then anyway.

Another example, in Canada we have the Canadian Broadcasting Corporation, the CBC, which is a federally funded news organization, effectively. If you go on and comment on some of those articles, they will immediately block your comments simply because you use certain words or phrases that are often totally legitimate. It’s just that the algorithms block them automatically without considering the context.

It all has that kind of vibe of what’s happening in China. And as someone who started a social media platform before Facebook, I had a company that did this, I find this trend very disturbing. It is indiscriminate censorship in many cases.

The platforms are afraid of blowback and so they are overly cautious and it’s totally unregulated and social discourse suffers as a result.

I actually recorded the vast majority of this a few days ago and today, by chance, I actually happened to come across an article on the Conversation which addressed this very phenomenon. And there’s a term for it which is “algospeak”, language designed to get around algorithms.

So clearly there are others concerned about this as well. I’ll put a link to that in the description. Again, very Orwellian.

There’s another factor where this platform or others in the future could simply determine that they don’t want to promote or push content from a certain group of people or from people who focus on particular topics. With a few lines of code, they could completely block all those people and they would lose all of their followers, all of the engagement, all of those connections that they had established would be gone instantly.

Look at what happened to Twitter. Musk single-handedly destroyed it within less than a year. What I’m saying is this situation leaves us very vulnerable.

And I have considered other platforms but they all have challenges or difficulties. Some are really geared towards people who are highly technical, some are more obscure and not very commonly used, and so forth.

For example, there’s a Twitter alternative called Tribal but it looks like something that was designed in the 1990’s. The interface is terrible. It’s so bad that you think that it’s deliberately bad.

But my larger concern is just about how these platforms are shaping our ability to communicate, what we can communicate, and with whom. And more importantly, because content creators are trying to dodge or not trigger these mysterious algorithms that they do not fully understand, it’s forcing them to deliver their content differently than they would under ideal circumstances. So the content itself is kind of skewed or distorted because people have to play this kind of cat and mouse game for fear of censorship.

Now clearly some content should be screened but what’s happening is that content that shouldn’t is.

Again, I seem to be being penalized because presumably I have been critical of something socially or politically that may be entirely legitimate but it might have triggered some algorithm out of context. It has nothing to do with the quality of my content or the value of it.

I know how algorithms work. They are simply rules. If this, then that. If this word appears, then do that. It’s not nearly as complicated as most people think, although algorithms can be quite sophisticated.

The concept is actually quite simple. It’s all automated. It runs in the background and then your account probably gets flagged if it finds a certain word.

Like, for example, a while ago I posted on my community tab about the whole Clarence Thomas thing that was blowing up. And I linked to an entirely legitimate article talking about the billionaire that he has been associated with who is a collector of certain types of paraphernalia and in the title was a word that probably triggered the algorithm. Even though the article is accurate and factual and from a reputable source, that word is probably blacklisted.

This whole issue I find really concerning. That these platforms can shape society effectively based on their own criteria.

Now I understand that there is a lot of content out there that should be screened. The problem is the algorithms are unable to effectively discriminate between what is a legitimate abuse and what isn’t. These words are all in the dictionary. They’re completely legitimate words, so the context matters.

I was speaking with someone recently who said that under these circumstances what you should do is you should get your subscribers to unsubscribe from your channel, type your channel name into the search bar on YouTube, then resubscribe, and then just like five videos. They don’t even have to listen to the whole video, just get them to like it. And that will reset the algorithms for your channel. Because apparently this has happened to other channels.

If you want to do that, fine. I don’t want to lose any of my viewers, but if you want to try that, fine. But I do seem to be in jail.

But like I said, this points to a larger concern. That we are very dependent on these companies who at any time could make any one of us completely redundant regardless of how much we have invested into our equipment, our production, the hundreds if not thousands of hours we have put into building up our channels. And that disturbs me. It’s almost like we need a plan B so that if that happens or when that happens we have a backup plan.

Montana, I believe it’s Montana, just this week completely banned TikTok and they want to fine companies for distributing the app.

I am not a fan of TikTok. I actually loathe things like shorts where like what can you actually convey in 60 seconds that has any real value. Nothing. It’s all a part of the dumbing down of society. I think short attention spans is a huge cultural problem.

And there are concerns about the Chinese government gaining access to user data.

But the point is, is that at any time some state could ban a platform that they don’t like. And again, this has kind of an Orwellian vibe about it.

I’m just putting this out there because I think it’s something that people need to think about.

I have a website. I have a podcast. I actually encourage everyone to also subscribe to my podcast because I’m mostly just doing audio and a lot of the content I listen to is podcasts. I like podcasts.

But we should have redundancies. You could have a whole social movement completely wiped out.

I mean, yes, there are some people on the right who do not like my content and they’ve complained about it, if not trolled me at times, because they want me to shut up, basically. But I don’t care. I know who I am and where I stand and I expect that.

Who knows, maybe there’s somebody with a political bias who works for one of these platforms, and they decide they don’t like you or your content, and the next thing you know, you’re completely obscure.

I also think we’re going to see more and more governments trying to interfere with tech platforms. And that really concerns me because a lot of these politicians actually know nothing about technology. A lot of them are older and they have a very simplistic understanding of how the internet actually works.

But they don’t know that. They think they know how it works, but they don’t. And so they are prone to make really stupid decisions.

Like we have something going on in Canada now where the government wants to control what content is being delivered to people and how. And I find the whole thing completely idiotic.

There was nothing wrong with the way things are. And in fact, the way they’re handling it is going to benefit large established media organizations mostly. It’s not actually going to benefit people. It’ll protect the status quo and those big corporations who own significant media companies.

So anyway, stuff for you to think about.

This could be one of the negative expressions of Pluto transiting Aquarius, that you get a bunch of technocrats determining how we communicate and collaborate. And their own biases and prejudices may dictate how we can do that. And or it could be motivated by financial interests.

We really need some kind of open source platform that doesn’t suck, that is not tied to any particular company.

Like I actually have very significant concerns about the conglomeration of power within certain companies and organizations as it stands already. We have what are effectively a lot of monopolies and I find that very disturbing. There are a lot of powerful people who are very vested in the status quo and the status quo is literally not sustainable.

So that’s that.

If you want to experiment and unsubscribe, you got to do it in this order apparently. You unsubscribe from my channel, you type in my name Aquarian Diary into the search function on YouTube, you find my channel, you resubscribe, and then you just have to like four or five videos. You don’t have to watch them all, you just have to like them. Apparently that is one of the magic tricks to resolve these issues.

If you want to support my channel, you can book an astrology reading with me. I’ll put a link to that in the description.

Take care, all the best, and I’ll talk to you again soon.

End episode transcript.

Other related episodes or referenced herein:

What is ‘algospeak’? Inside the newest version of linguistic subterfuge

Addendum July 2, 2023:
This episode of the Future Tense podcast (via ABC Australia) was published some 2 1/2 months after I published this episode of Aquarian Diary. It describes the phenomenon I articulated here in more technical, broad — and even more concerning — terms. Link: Cory Doctorow: Platform capitalism and the curse of “enshittification”

#AgeofAquarius #Algospeak #Platphobia

Search for “Aquarian Diary” in your podcast app to find the podcast version of this channel.

Check my “Community Tab” where I comment and share links I find interesting.