Yesterday I heard a Joe Rogan guest mention a particular non-Rogan podcast episode. It was a discussion about science and policy by two professional, well credentialed scientists and a layperson. I Googled it, couldn't find it. Same for Bing and DDG. Turns out it had been a popular YouTube video, but they objected to the conclusions and de-platformed it, from YouTube and the big indexes.
It took maybe another dozen clicks to find it on one of the participant's own blog. And the podcast is still up and hosted by Apple. So it's something that you can find if you know about it, but not by searching on the topic. At this point, at least, the shadow ban is still soft.
Add that as a data point for Google no longer producing high quality search results.
Perhaps with all the deplatforming, delisting, and other deleterious effects, all the 'good' content is removed, while at the same time those removals are affecting their machine learning data sets. It is almost a self-imposed adversarial attack on result quality.
I think that the model is, somehow, irreparably broken. Remember that, when it started returning photos of black people in searches for "gorilla", they just stopped using it for "gorilla" searches.
My suspicion is that its been poisoned by some combination of an improperly-considered tagging process and malfeasance.
I do think it's hilarious that the one time Google doesn't drop a relevant search term in the middle of a query is for something like "black haired man." The way it treats terms, you might expect it to drop the "haired" and return pictures of black men, but no, it's almost exclusively white dudes with black hair (never mind that black and brown people also have black hair).
Rogan apparently* said that Spotify owns the rights to the JRE video, so they'll take down rehosted videos regardless of its content if they discover it. And if a video happens to be controversial by YouTube standards, it gets discovered faster and taken down faster.
> so they'll take down rehosted videos regardless of its content if they discover it
If that were true then Joe's youtube channel would have 0 posts since the time he joined spotify. It's chalk full of clips from Spotify broadcasts. I'm also fairly certain Spotify knows of Joe's YT channel considering that's where they found him and hired him away. It likely also would have been in his Spotify contract whether or not he could continue to post clips to his YT channel, or elsewhere.
> It likely also would have been in his Spotify contract whether or not he could continue to post clips to his YT channel, or elsewhere.
I think this is very likely, to allow Rogan to post short clips on his own channel for marketing purposes and keep existing content, while enforcing copyright on other channels.
I had this issue with a less controversial topic yesterday.
I was trying to explain VACnet someone and wanted to link the original talk. Youtube didn't have it anymore. 3kliksphilip covered the topic and he linked to the video. It said the video had been removed, though not why. 3kliksphilip's video helpfully said it was a GDC 2018 talk in the first sentence, so I searched google for that and found the video hosted on another site.
I don't think censorship was at play. I just think the world of mutable data is volatile and impossible to keep a 100% accurate up-to-date index.
I find it pretty egregious that tech companies in particular consider a way to fight misinformation to be to make it difficult to find. Where's the trust in people to critically assess evidence? We seem too afraid they'll pick the wrong "truth". What does that say about our lack of respect for their ability to make decisions? Will us behaving this way possibly change their minds? Stupid, irrational people have always existed and will always exist. Better to at least try to train them to critically assess what they read - which will reduce total stupidity in the world - than to block them from reading such material which only makes them think there's a conspiracy. And, what better way to make high quality allies than to have them critically assess both sides and decide to pick yours? And, maybe we might even change our own mind having had a more complete body of information to draw upon - better for everything.
Blocking access to information seems optimally bad for truth.
If limiting the spread of 'misinformation' is their intended goal, it seems like they would have a bigger impact by just toning down their algorithmic recommendations. Leave the videos up and available to search, and stop doing hyper-optimized recommendations designed to increase 'engagement'.
Of course, that would lose them advertising money, which is probably why they didn't go that route.
I think a large portion of the internet agrees with you: the problem isn't that this podcast was de-indexed; the problem is that I was able to learn about it in the first place.
No, the problem is that you're conflating a decision by google to not serve relevant results for moral and business reasona with a failure by google to do what they intend to do. They don't care if you search for far right stuff, they aren't interested in serving it to you. They are interested in serving you infromation about other stuff you're interested in.
Agreed; the former - a large corporation engaging in "too much free time on their hands" Twitter/1984-tier cancel culture - is arguably far more egregious, but par for the course for a company whose leaders, directors, and executives were brought to literal tears of anger and frustration in a post-2016 election meeting within the company. [0], [1]
It took maybe another dozen clicks to find it on one of the participant's own blog. And the podcast is still up and hosted by Apple. So it's something that you can find if you know about it, but not by searching on the topic. At this point, at least, the shadow ban is still soft.
Add that as a data point for Google no longer producing high quality search results.