I made sure to remove cookies and not sign in so I think these are the base suggestions made by youtube.
The YouTube algorithm is biased towards content that encourages further engagement, anger causes that, and right wing propaganda is designed to make people angry
Yup, content consumption algorithms almost always seem to push ragebait, clickbait, and other low quality results.
deleted by creator
This is the reason why all corporate media has become the dumpster fire it is (not just social media). They use negative emotion like fear and anger to promote engagement. So all you get as a viewer is stuff that gets you fired up. The quality of journalism is so low now they’re fabricating stuff to engage the viewer. Then there’s no journalistic accountability when they do get caught with their hand in the cookie jar.
“Then there’s no journalistic accountability when they do get caught with their hand in the cookie jar.”
Literally Fox
NewsEntertainment when brought to court over Tucker Carlsons bullshit…“We’re not news we’re Entertainment™©®, no reasonable person would believe what Mr.Carlson is saying is true.” So they don’t need to have any integrity whatsoever…
Because the algorithm caters to nazis.
Not that the algorithms caters to nazis, but because they would have flagged Republicans as hate speech. As a resultn algorithms designed to combat hate speech, racism, white supremacy, and neo-nazism were atered down.
LOL if the algorithm can’t tell republicans from nazis and they back down when republicans complain… the result is that it caters to nazis hiding among republicans because the goddamned GOP shelters its nazis
It’s definitely heavily weighted towards right wing content. Even though it should be less popular. It’s like how right wing states always complain about “liberal spending” but then their entire states are subsidized by blue states because their policies are trash. Right wing ideology can’t stand on its own without ‘training wheels’ ie: funding from massively wealthy donors who want to manipulate people’s reality.
With the recent wave of increadily stupid protests, it simply became a popular thing to search for.
What stupid protests? I seem to have missed them.
Search “Just Stop Oil”. They may have some good intentions but have yet to win over a single person.
I was looking for Captain Marvel movie content way back when it came out and accidentally clicked on one of those pitiful right-wing, woman-hating nerdbro videos and my suggestions suddenly became quite heavily peppered with similar horrible content. It feels the same thing doesn’t happen with non-conservative content quite so much. I wonder if there’s just so much right-wing and liberal content that the algorithm’s percentage calculations don’t know how to compensate for this sort of political nuance, or is it something more insidious?
I’m trans, watch OneTopicAtATime and similar pro-trans and pro-gay channels and still get recommended Matt Walsh somehow.
It’s probably what people have searched for before. I mean why would anyone watch peaceful protest videos on youtube?
Did you change your IP address as well?
At any rate, google shows things to people that other people are looking at, as that’s how it’s algorithm decides what is popular or not. So, the conclusion that can be drawn from this, is that most people that search for videos of protestors, are looking to see them get owned. If you consider internet demographics, this should not be that unusual, really.