YouTube more likely to recommend conservative vidoes to all
go.theregister.com
external-link
There goes Woke Big Tech again, downplaying traditional liberal views

How much you want to bet that they’re getting this directive handed from the Feds?

Ha that’s a moot bet. Feds censor the press (as NYT admitted) so no way they would miss such huge medium as Youtube, and Youtube is even better tool because they can not only censor it, but actively encourage content to people who would never buy their paper rags.

Source on NYT saying Feds censor the press?

Welp yep that’s a solid source all right. Unbelievable

Mad
link
fedilink
34M

probably because regardless of your politics, they will invoke strong emotions or make you curious

YT does not care what content it puts, suggestions depend solely on what the user wants, the same as in other Google services. If you look for and watch Nazi videos, it puts Nazi videos in the suggestions, A YT is only interested in the dough and being able to place it’s ads and trackers. There are videos of all political colors, as long as they are not removed for violating the rules or are not reported by users. Here in Europe I subscribe to various left wing channels, apart from scientists, history and others and I find suggestions on exactly these topics. Certainly, like all Google services, it is important to monitor privacy issues, but there is no real alternative to YT, nobody has so much content, the only thing you can do is use a front-end from Invidious, Piped or others.

@OptimusPrime@lemmy.ml
creator
link
fedilink
6
edit-2
4M

Google influences people to think the way they want them to. It changes what they show and even some websites content depending on the person’s beliefs. They say this in their terms of service and yet people think Google is only bad because it has some privacy issues. So I don’t think it’s a stretch to think that they make Youtube’s algorithm recommend more of certain content.

Nope, at least not in Europe, YT shows me good sugestions exactly related to my selections. Maybe the algoritm in US is different. It is different in the search engine that I do not use, not only because of the lack of privacy, but also because of the danger of the bubble filter, this, apart from half a page of ads in the results. This problem in YT does not exist as such, it only shows me suggestions from the subscribed and related sources. I have a Google account from times when the motto “don’t be evil” was still valid and for this I have access to my dashboard and I know the data it has about me, which is rather little. Relevant histories none, as apart from YT I don’t use any other Google services.

suggestions depend solely on what the user wants

Lmao, it can read my mind?

I think even Markiplier showed that if you clear cookies and shit and just keep clicking recommended you’re eventually gonna end up on Holocaust denial

It doesn’t matter what you delete from your browser or locally, but what YT deletes or not from it’s logs of your visits that it have saved on the server itself.

No, but your IP, fingerprints, cache, serviceworkers and other, related to your previous searches. Enter in Browserleaks and see what every page you visit knows about you, also YT.

Yes and also it seems to fervor right wing content for some reason.

Mad
link
fedilink
34M

Correction: it shows what would make the viewer want to stay on the platform, which is not necessarily what the viewer wants to do. so showing political extremes or misrepresentations is encouraged, because they would invoke strong emotions and make the viewer want to stay, whether they wanted to see that stuff or not, and whether they wanted to stay on youtube or not.

“We found that YouTube’s recommendation algorithm does not lead the vast majority of users down extremist rabbit holes, although it does push users into increasingly narrow ideological ranges of content in what we might call evidence of a (very) mild ideological echo chamber”

And

“Push slightly to the right.”

So the effect is small and it need reproducibility to be assessed correctly. Plus it all depends on the way they defined “right” wing content, which the news article take for granted.

@ree @OptimusPrime It doesn’t take much for YouTube’s algorithm to quickly screw up just because you watched something you wouldn’t normally watch out of curiosity.

I watched few The Quartering videos roasting Fallout 76 when it came out (i did not know whom he was back then) and in next few months my feed was full of stupid fascist shit. Literally few videos caused this.

The title of this post is bullshit and does not correspond to the study results.

What I want to see in-depth analysis of is Commander Ashtar and the Pleiadian fleet besieging the website!

@terulo@lemmy.pt
link
fedilink
8
edit-2
3M

deleted by creator

That’s good, but this is mostly a problem for young teens and people like my boomer parents who don’t know the workings of the internet and media very well.

Oh sweet summer child, only if you knew how many youngsters are easy to brainwash.

that’s what I said though?

I missed the young teens part, but also young adults are incredibly capitalistic. So I covered that.

Subscribe to see more stories about technology on your homepage


  • 0 users online
  • 18 users / day
  • 40 users / week
  • 93 users / month
  • 374 users / 6 months
  • 14 subscribers
  • 1.01K Posts
  • 3.24K Comments
  • Modlog