• paris@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        YouTube makes money by showing ads on videos people watch. If they show people the videos they want to watch, they get to show more ads before someone stops watching YouTube for the day. This incentives YouTube to surface the videos that people will watch for the longest time with no regard for anything but their advertisers’ willingness to have their ads played on said videos.

        Also it’s expensive to moderate a platform so big that one in three humans uses it.

        • momo420@lemmy.world
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          1 year ago

          Thank you, I think your answer is much better. Still I don’t think “Capitalism” is an informative answer to why/how these search suggestions work.

  • ImplyingImplications@lemmy.ca
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Google is currently defending itself in the US Supreme Court over a lawsuit that alleges they assisted the terrorist group ISIS in recruiting members after it was found the YouTube algorithm promoted ISIS recruiting videos to young men who later committed a terrorist attack.

    So to answer your question using Google’s argument: they have so many videos that an advanced search feature is required to make the site usable. Their search feature only suggests things that are popular. It’s not their fault ISIS recruitment (or other violent content) videos are popular.

    The counter argument is: Google is curating content by displaying things people didn’t search out themselves. This is direct promotion by Google itself and therefor it should be treated as if they are the publisher of that content. Anyone publishing violent content should be held liable for it.

    • ipkpjersi@lemmy.one
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      So wait, is Google only suggesting things that are popular, or are they displaying things people didn’t search out themselves? How do they prove that in court, do they need to show their source code or something?

      • Prethoryn Overmind@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Yeah, except it gets more complicated than that. Google wouldn’t necessarily be promoting it either. As their algorithm looks for popular searches. Terrorism seems to be an overly used word for comparing protests to terrorism.

        As an example, I live in a pretty red state. I would consider my self democrat/liberal in this state. When the George Floyd protests were happening a lot of people in my state were referring to the non-protest raids as terrorism. Despite the fact they will all defend the very clear terrorism on the capital as an attempt to save the U.S.

        Point being you take the word protest and terrorism. You set it side by side as an exaggeration for literally anything half the the country disagrees with and boom you get popular terrorists searches.

        I also don’t think Google isn’t at fault since their algorithm is designed to continue feeding that kind of content and the deeper you go the more ingrained into content you get and the more insane it gets.

  • SJ0@lemmy.fbxl.net
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Presumably because that’s what lots of people are searching for.

    Otoh, Google giving insane suggestions is sort of a meme by itself so who knows what they use?

    • Meldroc@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Algorithm’s designed to promote engagement. Getting angry groups screaming and trying to murder people counts as engagement.

      As long as the screaming and arguing happens on their site and drives ad revenue, they don’t care about the murdering part.