TLDR if you don’t wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for “Kamilia” would lose you “10000 rizz”, and how voting for Trump would get you “1 million rizz”.

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn’t necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

  • Zoot@reddthat.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 days ago

    Just anecdotal but I only ever watch duck videos or funny animal videos with occasional other funnies or crazy science things, and that’s still all I ever get. Other days I get plenty of cool music like tesla coils making music or other piano music.

    Am I youtubing wrong?

    • UraniumBlazer@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      Nah good for you. Maybe it’s because of your geographical location/you just being lucky? I have experienced what the video above says quite a lot though.

      I’m not American, so I didn’t exactly see a lot of Trump (although there was some amount of it). I largely saw a lot of Hindu nationalist content (cuz of my geographical location). The more I disliked the videos, the more they got recommended to me. It was absolutely pathetic.

  • labbbb2@thelemmy.club
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    7 days ago

    I have encountered this too. Around 2 months ago I visited YouTube to watch some video, rejected all cookies and there were “woke” videos everywhere in recommendations. I used it without an account.

  • gmtom@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    9 days ago

    So you’re saying we need to start pumping out low quality left wing brainrot?

    • eronth@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 days ago

      Insanely, that seems to be the play. Not logic or reason, but brainrot and low blows. Which is a bit at odds with the actual desire.

      • xor@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 days ago

        fight fire with fire i guess….
        maybe people get on board quicker if they feel the emotions first, and then learn the logic….
        one good example is Noam Chompsky: every thing is says is gold, but he says it so slow and dispassionately even people who agree with him find it hard to watch.

    • sudo@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      8 days ago

      It only must be extremely simplified and evoke emotional reactions. That’s just basic propaganda rules. The brainrot quality of the content is a consequence of the sheer quantity of the content. You can’t make that volume of content and without making fully automated ai slop.

      What the experiment overlooks is that there are PR companies being paid to flood YouTube with rightwing content and are actively trying to game its algorithm. There simply isn’t a left-wing with the capital to manufacture that much content. No soros-bucks for ai minions in keffiyehs talking about medicare.

  • Pennomi@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 days ago

    I think the explanation might be even simpler - right wing content is the lowest common denominator, and mindlessly watching every recommended short drives you downward in quality.

    • Plebcouncilman@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      10 days ago

      I was gonna say this. There’s very little liberal or left leaning media being made and what there is is mostly made for a female or LGBTQ audience. Not saying that men cannot watch those but there’s not a lot of “testosterone” infused content with a liberal leaning, one of the reasons Trump won was this, so by sheer volume you’re bound to see more right leaning content. Especially if you are a cisgender male.

      Been considering creating content myself to at least stem the tide a little.

      • BassTurd@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 days ago

        I think some of it is liberal media is more artsy and creative, which is more difficult to just pump out. Creation if a lot more difficult than destruction.

        • anomnom@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 days ago

          Plus fact based videos require research, sourcing and editing.

          Emotional fiction only takes as long to create as a daydream.

        • Plebcouncilman@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 days ago

          Not necessarily. For example a lot of “manosphere” guys have taken a hold of philosophy,health and fitness topics, a liberal influencer can give a liberal view on these subjects. For example in philosophy, explain how Nietzsche was not just saying that you can do whatever the fuck you want, or how stoicism is actually a philosophy of tolerance not of superiority etc. there’s really a lot of space that can be covered.

  • ohlaph@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 days ago

    If I see any alt-right content, I immediately block the account and report it. I don’t see any now. I go to yourube for entertainment only. I don’t want that trash propaganda.

    • KariKariCrunch@lemmy.worldB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 days ago

      Same. I watched one Rogan video in like, 2019, and it was like opening a flood gate. Almost immediately almost every other recommendation was some right-wing personality’s opinion about “cancel culture” or “political correctness.” It eventually called down once I started blocking those channels and anything that looks like it might lead to that kind of content. I can only imagine what would pop up now.

  • HoMaster@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    Alt right videos are made to elicit outrage, hate, and shock which our lizard brains react to more due to potential danger than positive videos spreading unity and love. It’s all about getting as many eyeballs on the video to make money and thi is the way that’s most effective.

    • sudo@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 days ago

      There’s also an entire industry around mass producing this content and deliberately gaming the algorithm.

  • Yerbouti@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    There’s a firefox extension to hide short and another to default to your subscription. Along with ublock, those are the only things that makes youtube usable.

    • Ashelyn@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 days ago

      That doesn’t fix the out-of-the-box experience of the platform for millions, if not billions of people. Yes it’s a good step to take individually, but insufficient to deal with the broader issue raised of latent alt-right propagandizing

  • LovableSidekick@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    Does this mean youtube preferentially selects alt-right shorts, or alt-right people make more shorts? Or some other thing entirely? Jump to your own conclusion.

    • RisingSwell@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 days ago

      YouTube selects what gives YouTube the most views for the longest time. If that’s right wing shorts, they don’t care.