Social media companies are receding from their role as watchdogs against conspiracy theories ahead of the 2024 presidential election.

    • @TwilightVulpine@lemmy.world
      link
      fedilink
      English
      201 year ago

      Because like it or not that is where a lot of people get information these days. If it keeps pushing bulshit, people believe bulshit. For an example, anti-vaxxers didn’t use to be so common, until their bulshit was spread all over social media.

      I would love for people to be wise enough to verify information in reliable sources and not just believe everything they see, but sadly that’s not the world where we live in.

      • @Jakeroxs@sh.itjust.works
        link
        fedilink
        English
        11 year ago

        Antivax sentiment has been around for hundreds of years, long before the Internet, mostly through political party rhetoric and/or religion, not saying the spread likely hasn’t increased, but people believe wrong information all the time.

        • @TwilightVulpine@lemmy.world
          link
          fedilink
          English
          11 year ago

          There is always a nutball, but my point is that, yes, it has increased significantly. Vaccines were a settled matter already, people far and wide trusted them. Now vaccination rates have gone down and diseases that we had nearly eliminated are having a comeback. This has happened because now any stupid grifter can have a worldwide platform and a following who actively spreads their nonsense.

    • Andy
      link
      fedilink
      English
      17
      edit-2
      1 year ago

      I think we need to pursue a strategy that attempts to discourage the spread of disinformation while avoiding making them the arbiters of truth.

      I think social media platforms are like a giant food court. If you do nothing to discourage the spread of germs, your salad bar and buffets are all going to be petri dishes of human pathogens. That doesn’t mean that the food court needs to put in hospital-level sterilization measures. It just means that the FDA requires restaurants to use dishwashers that get up to 71 C, and employees are required to wash their hands.

      In this case, I think we should experiment. What if platforms were required to let users flag something as disinformation, and share a credible source if they like? Maybe users could see all the flags and upvote or downvote them. The information would still be there, but you’d go to the InfoWars page and it would say, “Hey: You should know that 95% of people say this page posts mostly bullshit.”

      Something like that. I don’t like the role the companies play currently, but disinformation does carry the potential to cause serious harm.

        • Andy
          link
          fedilink
          English
          31 year ago

          Yes?

          I can’t tell if you’re agreeing with me or not.

        • @Jakeroxs@sh.itjust.works
          link
          fedilink
          English
          11 year ago

          I am also against deleting valid news about wrongdoing for Democrats, if you’re implying this stance is political in some way.