The European Union has formally announced it suspects X, previously known as Twitter, of breaching its rules in areas including countering illegal content and disinformation.

Digital commissioner Thierry Breton set out the alleged infringements in a post on the social media platform.

He said X, which is owned by Elon Musk, was also suspected of breaching its obligations on transparency.

  • @Geek_King@lemmy.world
    link
    fedilink
    631 year ago

    Good, there is nothing wrong with having rules against spreading misinformation if that information can be objectively be verified as false. This weird idea we have to be polite and treat people spreading misinformation with respect is silly. Opinions are opinions, but spreading verifiable lies shouldn’t be allowed.

    • Shake747
      link
      fedilink
      -351 year ago

      Where and how do you draw the line between opinion and misinformation? And who is the arbiter of truth?

      Most issues that are controversial are social/political issues. Like Palestine vs Israel - other than “people are dying on both side” what can we say that wouldn’t be a subjective view?

      Many things that are subjective, can be spun to be viewed objectively as well. The media is good at appearing like this.

      Sometimes even scientific issues arise in the form of skewed/incomplete data pushed by corporations to give their product the go ahead or to make a point, rather than objectively study. See the sugar or tobacco industries as an example. Even “the food pyramid” that was pushed on us all as kids was a lie from skewed studies.

      This is sort of the consequence of free speech, one that we currently don’t have a solution for. Personally, I’d rather see “misinformation” than have anything censored. This at least gives you the power to infer what you want, and decide who the good and bad actors are

      • @Geek_King@lemmy.world
        link
        fedilink
        241 year ago

        I said verifiable fact. Not opinions. Example: During the pandemic, here in America, the vaccine was available in May of 2021, and around the end of summer, everyone who was going to get it had gotten it. Biden’s administration was trying to think how they could help get more fence sitters to get the vaccine. So they thought maybe a door to door education campaign would help, just people going door to door with information and hand out packets of information to help educate people on what a mRNA vaccine was, and why it wasn’t something to worry about.

        Well, Fox News aired straight up lies. On air, Fox News talking heads said that the door to door campaign was going to have people show up at your house WITH the vaccine and demand you get the shot right then and there. And what would be next? They could come to your door and take your guns, they could take your BIBLE. I shit you not, that’s fear mongering aimed at one thing, to scare their viewership and the republican base. None of that fear mongering had any basis in reality, it was verifiably FALSE.

        So that’s what I’m talking about, spread lies and misinformation that can be proved to be false should be illegal for the same reasons why it’s illegal to yell “FIRE” in a crowded theater when there isn’t a fire.

        • Shake747
          link
          fedilink
          -201 year ago

          All the media outlets say stupid shit like that lol. Extreme emotions is what drives viewership for all of them.

          All I’m saying is I’d rather have it out in the open (so you can see the stupidity, just like you did with Fox) - rather than censored.

          How would you know Fox corp is full of bad actors without seeing actions like that? Now you know not to trust them, and the people it’s connected to.

          • Star
            link
            fedilink
            8
            edit-2
            1 year ago

            Well if they get slapped with the misinformation label, we wont trust them, they lose viewership, they fade out. In a perfect example, that is. (Meaning it all has to go perfect)

            I would rather not have misinfotnatuon out in the open so we can decide their character. Some people will still see that misinformation and now it will spread.

            People are not perfect. Even incredibly educated people can read misinformation and take it as truth.

            As a person, I don’t want to have to analyze everythonf said to make sure it’s true. I want to be presented simply true stuff without worry. I want to believe the people in my communities.

            • Shake747
              link
              fedilink
              -31 year ago

              I’d love a world like that too, and I think most people would. There’s just too many different cultures and opposing views that we won’t be able to have just a single truth though. Like the Palestine vs Israel stuff for example.

              It’d be very dangerous if there was a small governing body of people that gets to decide what’s truth and what isn’t. Maybe if we could figure that out openly, with the public, and have it decentralized and no singular group could control a narrative, maybe we’d inch closer to that ideal

              • Star
                link
                fedilink
                1
                edit-2
                1 year ago

                I dont mean going as far as a “Department of Truth”, but there is a line that needs to be enforced. A wrong opinion is not necessarily misinformation. A single person stating a wrong fact as true ia not a dramatic offense in a small context.

                News shows telling the public a lie as news os a problem.

                • sousmerde{retardatR}
                  link
                  fedilink
                  2
                  edit-2
                  1 year ago

                  I don’t understand people like you.
                  For me, it’s impossibly naive to believe that our disinformation( doesn’t exist, or) will be censored as much as the disinformation of our opponents, yet that’s apparently the new mainstream opinion.
                  And x.com is the only social media to have implemented Community Notes.

          • @Geek_King@lemmy.world
            link
            fedilink
            51 year ago

            I can tell that type of drivel is bullshit, but their vast viewership isn’t making that distinction. It’s leading to a huge group of America who don’t live in reality, and it’s proving to be dangerous physically and politically. So I still think if something is a verifiable lie, it should be called as such and the spreader of that lie should face ramifications.

            • Shake747
              link
              fedilink
              -31 year ago

              Fair enough, but censorship won’t fix the underlying issue that people aren’t taking the time to think about what they’re reading/watching. It might make it worse once it becomes information “they don’t want you to see”

      • @Viking_Hippie@lemmy.world
        link
        fedilink
        161 year ago

        Personally, I’d rather see “misinformation” than have anything censored. This at least gives you the power to infer what you want, and decide who the good and bad actors are

        Well that’s fucking idiotic! You’re acting like you’ve never encountered humans before, especially on the internet.

        Propaganda works, and not only on gullible and/or dumb people. That’s why there needs to be safeguards against dangerous lies like the ones of Musk, Trump, Hitler and other fascists with huge cult followings.

        • Shake747
          link
          fedilink
          -131 year ago

          Fascists restrict speech, more than anyone else.

          That restricted speech, coupled with propaganda is what is dangerous. The fact that there’s propaganda, and then a completely opposing view also existing, is a good thing. It means free speech is working and we all need to be diligent about what we take in. “Don’t believe everything you read!”

          I don’t want to outsource the cognitive load of who and what I should be trusting, while watching what I say, because that’s exactly how you end up in a fascist state

          • tired_n_bored
            link
            fedilink
            11 year ago

            What you say would work in an ideal world, where people freely discuss about topics. In the real world, instead, especially on social networks, misinformation spread like a virus, aided by people or governments who know the truth but obtain benefits from division created by such lies. I agree, it’s difficult to draw a line between opinions and misinformation tho, but it is necessary, I believe.

      • @MagicShel@programming.dev
        link
        fedilink
        9
        edit-2
        1 year ago

        Twenty years ago I might’ve agreed with you but not any more. I don’t know how the truth is determined in every case, but I do know the internet is useless when lies outnumber truth 10:1. Ideally such arbiters would be folks who can be held financially or criminally liable for lying. Maybe through a professional certification such as lawyers and engineers have. If someone doesn’t have any skin in telling the truth, why believe them?

        • Shake747
          link
          fedilink
          01 year ago

          I’d argue it’s always been 10:1, we just have access to all of it at the click of a button, and it’s all now recorded - remember how many old wives tales used to get shared around back then?

          “Bubblegum stays in your stomach for YEARS!” “Shaving makes your hair come back thicker” “Don’t crack your knuckles, it’ll give you arthritis!”

          And now we have the ability to cast these claims against their opposers, where as before it would’ve been much more difficult to uncover.

          But to your point about believing people with skin in the game, I’d say that’s a great idea - if we can keep it decentralized and as open to the public as possible, all at the same time. Pharmaceutical companies have a lot of exemptions from this kind of thing though, we’d need a method for them as well

      • @HWK_290@lemmy.world
        link
        fedilink
        English
        71 year ago

        I think the idea is that the burden of proof should be on the preacher, and not on the pulpit at which they stand. In other words it’s not rejecting the idea, it’s rejecting the person until they are able to prove their claims using rational evidence