• @Deello@lemm.ee
      link
      fedilink
      English
      374 months ago

      I mean yes but that’s like saying Bitcoin is used by criminals to buy drugs and weapons. The problem is that’s not their only use.

      • Baut [she/her] auf.
        link
        fedilink
        English
        94 months ago

        Bitcoin is a bad example, since it’s not designed as a private currency. Monero/XMR is actually usable.

      • @stonerboner@lemmynsfw.com
        link
        fedilink
        English
        -34 months ago

        Yep. The issue is that they put out a tool that does some good things, but is also heavily adopted by criminals who piggyback on it.

        Should we let child abuse just proliferate with these tools, because there’s so much need for privacy? How do you weed out the bad without kneecapping the good? There’s no good answer here. The good parts of the tech working enable the bad parts, too.

        There has to be a certain level of knowledge and acceptance of the bad parts to continue developing it. It’s a catch 22, so law enforcement has to pick between sacrificing the privacy or allowing a tool to exist that proliferates child abuse material and other ills.

        There are valid arguments for the importance of privacy, and valid arguments for making sure there these crimes shouldn’t have a safe haven. Action to either end will hurt some people and enrage others.

        • @Grimpen@lemmy.ca
          link
          fedilink
          English
          94 months ago

          The standard I recall being established back in the nineties as to whether strong encryption was even legal in the US was “substantial non-infringing use” or similar. It’s been awhile.

          The problem with key-escrow or anything similar is that any proscribed circumvention is also available to the “bad guys”.

          I think Telegram’s stance would be that they can’t moderate because of strong end-to-end encryption. Back in the day the parallel would have been made to the phone system or mail.

          Of course this is all happening in France, so I have no idea what the combination of French and EU laws will have on this, but I would still broadly expect that if a parallel can be made to mail or phone, Telegram would be in the clear. The phone company and mail service have no expectation of content moderation.

          I guess we’ll see.

          • @stonerboner@lemmynsfw.com
            link
            fedilink
            English
            14 months ago

            The huge difference between mail or phone and telegram is that both mail and phone work with law enforcement, with useful records being made available upon subpoena. Telegram, by design, will not.

            If you think drawing that parallel is useful to Telegram, they would then also be required to maintain the same standards of security as the mail, with package inspections, drug dogs, entire teams of government officials investigating illegal activities etc.

            The criminals use it precisely because it is not a parallel to other available channels, as it circumvents those safeguards.

      • @Vendetta9076@sh.itjust.works
        link
        fedilink
        English
        144 months ago

        i dont get this. it can be used as one. its being seen as criminal because of its ability to send encrypted messages. comments like this are meaningless.

        • punkisundead [they/them]
          link
          fedilink
          English
          34 months ago

          Yeah and still we should not push the message that Telegrams encryption is even remotely close to other encrypted messengers. I met more than one person that somehow thought their messages in Telegram are “safer” than they are in WhatsApp (and they never activated the encrypted chat oprion in Telegram)

    • @pedroapero@lemmy.ml
      link
      fedilink
      English
      44 months ago

      No, every service provider must remove infringing content when reported. That is not the case on Telegram.