Lemmings.world
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
Da Cap’n@lemmy.dbzer0.com to Privacy@lemmy.dbzer0.comEnglish · 3 months ago

DeepSeek 'shared user data' with TikTok owner ByteDance

www.bbc.com

external-link
message-square
18
link
fedilink
  • cross-posted to:
  • privacy@lemmy.ca
  • artificialintelligence@lemmy.sdf.org
  • technology@lemmy.zip
  • hackernews@lemmy.bestiver.se
  • world@quokk.au
  • bbc@rss.ponder.cat
111
external-link

DeepSeek 'shared user data' with TikTok owner ByteDance

www.bbc.com

Da Cap’n@lemmy.dbzer0.com to Privacy@lemmy.dbzer0.comEnglish · 3 months ago
message-square
18
link
fedilink
  • cross-posted to:
  • privacy@lemmy.ca
  • artificialintelligence@lemmy.sdf.org
  • technology@lemmy.zip
  • hackernews@lemmy.bestiver.se
  • world@quokk.au
  • bbc@rss.ponder.cat
South Korea's data protection regulator says user data was sent to the Chinese owner of TikTok.
alert-triangle
You must log in or register to comment.
  • Scrubbles@poptalk.scrubbles.tech
    link
    fedilink
    English
    arrow-up
    40
    ·
    3 months ago

    I am zero surprised.

  • fxomt@lemmy.dbzer0.comM
    link
    fedilink
    arrow-up
    33
    arrow-down
    1
    ·
    3 months ago

    No way… you’re telling me a free AI is profiting off my data?

    Always run AI locally!

    • FarraigePlaisteach@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      3 months ago

      Is that feasible for someone with an office PC with integrated graphics? Asking for a friend.

      • BakedCatboy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        If you have a lot of RAM, you can run small models slowly on the CPU. Your integrated graphics I would guess won’t fit anything useful in it’s vram, so if you really want to run something locally, getting some extra sticks of RAM is probably your cheapest option.

        I have 64G and I run 8-14b models. 32b is pushing it (it’s just really slow)

        • Lucy :3@feddit.org
          link
          fedilink
          arrow-up
          3
          ·
          3 months ago

          Don’t iGPUs use the RAM as VRAM directly? You’d only need to configure how much in the BIOS (eg. by default it uses 1.5GB of 8GB or smth and you can set it to 6/8GB)

          • BakedCatboy@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            3 months ago

            Yes for gaming, but for LLMs I’ve heard that the bandwidth limitations of using system RAM as vram hurts performance worse than running on the CPU using system memory directly, since smaller models are more memory bandwidth limited.

            I’ve never tried to run AI on an igpu with system memory though so you could try it, assuming it will let you allocate like 32GB or more like 64GB. I think you’ll also need a special runner that supports igpus.

    • jonne@infosec.pub
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      3 months ago

      Yeah, AI is even being trained in data provided by the Nazi Steve Huffman’s website.

  • Anarki_@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    27
    arrow-down
    1
    ·
    3 months ago

    ⢀⣠⣾⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠀⠀⠀⠀⣠⣤⣶⣶ ⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠀⠀⠀⢰⣿⣿⣿⣿ ⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣧⣀⣀⣾⣿⣿⣿⣿ ⣿⣿⣿⣿⣿⡏⠉⠛⢿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡿⣿ ⣿⣿⣿⣿⣿⣿⠀⠀⠀⠈⠛⢿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠿⠛⠉⠁⠀⣿ ⣿⣿⣿⣿⣿⣿⣧⡀⠀⠀⠀⠀⠙⠿⠿⠿⠻⠿⠿⠟⠿⠛⠉⠀⠀⠀⠀⠀⣸⣿ ⣿⣿⣿⣿⣿⣿⣿⣷⣄⠀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣴⣿⣿ ⣿⣿⣿⣿⣿⣿⣿⣿⣿⠏⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠠⣴⣿⣿⣿⣿ ⣿⣿⣿⣿⣿⣿⣿⣿⡟⠀⠀⢰⣹⡆⠀⠀⠀⠀⠀⠀⣭⣷⠀⠀⠀⠸⣿⣿⣿⣿ ⣿⣿⣿⣿⣿⣿⣿⣿⠃⠀⠀⠈⠉⠀⠀⠤⠄⠀⠀⠀⠉⠁⠀⠀⠀⠀⢿⣿⣿⣿ ⣿⣿⣿⣿⣿⣿⣿⣿⢾⣿⣷⠀⠀⠀⠀⡠⠤⢄⠀⠀⠀⠠⣿⣿⣷⠀⢸⣿⣿⣿ ⣿⣿⣿⣿⣿⣿⣿⣿⡀⠉⠀⠀⠀⠀⠀⢄⠀⢀⠀⠀⠀⠀⠉⠉⠁⠀⠀⣿⣿⣿ ⣿⣿⣿⣿⣿⣿⣿⣿⣧⠀⠀⠀⠀⠀⠀⠀⠈⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢹⣿⣿ ⣿⣿⣿⣿⣿⣿⣿⣿⣿⠃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸⣿⣿

    • asudox@lemmy.asudox.dev
      link
      fedilink
      arrow-up
      4
      ·
      3 months ago

      when does it end

  • Ech@lemm.ee
    link
    fedilink
    English
    arrow-up
    11
    ·
    3 months ago

    What!? What a complete and utter shocker!!

    Tbf, I don’t use any of these corporate llms for exactly that reason. At best, they just use user interaction to “improve” the models, and they’re more likely using it to profile and track them as well. Fuck that.

  • Coldmoon@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    8
    ·
    3 months ago

    Surprised pikachu face.

  • ObsidianZed@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    3 months ago

    Everyone had it wrong! It was the Chinese Government stealing your data to give to TikTok!

    • Da Cap’n@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      Uno reverse

    • LWD@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      “I only care if America has my data”

      DeepSeek > TikTok > Oracle > Ellison > America

  • zombiewarrior@social.vivaldi.net
    link
    fedilink
    arrow-up
    4
    ·
    3 months ago

    deleted by creator

    • Da Cap’n@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      You rang?

      • go $fsck yourself@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        Comments from some federated sources always add the username of the user they are replying to. It’s one of the things I really hate from cross-federation.

        • Da Cap’n@lemmy.dbzer0.comOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 months ago

          Oooh, I didn’t know that. Thanks for explaining.

  • _cryptagion [he/him]@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 months ago

    What? But the tankies assured me this wouldn’t happen!

Privacy@lemmy.dbzer0.com

privacy@lemmy.dbzer0.com

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !privacy@lemmy.dbzer0.com

Welcome! This is a community for all those who are interested in protecting their privacy.

Rules

PS: Don’t be a smartass and try to game the system, we’ll know if you’re breaking the rules when we see it!

  1. Be civil and no prejudice
  2. Don’t promote big-tech software
  3. No apathy and defeatism for privacy (i.e. “They already have my data, why bother?”)
  4. No reposting of news that was already posted
  5. No crypto, blockchain, NFTs
  6. No Xitter links (if absolutely necessary, use xcancel)

Related communities:

Some of these are only vaguely related, but great communities.

  • !opensource@programming.dev
  • !selfhosting@slrpnk.net / !selfhosted@lemmy.world
  • !piracy@lemmy.dbzer0.com
  • !drm@lemmy.dbzer0.com
Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 482 users / day
  • 1.13K users / week
  • 3.17K users / month
  • 7.28K users / 6 months
  • 13 local subscribers
  • 2.16K subscribers
  • 460 Posts
  • 4.01K Comments
  • Modlog
  • mods:
  • fxomt@lemmy.dbzer0.com
  • Otter@lemmy.ca
  • shaytan@lemmy.dbzer0.com
  • fxomt@piefed.social
  • BE: 0.19.11
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org