Deepfake scammer walks off with $25 million in first-of-its-kind AI heist::Hong Kong firm tricked by simulation of multiple real people in video chat, including voices.

  • @theskyisfalling@lemmy.dbzer0.com
    link
    fedilink
    English
    3811 months ago

    What kind of company let’s a single employee transfer that amount of money without multiple different password entries or checks from different people though, seriously?

    Doesn’t matter if they had a conference call with what appeared to be certain people as the article says they could easily have used key pair verification such as pgp. Sounds like poor security all around especially considering the amounts involved.

    • @WhatAmLemmy@lemmy.world
      link
      fedilink
      English
      14
      edit-2
      11 months ago

      PGP? Have you ever dealt with any banking or financial corporations? You’d have better luck getting the money handlers and decision makers to authenticate transactions with magic.

      Hong Kong and Japan are the absolute worst I’ve experienced. Their online banking UI’s and processes are stuck in the late 90’s to early 2000’s.

        • Jojo
          link
          fedilink
          English
          111 months ago

          It’s stylistically acceptable to put an apostrophe for plurals in cases where the plural thing isn’t a “normal” word, as is the case for initialisms like UI or numbers like the latter two you caught.

          Obviously a given body may make its own rules in this regard, but luckily English has no overall authority, and this is informal communication outside the domain of any minor ones (beyond, perhaps, idle pedants and prescriptivists).

    • @Lmaydev@programming.dev
      link
      fedilink
      English
      2
      edit-2
      11 months ago

      Somewhere I worked the CEOs email got hacked and they asked the head of finance to change the bank account details for a 100k payment that was due to go out.

      Luckily they thought to double check with them. But it came really close to happening.

      This all happened via a phishing email.

      Social engineering is how most hacks happen. Doesn’t matter what protection you put in place. People are always the weakest link.

  • @redcalcium@lemmy.institute
    link
    fedilink
    English
    26
    edit-2
    11 months ago

    Acting senior superintendent Baron Chan Shun-ching of the Hong Kong police emphasized the novelty of this scam, noting that it was the first instance in Hong Kong where victims were deceived in a multi-person video conference setting. He pointed out the scammer’s strategy of not engaging directly with the victim beyond requesting a self-introduction, which made the scam more convincing.

    The police have offered tips for verifying the authenticity of individuals in video calls, such as asking them to move their heads or answer questions that confirm their identity, especially when money transfer requests are involved. Another potential solution to deepfake scams in corporate environments is to equip every employee with an encrypted key pair, establishing trust by signing public keys at in-person meetings. Later, in remote communications, those signed keys could be used to authenticate parties within the meeting.

    If you’re a rank-and-file employee in a virtual meeting with your company’s top brass, it probably won’t occur in your mind to ask them to turn their heads to see if it’ll glitch. The scammers can just act offended and ignore your request instead. Chance that you’re going to fear for your employment and apologize profusely.

    The key exchange mechanism suggested by the article sounds impractical because the employees from HK likely never meet the CFO from UK in person. Maybe the corporate video conferencing system should have a company-wide key registry, but if the scammers managed to hack in and insert their own key or steal a top brass’s video conferencing accounts, then it’ll probably moot.

  • @Sunforged@lemmy.ml
    link
    fedilink
    English
    1911 months ago

    This is incredible. And scary. And incredible. I would hate to be the poor sap that fell for it though, oof.

  • @PeroBasta@lemmy.world
    link
    fedilink
    English
    1211 months ago

    I’d like to hear the whole story, like how hold was the scammed guy etc.

    To me it smells like he was an accomplice, or a very old person who is full of his company shit.

    • @redcalcium@lemmy.institute
      link
      fedilink
      English
      511 months ago

      These new realtime deepfake system is very good. DeepFaceLive was one of the example. It can generate deepfake in realtime from just a single photo, and even more convincing deepfake if you have thousands of target’s photos/video frames to train the deepfake models. It’s not surprising if someone could fall for it if they’re not aware of the technology.

  • BarqsHasBite
    link
    fedilink
    English
    1111 months ago

    Used to be so easy to spot scams and fakes, this stuff now is getting scary. I wonder if this will slow things down as we require face to face and in person confirmation.

  • @jet@hackertalks.com
    link
    fedilink
    English
    811 months ago

    And why can’t they claw back the funds through traditional banking? It’s not like they sent crypto to a unknown address

  • AutoTL;DRB
    link
    English
    711 months ago

    This is the best summary I could come up with:


    Deepfakes utilize AI tools to create highly convincing fake videos or audio recordings, posing significant challenges for individuals and organizations to discern real from fabricated content.

    This incident marks the first of its kind in Hong Kong involving a large sum and the use of deepfake technology to simulate a multi-person video conference where all participants (except the victim) were fabricated images of real individuals.

    Despite initial doubts, the employee was convinced enough by the presence of the CFO and others in a group video call to make 15 transfers totaling HK$200 million to five different Hong Kong bank accounts.

    The high-tech theft underscores the growing concern over new uses of AI technology, which has been spotlighted recently due to incidents like the spread of fake explicit images of pop superstar Taylor Swift.

    Over the past year, scammers have been using audio deepfake technology to scam people out of money by impersonating loved ones in trouble.

    The police have offered tips for verifying the authenticity of individuals in video calls, such as asking them to move their heads or answer questions that confirm their identity, especially when money transfer requests are involved.


    The original article contains 519 words, the summary contains 192 words. Saved 63%. I’m a bot and I’m open source!

  • TheMediocreOne
    link
    fedilink
    English
    611 months ago

    That’s scary but at the smme time super cool. I mean kudos to the thieves, they did the hard work apparently.