Archive.org link

Some key excerpts:

A pseudonymous coder has created and released an open source “tar pit” to indefinitely trap AI training web crawlers in an infinitely, randomly-generating series of pages to waste their time and computing power. The program, called Nepenthes after the genus of carnivorous pitcher plants which trap and consume their prey, can be deployed by webpage owners to protect their own content from being scraped or can be deployed “offensively” as a honeypot trap to waste AI companies’ resources.

The typical web crawler doesn’t appear to have a lot of logic. It downloads a URL, and if it sees links to other URLs, it downloads those too. Nepenthes generates random links that always point back to itself - the crawler downloads those new links. Nepenthes happily just returns more and more lists of links pointing back to itself,” Aaron B, the creator of Nepenthes, told 404 Media.

Since they made and deployed a proof-of-concept, Aaron B said their pages have been hit millions of times by internet-scraping bots. On a Hacker News thread, someone claiming to be an AI company CEO said a tarpit like this is easy to avoid; Aaron B told 404 Media “If that’s, true, I’ve several million lines of access log that says even Google Almighty didn’t graduate” to avoiding the trap.

  • Onno (VK6FLAB)@lemmy.radio
    link
    fedilink
    arrow-up
    18
    ·
    edit-2
    8 hours ago

    This will not work. It sounds great, it sounds plausible, even realistic at some level, but this will not work.

    Here’s why.

    The bot operator has more money than you do. If the efficiency of one bot decreases on one website, they’ll throw another bot at it, rinse and repeat until your website stops responding because it’s ground to dust.

    Meta bots are good at doing this, hitting your site with thousands of requests a second, over and over again.

    Meta is not alone in this, but in my experience it’s the most destructive.

    Source: One of my clients runs a retail website and I’ve been dealing with this.

    At the moment the “best” - least worse is probably more accurate - “solution” is to block them as if they’re malicious traffic - which essentially is what they are.

    • G0ldenSp00n@lemmy.jacaranda.club
      link
      fedilink
      arrow-up
      2
      ·
      3 hours ago

      I think part of this software at least in the description on the website is easy and reliable detection of LLM bots to block. You can run it and it will generate statistics about bots that get caught in it so you can easily block big lists of them.