• @General_Effort@lemmy.world
    link
    fedilink
    English
    510 months ago

    They say it’s $60 million on an annualized basis. I wonder who’d pay that, given that you can probably scrape it for free.

    Maybe it’s the AI act in the EU. That might cause trouble in that regard. The US is seeing a lot of rent-seeker PR, too, of course. That might cause some to hedge their bets.

    Maybe some people had not realized that yet, but limiting fair use does not just benefit the traditional media corporations but also the likes of Reddit, Facebook, Apple, etc. Making “robots.txt” legally binding would only benefit the tech companies.

    • FaceDeer
      link
      fedilink
      -110 months ago

      This is the most frustrating thing, so many people are arguing against their own interests with their efforts to “lock down” their content to prevent AIs from training on it. In this very thread I’ve been accused of being pro-giant-company when I’m quite the opposite. The harder we make it to train AI, the stronger the advantage that the existing giant companies have in this field.

      • @MBM
        link
        English
        010 months ago

        deleted by creator