The New York Times is suing OpenAI and Microsoft for copyright infringement, claiming the two companies built their AI models by “copying and using millions” of the publication’s articles and now “directly compete” with its content as a result.
As outlined in the lawsuit, the Times alleges OpenAI and Microsoft’s large language models (LLMs), which power ChatGPT and Copilot, “can generate output that recites Times content verbatim, closely summarizes it, and mimics its expressive style.” This “undermine[s] and damage[s]” the Times’ relationship with readers, the outlet alleges, while also depriving it of “subscription, licensing, advertising, and affiliate revenue.”
The complaint also argues that these AI models “threaten high-quality journalism” by hurting the ability of news outlets to protect and monetize content. “Through Microsoft’s Bing Chat (recently rebranded as “Copilot”) and OpenAI’s ChatGPT, Defendants seek to free-ride on The Times’s massive investment in its journalism by using it to build substitutive products without permission or payment,” the lawsuit states.
The full text of the lawsuit can be found here
If the garbage that comes out of chatgpt can be considered legitimate competition, then the new your times sucks a journalism
It’s not a legitimate competition, that’s the entirely point. The claim is AI models rely on stealing content and changing it slightly or not all. And if a “regular” journalist does this, they would get into trouble. Just because the entity switches to an AI company doesn’t make this business model legitimate.
A few years ago there was a big plagiarism scandal on IGN because one of their “journalists” mostly took reviews of other people, changed a few words, and published it. Obviously that’s not fine.
Is the TL;DRbot fair use?
Probably not.
deleted by creator
Copyright law doesn’t actually care about commercial use. It probably should, but it doesn’t.
Nobody is profiting off the tldr bot so they’re completely incomparable and you’ve shown just how out of your element you are in this discussion.