• @idkwhatimdoing@sh.itjust.works
    link
    fedilink
    English
    8
    edit-2
    10 months ago

    As someone who works in content marketing, this is already untrue at the current quality of LLMs. It still requires a LOT of human oversight, which obviously it was not given in this example, but a good writer paired with knowledgeable use of LLMs is already significantly better than a good content writer alone.

    Some examples are writing outside of a person’s subject expertise at a relatively basic level. This used to take hours or days of entirely self-directed research on a given topic, even if the ultimate article was going to be written for beginners and therefore in broad strokes. With diligent fact-checking and ChatGPT alone, the whole process, including final copy, takes maybe 4 hours.

    It’s also an enormously useful research tool. Rather than poring over research journals, you can ask LLMs with academic plug-ins to give a list of studies that fit very specific criteria and link to full texts. Sometimes it misfires, of course, hence the need for a good writer still, but on average this can cut hours from journalistic and review pieces without harming (often improving) quality.

    All the time writers save by having AI do legwork is then time they can instead spend improving the actual prose and content of an article, post, whatever it is. The folks I know who were hired as writers because they love writing and have incredible commitment to quality are actually happier now using AI and being more “productive” because it deals mostly with the shittiest parts of writing to a deadline and leaves the rest to the human.

    • circuitfarmer
      link
      fedilink
      English
      510 months ago

      It still requires a LOT of human oversight, which obviously it was not given in this example, but a good writer paired with knowledgeable use of LLMs is already significantly better than a good content writer alone.

      I’m talking about future state. The goal clearly is to avoid the need of human oversight altogether. The purpose of that is saving some rich people more money. I also disagree that LLMs improve output of good writers, but even if they did, the cost to society is high.

      I’d much rather just have the human author, and I just hope that saying “we don’t use AI” becomes a plus for PR due to shifting public opinion.

      • @kromem@lemmy.world
        link
        fedilink
        English
        210 months ago

        No, it’s not the ‘goal’.

        Somehow when it comes to AI it’s humans who have the binary thinking.

        It’s not going to be “either/or” anytime soon.

        Collaboration between humans and ML is going to be the paradigm for the foreseeable future.

        • @M0oP0o@mander.xyz
          link
          fedilink
          English
          310 months ago

          The hundreds of clearly AI written help articles with bad or useless info every time I try to look something up in the last few months says otherwise…