• @psivchaz@reddthat.com
    link
    fedilink
    English
    205 months ago

    It is legitimately useful for getting started with using a new programming library or tool. Documentation is not always easy to understand or easy to search, so having an LLM generate a baseline (even if it’s got mistakes) or answer a few questions can save a lot of time.

    • @Grandwolf319@sh.itjust.works
      link
      fedilink
      English
      345 months ago

      So I used to think that, but I gave it a try as I’m a software dev. I personally didn’t find it that useful, as in I wouldn’t pay for it.

      Usually when I want to get started, I just look up a basic guide and just copy their entire example to get started. You could do that with chatGPT too but what if it gave you wrong answers?

      I also asked it more specific questions about how to do X in tool Y. Something I couldn’t quickly google. Well it didn’t give me a correct answer. Mostly because that question was rather niche.

      So my conclusion was that, it may help people that don’t know how to google or are learning a very well know tool/language with lots of good docs, but for those who already know how to use the industry tools, it basically was an expensive hint machine.

      In all fairness, I’ll probably use it here and there, but I wouldn’t pay for it. Also, note my example was chatGPT specific. I’ve heard some companies might use it to make their docs more searchable which imo might be the first good use case (once it happens lol).

      • @BassTurd@lemmy.world
        link
        fedilink
        English
        45 months ago

        I just recently got copilot in vscode through work. I typed a comment that said, “create a new model in sqlalchemy named assets with the columns, a, b, c, d”. It couldn’t know the proper data types to use, but it output everything perfectly, including using my custom defined annotations, only it was the same annotation for every column that I then had to update. As a test, that was great, but copilot also picked up a SQL query I had written in a comment to reference as I was making my models, and it also generated that entire model for me as well.

        It didn’t do anything that I didn’t know how to do, but it saved on some typing effort. I use it mostly for its auto complete functionality and letting it suggest comments for me.

        • @Grandwolf319@sh.itjust.works
          link
          fedilink
          English
          55 months ago

          That’s awesome, and I would probably would find those tools useful.

          Code generators have existed for a long time, but they are usually free. These tools actually costs a lot of money, cost way more to generate code this way than the traditional way.

          So idk if it would be worth it once the venture capitalist money dries up.

          • @BassTurd@lemmy.world
            link
            fedilink
            English
            35 months ago

            That’s fair. I don’t know if I will ever pay my own money for it, but if my company will, I’ll use it where it fits.

              • @bamboo@lemm.ee
                link
                fedilink
                English
                15 months ago

                Neither of those seem similar to GitHub copilot other than that they can reduce keystrokes for some common tasks. The actual applicability of them seems narrow. Frequently I use GitHub copilot for “implement this function based on this doc comment I wrote” or “write docs for this class/function”. It’s the natural language component that makes the LLM approach useful.

                • @Grandwolf319@sh.itjust.works
                  link
                  fedilink
                  English
                  15 months ago

                  There is also auto doc generators.

                  I think what you’re specifically referring to is accessibility or ease of use. For someone unfamiliar with those tools, I can see the appeal.

                  Personally, as a software dev, I think it’s just very inefficient way to accomplish this goal. LLMs consume vastly more resources than a simple script. So I wouldn’t use it, especially if I’m paying real money for it.

      • Dran
        link
        fedilink
        English
        35 months ago

        I’m actually working on a vector DB RAG system for my own documentation. Even in its rudimentary stages, it’s been very helpful for finding functions in my own code that I don’t remember exactly what project I implemented it in, but have a vague idea what it did.

        E.g

        Have I ever written a bash function that orders non-symver GitHub branches?

        Yes! In your ‘webwork automation’ project, starting on line 234, you wrote a function that sorts Git branches based on WebWork’s versioning conventions.

      • @markon@lemmy.world
        link
        fedilink
        English
        05 months ago

        Huge time saver. I’ve had GPT doing a lot of work for me and it makes stuff like managing my Arch install smooth and easy. I don’t use OpenAI stuff much though. Gemini has gotten way better, Claude 3.5 Sonnet is beastly at code stuff. I guess if you’re writing extremely complex production stuff it’s not going to be able to do that, but try asking most people even what an unsigned integer is. Most people will be like “what?”

        • @Grandwolf319@sh.itjust.works
          link
          fedilink
          English
          35 months ago

          but try asking most people even what an unsigned integer is. Most people will be like “what?”

          Why is that relevant? Are you saying that AI makes coding more accessible? I mean that’s great, but it’s like a calculator. Sure it helps people who need simple calculations in the short term, but it might actually discourage software literacy.

          I wish AI could just be a niche tool, instead it’s like a simple calculator being sold as a smartphone.