• @damndotcommie@lemmy.basedcount.com
    link
    fedilink
    English
    154 months ago

    But AI has no actual intelligence of it’s own. It’s not going to magically just figure things out. All it can do is spit back what it has been fed.

    • @Lmaydev@programming.dev
      link
      fedilink
      English
      1
      edit-2
      4 months ago

      That’s 100% not how AIs work. Not even LLMs.

      The whole point of AIs is they work beyond their training data. Otherwise they couldn’t do anything.

    • @cm0002@lemmy.world
      link
      fedilink
      English
      -44 months ago

      All it can do is spit back what it has been fed.

      Those who say these things severely underestimate what AI is capable of or will be in short order or just don’t understand how they work and why.

      But setting that aside, I’m not saying we’ll be able to feed AI a raw decompiled firmware and have it spit out a fully functional emulator in an hour.

      But, in the near future we might be able to feed it raw decompiled firmware and it’ll be able to map proprietary undocumented syscalls in a few minutes, that would be a big chunk of work that could take months of not years

      A decent AI model could significantly lower the barrier to entry for emulator development from “A handful of elite hackers and programmers”

      • @Cypher@lemmy.world
        link
        fedilink
        English
        34 months ago

        I see you don’t understand what an LLM is, how they operate or comprehend the kind and volume of training data that is required.

          • @Cypher@lemmy.world
            link
            fedilink
            English
            34 months ago

            With current models? No. See my points above especially the one about the volume of data required.

            Reverse engineering firmware is extremely niche, even more so for emulation. There are so few examples that current AI models wouldn’t have enough training data to work off of.