• @jordanlund@lemmy.world
    link
    fedilink
    198 months ago

    Roko’s Basilisk. But here’s the thing, once you’re aware of it, you’re fucked. The only solution is to not research it, don’t know anything about it. Live in blissful ignorance.

    • @Naich
      link
      148 months ago

      What about if you read about it and didn’t understand it?

    • @jkrtn@lemmy.ml
      link
      fedilink
      98 months ago

      You have to believe that a malevolent AI will give enough of a damn about you to bother simulating anything at all, let alone infinite torture, which is useless for it to do once it already exists. Everyone on LessWrong has a well-fed ego so I get why they were in a tizzy for a while.

      • @jordanlund@lemmy.world
        link
        fedilink
        18 months ago

        Well one punishes you if you deny it’s existence, the other punishes you if you fail to assist in it’s development. So it’s a LITTLE different. :)

        Fortunately, for me personally, I helped fund a key researcher who could, in theory, be a major contributor to such a thing. So I have plausible deniability. ;) And I’ve been promised a 15 minute head start before he turns it on.

      • @Denjin
        link
        168 months ago

        Silly thought experiment, the result of which, in gullible people could make them potential victims of psychosomatic symptoms like headaches and insomnia.

      • @Nibodhika@lemmy.world
        link
        fedilink
        68 months ago

        It’s essentially a thought experiment, without getting too specific it goes along the lines of “what if there was a hypothetical bad scenario that gets triggered by you knowing about it”, so if you look it up now you’re doomed.