• Primarily0617
    link
    fedilink
    12
    edit-2
    1 year ago

    Incorrect.

    They’re designed to be resource intensive to calculate to make them harder to brute force, and impossible to reverse.

    Some literally have a parameter which acts as a sliding scale for how difficult they are to calculate, so that you can increase security as hardware power advances.

    • @confusedbytheBasics@lemmy.world
      link
      fedilink
      English
      11 year ago

      I was incorrect but I still disagree with you. The hashing function is not designed to be resource intensive but to have a controlled cost. Key stretching by adding rounds repeats the controlled cost to make computing the final hash more expensive but the message length passed to the function isn’t really an issue. After the first round it doesn’t matter if the message length was 10, 128, or 1024 bytes because each round after is only getting exactly the number of bytes the one way hash outputs.

      • Primarily0617
        link
        fedilink
        21 year ago

        It depends on the hash. E.g., OWASP only recommends 2 iterations of Argon2id as a minimum.

        Yes, a hashing function is designed to be resource intensive, since that’s what makes it hard to brute force. No, a hashing function isn’t designed to be infinitely expensive, because that would be insane. Yes, it’s still a bad thing to provide somebody with a force multiplier like that if they want to run a denial-of-service.

        • @confusedbytheBasics@lemmy.world
          link
          fedilink
          English
          11 year ago

          I’m a bit behind on password specific hashing techniques. Thanks for the education.

          My background more in general purpose one way hashing functions where we want to be able to calculate hashes quickly, without collisions, and using a consistent amount of resources.

          If the goal is to be resource intensive why don’t modern hashing functions designed to use more resources? What’s the technical problem keeping Argon2 from being designed to eat even more cycles?

          • Primarily0617
            link
            fedilink
            1
            edit-2
            1 year ago

            Argon2 has parameters that allow you to specify the execution time, the memory required, and the degree of parallelism.

            But at a certain point you get diminishing returns and you’re just wasting resources. It seems like a similar question to why not just use massive encryption keys.