I’m writing a program that wraps around dd to try and warn you if you are doing anything stupid. I have thus been giving the man page a good read. While doing this, I noticed that dd supported all the way up to Quettabytes, a unit orders of magnitude larger than all the data on the entire internet.

This has caused me to wonder what the largest storage operation you guys have done. I’ve taken a couple images of hard drives that were a single terabyte large, but I was wondering if the sysadmins among you have had to do something with e.g a giant RAID 10 array.

  • @freijon
    link
    604 months ago

    I’m currently backing up my /dev folder to my unlimited cloud storage. The backup of the file /dev/random is running since two weeks.

    • Eager Eagle
      link
      fedilink
      English
      134 months ago

      That’s silly. You should compress it before uploading.

    • @Mike1576218@lemmy.ml
      link
      fedilink
      94 months ago

      No wonder. That file is super slow to transfer for some reason. but wait till you get to /dev/urandom. That file hat TBs to transfer at whatever pipe you can throw at it…

      • @PlexSheep@infosec.pub
        link
        fedilink
        34 months ago

        /dev/random and other “files” in /dev are not really files, they are interfaces which van be used to interact with virtual or hardware devices. /dev/random spits out cryptographically secure random data. Another example is /dev/zero, which spits out only zero bytes.

        Both are infinite.

        Not all “files” in /dev are infinite, for example hard drives can (depending on which technology they use) be accessed under /dev/sda /dev/sdb and so on.

        • data1701d (He/Him)OP
          link
          fedilink
          English
          14 months ago

          I’m aware of that. I was quite sure the author was joking, with the slightest bit of concern of them actually making the mistake.