• @Blizzard@lemmy.zip
    link
    fedilink
    English
    1314 months ago

    My current company has a script that runs and deletes files that haven’t been modified for two years. It doesn’t take into account any other factors, just modification date. It doesn’t aks for confirmation and doesn’t even inform the end user about.

      • @Blizzard@lemmy.zip
        link
        fedilink
        English
        254 months ago

        Thought about it but I use modification date for sorting to have the stuff I’ve recently worked on on top. I instead keep the files where the script isn’t looking. The downside is they are not backed up so I might potentially lose them but if I don’t do that, then I’ll lose them for sure…

        • Perhyte
          link
          fedilink
          English
          294 months ago

          You don’t actually have to set all the modification dates to now, you can pick any other timestamp you want. So to preserve the order of the files, you could just have the script sort the list of files by date, then update the modification date of the oldest file to some fixed time ago, the second-oldest to a bit later, and so on.

          You could even exclude recently-edited files because the real modification dates are probably more relevant for those. For example, if you only process files older than 3 months, and update those starting from "6 months old"1, that just leaves remembering to run that script at least once a year or so. Just pick a date and put a recurring reminder in your calendar.

          1: I picked 6 months there to leave some slack, in case you procrastinate your next run or it’s otherwise delayed because you’re out sick or on vacation or something.

          • @barsquid@lemmy.world
            link
            fedilink
            44 months ago

            Change the date on all the files by scaling to fit the oldest file. Scale to 1 year as a safe maximum age. So if the oldest file is 1.5 years old, scale all files to be t/1.5 duration prior to now.

        • LazaroFilm
          link
          fedilink
          English
          44 months ago

          Have a script that makes a copy of all files that are 1.9 years old into a separate folder.

        • @MNByChoice@midwest.social
          link
          fedilink
          3
          edit-2
          4 months ago

          Create a series of folders labeled with dates. Every day copy the useful stuff to the new folder. Every night change modified dates on all files to current date.

    • @brygphilomena@lemmy.world
      link
      fedilink
      184 months ago

      What industry are you in. This could be compliance for different reasons. Retention is a very specific thing that should be documented in policies.

      I know financial institutions that specifically do not want data just hanging around. This limits liability and exposure if there is a breach, and makes any litigation much easier if the data doesn’t exist by policy.

      Should they be more choosy on what gets deleted, yea probably. But I understand why it’s there.

    • @ramble81@lemm.ee
      link
      fedilink
      84 months ago

      That sounds like a lawyers dream… “can’t provide it if it doesn’t exist” … now granted, if they got a subpoena they’d have to save it going forward, but before then, if their not bound by something that forces data retention, the less random data laying around the better.

    • @BearOfaTime@lemm.ee
      link
      fedilink
      84 months ago

      Put all your files in a single zip file. No compression. Since Windows handles zip files like folders, you can work like normal. And the zip file will always have a recent time stamp.

    • Kalkaline
      link
      fedilink
      84 months ago

      That’s the worst foresight I think I’ve ever heard of, you might as well make that 3 months if you’re just going to trash thousands of labor hours on those files.