Dust is a rewrite of du (in rust obviously) that visualizes your directory tree and what percentage each file takes up. But it only prints as many files fit in your terminal height, so you see only the largest files. It’s been a better experience that du, which isn’t always easy to navigate to find big files (or atleast I’m not good at it.)

Anyway, found a log file at .local/state/nvim/log that was 70gb. I deleted it. Hope it doesn’t bite me. Been pushing around 95% of disk space for a while so this was a huge win 👍

  • bizdelnick@lemmy.ml
    link
    fedilink
    arrow-up
    39
    arrow-down
    1
    ·
    8 months ago

    I usually use something like du -sh * | sort -hr | less, so you don’t need to install anything on your machine.

    • mvirts@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      8 months ago

      Same, but when it’s real bad sort fails 😅 for some reason my root is always hitting 100%

      I usually go for du -hx | sort -h and rely on my terminal scroll back.

      • bizdelnick@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        Maybe, but I need it one time per year or so. It is not a task for which I want to install a separate tool.

        • meteokr@community.adiquaints.moe
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          8 months ago

          Perfect for your use case, not as much for others. People sharing tools, and all the different ways to solve this type of problem is great for everyone.

    • digdilem@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      8 months ago

      Almost the same here. Well, du -shc *|sort -hr

      I admin around three hundred linux servers and this is one of my most common tasks - although I use -shc as I like the total too, and don’t bother with less as it’s only the biggest files and dirs that I’m interested in and they show up last, so no need to scrollback.

      When managing a lot of servers, the storage requirements when installing extra software is never trivial. (Although our storage does do very clever compression and it might recognise the duplication of the file even across many vm filesystems, I’m never quite sure that works as advertised on small files)

      • dan@upvote.au
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        I admin around three hundred linux servers

        What do you use for management? Ansible? Puppet? Chef? Something else entirely?

        • digdilem@lemmy.ml
          link
          fedilink
          arrow-up
          2
          ·
          8 months ago

          Main tool is Uyuni, but we use Ansible and AWX for building new vms, and adhoc ansible for some changes.

          • dan@upvote.au
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            8 months ago

            Interesting; I hadn’t heard of Uyuni before. Thanks for the info!

        • digdilem@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          du -xh --max-depth=1|sort -hr

          Interesting. Do you often deal with dirs on different filesystems?

          • pete_the_cat@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            Yeah, I was a Linux System Admin/Engineering for MLB/Disney+ for 5 years. When I was an admin, one of our tasks was clearing out filled filesystems on hosts that alerted.

            • digdilem@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              8 months ago

              Sounds pretty similar to what I do now - but never needed the -x. Guess that might be quicker when you’re nested somewhere there is a bunch of nfs/smb stuff mounted in.

              • pete_the_cat@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                8 months ago

                We’d do it from root (/) and drill down from there, it was usually /var/lib or /var/logs that was filling up, but occasionally someone would upload a 4.5 GB file to their home folder which has a quota of 5 GB.

                Using ncdu would have been the best way, but that would require it being installed on about 7 thousand machines.

      • digdilem@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        With sort -hr, the biggest ones are generally at the bottom already, which is often what most people care about.