There are plenty of utilities (GUI, such as filelight and TUI, such as dua as well) for analyzing disk usage by space, but I would like to view my folders based on the count of files, as I’m making backups, and folders with lots of small files (e.g. node_modules) take very long to move around, so I guess that I’d be better of compressing those into a single file before archiving, as it’s already highly unlikely that I’ll need to access them anyway. Thanks for any pointers in advance!

  • d3Xt3r@lemmy.nzM
    link
    fedilink
    arrow-up
    14
    ·
    4 months ago

    You can use ncdu for this. Launch it with the options --show-itemcount --sort=itemcount

    • isti115@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      4 months ago

      Oh, wow, thank you! I had ncdu installed, but it was an older version, which didn’t yet have this feature. Now that I updated to the newest (Zig based 🎉) release this looks perfect for my needs!

    • isti115@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      Thank you for the idea! I didn’t know about the --inodes flag before, this seems like a viable solution for systems where I can’t / don’t want to install additional software!

  • palordrolap@kbin.social
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    4 months ago

    Cinnamon’s Nemo (GUI) file manager shows folder item count in the List View’s “Size” column rather than a byte value. It started as a fork of Nautilus (now GNOME Files), so that and its descendents may also have the same feature.

    The equivalent GNOME gio list command line command doesn’t seem to do this.

    It wouldn’t be too hard to whip something up in Python, Perl etc. if you can’t or don’t want to install anything else for some reason.

    e.g.

    perl -wle '$a=$ARGV[0];opendir D, defined $a && -d $a?$a:".";@x=readdir D; print -2+@x'
    
    

    is a Perl incantation that will return the number of entries in the current directory, or the supplied directory if that’s added as a parameter after the command.

    The -2 in there subtracts the count for . and ... That’s off by one for the root directory where there’s no “…” but that’s rare and I didn’t want to add too much for a quick proof of concept.

    • isti115@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Thanks for your input! To me it seems like Nemo only counts the direct descendants and doesn’t recurse, which makes it less useful for this purpose, but still nice to know!

      • palordrolap@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        4 months ago

        The find command could be your friend for getting a full depth count. Something like:

        find /path/name/here/ | wc -l
        
        

        Or just:

        find . | wc -l
        
        

        for the current directory.

        There’s also a command called locate (often with another letter in front, but accessible by just that word) which maintains a database of filenames on a system that can be used, provided it’s installed and has built that database.

        Pro: Faster than churning the disk every time with find. (Though disk cache can help alleviate some of this).

        Cons: Can get out of date with a changing filesystem. Harder to search for relative paths like .

        locate -r '^/path/name/here/' | wc -l
        
        

        … would be roughly equivalent to the first find example.

  • rlychilplr@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    Ranger (tui) shows the amount of files in a directory. Im not sure about this but ncdu (also tui) might do this recursivaly.