• frezik
    link
    fedilink
    arrow-up
    95
    arrow-down
    1
    ·
    21 days ago

    Zip makes different tradeoffs. Its compression is basically the same as gz, but you wouldn’t know it from the file sizes.

    Tar archives everything together, then compresses. The advantage is that there are more patterns available across all the files, so it can be compressed a lot more.

    Zip compresses individual files, then archives. The individual files aren’t going to be compressed as much because they aren’t handling patterns between files. The advantages are that an error early in the file won’t propagate to all the other files after it, and you can read a file in the middle without decompressing everything before it.

    • herrvogel@lemmy.world
      link
      fedilink
      arrow-up
      13
      ·
      20 days ago

      Yeah that’s a rather important point that’s conveniently left out too often. I routinely extract individual files out of large archives. Pretty easy and quick with zip, painfully slow and inefficient with (most) tarballs.

      • frezik
        link
        fedilink
        arrow-up
        7
        ·
        20 days ago

        It’s just a different layer of compression. Better than gzip generally, but the tradeoffs are exactly the same.

  • renzev@lemmy.world
    link
    fedilink
    arrow-up
    86
    arrow-down
    1
    ·
    21 days ago

    Obligatory shilling for unar, I love that little fucker so much

    • Single command to handle uncompressing nearly all formats.
    • No obscure flags to remember, just unar <yourfile>
    • Makes sure output is always contained in a directory
    • Correctly handles weird japanese zip files with SHIFT-JIS filename encoding, even when standard unzip doesn’t
      • renzev@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        20 days ago

        Voicebanks for Utau (free (as in beer, iirc) clone of Vocaloid) are primarily distributed as SHIFT-JIS encoded zips. For example, try downloading Yufu Sekka’s voicebank: http://sekkayufu.web.fc2.com/ . If I try to unzip the “full set” zip, it produces a folder called РсЙ╠ГЖГtТPУ╞Й╣ГtГЛГZГbГgБi111025Бj. But unar detects the encoding and properly extracts it as 雪歌ユフ単独音フルセット(111025). I’m sure there’s some flag you can pass to unzip to specify the encoding, but I like having unar handle it for me automatically.

        • sh__@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          20 days ago

          Ah, that’s pretty cool. I’m not sure I know of that program. I do know a little vocaloid though, but I only really listen to 稲葉曇(Inabakumori).

          • renzev@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            18 days ago

            I know inabakumori! Their music is so cool! When I first listened to rainy boots and lagtrain, it made me feel emotions I thought I had forgotten a long time ago… I wish my japanese was good enough to understand the lyrics without looking them up ._. I’m also a huge fan of Kikuo. His music is just something completely unique, not to mention his insane tuning. He makes Miku sing in ways I didn’t think were possible lol

            • sh__@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              18 days ago

              I get you, I want to learn more Japanese. I only understand a very small amount at this point. I don’t have any Miku songs that I have really wanted to listen to, but that could change. I might check out Kikuo then. Also I love the animations Inabakumori release with their songs too. They have some new stuff that’s really good if you haven’t checked it out yet.

    • Alexstarfire@lemmy.world
      link
      fedilink
      arrow-up
      21
      ·
      21 days ago

      You can’t decrease something by more than 100% without going negative. I’m assuming this doesn’t actually decompress files before you tell it to.

      Does this actually decompress in 1/13th the time?

    • boredsquirrel@slrpnk.net
      link
      fedilink
      arrow-up
      16
      ·
      21 days ago

      Yeah, Facebook!

      Sucks but yes that tool is damn awesome.

      Meta also works with CentOS Stream at their Hyperscale variant.

      • abbadon420@lemm.ee
        link
        fedilink
        arrow-up
        24
        ·
        21 days ago

        Makes sense. There are actual programmers working at facebook. Programmers want good tools and functionality. They also just want to make good/cool/fun products. I mean, check out this interview with a programmer from pornhub. The poor dude still has to use jquery, but is passionate to make the best product they can, like everone in programming.

  • qjkxbmwvz@startrek.website
    link
    fedilink
    arrow-up
    38
    arrow-down
    1
    ·
    21 days ago

    When I’m feeling cool and downloading a *.tar* file, I’ll wget to stdout, and tar from stdin. Archive gets extracted on the fly.

    I have (successfully!) written an .iso to CD this way, too (pipe wget to cdrecord). Fun stuff.

  • Hucklebee@lemmy.world
    link
    fedilink
    arrow-up
    33
    ·
    edit-2
    20 days ago

    Can someone explain why MacOS always seems to create _MACOSX folders in zips that we Linux/Windows users always delete anyway?

    • FIST_FILLET@lemmy.ml
      link
      fedilink
      arrow-up
      16
      ·
      20 days ago

      this is a complete uneducated guess from a relatively tech-illiterate guy, but could it contain mac-specific information about weird non-essential stuff like folder backgrounds and item placement on the no-grid view?

    • LiveLM@lemmy.zip
      link
      fedilink
      English
      arrow-up
      8
      ·
      20 days ago

      They’re Metadata specific for Macs.
      If you download a third party compression tool they’ll probably have an option somewhere to exclude these from the zips but the default tool doesn’t Afaik.

      • Hucklebee@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        20 days ago

        Thanks! Hmm, never thought of looking at 7zip’s settings to see if it can autodelete/not unpack that stuff. I’ll see if I can find such a setting!

        • ReveredOxygen@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          20 days ago

          You can definitely check, but I would expect the option to exist when the archive is created rather than when it’s extracted

    • cm0002@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      20 days ago

      Because Apple always gotta fuck with and “innovate” perfectly working shit

      Windows’s built-in tool can make zips without fucking with shit AND the resulting zip works just fine across systems.

      Mac though…Mac produced zips always ALWAYS give me issues when trying to unzip on a non-mac (ESPECIALLY Linux)

    • Hawk@lemmynsfw.com
      link
      fedilink
      arrow-up
      3
      ·
      19 days ago

      HFS+ has a different features set than NTFS or ext4, Apple elect to store metadata that way.

      I would imagine modern FS like ZFS or btrfs could benefit from doing something similar but nobody has chosen to implement something like that in that way.

        • Hawk@lemmynsfw.com
          link
          fedilink
          arrow-up
          3
          ·
          19 days ago

          I gotcha:

          • Btrfs
            • BTree File System
              • A Copy on White file system that supports snapshots, supported mostly by
          • ZFS
            • Zetabyte File System
              • Copy on Write File System. Less flexible than BTRFS but generally more robust and stable. Better compression in my experience than BTRFS. Out of Kernel Linux support and native FreeBSD.
          • HFS+
            • what Mac uses, I have no clue about this. some Copy on Write stuff.
          • NTFS
            • Windows File System
            • From what I know, no compression or COW
            • In my experience less stable than ext4/ZFS but maybe it’s better nowadays.
          • TCB13@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            19 days ago

            Great summary, but I’ve to add that NTFS is WAY more stable than ext4 when it comes to hardware glitches and/or power failures. ZFS is obviously superior to both but overkill for most people, BTRFS should be a nice middle ground and now even NAS manufacturers like Synology are migrating ext4 into BTRFS.

            • Hawk@lemmynsfw.com
              link
              fedilink
              arrow-up
              1
              ·
              19 days ago

              Well that’s good to know because I had some terrible luck with it about a decade ago. Although I don’t think I would go back to windows, I just don’t need it for work anymore and it’s become far too complex.

              I’ve also had pretty bad luck with BTRFS though, although it seems to have improved a lot in the past 3 years that I’ve been using it.

              ZFS would be good but having to rebuild the kernel module is a pain in the ass because when it fails to build you’re unbootable (on root). I also don’t like how clones are dependant on parents, requires a lot of forethought when you’re trying to create a reproducible build on eg Gentoo.

  • cygon@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    20 days ago

    I’m the weird one in the room. I’ve been using 7z for the last 10-15 years and now .tar.zst, after finding out that ZStandard achieves higher compression than 7-Zip, even with 7-Zip in “best” mode, LZMA version 1, huge dictionary sizes and whatnot.

    zstd --ultra -M99000 -22 files.tar -o files.tar.zst

  • jh29a@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    21
    ·
    edit-2
    20 days ago

    I use .tar.gz in personal backups because it’s built in, and because its the command I could get custom subdirectory exclusion to work on.