I’m trying to find a good method of making periodic, incremental backups. I assume that the most minimal approach would be to have a Cronjob run rsync
periodically, but I’m curious what other solutions may exist.
I’m interested in both command-line, and GUI solutions.
I use rsync+btrfs snapshot solution.
- Use rsync to incrementally collect all data into a btrfs subvolume
- Deduplicate using
duperemove
- Create a read-only snapshot of the subvolume
I don’t have a backup server, just an external drive that I only connect during backup.
Deduplication is mediocre, I am still looking for snapshot aware
duperemove
replacement.I’m not trying to start a flame war, but I’m genuinely curious. Why do people like btrfs over zfs? Btrfs seems very much so “not ready for prime time”.
I’ve only ever run ZFS on a proxmox/server system but doesn’t it have a not insignificant amount of resources required to run it? BTRFS is not flawless, but it does have a pretty good feature set.
btrfs is included in the linux kernel, zfs is not on most distros
the tiny chance that an externel kernel module borking with a kernel upgrade happens sometimes and is probably scary enough for a lot of peopleFair enough
Features necessary for most btrfs use cases are all stable, plus btrfs is readily available in Linux kernel whereas for zfs you need additional kernel module. The availability advantage of btrfs is a big plus in case of a disaster. i.e. no additional work is required to recover your files.
(All the above only applies if your primary OS is Linux, if you use Solaris then zfs might be better.)
Get a Mac, use Time Machine. Go all in on the eco system. phone, watch, iPad, tv. I resisted for years but it’s so good man and the apple silicon is just leaps beyond everything else.
Time Machine is such a neglected product. Time Shift is worlds beyond it.
Time Machine is not a backup, it is unreliable. I’ve had corrupted time machine backups and its backups are non-portable: You can only read the backups using an Apple machine. Apple Silicon is also not leaps beyond everything else, a 7000-series AMD chip will trade blows on performance per watt given the same power target. (source: I measured it, 60 watt power limit on a 7950X will closely match a M1 ultra given the same 60 watts of power)
Sure their laptops are tuned better out of the box and have great battery life, but that’s not because of the Apple Silicon. Apple had good battery life before, even when their laptops had the same Intel chip as any other laptop. Why? Because of software.
Like before, their new M-chips are nothing special. Apple Silicon chips are great, but so are other modern chips. Apple Silicon is not “leaps beyond everything else”.
If you look past their shiny fanboy-bait chips, you realize you pay **huge ** markups on RAM and storage. Apple’s RAM and storage isn’t anything special, but they’re a lot more expensive than any other high-end RAM and storage modules, and it’s not like their RAM or storage is better because, again, an AMD chip can just use regular RAM modules and an NVME SSD and it will match the M-chip performance given the same power target. Except you can replace the RAM modules and the SSD on the AMD chipset for reasonable prices.
In the end, a macbook is a great product and there’s no other laptop that really gets close to its performance given its size. But that’s it, that’s where Apple’s advantage ends. Past their ultra-light macbooks, you get overpriced hardware, crazy expensive upgrades, with an OS that isn’t better, more reliable or more stable than Windows 11 (source: I use macOS and Windows 11 daily). You can buy a slightly thicker laptop (and it will still be thin and light) with replacable RAM and SSD and it will easily match the performance of the magic M1 chip with only a slight reduction in potential battery life. But guess what: If you actually USE your laptop for anything, the battery life of any laptop will quickly drop to 2-3 hours at best.
And that’s just laptops. If you want actual work done, you get a desktop, and for the price of any Apple desktop you can easily get any PC to outperform it. In some cases, you can buy a PC to outperform the Apple desktop AND buy a macbook for on the go, and still have money left over. Except for power consumption ofcourse, but who cares about power consumption on a work machine? Only Apple fanboys care about that, because that’s the only thing they got going for them. My time is more expensive than my power bill.
Someone asking for Linux backup solution may prefer to avoid Apple ‘ecosystem’.
I use Borg backup with Vorta for a GUI. Hasn’t let me down yet.
This is the correct answer.
I use PikaBackup which I think uses Borg. Super good looking Gnome app that has worked for me.
Borgmatic is also a great option, cli only.
timeshift with system files and manually my home folder
Duplicity (cli) with deja-dup (gui) has saved my sorry ass many times.
I have a bash script that backs all my stuff up to my Homeserver with Borg. My servers have cronjobs that run similar scripts.
Check out Pika backup. It’s a beautiful frontend for Borg. And Borg is the shit.
Most of my data is backed up to (or just stored on) a VPS in the first instance, and then I backup the VPS to a local NAS daily using rsnapshot (the NAS is just a few old hard drives attached to a Raspberry Pi until I can get something more robust). Very occasionally I’ll back the NAS up to a separate drive. I also occasionally backup my laptop directly to a separate hard drive.
Not a particularly robust solution but it gives me some piece of mind. I would like to build a better NAS that can support RAID as I was never able to get it working with the Pi.
A separate NAS on an atom cpu with btrfs of raid 10 exposed over NFS.
Periodic backup to external drive via Deja Dup. Plus, I keep all important docs in Google Drive. All photos are in Google Photos. So it’s only my music really which isn’t in the cloud. But I might try upload it to Drive as well one day.
I just run my own nextcloud instance. Everything important is synced to that with the nextcloud desktop client, and the server keeps a month’s worth of backups on my NAS via rsync.
I use duplicity to a drive mounted off a Pi for local, tarsnap for remote. Both are command-line tools; tarsnap charges for their servers based on exact usage. (And thanks for the reminder; I’m due for another review of exactly what parts of which drives I’m backing up.)
Restic since 2018, both to locally hosted storage and to remote over ssh. I’ve “stuff I care about” and “stuff that can be relatively easily replaced” fairly well separated so my filtering rules are not too complicated. I used duplicity for many years before that and afbackup to DLT IV tapes prior to that.
Used to use Duplicati but it was buggy and would often need manual intervention to repair corruption. I gave up on it.
Now use Restic to Backblaze B2. I’ve been very happy.
I’ve used restic in the past; it’s good but requires a great deal of setup if memory serves me correctly. I’m currently using Duplicati on both Ubuntu and Windows and I’ve never had any issues. Thanks for sharing your experience though; I’ll be vigilant.
Restic to B2 is made of win.
The quick, change-only backups in a digit executable intrigued me; the ability to mount snapshots to get at, e.g., a single file hooked me. The wide, effortless support for services like BackBlaze made me an advocate.
I back up nightly to a local disk, and twice a week to B2. Everywhere. I have some 6 machines I do this on; one holds the family photos and our music library, and is near a TB by itself. I still pay only a few dollars per month to B2; it’s a great service.
Anything important I keep in my Dropbox folder, so then I have a copy on my desktop, laptop, and in the cloud.
When I turn off my desktop, I use restic to backup my Dropbox folder to a local external hard drive, and then restic runs again to back up to Wasabi which is a storage service like amazon’s S3.
Same exact process for when I turn off my laptop… except sometimes I don’t have my laptop external hd plugged in so that gets skipped.
So that’s three local copies, two local backups, and two remote backup storage locations. Not bad.
Changes I might make:
- add another remote location
- rotate local physical backup device somewhere (that seems like a lot of work)
- move to next cloud or seafile instead of Dropbox
I used seafile for a long time but I couldn’t keep it up so I switched to Dropbox.
Advice, thoughts welcome.
I actually move my Documents, Pictures and other important folders inside my Dropbox folder and symlink them back to their original locations
This gives me the same Docs, Pics, etc. folders synced on every computer.