Hi Everyone
I need some help
I’m currently selfhosting some of my applications on Digital Ocean and i run a container using Portainer CE. I was wondering how you guys keep backups for the applications running on docker.
I’m currently using Digital ocean’s snapshots feature but is there a better way i could use, any help on this is highly appreciated.
For databases and data I use restic-compose-backup because you can use labels in your docker compose files.
For config files I use a git repository.
Uuuh…timeshift and borg??
Hey that is the plot to First Contact.
I just run a pg_dump through kubectl exec and pipe the stdout to a file on my master node. The same script then runs restic to send encrypted backups over to s3. I use the host name flag on the restic command as kind of a hack to get backups per service name. This eliminates the risk of overwriting files or directories with the same name.
I backup all the mounted docker volumes once every hour (snapshots). Additionally i create dumps from all databases with https://github.com/tiredofit/docker-db-backup (once every hour or once a day depending on the database).
ZFS snapshots.
duplicati to take live, crash-consistent backups of all my windows servers and VMs with Volume Shadowcopy Service (VSS)
On Proxmox i use for my Backup Solution - Hetzner Storage Bix
I have bind mounts to nfs shares that are backed my zfs pools and last snapshots and sync jobs to another storage device. All containers are ephemeral.
When backing up Docker volumes, should not the docker container be stopped first.?? I can’t se any support for that in the backup tools mentioned.
Yes the containers do need to be stopped. I actually built a project that does exactly that.
Thanks, I will look into this.
Proxmox Backup Server (PBS) snapshotting all my VM’s / LXC’s.
External VPS’ and anything that can’t run PBS-Client I am rsync’ing important data into my home network first, then doing a file based backup of that data to PBS via PBS-Client tool. All this is automated through cron jobs.
Those backups then get sync’d to a 2nd datastore for a bit of redundancy.
Kopia has been great.
I use duplicati to backup to a secure off site location. Useful for something like vaultwarden.
Use resticker to add an additional backup service to each compose allowing me to customize some pre/post backup actions. Works like a charm 👍
Most of mine are lightweight so private repos on git.
For big data I have two NAS that sync on the daily.
Unraid with Duplicacy and Appdata Backup incremental to Backblaze