Skip Navigation

What are your backup solutions?

I just started getting into self hosting using docker compose and I wonder about possible backup solutions. I only have to safe my docker config so far, but I want host files as well. What software and hardware are you using for backup?

69 comments
  • I've had excellent luck with Kopia, backing up to Backblaze B2.

    At work, I do the same to a local directory in my company provided OneDrive account to keep company data on company resources.

  • I doubt your using NixOS so this config might seem useless but at its core it is a simple systemd timer service and bash scripting.

    To convert this to another OS you will use cron to call the script at the time you want. Copy the part between script="" and then change out variables like the location of where docker-compose is stored since its different on NixOS.

    Let me explain the script. We start out by defining the backupDate variable, this will be the name of the zip file. As of now that variable would be 2023-07-12. We then go to each folder with a docker-compose.yml file and take it down. You could also replace down with stop if you don't plan on updating each night like I do. I use rclone to connect to Dropbox but rclone supports many providers so check it out and see if it has the one you need. Lastly I use rclone to connect to my Dropbox and delete anything older than 7 days in the backup folder. If you end up going my route and get stuck let me know and I can help out. Good luck.

     undefined
        
    systemd = {
          timers.docker-backup = {
            wantedBy = [ "timers.target" ];
            partOf = [ "docker-backup.service" ];
            timerConfig.OnCalendar= "*-*-* 3:30:00";
          };
          services.docker-backup = {
            serviceConfig.Type = "oneshot";
            serviceConfig.User = "root";
            script = ''
            backupDate=$(date  +'%F')
            cd /docker/apps/rss
            ${pkgs.docker-compose}/bin/docker-compose down
    
            cd /docker/apps/paaster
            ${pkgs.docker-compose}/bin/docker-compose down
    
            cd /docker/no-backup-apps/nextcloud
            ${pkgs.docker-compose}/bin/docker-compose down
    
            cd /docker/apps/nginx-proxy-manager
            ${pkgs.docker-compose}/bin/docker-compose down
    
            cd /docker/backups/
            ${pkgs.zip}/bin/zip -r server-backup-$backupDate.zip /docker/apps
    
            cd /docker/apps/nginx-proxy-manager
            ${pkgs.docker-compose}/bin/docker-compose pull
            ${pkgs.docker-compose}/bin/docker-compose up -d
    
            cd /docker/apps/paaster
            ${pkgs.docker-compose}/bin/docker-compose pull
            ${pkgs.docker-compose}/bin/docker-compose up -d
    
            cd /docker/apps/rss
            ${pkgs.docker-compose}/bin/docker-compose pull
            ${pkgs.docker-compose}/bin/docker-compose up -d
    
            cd /docker/no-backup-apps/nextcloud
            ${pkgs.docker-compose}/bin/docker-compose pull
            ${pkgs.docker-compose}/bin/docker-compose up -d
    
            cd /docker/backups/
            ${pkgs.rclone}/bin/rclone copy server-backup-$backupDate.zip Dropbox:Server-Backup/
            rm server-backup-$backupDate.zip
            ${pkgs.rclone}/bin/rclone delete --min-age 7d Dropbox:Server-Backup/
            '';
          };
        };
    
    
      
    • Thanks! I just started setting up NixOS on my laptop and I'm planning to use it for servers next. Saving this for later!

  • Someone on lemmy here suggested Restic, a backup solution written in Go.

    I back up to an internal 4TB HDD every 30 minutes. My most important files are stored in an encrypted file storage online in the cloud.

    Restic is good stuff.

  • At home I have a Synology NAS for backup of the local desktops. Offsite Backups are done with restic to Blackblaze B2 and to another location.

  • VM instances on the Proxmox VE with native integration with the Proxmox Backup Server (PBS).

    For non-VM a little PBS agent.

  • I host everything on Proxmox VM's so I just take daily snapshots to my NAS

  • On Proxmox, I use the built-in system + storing it to my Synology NAS (RS1221+). I use Active Backup for business (filesync) to back up the Proxmox config files, and also backup the husband's PC and my work PC.

  • Backblaze B2. Any software that is S3 compatible can use B2 as the target and it’s reasonably priced for the service. I backup all the PCs and services to a Synology NAS and then backup that to B2 (everything except my Plex media, that would be pricy and it’s easy enough to re-rip from disc if needed).

  • All systems backup to Synology then to AWS Glacier. Ill check out Backblaze for pricing.

  • I run a second Unraid server with a couple of backup-related applications, as well as Duplicati. I have my main server network mounted and run scheduled jobs to both copy data from the main pool to the backup pool, as well as to Backblaze. Nice having the on-site backup as well as the cloud based.

    I occasionally burn to 100gb blurays as well for the physical backup.

  • I rsync my data once a day to another drive via script. If I accidentaly delete files, I can easily copy them back. Then once a day, rclone makes an encrypted backup to a hetzner storagebox

  • Everything:

    Kopia encrypted -> another phisical drive

    Kopia encrypted -> backblaze B2

    • Chron job every day at 4:15 AM

    Most important folder (part of everything):

    Duplicaty encrypted -> google drive

    • Also daily backup
  • For containers (but I use k3s) I use git to store helmfiles and configuration, secrets in ci/cd system.

    For the rest - I use autorestic that backups data over ssh and S3.

  • ZFS send to a pair of mirrored HDDs on the same machine ever hour and a daily restic backup to S3 storage. Every six months I test and verify the cloud backup.

  • A lot of services have some kind of way to create backup files. I have cronjobs doing that daily then uploading it to some cloud storage with rclone.

  • Encrypted backup to google drive weekly from unraid, planning to get a NAS for another backup location

  • For my workstation I'm using a small script that packs and compresses all relevant directories with tar once a week. The resulting file is then copied to a local backup drive and to my NAS. An encrypted version of that file is sent to an offsite VPS.

    For my selfhosted services (on Proxmox) I'm using ProxmoxBackupServer.

  • Rsync custom script. I am connecting two different hard disks (1 natively + 1 remotely via ssh) to backup the disk.

    1 tine per month, U unplug ny microsd fro my Raspberry Pi 4 Server and I am making a full backup of the sd in case it fails, to restore it to a new sd card.

69 comments