Skip to main content

How to Self-Host Duplicati: Encrypted Cloud Backup 2026

·OSSAlt Team
duplicatibackupencryptionself-hostingdockers32026

TL;DR

Duplicati (LGPL, ~10K GitHub stars, C#) is a free backup client that stores encrypted, compressed backups on any cloud storage or remote server. Support for 25+ storage backends — S3, Backblaze B2, Google Drive, OneDrive, SFTP, WebDAV, and more. All backups are encrypted client-side with AES-256 before leaving your machine. Backblaze backup costs $7/month; CrashPlan costs $10/month. Duplicati + B2 storage costs ~$0.50/month for 100GB.

Key Takeaways

  • Duplicati: LGPL, ~10K stars, C# — encrypted backup to cloud storage
  • AES-256 encryption: Data encrypted before upload — storage provider can't read your files
  • 25+ backends: S3, B2, GDrive, OneDrive, SFTP, WebDAV, Azure, FTP, local
  • Deduplication: Block-level — only changed data is uploaded after initial backup
  • Scheduling: Cron-like schedules with retention policies
  • Web UI: Configure and monitor backups from the browser

Duplicati vs Restic vs Borg vs CrashPlan

FeatureDuplicatiResticBorgCrashPlan
PriceFreeFreeFree$10/mo
UIWeb GUICLI onlyCLI onlyDesktop GUI
EncryptionAES-256AES-256AES-256AES-256
DeduplicationBlock-levelContent-definedContent-definedFile-level
Cloud backends25+20+SSH onlyCloud only
Setup difficultyEasyMediumMediumEasy
SchedulingBuilt-inExternal (cron)External (cron)Built-in
RestoreWeb UICLICLIDesktop app

Part 1: Docker Setup

# docker-compose.yml
services:
  duplicati:
    image: lscr.io/linuxserver/duplicati:latest
    container_name: duplicati
    restart: unless-stopped
    ports:
      - "8200:8200"
    volumes:
      - duplicati_config:/config
      # Mount directories you want to back up:
      - /home:/source/home:ro
      - /etc:/source/etc:ro
      - /var/lib/docker/volumes:/source/docker-volumes:ro
      # Optional: local backup destination:
      - /mnt/backup:/backups
    environment:
      PUID: 1000
      PGID: 1000
      TZ: America/Los_Angeles

volumes:
  duplicati_config:
docker compose up -d

Visit http://your-server:8200 → set a web UI password when prompted.


Part 2: HTTPS with Caddy

backup.yourdomain.com {
    reverse_proxy localhost:8200
}

Part 3: Create a Backup Job

Step-by-step wizard

  1. + Add Backup → Configure a new backup

  2. General settings:

    • Name: Daily Server Backup
    • Encryption: AES-256 (default)
    • Passphrase: strong, unique passphrase (SAVE THIS — you cannot recover data without it)
  3. Destination: Choose storage backend

Storage backends

BackendConfig
Backblaze B2Bucket name, Application Key ID, Application Key
Amazon S3Bucket, region, Access Key, Secret Key
Google DriveOAuth login → select folder
SFTPHost, port, username, SSH key or password
WebDAVURL, username, password
OneDriveOAuth login → select folder
Local/NetworkPath: /backups/server1
MinIOS3-compatible: endpoint, bucket, access key

Backblaze B2 example

Backend: B2 Cloud Storage
Bucket: my-server-backups
Folder: server1/
B2 Application Key ID: your-key-id
B2 Application Key: your-key

S3 example

Backend: S3 Compatible
Server: s3.amazonaws.com (or minio.yourdomain.com)
Bucket: backups
Region: us-east-1
Folder: server1/
AWS Access ID: AKIA...
AWS Access Key: your-secret-key
  1. Source data: Select directories to back up

    • /source/home (home directories)
    • /source/etc (system configuration)
    • /source/docker-volumes (Docker data)
  2. Schedule:

    • Frequency: Daily at 3:00 AM
    • Or: Every 6 hours
  3. Options:

    • Keep backups: Smart retention (1/day for 7 days, 1/week for 4 weeks, 1/month for 12 months)
    • Block size: 100KB (default — good for dedup)
    • Upload speed limit: optional (don't saturate your connection)

Part 4: Retention Policies

Keep all backups for: 7 days
Keep 1 backup per week for: 4 weeks
Keep 1 backup per month for: 12 months
Delete everything older than: 1 year

Custom retention

Keep last: 30 versions

Unlimited retention

Keep all backups (unlimited)

Warning: Unlimited retention grows storage costs linearly. Smart retention is recommended.


Part 5: Restore Files

Via web UI

  1. Backups → [backup name] → Restore
  2. Browse the backup tree — select files/folders
  3. Choose restore point (date/time)
  4. Restore to: original location or alternate path
  5. Restore → files are downloaded, decrypted, and placed

Restore specific files

Restore → Search:
  *.conf                 — all config files
  /source/home/user/     — specific directory

Restore to different machine

  1. Install Duplicati on the new machine
  2. Point to the same storage backend
  3. Restore from configuration → enter passphrase
  4. Browse and restore any file

Part 6: CLI Usage

# Run backup from command line:
docker exec duplicati duplicati-cli backup \
  "b2://my-bucket/server1?auth-username=KEY_ID&auth-password=KEY" \
  /source/home \
  --passphrase="your-encryption-passphrase" \
  --backup-name="CLI Backup"

# List backup versions:
docker exec duplicati duplicati-cli list \
  "b2://my-bucket/server1?auth-username=KEY_ID&auth-password=KEY" \
  --passphrase="your-encryption-passphrase"

# Restore a file:
docker exec duplicati duplicati-cli restore \
  "b2://my-bucket/server1?auth-username=KEY_ID&auth-password=KEY" \
  --passphrase="your-encryption-passphrase" \
  --restore-path=/tmp/restore \
  "home/user/important-file.txt"

# Verify backup integrity:
docker exec duplicati duplicati-cli test \
  "b2://my-bucket/server1?auth-username=KEY_ID&auth-password=KEY" \
  --passphrase="your-encryption-passphrase" \
  5   # verify 5 random file sets

Part 7: Notifications

Email notifications

Settings → Add send mail advanced option:
  --send-mail-url=smtp://mail.yourdomain.com:587
  --send-mail-username=backup@yourdomain.com
  --send-mail-password=***
  --send-mail-from=backup@yourdomain.com
  --send-mail-to=you@yourdomain.com
  --send-mail-subject=Duplicati %OPERATIONNAME% - %PARSEDRESULT%
  --send-mail-level=Warning,Error,Fatal

Webhook (ntfy, Gotify, etc.)

Advanced Options:
  --send-http-url=https://ntfy.yourdomain.com/backups
  --send-http-message=%OPERATIONNAME% completed: %PARSEDRESULT%
  --send-http-result-output-format=Duplicati
  --send-http-level=Warning,Error,Fatal

Part 8: Best Practices

What to back up

✅ Home directories (/home)
✅ Config files (/etc)
✅ Docker volumes (/var/lib/docker/volumes)
✅ Database dumps (pre-backup script)
✅ SSL certificates (/etc/letsencrypt)

❌ System binaries (/usr, /bin) — reinstall is faster
❌ Package cache (/var/cache)
❌ Temp files (/tmp)
❌ Docker images — re-pull is faster

Pre-backup database dumps

# Run-before script (add to backup job):
#!/bin/bash
# Dump all databases before backup:
docker exec postgres pg_dumpall -U postgres > /source/home/backups/postgres-all.sql
docker exec mysql mysqldump --all-databases -u root -p"$PASS" > /source/home/backups/mysql-all.sql

3-2-1 backup rule

  • 3 copies of data (original + 2 backups)
  • 2 different storage media (local + cloud)
  • 1 offsite (cloud storage)
# Create 2 backup jobs:
# 1. Local backup (fast restore):
#    Destination: /mnt/backup/server1
#    Schedule: Every 6 hours
#
# 2. Cloud backup (disaster recovery):
#    Destination: B2/S3
#    Schedule: Daily at 3 AM

Maintenance

# Update:
docker compose pull
docker compose up -d

# Verify backup integrity (monthly):
# Backups → [job] → Commandline → Verify

# Check storage usage:
# Backups → [job] → Show log → backend statistics

# Compact/repair database:
docker exec duplicati duplicati-cli repair \
  "b2://bucket/path?auth-username=KEY&auth-password=SECRET" \
  --passphrase="your-passphrase"

# Logs:
docker compose logs -f duplicati

See also: Restic + Rclone — CLI-first backup with more advanced deduplication

See all open source DevOps tools at OSSAlt.com/categories/devops.

Comments