How to Self-Host Duplicati: Encrypted Cloud Backup 2026
TL;DR
Duplicati (LGPL, ~10K GitHub stars, C#) is a free backup client that stores encrypted, compressed backups on any cloud storage or remote server. Support for 25+ storage backends — S3, Backblaze B2, Google Drive, OneDrive, SFTP, WebDAV, and more. All backups are encrypted client-side with AES-256 before leaving your machine. Backblaze backup costs $7/month; CrashPlan costs $10/month. Duplicati + B2 storage costs ~$0.50/month for 100GB.
Key Takeaways
- Duplicati: LGPL, ~10K stars, C# — encrypted backup to cloud storage
- AES-256 encryption: Data encrypted before upload — storage provider can't read your files
- 25+ backends: S3, B2, GDrive, OneDrive, SFTP, WebDAV, Azure, FTP, local
- Deduplication: Block-level — only changed data is uploaded after initial backup
- Scheduling: Cron-like schedules with retention policies
- Web UI: Configure and monitor backups from the browser
Duplicati vs Restic vs Borg vs CrashPlan
| Feature | Duplicati | Restic | Borg | CrashPlan |
|---|---|---|---|---|
| Price | Free | Free | Free | $10/mo |
| UI | Web GUI | CLI only | CLI only | Desktop GUI |
| Encryption | AES-256 | AES-256 | AES-256 | AES-256 |
| Deduplication | Block-level | Content-defined | Content-defined | File-level |
| Cloud backends | 25+ | 20+ | SSH only | Cloud only |
| Setup difficulty | Easy | Medium | Medium | Easy |
| Scheduling | Built-in | External (cron) | External (cron) | Built-in |
| Restore | Web UI | CLI | CLI | Desktop app |
Part 1: Docker Setup
# docker-compose.yml
services:
duplicati:
image: lscr.io/linuxserver/duplicati:latest
container_name: duplicati
restart: unless-stopped
ports:
- "8200:8200"
volumes:
- duplicati_config:/config
# Mount directories you want to back up:
- /home:/source/home:ro
- /etc:/source/etc:ro
- /var/lib/docker/volumes:/source/docker-volumes:ro
# Optional: local backup destination:
- /mnt/backup:/backups
environment:
PUID: 1000
PGID: 1000
TZ: America/Los_Angeles
volumes:
duplicati_config:
docker compose up -d
Visit http://your-server:8200 → set a web UI password when prompted.
Part 2: HTTPS with Caddy
backup.yourdomain.com {
reverse_proxy localhost:8200
}
Part 3: Create a Backup Job
Step-by-step wizard
-
+ Add Backup → Configure a new backup
-
General settings:
- Name:
Daily Server Backup - Encryption: AES-256 (default)
- Passphrase: strong, unique passphrase (SAVE THIS — you cannot recover data without it)
- Name:
-
Destination: Choose storage backend
Storage backends
| Backend | Config |
|---|---|
| Backblaze B2 | Bucket name, Application Key ID, Application Key |
| Amazon S3 | Bucket, region, Access Key, Secret Key |
| Google Drive | OAuth login → select folder |
| SFTP | Host, port, username, SSH key or password |
| WebDAV | URL, username, password |
| OneDrive | OAuth login → select folder |
| Local/Network | Path: /backups/server1 |
| MinIO | S3-compatible: endpoint, bucket, access key |
Backblaze B2 example
Backend: B2 Cloud Storage
Bucket: my-server-backups
Folder: server1/
B2 Application Key ID: your-key-id
B2 Application Key: your-key
S3 example
Backend: S3 Compatible
Server: s3.amazonaws.com (or minio.yourdomain.com)
Bucket: backups
Region: us-east-1
Folder: server1/
AWS Access ID: AKIA...
AWS Access Key: your-secret-key
-
Source data: Select directories to back up
/source/home(home directories)/source/etc(system configuration)/source/docker-volumes(Docker data)
-
Schedule:
- Frequency: Daily at 3:00 AM
- Or: Every 6 hours
-
Options:
- Keep backups: Smart retention (1/day for 7 days, 1/week for 4 weeks, 1/month for 12 months)
- Block size: 100KB (default — good for dedup)
- Upload speed limit: optional (don't saturate your connection)
Part 4: Retention Policies
Smart retention (recommended)
Keep all backups for: 7 days
Keep 1 backup per week for: 4 weeks
Keep 1 backup per month for: 12 months
Delete everything older than: 1 year
Custom retention
Keep last: 30 versions
Unlimited retention
Keep all backups (unlimited)
Warning: Unlimited retention grows storage costs linearly. Smart retention is recommended.
Part 5: Restore Files
Via web UI
- Backups → [backup name] → Restore
- Browse the backup tree — select files/folders
- Choose restore point (date/time)
- Restore to: original location or alternate path
- Restore → files are downloaded, decrypted, and placed
Restore specific files
Restore → Search:
*.conf — all config files
/source/home/user/ — specific directory
Restore to different machine
- Install Duplicati on the new machine
- Point to the same storage backend
- Restore from configuration → enter passphrase
- Browse and restore any file
Part 6: CLI Usage
# Run backup from command line:
docker exec duplicati duplicati-cli backup \
"b2://my-bucket/server1?auth-username=KEY_ID&auth-password=KEY" \
/source/home \
--passphrase="your-encryption-passphrase" \
--backup-name="CLI Backup"
# List backup versions:
docker exec duplicati duplicati-cli list \
"b2://my-bucket/server1?auth-username=KEY_ID&auth-password=KEY" \
--passphrase="your-encryption-passphrase"
# Restore a file:
docker exec duplicati duplicati-cli restore \
"b2://my-bucket/server1?auth-username=KEY_ID&auth-password=KEY" \
--passphrase="your-encryption-passphrase" \
--restore-path=/tmp/restore \
"home/user/important-file.txt"
# Verify backup integrity:
docker exec duplicati duplicati-cli test \
"b2://my-bucket/server1?auth-username=KEY_ID&auth-password=KEY" \
--passphrase="your-encryption-passphrase" \
5 # verify 5 random file sets
Part 7: Notifications
Email notifications
Settings → Add send mail advanced option:
--send-mail-url=smtp://mail.yourdomain.com:587
--send-mail-username=backup@yourdomain.com
--send-mail-password=***
--send-mail-from=backup@yourdomain.com
--send-mail-to=you@yourdomain.com
--send-mail-subject=Duplicati %OPERATIONNAME% - %PARSEDRESULT%
--send-mail-level=Warning,Error,Fatal
Webhook (ntfy, Gotify, etc.)
Advanced Options:
--send-http-url=https://ntfy.yourdomain.com/backups
--send-http-message=%OPERATIONNAME% completed: %PARSEDRESULT%
--send-http-result-output-format=Duplicati
--send-http-level=Warning,Error,Fatal
Part 8: Best Practices
What to back up
✅ Home directories (/home)
✅ Config files (/etc)
✅ Docker volumes (/var/lib/docker/volumes)
✅ Database dumps (pre-backup script)
✅ SSL certificates (/etc/letsencrypt)
❌ System binaries (/usr, /bin) — reinstall is faster
❌ Package cache (/var/cache)
❌ Temp files (/tmp)
❌ Docker images — re-pull is faster
Pre-backup database dumps
# Run-before script (add to backup job):
#!/bin/bash
# Dump all databases before backup:
docker exec postgres pg_dumpall -U postgres > /source/home/backups/postgres-all.sql
docker exec mysql mysqldump --all-databases -u root -p"$PASS" > /source/home/backups/mysql-all.sql
3-2-1 backup rule
- 3 copies of data (original + 2 backups)
- 2 different storage media (local + cloud)
- 1 offsite (cloud storage)
# Create 2 backup jobs:
# 1. Local backup (fast restore):
# Destination: /mnt/backup/server1
# Schedule: Every 6 hours
#
# 2. Cloud backup (disaster recovery):
# Destination: B2/S3
# Schedule: Daily at 3 AM
Maintenance
# Update:
docker compose pull
docker compose up -d
# Verify backup integrity (monthly):
# Backups → [job] → Commandline → Verify
# Check storage usage:
# Backups → [job] → Show log → backend statistics
# Compact/repair database:
docker exec duplicati duplicati-cli repair \
"b2://bucket/path?auth-username=KEY&auth-password=SECRET" \
--passphrase="your-passphrase"
# Logs:
docker compose logs -f duplicati
See also: Restic + Rclone — CLI-first backup with more advanced deduplication
See all open source DevOps tools at OSSAlt.com/categories/devops.