Self-Host Immich: Google Photos Alternative 2026
TL;DR
Immich (MIT, 90K+ GitHub stars) is the most popular self-hosted photo and video management platform — and the closest thing to a drop-in Google Photos replacement that actually exists. It has native iOS and Android apps with background auto-sync, AI-powered face recognition, natural language CLIP search, shared albums, a Memories feature, and a web interface that looks and feels genuinely modern. With v2.0.0 (released October 2025), the project committed to semantic versioning — no more breaking changes without a major version bump. Google Photos charges for storage beyond 15 GB. Immich costs whatever your server costs.
Key Takeaways
- Immich: MIT license, 90K+ stars — #1 self-hosted photo platform by a wide margin
- Native mobile apps: iOS and Android with background auto-sync, WiFi-only and charging-only options
- AI features: Face recognition (clustering), CLIP natural language search, object detection — all running locally
- v2.0.0 stable: Semantic versioning commitment since October 2025 — no more surprise breaking changes
- Docker Compose: Six-service stack (server, microservices, machine-learning, PostgreSQL, Redis, search)
- RAW support: CR2, NEF, ARW, DNG — plus video with transcoding
- Hardware: Minimum 4 GB RAM; 8 GB+ for AI features; runs on Raspberry Pi 4 (8 GB) for basic use
Why Immich Pulled Ahead
Self-hosted photo management has been a crowded space for years. PhotoPrism has been around longer. LibrePhotos has a dedicated following. But Immich crossed 90,000 GitHub stars for a reason: it prioritizes the daily-driver experience over raw feature completeness.
The two things that made Google Photos indispensable were automatic background sync from your phone and a search box that actually understands what you're looking for. Immich does both. The native mobile apps (not a progressive web app — real native apps in the App Store and Play Store) run in the background and upload your photos as you take them, with the same configurable options you'd expect: WiFi-only, charging-only, or always-on. CLIP-based search lets you type "sunset at the beach" or "birthday cake with candles" and get relevant results without any manual tagging.
The developer, Alex Tran, has been transparent about the long road to stability. The v2.0.0 release in October 2025 marked the project's public commitment to not breaking people's setups between updates — a real concern for anything you're trusting with your entire photo library.
Architecture Overview
A full Immich deployment runs six containers. Understanding what each one does helps when troubleshooting or sizing your hardware.
| Service | Role |
|---|---|
immich-server | Main API server and web frontend (Next.js) — handles uploads, albums, sharing |
immich-microservices | Background job processor — thumbnail generation, metadata extraction, video transcoding |
immich-machine-learning | Python ML service — face recognition, CLIP embeddings, object detection |
postgres | Primary database — stores all metadata, user accounts, album structure |
redis | Job queue and caching layer — coordinates work between server and microservices |
All photo and video files land in a single upload directory on your host. The database and ML models are stored in named Docker volumes. This design makes backups straightforward: back up the upload directory and the PostgreSQL database, and you can restore everything.
System Requirements
| Scenario | CPU | RAM | Storage |
|---|---|---|---|
| Basic (no AI) | 2 cores | 4 GB | Your photo library + 10% |
| AI features enabled | 4 cores | 8 GB | Your photo library + 15% |
| Large library (100K+ photos) | 6+ cores | 16 GB | SSD recommended for DB |
| GPU-accelerated ML | Any + NVIDIA/AMD GPU | 8 GB | Same |
Storage note: Immich stores original files untouched plus generates thumbnails and preview images. Budget roughly 10–20% overhead on top of your raw library size.
First-run warning: Thumbnail generation for an existing large library is CPU-intensive and runs in the background after import. For 50,000 photos on modest hardware, expect this to take several hours. The app is fully usable while this runs — thumbnails just appear progressively.
GPU acceleration (NVIDIA CUDA, OpenCL, or Intel OpenVINO) is optional but meaningfully speeds up the machine learning container. Without a GPU, ML tasks run on CPU and take longer but complete correctly.
Part 1: Docker Compose Installation
1.1 Create the directory structure
mkdir -p /opt/immich
cd /opt/immich
1.2 Create the .env file
# .env
# Change these values before deploying
# Database password — use something strong
DB_PASSWORD=your_strong_password_here
# Immich version — pin to a specific release tag in production
# Check https://github.com/immich-app/immich/releases for the latest
IMMICH_VERSION=release
# Where photos and videos will be stored on the host
UPLOAD_LOCATION=/opt/immich/library
# PostgreSQL data directory
DB_DATA_LOCATION=/opt/immich/postgres
# Timezone
TZ=America/New_York
# JWT secret — generate with: openssl rand -base64 128
SECRET_KEY=your_jwt_secret_here
Generate a strong secret key:
openssl rand -base64 128
1.3 Create the docker-compose.yml
# docker-compose.yml
name: immich
services:
immich-server:
container_name: immich_server
image: ghcr.io/immich-app/immich-server:${IMMICH_VERSION:-release}
volumes:
- ${UPLOAD_LOCATION}:/usr/src/app/upload
- /etc/localtime:/etc/localtime:ro
env_file:
- .env
ports:
- "2283:3001"
depends_on:
- redis
- database
restart: always
healthcheck:
disable: false
immich-machine-learning:
container_name: immich_machine_learning
# For GPU acceleration, use a hardware-specific image:
# image: ghcr.io/immich-app/immich-machine-learning:${IMMICH_VERSION:-release}-cuda
image: ghcr.io/immich-app/immich-machine-learning:${IMMICH_VERSION:-release}
volumes:
- model-cache:/cache
env_file:
- .env
restart: always
healthcheck:
disable: false
redis:
container_name: immich_redis
image: docker.io/redis:6.2-alpine@sha256:2d1463258f2764328496376f5d965f20c6a67f66ea2b06dc42af351f75248792
healthcheck:
test: redis-cli ping || exit 1
restart: always
database:
container_name: immich_postgres
image: docker.io/tensorchord/pgvecto-rs:pg14-v0.2.0@sha256:90724186f0a3517cf6914295b5ab410db9ce23190a2d9d0b9dd6463e3fa298f0
environment:
POSTGRES_PASSWORD: ${DB_PASSWORD}
POSTGRES_USER: postgres
POSTGRES_DB: immich
POSTGRES_INITDB_ARGS: "--data-checksums"
volumes:
- ${DB_DATA_LOCATION}:/var/lib/postgresql/data
healthcheck:
test: >-
pg_isready --dbname="$${POSTGRES_DB}" --username="$${POSTGRES_USER}" || exit 1;
Chksum="$$(psql --dbname="$${POSTGRES_DB}" --username="$${POSTGRES_USER}" --tuples-only --no-align
--command='SELECT COALESCE(SUM(checksum_failures), 0) FROM pg_stat_database')";
echo "checksum failure count - $${Chksum}";
[ "$${Chksum}" = '0' ] || exit 1
interval: 5m
start_interval: 30s
start_period: 5m
command:
[
"postgres",
"-c",
"shared_preload_libraries=vectors.so",
"-c",
'search_path="$$user", public, vectors',
"-c",
"logging_collector=on",
"-c",
"max_wal_size=2GB",
"-c",
"shared_buffers=512MB",
"-c",
"wal_compression=lz4",
]
restart: always
volumes:
model-cache:
1.4 Start the stack
docker compose up -d
Check that all containers started cleanly:
docker compose ps
docker compose logs immich-server --tail=50
Immich is now available at http://your-server:2283.
1.5 Initial setup
Open the web interface and create your admin account. The first user to register becomes the admin — disable open registration afterward in Administration → Settings → User Management if you're not on a private network.
Part 2: HTTPS with Caddy
For remote access and mobile app sync outside your home network, you need HTTPS. Caddy handles certificate provisioning automatically.
# /opt/caddy/Caddyfile
photos.yourdomain.com {
reverse_proxy localhost:2283
# Increase max upload size for large video files
request_body {
max_size 50GB
}
}
docker run -d \
--name caddy \
--network host \
-v /opt/caddy/Caddyfile:/etc/caddy/Caddyfile \
-v caddy_data:/data \
-v caddy_config:/config \
caddy:latest
Point your DNS A record for photos.yourdomain.com at your server's public IP. Caddy fetches a Let's Encrypt certificate on the first request.
Part 3: Mobile App Setup
This is where Immich meaningfully beats every other self-hosted photo platform. The mobile apps are real native apps — not a web wrapper, not a PWA — available on the App Store and Google Play.
Connecting the app
- Open the app and enter your server URL:
https://photos.yourdomain.com - Log in with your Immich credentials
- The app scans your camera roll and presents an album to back up
Configuring background sync
In the app settings, go to Backup to configure:
- Background backup: On — uploads new photos automatically
- WiFi only: Recommended — prevents mobile data usage for large uploads
- Charging only: Optional — useful if you take a lot of videos
- Upload on mobile data: Off (unless you have unlimited data)
The sync behavior mirrors Google Photos: photos appear in Immich shortly after you take them, without any manual action.
What syncs
The app backs up your entire camera roll by default, including screenshots, WhatsApp images, and any folder you add to the backup selection. You can exclude specific albums in Backup → Excluded Albums — useful for keeping your camera roll clean on the Immich side.
Part 4: AI Features
Immich's machine learning container runs three distinct AI capabilities. All of them run locally — no data leaves your server.
Face Recognition
Immich clusters detected faces across your library and lets you name them. Once named, a person becomes searchable: search for "Sarah" and see every photo Sarah appears in. The clustering is automatic — Immich groups similar faces together and prompts you to assign names.
Face recognition runs as a background job. For a fresh import of 50,000 photos, initial processing takes hours on CPU. The accuracy is good for consumer use but not perfect — expect occasional misclustering, especially for photos taken in poor lighting or at unusual angles.
CLIP Natural Language Search
CLIP (Contrastive Language-Image Pretraining) lets you search your photos using plain English descriptions rather than tags. Examples that actually work:
"golden retriever playing in snow""birthday cake with candles""sunset over the ocean""kids at a birthday party""aerial view of a city"
CLIP generates a vector embedding for each photo and stores it in the PostgreSQL database. Search queries are matched against these embeddings semantically, not by filename or EXIF metadata. This is the same underlying technology that powers Google Photos' search.
Object Detection
Immich tags photos with detected objects and scenes automatically. These tags appear in the photo detail view and are searchable. Common tags include car, dog, cat, tree, food, beach, mountain, and hundreds of others.
GPU Acceleration
If your server has an NVIDIA GPU, swap the machine learning image:
# In docker-compose.yml, replace the immich-machine-learning image:
image: ghcr.io/immich-app/immich-machine-learning:${IMMICH_VERSION:-release}-cuda
# Add GPU device access:
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities: [gpu]
AMD GPUs use the -openvino image variant. Intel Arc and integrated graphics use -openvino as well.
Part 5: External Libraries
If you have an existing photo archive on disk, Immich can index it without moving or copying files. This is the External Library feature — Immich reads files in place and adds them to your library.
In Administration → External Libraries, add a new library and point it at your existing photo directory:
/mnt/photos/archive
/mnt/photos/family
Immich scans the directory, generates thumbnails and ML embeddings, and adds the photos to your library. The original files are not modified or moved. This is ideal for migrating from a NAS-based photo archive or a Google Photos export (via Google Takeout).
Part 6: Backups
Your Immich data lives in two places: the upload directory and the PostgreSQL database. Both need to be backed up.
Upload directory — standard file backup. Point Restic or your backup tool of choice at ${UPLOAD_LOCATION}.
PostgreSQL database — dump it on a schedule:
# Add to crontab or run as a Docker exec
docker exec immich_postgres pg_dumpall \
--username=postgres \
> /opt/immich/backups/immich-$(date +%Y%m%d).sql
Immich's own documentation recommends backing up the database daily and keeping at least 30 days of history. A single pg_dumpall output for a library with 100,000 photos is typically 1–3 GB uncompressed.
Comparison: Immich vs PhotoPrism vs LibrePhotos
| Feature | Immich | PhotoPrism | LibrePhotos |
|---|---|---|---|
| License | MIT | AGPL 3.0 | MIT |
| GitHub Stars | 90K+ | ~35K | ~7K |
| Native mobile apps | iOS + Android | PWA only | PWA only |
| Background auto-sync | Yes | No | No |
| CLIP search | Yes | Yes | Yes |
| Face recognition | Yes (clustering) | Yes | Yes |
| RAW support | Yes (basic) | Yes (excellent) | Limited |
| Video transcoding | Yes | Limited | Yes |
| Shared albums | Yes | Limited | Yes |
| Partner sharing | Yes | No | No |
| Memories feature | Yes | No | No |
| Duplicate detection | Yes | Yes | Limited |
| External libraries | Yes | Yes | No |
| Multiple users | Yes | Yes (paid tiers) | Yes |
| Active development | Very active | Active | Slower |
| Stability | Stable (v2.0+) | Stable | Variable |
| Google Photos UX | Very close | Different paradigm | Different paradigm |
When PhotoPrism is better: If RAW processing quality is your primary concern, PhotoPrism has more mature RAW handling. It also has a more polished timeline view for photographers who think in terms of days and shoots rather than albums. The AGPL license is a consideration if you're building a product on top of it.
When LibrePhotos is better: Honestly, for most use cases, it's not. LibrePhotos is an older project with slower development velocity. It's worth knowing about but harder to recommend over Immich in 2026.
Immich's weaknesses: RAW processing is functional but not as advanced as PhotoPrism. The project moved fast through 2023–2024, which meant some instability — v2.0.0 addressed this with the semantic versioning commitment. Always pin to a specific version tag in production rather than using release.
When to Use Immich
Good fit:
- You want Google Photos-like automatic sync from your phone
- Privacy matters and you want your photos stored only on hardware you control
- You have a library primarily of JPEGs and videos (camera roll, not professional RAW workflow)
- You want AI search without sending data to Google or Apple
- You're running Docker on a home server, NAS (Synology, QNAP, TrueNAS), or VPS
Not ideal:
- Professional photography workflow requiring precise RAW editing — use Darktable or Lightroom Classic
- Very limited hardware (under 4 GB RAM) — PhotoPrism is lighter
- You need deep Apple ecosystem integration — iCloud Photos handles this better
- You want a managed service with no maintenance — Google Photos is genuinely convenient for that
Methodology
This article is based on the Immich GitHub repository, official documentation at immich.app/docs, and the v2.0.0 release notes. System requirements and performance estimates are drawn from community reports in the Immich GitHub discussions and Discord server. The docker-compose.yml shown is based on the official Immich installation template with minor formatting adjustments for clarity. All AI features described run locally — no data is sent to external services.
For more self-hosting guides, see our Jellyfin media server setup, Nextcloud installation guide, Vaultwarden password manager setup, and our self-hosting backup guide for keeping your data safe.