Self-Host Hoarder: AI-Powered Bookmark Manager 2026
TL;DR
Hoarder (AGPL 3.0, ~10K GitHub stars, Next.js) is a self-hosted bookmark manager with AI superpowers. Save a URL and Hoarder automatically fetches the full page, takes a screenshot, extracts text, and uses a local AI (Ollama) to generate tags — zero manual organization required. Raindrop.io Pro ($3/month) offers similar AI features but sends your data to their cloud. Hoarder runs locally with complete privacy.
Key Takeaways
- Hoarder: AGPL 3.0, ~10K stars, Next.js — AI-first bookmark manager with automatic tagging
- Ollama integration: Uses local LLMs (llama3.2, mistral, etc.) for private AI tagging
- Full-page archiving: Saves snapshots of pages so they're available even if the original goes down
- Screenshots: Visual thumbnails for every bookmark — makes browsing bookmarks fast
- Browser extensions: Chrome and Firefox — one-click saving with instant AI processing
- Mobile apps: iOS and Android — save links from any app via share sheet
Hoarder vs Linkding vs Raindrop.io
| Feature | Hoarder | Linkding | Raindrop.io Pro |
|---|---|---|---|
| AI auto-tagging | Yes (Ollama) | No | Yes (cloud) |
| Full-page archive | Yes | No | Yes |
| Screenshots | Yes | No | Yes |
| Privacy | Local | Local | Cloud |
| Setup complexity | Medium | Simple | None |
| Mobile apps | Yes | PWA | Yes |
| Resource usage | ~500MB+ | ~50MB | N/A |
| Price | Free | Free | $3/mo |
Part 1: Docker Setup
Hoarder requires three services: the app itself, MeiliSearch (fast full-text search), and Chromium (for page archiving and screenshots).
# docker-compose.yml
services:
web:
image: ghcr.io/hoarder-app/hoarder:latest
container_name: hoarder
restart: unless-stopped
ports:
- "3000:3000"
volumes:
- hoarder_data:/data
environment:
# Required:
NEXTAUTH_SECRET: "${NEXTAUTH_SECRET}" # openssl rand -base64 36
NEXTAUTH_URL: "https://hoarder.yourdomain.com"
# MeiliSearch:
MEILI_ADDR: "http://meilisearch:7700"
MEILI_MASTER_KEY: "${MEILI_MASTER_KEY}" # openssl rand -base64 24
# Chromium for archiving:
BROWSER_WEB_URL: "http://chrome:9222"
# Data directory:
DATA_DIR: /data
# Optional: Ollama for AI tagging:
OLLAMA_BASE_URL: "http://ollama:11434"
OLLAMA_MODEL: "llama3.2"
# Optional: disable signup after first user:
# DISABLE_SIGNUPS: "true"
depends_on:
- meilisearch
- chrome
meilisearch:
image: getmeili/meilisearch:v1.6
container_name: hoarder-meilisearch
restart: unless-stopped
volumes:
- meilisearch_data:/meili_data
environment:
MEILI_MASTER_KEY: "${MEILI_MASTER_KEY}"
MEILI_NO_ANALYTICS: "true"
chrome:
image: gcr.io/zenika-hub/alpine-chrome:latest
container_name: hoarder-chrome
restart: unless-stopped
command: >
--no-sandbox
--disable-gpu
--disable-dev-shm-usage
--remote-debugging-address=0.0.0.0
--remote-debugging-port=9222
--hide-scrollbars
# Optional: local AI with Ollama
ollama:
image: ollama/ollama:latest
container_name: hoarder-ollama
restart: unless-stopped
volumes:
- ollama_data:/root/.ollama
# Uncomment for GPU:
# deploy:
# resources:
# reservations:
# devices:
# - driver: nvidia
# count: 1
# capabilities: [gpu]
volumes:
hoarder_data:
meilisearch_data:
ollama_data:
# Generate secrets:
echo "NEXTAUTH_SECRET=$(openssl rand -base64 36)" >> .env
echo "MEILI_MASTER_KEY=$(openssl rand -base64 24)" >> .env
docker compose up -d
# Pull the AI model (after Ollama starts):
docker exec hoarder-ollama ollama pull llama3.2
Part 2: HTTPS with Caddy
hoarder.yourdomain.com {
reverse_proxy localhost:3000
}
Visit https://hoarder.yourdomain.com → create your account.
Part 3: Browser Extensions
Chrome
- Install Hoarder Extension
- Extension → Options:
- Server URL:
https://hoarder.yourdomain.com - Click Login → authenticates via your account
- Server URL:
- Click extension icon on any page → Save
Firefox
- Install from Mozilla Add-ons: search "Hoarder"
- Same configuration
When you save a link:
- Hoarder immediately stores the URL
- Chromium fetches the full page + screenshot (background)
- Ollama generates tags from the content (background)
- Full text becomes searchable
Part 4: Mobile Apps
iOS
- Install Hoarder for iOS
- Server:
https://hoarder.yourdomain.com - Login with your credentials
- Share Sheet → Hoarder — save any link from Safari, Twitter, etc.
Android
- Install from Play Store: search "Hoarder"
- Same setup
- Share links from any app directly to Hoarder
Part 5: AI Tagging Configuration
Using Ollama (local, private)
# In docker-compose.yml:
environment:
OLLAMA_BASE_URL: "http://ollama:11434"
OLLAMA_MODEL: "llama3.2" # or mistral, phi3, etc.
# Available models for tagging (CPU-friendly):
docker exec hoarder-ollama ollama pull llama3.2 # 2GB, good quality
docker exec hoarder-ollama ollama pull phi3:mini # 2.3GB, fast
docker exec hoarder-ollama ollama pull mistral:7b # 4.1GB, best quality
# Switch model without restart:
# Update OLLAMA_MODEL env var and restart hoarder container
docker compose restart web
Using OpenAI API (if you prefer cloud)
environment:
OPENAI_API_KEY: "sk-..."
OPENAI_BASE_URL: "https://api.openai.com/v1" # or any OpenAI-compatible API
INFERENCE_TEXT_MODEL: "gpt-4o-mini"
INFERENCE_IMAGE_MODEL: "gpt-4o-mini"
Prompt customization
Hoarder uses a default system prompt for tag generation. Override in settings:
# Settings → AI → Custom Prompt
You are a bookmark tagging assistant. Generate 3-5 short, lowercase tags
for the provided content. Focus on: topic, type (article/tool/video/paper),
and primary programming language or technology stack.
Return only comma-separated tags.
Part 6: Lists and Organization
Lists (manual organization)
Beyond AI tags, organize bookmarks into lists:
- Lists → + New List:
Reading Queue,Work Research,Tools to Try - Drag bookmarks into lists, or save directly to a list from the extension
- Lists are shareable — each gets a public URL
Filtering
# In the search bar:
#docker — filter by tag
list:work — filter by list
is:archived — show archived
is:unread — show unread
# Combine:
#kubernetes is:unread
Part 7: REST API
# Get API key:
# Settings → API Keys → Generate
API_KEY="your-api-key"
BASE="https://hoarder.yourdomain.com"
# List bookmarks:
curl "$BASE/api/v1/bookmarks" \
-H "Authorization: Bearer $API_KEY" | jq '.bookmarks[].title'
# Add a bookmark:
curl -X POST "$BASE/api/v1/bookmarks" \
-H "Authorization: Bearer $API_KEY" \
-H "Content-Type: application/json" \
-d '{
"type": "link",
"url": "https://example.com",
"title": "Optional override title"
}'
# Get bookmark with AI tags:
curl "$BASE/api/v1/bookmarks/BOOKMARK_ID" \
-H "Authorization: Bearer $API_KEY" | jq '{title: .title, tags: .tags}'
# Archive a bookmark:
curl -X PUT "$BASE/api/v1/bookmarks/BOOKMARK_ID" \
-H "Authorization: Bearer $API_KEY" \
-H "Content-Type: application/json" \
-d '{"archived": true}'
Part 8: Import from Other Services
Import from Pocket
- Pocket → Export → downloads
ril_export.html - Hoarder → Settings → Import → select Pocket HTML file
- All links import with original tags preserved
Import from Raindrop.io
- Raindrop.io → Export → CSV or HTML
- Hoarder → Settings → Import → select file
Import bookmarks from browser
- Chrome/Firefox: Bookmarks Manager → Export → HTML
- Hoarder → Settings → Import → select HTML file
All imported bookmarks are queued for AI processing automatically.
Resource Requirements
| Component | Minimum RAM | Recommended RAM |
|---|---|---|
| Hoarder web | 256MB | 512MB |
| MeiliSearch | 256MB | 512MB |
| Chromium | 512MB | 1GB |
| Ollama (llama3.2) | 4GB | 8GB |
| Total | ~1GB (no Ollama) | ~6-10GB (with Ollama) |
If RAM is limited, use OpenAI API instead of Ollama — it requires no local inference.
Maintenance
# Update:
docker compose pull
docker compose up -d
# Backup:
tar -czf hoarder-backup-$(date +%Y%m%d).tar.gz \
$(docker volume inspect hoarder_hoarder_data --format '{{.Mountpoint}}')
# Backup MeiliSearch indexes:
tar -czf hoarder-search-$(date +%Y%m%d).tar.gz \
$(docker volume inspect hoarder_meilisearch_data --format '{{.Mountpoint}}')
# Trigger re-processing of all bookmarks (after AI model change):
# Settings → Re-run AI tagging on all bookmarks
# Logs:
docker compose logs -f web
docker compose logs -f chrome
See also: Linkding — simpler, minimal bookmark manager with lower resource requirements
See all open source productivity tools at OSSAlt.com/categories/productivity.