Skip to main content

Self-Host OpenClaw: Personal AI Assistant 2026

·OSSAlt Team
openclawself-hostedai-assistantdockerllm
Share:

Self-Host OpenClaw: Personal AI Assistant 2026

TL;DR

OpenClaw is the personal AI assistant that went from 9,000 to 247,000+ GitHub stars in under 6 months — the fastest any open-source project has ever accumulated that kind of community signal. It runs on your own hardware, connects to messaging apps you already use (WhatsApp, Telegram, Slack, Discord, iMessage), and routes your queries to any LLM provider you choose. This guide covers the full Docker Compose self-hosting setup, from zero to a running AI agent in under 30 minutes.

Key Takeaways

  • 247K+ GitHub stars — fastest-growing OSS project in history, surpassing even Docker and Node.js at comparable stages
  • Self-hosted, no subscription — bring your own API key (Anthropic, OpenAI, Ollama, or any OpenAI-compatible endpoint)
  • 50+ messaging integrations — WhatsApp, Telegram, Slack, Discord, Signal, iMessage, Teams, Matrix, LINE, and more
  • Sandbox mode — tool executions run in isolated sub-containers, keeping your host secure
  • Minimum 2 GB RAM, 1 vCPU — runs on any $5/month VPS
  • Skill ecosystem — install community skills to extend OpenClaw beyond built-in capabilities

Why OpenClaw Is Different

Most AI assistants live in a browser tab. OpenClaw lives where you already communicate. The core concept: a self-hosted runtime that connects your AI provider of choice to your messaging apps as a first-class integration — not a webhook hack.

The result: you can message your own AI from WhatsApp at 11pm, ask it to check your calendar, draft a reply to an email, and run a web search — all without leaving your phone or giving any third party access to your data.

The explosion from 9,000 to 247,000 stars (January–March 2026) happened because OpenClaw solved the gap between "I want an AI like Claude at work" and "I don't want another SaaS subscription or cloud service that owns my data." The project even has an official AWS Lightsail AMI and DigitalOcean 1-Click install, which signals serious infrastructure investment for an OSS project.


Prerequisites

  • A Linux VPS (Ubuntu 22.04+ recommended) or Docker Desktop on macOS/Windows
  • Docker Engine + Docker Compose v2
  • Minimum: 2 GB RAM, 1 vCPU, 5 GB disk (4 GB RAM recommended for production with multiple integrations)
  • An API key for at least one LLM provider (Anthropic, OpenAI, or a local Ollama instance)

Step 1: Directory Setup

Create a dedicated directory and pull the official Docker Compose file:

mkdir openclaw && cd openclaw
curl -fsSL https://docs.openclaw.ai/install/docker-compose.yml -o docker-compose.yml

Or create the Compose file manually:

# docker-compose.yml
version: "3.9"

services:
  openclaw:
    image: openclaw/openclaw:latest
    restart: unless-stopped
    ports:
      - "3100:3100"       # Web UI + API gateway
    volumes:
      - ${OPENCLAW_CONFIG_DIR:-./config}:/home/node/.openclaw
      - ${OPENCLAW_WORKSPACE_DIR:-./workspace}:/home/node/.openclaw/workspace
    environment:
      - NODE_ENV=production
      - SETUP_PASSWORD=${SETUP_PASSWORD}
      - OPENCLAW_GATEWAY_TOKEN=${OPENCLAW_GATEWAY_TOKEN}
      - OPENCLAW_SANDBOX=${OPENCLAW_SANDBOX:-1}
      - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
      # OR: OPENAI_API_KEY=${OPENAI_API_KEY}
      # OR: OLLAMA_BASE_URL=http://host.docker.internal:11434
    extra_hosts:
      - "host.docker.internal:host-gateway"  # Required on Linux for Ollama

Step 2: Environment Configuration

Create your .env file:

# .env — never commit this to git
SETUP_PASSWORD=your-long-random-setup-password-here
OPENCLAW_GATEWAY_TOKEN=your-long-random-gateway-token-here
OPENCLAW_SANDBOX=1

# Choose ONE LLM provider:
ANTHROPIC_API_KEY=sk-ant-...
# OPENAI_API_KEY=sk-...

# Optional: local Ollama
# OLLAMA_BASE_URL=http://host.docker.internal:11434

Security notes:

  • SETUP_PASSWORD is used for the initial web UI setup — use a strong random password (32+ chars)
  • OPENCLAW_GATEWAY_TOKEN authenticates API requests — treat like an API key
  • OPENCLAW_SANDBOX=1 is strongly recommended — it runs tool executions in isolated sub-containers, preventing any tool from accessing your host filesystem

Generate secure values:

echo "SETUP_PASSWORD=$(openssl rand -base64 32)"
echo "OPENCLAW_GATEWAY_TOKEN=$(openssl rand -base64 32)"

Step 3: Start OpenClaw

docker compose up -d

# Verify it started
docker compose logs -f openclaw
# Look for: "OpenClaw is running on port 3100"

Open the web UI at http://your-server-ip:3100. Complete the setup wizard using your SETUP_PASSWORD.


Step 4: Connect Your First Messaging Platform

OpenClaw's power comes from connecting to the platforms you already use. Here are the three most popular integrations:

Telegram (Easiest — 5 minutes)

  1. Message @BotFather on Telegram
  2. Send /newbot, choose a name, copy the token
  3. In the OpenClaw web UI → ChannelsTelegram → paste token
  4. Message your new bot — it routes to your configured AI

Slack

  1. Go to api.slack.com/appsCreate New AppFrom scratch
  2. Enable Socket Mode, generate an App-Level Token (scope: connections:write)
  3. Enable Event Subscriptions, subscribe to message.im and app_mention events
  4. Install to your workspace, copy the Bot Token (xoxb-...)
  5. In OpenClaw → ChannelsSlack → paste both tokens

Discord

  1. Create a new app at discord.com/developers
  2. Under Bot, enable Message Content Intent and copy the token
  3. OAuth2 URL with scopes: bot, permissions: Send Messages, Read Message History
  4. Invite to your server
  5. In OpenClaw → ChannelsDiscord → paste token + server ID

WhatsApp (via WhatsApp Business API)

WhatsApp requires a Meta Business Account and WhatsApp Business API access. Once set up:

  1. OpenClaw → ChannelsWhatsApp → enter your phone number ID and permanent token

Step 5: Configure Your LLM Provider

In the OpenClaw web UI → SettingsLLM Provider:

ProviderBest forNotes
Anthropic ClaudeComplex reasoning, long contextclaude-3-7-sonnet recommended
OpenAI GPT-4oGeneral purpose, tool useWidely supported
Ollama (local)Privacy, no API costsRequires local Llama/Mistral
GroqSpeedFast inference, rate limits

For local-first privacy, point OpenClaw at a local Ollama instance:

OLLAMA_BASE_URL=http://host.docker.internal:11434
OLLAMA_MODEL=llama3.2:latest

Step 6: Reverse Proxy with SSL (Production)

For a public-facing deployment, put OpenClaw behind Nginx or Caddy:

# /etc/caddy/Caddyfile
openclaw.yourdomain.com {
    reverse_proxy localhost:3100
}
systemctl reload caddy
# That's it — Caddy handles Let's Encrypt automatically

Nginx

server {
    listen 443 ssl;
    server_name openclaw.yourdomain.com;

    ssl_certificate /etc/letsencrypt/live/openclaw.yourdomain.com/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/openclaw.yourdomain.com/privkey.pem;

    location / {
        proxy_pass http://localhost:3100;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
    }
}

Skills: Extending OpenClaw

OpenClaw uses a skills system (similar in concept to Claude Code skills) to add capabilities beyond built-in tools. Install community skills from the OpenClaw Skills Registry:

# Inside the OpenClaw web UI → Skills → Browse Registry
# Or via API:
curl -X POST http://localhost:3100/api/skills/install \
  -H "Authorization: Bearer $OPENCLAW_GATEWAY_TOKEN" \
  -d '{"skill": "calendar-google"}'

Popular community skills:

  • calendar-google — read/write Google Calendar
  • email-gmail — draft and send emails
  • github-tools — create issues, PRs, check CI
  • home-assistant — control smart home devices
  • file-manager — read/write files in the workspace directory

OpenClaw vs Other Self-Hosted AI Assistants

OpenClawOpen WebUIJanLibreChat
Messaging integrations50+NoneNoneDiscord only
Self-hosted
GitHub stars247K52K24K19K
Proactive notifications
Skills/pluginsExtensionsPlugins
Runs offlineWith OllamaWith OllamaWith Ollama
Setup complexityMediumLowLowMedium

OpenClaw's key differentiator: it comes to you (via messaging), rather than requiring you to open a browser tab. For automation, scheduling, and AI that works proactively, it's in a different category from chat UI tools like Open WebUI or Jan.


Troubleshooting Common Issues

Container won't start — port 3100 in use:

lsof -i :3100
# Kill the conflicting process or change OpenClaw port in docker-compose.yml

Telegram messages not received:

  • Check bot is not blocked
  • Verify the token is correct in OpenClaw settings
  • Inspect logs: docker compose logs openclaw | grep telegram

Sandbox mode breaking tool execution:

  • Ensure Docker socket is accessible: ls -la /var/run/docker.sock
  • Add to docker-compose.yml volumes: - /var/run/docker.sock:/var/run/docker.sock

Ollama not reachable:

  • On Linux, use host.docker.internal with extra_hosts in docker-compose.yml (shown above)
  • Verify Ollama is running: curl http://localhost:11434/api/tags

Recommendations

Use OpenClaw if:

  • You want AI accessible from your phone without opening new apps
  • Privacy is a priority — all data stays on your server
  • You want proactive AI (scheduled tasks, notifications, reminders)
  • You already use Telegram/Discord/Slack for communication

Stick with cloud alternatives if:

  • You need the absolute latest AI capabilities with zero setup
  • Your team needs shared AI workspace features (Notion-style)
  • You're not comfortable managing a VPS

Methodology

  • Sources: github.com/openclaw/openclaw (247K+ stars), docs.openclaw.ai/install/docker, AWS blog (Lightsail AMI launch), DigitalOcean 1-Click docs, Simon Willison's TIL on OpenClaw Docker, Towards AI Medium article (Feb 2026)
  • Data as of: March 2026

Comparing self-hosted AI assistants? See LocalAI vs Ollama vs LM Studio 2026 and Open WebUI vs LibreChat vs Jan 2026.

Want to automate your AI workflows without self-hosting? Compare n8n vs Zapier alternatives 2026.

Comments

The SaaS-to-Self-Hosted Migration Guide (Free PDF)

Step-by-step: infrastructure setup, data migration, backups, and security for 15+ common SaaS replacements. Used by 300+ developers.

Join 300+ self-hosters. Unsubscribe in one click.