Self-Host Void Editor: Open Source Cursor Fork 2026
Self-Host Void Editor: Open Source Cursor Fork 2026
TL;DR
Void Editor is an open-source VS Code fork that adds AI coding features — inline edits, chat, autocomplete, and agent mode — without routing your code through Cursor's proprietary servers. You bring your own LLM: connect to OpenAI, Anthropic Claude, Google Gemini, Groq, or run a local model via Ollama for completely private coding assistance. Backed by YC (YC W25), Void is in active beta with Linux and macOS releases available.
Key Takeaways
- Void Editor: VS Code fork + AI features, open-source (Apache 2.0)
- No middleman: your code goes directly from Void to your chosen LLM provider — no Cursor/Windsurf backend
- Any LLM: OpenAI, Anthropic, Gemini, Groq, Ollama (local), LM Studio, any OpenAI-compatible endpoint
- Cursor $20/month saved: Void is free, you only pay for API tokens you use
- Full VS Code compatibility: all VS Code extensions work in Void
- Features: inline editing, chat panel, autocomplete, agent mode (codebase changes)
- Status (2026): Beta — Linux and macOS available, Windows in progress
- YC-backed: Glass Devtools, W25 batch
Why Void Over Cursor?
Cursor is the dominant AI code editor in 2026. It's genuinely excellent. The reasons to consider Void instead:
Privacy concerns with Cursor:
- Your code is sent to Cursor's servers before reaching the LLM
- Cursor's privacy mode exists but is limited
- Cursor controls which models you access and at what cost
- $20/month regardless of actual usage
Void's approach:
- Your code goes directly to OpenAI/Anthropic/Ollama — no Void intermediary
- Complete privacy when using Ollama (everything stays on your machine)
- Pay-per-token with OpenAI/Anthropic (often cheaper for light users)
- Free for heavy users who run local models
- Full source code visibility — audit exactly what data is sent where
Void vs. Cursor vs. Other Alternatives
| Feature | Void | Cursor | Continue.dev | Aider |
|---|---|---|---|---|
| Price | Free | $20/month | Free | Free |
| Open Source | ✅ Apache 2.0 | ❌ | ✅ Apache 2.0 | ✅ Apache 2.0 |
| VS Code fork | ✅ | ✅ | ❌ (extension) | ❌ (CLI) |
| Local LLM (Ollama) | ✅ | ⚠️ Limited | ✅ | ✅ |
| Inline editing | ✅ | ✅ | ✅ | ✅ |
| Chat panel | ✅ | ✅ | ✅ | ✅ |
| Agent mode | ✅ | ✅ | ✅ | ✅ |
| Autocomplete | ✅ | ✅ | ✅ | ❌ |
| Codebase indexing | ✅ | ✅ | ✅ | ✅ |
| Direct LLM connection | ✅ | ❌ | ✅ | ✅ |
| Windows support | In progress | ✅ | ✅ | ✅ |
Installation
macOS
# Download from voideditor.com (direct LLM connection, no tracking)
# Or via Homebrew (when available):
brew install --cask void
# Current: download the .dmg from https://voideditor.com
# Mount and drag to Applications folder
Linux
# Download the latest .deb or .AppImage from GitHub releases:
# https://github.com/voideditor/void/releases
# Debian/Ubuntu (.deb):
wget https://github.com/voideditor/void/releases/latest/download/void_*.deb
sudo dpkg -i void_*.deb
# AppImage (universal Linux):
wget https://github.com/voideditor/void/releases/latest/download/Void_*.AppImage
chmod +x Void_*.AppImage
./Void_*.AppImage
# Arch Linux (AUR):
yay -S void-editor-bin
Build from Source
# Requirements: Node.js 18+, Python 3, Git, C++ build tools
git clone https://github.com/voideditor/void.git
cd void
# Install dependencies
npm install
# Build for your platform
npm run compile
# Launch
./scripts/code.sh # Linux/macOS
Configuring Your LLM Provider
Void's model configuration is in Settings → Void Settings:
OpenAI
{
"void.models": [
{
"provider": "openai",
"model": "gpt-4o",
"apiKey": "sk-..."
},
{
"provider": "openai",
"model": "gpt-4o-mini",
"apiKey": "sk-..."
}
]
}
Anthropic Claude
{
"void.models": [
{
"provider": "anthropic",
"model": "claude-opus-4-5",
"apiKey": "sk-ant-..."
},
{
"provider": "anthropic",
"model": "claude-sonnet-4-5",
"apiKey": "sk-ant-..."
}
]
}
Ollama (Fully Local, Zero Cost)
# 1. Install Ollama
curl -fsSL https://ollama.com/install.sh | sh
# 2. Pull a coding model
ollama pull deepseek-coder-v2:16b # Best for code, 16B params
ollama pull qwen2.5-coder:7b # Good balance, 7B params
ollama pull codellama:13b # Meta's coding model
# 3. Configure Void to use Ollama
{
"void.models": [
{
"provider": "ollama",
"model": "deepseek-coder-v2:16b",
"endpoint": "http://localhost:11434"
}
]
}
With Ollama configured, your entire coding workflow stays local — no tokens, no API bills, no data leaving your machine.
Groq (Fast Free Tier)
{
"void.models": [
{
"provider": "groq",
"model": "llama-3.3-70b-versatile",
"apiKey": "gsk_..."
}
]
}
Groq's free tier offers 14,400 requests/day at very high speed (tokens generated at 500+ tokens/second) — effectively free for most individual developers.
Core Features
Inline Editing (Cmd/Ctrl + K)
1. Select code you want to modify
2. Press Cmd+K (macOS) or Ctrl+K (Linux/Windows)
3. Type your instruction: "Refactor this to use async/await"
4. Void sends: system prompt + file context + selected code + your instruction
→ directly to your configured LLM
5. Diff view shows proposed changes
6. Accept (Tab) or reject (Esc)
Chat Panel (Cmd/Ctrl + L)
1. Open chat sidebar
2. Type your question about the codebase
3. Void automatically includes:
- Current file context
- Referenced files (@file syntax)
- Selected code
4. Direct API call to your LLM
5. Code blocks in response have "Apply" button for one-click insertion
Agent Mode (Multi-File Edits)
1. Describe what you want to build or change
2. Void's agent:
- Reads relevant files in your codebase
- Plans required changes across multiple files
- Executes changes as a series of edits
- Shows diff of all changes before applying
3. Review and accept/reject the full changeset
VS Code Extensions Compatibility
Because Void is a VS Code fork, most VS Code extensions work without modification:
Compatible extensions:
ESLint, Prettier, GitLens ✅
Docker, Kubernetes ✅
Python, Go, Rust Analyzer ✅
Tailwind CSS IntelliSense ✅
Database clients (SQLite, Postgres) ✅
Vim/Emacs keybindings ✅
GitHub Copilot extension ⚠️ (redundant but installs)
Cursor-specific extensions ❌ (Cursor-proprietary)
Migrating from Cursor
Cursor and Void are both VS Code forks. Your settings and keybindings transfer:
# Export Cursor settings
cp ~/Library/Application\ Support/Cursor/User/settings.json ~/cursor-settings-backup.json
cp ~/Library/Application\ Support/Cursor/User/keybindings.json ~/cursor-keybindings-backup.json
# Import into Void (same path structure as VS Code)
cp ~/cursor-settings-backup.json ~/Library/Application\ Support/Void/User/settings.json
cp ~/cursor-keybindings-backup.json ~/Library/Application\ Support/Void/User/keybindings.json
# Install your extensions in Void:
# View → Extensions → search for each extension you use
Cost Comparison
Cursor Pro: $20/month = $240/year
Void + OpenAI GPT-4o:
Light usage (10 code chats/day): ~$5-15/month
Heavy usage (50+ chats/day): ~$20-40/month
(GPT-4o: $2.50/1M input tokens, $10/1M output)
Void + Claude Sonnet:
Light usage: ~$3-10/month
Heavy usage: ~$15-30/month
(Claude Sonnet: $3/1M input, $15/1M output)
Void + Ollama (local Llama 3.3 70B or Qwen 2.5):
Cost: $0 (existing hardware)
Hardware requirement: 32GB+ RAM for 70B models
16GB RAM for 7B-13B models
Break-even vs. Cursor Pro:
Light user: Void + cloud API is CHEAPER (~$5-15/month)
Heavy user: Void + cloud API is ~SAME ($20-40/month)
Local models: Void is ALWAYS cheaper after hardware cost
Current Limitations (Beta Status)
Windows: Not yet available (macOS and Linux only as of 2026)
Stability: Beta — occasional crashes, reported in GitHub issues
Auto-complete quality: Good but below Cursor's fine-tuned models
Agent mode: Functional but less polished than Cursor's Composer
No Void Cloud: Everything is self-configured (advantage for privacy, work for setup)
Void is the best option for privacy-focused developers who want full control over their LLM stack. For teams where Windows is primary or who want a polished out-of-the-box experience, Cursor is still the smoother option in 2026.
More open-source AI coding tools at OSSAlt.
Related: Best Open Source Cursor Alternatives 2026 · Self-Host Perplexica