Install Leapable
Download the installer for your OS. It detects Docker, pulls the container, and configures your AI clients automatically. No terminal required.
macOS
macOS 12 Monterey or later
~10 MB installer · Requires Docker Desktop or OrbStack
Linux
Ubuntu/Debian/Fedora/Arch
~2 MB .deb · Requires Docker Engine
What the installer does
1. System check
Detects Docker, checks disk space (30 GB), verifies prerequisites. Prompts you to install Docker if missing.
2. Pull container
Downloads leapable/mcp:latest from Docker Hub with all 157 AI tools pre-installed.
3. Configure clients
Auto-detects Claude Desktop, Claude Code, Cursor, VS Code, Windsurf, Continue.dev, Zed, JetBrains IDEs, and Amazon Q. Writes MCP config for each.
4. Launch
Starts the MCP server on localhost:4100. Your AI clients instantly gain access to all 157 Leapable tools.
Requirements
CPU & Memory
- 8 GB RAM (minimum)
- x86_64 or ARM64
- 2+ CPU cores
Disk Space
- 30 GB free
- Container: ~3.5 GB
- Your databases: variable
Docker
- Docker Desktop (Win/Mac)
- Docker Engine (Linux)
- Installer can install it
Prefer the terminal?
If you already have Docker running, you can start Leapable with one command:
Then add the MCP server to your client. See the For Subscribers page for full MCP config examples.
Frequently asked questions
Does the installer need my data?
No. The installer detects Docker, pulls the container, and configures your AI clients. Your databases stay in a local Docker volume. Leapable never sees your data.
Can I use this without Docker?
Not yet. Docker is required so we can ship all 157 tools, Python workers, and ML models as one reproducible unit. Bare-metal install is on our roadmap.
What about updates?
The installer never deletes your data. Re-running it updates the container while preserving your databases and account balance via a stable machine ID.
How do .md and .txt files work?
Text files are processed locally for free — no cloud calls, no charges. PDFs and images route to our cloud GPU pipeline (included in your subscription credits).
Which AI clients are supported?
Claude Desktop, Claude Code, Cursor, VS Code + Copilot, Windsurf, Continue.dev, Zed, JetBrains IDEs (IntelliJ/PyCharm/WebStorm), and Amazon Q. Any MCP-compatible client works — we use the standard Model Context Protocol.
I'm having trouble — where can I get help?
Open an issue on GitHub.
Common issues: Docker not running, WSL2 not installed on Windows (run wsl --install),
or port 4100 already in use.