How to Use OpenCode with Omniside: One API Key for All Models
Step-by-step guide to configuring OpenCode with Omniside as a custom OpenAI-compatible provider. Works on Ubuntu, CentOS, Debian, macOS, and Windows.
OpenCode is one of the hottest open-source AI coding agents right now. It runs in your terminal, connects to multiple LLM providers, and lets you build, debug, and explore code with AI — all without leaving the command line.
The one thing that trips people up? Setting up the provider.
OpenCode supports 75+ providers, but if you want to access GPT-4o, Claude, Gemini, Grok, and DeepSeek through a single endpoint, you need a unified API gateway. That’s where Omniside comes in — one API key, one base URL, all major models.
This guide covers installation and configuration on every major platform: Ubuntu, Debian, CentOS, macOS, and Windows.
Prerequisites
Before you start, you’ll need two things:
- A terminal — any modern terminal works (Terminal.app, WezTerm, Alacritty, Windows Terminal, etc.)
- An Omniside API Key — sign up here (takes 30 seconds, no minimum deposit)
The rest is handled in the steps below.
Step 1: Install OpenCode
macOS
The fastest way is Homebrew:
brew install anomalyco/tap/opencodeOr use the universal install script:
curl -fsSL https://opencode.ai/install | bashUbuntu / Debian
curl -fsSL https://opencode.ai/install | bashOr install via npm (requires Node.js 18+):
# Install Node.js if neededcurl -fsSL https://deb.nodesource.com/setup_22.x | sudo -E bash -sudo apt-get install -y nodejs
# Install OpenCodenpm install -g opencode-aiCentOS / RHEL / Fedora
curl -fsSL https://opencode.ai/install | bashOr via npm:
# Install Node.js if neededcurl -fsSL https://rpm.nodesource.com/setup_22.x | sudo bash -sudo yum install -y nodejs # CentOS/RHEL# or: sudo dnf install -y nodejs # Fedora
# Install OpenCodenpm install -g opencode-aiArch Linux
sudo pacman -S opencode# or from AUR:paru -S opencode-binWindows
For the best experience, use WSL (Windows Subsystem for Linux):
# In PowerShell (admin), install WSL first if neededwsl --install
# Then inside WSL, run:curl -fsSL https://opencode.ai/install | bashNative Windows options are also available:
# Chocolateychoco install opencode
# Scoopscoop install opencode
# npmnpm install -g opencode-aiVerify Installation
After installation, confirm OpenCode is working:
opencode --versionYou should see the version number printed. If not, make sure the binary is in your PATH.
Step 2: Get Your Omniside API Key
- Go to platform.omniside.ai
- Sign up or log in
- Navigate to API Keys
- Click Create API Key
- Copy the key (starts with
sk-)
Keep this key handy — you’ll need it in the next step.
Step 3: Configure Omniside as a Custom Provider
OpenCode uses a opencode.json config file. It can live in two places:
- Global (all projects):
~/.config/opencode/opencode.json - Per-project:
opencode.jsonin your project root
We’ll set it up globally so every project uses Omniside by default.
Create the Config
mkdir -p ~/.config/opencodeNow create the config file. Choose one of the three presets below based on your needs.
Option A: Full — All Models with Fallback
Best experience. Uses Claude Sonnet 4.5 as primary with GPT-4o, Gemini, and DeepSeek as fallbacks.
{ "$schema": "https://opencode.ai/config.json", "provider": { "omniside": { "npm": "@ai-sdk/openai-compatible", "name": "Omniside", "options": { "baseURL": "https://api.omniside.ai/v1" }, "models": { "claude-sonnet-4-5-20250514": { "name": "Claude Sonnet 4.5", "limit": { "context": 200000, "output": 8192 } }, "gpt-4o": { "name": "GPT-4o", "limit": { "context": 128000, "output": 16384 } }, "gemini-2.5-pro": { "name": "Gemini 2.5 Pro", "limit": { "context": 1000000, "output": 65536 } }, "deepseek-chat": { "name": "DeepSeek V3", "limit": { "context": 64000, "output": 8192 } }, "grok-3": { "name": "Grok 3", "limit": { "context": 131072, "output": 8192 } } } } }}Option B: Smart — Best Quality-to-Cost Ratio
Uses Claude Sonnet 4.5 as primary with GPT-4o as fallback. Recommended for most developers.
{ "$schema": "https://opencode.ai/config.json", "provider": { "omniside": { "npm": "@ai-sdk/openai-compatible", "name": "Omniside", "options": { "baseURL": "https://api.omniside.ai/v1" }, "models": { "claude-sonnet-4-5-20250514": { "name": "Claude Sonnet 4.5", "limit": { "context": 200000, "output": 8192 } }, "gpt-4o": { "name": "GPT-4o", "limit": { "context": 128000, "output": 16384 } } } } }}Option C: Budget — Lowest Cost
Uses DeepSeek V3 as primary. Great for everyday coding tasks.
{ "$schema": "https://opencode.ai/config.json", "provider": { "omniside": { "npm": "@ai-sdk/openai-compatible", "name": "Omniside", "options": { "baseURL": "https://api.omniside.ai/v1" }, "models": { "deepseek-chat": { "name": "DeepSeek V3", "limit": { "context": 64000, "output": 8192 } }, "gpt-4o-mini": { "name": "GPT-4o Mini", "limit": { "context": 128000, "output": 16384 } } } } }}Save the Config
Save whichever option you chose to ~/.config/opencode/opencode.json.
On Linux and macOS, you can do this in one command (using Option B as an example):
cat > ~/.config/opencode/opencode.json << 'EOF'{ "$schema": "https://opencode.ai/config.json", "provider": { "omniside": { "npm": "@ai-sdk/openai-compatible", "name": "Omniside", "options": { "baseURL": "https://api.omniside.ai/v1" }, "models": { "claude-sonnet-4-5-20250514": { "name": "Claude Sonnet 4.5", "limit": { "context": 200000, "output": 8192 } }, "gpt-4o": { "name": "GPT-4o", "limit": { "context": 128000, "output": 16384 } } } } }}EOFOn Windows (PowerShell):
$configDir = "$env:USERPROFILE\.config\opencode"New-Item -ItemType Directory -Force -Path $configDir | Out-Null
@'{ "$schema": "https://opencode.ai/config.json", "provider": { "omniside": { "npm": "@ai-sdk/openai-compatible", "name": "Omniside", "options": { "baseURL": "https://api.omniside.ai/v1" }, "models": { "claude-sonnet-4-5-20250514": { "name": "Claude Sonnet 4.5", "limit": { "context": 200000, "output": 8192 } }, "gpt-4o": { "name": "GPT-4o", "limit": { "context": 128000, "output": 16384 } } } } }}'@ | Set-Content -Path "$configDir\opencode.json" -Encoding UTF8Step 4: Add Your API Key
Now connect your Omniside API key. Launch OpenCode and use the /connect command:
opencodeInside the TUI, type:
/connectWhen prompted, select “Other” to add a custom provider, then enter:
- Provider name:
Omniside - API key: your
sk-...key
The key is stored securely in ~/.local/share/opencode/auth.json (Linux/macOS) or %USERPROFILE%\.local\share\opencode\auth.json (Windows).
Tip: Not sure where OpenCode stores its files? Run
opencode pathsto see all config, data, and cache paths for your system.
Alternatively, you can set the key as an environment variable. Add this to your shell profile (~/.bashrc, ~/.zshrc, or ~/.profile):
export OMNISIDE_API_KEY="sk-your-key-here"Then reference it in your config using the {env:...} syntax:
"options": { "baseURL": "https://api.omniside.ai/v1", "apiKey": "{env:OMNISIDE_API_KEY}"}Step 5: Select Your Model
Launch OpenCode in any project directory:
cd /path/to/your/projectopencodeThen use the /models command to pick a model:
/modelsYou’ll see your Omniside models listed. Select one — for example, omniside/claude-sonnet-4-5-20250514.
That’s it. You’re now coding with Claude, GPT-4o, Gemini, or any other model through Omniside — all from your terminal.
Switching Models
You can switch models at any time inside OpenCode:
/modelsOr use the -m flag when launching:
opencode -m omniside/gpt-4oUsing Environment Variables Instead of Config
If you prefer a minimal setup without a config file, you can set environment variables directly:
export OPENAI_API_KEY="sk-your-omniside-key"export OPENAI_BASE_URL="https://api.omniside.ai/v1"This works because Omniside is OpenAI-compatible, so OpenCode’s built-in OpenAI provider will route through Omniside automatically. The downside is you only get models that OpenCode already knows about (like gpt-4o) — you won’t see Claude or Gemini in the model list.
For the full multi-model experience, use the config file approach from Step 3.
Platform-Specific Notes
Ubuntu / Debian
If you get permission errors with npm install -g, either:
- Use
sudo npm install -g opencode-ai, or - Set npm’s global directory:
npm config set prefix ~/.npm-globaland add~/.npm-global/binto your PATH
CentOS / RHEL
On CentOS 7, you may need to install a newer gcc and glibc for Node.js 22. Consider using CentOS Stream 9 or Rocky Linux 9 for a smoother experience.
macOS
On first run, macOS may show a security prompt. Go to System Settings → Privacy & Security and allow OpenCode to run.
If using the Homebrew tap (anomalyco/tap/opencode), you always get the latest release. The official brew install opencode formula is maintained by the Homebrew team and may lag behind.
Windows
WSL is the recommended approach. If using native Windows:
- Use Windows Terminal for the best TUI experience
- PowerShell 7+ is recommended over the legacy PowerShell
- Some OpenCode features (like file watching) work better under WSL
ARM / Raspberry Pi
OpenCode runs on ARM Linux. Install via npm:
npm install -g opencode-aiThe Omniside config works identically — ARM doesn’t affect the API configuration.
Verifying Everything Works
Run a quick test to make sure your setup is correct:
# Check OpenCode sees your modelsopencode models
# Or start OpenCode and ask a questionopencodeInside the TUI, try a simple prompt:
What files are in this directory?If OpenCode responds with a list of files using your chosen Omniside model, you’re all set.
Troubleshooting
”Route /api/messages not found”
The baseURL must end with /v1, not /v1/chat/completions. Make sure your config has:
"baseURL": "https://api.omniside.ai/v1"Model not showing in /models
Check that:
- The
providersection in your config is properly formatted - The
npmfield is set to"@ai-sdk/openai-compatible" - You’ve restarted OpenCode after changing the config
Authentication errors
Make sure your API key is set either:
- Via the
/connectcommand inside OpenCode, or - In the
options.apiKeyfield of your config, or - As an environment variable referenced with
{env:VARIABLE_NAME}
Connection timeout
If you’re behind a corporate proxy or firewall, you may need to set the HTTPS_PROXY environment variable:
export HTTPS_PROXY="http://your-proxy:8080"Why Omniside for OpenCode?
The typical OpenCode setup requires you to sign up with each AI provider separately. If you want Claude, GPT-4o, and Gemini, that’s three signups, three API keys, and three billing accounts to manage.
With Omniside, you configure one provider in OpenCode and get access to all major models. Switch between them with /models — no reconfiguration needed.
Check the full list of available models at platform.omniside.ai/models.
Get your API key at platform.omniside.ai and start coding.