OpenClaw is the open-source personal AI assistant that’s been taking the developer community by storm. It runs on your machine, connects to messaging apps like Telegram, Discord, and WhatsApp, and handles everything from coding tasks to email management — powered by LLMs.

The catch? OpenClaw needs at least one LLM provider configured to work. By default, that means signing up for Anthropic, OpenAI, Google, or other providers separately, each with their own API key and billing account.

With Omniside, you skip all of that. One API key, one billing account — access to Claude, GPT-4o, Gemini, Grok, DeepSeek, and more through a single OpenAI-compatible endpoint.

This guide covers installation and configuration on every major platform: macOS, Ubuntu, Debian, CentOS, and Windows.

Prerequisites

You’ll need:

  1. A machine to run OpenClaw on — Mac, Linux, or Windows (WSL recommended)
  2. Node.js 22+ — the install script handles this if you don’t have it
  3. An Omniside API Keysign up here (takes 30 seconds)

Step 1: Install OpenClaw

macOS

The recommended approach:

Terminal window
curl -fsSL https://openclaw.ai/install.sh | bash

Or via Homebrew:

Terminal window
brew install openclaw

Or via npm:

Terminal window
npm install -g openclaw

A Desktop companion app is also available — download the .dmg from the GitHub Releases.

Ubuntu / Debian

Terminal window
curl -fsSL https://openclaw.ai/install.sh | bash

Or install via npm (requires Node.js 22+):

Terminal window
# Install Node.js if needed
curl -fsSL https://deb.nodesource.com/setup_22.x | sudo -E bash -
sudo apt-get install -y nodejs
# Install OpenClaw
npm install -g openclaw

CentOS / RHEL / Fedora

Terminal window
curl -fsSL https://openclaw.ai/install.sh | bash

Or via npm:

Terminal window
# Install Node.js if needed
curl -fsSL https://rpm.nodesource.com/setup_22.x | sudo bash -
sudo yum install -y nodejs # CentOS/RHEL
# or: sudo dnf install -y nodejs # Fedora
# Install OpenClaw
npm install -g openclaw

Windows

WSL is strongly recommended for the best experience:

Terminal window
# In PowerShell (admin), install WSL first if needed
wsl --install
# Then inside WSL:
curl -fsSL https://openclaw.ai/install.sh | bash

Native Windows is also possible via npm:

Terminal window
npm install -g openclaw

Docker

If you prefer containerized deployment:

Terminal window
docker run -it --rm ghcr.io/openclaw/openclaw

Verify Installation

Terminal window
openclaw --version

Step 2: Get Your Omniside API Key

  1. Go to platform.omniside.ai
  2. Sign up or log in
  3. Navigate to API Keys
  4. Click Create API Key
  5. Copy the key (starts with sk-)

Step 3: Configure Omniside as a Custom Provider

OpenClaw uses a openclaw.json (or openclaw.json5) config file. The config can live in:

  • Global: ~/.config/openclaw/openclaw.json (Linux/macOS) or %USERPROFILE%\.config\openclaw\openclaw.json (Windows)
  • Legacy path: ~/.openclaw/openclaw.json
  • Per-project: openclaw.json in the project root

We’ll set it up globally. Create the directory first:

Terminal window
mkdir -p ~/.config/openclaw

Understanding the Config Structure

OpenClaw custom providers require two things:

  1. Provider definition in models.providers — tells OpenClaw where to send requests
  2. Model allowlist in agents.defaults.models — tells OpenClaw which models are available
  3. mode: "merge" — critical; without this, your custom provider won’t load alongside built-in ones
  4. api: "openai-completions" — required for any OpenAI-compatible endpoint

Option A: Full — All Models with Fallback

Best experience. Claude Sonnet 4.5 as primary, with automatic fallback to GPT-4o, Gemini, and DeepSeek.

{
"models": {
"mode": "merge",
"providers": {
"omniside": {
"baseUrl": "https://api.omniside.ai/v1",
"apiKey": "sk-your-omniside-key",
"api": "openai-completions",
"models": [
{
"id": "claude-sonnet-4-5-20250514",
"name": "Claude Sonnet 4.5",
"reasoning": true,
"input": ["text", "image"],
"contextWindow": 200000,
"maxTokens": 8192
},
{
"id": "gpt-4o",
"name": "GPT-4o",
"reasoning": false,
"input": ["text", "image"],
"contextWindow": 128000,
"maxTokens": 16384
},
{
"id": "gemini-2.5-pro",
"name": "Gemini 2.5 Pro",
"reasoning": true,
"input": ["text", "image"],
"contextWindow": 1000000,
"maxTokens": 65536
},
{
"id": "deepseek-chat",
"name": "DeepSeek V3",
"reasoning": false,
"input": ["text"],
"contextWindow": 64000,
"maxTokens": 8192
},
{
"id": "grok-3",
"name": "Grok 3",
"reasoning": false,
"input": ["text"],
"contextWindow": 131072,
"maxTokens": 8192
}
]
}
}
},
"agents": {
"defaults": {
"model": {
"primary": "omniside/claude-sonnet-4-5-20250514",
"fallbacks": [
"omniside/gpt-4o",
"omniside/gemini-2.5-pro",
"omniside/deepseek-chat"
]
},
"models": {
"omniside/claude-sonnet-4-5-20250514": { "alias": "sonnet" },
"omniside/gpt-4o": { "alias": "gpt4o" },
"omniside/gemini-2.5-pro": { "alias": "gemini" },
"omniside/deepseek-chat": { "alias": "deepseek" },
"omniside/grok-3": { "alias": "grok" }
}
}
}
}

Option B: Smart — Best Quality-to-Cost Ratio

Claude Sonnet 4.5 as primary with GPT-4o fallback. Recommended for most users.

{
"models": {
"mode": "merge",
"providers": {
"omniside": {
"baseUrl": "https://api.omniside.ai/v1",
"apiKey": "sk-your-omniside-key",
"api": "openai-completions",
"models": [
{
"id": "claude-sonnet-4-5-20250514",
"name": "Claude Sonnet 4.5",
"reasoning": true,
"input": ["text", "image"],
"contextWindow": 200000,
"maxTokens": 8192
},
{
"id": "gpt-4o",
"name": "GPT-4o",
"reasoning": false,
"input": ["text", "image"],
"contextWindow": 128000,
"maxTokens": 16384
}
]
}
}
},
"agents": {
"defaults": {
"model": {
"primary": "omniside/claude-sonnet-4-5-20250514",
"fallbacks": ["omniside/gpt-4o"]
},
"models": {
"omniside/claude-sonnet-4-5-20250514": { "alias": "sonnet" },
"omniside/gpt-4o": { "alias": "gpt4o" }
}
}
}
}

Option C: Budget — Lowest Cost

DeepSeek V3 as primary. Great for everyday tasks where cost matters more than peak performance.

{
"models": {
"mode": "merge",
"providers": {
"omniside": {
"baseUrl": "https://api.omniside.ai/v1",
"apiKey": "sk-your-omniside-key",
"api": "openai-completions",
"models": [
{
"id": "deepseek-chat",
"name": "DeepSeek V3",
"reasoning": false,
"input": ["text"],
"contextWindow": 64000,
"maxTokens": 8192
},
{
"id": "gpt-4o-mini",
"name": "GPT-4o Mini",
"reasoning": false,
"input": ["text", "image"],
"contextWindow": 128000,
"maxTokens": 16384
}
]
}
}
},
"agents": {
"defaults": {
"model": {
"primary": "omniside/deepseek-chat",
"fallbacks": ["omniside/gpt-4o-mini"]
},
"models": {
"omniside/deepseek-chat": { "alias": "deepseek" },
"omniside/gpt-4o-mini": { "alias": "gpt4omini" }
}
}
}
}

Save the Config

On Linux / macOS, save the JSON to the global config path:

Terminal window
# Write Option B (Smart) config - replace sk-your-omniside-key with your actual key
cat > ~/.config/openclaw/openclaw.json << 'EOF'
{
"models": {
"mode": "merge",
"providers": {
"omniside": {
"baseUrl": "https://api.omniside.ai/v1",
"apiKey": "sk-your-omniside-key",
"api": "openai-completions",
"models": [
{
"id": "claude-sonnet-4-5-20250514",
"name": "Claude Sonnet 4.5",
"reasoning": true,
"input": ["text", "image"],
"contextWindow": 200000,
"maxTokens": 8192
},
{
"id": "gpt-4o",
"name": "GPT-4o",
"reasoning": false,
"input": ["text", "image"],
"contextWindow": 128000,
"maxTokens": 16384
}
]
}
}
},
"agents": {
"defaults": {
"model": {
"primary": "omniside/claude-sonnet-4-5-20250514",
"fallbacks": ["omniside/gpt-4o"]
},
"models": {
"omniside/claude-sonnet-4-5-20250514": { "alias": "sonnet" },
"omniside/gpt-4o": { "alias": "gpt4o" }
}
}
}
}
EOF

On Windows (PowerShell):

Terminal window
$configDir = "$env:USERPROFILE\.config\openclaw"
New-Item -ItemType Directory -Force -Path $configDir | Out-Null
notepad "$configDir\openclaw.json"
# Paste your chosen config and save

Important: Replace sk-your-omniside-key with your actual Omniside API key. You can also use environment variable syntax: "apiKey": "${OMNISIDE_API_KEY}" and export the key in your shell profile.

Step 4: Run Onboarding

Now start the OpenClaw onboarding process:

Terminal window
openclaw onboard

When prompted for model selection, choose “Skip for now” — your Omniside models are already configured in the JSON file.

The onboarding wizard will walk you through connecting a chat channel (Telegram, Discord, WhatsApp, etc.).

Step 5: Verify It Works

After onboarding, verify your models are loaded:

Terminal window
openclaw models list

You should see your Omniside models listed with the omniside/ prefix. To switch models:

Terminal window
openclaw models set omniside/gpt-4o

Or restart the gateway to apply config changes:

Terminal window
openclaw gateway restart

Using Environment Variables for the API Key

Instead of hardcoding the key in the config, you can reference an environment variable:

"apiKey": "${OMNISIDE_API_KEY}"

Then add to your shell profile (~/.bashrc, ~/.zshrc, or ~/.profile):

Terminal window
export OMNISIDE_API_KEY="sk-your-key-here"

This keeps your key out of config files that might be committed to git.

Platform-Specific Notes

Ubuntu / Debian

If npm install -g fails with permission errors:

  • Use sudo npm install -g openclaw, or
  • Set npm prefix: npm config set prefix ~/.npm-global and add ~/.npm-global/bin to PATH

CentOS / RHEL

CentOS 7 may need updated gcc and glibc for Node.js 22. Rocky Linux 9 or CentOS Stream 9 provide a smoother experience.

macOS

On first run, macOS may block OpenClaw with a Gatekeeper warning. Go to System Settings → Privacy & Security → Open Anyway.

Windows

WSL is strongly recommended. If using native Windows:

  • Config path: %USERPROFILE%\.config\openclaw\openclaw.json
  • Use Windows Terminal for the best TUI experience
  • Some features like file watching work better under WSL

Docker

For Docker deployments, pass the API key as an environment variable:

services:
openclaw:
image: ghcr.io/openclaw/openclaw:latest
environment:
- OMNISIDE_API_KEY=sk-your-key-here
volumes:
- ./openclaw.json:/root/.config/openclaw/openclaw.json

Troubleshooting

”No API provider registered for api: undefined”

You’re missing the "api": "openai-completions" field in your provider config. This is the most common mistake.

”Config validation failed: baseUrl expected string”

You set apiKey but forgot baseUrl. Both are required for custom providers:

"baseUrl": "https://api.omniside.ai/v1",
"apiKey": "sk-your-key",
"api": "openai-completions"

Model not appearing in openclaw models list

Check that:

  1. "mode": "merge" is set in the models section
  2. The model is listed in both models.providers.omniside.models AND agents.defaults.models
  3. You restarted the gateway: openclaw gateway restart

401 / 403 Authentication Error

Verify your API key is correct. Test it directly:

Terminal window
curl https://api.omniside.ai/v1/models \
-H "Authorization: Bearer sk-your-key"

baseUrl must end with /v1

Make sure the URL is https://api.omniside.ai/v1 — not /v1/chat/completions.

Why Omniside for OpenClaw?

Without Omniside, setting up OpenClaw with multiple models means signing up for Anthropic, OpenAI, Google, and xAI separately — four signups, four API keys, four billing accounts.

With Omniside, you add one provider block to your config and get access to all major models. Switch between Claude, GPT, and Gemini with a single openclaw models set command — no reconfiguration needed.

Check available models at platform.omniside.ai/models.

Get your API key at platform.omniside.ai and start building.