Skip to main content
Last Updated: March 27, 2026

Introduction

OpenCode is an open-source AI coding agent that runs as a terminal UI. Built in Go, it provides a rich interactive interface for coding, planning, reviewing, and debugging directly from your terminal. OpenCode was built with custom model providers in mind and includes first-class support for OpenAI-compatible endpoints. OpenCode pairs well with SaladCloud because:
  • Per-hour pricing means no per-token cost during long coding sessions
  • Custom provider support is built into OpenCode’s config system - no workarounds needed
  • Self-hosted models keep your code on infrastructure you control

Prerequisites

Before getting started, make sure you have:
  • A SaladCloud account
  • Node.js 18+ installed (for the @ai-sdk/openai-compatible package used internally by OpenCode)
  • A terminal environment (Linux, macOS, or WSL on Windows)

Step-by-Step Setup

Step 1: Deploy an LLM Recipe on SaladCloud

First, deploy an OpenAI-compatible LLM server on SaladCloud.
  • Go to the SaladCloud portal and create an account if you do not already have one.
  • Create an organization or choose an existing one, then click “Deploy a container group”.
  • Select an LLM recipe. The best fit for agentic coding is the Qwen3.5-35B-A3B (llama.cpp) recipe. On the recipe page, provide a name and deploy - the rest is preconfigured with recommended settings.
  • Once deployed, your endpoint will be live and serving an OpenAI-compatible API.
Available recipes: Ready-to-deploy recipes (best for less technical users):
  • qwen3.5-35B-A3B — A powerful Mixture of Experts model optimized for instruction-following tasks, ideal for agentic use cases.
Recipes for custom deployments (best for advanced users):
  • llama.cpp — Supports GGUF models
  • sglang — High-performance inference
  • vllm — Popular LLM serving framework
  • ollama — Simple model management
  • tgi — Hugging Face Text Generation Inference server
After deployment, note your:
  • API endpoint URL (e.g., https://your-endpoint.salad.cloud)

Step 2: Install OpenCode

Install OpenCode using the official install script:
curl -fsSL https://opencode.ai/install | bash
Or install via npm:
npm install -g opencode-ai
Verify the installation:
opencode --version

Step 3: Configure OpenCode to Use Your SaladCloud Endpoint

Create or edit the OpenCode configuration file at ~/.config/opencode/opencode.json:
{
  "$schema": "https://opencode.ai/config.json",
  "model": "saladcloud/qwen3.5-35b",
  "provider": {
    "saladcloud": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "SaladCloud",
      "options": {
        "baseURL": "https://your-endpoint.salad.cloud/v1",
        "apiKey": "dummy",
        "headers": {
          "Salad-Api-Key": "{env:SALAD_API_KEY}"
        }
      },
      "models": {
        "qwen3.5-35b": {
          "name": "Qwen 3.5-35B-A3B",
          "limit": {
            "context": 262144,
            "output": 8192
          }
        }
      }
    }
  }
}
Replace https://your-endpoint.salad.cloud/v1 with your actual SaladCloud endpoint URL. If your SaladCloud deployment requires authentication, set your API key as an environment variable or hardcode it in the config (not recommended for security reasons):
export SALAD_API_KEY=your-salad-api-key
OpenCode will inject this as the Salad-Api-Key request header on every API call. The apiKey field in the config must be set to a non-empty string but is otherwise unused - authentication is handled entirely by the header.

Step 4: Launch OpenCode and Test the Connection

Navigate to your project directory and start OpenCode:
cd /path/to/your/project
opencode
OpenCode will open a TUI in your terminal. Test with a simple task:
“Create a hello world Python script that prints ‘Hello from SaladCloud!’”
If OpenCode successfully creates the file, your setup is complete.

Tips for Best Results

Use the Build Agent

OpenCode’s built-in build agent handles code generation and editing with all tools enabled. It uses whichever model is set as the default in your config. You can also switch to the built-in plan agent (Tab key) for read-only analysis and planning without the risk of unintended file modifications.

Context Window

The Qwen 3.5-35B-A3B recipe supports a 262,144 token context window. The context value in the config lets OpenCode track how much context is remaining and manage prompt sizes accordingly, so make sure it reflects your actual deployment configuration.

Persist Your API Key

To avoid re-exporting SALAD_API_KEY in every terminal session, add it to your shell profile:
echo 'export SALAD_API_KEY=your-salad-api-key' >> ~/.bashrc
source ~/.bashrc