- Deploy Ollama on SaladCloud.
- Install and onboard OpenClaw locally.
- Point OpenClaw to your Salad-hosted Ollama endpoint.
- Pair and test Telegram.
Why SaladCloud for OpenClaw?
- Cost-effective: Use flat compute pricing instead of per-token API pricing.
- Scalable: Scale replicas based on demand.
- Fast setup: Deploy Ollama with a prebuilt Salad recipe.
- Flexible: Use any Ollama model that fits your use case.
Prerequisites
- A SaladCloud account.
- A local machine for OpenClaw (macOS, Linux, or Windows with WSL2).
- A Telegram account.
Part 1: Deploy Ollama on SaladCloud
Step 1: Choose a model
Recommended starter models from the Ollama OpenClaw guide:| Model | Description |
|---|---|
qwen3-coder | Optimized for coding tasks |
glm-4.7-flash | Balanced performance and speed |
gpt-oss:20b | Balanced performance and speed |
gpt-oss:120b | Improved capability |
Step 2: Deploy the Ollama recipe
- Open portal.salad.com.
- Select your organization and project.
- Click Deploy a Container Group.
- Choose the Ollama recipe.
- Configure your deployment:
- Container Group Name: a unique name for your deployment
- Model: for example
gpt-oss:20b - Replicas: You can start with default or switch to
1if you are going to configure fallback models in OpenClaw. - Hardware configuration: the default configuration is sufficient for most models, but if you want to update it click Modify Configuration
- Authentication: If you want to restrict access to your Ollama endpoint, enable authentication but make sure to add your Salad API Key when configuring OpenClaw.
- Click Deploy.
- Wait until at least one replica is Running.
Step 3: Copy endpoint and test
Copy the deployment access domain (example:https://guava-thyme-123.salad.cloud).
Test the Ollama-native endpoint. You can find this command on your deployment page.
Part 2: Install and onboard OpenClaw locally
Step 4: Install OpenClaw
Run the following command in your terminal to install OpenClaw CLI:.env.
For full installation details, see the OpenClaw docs.
Step 5: Run onboarding
If OpenClaw has not started the onboarding wizard automatically, you can start it with:- Accept the local-agent security warning. If you choose
No, onboarding exits immediately. ChooseYesonly if you understand the agent can execute actions with your local user permissions. - Select the quick start path.
- Skip model setup for now.
- Select Telegram when OpenClaw asks you to choose channels.
- Paste your Telegram bot token when prompted. (See Step 6 below for details.)
- Complete or skip the remaining optional setup steps.
Step 6: Telegram connection details (during onboarding)
Use this flow when onboarding asks for Telegram setup. First, create a Telegram bot in the Telegram app and get its token:- Open Telegram and search for
@BotFather - Start a chat and run:
- Set a display name for your bot.
- Set a bot username that ends with
_bot(for example,salad_openclaw_bot). - Once your bot is created, BotFather sends a summary message with all bot details. In that message, find and copy the bot token.
- Return to OpenClaw onboarding and paste the token when prompted. Then continue with the next onboarding steps.
- Return to your Telegram bot and start the first conversation. It will give you a pairing code and a command you need to run. Copy that and paste it in your terminal to complete pairing. command will look something like this:
~/.openclaw/openclaw.json:
Part 3: Configure OpenClaw to use Salad-hosted Ollama
In this part, you will connect OpenClaw to your Salad-hosted Ollama endpoint. You will add a model provider that points to your Salad deployment URL, set the default model OpenClaw should use, and verify that OpenClaw can see the configured model before testing in Telegram.Step 7: Update OpenClaw config
Edit~/.openclaw/openclaw.json and add/merge:
headers section:
Part 4: Restart and test
Step 8: Restart gateway
Step 9: Send a test message
- Open Telegram and send a direct message to your bot.
- Wait for the bot response.
http://127.0.0.1:18789/.
Troubleshooting
OpenClaw cannot connect to Ollama endpoint
- Confirm the Salad container group is Running.
- Verify the endpoint URL is correct.
- Confirm model download has completed (check container logs).
- Verify provider
baseUrlends with/v1in OpenClaw config. - If auth is enabled, ensure
Salad-Api-Keyis set in headers. - Use curl command to send a message directly to the endpoint and confirm it works.
Telegram bot is not responding
- Verify
botTokenis correct. - Re-run pairing approval if needed.
- Check logs:
Cost optimization tips
Model selection
- Start with smaller models for simple tasks.
- Move to larger models only when your workflow needs stronger reasoning.
- Compare quality and speed before scaling up.
Scaling strategy
- Start with 1-3 replicas.
- Increase replicas only after you observe real load.
- Use autoscaling for variable demand.
- Configure OpenClaw model fallbacks for resilience during reallocations.