Skip to content

fix: improve error for unconfigured local providers (ollama/vllm)#18183

Merged
steipete merged 1 commit intoopenclaw:mainfrom
arosstale:fix/17328-ollama-error-hint
Feb 16, 2026
Merged

fix: improve error for unconfigured local providers (ollama/vllm)#18183
steipete merged 1 commit intoopenclaw:mainfrom
arosstale:fix/17328-ollama-error-hint

Conversation

@arosstale
Copy link
Contributor

@arosstale arosstale commented Feb 16, 2026

Summary

When a user configures agents.defaults.model.primary: "ollama/gemma3:4b" but forgets to set OLLAMA_API_KEY, the error is a confusing Unknown model: ollama/gemma3:4b. The Ollama provider requires any dummy API key to register (the local server doesn't check it), but this isn't obvious. This fix adds an actionable hint to the error message.

lobster-biscuit

Fixes #17328

Root Cause

src/agents/pi-embedded-runner/model.ts:resolveModel — returns a bare "Unknown model" string with no guidance when a local provider (ollama/vllm) isn't registered due to missing auth config.

Behavior Changes

Scenario Before After
ollama model, no OLLAMA_API_KEY Unknown model: ollama/gemma3:4b Unknown model: ollama/gemma3:4b. Ollama requires authentication to be registered as a provider. Set OLLAMA_API_KEY="ollama-local" (any value works) or run "openclaw configure". See: https://docs.openclaw.ai/providers/ollama
vllm model, no VLLM_API_KEY Unknown model: vllm/llama-3-70b Similar hint with vLLM-specific instructions
Non-local provider (e.g. google) Unknown model: google/model Unchanged — no hint appended

Codebase and GitHub Search

  • I searched the codebase for existing functionality.
    Searches performed:
    • rg "Unknown model" src/ — found the single error path in model.ts
    • rg "OLLAMA_API_KEY" src/ — confirmed no existing hint logic
    • rg "buildUnknownModelError" src/ — new function, no duplication

Tests

  • Format: pnpm exec oxfmt --check
  • Lint: pnpm check
  • 3 new tests (18 total in model.test.ts): ollama hint, vllm hint, non-local no hint

Sign-Off

  • Models used: Claude Sonnet 4 (via pi coding agent)
  • Submitter effort: Traced from user report → resolveModel error path → missing provider registration due to absent API key
  • Agent notes: AI-assisted.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Improves user-facing diagnostics in the embedded model resolver when a configured model reference (e.g., ollama/... or vllm/...) can’t be resolved, by appending an actionable “set API key / run configure” hint for known local providers.

Changes:

  • Replace the generic unknown-model error in resolveModel() with a helper that can add provider-specific guidance.
  • Add local-provider hint text for ollama and vllm.
  • Add unit tests covering hint inclusion/exclusion cases.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.

File Description
src/agents/pi-embedded-runner/model.ts Adds buildUnknownModelError() and hint mapping for local providers when model lookup fails.
src/agents/pi-embedded-runner/model.test.ts Adds tests asserting the new hint behavior for ollama/vllm and non-local providers.


function buildUnknownModelError(provider: string, modelId: string): string {
const base = `Unknown model: ${provider}/${modelId}`;
const hint = LOCAL_PROVIDER_HINTS[provider.toLowerCase()];
Copy link

Copilot AI Feb 16, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

buildUnknownModelError currently appends the auth hint solely based on the requested provider string. This can mislead users when the provider is registered (API key set) but the modelId is actually wrong/missing (e.g., model not pulled in Ollama). Consider adding the hint only when the provider is not present in modelRegistry (e.g., modelRegistry.getAll() contains no models for normalizeProviderId(provider)), and use normalizeProviderId(provider) instead of provider.toLowerCase() for the lookup so whitespace/aliasing is handled consistently.

Suggested change
const hint = LOCAL_PROVIDER_HINTS[provider.toLowerCase()];
const normalizedProviderId = normalizeProviderId(provider);
const hint = LOCAL_PROVIDER_HINTS[normalizedProviderId];

Copilot uses AI. Check for mistakes.
…enclaw#17328)

When a user sets `agents.defaults.model.primary: "ollama/gemma3:4b"`
but forgets to set OLLAMA_API_KEY, the error is a confusing
"unknown model: ollama/gemma3:4b". The Ollama provider requires any
dummy API key to register (the local server doesn't actually check it),
but this isn't obvious from the error.

Add `buildUnknownModelError()` that detects known local providers
(ollama, vllm) and appends an actionable hint with the env var name
and a link to the relevant docs page.

Before: Unknown model: ollama/gemma3:4b
After:  Unknown model: ollama/gemma3:4b. Ollama requires authentication
        to be registered as a provider. Set OLLAMA_API_KEY="ollama-local"
        (any value works) or run "openclaw configure".
        See: https://docs.openclaw.ai/providers/ollama

Closes openclaw#17328
@steipete steipete merged commit 4df970d into openclaw:main Feb 16, 2026
23 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

agents Agent runtime and tooling size: S

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: Telegram bot replies "unknown model: ollama/gemma3:4b" despite status showing it as default (v2026.2.12)

3 participants