Onboard: add Ollama auth flow and improve model defaults#41529
Onboard: add Ollama auth flow and improve model defaults#41529steipete merged 2 commits intoopenclaw:mainfrom
Conversation
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 3ae2525b75
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
Greptile SummaryThis PR adds Ollama as a first-class onboarding provider, introducing interactive and non-interactive setup flows with Cloud+Local and Local modes, browser-based sign-in via Finding: Auth credential should be stored after interactive prompts complete. Currently Confidence Score: 4/5
Last reviewed commit: 3ae2525 |
3ae2525 to
7c285db
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 7c285db220
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
7c285db to
a627efc
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: a627efc4ec
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
a627efc to
10a39c7
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 10a39c722e
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
10a39c7 to
9034b4d
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 9034b4db77
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
9034b4d to
fa744d1
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: fa744d12bc
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
|
Hi OpenClaw maintainer team, the Ollama team and I have worked together on this PR to improve the built in Ollama provider functionality. We're super open to feedback and can make adjustments as needed. We'd love to help OpenClaw users more easily connect to (increasingly capable) local and larger hosted open models (Ollama's cloud models don't store prompts or outputs and data is never trained on). |
|
Add Ollama as a auth provider in onboarding with Cloud + Local mode selection, browser-based sign-in via /api/me, smart model suggestions per mode, and graceful fallback when the default model is unavailable. - Extract shared ollama-models.ts - Auto-pull missing models during onboarding - Non-interactive mode support for CI/automation Closes openclaw#8239 Closes openclaw#3494 Co-Authored-By: Jeffrey Morgan <[email protected]>
fa744d1 to
3ed1cf1
Compare
|
Landed via temp rebase onto main.
Thanks @BruceMacD! |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 3ed1cf1e46
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| if (!response.ok) { | ||
| return { reachable: true, models: [] }; |
There was a problem hiding this comment.
Treat non-2xx Ollama tag responses as unreachable
fetchOllamaModels currently reports reachable: true for any non-OK /api/tags response, which lets onboarding proceed as if the Ollama endpoint is healthy even when it is returning 401/404/500. In the new non-interactive Ollama flow, that can produce a "successful" config (notably when a :cloud model is requested and no pull is attempted) and defer the real failure until first model invocation. Returning an unreachable/error state for non-2xx responses would fail fast and prevent persisting broken Ollama defaults.
Useful? React with 👍 / 👎.
* main: fix(agents): check billing errors before context overflow heuristics (openclaw#40409) fix(config): add missing editMessage and createForumTopic to Telegram actions schema (openclaw#35498) fix(signal): add missing accountUuid to Zod config schema (openclaw#35578) fix(voice-call): add speed and instructions to OpenAI TTS config schema (openclaw#39226) fix(telegram): clear stale retain before transient final fallback (openclaw#41763) Fix env proxy bootstrap for model traffic (openclaw#43248) fix: tighten Ollama onboarding cloud handling (openclaw#41529) (thanks @BruceMacD) Onboard: add Ollama auth flow and improve model defaults
…@BruceMacD) (cherry picked from commit 1435fce)
Summary
/api/me, smart model suggestions per mode, auto-pull of missing models, and graceful fallback. Extracted sharedollama-models.tsmodule to avoid duplication existing models selection logic for Ollama.models-config.providers.discovery.tsonly refactored to import shared constants from the new module.Change Type (select all)
Scope (select all touched areas)
Linked Issue/PR
User-visible / Behavior Changes
--auth-choice ollama) supported for CI/automationSecurity Impact (required)
NoNo, stores a placeholderollama-localcredential in auth-profiles for the Ollama provider (same pattern as other local providers), actual auth happens in the external Ollama server.Yes— fetches/api/tags,/api/me, and/api/pullfrom the user-configured Ollama base URL during onboardingNoNoRepro + Verification
Environment
Steps
openclaw onboardExpected
config.yamlwith discovered modelsActual
Evidence
New tests:
ollama-setup.test.ts(13 tests),auth-choice.apply.ollama.test.ts(2 tests),auth-choice-options.test.ts(5 tests)Human Verification (required)
Compatibility / Migration
YesNoNoFailure Recovery (if this breaks)
Risks and Mitigations
/api/meendpoint behavior could changecheckOllamaCloudAuthreturnssignedIn: true(assumes no cloud auth needed), so onboarding degrades gracefully