feat: add OpenRouter + direct OpenAI as alternative LLM providers#1677
feat: add OpenRouter + direct OpenAI as alternative LLM providers#1677bettercallzaal wants to merge 3 commits intorecoupable:testfrom
Conversation
Adds a `createModel()` helper that resolves model strings through whichever provider is configured, with priority: 1. Vercel AI Gateway (existing production behavior) 2. OpenRouter (one key, many models) 3. Direct OpenAI (strip provider prefix) This lets contributors run the app locally without needing a Vercel AI Gateway API key — just set OPENROUTER_API_KEY or OPENAI_API_KEY instead. Also fixes generateMermaidDiagram to use provider-prefixed "anthropic/claude-3-7-sonnet-20250219" instead of bare model string. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]>
|
@bettercallzaal is attempting to deploy a commit to the Recoupable Team on Vercel. A member of the Team first needs to authorize it. |
|
Warning Rate limit exceeded
Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 23 minutes and 32 seconds. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro Run ID: ⛔ Files ignored due to path filters (2)
📒 Files selected for processing (5)
✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: ad1b1a8c5e
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| const bareModel = modelId.startsWith("openai/") ? modelId.slice(7) : modelId; | ||
| return openai(bareModel); |
There was a problem hiding this comment.
Guard direct OpenAI fallback against non-OpenAI model IDs
In the OPENAI_API_KEY branch, any model ID that is not prefixed with openai/ is still forwarded to openai(...) unchanged, which breaks when callers pass other providers. This commit now does that in generateMermaidDiagram by calling createModel(anthropic/${ANTHROPIC_MODEL}), so OpenAI-only environments will fail whenever the Mermaid tool runs with an invalid model error. Add a provider check/remap here so non-OpenAI IDs are rejected or converted before invoking the OpenAI client.
Useful? React with 👍 / 👎.
| id: DEFAULT_MODEL, | ||
| name: "GPT-5 Mini", | ||
| description: "Default model for chat and generation", | ||
| specification: { specificationVersion: "v2", provider: "openai", modelId: DEFAULT_MODEL }, |
There was a problem hiding this comment.
Add pricing metadata to fallback model entries
The new fallback models omit pricing, but downstream gating logic (isFreeModel and ModelSelect's handleModelChange) treats missing pricing as non-free and blocks non-subscribed users from selecting those models. Because getAvailableModels now returns this fallback list in non-gateway setups, all fallback options appear paid and free users can be prevented from switching models. Include pricing (or another explicit free/pro signal) on these defaults to avoid incorrect paywall behavior.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
5 issues found across 7 files
Confidence score: 2/5
- There are concrete runtime break risks in
lib/ai/createModel.ts: OIDC environments may fail model creation becauseVERCEL_OIDC_TOKENis ignored, and direct OpenAI fallback currently passes non-OpenAI IDs through unchanged. lib/tools/generateMermaidDiagram.tsnow always requests an Anthropic model ID, which can break theOPENAI_API_KEY-only path when combined with currentcreateModel()behavior; this is user-facing in fallback scenarios.lib/ai/getAvailableModels.tsfallback behavior can mask gateway failures and advertise unreachable models, and missingpricingmetadata may misclassify free-tier availability; these are medium-risk consistency issues rather than hard blockers.- Pay close attention to
lib/ai/createModel.ts,lib/tools/generateMermaidDiagram.ts,lib/ai/getAvailableModels.ts- provider-path mismatches and fallback metadata can cause runtime failures or misleading model availability.
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="lib/ai/createModel.ts">
<violation number="1" location="lib/ai/createModel.ts:14">
P1: Gateway detection is inconsistent with the rest of the AI stack: `createModel` ignores `VERCEL_OIDC_TOKEN`, which can make model creation fail in OIDC-configured environments.</violation>
<violation number="2" location="lib/ai/createModel.ts:26">
P2: Direct OpenAI fallback should reject non-OpenAI model IDs; currently it forwards `anthropic/*`/`google/*` IDs unchanged and fails at runtime.</violation>
</file>
<file name="lib/tools/generateMermaidDiagram.ts">
<violation number="1" location="lib/tools/generateMermaidDiagram.ts:26">
P1: This tool now always requests an Anthropic model ID, which breaks in the direct-OpenAI fallback path (`OPENAI_API_KEY` only) because `createModel()` passes non-OpenAI prefixes through to `openai(...)` unchanged.</violation>
</file>
<file name="lib/ai/getAvailableModels.ts">
<violation number="1" location="lib/ai/getAvailableModels.ts:11">
P2: The fallback `DEFAULT_MODELS` entries omit `pricing` metadata. If downstream gating logic (e.g. `isFreeModel` checks) treats missing pricing as non-free, all fallback models will appear as paid and free-tier users may be blocked from selecting them in non-gateway environments. Consider adding explicit `pricing` fields (or a free/pro signal) to these entries.</violation>
<violation number="2" location="lib/ai/getAvailableModels.ts:46">
P2: Do not return non-gateway defaults when a configured gateway request fails; this hides gateway failures and can advertise models that are not actually reachable via the active provider path.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review, or fix all with cubic.
| execute: async ({ context }) => { | ||
| const result = await generateText({ | ||
| model: ANTHROPIC_MODEL, | ||
| model: createModel(`anthropic/${ANTHROPIC_MODEL}`), |
There was a problem hiding this comment.
P1: This tool now always requests an Anthropic model ID, which breaks in the direct-OpenAI fallback path (OPENAI_API_KEY only) because createModel() passes non-OpenAI prefixes through to openai(...) unchanged.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At lib/tools/generateMermaidDiagram.ts, line 26:
<comment>This tool now always requests an Anthropic model ID, which breaks in the direct-OpenAI fallback path (`OPENAI_API_KEY` only) because `createModel()` passes non-OpenAI prefixes through to `openai(...)` unchanged.</comment>
<file context>
@@ -22,7 +23,7 @@ export const generateMermaidDiagram = tool({
execute: async ({ context }) => {
const result = await generateText({
- model: ANTHROPIC_MODEL,
+ model: createModel(`anthropic/${ANTHROPIC_MODEL}`),
system: MERMAID_INSTRUCTIONS_PROMPT,
prompt: `Generate a Mermaid diagram for the following context: ${context}
</file context>
Throws a clear error when the direct OpenAI fallback receives a non-OpenAI model (e.g. "anthropic/..." or "google/...") instead of silently passing it to the OpenAI API and getting a confusing error. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]>
- Add VERCEL_OIDC_TOKEN check to gateway detection in createModel - Return [] on gateway failure instead of DEFAULT_MODELS to avoid masking gateway errors - Add pricing metadata to fallback model entries Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]>
| if (process.env.OPENROUTER_API_KEY) { | ||
| const openrouter = createOpenRouter({ apiKey: process.env.OPENROUTER_API_KEY }); | ||
| return openrouter(modelId); | ||
| } |
There was a problem hiding this comment.
KISS principle - Why would we want to add OpenRouter if we already support AI Gateway? How does this change the experience for our end customers?
Summary
createModel()helper (lib/ai/createModel.ts) that resolves model strings through whichever provider is configuredcreateModel()(7 files)@openrouter/ai-sdk-providerdependencygetAvailableModels()for non-gateway environmentsANTHROPIC_MODELingenerateMermaidDiagram→ uses provider-prefixed"anthropic/claude-3-7-sonnet-20250219"Motivation
Contributors without Vercel AI Gateway access can't run the chat features locally. This lets devs run with just an
OPENROUTER_API_KEY(one key, many models) or a directOPENAI_API_KEY.No breaking changes — existing production deployments using the gateway are unaffected.
Companion PR
What's NOT changed
generateImage.ts— usesopenai.image()directly (OpenAI-specific API)generateArray.ts— usesanthropic()directly (Anthropic-specific call)"ai"packageTest plan
VERCEL_AI_GATEWAY_API_KEY→ verify existing behavior unchangedOPENROUTER_API_KEY→ verify chat works through OpenRouterOPENAI_API_KEY→ verify chat works through OpenAI directlypnpm buildpassespnpm testpasses🤖 Generated with Claude Code
Summary by cubic
Adds OpenRouter and direct OpenAI as fallback LLM providers via a new
createModel()helper. Local devs can run chat withOPENROUTER_API_KEYorOPENAI_API_KEY; gateway behavior (includingVERCEL_OIDC_TOKEN) remains unchanged.New Features
createModel()to resolve model strings by provider priority:@ai-sdk/gateway(incl.VERCEL_OIDC_TOKEN) →@openrouter/ai-sdk-provider→@ai-sdk/openai.createModel().getAvailableModels()returns a small default list when the gateway isn’t configured, now with pricing metadata.@openrouter/ai-sdk-providerdependency.Bug Fixes
generateMermaidDiagramnow usesanthropic/${ANTHROPIC_MODEL}viacreateModel()to ensure the provider prefix.[]instead of fallback models to avoid masking configuration errors.Written for commit f2645d0. Summary will update on new commits.