Skip to content

feat: add OpenRouter + direct OpenAI as alternative LLM providers#1677

Open
bettercallzaal wants to merge 3 commits intorecoupable:testfrom
bettercallzaal:feat/openrouter-provider
Open

feat: add OpenRouter + direct OpenAI as alternative LLM providers#1677
bettercallzaal wants to merge 3 commits intorecoupable:testfrom
bettercallzaal:feat/openrouter-provider

Conversation

@bettercallzaal
Copy link
Copy Markdown

@bettercallzaal bettercallzaal commented Apr 15, 2026

Summary

  • Adds createModel() helper (lib/ai/createModel.ts) that resolves model strings through whichever provider is configured
  • Priority cascade: Vercel AI Gateway → OpenRouter → direct OpenAI
  • All AI SDK call sites updated to use createModel() (7 files)
  • Adds @openrouter/ai-sdk-provider dependency
  • Adds fallback model list in getAvailableModels() for non-gateway environments
  • Fixes bare ANTHROPIC_MODEL in generateMermaidDiagram → uses provider-prefixed "anthropic/claude-3-7-sonnet-20250219"

Motivation

Contributors without Vercel AI Gateway access can't run the chat features locally. This lets devs run with just an OPENROUTER_API_KEY (one key, many models) or a direct OPENAI_API_KEY.

No breaking changes — existing production deployments using the gateway are unaffected.

Companion PR

What's NOT changed

  • generateImage.ts — uses openai.image() directly (OpenAI-specific API)
  • generateArray.ts — uses anthropic() directly (Anthropic-specific call)
  • Type-only imports from "ai" package

Test plan

  • Set VERCEL_AI_GATEWAY_API_KEY → verify existing behavior unchanged
  • Set only OPENROUTER_API_KEY → verify chat works through OpenRouter
  • Set only OPENAI_API_KEY → verify chat works through OpenAI directly
  • pnpm build passes
  • pnpm test passes

🤖 Generated with Claude Code


Summary by cubic

Adds OpenRouter and direct OpenAI as fallback LLM providers via a new createModel() helper. Local devs can run chat with OPENROUTER_API_KEY or OPENAI_API_KEY; gateway behavior (including VERCEL_OIDC_TOKEN) remains unchanged.

  • New Features

    • Added createModel() to resolve model strings by provider priority: @ai-sdk/gateway (incl. VERCEL_OIDC_TOKEN) → @openrouter/ai-sdk-provider@ai-sdk/openai.
    • Updated AI SDK call sites to use createModel().
    • getAvailableModels() returns a small default list when the gateway isn’t configured, now with pricing metadata.
    • Added @openrouter/ai-sdk-provider dependency.
  • Bug Fixes

    • generateMermaidDiagram now uses anthropic/${ANTHROPIC_MODEL} via createModel() to ensure the provider prefix.
    • Added a clear error when a non-OpenAI model is used in direct OpenAI mode.
    • Gateway fetch failures now return [] instead of fallback models to avoid masking configuration errors.

Written for commit f2645d0. Summary will update on new commits.

Adds a `createModel()` helper that resolves model strings through
whichever provider is configured, with priority:
1. Vercel AI Gateway (existing production behavior)
2. OpenRouter (one key, many models)
3. Direct OpenAI (strip provider prefix)

This lets contributors run the app locally without needing a
Vercel AI Gateway API key — just set OPENROUTER_API_KEY or
OPENAI_API_KEY instead.

Also fixes generateMermaidDiagram to use provider-prefixed
"anthropic/claude-3-7-sonnet-20250219" instead of bare model string.

Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]>
@vercel
Copy link
Copy Markdown
Contributor

vercel bot commented Apr 15, 2026

@bettercallzaal is attempting to deploy a commit to the Recoupable Team on Vercel.

A member of the Team first needs to authorize it.

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Apr 15, 2026

Warning

Rate limit exceeded

@bettercallzaal has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 23 minutes and 32 seconds before requesting another review.

Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 23 minutes and 32 seconds.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 93c154d5-170a-4691-a1f2-6f6559c91a84

📥 Commits

Reviewing files that changed from the base of the PR and between 72cce68 and f2645d0.

⛔ Files ignored due to path filters (2)
  • package.json is excluded by none and included by none
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml and included by none
📒 Files selected for processing (5)
  • lib/ai/createModel.ts
  • lib/ai/generateText.ts
  • lib/ai/getAvailableModels.ts
  • lib/catalog/analyzeCatalogBatch.ts
  • lib/tools/generateMermaidDiagram.ts
✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: ad1b1a8c5e

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment thread lib/ai/createModel.ts
Comment on lines +26 to +27
const bareModel = modelId.startsWith("openai/") ? modelId.slice(7) : modelId;
return openai(bareModel);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Guard direct OpenAI fallback against non-OpenAI model IDs

In the OPENAI_API_KEY branch, any model ID that is not prefixed with openai/ is still forwarded to openai(...) unchanged, which breaks when callers pass other providers. This commit now does that in generateMermaidDiagram by calling createModel(anthropic/${ANTHROPIC_MODEL}), so OpenAI-only environments will fail whenever the Mermaid tool runs with an invalid model error. Add a provider check/remap here so non-OpenAI IDs are rejected or converted before invoking the OpenAI client.

Useful? React with 👍 / 👎.

Comment on lines +13 to +16
id: DEFAULT_MODEL,
name: "GPT-5 Mini",
description: "Default model for chat and generation",
specification: { specificationVersion: "v2", provider: "openai", modelId: DEFAULT_MODEL },
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Add pricing metadata to fallback model entries

The new fallback models omit pricing, but downstream gating logic (isFreeModel and ModelSelect's handleModelChange) treats missing pricing as non-free and blocks non-subscribed users from selecting those models. Because getAvailableModels now returns this fallback list in non-gateway setups, all fallback options appear paid and free users can be prevented from switching models. Include pricing (or another explicit free/pro signal) on these defaults to avoid incorrect paywall behavior.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

5 issues found across 7 files

Confidence score: 2/5

  • There are concrete runtime break risks in lib/ai/createModel.ts: OIDC environments may fail model creation because VERCEL_OIDC_TOKEN is ignored, and direct OpenAI fallback currently passes non-OpenAI IDs through unchanged.
  • lib/tools/generateMermaidDiagram.ts now always requests an Anthropic model ID, which can break the OPENAI_API_KEY-only path when combined with current createModel() behavior; this is user-facing in fallback scenarios.
  • lib/ai/getAvailableModels.ts fallback behavior can mask gateway failures and advertise unreachable models, and missing pricing metadata may misclassify free-tier availability; these are medium-risk consistency issues rather than hard blockers.
  • Pay close attention to lib/ai/createModel.ts, lib/tools/generateMermaidDiagram.ts, lib/ai/getAvailableModels.ts - provider-path mismatches and fallback metadata can cause runtime failures or misleading model availability.
Prompt for AI agents (unresolved issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="lib/ai/createModel.ts">

<violation number="1" location="lib/ai/createModel.ts:14">
P1: Gateway detection is inconsistent with the rest of the AI stack: `createModel` ignores `VERCEL_OIDC_TOKEN`, which can make model creation fail in OIDC-configured environments.</violation>

<violation number="2" location="lib/ai/createModel.ts:26">
P2: Direct OpenAI fallback should reject non-OpenAI model IDs; currently it forwards `anthropic/*`/`google/*` IDs unchanged and fails at runtime.</violation>
</file>

<file name="lib/tools/generateMermaidDiagram.ts">

<violation number="1" location="lib/tools/generateMermaidDiagram.ts:26">
P1: This tool now always requests an Anthropic model ID, which breaks in the direct-OpenAI fallback path (`OPENAI_API_KEY` only) because `createModel()` passes non-OpenAI prefixes through to `openai(...)` unchanged.</violation>
</file>

<file name="lib/ai/getAvailableModels.ts">

<violation number="1" location="lib/ai/getAvailableModels.ts:11">
P2: The fallback `DEFAULT_MODELS` entries omit `pricing` metadata. If downstream gating logic (e.g. `isFreeModel` checks) treats missing pricing as non-free, all fallback models will appear as paid and free-tier users may be blocked from selecting them in non-gateway environments. Consider adding explicit `pricing` fields (or a free/pro signal) to these entries.</violation>

<violation number="2" location="lib/ai/getAvailableModels.ts:46">
P2: Do not return non-gateway defaults when a configured gateway request fails; this hides gateway failures and can advertise models that are not actually reachable via the active provider path.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review, or fix all with cubic.

Comment thread lib/ai/createModel.ts Outdated
execute: async ({ context }) => {
const result = await generateText({
model: ANTHROPIC_MODEL,
model: createModel(`anthropic/${ANTHROPIC_MODEL}`),
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot Apr 15, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: This tool now always requests an Anthropic model ID, which breaks in the direct-OpenAI fallback path (OPENAI_API_KEY only) because createModel() passes non-OpenAI prefixes through to openai(...) unchanged.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At lib/tools/generateMermaidDiagram.ts, line 26:

<comment>This tool now always requests an Anthropic model ID, which breaks in the direct-OpenAI fallback path (`OPENAI_API_KEY` only) because `createModel()` passes non-OpenAI prefixes through to `openai(...)` unchanged.</comment>

<file context>
@@ -22,7 +23,7 @@ export const generateMermaidDiagram = tool({
   execute: async ({ context }) => {
     const result = await generateText({
-      model: ANTHROPIC_MODEL,
+      model: createModel(`anthropic/${ANTHROPIC_MODEL}`),
       system: MERMAID_INSTRUCTIONS_PROMPT,
       prompt: `Generate a Mermaid diagram for the following context: ${context}
</file context>
Fix with Cubic

Comment thread lib/ai/createModel.ts
Comment thread lib/ai/getAvailableModels.ts Outdated
Comment thread lib/ai/getAvailableModels.ts
Throws a clear error when the direct OpenAI fallback receives a
non-OpenAI model (e.g. "anthropic/..." or "google/...") instead of
silently passing it to the OpenAI API and getting a confusing error.

Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]>
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

0 issues found across 1 file (changes from recent commits).

Requires human review: Auto-approval blocked by 4 unresolved issues from previous reviews.

- Add VERCEL_OIDC_TOKEN check to gateway detection in createModel
- Return [] on gateway failure instead of DEFAULT_MODELS to avoid
  masking gateway errors
- Add pricing metadata to fallback model entries

Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]>
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

0 issues found across 2 files (changes from recent commits).

Requires human review: Auto-approval blocked by 1 unresolved issue from previous reviews.

Comment thread lib/ai/createModel.ts
Comment on lines +19 to +22
if (process.env.OPENROUTER_API_KEY) {
const openrouter = createOpenRouter({ apiKey: process.env.OPENROUTER_API_KEY });
return openrouter(modelId);
}
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

KISS principle - Why would we want to add OpenRouter if we already support AI Gateway? How does this change the experience for our end customers?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants