Skip to content

feat: add Mistral model support to OpenAI-compatible client#7350

Open
Aboudjem wants to merge 1 commit intomicrosoft:mainfrom
Aboudjem:feat/mistral-support
Open

feat: add Mistral model support to OpenAI-compatible client#7350
Aboudjem wants to merge 1 commit intomicrosoft:mainfrom
Aboudjem:feat/mistral-support

Conversation

@Aboudjem
Copy link

@Aboudjem Aboudjem commented Mar 6, 2026

Summary

  • Adds Mistral AI models to _MODEL_INFO with proper capabilities (vision, function calling, structured output)
  • Adds token limits to _MODEL_TOKEN_LIMITS for all Mistral models
  • Adds model pointer aliases (e.g. mistral-large-latestmistral-large-2411) to _MODEL_POINTERS
  • Adds MISTRAL_OPENAI_BASE_URL constant

Models added:

Model Context Vision Notes
mistral-large-2411 131K No Flagship
mistral-small-2503 131K No Cost-efficient
codestral-2501 262K No Code-specialized
mistral-medium-2505 131K No Balanced
pixtral-large-2411 131K Yes Multimodal

All models use ModelFamily.MISTRAL (already defined in autogen-core). Follows the same pattern as existing Gemini and Llama integrations.

Test plan

  • Verify resolve_model and get_info work for Mistral models
  • Integration test with OpenAIChatCompletionClient and Mistral base URL

Fixes #6151

🤖 Generated with Claude Code

Adds Mistral AI models (mistral-large, mistral-small, codestral,
mistral-medium, pixtral-large) to _MODEL_INFO, _MODEL_TOKEN_LIMITS,
and _MODEL_POINTERS. Also adds MISTRAL_OPENAI_BASE_URL constant for
the Mistral OpenAI-compatible API endpoint.

This enables using Mistral models via OpenAIChatCompletionClient with
proper model info resolution, token limits, and base URL configuration,
following the same pattern as existing Gemini and Llama integrations.

Fixes microsoft#6151

Co-Authored-By: Claude Opus 4.6 <[email protected]>
@Aboudjem
Copy link
Author

Aboudjem commented Mar 6, 2026

@microsoft-github-policy-service agree

@Aboudjem
Copy link
Author

Hi team — friendly follow-up on this PR adding Mistral model support. Is there anything I should adjust, or a different approach you'd prefer? Happy to iterate. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Mistral support through OpenAIChatCompletionClient.

1 participant