feat: add AI Bridge request logs model filter#22230
Conversation
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: a5fb301aba
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| GROUP BY | ||
| model | ||
| LIMIT COALESCE(NULLIF(@limit_::integer, 0), 100) |
There was a problem hiding this comment.
Add ORDER BY before paginating model list
The new GET /aibridge/models query applies LIMIT/OFFSET immediately after GROUP BY without any ORDER BY, so pagination is nondeterministic. In PostgreSQL, grouped rows can be returned in arbitrary order, which means clients using offset pagination can see duplicates or miss models between pages as execution plans/data change. Add a stable ordering (for example by model) before pagination so page boundaries are consistent.
Useful? React with 👍 / 👎.
| if (firstModel) { | ||
| return { | ||
| label: firstModel, | ||
| value: firstModel, | ||
| }; |
There was a problem hiding this comment.
Validate selected model matches filter value
getSelectedOption currently treats the first result from /aibridge/models?q=<value>&limit=1 as the selected model even when it is only a prefix match and not the actual current filter value. If a user lands on a query like model:gpt (or any non-exact value), the dropdown can display a different model as selected, which misrepresents the active filter state and can lead to accidental filter changes. Return a selected option only when the fetched model exactly equals value.
Useful? React with 👍 / 👎.
a5fb301 to
2878930
Compare
Documentation CheckUpdates Needed
Automated review via Coder Tasks |
This stack of pull requests is managed by Graphite. Learn more about stacking. |
There was a problem hiding this comment.
I think AI left good review comments. Lack of ORDER BY with LIMIT/OFFSET I agree with.
Not sure If I understand correctly the second one but from my manual testing it seemed impossible to filter by model prefix, which is mentioned in PR description as a feature, fist match from the dropdown was selected instead.
Also it looks like each time a model input field value is changed a request to backend is made (eg. typing "gpt" results in 3 queries):

Either get all the models once, remove limit/offset, kind of risky but there should not be that many models in practice, and filter on client side or query back-end only when enter is clicked or after some delay.
Maybe I'm a bit paranoid but this table is big so I'd rather have as little
site/src/pages/AIBridgePage/RequestLogsPage/RequestLogsFilter/ModelFilter.tsx
Show resolved
Hide resolved
pawbana
left a comment
There was a problem hiding this comment.
Approving with credit that eager endpoint calling on model filter input change will be fixed in a follow up soon : )
(#22230 (review))
3683ce5 to
08fa8ae
Compare

This pull-request implements a simple filtering logic so that we're able to pick which model the user actually used when logs were sent to AI Bridge.
GET /aibridge/modelsAPI endpoint that returns distinct model names from AI Bridge interceptions, with pagination and search supportListAIBridgeModelsSQL query using case-sensitive prefix matching (LIKE model || '%') to allow B-tree index usageListAuthorizedAIBridgeModelsinmodelqueries.gofor RBAC authorization filter injectionAIBridgeModelssearch query parser in searchquery/search.go (defaults bare terms to themodelfield)