Is your feature request related to a problem?
Could you please add vLLM support to the model list in SQLChat settings?
The current Ollama integration doesn’t perform well and is noticeably slow; supporting vLLM would give us a much faster local inference option.
Thanks!
Describe the solution you'd like
add vLLM support to the model list in SQLChat settings
Additional context
No response
Is your feature request related to a problem?
Could you please add vLLM support to the model list in SQLChat settings?
The current Ollama integration doesn’t perform well and is noticeably slow; supporting vLLM would give us a much faster local inference option.
Thanks!
Describe the solution you'd like
add vLLM support to the model list in SQLChat settings
Additional context
No response