Chat at gazelle speed β clone, run, and sprint.
Ghazelle Chat is an open-source, full-stack AI chat app that streams answers in real time from multiple LLM providers (OpenAI, Anthropic, Google Gemini, Groq) with optional OpenRouter expansion. It ships with credential-based authentication, a local libSQL database that can be swapped for Turso/libSQL for global low-latency reads and a modern, responsive UI built with Nuxt 3 & Tailwind CSS.
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
β’ Realtime AI chat β Server-Sent Events (SSE) stream model tokens as they arrive for snappy UX.
β’ Model picker with registry β A centralized model registry powers a Command-palette picker with grouped categories (Best, Fast, Open), search, badges, and a "Browse more models" drawer for discovery. Supports GPT-5.4, Claude Opus/Sonnet 4, Gemini 3 Pro/Flash, Llama 3 on Groq, and more via OpenRouter.
β’ Personalized model selection β Pin, favorite, and track recent models. Preferences persist locally and surface your go-to models at the top of the picker.
β’ Resumable streams β In-flight generations survive page refresh; The backend keeps generating even if you refresh or lose connection.
β’ Persisted conversations β Drizzle ORM manages conversations & messages tables (libSQL/Turso).
β’ Email / password auth β Powered by Lucia with a Drizzle adapter.
β’ Image & PDF Attachments β supports attachments in messages for richer context. Users can upload images or PDF files along with their question.
β’ Dynamic theming & backgrounds β One-click light/dark and a gallery of ambient wallpapers that swap automatically with the theme.
β’ Mobile-first UI β Radix-Vue primitives via shadcn-nuxt, light/dark themes.
β’ Keyboard shortcuts & micro-interactions β Ctrl + K to focus input, Esc clears draft, animated send button, toast notifications.
β’ Extensible β Plug in new LLM providers, vector search, or PWA offline cache with minimal code changes.
| Layer | Choice |
|---|---|
| UI framework | Nuxt 3 + Nitro |
| Styling | Tailwind CSS + shadcn-nuxt (Radix-Vue) |
| Auth | Lucia |
| Database / ORM | libSQL / Turso + Drizzle |
| LLM SDKs | openai, groq-sdk, @anthropic-ai/sdk, Gemini REST, OpenRouter (via OpenAI SDK) |
| Validation | Zod |
# 1. Clone & install deps
pnpm install --frozen-lockfile
# 2. Copy env template & fill in secrets
cp .env.example .env
# 3. Push database schema & optional seed
pnpm db:push # drizzle-kit push
pnpm db:seed # optional sample data
# 4. Hack away β¨
pnpm dev # http://localhost:3000| Key | Description |
|---|---|
OPENAI_API_KEY |
Secret key from OpenAI dashboard |
ANTHROPIC_API_KEY |
Secret key from Anthropic |
GOOGLE_API_KEY |
Secret key from Google AI Studio (Gemini) |
GROQ_API_KEY |
Secret key from GroqCloud |
OPENROUTER_API_KEY |
(optional) Key from OpenRouter for expanded model access |
DATABASE_URL |
file:./db.sqlite (default) or Turso URL libsql://β¦ |
AUTH_SECRET |
Random 32-byte string used by Lucia |
An example template lives in .env.example.
βΉοΈ Only the keys for providers you intend to use are required at runtime.
ββ assets/ # Tailwind & global CSS, images
ββ components/ # UI & page-level Vue components (shadcn-nuxt wrappers)
ββ pages/ # Nuxt file-system routes (/, /login, /chat)
ββ server/ # Nitro server routes, db, auth & utils
β ββ api/ # REST endpoints (auth, chat, conversations, β¦)
β ββ db/ # Drizzle schema & seed
β ββ utils/ # OpenAI / Groq / Anthropic helpers
ββ lib/ # Shared utilities & model registry (lib/models/)
β ββ models/ # Model types, registry, helpers, provider metadata
ββ nuxt.config.ts # Runtime config & module registration
All server routes live under /server/api and are automatically mapped by Nitro:
| Method | Route | Purpose |
|---|---|---|
| POST | /api/auth/register |
Create user |
| POST | /api/auth/login |
Email + password login |
| POST | /api/auth/logout |
Revoke session |
| GET | /api/auth/me |
Current user info |
| POST | /api/chat |
Stream chat completion |
| GET | /api/conversations |
List user conversations |
| DELETE | /api/conversations/[id] |
Delete conversation |
| GET | /api/conversations/[id]/messages |
Paginated messages |
| GET | /api/messages/[id]/content |
Full message content |
| PUT | /api/messages/[id]/stop |
Stop generation |
Responses are JSON or text/event-stream when streaming.
| Script | What it does |
|---|---|
pnpm dev |
Launches Nuxt dev server with auto-reload |
pnpm build |
Generates production build (.output/) |
pnpm preview |
Serves the production build locally |
pnpm db:push |
Runs Drizzle migrations against DATABASE_URL |
pnpm db:seed |
Populates the DB with demo data |
Ghazelle Chat is optimised for Vercel Edge Functions but will happily run on any Node 18+ environment.
- Set your env vars in the hosting dashboard.
- Ensure
DATABASE_URLpoints at Turso/libSQL for global reads. vercel --prodwill callpnpm buildautomatically.
For traditional servers, run
pnpm build && node .output/server/index.mjs.
We welcome and appreciate contributions of all kinds! If you plan to add a new feature or make a nonβtrivial change, please open an issue first so we can discuss scope and design.
- Open an issue β discuss the proposed change or bugβ―fix.
- Fork the repo and create a descriptive branch name (feature/pagination, fix/input-focus, β¦).
- Keep the pull request focused and include a clear description.
- Ensure pnpm lint (and pnpm test, coming soon) pass before pushing.
- Update the documentation and .env.example if your change introduces new environment variables.
Ensure pnpm lint (and pnpm test, coming soon) pass before pushing.
Update the documentation and .env.example if your change introduces new environment variables.
π Roadmap / HelpβWanted
-
Message pagination & infinite scroll
-
Rateβlimiting middleware for public API routes
-
Builtβin web search tool (browser with citations)
-
Shareable conversation links / export
-
image generation
If any of these interest you, comment on the corresponding issue (or create one) and claim it. New ideas are always welcome!
MIT





