Skip to content

badhri17/GhazelleChat

Repository files navigation

Ghazelle Chat

Chat at gazelle speed β€” clone, run, and sprint.

Ghazelle Chat is an open-source, full-stack AI chat app that streams answers in real time from multiple LLM providers (OpenAI, Anthropic, Google Gemini, Groq) with optional OpenRouter expansion. It ships with credential-based authentication, a local libSQL database that can be swapped for Turso/libSQL for global low-latency reads and a modern, responsive UI built with Nuxt 3 & Tailwind CSS.


πŸ“Έ Screenshots


✨ Features

β€’ Realtime AI chat – Server-Sent Events (SSE) stream model tokens as they arrive for snappy UX.

β€’ Model picker with registry – A centralized model registry powers a Command-palette picker with grouped categories (Best, Fast, Open), search, badges, and a "Browse more models" drawer for discovery. Supports GPT-5.4, Claude Opus/Sonnet 4, Gemini 3 Pro/Flash, Llama 3 on Groq, and more via OpenRouter.

β€’ Personalized model selection – Pin, favorite, and track recent models. Preferences persist locally and surface your go-to models at the top of the picker.

β€’ Resumable streams – In-flight generations survive page refresh; The backend keeps generating even if you refresh or lose connection.

β€’ Persisted conversations – Drizzle ORM manages conversations & messages tables (libSQL/Turso).

β€’ Email / password auth – Powered by Lucia with a Drizzle adapter.

β€’ Image & PDF Attachments – supports attachments in messages for richer context. Users can upload images or PDF files along with their question.

β€’ Dynamic theming & backgrounds – One-click light/dark and a gallery of ambient wallpapers that swap automatically with the theme.

β€’ Mobile-first UI – Radix-Vue primitives via shadcn-nuxt, light/dark themes.

β€’ Keyboard shortcuts & micro-interactions – Ctrl + K to focus input, Esc clears draft, animated send button, toast notifications.

β€’ Extensible – Plug in new LLM providers, vector search, or PWA offline cache with minimal code changes.


πŸ— Tech Stack

Layer Choice
UI framework Nuxt 3 + Nitro
Styling Tailwind CSS + shadcn-nuxt (Radix-Vue)
Auth Lucia
Database / ORM libSQL / Turso + Drizzle
LLM SDKs openai, groq-sdk, @anthropic-ai/sdk, Gemini REST, OpenRouter (via OpenAI SDK)
Validation Zod

πŸš€ Quick Start

# 1. Clone & install deps
pnpm install --frozen-lockfile

# 2. Copy env template & fill in secrets
cp .env.example .env

# 3. Push database schema & optional seed
pnpm db:push      # drizzle-kit push
pnpm db:seed      # optional sample data

# 4. Hack away ✨
pnpm dev          # http://localhost:3000

Required Environment Variables

Key Description
OPENAI_API_KEY Secret key from OpenAI dashboard
ANTHROPIC_API_KEY Secret key from Anthropic
GOOGLE_API_KEY Secret key from Google AI Studio (Gemini)
GROQ_API_KEY Secret key from GroqCloud
OPENROUTER_API_KEY (optional) Key from OpenRouter for expanded model access
DATABASE_URL file:./db.sqlite (default) or Turso URL libsql://…
AUTH_SECRET Random 32-byte string used by Lucia

An example template lives in .env.example.

ℹ️ Only the keys for providers you intend to use are required at runtime.


πŸ“‚ Project Structure (abridged)

β”œβ”€ assets/        # Tailwind & global CSS, images
β”œβ”€ components/    # UI & page-level Vue components (shadcn-nuxt wrappers)
β”œβ”€ pages/         # Nuxt file-system routes (/, /login, /chat)
β”œβ”€ server/        # Nitro server routes, db, auth & utils
β”‚  β”œβ”€ api/        # REST endpoints (auth, chat, conversations, …)
β”‚  β”œβ”€ db/         # Drizzle schema & seed
β”‚  └─ utils/      # OpenAI / Groq / Anthropic helpers
β”œβ”€ lib/           # Shared utilities & model registry (lib/models/)
β”‚  └─ models/     # Model types, registry, helpers, provider metadata
└─ nuxt.config.ts # Runtime config & module registration

πŸ”Œ API Overview

All server routes live under /server/api and are automatically mapped by Nitro:

Method Route Purpose
POST /api/auth/register Create user
POST /api/auth/login Email + password login
POST /api/auth/logout Revoke session
GET /api/auth/me Current user info
POST /api/chat Stream chat completion
GET /api/conversations List user conversations
DELETE /api/conversations/[id] Delete conversation
GET /api/conversations/[id]/messages Paginated messages
GET /api/messages/[id]/content Full message content
PUT /api/messages/[id]/stop Stop generation

Responses are JSON or text/event-stream when streaming.


πŸ›  Useful Scripts

Script What it does
pnpm dev Launches Nuxt dev server with auto-reload
pnpm build Generates production build (.output/)
pnpm preview Serves the production build locally
pnpm db:push Runs Drizzle migrations against DATABASE_URL
pnpm db:seed Populates the DB with demo data

☁️ Deployment

Ghazelle Chat is optimised for Vercel Edge Functions but will happily run on any Node 18+ environment.

  1. Set your env vars in the hosting dashboard.
  2. Ensure DATABASE_URL points at Turso/libSQL for global reads.
  3. vercel --prod will call pnpm build automatically.

For traditional servers, run pnpm build && node .output/server/index.mjs.


🀝 Contributing

We welcome and appreciate contributions of all kinds! If you plan to add a new feature or make a non‑trivial change, please open an issue first so we can discuss scope and design.

  1. Open an issue β†’ discuss the proposed change or bugβ€―fix.
  2. Fork the repo and create a descriptive branch name (feature/pagination, fix/input-focus, …).
  3. Keep the pull request focused and include a clear description.
  4. Ensure pnpm lint (and pnpm test, coming soon) pass before pushing.
  5. Update the documentation and .env.example if your change introduces new environment variables.

Ensure pnpm lint (and pnpm test, coming soon) pass before pushing.

Update the documentation and .env.example if your change introduces new environment variables.

πŸ“Œ Roadmap / Help‑Wanted

  • Message pagination & infinite scroll

  • Rate‑limiting middleware for public API routes

  • Built‑in web search tool (browser with citations)

  • Shareable conversation links / export

  • image generation

If any of these interest you, comment on the corresponding issue (or create one) and claim it. New ideas are always welcome!


License

MIT


β˜• Support(Buy me a coffee)

Buy Me A Coffee

About

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors