Skip to content

deepsense-ai/ragbits-example

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ragbits Example

Companion repository for the Builder Journal — a hands-on guide to building production-ready GenAI applications with Ragbits.

Overview

This repository contains working code for each section of the Builder Journal. Each section builds on the previous one, progressively adding capabilities to a chat application:

Section What You Build Key Concepts
1. LLM Proxy Streaming chat API with web UI ChatInterface, LiteLLM, RagbitsAPI
2. App Configuration Branded, persistent, authenticated chat UICustomization, FileHistoryPersistence, UserSettings, FeedbackConfig, ListAuthenticationBackend

Prerequisites

  • Python 3.10+
  • An API key from any supported LLM provider (OpenAI, Anthropic, Azure, etc.)

Installation

Install dependencies using uv:

uv sync

Configuration

Set your LLM provider API key:

# OpenAI
export OPENAI_API_KEY="your-api-key"

# Or Anthropic
export ANTHROPIC_API_KEY="your-api-key"

# Or any other provider supported by LiteLLM

Running the Application

Start the chat server:

uv run ragbits api run ragbits_example.main:SimpleStreamingChat

Open http://127.0.0.1:8000 in your browser to access the chat interface.

Server Options

# Custom host and port
uv run ragbits api run ragbits_example.main:SimpleStreamingChat --host 0.0.0.0 --port 9000

# Auto-reload for development
uv run ragbits api run ragbits_example.main:SimpleStreamingChat --reload

# Debug mode
uv run ragbits api run ragbits_example.main:SimpleStreamingChat --debug

Deploying the Application

In order to deploy the app on GCP/AWS follow instructions in infrastructure/README.md

Project Structure

ragbits-example/
├── src/
│   └── ragbits_example/
│       ├── __init__.py
│       ├── config.py    # UI, forms, auth, and settings configuration
│       └── main.py      # Chat application implementation
├── infrastructure/      # IaaC for deploying the app on GCP or AWS
│   ├── aws/
│   │   ├── terraform/
│   │   │   ├── backend.tf
│   │   │   ├── main.tf
│   │   │   ├── outputs.tf
│   │   │   ├── providers.tf
│   │   │   └── variables.tf
│   │   ├── deploy.sh
│   │   └── destroy.sh
│   ├── gcp/
│   │   ├── terraform/
│   │   │   ├── backend.tf
│   │   │   ├── main.tf
│   │   │   ├── outputs.tf
│   │   │   ├── providers.tf
│   │   │   └── variables.tf
│   │   ├── deploy.sh
│   │   └── destroy.sh
│   ├── config.sh
│   ├── deploy_infra.sh
│   ├── destroy_infra.sh
│   └── README.md
├── pyproject.toml       # Project configuration and dependencies
└── README.md

How to Use with the Builder Journal

  1. Read a section in the Builder Journal
  2. Reference the corresponding code in this repository
  3. Run and experiment with the application
  4. Modify and extend based on your needs

The documentation explains the concepts while this repository provides the working implementation.

Links

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors